Outcooperating the Competition

We often lack a shared framework, a reference business-technology architecture to build Platform-Ecosystem last movers by embracing a Long-term and inclusive perspective. A deep dive into defensibility flywheels and the role of infrastructures, interfaces and extensions to outcooperate the competition.

Boundaryless Team

July 01, 2021

When we speak about ecosystems and platforms, new forms of organizations, and the role of software in the development of such, we often lack a shared framework, a reference business, and technology architecture.

In reality, such a convergent view of digitally enabled ecosystems of interacting parties is emerging potently from market practices, and it’s worth highlighting as it can provide a foundation for further thinking. More specifically, after having introduced a way to frame the three key challenges of the organization that aims at becoming platform ready, we will now look into these emerging patterns of organizing markets and into a potential way to orchestrate and design a set of incentives that can make it possible for an ecosystem of such type to be brought forth as a cooperative “last mover”, in the sense of creating an ecosystem weaving initiative that aims at becoming the standard, the “place to be” for innovations to happen.

The essential question that this essay is investigating is the possibility to create ecosystem strategies that reduce the case for destructive competition and maximize the case for collaboration, plugging in, integration of special capabilities, composability, modularity, and — in the end — wholly systemic actualization where all parties thrive. Such a system would be post-competitive and represent a powerful new way to redesign markets for accelerated innovation.

The components of a platform-ecosystem

Modern software powered platform-ecosystem initiatives are essentially based on three key spaces of value creation where exchanges happen:

  1. the marketplace;
  2. the main “product” features (that we call the main UX);
  3. the so-called extension platform.

Indeed, a weaver of a platform powered ecosystem normally aims at and operates for:

  • creating a marketplace that enables certain experiences of exchange of niche product/services between parties (producer-consumer), normally monetizing through a “take rate”;
  • providing a main user experience with regards to a set of enabling services and products — centrally provided by the platform owner — often in the form of a SaaS offering or other more capital intensive services (such as logistics for example) targeted often to the producers in the marketplace, and in a more limited set of cases to the consumers;
  • create an environment where other third parties can develop so-called “extensions” to the main user experience in the form of apps, templates, plug-ins. In many cases this happens by adopting a so-called “reverse API” paradigm, where extension effectively are pieces of software that run in tight connection with the main UX and are often optimized (following strict UX guidelines); these apps connect for data, and further workflow execution, to other pieces of software running in external contexts where the users may also have a connected identity, information and data (eg: a Shopify seller that keeps sync of the ecommerce with a bookkeeping solution through an extension).

Normally, in most of the cases, the integrated stack would end here, hiding at least three more key layers. Indeed in the back of the platform owned “back end” you would find:

  • a certain “grammar”, so-called domain-model in DDD terms;
  • a data layer where all data generated are kept safe and accessible;
  • eventually, the infrastructure on top of which the system runs.

All these need to exist to grant system execution.

The stack can be therefore be seen as:

It’s important to note that (as the symbol [→] hints) the extensions that run on the platform may also use an extended domain model and can store data on different data layers and run on different modularized components as infrastructure. For example, the accounting extension that allows the ecommerce owner to keep books synchronized with sales may store data on a private and controlled space and possibly connect with public tax filing infrastructure.

Understanding and challenging the framework

Normally such a platform framework would be run by a prominent company. This company would aim at gaining a certain defensible advantage through a defensibility flywheel such for example a scale advantage, a “lock-in” one — becoming essential to the adopters’ workflow, or a proprietary technology or data advantage — extensively explained in a previous article.

After becoming “too big to shortcut” then the platform would play a game of balance between control — by controlling all the interfaces between the layers — and enablement — by providing services valued by all the entities involved. Entities let go of some independence to be able to reap the benefits of being “part of” the ecosystem, such as, for example, greater demand generation of better efficiency.

The platform owning company would likely seek defensibility and control of the ecosystem by leveraging on these reinforced multi-sided network effects.

Why would then be important to start looking at this framework beyond the “usual” way? And what are the challenges that prevent new approaches to running an ecosystem that are more inclusive, and less centralizing? Would such a made system be auspicable or, just, more efficient at innovation?

Liberating Interfaces

The essential role of interfaces is well captured in David Akin’s Shea’s law stating:

“the ability to improve a design occurs primarily at the interfaces. This is also the prime location for screwing it up.”

The key aspect that we want to explore with regards to the interfaces in this case is the effect of two major drivers. On one hand, we want to explore what happens by liberating the interfaces that exist between layers and components from the monopolistic control of one single ruling party. We believe that clear and stable interfaces increase the overall capability of the system to generate broader plurality, optionality, and bring more resilience as an effect.

On the other hand, we also envision that transitioning towards a shared governance on interfaces and embracing less centralized incentives structures, would bring a longer-term focus: we assume that — as some studies have shown — organizations that are co-managed and co-owned show a broader tendency towards “voice” and “loyalty” — as in Hirschman’s Exit, Voice, and Loyalty framing — versus exit thus making them essentially better equipped for long termism.

Creating clear interfaces between layers would also be essential to facilitate evolution of each of those layers; it’s clear that a different pace layering will affect the infrastructure and domain model layers — much slower in evolving — versus the services and products layer that normally evolve much faster.

The role of the Main UX

In a system architecture like the one outlined above, the provider of the main UX would be in charge of:

  • implementing the core set of “product” functionalities specified in the domain;
  • provide differentiating element on top of the core domain functionalities (e.g.: with capital intensive services that plug into the functionality);
  • building its own marketplace(s) of services;
  • building its own marketplace of extensions;
  • managing policing, and security of both marketplaces.

By standardizing the interfaces between the main UX and the extensions it would be possible to have multiple players provide alternative main UXs. Services marketplaces could also have a low bundling with the main UX(s). For example, a marketplace featuring consultants aiming to provide consulting services to adopters of the software stack would just need the experts to be familiar with the domain model and main UX, and wouldn’t require a deeper integration. Envisioning the possibility of including in the domain model also the single marketplace entity and its reputation, it would be technically possible to imagine experts being able to provide services across different main UXs’s marketplaces by leveraging on the same reputation. In the case of such an untangled UX, and thus hard to attain defensibility, the main differentiator for the main UX providers would be that of full compatibility with the ecosystem of service providers and extensions, and furthermore, that of providing the best experience across the core set of features plus adding differentiating features on top, by keeping compatibility with the whole ecosystem.

The thickness of the main UX depends on the grade of standardization in the business process enabled: the more standard the process, the thicker the main UX is nudged to be. It would be also possible to imagine such an ecosystem to sport a very thin main UX, even a disappearing one: in this case, the extensions would all share the same domain model but implement a partially overlapping set of features, allowing interoperability but providing their own “view” on the domain. The reason to have a main UX, and not only “extensions”, would be to provide basic curation services (policing and security) helping the adopter “navigate” the extensions’ market. Also, the main UX provider would be best suited to run the services marketplaces. Main UX providers would be in charge of standardizing transactions (for example with a payment system), distribution, reputation-based browsing, etc…

The case to have a main UX and the inherent difficulty in standardizing the interface between the main UX and the extensions indicates that a likely outcome could be that of having a strong coupling between a main UX and a certain ecosystem of extensions.

A thick main UX player would have to deliver a tangible amount of enabling value to the ecosystem by running the transactions engine (for both the extensions and the service marketplaces) and the overall evolutionary learning engine at scale. The thicker the main UX, the more empowerment and services to be provided to the ecosystem will be needed to justify the thickness. As defensibility options would be certainly limited for the main UX provider though, due to the openness of the domain model and interfaces in such an unbundled market, building trust and empowering features for the ecosystem would require a different financing path. The current financing path for network-based/platform based organizations is indeed largely based on investing upfront with the aim of creating defensibility and lock-in: platforms struggle to overcome the so-called chicken-egg problem and rely on massive subsidies in the early stage to create the attraction that — in the longer term — allow them to create the network effects. In an unbundled ecosystem such as the one we’re exploring here, we would be seeing (and to some extent we’re already seeing) the application of new incentive design approaches that allow an early and steady creation of trust between users and the platform by cementing a reciprocal set of incentives for success from the beginning. Tools such as Bonding Curves, Augmented Bonding Curves or similar crypto primitives designed to create early stage utility (either financial, by giving rights to future profits or functional, by allocating special governance rights) while network effects and product maturity are still not materialized, may fruitfully contribute to building such early trust and solve the problem eminently explained by Chris Dixon in his landmark “Crypto Tokens: A Breakthrough in Open Network Design”.

Finally, it’s worth saying that in such an open architecture both direct to customer extensions and a parallel main UX mediated distribution may also co-exist. An example of such a separation and standardization of interfaces may be definitely seen in the development of WordPress ecosystem where thanks to the standardization of the domain model, and the openness of interfaces we’ve seen a plethora of different approaches emerging such as with headless CMSs, second order ecosystem such as Elementor’s built on top WordPress domain model and back end, and more.

The Domain model, the data layer, and the Infrastructure

The domain model on top of which the system would run would then act as a common, shared model of the system and would contain all the definitions, and the actions that are worth specifying to ensure consistency and compatibility across the different implementations of the main UX and between main UX and extensions. The domain model would represent the actual underlying protocol and would also be the root of the implementation of the data architecture. Such a domain protocol would clearly need to be subject to shared governance processes to ensure all points of view are respected and that changes in the domain model do not impact dramatically a subset of the ecosystem players.

The data layer would need to be transparent, accessible and auditable supporting some level of federation between local clusters that would allow certain spaces of information to remain closed — but compatible and available for settlement. To some extent, the common infrastructure and the data model could also overlap in such a system, especially in the possibility of an implementation based on a permissionless digital ledger. In this case, multiple nodes would be responsible for the execution of the open ledger and for the validation of all the transactions: token engineering would be needed to ensure the relevant incentives for the nodes composing the network to run the validation work.

More “federated” architectures could be also designed and reduce the ledger validation work by designing trustless settlement layers between clusters made of trusted entities (eg: settling inter-organizational transactions, while keeping intra-organizational transactions in a trust-based environment). The type of interface and data model coupling would vary with the type of data being exchanged: with exchanges needing the historicization, typical of financial transactions, sitting on a distributed ledger means distributing the validation work to nodes; if the need to share a data model for interoperability doesn’t entail having consistent ledgers with auditable information then the need for shared infrastructure would likely disappear, technically leaving the possibility for other types of provisioning of common infrastructure to emerge, such as with cloud providers.

The Extensions

In such an architecture extensions run on the premise of using the same shared domain model (that is reflected in the interfaces) — to ensure their compatibility to all the main UXs — and possibly extend the domain model or just use a complementary extension of such a model into other domains. Extensions could technically also be able to wrap and integrate different operational infrastructures (such as further logistics or computation infrastructures for example) and make them available to the parties. In a pace layering view, extensions are the most likely to capture new and emerging behaviours and are subject to strong innovation pressures to keep competing. The main UX(s) and the common domain model continuously exercise an attraction mechanism for features that emerge in the extension ecosystem or in the marketplaces as they mature: in an ILC cycle, extensions and marketplaces likely generate most of the innovations and that is to be gradually integrated into the main UX by continuous institutionalization. This cycle is continuous: as the main UX grows it may bureaucratize and become too monolithic and big: this makes the case for it breaking down into smaller niches of the market and a subsequent further specification of the domain model, effectively giving space to the birth further, more vertical, ecosystem.

Outcooperating the competition

In a usual context we would see a single organization investing widely, iterating fast and creating a main UX and a data-infrastructure layer on top of a proprietary domain model.

This company would likely start providing a single user value proposition in the main UX and gradually introduce marketplace features and extensions. Other trajectories also exist albeit this would be the most anticipatable today. So when would a pluralistic, cooperative model be worth applying? Why is the traditional approach here to be overcome? What are the interfaces that make sense to agree on?

One could argue that — if possible — agreeing on a shared protocol representing a common domain model would make sense to allow inter-system cooperation and interoperability. But in a world of take-all winners such a proposition doesn’t make sense: as you’re competing for a certain niche and your value proposition depends on acquiring network effects you shouldn’t focus on enhancing low layer interoperability.

The existing financing and technological patterns that have pushed towards a winner take all perspective are, by the way, being challenged by several essential innovations. First of all, as we’ve explained briefly above, mainly thanks to decentralized finance and governance patterns and crypto-tokens design, new ways to finance early stage ecosystem development in a more pluralistic way are emerging. Furthermore the emergence of these technologies further reduces transaction cost and make self-executing multi party contracts (effectively partially autonomous organizations) possible and puts into question the centralized approach to platform building and ecosystem weaving by just making alternative ways possible.

Minimizing the attractiveness of exit by ensuring the stability of interfaces (a major element of concerns of third parties that accept to produce under a platform enablement regime) would increase trust in the platform and would push the development of specific IP inside the extensions while leaving the platform enabling services to be the basis of a larger ecosystem: the need to compete with the ecosystem would be minimized, while the need to compete “within the ecosystem” would still be present and create innovations that would be gradually captured and institutionalized by a trusted party — the main UX provider — and inside a trusted governance and financial process — the domain model evolution. Considering that all interfaces will be open and externalizable one could argue that competition would not disappear but just be moving away from the interface. Interfaces and the domain model, instead, would effectively have to be managed under a “Commons” regime and would require a governance process inspired by the 8 key principles pointed out by Elinor Ostrom.

In this perspective, in no way we could imagine innovation to happen through the governance process that would, instead, be more a guarantee of ecosystem stability and thrivability: the main UX providers would compete among themselves and with the extensions for user attention but a lot of the network value would accrue in shared, non enclosable spaces.

One could argue that the natural tendency of the market would naturally push for ecosystems to compete with each other and thus invalidate most of the ideas outlined in this article. On the other hand, standardization processes have always been part of the history of industry development, and the Internet itself, and experiences in shared-governance ecosystems are growing providing promising results in terms of growth and innovation enablement. Open Compute Project — a shared governance platform born to “apply the benefits of open source and open collaboration to hardware and rapidly increase the pace of innovation in, near and around the data center” is now projected to intermediate and facilitate circa 12B$ of GMV in 2023. Uniswap, a decentralized exchange protocol that connects “developers, liquidity providers and traders” and lets them “participate in a financial marketplace that is open and accessible to all” is now able to enable almost 1B$ in transactions every day and is governed through the interplay of a public forum, a governance token that has been distributed to the stakeholders in the community in a certain issuance moment (UNI).

Based on early research and interviews, we foresee that, as a complement of the goodwill and long term commitment, creating interlocking financial incentives — that increase skin in the game for all participants into each others’success — may represent a promising way to keep the case for cooperation higher than that for exit, fork and competition. As an example, organizations developing extensions should be provided with options to access not only the governance processes related to the domain model and protocol, but also be given the possibility to access equity of the main UX provider they decide to connect to. As we’ve anticipated above, a coupling between a particular main UX provider and an ecosystem of entities is anticipatable: extensions evidently delegating some decision making power to the main UX may trade this loss of power in exchange for a stake in the success of the main UX they optimize for. These incentives — traditionally related to equity holding and transferred through complex and bureaucratic processes — are being streamlined through new technological approaches of which indubitably token engineering is the most representative.

This story is an early release from the upcoming book “The Power of Ecosystems, Making sense of the new reality for organizations.” that will be published in September by Thinkers 50 as part of the Business Ecosystem Alliance program of which Boundaryless is part of.

Conclusions and further work

In this article we presented a view of how the emerging trends in tech, infrastructure and development of ecosystems are gradually making possible a different approach to ecosystem building. This approach is more integrative and cooperative, and seeks to, at the same time, enable competition for innovation in certain spaces, while incentivising cooperation and the creation of shared innovation though interface standardization, shared governance and new types of financial incentives. The emergence of new technologies is making these new directions possible, case studies confirm that this direction already constitutes an appropriate approach to ecosystem building. Such an approach to ecosystem building is advisable to incumbents and upstarts that intend to weave long term ecosystemic initiatives and embrace a perspective of openness and long-termism versus land grab and exploitation.

This research stream emerged as at Boundaryless, in collaboration with other entities, we’re exploring the opportunity to create a software-powered ecosystem around a common protocol of organizing by embracing a long-term, and cooperative approach that can outcooperate the competition.

Next Steps: Building a software ecosystem on a shared model

We’re building a software ecosystem according to this principles to eneable new ways of organizing. To stay tuned and express your interest on the project, please check out this form.

Boundaryless Team

July 01, 2021