I. Introduction
The Internet of Things (IoT) is positioned at the intersection between physical and digital worlds in a manner that is set to have a profound influence on the world economy.Footnote 1 Global IoT revenue is projected to increase by 301.5 billion euros over the next eight years, peaking at 408.7 billion euros in 2030, with almost 8 billion devices interconnected.Footnote 2 This new wave goes beyond smart consumer devices as it involves the systemic use of sensors for industrial applications, spanning remote monitoring of equipment to factory automation and healthcare services.Footnote 3 In fact, business-to-business solutions are expected to account for nearly 62% of overall IoT revenue by 2030.Footnote 4
However, the economic features at the basis of the rapid adoption of the IoT may represent a double-edged sword if not properly addressed by policymakers. No matter how good a smart device may be, it remains useless outside the context of a digital ecosystem. Since the IoT is a network made up of physical and virtual things that are seamlessly connected, in the absence of smooth interoperability with the rest of the network, any device is likely to lose much of its utility.Footnote 5 Furthermore, IoT environments are possible as long as all sorts of devices can be interconnected and can exchange data in real time. Therefore, the ability to gather and access different data sources is crucial in order for IoT innovation to thrive. As a result, access to data and data-sharing practices have attracted the attention of policymakers and enforcers as key factors for unlocking competition and incentivising innovation.Footnote 6
The European Union (EU) has been a forerunner in promoting the free flow of data to enable inter-platform competition with a broad array of heterogeneous legislative initiatives.Footnote 7 While the General Data Protection Regulation (GDPR) enshrined a general data portability right for individuals, Footnote 8 the Second Payment Service Directive (PSD2) introduced a rule on sector-specific access to account data in favour of specific players Footnote 9 ; meanwhile, the Regulation on the free flow of non-personal data has facilitated data-sharing practices in the commercial arena (business to business). Footnote 10 In the same vein, the Commission introduced the Open Data Directive with the goal of putting government data to good use for private players, Footnote 11 and it launched the Data Governance Act to promote the voluntary sharing of data by individuals and businesses and to harmonise conditions for the use of certain public-sector data.Footnote 12
Nevertheless, the exponential growth of mobile ecosystems and large technology platforms within IoT environments have demonstrated that the competitive landscape has not evolved according to policymakers’ expectations.Footnote 13 Digital ecosystems, built on the combination of an operating system running on a mobile phone, have emerged as digital infrastructures within which a huge number of IoT interactions take place. Therefore, the digital economy seems to be moving away from being a market for information where consumers move their data backlog from one provider to another. Indeed, major concerns regarding data lock-in have been raised in recent European data strategy initiatives.Footnote 14
As a consequence, policymakers have gradually moved towards a different approach intended to nurture competitive dynamics within and between platform-based ecosystems. With this aim, interoperability requirements are taking centre stage.Footnote 15 Indeed, any lack of interoperability between providers may act as a technical barrier, making it harder for users to switch and multi-home.
Notably, in its data strategy, the European Commission announced the establishment of EU-wide common, interoperable data spaces in strategic sectors to overcome legal and technical barriers to data sharing.Footnote 16 Furthermore, the Commission has identified the lack of interoperability as a crucial element for the exploitation of data value, especially in the context of artificial intelligence deployment.Footnote 17 In addition, as of November 2022, the Digital Markets Act (DMA) has entered into force, introducing, amongst its other provisions, interoperability obligations for online platforms having a gatekeeping position.Footnote 18 Notably, the DMA envisages horizontal interoperability for basic functionalities of number-independent interpersonal communications services and vertical interoperability obligations to install third-party app stores and sideload apps and to ensure access to essential functionalities of the operating systems or hardware capabilities of a given device.Footnote 19
With specific regard to IoT environments, the relevance of platform ecosystems and interoperability was underlined by the European Commission in a recent sector inquiryFootnote 20 and in the proposal for a Data Act.Footnote 21 In the former, on the premise that the market viability of IoT service providers and smart device manufacturers depends on smooth access to dominant technology platforms, the Commission argued that both horizontal and vertical interoperability among consumer services, smart devices and technology platforms play pivotal roles in unlocking the full potential of IoT ecosystems and preventing any lock-in to a certain provider’s products.Footnote 22 In the latter, by pursuing the overarching goal to end de facto exclusive control over personal and non-personal information enjoyed by manufacturers of data-collecting devices, the Commission acknowledged that the right to data portability enshrined in the GDPR is inherently unfit to deliver the pro-competitive expectations voiced by academics and policymakers.Footnote 23 Furthermore, according to the Commission, the absence of an obligation to create technical interfaces for automated and continuous data flows in the context of the IoT “can make it hard to offer certain services that require real time data flows, leading to lock-in situations for data subjects and hampering the development of innovative services based on access to such data”.Footnote 24 However, the proposal supports the adoption of open interoperability specifications and standards only to facilitate switching between data-processing services.Footnote 25
The role of interoperability has been finally confirmed with the launch of the proposal for a European Health Data Space (EHDS), the first common data space in a specific area to emerge from the EU strategy for data.Footnote 26 Indeed, the Commission highlighted the absence of binding or compulsory standards across the EU and consequently limited interoperability.Footnote 27 More recently, EU institutions committed to promoting interoperability and open standards in the European Declaration on Digital Rights and Principles for the Digital Decade.Footnote 28
As far as any type of interoperability is concerned, standards are decisive in enabling the broad complementarity of products and services. The reliance of IoT applications on seamless interoperability and continuous access to different data sources requires efforts to be made in the field of standardisation. Indeed, IoT devices are integrated by means of intelligent interfaces to develop smart environments in which each item is able to exchange data in order to improve the customer experience.
To this end, by analysing the limits and potentials of interoperability obligations in fostering competition and innovation across the IoT economy, this article aims to assess whether and to what extent the experience of the consumer financial data-sharing regulatory framework may provide useful insights. In particular, noting that Open Banking shows similar competitive dynamics and interoperability features, we argue that this experience developed in the UK could serve as best practice for implementing workable interoperability in the IoT.Footnote 29
The paper is structured as follows. Section II summarises the recent literature on interoperability. Section III analyses the role of standards in ensuring the effectiveness of interoperability initiatives and the significant trade-offs faced by policymakers. Section IV illustrates the Open Banking experience and the different approaches taken towards application programming interface (API) standardisation in the retail payment market. Section V puts forward a proposal for delivering workable interoperability in the IoT landscape to ensure market contestability without threatening incentives to innovate. Section VI concludes.
II. Interoperability: definitions and related literature
Since the concepts of interoperability and portability have been widely used in different contexts and are not necessarily always well distinguished, it is worth providing a clarification of their respective features and implications.Footnote 30
Interoperability can be broadly defined as the ability of two or more products or services to work together despite differences in interface, execution or coding language.Footnote 31 More specifically, full interoperability involves the interchangeability, compatibility and usability of services and products offered by different companies.Footnote 32 Indeed, as noted by Crémer, de Montjoye and Schweitzer, three sub-categories of interoperability can be identified in addition to the concept of data portability.Footnote 33 First, protocol interoperability (or partial interoperability) refers to the ability of different products and services to work together in a complementary fashion. This form of interoperability allows smooth interaction between different and potentially complementary services or products provided by different manufactures. Second, full protocol interoperability ensures that two or more substitute services can interoperate through a more fundamental interconnection and alignment of features (such as messaging systems or mobile ecosystems). Partial and full protocol interoperability should be understood as existing on a continuum.Footnote 34 While the former requires that only some or a subset of all features are shared with other players, the latter involves a deeper level of integration and standardisation between operators. Third, data interoperability refers to the ability to share and access data on a continuous, often reciprocal, real-time basis (usually through APIs). Footnote 35
Conversely, data portability is the ability to port from data holder A to data holder B a bulk of data created during the use of a service by an individual. As clarified by Schnurr, data portability differs from data interoperability because it comes with a one-off transfer at a specified point in time.Footnote 36 In order to be effective and enable consumer switching, data portability requires that the information is available in a structured, commonly used and machine-readable format but does not require the systemic use of APIs.
An additional clarification regards the distinction between vertical and horizontal interoperability.Footnote 37 While the former (ie interoperability within a platform/ecosystem) enables downstream integration across the value chain between complementary products as well as within a digital infrastructure provided upstream, the latter (ie interoperability between platforms/ecosystems) requires that similar services and products can smoothly interoperate between each other so as to share direct network effects. As such, horizontal interoperability can be conflated with full protocol interoperability, whereas vertical interoperability relies to a large extent on protocol interoperability.
The case for horizontal and/or vertical interoperability has become a much-debated issue regarding competition policy in digital markets. Indeed, whether horizontal interoperability represents the proper regulatory solution to ensure effective data sharing and promote technological innovation is controversial.
Notably, on the premises that interoperability increases network effects for all players, Crémer, Rey and Tirole argue that interoperability can level the playing field between small and large players, hence increasing contestability and competition.Footnote 38 More recently, Crawford et al maintain that such obligations are necessary to avoid the economic risks arising from firm-specific network effects (eg lock-in dynamics and market tipping).Footnote 39 However, Bourreau points out that interoperability and multi-homing may represent two substitute means to enhance competition and improve contestability in digital markets.Footnote 40 More specifically, mandating horizontal interoperability can have ambivalent impacts on competition.Footnote 41 Bourreau and Krämer show that, as homogenisation restricts differentiation and innovation opportunities, mandating horizontal interoperability could reduce existing incentives for multi-homing and ultimately hinder competition.Footnote 42 By analysing the consolidation processes used by US securities clearinghouses and depositories, Awrey and Macey argue that horizontal interoperability requirements could help dominant firms to take advantage of a frictionless environment with low barriers to entry, ultimately entrenching their monopoly power. Footnote 43
Furthermore, if multi-homing is possible without significant costs, the potential welfare gains from horizontal interoperability obligations are limited in digital markets where innovation is occurring at a fast pace.Footnote 44 Moreover, forcing market players who have already developed their own services to implement new horizontal interoperability obligations would be highly costly, complex and time-consuming, especially in terms of regulatory monitoring and enforcement.Footnote 45 Finally, as interconnected networks should use the same encryption method, horizontal interoperability may affect security and privacy.
In light of these considerations, concerns have been raised by the European Commission itself for the DMA proposal to make number-independent interpersonal communication services and social network services offered by gatekeepers horizontally interoperable.Footnote 46 Alongside several related technical issues, it is argued that this measure would be doomed to hinder innovation, incentives to invest and service differentiation, ultimately damaging consumer welfare.
III. The role of standards
A second layer of complexity involves the relationship between interoperability and standards.
Interoperability relies on standardisation in order to be effective. Different manufacturers and application developers can benefit from open access and interoperability requirements as long as technical protocols and APIs are designed in a homogeneous way, guaranteeing effective data access and smooth interconnection with the underpinning digital infrastructure. Standards serve exactly this purpose by providing a set of technical rules and characteristics that allow devices not only to connect and integrate, but also to ensure the quality and security of IoT interactions. Moreover, from the perspective of competition policy, well-designed standards prevent the risk of third-party providers being surreptitiously undermined in comparison to proprietary services through lower levels of interoperability.
When it comes to the design and implementation of standards, two main distinctions arise.
With regards to the implementation aspect, standards can be open or closed, depending on whether developers retain control over access to and implementation of the underlying technologies. While the former are open source and freely available to manufacturers and services providers willing to enter the market with interoperable products, proprietary standards often require a licence for the intellectual property owned by developers and may also come with proprietary enhancements for administering access to specific market niches.
Depending on how they are developed, standards can be industry-led or formal. The former are designed by market players which (independently or collectively) voluntarily agree to define common procedures and characteristics for products and services in a timely fashion. When these solutions are broadly adopted by the market participants, they are considered de facto standards as their implementation is substantially unavoidable for manufacturers. Conversely, formal standards are developed with a top-down approach by standard development organisations (SDOs) officially appointed by policymakers and regulators. Footnote 47 They are usually developed via processes that are transparent and open to broad participation by the industry and by stakeholders under the coordination and rules set by the competent SDOs. While formal standardisation prioritises consensus and social welfare implications over efficiency, industry-led standards focus on agility, speed and the need for a positive reaction from the market participants in order to succeed.
The policy choice between voluntary and mandatory standardisation initiatives involves relevant trade-offs. Technical expertise and flexibility considerations support bottom-up spontaneous standardisation processes. Moreover, there is a risk that mandating specific solutions in markets at an early stage of development may undermine innovation and lower incentives to invest.Footnote 48 On the other side, technology fragmentation or the lack of open standards available on the market, as well as the need to ensure prompt and effective data sharing, may justify top-down interventions and mandatory standards.Footnote 49 Indeed, standardisation may be a complex and time-consuming process because of the difficulties in reaching an agreement when several players are involved with different and potentially conflicting incentives.
These trade-offs also emerge from the recent EU Commission’s IoT sector inquiry, in which the majority of participants expressed the need to prioritise standardisation over proprietary solutions in order to guarantee higher levels of interoperability.Footnote 50 The Commission noted that the IoT standardisation environment is strongly heterogeneous, as smart devices and services rely on a mix of protocols, open standards, open sources and proprietary technologies, depending on the different technology layers implemented in smart devices.Footnote 51 While formal standards prevail only at the level of basic connectivity technologies (eg Bluetooth and Wi-Fi), de facto standards have emerged in the field of operating systems, wearable devices and user interfaces.Footnote 52 Technology fragmentation and de facto standards can exacerbate the costs and complexity of interoperability. As things stand, manufacturers risk having to make redundant investments in order to comply with heterogeneous APIs and certification processes, ultimately leading to the poor reusability of technical solutions and imposing major hurdles to product innovation.Footnote 53 Furthermore, recent competition inquiries into payment and financial services outlined the risk that API standard fragmentation could translate into higher barriers to entry for new entrants.Footnote 54
Therefore, the Commission, on the one hand, has praised open standardisation and encouraged dialogue and interaction between IoT players in order to develop industry-wide standards,Footnote 55 but, on the other hand, has recognised that intense standardisation activities by a high number of competing SDOs and private partnerships/industry organisations in the IoT sector might also lead to a lack of transparency and ultimately undermine interoperability.Footnote 56
In this scenario, European policymakers did not take a clear stance towards standardisation. Notably, while acknowledging the importance of interconnection for the flourishing of competition in IoT and data-enabled environments, there is no clear indication as to how standards should be developed and implemented in order to ensure workable interoperability across digital markets. Indeed, the DMA merely states that, “where appropriate and necessary”, the Commission may mandate European standardisation bodies to develop appropriate standards. Footnote 57 With regards to number-independent interpersonal communications services, gatekeepers are obliged to provide the necessary technical interfaces or similar solutions that facilitate interoperability, upon request and free of charge.Footnote 58 Similarly, the Data Act proposal rules out the possibility of mandating the adoption of technical standards or interfaces. However, it provides the Commission with the power to delegate the adoption of European harmonised standards for the interoperability of data-processing services.Footnote 59 Furthermore, the Commission pledges to adopt common specifications by way of implementing acts in case harmonised standards are missing or existing standards are insufficient.Footnote 60
On a related note, the UK Open Banking initiative stands out as one of the most advanced cases of mandated interoperability in the digital economy.Footnote 61 Open Banking is commonly seen as a secure environment that allows consumers to share bank transaction data with trusted third parties who can analyse such information to offer them new services or make payments on their behalf. Therefore, it has the potential to strengthen the consumer bargaining position in relation to financial service providers by facilitating choice and improving the quality of data-enabled products.Footnote 62
The global attention gained by the Open Banking project convinced the UK Government to expand third-party data access and API standardisation to a broader range of financial services and products, thereby launching the Open Finance project.Footnote 63 This initiative is part of the broader Smart Data strategy under which the UK Government is looking to expand data access tools in all regulated markets.Footnote 64
Against this background, this paper aims to contribute to the literature on interoperability and standards by assessing whether the Open Banking standardisation approach may serve as a valuable blueprint for tackling some of the competitive issues underpinning IoT business environments.
IV. The Open Banking experience
Following a review of retail banking, in 2017 the UK Competition and Markets Authority (CMA) found that incumbent banks, which had consistently retained an 80% market share of the retail banking market over the years, were exploiting consumer inertia and barriers to entry in order to enjoy monopolistic rents, jeopardising data-enabled innovation.Footnote 65 In order to tackle these competitive weaknesses, the CMA made full use of its market investigation powers to ease the functioning of the access-to-account rule enshrined in the PSD2.
This piece of legislation mandated banks to allow customers to share their account transaction data with trusted third parties but left them free to do so while adopting the method they preferred.Footnote 66 As a result, access-seekers would have to face major transaction costs as they would either have to develop applications working with many different API standards or rely on technical service providers to interoperate with different banks. Moreover, the incumbents had a clear incentive not to cooperate in the implementation of the access-to-account rule and to keep their own infrastructure as closed as possible to potential rivals.Footnote 67
To address these concerns, the CMA required the nine major banks in Great Britain and Northern Ireland to agree on common and open API standards, data formats and security protocols that would allow new entrants to calibrate their applications according to a single set of specifications.Footnote 68 The CMA entrusted a special-purpose organisation (the Open Banking Implementation Entity) with the task of reaching an agreement between the banks, consumer representatives and fintech third-party providers on the appropriate standards for implementing financial data access. The CMA also appointed an Implementation Trustee, having the power to impose binding decisions on all nine major banks subject to the order in the case of no deal.Footnote 69 As such, this remedy imposed full protocol interoperability with reference to the payment data infrastructure of banks and other account providers.
The common standard approach allowed the UK to gain a leading position in the worldwide adoption of financial data sharing.Footnote 70 Significantly, the CMA order secured the adoption of Open Banking by preventing incumbents from delaying and frustrating the implementation of the access-to-account rule. Furthermore, account providers that were not subject to the remedy also decided to comply with PSD2 by adopting free API standards rather than developing their own. As of August 2021, there were 119 firms with live-to-market Open Banking-enabled products and services, while the Open Banking ecosystem accrued approximately 3 million users in Great Britain and Northern Ireland.Footnote 71
After taking stock of the UK experience, Australia introduced an even more ambitious economy-wide data-sharing framework (ie the Consumer Data Right; CDR), which gives consumers the right to share their data between any kinds of service providers of their choosing.Footnote 72 In 2019, this regime was initially implemented within the banking sector. The Australian Competition and Consumer Commission required the four major banks in Australia to share product reference data with accredited data recipients and mandated the adoption of a single set of API standards for data sharing. Footnote 73 Furthermore, the Australian Government established the Data Standards Body to deliver open standards supporting the CDR. Within this entity, various working groups open to participation with any stakeholders are now designing and testing open standards.
Given the UK and Australian experiences, many other jurisdictions have developed a considerable interest in following suit. Conversely, the EU has refrained from publicly mandating API standardisation and has left banks free to come up with their own data-sharing interfaces or to take part in privately led standardisation initiatives.Footnote 74 The underpinning rationale of this choice hinged on the concern that a common API standard could jeopardise innovation and dynamic competition between standards. However, when launching the Digital Finance Strategy and the Retail Payments Strategy in 2020, the European Commission recognised that the lack of API interoperability hindered newcomers, and so it committed to establish an Open Finance framework by the end of 2024 as well as to review the PSD2.Footnote 75
V. Open Banking as a blueprint for IoT interoperability regulation?
Some elements suggest that the Open Banking approach could serve as best practice for implement workable interoperability in the IoT universe.
In particular, Open Banking resembles the IoT in terms of its competitive dynamics and interoperability features as it builds on smooth data-sharing mechanisms and connections between different service providers in the context of vertically integrated platforms. Indeed, similarly to the retail payment market, the IoT sector encompasses various service providers connected between each other, and the role of intermediaries (either banks or leading technology platforms) enabling machine and user interaction is of the utmost importance for the network to thrive.
Furthermore, information is a key input not only for competing in financial services, but also for designing and producing smart products that can meaningfully interoperate within leading digital ecosystems. Therefore, the type of information that leading technology platforms and financial institutions hold and the way they use it are pivotal for the potential flourishing of the IoT and financial technology. Given the gateway role that such incumbents may play in the viability of data-enabled products and services, a data bottleneck problem may affect both IoT environments and retail financial markets. In this respect, users’ data may sometimes constitute a significant barrier for newcomers wanting to enter either the retail financial or the IoT markets.
Moreover, both the retail financial and the IoT sectors feature fragile market dynamics that are greatly dependent on users’ trust. Therefore, the ability of infrastructure orchestrators to maintain high levels of cybersecurity, personal data protection and user engagement is particularly relevant.Footnote 76 Ultimately, the financial and IoT universes are multi-sided markets where technical failures and unexpected changes can trigger crises of confidence, leading to death spirals for platform-based business models. Thus, both environments share a common need to avoid degradation of network quality due to poorly secured interoperability mechanisms.
The suggestion of looking at Open Banking as a useful case study for IoT interoperability and standardisation is supported by a recent market study conducted by the CMA, which explicitly referred to Open Banking as a model to ensure the successful rollout of electric vehicles (EVs) and related charging infrastructure.Footnote 77 Indeed, since EVs are considered “physical products that obtain, generate or collect … data concerning their performance, use or environment and that are able to communicate that data via publicly available electronic communications services”, they are explicitly included in the scope of the Data Act. EV smart charging requires effective data sharing among several stakeholders. A lack of interoperability due to differing standards limits consumer choice about where to charge and how to pay. Therefore, similarly to what has happened in IoT environments, the main regulatory challenges relate to the policy choices involving interoperability and standardisation. In particular, policymakers have been called to assess the option of mandated rather than spontaneous API adoption by providers in order to deliver effective interoperability.Footnote 78
However, since interoperability is context dependent, some important differences between retail financial markets and the IoT should not be overlooked. The banking industry is indeed much more mature than the IoT and so mandated interoperability was justified by the empirical identification of a market failure. Furthermore, the banking industry is characterised by a relative high number of legacy incumbents. Hence, if each bank were allowed to adopt different data-sharing interfaces, newcomers would face extremely high transaction costs. Footnote 79 Conversely, the IoT has witnessed a much quicker worldwide consolidation process, and the market viability of IoT service providers and smart device manufacturers currently depends on smooth access to a small number of dominant technology platforms (ie Alphabet, Amazon and Apple).Footnote 80
For these reasons, we advance a proposal for ecosystem-tailored standardisation that relies on the Open Banking approach but introducing adaptations to the specific features of the sector at issue.
Moving from the premise that common certification procedures, APIs and open standards are needed to ensure smooth data access and interoperability, we suggest that IoT ecosystem orchestrators should be asked to engage in standardisation processes that are transparent and open to broad participation by the industry and by stakeholders. Nonetheless, such a regulatory initiative should aim at ensuring interoperability within each ecosystem (rather than among ecosystems) in order to preserve differentiation and incentives to innovate. Therefore, leading IoT ecosystems should be required to implement vertical interoperability.
As orchestrators of key bottlenecks within the IoT economy, they can restrain other firms’ ability to benefit from network effects and obtain unchallenged access to users’ data. Furthermore, they are in a position to determine the requirements and certification processes by which access to and interoperability with their ecosystems take place. Moreover, as they usually are vertically integrated and compete with third-party providers over their own platforms, such a dual role may incentivise them to give preferential treatment to their own products and services (so called self-preferencing) by restricting interoperability for third-party services and devices. In this scenario, vertical interoperability would promote complementary innovation and the modular combination of services across the value chain.
Conversely, horizontal interoperability should be rejected as it could cause unintended consequences in terms of inter-ecosystem competition and dynamic innovation.Footnote 81 Indeed, it would force firms to provide substitutable services and products, preventing differentiation between ecosystems for the sake of delivering a level playing field among providers. In addition, horizontal interoperability could raise data security issues.
Moreover, digital ecosystem orchestrators should be required to design open interoperability standards together with third-party providers and manufacturers under the strict oversight of a publicly appointed supervisor. In the event of a failure to reach a compromise between the different stakeholders, the competent supervisory body would have the power to impose a solution on all parties. Similarly, the competent supervisor should be entrusted with the task of overseeing compliance over time from the side of both leading technology platforms and third-party manufacturers and service providers. As has already happened with the UK Open Banking experience, there is a serious risk that incumbents could try to surreptitiously undermine data transfers, with the final goal of watering down the competitive potential of interoperability.Footnote 82
As long as public supervision and transparency are ensured, ecosystem-tailored standardisation could harness the skills and best practices developed by existing technical bodies. For instance, formal SDOs already opened several work streams for the development of IoT standards both in the EU and at the international level. The European Committee for Standardization (CEN) and the European Committee for Electrotechnical Standardization (CENELEC) actively cooperate with the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC) to develop standards facilitating interoperability and communication between voice assistants, wearable devices and consumer IoT services as well as smart home devices.Footnote 83 At the same time, several of the industry-led standardisation initiatives that emerged over the last decade could offer useful technical guidance. In 2019, Apple, Google and Amazon established a working group (now named Matter) within the Connectivity Standards Alliance to launch a new, royalty-free connectivity standard enabling compatibility among a wide range of smart home devices.Footnote 84 Moreover, since 2014, the Thread Group alliance has been operating to provide network protocols to connect and control products for home automation.Footnote 85 Finally, in 2019, both Amazon and the Linux Foundation launched initiatives (the Voice Interoperability Initiative and the Open Voice Network, respectively) to facilitate multi-homing and interconnection between voice assistants.Footnote 86
Nevertheless, the proposal at stake would be time-consuming and resource-intensive to implement for both market supervisors and leading platform orchestrators. Engaging with the implementation of vertical interoperability requirements and the ex post standardisation of existing digital ecosystems is an extremely complex, cumbersome and costly task that could face strong resistance from the firms involved. A potential way to deal with this issue would be to require technology platforms to contribute with their finances to ecosystem-tailored supervisory costs. As has been shown by the UK Open Banking experience, adequate implementation, team resources and skills are crucial to achieving the pro-competitive goal of ecosystem-tailored standardisation.
Furthermore, conceiving workable governance and enforcement of vertical interoperability obligations requires regulators to strike a delicate balance between the interests of incumbents and newcomers while dealing with limited public resources. The UK Open Banking experience allows us once again to better understand this issue as it was originally funded by incumbent banks. As the implementation roadmap set out by the CMA was substantially completed by the end of 2022, the UK Government started considering how to ensure the necessary supervision for enforcing the continuing obligations under the CMA’s Retail Banking Order to allow customers to access their data securely and to benefit from financial data sharing. Footnote 87 For instance, the leading industry body for financial services (ie UK Finance) proposed to let the nine largest banks be free to withdraw from membership (and funding duties) after three years. Footnote 88 This spurred a discussion as, according to several fintech firms, such a proposal would turn into an unfair leverage to manipulate the new supervisor’s activity, especially when it comes to the oversight of interoperability requirements and standardisation initiatives. Footnote 89
However, a substantial element differentiates our model from that of the UK Open Banking experience. Our proposal would not lead to a one-size-fits-all solution as the standards would be tailored to the specific features of each ecosystem. Indeed, as the IoT encompasses a wide range of heterogeneous products and services interconnected together within diverse digital ecosystems, it would not be appropriate to impose a single set of interoperability standards on the whole sector. Rather, the Open Banking paradigm could serve as a reference for delivering vertical interoperability tailored around the features of major digital ecosystems in the IoT universe. Adopting an ecosystem-based approach to standardisation would foster dynamic innovation and ecosystem diversification as platform operators would not be bound to deliver homogenous offers. Moreover, interoperability could work smoothly in a vertical fashion, thereby facilitating ecosystem entry by newcomers and reducing the risk of technological self-preferencing.
Overall, our proposal is set to complement the existent EU regulatory framework and to fill the gaps left open by the proposed Data Act, which only addressed interoperability for cloud service providers. Furthermore, by adopting a straightforward approach towards mandated standardisation to deliver on vertical interoperability, the solution at stake also complements the DMA, which already introduced some vertical obligations for app store gatekeepers. Indeed, the DMA only provides the Commission with the ability to force SDOs to facilitate the adoption by digital gatekeepers of appropriate standards as long as this is necessary to implement the vertical interoperability requirements needed to install third-party app stores and to side load apps, as well as to ensure access to the essential functionalities of operating systems or the hardware capabilities of a specific device. Moreover, our proposal would draw on the technical toolbox recently established by the Data Governance Act. As the European Data Innovation Board is entrusted with the fundamental task of identifying the relevant standards and interoperability requirements for cross-sector data sharing, it could serve as a foothold for launching the implementation of ecosystem-tailored standardisation.Footnote 90
VI. Concluding remarks
The IoT is likely to bring about substantial changes across a large portion of worldwide economies. An increasingly number of products require interconnection within larger networks of smart devices; therefore, smooth access and interoperability with digital ecosystems are essential for third parties to engage within the IoT universe. At the same time, by exerting control over the interface between final users and application developers as well as device manufacturers, the orchestrators of these ecosystems may engage in anti-competitive behaviours and undermine the economic potential of the IoT.
To address these concerns, a wave of regulatory initiatives has progressively emerged in recent years to address the strategic role played by large platform-based digital ecosystems. In such a scenario, interoperability obligations are taking centre stage, promising to put an end to those network effects that work only in favour of the most prominent digital ecosystem owners. However, given that interoperability is context dependent, interventions should build on a careful assessment as to whether it is more desirable to aim for vertical rather than horizontal interoperability, as well as whether it would be preferable to rely on mandatory rather than voluntary standardisation.
Against this background, we put forward a twofold proposal. First, we argue that competition-orientated reforms in the IoT should aim to deliver vertical (within-ecosystem) interoperability. Horizontal (between-ecosystems) interoperability would, indeed, threaten platform design and governance, thus jeopardising business models, preventing their differentiation and ultimately reducing incentives to innovate. Second, we sound a note of caution against poorly designed legislative measures that fail to address the role of standardisation for delivering interoperability. In this regard, by taking stock of the Open Banking experience, we argue that industry-led standardisation under the oversight of independent public bodies is the correct solution for tackling interoperability challenges in the IoT universe. Under our proposal, leading IoT technology platform orchestrators would be expected to design open interoperability standards together with third-party providers and manufacturers. In this way, the hurdles of formal standardisation processes could be overcome while countering the risks of de facto standards being conveniently developed under the control of large technology platforms.
By implementing ecosystem-tailored standardisation, policymakers could strike a reasonable balance between the need to ensure contestability in digital markets and the overarching goal of preserving consumer welfare and innovation.
Competing interests
The authors declare none.