1 Introduction

Data portability (the ability to transfer data to others without affecting its contentFootnote 1) and interoperability (the ability to integrate two or more datasetsFootnote 2) have the potential to beneficially affect the use of data, while nevertheless raising important issues for antitrust policy. This paper lays out the pros and cons of a move towards requirements of data interoperability and portability, comments on regulatory versus competition policy approaches to manage interoperability and portability and contrasts the use of these approaches in the U.S. and in the EU.

Currently, it can be difficult for users to move personal data in a social network to a competing service.Footnote 3 Allowing for improved data portability could facilitate the ability of consumers to switch services, which would, in theory, substantially increase competition. To illustrate, portability might allow a Facebook user to connect with users of other social networks, irrespective of their initial social network provider.

Unfortunately, there is a downside. Barriers to data portability and interoperability can be a major source of social inefficiency (Borgogno & Colangelo, 2019). To be specific, there is a real concern that barriers to data sharing could result in the balkanization of data within sectors, thereby not only impeding innovation within markets, but also reducing spillovers to other markets. Indeed, it is quite possible that private regulation could prevent the sharing of data that would otherwise be efficiency-enhancing (Tucker, 2019).Footnote 4 Indeed, as several commentators have pointed out, network effects can shift beneficial competition for the market to lock-in effects that lead to competition in the market (Fletcher et al., 2021; Gulati-Gilbert & Seamans, 2023).

Some technological obstacles to widening the use of data can be overcome when data are standardized (Gal & Rubinfeld, 2017). On the plus side, standardizing data so that they are portable can lead to smoother data flows, better machine learning, easier policing of infringement, and fewer adverse effects of data-fed algorithms. Standardization might also support a more competitive and distributed data collection ecosystem. At the same time, increasing the scale and scope of data analysis can create negative externalities in the form of better profiling, increased harms to privacy, and cybersecurity risks.

Ultimately, whether the push for portability and interoperability will require a more interventionist role for competition or regulatory authorities is an open question. Importantly, the responses to the issues just raised currently differ between the U.S. and the E.U. Characterizing these differences and commenting on their normative implications is the focus of this short paper. The paper begins in Sect. 2 with a description of the particularities that flow from issues relating to data analysis. Building on this, Sect. 3 offers a description of the pros and cons of portability and interoperability, whether through the enforcement of a detailed regulatory structure or through public antitrust enforcement. Section 4 follows with a discussion of the antitrust and regulatory implications that flow from this discussion. Section 5 offers a description (with commentary) of the distinctive approaches currently taken by the U.S. and by the European Commission. There is clearly a tension between the approaches taken in the U.S. vis-à-vis Europe, a tension that is informed by a description of the important differences between the federal systems in the U.S. and in the E.U. The paper suggests that the ideal structure would be one in which the E.U. relies on a powerful regulatory structure, whereas the U.S. relies more heavily on its two federal enforcement agencies and the common law of antitrust. Section 6 concludes.

2 Data: analysis and markets

To understand portability and interoperability issues, it is important to explore the relevant characteristics of data, data analysis, and data markets, as well as some technological obstacles to the use of data and data integration. While some types of data are not fungible,Footnote 5 other datasets can be relevant for multiple users, operating in a wide variety of markets (Stucke & Grunes, 2016), Lambrecht & Tucker, (2017). Furthermore, there are additional complexities when markets are two-sided. Take, for example, the market for Google search advertising, with consumers on one side and advertisers on the other (Ratliff et al., 2014). By filling the role of connector of the two sides of the market, Google has the potential to gain consumer information that is not available to its competitors, in the process generating the benefits that flow from interoperability. At the same time, big data has increased the ability of algorithms to reveal interesting relationships between attributes of datasets and to mine valuable knowledge for descriptive as well as predictive use.

In an ideal world, data would be transferable or replicable at a very low marginal cost. Interoperability can be achieved because data are divisible, non-rivalrous, and can potentially be integrated with other data. Moreover, when economies of scale and scope cannot be achieved by a single entity or by a single source of data, the integration of data through portability has the potential to substantially increase data’s predictive value.Footnote 6 In such cases, data portability and interoperability will be essential if the benefits of the integration of a large volume of data into a high-quality dataset is to be achieved. The challenge is to integrate data that are not necessarily similar in source or structure and to do so quickly and at a reasonable cost.Footnote 7

Competition for data collection, analysis, and storage, as well as competition in markets for data-based products or services, is affected by the extent of entry barriers at various points in the vertical chain, from manufacturer to wholesaler-distributor to retailer and finally to the ultimate consumer. It is not surprising that the demand for data has created an ecosystem of numerous firms that trade in data.Footnote 8 This, in turn, enables firms to use data collected elsewhere to enjoy the benefits of data-related scale and scope economies, via data portability and interoperability.

There are collaborative projects that have been directed towards improving data portability and interoperability. Founded by Google, Microsoft, Yahoo, and Yandex, schema.org is a collaborative effort to create, maintain, and promote schemas for structured data – in essence, to achieve data standardization, thereby increasing data interoperability.Footnote 9

Built by schema.org, datacommons.org is an open knowledge repository that combines data from public datasets. using mapped common entities. The datacommons.org website covers many topics, from demographics and economics to emissions and climate change. To illustrate, climate change has important implications for personal health. As datacommons.org notes, increases in mean temperatures are global averages that have been aggregated over time. At every temperature level, there will be some places that become hotter and other places that become colder. As a result, climate change is not just about reducing carbon emissions. It is also about adaptation to the change that is already happening. As an illustration, the datacommons.org website provides a fascinating scatter plot of the expected peak temperatures in various countries with the fraction of people suffering from coronary ailments.

To take another example, Google Takeout allows users to export their data, including email, voice, text, and photos in an “industry standard” form.Footnote 10 By accommodating the porting of information of these various types, Takeout allows for more efficient data management and analysis.

There are three obstacles to achieving the substantial potential benefits that flow from portability and interoperability (Gal & Rubinfeld, 2019). The first involves metadata that describe the data included in a dataset. Metadata uncertainties limit others’ ability to understand what different data points signify (e.g., Does the label “address” relate to billing or to shipping?). These problems can increase information asymmetries regarding the content of datasets, thereby reducing incentives to engage in mutually beneficial data sharing.

The second limitation involves obstacles to data transformation, which can raise the costs of combining the available data into coherent datasets, the key to achieving data interoperability. One such obstacle results from data granularity, as when similarly attributed data are collected at different points in time. Another obstacle can arise from the need to reorganize data into a new, combined dataset with a different structure and/or internal organization.

The third obstacle involves missing data. This limitation arises when some necessary data are not collected, or the costs of ex-post collection is prohibitive. Missing data may also result from limited capacity of a database to store the data,Footnote 11 or from data collectors’ limited foreseeability of the value of data interoperability.

These three limitations reduce users’ incentives and ability to extend the use of data and to achieve data synergies. Indeed, a European Commission study found that “merging different datasets and making them interoperable is one of the most resource-intensive activities for data (re-)users and that, even within the same value chain, datasets are rarely interoperable by default.” (Eur. Comm’n, 2015a at 89).

3 The benefits and costs of portability and interoperability

Making data portable and interoperable has the potential to reduce the obstacles to data use by others. Data portability measures can reduce user switching costs and reduce any frictions that arise when utilizing new services. Furthermore, these measures can increase the value of the data and limit duplication (of data collection, storage, and analysis).

Supporting the portability of different data sources also reduces investment risks associated with data collection, organization, and storage. By reducing data portability costs and enabling more market players to utilize data, data standardization may increase incentives for data sharing. Increased use of data may also facilitate cumulative and synergetic knowledge production.Footnote 12

Data portability can support a competitive and distributed data collection ecosystem. Not only can it increase the incentives of firms to collect and to share data, but portability can also increase the ability of firms to integrate different datasets and reduce the need to rely on one source for data, either internal or external. For example, Google may combine data regarding a user’s email, geo-location, and browser history, to better account for her preferences. It can also benefit from the creation of network effects. Other firms, which lack such a variety of data sources, may find it difficult to match these capabilities.

Unfortunately, the quality gap created by such network effects carries the potential for lock-in that can entrench or strengthen the dominance of some firms. As a result, data-based markets could exhibit highly concentrated structures, with a dominant firm possessing a massive share. Of course, the benefits arising from data collection and analysis that are not the result of artificial entry barriers do not necessarily raise antitrust concearns.

Interoperability measures can allow users to multi-home and as a result to make markets more contestable. As the OECD points out (OECD, 2021 at 8), “[D]depending on their design, interoperability measures can promote competition among digital platforms, by allowing users to preserve network effects on new services, and within digital platforms by allowing users to mix and match different complementary services from different providers.” Ultimately, the resulting threat or actuality of increased competition can reduce the market power of economically powerful platforms, lowering prices paid, directly or indirectly, by consumers.

The difficulty of achieving scale may be overcome if competitors could combine data collected by numerous sources. The lower the costs and obstacles to data portability and interoperability, the stronger the potential competitive pressures on large data collectors. And, since data are non-rivalrous and often easily replicable, data collectors could share their data with many potential users, potentially strengthening competition even further.

At the same time, data interoperability measures, which require the setting of data standards, raise the risk of lock-in to an inefficient standard. To illustrate, assume that a data standard requires all medical data collectors to gather certain types of data at specified intervals, but these intervals are too far apart for the data to be meaningful. While data could be collected at shorter intervals, the standard might send a wrong signal as to the appropriate interval. Data standards can also impose high compliance costs on all market players, potentially countering some of the competition-driven portability benefits (Swire & Lagos, (2013; Lynskey, (2017; Gal & Rubinfeld, (2017). Finally, data standards can also negatively affect competition by raising some competitors’ costs,Footnote 13 and could make coordination and collusion easier (Ezrachi & Stucke (2016).

Data portability and interoperability also raise privacy concerns. The easier it is to share data, the greater the concern that private data will fall into more hands (Swire & Lagos (2013) at 335). Portability could also reduce the willingness of potential data subjects to allow their private data to be collected, thereby potentially affecting data collection and innovation.

Data portability can also affect cybersecurity.Footnote 14 To be specific, the integration of databases may enable security systems to detect patterns of suspicious activity more efficiently. Moreover, the scale of data may allow algorithms to learn rapidly from past patterns to detect future attacks (Tropina & Callanan, 2015). Yet, the more standardized the data, the easier it might be for hackers to access and use it. The potential harm becomes even greater to the extent that data portability enables the creation of larger, less-dispersed databases, given that the size of the dataset may be positively correlated with the potential harm from security breaches (Kerber & Schweitzer, 2017). Furthermore, an inefficient standard can reduce organizations’ ability to detect and respond to cyber threats (and make its implementation costly. Finally, there remains the possibility that imposing mandatory portability and interoperability would adversely affect incentives to innovate.

The costs and benefits of requiring portability of data sets and making interoperability possible are likely to differ among various types of data or its uses. As a result, in some cases it may be better to prevent certain uses of data, including its sharing. However, there are safeguards – legal, technological, or even cultural – that ensure that the overall effects of portability are positive.

Of note, the OECD has recommended that portability measures are best applied to markets where, among other things (i) the data in question can be used in clearly defined applications; (ii) there are no significant privacy or ownership applications; and (iii) there is some competition, so that network effects and data-driven economies of sale do not preclude effective competition (OECD, 2021 at 26).

4 Antitrust enforcement and/or regulation?

Can we rely on the market to create and implement efficient mechanisms that support data portability and interoperability? In several settings, the answer is in the affirmative, given the large benefits to be had from portability and interoperability. Interestingly, private endeavors have focused mainly on data portability, rather than interoperability.Footnote 15 Furthermore, significant market failures may prevent socially beneficial data standardization, a vital prerequisite to achieving the benefits of portability and interoperability. Consider the world of music recordings, where songs and other types of music have been saved in a variety of formats (e.g., cartridges, CDs, audio tapes, digital audio recordings, etc.) that can be hard for individuals to interoperate. Making the system interoperable will greatly expand the reach of music of all types.Footnote 16

There are several reasons for market failures that make data portability and interoperability difficult.Footnote 17 First, the incentives of different market players may differ, making it hard to create efficient standards. To elaborate, some market participants may favor the status quo, with the benefits being high switching costs, greater lock-in and reduced data portability. The story is different for the large platforms – Google, Amazon, Facebook, and Amazon — these incumbents enjoy data-based comparative advantages that cannot be easily matched by others. They have the potential to gain an advantage by finding ways to raise their rivals’ costs relative to their own, also by limiting data portability and interoperability.

Second, even if a standard for data portability and interoperability is voluntarily created, its content may serve the interests of some market players and not others. Concerns arise from the private interests of those involved in setting the standard, especially given the knowledge that competitive entry may involve substantial sunk costs. Moreover, the chosen standard may impose costs as well as benefits on the rivals of its creators (Besen & Farrell, 1994).

Third, collective action problems might lead market players to block the possibility of achieving portability and interoperability, even when it) is beneficial for all to do so. In the absence of an arbiter, the market may be sufficiently fragmented that no single approach will gain critical support, leading to a patchwork of inconsistent standards that slow data flows (Gal and Rubinfeld, 2019 at 23).

In all cases there might be insufficient time for deliberation before the market sets on its course. Most importantly, the uncertainty resulting from the fact that users cannot be assured that others will follow their move to a new standard, creates a coordination problem. Coordination incentives could also be limited by a lack of knowledge among data collectors about potential data uses and concerns about the obstacles to integrating various types of data. Antitrust concerns, too, could limit incentives to standardize. Finally, the creation of efficient data standards might be inhibited by internal constraints, short-term strategic conduct, or historical legacies.

Even if data portability and interoperability serve the interests of all market players, private standard-setters may disregard the positive spillovers on data subjects, on firms in other markets, and on social welfare. An inherent tension also exists between temporal beneficiaries of data analysis: while tomorrow’s users may benefit from past data collection, their gains are not always easily shared with the current collectors of the data.Footnote 18

Market failures may also arise regarding the implementation of an acceptable standard for data portability and interoperability. There is arguably an important regulatory role in the acknowledgment, evaluation, and – in the right cases – the possible facilitation of data portability. The potential benefits from increased uses of data, as well as the costs accruing from the potential loss of international competitiveness and from the continuing use of a patchwork of (inefficient) standards, should act as a catalyst for data portability and interoperability issues to be seriously pursued.Footnote 19

The competition authorities in the E.U. and the U.S. are well positioned to analyze the pros and cons of data portability and interoperability. They have or can acquire the appropriate technical expertise. They understand the implications of their decisions on all market players, to evaluate whether industry standards are economically efficient, and to assess whether the market could and would develop timely and efficient standards without governmental intervention. Furthermore, it is well understood that the case to be made for aggressive enforcement by the competition agencies or through a more traditional regulatory overlay will vary from industry to industry.

Creating an ecosystem of standards that can work in different contexts, and that can interoperate where required, is likely to require consultation with industry, or a coordinated governance process that includes the participation of market players. Both approaches build on the fact that market players often have substantial knowledge and understanding of existing technical needs and the merits of a variety of possible solutions.

Once it is established that allowing for data portability and interoperability will likely increase social welfare, it is important to facilitate the creation of efficient data portability and interoperability mechanisms and standards (Gal & Rubinfeld, 2019, at 26). Regulators face a range of options regarding how such mechanisms and standards can be set, each with its own costs and benefits. These include adopting private solutions, establishing standard-setting organizations (“SSOs”), or determining standards themselves. (Gal & Rubinfeld, 2019 at. 27)

The preferred regulatory model may also differ depending on the relative competence of different standard setters, the extent of the divergence between private and social interests, and the way in which such a divergence might affect the costs of portability. It seems that in most cases the delegation of authority to an industry based SSO, comprised of professional data scientists, will be more advantageous than performing tasks by a new governmental entity. While regulators play an important role in determining when market failures prevent the creation of welfare-enhancing data standards, they generally have less competence in evaluating the standards that will work best in each market setting. Even when private SSOs are preferred, there will still be a need for a regulator to set and enforce some basic operational rules (Gal & Rubinfeld, 2019 at 27).

Once a data standard is agreed upon, the appropriate competition authority or regulator must decide how to facilitate its adoption. Options include setting best-practices, mandating the adoption of data standards, and creating soft incentives for their adoption. It might come as no surprise that the Data Transfer Project undertaken in June 2018 by Microsoft, Google, Facebook, and Twitter, which sets a standard to enable user-initiated data portability among project participants, was initiated amidst increased calls for the government to reign in the power of large digital firms resulting from the control of data.Footnote 20

Finally, it is noteworthy that in some situations the government may have no choice but to set data standards if it is to make portability and interoperability work. This might be the case where the government collects and organizes data internally (such as meteorological, demographic, or legal data), or where it contracts with others to provide it with certain types of data (Gal & Rubinfeld, 2019 at 28).

5 The E.U. and the U.S

To date, the U.S. and the E.U. have responded quite differently to the issues surrounding portability and interoperability. The E.U. has promulgated several relevant regulations. In contrast, the U.S. competition authorities have been active, but, despite a substantial effort by Congress, substantive regulations have yet to pass. This section begins with descriptions of the current situation in the E.U. and the U.S. Following this, the paper explains that this situation can be viewed as the result of each entity utilizing its own comparative advantage. This is to be expected, given that there are important differences between the U.S. and the E.U. federal systems.Footnote 21

5.1 The E.U. regulatory overlay

The E.U. has pursued an active regulatory approach, leaving as secondary the application of its competition laws. Consistent with this perspective, the European Commission’s Expert Panel on Competition Policy for the Digital Era has suggested that the competition law approach be reserved for cases in which data transfer arrangements or APIs can be standardized (Cremer et al. 2019 at 107). And, in part in response to concerns relating in part to portability and interoperability, the European Commission has promulgated its General Data Protection Regulation (“GDPR”). The GDPR puts into place a highly centralized regulatory overlay under which the Competition Directorate and the member states can manage competition policy.Footnote 22

The GDPR has generated meaningful benefits with respect to portability and interoperability.Footnote 23 Given that the right to data portability and interoperability is included in the GDPR, it is not surprising to find that, as Bradford (2021, Chap. 5, at 131) points out, the GDPR led businesses around the world “to quickly adjust their data collection, storage, and usage practices in response to the EU’s regulation.” Focusing on the power that currently resides in Brussels, she points out that the de facto capital of the E.U. is the center of the enforcement of data protection legislation that covers all sectors of the economy, “making data protection a powerful manifestation of the Europeanization of the global regulatory environment.” (at 128). This is especially the case because the GDPR applies to E.U. residents, whose data are being collected, held, or processed, irrespective of the location of the company where the data are being processed.

The regulatory coverage of the GDPR was expanded in the E.U. by the November 2022 passage of the digital markets act (“DMA”). The DMA, which serves two different, but complementary purposes, is directed towards limiting the power of large digital platforms, setting out an extensive list of core platform services that are covered. While it is not expected to come into full force until 2024, the DMA does make clear that platforms should give end users the right to effective portability of data and allow for the effective interoperability of third-party hardware and software (Turner, 2022).Footnote 24

The DMA is seen by some commentators as an instrument that complements EU antitrust enforcement. Specifically, Recital 10 of the DMA states that it is meant to be ‘complementary’ to the European and national antitrust rules, including the rules prohibiting the abuse of dominant positions.Footnote 25 However, that relationship remains a subject of some debate (Colangelo, 2023). Is it best seen as a short-cut substitute for traditional antitrust economic analysis or as a complement that serves to broaden the scope of that analysis?

There is substantive overlap between the DMA and the GDPR. In particular, the DMA policy prohibits so-called “gate keepers” from combining or using personal data between different core platform services, unless the end-user has provided appropriate consent.Footnote 26 Another concern, covered by the DMA, is the possibility that a high-tech platform will utilize business data to leverage a competitive advantage, especially in regard to gatekeepers favoring their own services and products when similar services and products are offered by third parties.

Article 5(2) of the DMA is particularly relevant with respect to concerns that could arise if a gatekeeper such as Google or Amazon were to combine personal data from a core platform service with personal data from services provided by the gatekeeper or a third party. Prohibiting this behavior places clear restrictions on the ability of a platform with substantial market power to expand that market power, at the risk of a loss of potential portability benefits.Footnote 27

The European Commission has seen APIs (“Application Programming Interfaces”) as vital to achieving interoperability and through portability to make possible the flourishing of Artificial Intelligence and the Internet of Things.Footnote 28 Similarly, Article 6 of the GDPR creates a right to business-to-business data portability.Footnote 29 As Turner (2022) puts it, “Article 6(9) requires gatekeepers to provide end users free of charge with effective portability of data provided by them or generated through their … core platform service.” The article creates rights that expand and complement the data portability rights given by the GDPR.

With respect to interoperability, Article 6(7) requires gatekeepers (such as Google or Apple) to allow providers of services free interoperability, with the same features accessed or controlled by each gatekeeper’s operating system. Ideally this will allow third parties to compete in the provision of services that must interconnect with the gatekeeper’s operating system, with the result being increased consumer choice.

The issues confronting the E.U. have been spelled out with some clarity in a thoughtful E.U. oriented report (Cremer et al. 2019). The authors point out that the GDPR provides a special framework for personal data – one that grants substantial control rights to individuals. According to Article 20, “each data subject shall have the right … to transmit those data to another controller without hindrance from the controller to which the persona data have been provided.” Furthermore, “the data subject shall have the right to have the personal data transmitted directly from one controller to another, where technically feasible.” Cremer et al. (2019 at 83) conceptualize data portability in the GDPR as an individual right to counter data lock-in and to facilitate switching. Similarly, data interoperability mechanisms rely on APIs, whereas a user authorizes a service to access his or her data.

5.2 The U.S. perspective

To this point, the U.S. has yet to promulgate meaningful new regulatory legislation, despite data-related concerns in the Biden administration that are likely to include interoperability and portability.Footnote 30 Questions for the U.S. abound. Does imposing a regulatory overlay, perhaps modeled on U.S. telecom regulation, make sense? Short of creating a new federal agency, can the FTC utilize its rule-making functionality to achieve substantial ex ante regulatory-like benefits? Alternatively, can the DOJ achieve similar benefits through litigation and/or the use of its typically ex post consent decree power? Will active antitrust enforcement improve efficiency, or will it lead to remedies that stifle innovation?

As an initial first-stage effort, it makes sense for Congress and the U.S. competition authorities to carefully study market dynamics and characteristics to identify where the benefits of data portability and interoperability outweigh its costs. These include the costs of standard setting, implementation, and oversight, of compliance with the standards, and lock-in to inefficient standards. The need for study is strengthened by the fact that the current situation is characterized by a patchwork of inconsistent legacy data collection and organizational methods, developed over time by various market players.

5.3 Federalism

That the U.S. can and should rely on active enforcement by its state and federal competition authorities is consistent with a normative theory of federalism in which the U.S. is seen as a more mature federal system than its E.U. counterpart. As Inman and Rubinfeld (2020) explain, beginning in the 18th century with the Articles of Confederation the U.S. has put in place a strong federal union that allows for the free flow of labor and capital and that has active monetary and fiscal systems. Furthermore, most U.S. states have promulgated competition statutes that are like the U.S. Sherman Act. In addition, the ability of states to support economically powerful local businesses has been restricted by a “state-action exemption doctrine” that gives the central government the legal authority to limit potentially anticompetitive actions to those that are clearly articulated and actively supervised.Footnote 31

The U.S. common law system has been relatively successful in prosecuting high technology competition issues at both the state and federal levels. In contrast the strong divergence of political interests at the federal level has made it difficult for Congress to legislate; this despite concerns raised by both parties about the economic power of the large highly successful platform companies.

The story is quite different when one views relevant E.U. activity. For one thing, the E.U. operates as a federal union that has been only mildly successful. The culturally and politically diverse member states of the European Union have supported only modest financial transfers to and from the center and monetary policy has been limited to the member states that are part of the European Monetary Union.Footnote 32 In essence, the E.U. might be characterized as a system of “cooperative federalism” that requires (near) unanimity among country-level officials for many policies to be put in place.Footnote 33

Given the diverse perspectives of the member states, with some being much more active in terms of domestic competition enforcement than others, and given that systems of private enforcement in Europe are still in their early stage, it is not surprising that the European Commission would seek to put in place a strong central regulatory authority, while relying to a much lesser degree on its competition authority, whose interventions often take many years to be resolved and even less reliance on private enforcement.

5.4 The future?

It may be welfare enhancing for the U.S. and the E.U. to pursue somewhat different enforcement policies in their treatment of issues relating to data portability and interoperability. The OECD has offered a useful perspective, pointing to several legal cases going back in time that highlight U.S – E.U. areas of commonality and differences (OECD (2021) at p. 29). To illustrate, in its 1988 browser bundling case against Microsoft, the U.S. District Court for the District of Columbia noted that Microsoft had delayed in providing a crucial API to Netscape. The case settled with respect to interoperability, in part requiring Microsoft to disclose APIs and related documentation needed for competing browsers to interoperate with respect to the Windows operating system. Shortly later, the EU’s investigation of Microsoft’s bundling of its Media Player lead to a 2004 decision in which Microsoft was ordered to provide complete and accurate specifications for the protocols used by Windows workgroup servers.

The continuing debate concerning the 2010 Ticketmaster/Live Nation merger offers a more recent example.Footnote 34 In 2022, many buyers of tickets for Taylor Swift performance were unable to obtain tickets on the Ticketmaster system. This crisis has re-raised questions as to the effectiveness of the DOJ’s remedy that required Live Nation to provide ticketing clients with their ticketing data and to allow clients to use alternative primary ticketing service providers. While these consent decree provisions are set to expire at the end of 2025, to date they have been largely ineffective in encouraging a competitive alternative to Ticketmaster for many U.S. venues, even though there are no significant barriers to handling interoperability issues with multiple ticket providers.

The policy issues surrounding portability and interoperability that are raised on both sides of the Atlantic are difficult. Too strict a data-oriented policy that limits portability would deny consumers the tangible benefits of scale. However, too loose a regime would stifle innovation and could lead to excessive data harvesting. One possibility, suggested by the FTC’s current chair, would be a revival of the doctrine of structural separation – this would exclude digital platform owners from competing within their platform on a case-by-case basis (Kahn 2019 at 973).

However, this approach is unlikely to be successful for ensuring that consumers enjoy the benefits of portability and interoperability, for several reasons. First, with respect to so-called “killer acquisitions,” ex post short-run efficiency considerations may make it difficult for the U.S. to block acquisitions that could be expected to have negative long-term implications. Second, structural separation may lose substantial consumer benefits (with respect to pricing and innovation) that can flow when there are vertically driven efficiencies (Cabral, 2018. Within the U.S. system, we are left with a litigation system that attempts, within a rule-of-reason structure, to balance the benefits of scale and incremental innovation with the detriments of decreased net innovation and data privacy. Unfortunately, as Hovenkamp (2022) has pointed out, the U.S. agencies have had minimal success in litigating within such a rule-of-reason world.

Many of the markets that are affected by portability and interoperability issues are global in scope. As a result, the more aggressive actions (whether through competition enforcement or regulation) are likely to dominate the relevant policy space. Consequently, should the U.S. not take an active role in examining and in some cases possibly even facilitating data standards, American firms are likely to find themselves bound by foreign standards for portability and interoperability. Given that the European Union has acknowledged the importance of data standards for ensuring a comprehensive data sharing environment,Footnote 35 and given that its market players are currently in the process of setting such standards to comply with portability requirements, it is important to ensure that U.S. data interoperability and portability considerations not be disregarded.

As of 2023, the EU has been a leader in the management of portability and interoperability issues, combining both regulatory and competition enforcement mechanisms.Footnote 36 To what extent the U.S. will be a follower is not clear. What is clear is that both elements will and should be utilized. Competition agencies are well suited to implement remedies through a market investigation process that examines issues such as user lock-in and switching effects, barriers to entry stemming from network effects, and the economies of scale and scope that are associated with market power. The competition agencies are appropriately suited to gathering information on industry structure and the potential for remedies that will facilitate entry and competition.

The GDPR and the DMA are best applied when there is a valuable role for oversight, monitoring, and adjudication of disputes, or when non-competition objectives are deemed to be relevant. In some cases, and for some industries, legislation may be needed to assign ownership rights to consumers or certain datasets. Finally, timing could be a concern if the proceedings of a competition authority will last for years, with the resulting imposition of remedies likely to be ineffective (Kerber (2019) at 406).

6 Conclusion

There are substantial benefits, along with some potentially significant costs, to increasing data portability and interoperability. The private sector has been active in making efforts, individually or jointly, to improve data portability. Nevertheless, private benefits and social benefits are not fully aligned and there is a clear role for public intervention. Adding a regulatory overlay to our current regulatory and enforcement environment that recognizes the potential effects of data portability and interoperability is appealing. The value of adding such an overlay, short of the creation of an entirely new governmental entity, makes sense.Footnote 37 To date, the E.U. and the U.S. have, to this point, differed in their approaches to managing portability and interoperability issues. The E.U. has taken the lead through the GDPR and the DMA, with a regulatory approach. The U.S. has been lagging, with greater emphasis being placed on the role of the competition agencies. The paper suggests that the U.S.-E.U. differences make sense in light of the differences in their federal systems. Indeed, the ideal structure would be one in which the E.U. relies on a powerful regulatory structure, whereas the U.S. relies more heavily on its two federal enforcement agencies and the common law of antitrust.