We live in a world where platforms are ubiquitous everywhere—right from the sources of news and entertainment, doing business, dealing with the government, and engaging with the larger society. It is therefore important to understand the impact of these platforms on society and how governments and regulators view platforms and their strategies. Given the way these large technology platforms have diversified and globalized, their power to shape our lives is unfettered. In this chapter, we discuss the societal impact of platforms, governance and regulation, and their impact on industry structures and economics.

Social Impact of Platforms

We live in a world that is dominated by a variety of platforms, right from the sources of information, news, commerce, social networks, and entertainment. And quite a few of these markets are winner-takes-all (WTA) markets, dominated by one or a few global firms. These global corporations are typically privately owned (for-profit) public spaces that dominate the societal narrative and discourses across various countries and over time. For instance, social media and the internet have been an effective tool for amplifying electoral messages in democracies. We will discuss the specific impact of platform firms on the society on three axes: content, data, and competition.

Content

Typically, platforms that work around news and information manage user-generated content (UGC) or third-party content (TPC) on their platforms. Take the example of Google or Facebook—neither of them generates their own content, as they intermediate in the information markets. Google collates and organizes websites and news available, either as search results or through its own products like Google Maps, for use by search users. In some products like Google Maps, Google collects basic data around topology and maps the cities and countries, and crowd sources-specific details about businesses and landmarks on the same. The quality and accuracy of such third-party data could be highly variable and subject to a variety of questions. Similarly, Facebook does not publish any data on its own—all its content is user-generated (UGC), both in the form of posts, events, or even links to news articles.

A key issue in such managing UGC and/or TPC is that of content moderation. What are the boundaries of the platforms? How do we ensure reliability of content provided by platforms that provide UGC and/or TPC? As these platforms replace our traditional means of information filtering (editorial processes, research protocols, and trusted institutional frameworks) with algorithms, there is a trade-off between content moderation and control on one side, and monetization on the other. These algorithms are typically designed to highlight and prioritize content that are most valued by the users on the other side, based on their analytics of the profile and preferences of the users. It is such customized content that sustains engagement with the users, which is highly valued by the other side of these platforms—the advertisers. In such a marketplace, a profit-maximizing platform is more likely to depend on algorithms that amplify specific information, rather than moderate the same.

Though platforms in the recent years have made investments in content moderation, there are serious questions about the transparency around these algorithms and moderation tools. Add to this, the vast differences that exist in many countries in digital literacy. With low to moderate levels of digital literacy, some of these markets are more prone to amplification of misinformation and disinformation, rather than fact checking and establishing provenance of information. This creates a world that is dominated by fake news and targeted information campaigns that push people into echo chambers—a context where people are pushed only that information that conforms to their own belief systems, and pretty much nothing that contradicts. Such echo chambers are highly useful in mobilizing public opinion around social issues, political movements, or even targeted campaigns by government and semi-government organizations. It can have major impact on what the entire segment of population believes in and can have major impact on even public health programs.

A more worrying issue in the management of these UGC and TPC is that of liability. The platforms have continued to deny any liability arising out of the quality and reliability of the content distributed/hosted on their platforms, as they were generated outside the platform and that platforms only role is to make things available and accessible. Across countries, various governments have tried placing liabilities on these intermediaries with limited success. These are more designed as information marketplaces, where the demand and supply conditions determine inventory and transaction of specific information, rather than as publishers, where the liability rests with the editorial function of the publishing house on the basic quality and veracity of content hosted/ published/ transacted on the platform.

Data

Possibly, the most contentious issue in the impact of platforms on society is that of data. Platforms collect a variety of data from the users to provide them with customized and personalized content, products, and services. These data could be collected at sign-up (as profile information), at sign-in (as context information) and during the engagement process (as preferences and priorities). These data that are collected are critical in most cases for the platforms in order to provide appropriate services to the users. For instance, profile information like age and gender act as important filters in choosing to provide adult/ sensitive content; location (of sign-in) and language preferences enable targeting the users with appropriate local advertisements; and the specific search terms and navigation behavior within the platform provides valuable “expressed preference” of the users to dynamically provide engaging content. Platform sponsors argue (and rightly so) that these profile, context, and preference information are critical in providing a satisfying user experience.

What matters therefore, is who owns this data? What are the accountabilities of the platform in using this data? Questions around how platforms can monetize this data are tricky. Given that a lot of these patterns are discerned through algorithms, it is likely to have significant social consequences. Well before the emergence of platforms, there have been instances of such patterns uncovering certain information about users, including drug abuse, suicidal tendencies, or even teenage pregnancies.Footnote 1 Dilemmas exist about whether the platforms are accountable to share with relevant stakeholders, including law-enforcement authorities, when these patterns are illegal. Should Facebook and Twitter alert a users’ suicidal tendencies to his friends and family at all? What about his privacy? Should drug abuse information about specific individual users be reported to the law-enforcement and/ or healthcare systems?

Algorithmic accountability is not an easy problem to solve. On the one hand, we can claim that it was a person that designed the algorithm in the first place and therefore, the platform that got the algorithm made should be accountability. However, the fact that the algorithms learn and make predictions about user behavior makes it nearly impossible for human agents to define the specific outcomes (apart from broad boundaries).

Managing (user) data privacy is a significant issue for such platforms. There are three layers of rights—right to use the data within the platform to customize and personalize their products and services; right to monetize the data within the platform by targeting third-party advertisements and content to users, and the right to share the data to third parties (both commercial entities or governments/ regulatory bodies). The European Commission had taken giant steps in this direction by enacting the General Data Privacy and Regulation (GDPR) Act in 2018.Footnote 2 Other countries are catching up, including India with its proposed regulation (still under discussion in the Indian Parliament), Personal Data Protection (PDP) Bill, 2019.Footnote 3 Germany has passed the Network Enforcement Law (NetzDG), 2017 that made the companies liable for illegal speech propagated through their platforms.Footnote 4 This NetzDG Act is hailed as one of the first attempts (not without significant controversies) that balances the demands of freedom of speech and privacy on one side and online hate on the other.

Competition

One of the major issues in competitiveness of platforms is that these firms operate largely in winner-takes-all markets, with little or no effective competition. The dominance and hegemony of these platforms is difficult to control through traditional anti-trust and monopoly regulations. As we have seen before, traditional tools are ineffective in regulating the market power and dominance of these platform firms. Given that these firms are also multi-national corporations, international regulations around information exchange and commerce are tricky as well. What may be acceptable in some markets may not be legal in some others.

Some countries like China have very strong regulatory frameworks in allowing multi-national technology firms operating in their countries. Country-specific requirements like data localization might impose significant costs on the platforms. However, specific regulatory frameworks like blasphemy, sedition, and national security laws in various countries pose different risks for the platforms. For instance, a platform like Twitter might not be held accountable to something a Canadian resident writes about an anti-government protest in India. While the content may attract legal action for Indian citizens/Indian residents, it may require very different action on the part of law enforcing agencies to act against such content. Twitter may be forced to remove the said content, block the user for a specific period, or even permanently disable that user from using the platform, by the government. But, as we can see, these are reactions rather than proactive regulation and moderation.

Taxation has also been a very thorny issue in the context of global platforms. These firms have known to avoid taxation by setting up their office and global headquarters in low-tax regime economies, bypassing a variety of international regulations. Apart from moving their administrative headquarters, some of these platforms also shift significant value creating activities to global locations. Like moving their research and development centers to cities like Bangalore, they save on significant employee costs (as compared to locating the same in a city like San Francisco or Seattle).

Patents and copyrights on these algorithms and designs are another issue in platform competition. Given the geographic nature of some of these patent laws, it has become very difficult and costly to enforce global patents and copyrights on product design, trademarks, and copyrights.

There are no easy answers to these questions of moderation, transparency, and liability of content; ensuring data rights and algorithmic accountability; and competitive behavior of these platforms, and various governments are trying different measures to govern platforms. At stake are major issues around liberal values of free speech and privacy, access to public data, easy political participation, and the very pillars of democratic governance.

Platform Governance

As we had discussed before, platform firms are private entities that work for private gains, even though they provide public goods. Most of them remain privately owned public spaces, driven by commercial interests. The impact that platforms have on widening information asymmetry, amplification of misinformation and disinformation, inability to curb hate speech and fake news, overall decline in the reliability of information, creation and propagation of information echo chambers resulting in heightened polarization of public opinion, and questions around psychological health of users (due to addiction and screen time) has been the concern of many public policy professionals. Add to these, the issue of winner-takes-all markets, where these markets are captured by a single or at best a handful of firms, who shape public discourse and opinion. These near monopolies have also known to collude with other firms within and outside their network to maximize their returns. For instance, the role of Cambridge Analytica (CA) in sharing raw data about millions of Facebook users through exploiting a loophole in Facebook APIs, for targeted political advertising shook the world.Footnote 5 CA ran a quiz on Facebook that collected not just data about the quiz takers but also friends of quiz takers without their knowledge and sold the data. It has been argued that it was not so much about a scam by CA, but Facebook’s inadequate protection of its users from a third-party application designed with the specific purpose of collecting user data without their knowledge. This is complicated by the fact that these are multi-national corporations with their algorithms operating in black boxes, and an architecture that makes it difficult to separate the liabilities of the platforms and their users.

The power of platforms to intervene and interfere in our daily lives has been documented by many scholars and policy practitioners, especially by the “Amsterdam school of critical platform studies” (Hargittai, 2007; Introna & Nissenbaum, 2000; Nieborg & Poell, 2018; and Van Dijck, Poell, and de Waal, 2018Footnote 6). These scholars argue that with their epistemic power of filtering information that is accessible to different actors in the ecosystem, digital platforms engage in some form of regulation themselves. With their choices of platform architectures and algorithms, these platforms are more likely to perpetuate biases prevalent in the society, rather than addressing them.

Therefore, it is imperative that platforms need to be governed by the very stakeholders that they seek to serveFootnote 7—the complementors, users, governments and other state actors, and the civil society. How they are governed has implications for scale, social impact, and upholding modern values (including transparency and non-discriminatory service delivery, civility of discourse, and content promoting diversity of perspectives). Platforms could be governed internally, like any other corporation, accountable to its stakeholders, and within the law of the land, they operate in. Such governance has been known to be problematic, as these for-profit corporations should prioritize the demands of the principal stakeholders, their shareholders as their fiduciary duty. They may be compliant with the regulations, but technology change has often outpaced regulation. These platforms may be complying with the letter of the law, without actually following the spirit of the same.

Platforms as Marketplaces, Gatekeepers, and Editors

There are three ways at looking at these platforms—as marketplaces, as gatekeepers, and as editors. When we consider these platforms as marketplaces, they take no responsibility to the products, services, content, and behaviors by their users or complementors that use their platform. As in a typical marketplaces, platforms own the discovery and matching algorithms, and are not accountable for the specific behaviors of the complements and users, beyond basic quality verification. Such models may work with platforms around ecommerce, where the markets are efficient, and buyers can efficiently evaluate the quality of products/ buyers. However, when markets are lesser efficient, the platforms need to take more accountability in assuring quality of the complementors, the products and services offered, as well as the quality of transactions. Take for instance, a financial intermediary. As compared to a traditional ecommerce firm, a financial intermediary needs to ensure that the complementors on their platform are regulatorily compliant, the products are approved by the appropriate authorities, and the processes are secure. A simple marketplace model that ensures compatibilities and a robust matching algorithm may not be sufficient in this case. In such cases, we need to conceive of the intermediating platform as a gatekeeper. The platform must not only earn the trust of the users and complementors on their products and services but should also ensure that only quality users and complementors are affiliated with the platform. In a sense, it should play the gatekeeping role.

In between the two extremes of completely laissez faire marketplaces and tightly controlled gatekeepers, we could also conceptualize platform intermediaries as editors. As editors, the platforms might be required to allow for user-generated content and third-party content to be available for dissemination, with certain controls. The accountability here is shared between the content creator and the platform. Traditional media organizations have been operating in this model, with their content being generated from a combination of their own employees, through tie-ups with syndicates and agencies, as well as from independent columnists. The split of accountabilities might be different across the three sources of content, but the platform surely takes some responsibility even in the case of columns by famous writers—after all, the writer was chosen by the editor.

The Problem of Many Hands

However, technology-driven platforms of today have achieved such scale and enjoy network effects that make it difficult to effectively perform these editorial roles efficiently. Most often, the business model involves motivating users to engage more and more and in the process enhance volume and diversity of interactions. Gatekeeping and editorial roles are extremely difficult to perform, and if at all, done post-hoc., i.e., when a specific content is flagged as inappropriate, action is taken. Apriori evaluation of content and controlling the flow might actually be counterproductive to the scale and scope that lies at the heart of the business. Pragmatically, regulators would prefer to have a central actor with full accountability to create and/ or cause harm and therefore own legal responsibilities. Such a centralization of responsibility is easier to administer by the law enforcement authorities.

This is a manifestation of what is referred to as the problem of many hands.Footnote 8 The problem of many hands occurs when multiple uncoordinated entities contribute in different ways to a problem (or in solving the problem) in a manner where it might be difficult to accurately place accountabilities and responsibilities to actions and consequences. Issues like climate change and air pollution are examples of the problem of many hands, where multiple actors contribute to the exacerbation and escalation of the problem, as well as in their own ways, mitigate the same problems. It would be practically impossible to assign values to activities like deforestation, fossil fuel usage, mining and civil construction, altering the course of rivers through dams and canals, as well as increased economic activity for climate change. Even when one could scientifically separate out part values of the various causes, it would be very difficult to legally hold specific actors responsible for each of these actions.

Modern platforms clearly suffer from the problem of many hands. For instance, to hold Twitter or Facebook accountable for hate speech posted by one its users is preposterous so is placing accountability on a few group administrators in WhatsApp groups, where members generate ideas bordering illegality (like say sedition, national security, or harassment/bullying). By the same token, absolving these platforms of any accountability for the existence and promotion of hate speech or illegal content. Clearly, it is the responsibility of these platforms to ensure that such content do not enter, remain, or get disseminated through their platforms. They have a variety of means to ensure that, ranging from carefully selecting and ratifying content, educating its users, using technologies like AI to discover offensive content, crowdsourcing the flagging of content, and removal of such content/ offenders when there is a breach in accepted norms. In order to solve this problem of many hands, Helberger, Pierson & Poell (2018) suggest a system of cooperative responsibility.Footnote 9 They suggest that these platforms should (a) collective define the essential public values that they intend to uphold; (b) acknowledge that they have a role to play in realization of these values through their activities and decisions; (c) develop a multi-stakeholder process of public deliberation and exchange; and (d) translate the outcomes of these deliberations into shared codes of conduct, rules, and design principles for their platform architecture.

Platforms in Contestable Markets

The theory of contestable markets was defined as an extension of perfect competition and has the following characteristics.Footnote 10

  1. (a)

    The market is accessible to potential entrants, where the same customer needs can be served using the same technologies (that are easily available) as the incumbents.

  2. (b)

    Therefore, the new entrants’ evaluation of the market attractiveness is based on the incumbents’ pre-entry prices.

  3. (c)

    Therefore, the entry into such markets is absolutely free, as the new entrants face no disadvantage in comparison to incumbents (easy technology access or no consumer lock-ins with the incumbents).

  4. (d)

    This market is also characterized with costless exits. In other words, competitors face no exit barriers—no sunk costs to recover. Therefore, contestable markets are vulnerable to hit-and-run strategies.

In such a market, where the threat of new entrants is always imminent, the incumbents will keep their prices close to the competitive equilibrium with very low profitability. Given the low entry and exit barriers, when a new entrant enters the market, the only feasible response by the incumbents is to compete with them by lowering the prices. It may still be possible for the new entrant to match the lowered prices for some time, but when the incumbents have scale and learning advantages, for whatever they are worth in such markets (in perfectly contestable markets, such advantages do not exist at all), they may not be able to sustain. And a few firms may exit the market.

It may not be always the case that contestable markets will have hundreds of competitors, but even when there are a handful of firms, the threat of new entry will keep the firms behaving as if they were in perfect competition. Contestable markets are efficient and increase consumer wealth, as the prices are kept to the minimum possible. Given that there are no switching and multi-homing costs faced by the users, competitors also have to maintain acceptable quality standards.

Let us consider an example. The conventional banking industry had significant costs of entry, including fixed costs of setting up a network of branches; resources like branding and customer services were differentiators that provided incumbents with competitive advantage; and the costs incurred in branding, promotion, and customer acquisition/retention are sunk costs (cannot be recovered at exit). However, the class of digital banks has no costs of entry—all that they need is a set of servers that could be rented from a cloud computing service; online banking provides very little differentiation opportunities across different banks; and the user acquisition and retention costs are minimal with electronic and social commerce penetration. Therefore, if we can consider digital (online) banking as a contestable market. In order to facilitate the contestability of these markets, governments and regulators across countries have also framed policies to ease switching costs across banks (as well as integrate physical banking and online banking).

The increased internet penetration has helped a lot of industries become more and more contestable; by reducing entry barriers (easy user access), removing fixed costs (growth of the sharing economy), information proliferation (easier discovery and evaluation), and reduced sunk costs (opportunities for coring).

Platform firms play a key role in enhancing the contestability of markets. Platforms, with their network effects, help competitors access users easily. Some firms may enter adjacent markets through tipping strategies and port the entire user base to the market. The consumer cloud storage market is an excellent example of contestability created by coring platforms. For instance, firms like Google and Apple have entered consumer cloud storage markets (Google Drive and iCloud) by leveraging their user base from products and services in other markets. The barriers to entry is very low, given that these firms already have sunk costs around cloud storage; exit barriers are also low, due to the lack of any specific investments required to be made for offering these services; and there are no differentiated services in the core offering. As more and more firms enter the market, the specialized incumbents like Dropbox and Box are forced to compete on prices and/ or differentiated features, in a market characterized by no consumer lock-ins, low switching and multi-homing costs, and low loyalty. Prices fall as new entrants threaten to enter the market, and there is increased homogeneity in the range and quality of services offered by the competitors.

The Rise of Platform Conglomerates—FANGAM

Such opportunities for platform firms to enter new markets relatively at no entry costs have given rise to what practitioners label as platform conglomerates. Platform conglomerates refer to those large technology corporations that started their journey as a specialized platform, but slowly diversified into adjacent markets that are contestable, leveraging their user base and core technologies. Abbreviated in a variety of ways, the six large platform firms have become to control users and businesses across the globe—Facebook, Amazon, Netflix, Google, Apple, and Microsoft. Each of these businesses started in a different business but has increasingly converged and has made more and more markets contestable.

  • Facebook began as a peer-to-peer social network but has entered into social commerce (small businesses setting up webpages and event pages on Facebook), peer-to-peer messaging (WhatsApp chat), payment solutions (WhatsApp Pay) and video (Instagram reels) as well.

  • Amazon began as an ecommerce retailer but has diversified into payments (Amazon Pay), video streaming (Prime Video), and voice assistant consumer devices (Alexa) among others.

  • Netflix began as a DVD rental firm embraced video streaming of third-party content (movies, TV shows, documentaries animations, and short films) and began producing its own content (Netflix originals).

  • Google began as a search engine and has possibly the most diversified portfolio among tech platforms, with businesses ranging from video sharing (YouTube), mobile operating systems (Android), browsers for PC and mobile phones (Chrome), applications marketplace (Play Store) navigation products (Google Maps), and even self-driving cars (Waymo).

  • Apple is an integrated competitor that produces hardware—computers, tablets, phones, televisions, and music players; operating systems and application software (iOS, iPadOS, and other applications), applications marketplace (AppStore), cloud storage (iCloud) and a voice assistant (Siri), among other products, software, and services.

  • Microsoft, a market leader in PC operating systems (Windows) and business productivity software (MS Office Suite) has acquired the professional networking site, LinkedIn (that includes a jobs marketplace, blogging, and learning solutions), and peer-to-peer communication platform Skype to complement their own collaboration platforms like MS Teams.

One could see that each of these firms competes with each other in certain businesses, and despite these overlaps, they seem to be dominating their own markets. Do you realize how one could make simple Venn diagrams to represent where these firms compete with each other? Such competition where major competitors compete with each other in multiple markets have their distinct strategic characteristics, which is known as multi-market competition.

Platforms in Multi-market Competition

Competitive strategy scholars define multi-market competition as occurring when firms compete against their competitors across multiple markets/industries.Footnote 11 When competitors face each other in a variety of markets, it may induce mutual forbearance and reduce rivalry among them. The theory of multi-market competition highlights how strategic similarity among firms reduces competitive intensity; and mutual forbearance is greater in more concentrated markets.Footnote 12

As we had discussed, enveloping platforms diversify and compete against each other, they engage in multi-market competition. Such platforms have the potential to demonstrate mutual forbearance—reduce competitive intensity in markets where they are weaker than competition, in lieu of receiving the same favor in another market where they are stronger than competition. In other words, across multiple markets, competitors just do not compete hard enough for fear of stronger retaliation in some other markets.

For instance, Amazon’s Kindle did not expand its capabilities beyond book reading, even though it had the opportunity to expand into a fully functional tablet. Similarly, Apple has not (yet) launched a voice assistant hardware to complement Siri’s capabilities. Therefore, in both markets (handheld devices and voice assistants), these competitors do not compete directly with each other—Kindle remains an ebook reader against the multi-functional iPad; whereas in Alexa is integrated into a standalone device Echo, whereas Apple’s Siri remains an App on the iPhone/iPad.

Platforms and International Regulations

When these platforms compete in international markets, there are specific issues of regulatory compliances. The tussle between news organizations and content platforms has come to the fore in markets like Australia and Germany. As platforms like Twitter, Facebook, and Google become the primary sources of news to many users, news organizations have been severely hit, as they begin losing advertisement revenues. News organizations claim that they had invested heavily in hard and soft infrastructure to collect, validate, and edit news to provide it to the users in a credible form, both in digital and physical forms. These activities of news collection and distribution cost money and they recouped the same from advertisers. However, with the emergence of these big technology platforms, the users began sourcing their news through these platforms (which had linked the news content from the news websites), and consequently, advertisers moved over to the platforms. The platforms claim that these links allow for the news companies to market their content to a wider audience, as these links brought in many more click-throughs to their websites.

In July 2020, the Australian Competition and Consumer Commission (ACCC) recommended a code to compensate the news organizations with a fair compensation for their journalism. Calling on the tech platforms to pay for the content, the code allowed these firms to partner with consortia of news organizations for the content.

The two firms that were primarily affected by this code, Facebook and Google, have responded differently.Footnote 13 Google initially threatened to withdraw its search engine from Australia but subsequently announced that it had signed an agreement with the media firm, News Corp for sharing news content from its news websites in exchange for payments. Facebook announced that they would stop users posting news content on their pages. It also blocked Australian news companies from posting any of their stories or links on their Facebook pages.

Germany, on the other hand, was in the process of enacting a new framework, Bundeskartellamt, that would proactively frame a set of rules that technology giants would need to follow.Footnote 14 Especially in markets with winner-takes-all dynamics, the German regulator claims that these gatekeeper corporations need to ensure that they do not give preferential treatment to their own products and services and hindering interoperability with other services. This could become the framework for a broader European regulation in the near future.

India has also been working on regulating how data collected by digital technology platforms are stored and used. In the year 2020, Indian regulators banned a slew of mobile applications, including the popular short video-sharing platform, TikTok, on cross-border data sharing concerns.Footnote 15 The government is also close to enacting the Personal Data Protection (PDP) Bill into an Act that would specify how these platforms will treat user data. Discussions around India’s ecommerce policy have also intensified—especially during the COVID-19 induced lockdown, while the local grocery shops gained significant ground in comparison to the national ecommerce firms. The concerns around ecommerce in India are centered around both ends of the business—how fairly are small and medium businesses are treated as suppliers on these platforms, and how much has this competition contributed to consumer welfare in terms of prices and convenience.

Conclusion

The emergence of platform business models has had a variety of consequences. The proliferation of digital technologies aided with network effects and the convergence of standards has significantly contributed to rapid growth of these platform firms. On the one hand, this growth had expanded the user base and broadened the range of services experienced by the users, including personal, social, and commercial benefits. However, on the other hand, these have come with their own costs—the emergence of winner-takes-all markets and the resulting dominance by global corporations.

Any discourse around emerging topics like platforms where technologies, business models, and regulation are constantly changing should co-evolve with the context. However, there are some foundational building blocks that need to be appreciated for sustaining the conversation. As in most other topics, there are many perspectives that one can take—one could discuss platforms from a policy and governance perspective, from the perspective of a marketer, from the users’ perspective, from small businesses that complement these platforms, the gig workers that serve these platforms, as well as from the strategic perspective—that of the platform owner/ manager. Each of these perspectives will provide different nuances around understanding the import and dynamics of these business models.

In this book, we took the perspective of the entrepreneur-manager that is building/ operating a platform business firm. We focused on the economics and strategy of these firms. We introduced the basic concepts and differentiated platform firms from traditional pipeline firms and elaborated on the core properties of platforms—network effects, penguin problems, and winner-takes-all dynamics. We analyzed a variety of platforms, including their value architectures and network mobilization strategies. We elucidated the choices around platform architecture, discussed platform competition and envelopment, and highlighted how multiple business models could come together to create synergies. We conclude the book with a discussion on contemporary issues facing platforms across different countries.