Skip to main content Accessibility help
×
Hostname: page-component-cd9895bd7-fscjk Total loading time: 0 Render date: 2024-12-26T16:04:35.123Z Has data issue: false hasContentIssue false

3 - The Law of the Platforms

Published online by Cambridge University Press:  06 May 2022

Giovanni De Gregorio
Affiliation:
University of Oxford

Summary

This chapter highlights the reasons for the turning of freedoms to more extensive forms of private power by online platforms. Understanding the characteristics of platform power is critical to understand the remedies mitigating this constitutional challenge. This chapter analyses the two interrelated forms through which platforms exercise powers in the digital environment: delegated and autonomous powers. The first part of the chapter analyses the reasons for a governance shift from public to private actors in the digital environment. The second part examines delegated powers in the field of content and data while the third part focuses on the exercise of autonomous powers competing with public authority.

Type
Chapter
Information
Digital Constitutionalism in Europe
Reframing Rights and Powers in the Algorithmic Society
, pp. 80 - 122
Publisher: Cambridge University Press
Print publication year: 2022
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC-ND 4.0 https://creativecommons.org/cclicenses/

3.1 From Public to Private As from Atoms to Bits

In the 1990s, Negroponte defined the increasing level of digitisation as the movement from atoms to bits.Footnote 1 In general, a bit is only the sum of 0 and 1 but, as in the case of atoms, the interrelations between bits can build increasingly complex structures,Footnote 2 leading to the shift from materiality to immateriality.Footnote 3 The move from the industrial to the information society is primarily due to the move from rivalrousness to non-rivalrousness of traditional products and services.Footnote 4 Put another way, the bits exchanged through the Internet have driven the shift from analogue to digital technologies by creating revolutionary models to market traditional products or services and leading to wonder about the application of traditional rules to the digital environment.Footnote 5 The result is that the economy is no longer based on the creation of value through production but on values created through the flowing of information on a transnational architecture governed at the intersection between public authority and private ordering.

At the end of the last century, constitutional democracies across the Atlantic adopted a liberal approach to this technological shift which promised new paths of economic growth.Footnote 6 The rapid expansion of digital technologies combined with liberal goals have been two of the primary drivers for the accumulation of power by transnational private entities providing increasingly essential services. Instead of the democratic decentralised society pictured by technology optimists at the end of the last century, an oligopoly of private entities controls the exchange of online information and provide services which are increasingly critical for society at large as public utilities.Footnote 7 As already mentioned in Chapter 1, the global pandemic has revealed to what extent the services offered by these actors constitute critical bricks of daily life. As such, the platform-based regulation of the Internet has prevailed over the community-based model.Footnote 8

Online platforms play a crucial role not only in providing products and services which are increasingly relevant but also in ensuring the enforcement of public policies online. The activity of content moderation and the enforcement of the right to be forgotten online are only two examples illustrating how public actors rely on online platforms to perform regulatory tasks in the field of content and data.Footnote 9 Online platforms enjoy a broad margin of discretion in deciding how to implement these functions. For instance, the decision to remove and consequently delete a video from YouTube is a clear interference with the user’s right to freedom of expression but could also preserve other fundamental rights such as their right to privacy. However, this ‘delegation’ of responsibilities is not the only concern at stake. By virtue of the governance of their digital spaces, online platforms also perform autonomous quasi-public functions without the need to rely on the oversight of a public authority, such as for the definition and enforcement of their Terms of Services (ToS) by relying on the governance of the technological architecture of their digital spaces.Footnote 10 In both cases, online platforms freely govern the relationship with their communities. They enforce and balance individual fundamental rights by implementing automated decision-making processes outside any constitutional safeguard.

This situation could not be seen as problematic from a constitutional standpoint. Rather, it could be considered as the expression of private freedoms. Given the lack of any regulation, platforms as private actors are not required to care about fundamental rights or other constitutional values. Despite multiple incentives such as corporate social responsibility, platforms are primarily driven by the maximisation of profits. This expression of freedom would not raise a constitutional concern as long as there are public safeguards to, eventually, limit the power which private actors exercise on fundamental rights and democratic values. This is not the case when looking at platforms shaping individual fundamental rights according to their legal, economic and ethical framework. When economic freedoms turn into forms of private powers, the lack of regulation translating constitutional principles into binding norms could lead to troubling challenges for democratic values such as transparency and the rule of law. The setting, enforcement and balancing of fundamental rights and freedoms in the algorithmic society is increasingly privatised and compete with constitutional standards of protection on a global scale. The consolidation of autonomous areas of powers extending their private rationality driven by private incentives is one of the primary calls for action for European digital constitutionalism to preserve democratic values from the influence of market dynamics.

Within this framework, this chapter highlights the reasons for the turning of online platforms’ freedoms into more extensive forms of private power. Understanding the characteristics of platform power is critical to understand the remedies mitigating this constitutional challenge. Therefore, this chapter analyses the two interrelated forms through which platforms exercise powers in the digital environment: delegated and autonomous powers. The first part of the chapter analyses the reasons for the governance shift from public to private actors in the digital environment. The second part examines delegated powers in the field of content and data while the third part focuses on the exercise of autonomous private powers competing with public authority.

3.2 The Governance Shift

In the last twenty years, global trends have underlined different patterns of convergence,Footnote 11 usually named ‘globalisation’ where the state-centric model has started to lose its power.Footnote 12 The decay of national sovereignty and territorial borders is represented by ‘a world in which jurisdictional borders collapse, and in which goods, services, people and information “flow across seamless national borders”’.Footnote 13 This transformation has led to limits of states’ control,Footnote 14 struggling with the rise of ‘global law’ to define a meta-legal system where different organisations and entities produce and shape norms with extraterritorial implications beyond the state.Footnote 15

Constitutions traditionally embody the values and principles to which a specific community decides to adhere and respect. They represent an expression of the social contract between public power and citizens. Constitutions have seen the light in different contexts through different forms of constituent powers.Footnote 16 Nevertheless, it is possible to underline the intimate relationship between constitutions and certain areas of space (i.e. territory) over which the sovereign power is exercised and limited. The relationship between (constitutional) law and space is intricate. The law stands on a certain territorial space and relies on political processes legitimising its creation. Formally, outside the domestic legal framework, there are not any other legitimised binding forces over a certain territory unless authorised by the legal framework itself. Substantially, the law is only one of the systems influencing space. By moving from a unitary view of the law to pluralism, it is possible to observe how other systems develop their norms and principles. Therefore, the relationship between law and territory characterising state sovereignty tends to lose its exclusiveness, thus leaving space to the consolidation of another dyadic relationship: norms and spaces.

As already underlined in Chapter 1, this twofold-poietic relationship is based on the idea that the law is not a monolith but interacts with other social systems. Although these systems tend to be normatively closed since they autonomously develop their rules internally, however, they are cognitively open and influenced by other systems externally.Footnote 17 This form of autopoiesis leads to look at the law not just as the outcome of only legitimated political structure in a certain territory but as one of the fragments composing the constitutional puzzle on a global scale.Footnote 18

An interesting example of this phenomenon can be found in the digital environment or the so-called cyberspace. At the end of the last century, Johnson and Post wrote that ‘[c]yberspace radically undermines the relationship between legally significant (online) phenomena and physical location’.Footnote 19 This is why the cyberspace was considered a self-regulatory environment where bottom-up regulation replaces top-down rules by public authorities lacking any power, effects, legitimacy and notice. Besides, unlike top-down norms affected by a high degree of rigidity and uniformity, bottom-up rules ensure more flexibility. Therefore, self-regulation was considered the way to provide a better regulatory framework than centralised rulemaking.Footnote 20

These positions, representing the gap between law and space, are one of the reasons for the the positions firmly denying the idea of cyberspace as a new ‘world’ outside the influence of sovereign states.Footnote 21 Territorial boundaries are known for their ability to define limited areas where states can exercise their sovereignty. In the case of constitutions, these legal sources provide the rules and the principles of a certain group of people in a certain sovereign space. Inside a specific territory, people are expected to comply with the applicable law in that area. The digital environment is not outside this constitutional framework. Rather than a ‘lawless place’, states have shown their ability to impose their sovereignty, especially by regulating network architecture.Footnote 22 According to Reidenberg, the architecture of the cyberspace prescribes its rules constituting the basis of the digital regulation, while also providing instruments of regulation. Footnote 23 In the case of China, the adoption of the ‘Great Firewall’ is one of the most evident examples of how states can express their sovereign powers over the Internet by regulating the network’s architectural dimension.Footnote 24 Precisely, one of the ways to express powers in the digital environment lies in the regulation of the online architecture.Footnote 25

Nonetheless, although public authorities can exercise their sovereign powers over the digital environment within their territories, at the same time, other actors contribute to producing their norms in turn. It is not by chance that scholars identified a ‘trend toward self-regulation’.Footnote 26 More specifically, this autopoietic trend in the cyberspace also results from the code’s architecture that contributes to defining the constitutional norms of the digital environment. As underlined by Sassen, ‘[p]rivate digital networks are also making possible forms of power other than the distributed power made possible by public digital networks’.Footnote 27 Likewise, Perrit underlines the dispersion of governance in the cyberspace among a variety of public and private institutions.Footnote 28

Therefore, understanding the overlapping points among social systems becomes crucial to understand the relationship of power in the digital environment. Unlike the static vision of the ‘pathetic dot’,Footnote 29 public and private actors are ‘active dots’ since they contribute to defining their rules and express regulatory powers over social systems.Footnote 30 The relationship of power in the cyberspace is more complicated than it appears at first glance. There are different micro-communities which are isolated and independently interact without knowing each other.Footnote 31 However, there are some points in the network where communities overlap. In those places, it is possible to observe the exercise of powers over the information flow. Examples of these points are Internet service providers, search engines such as Google, social network platforms such as Facebook or Twitter, governments, and other private organisations. All these actors participate in shaping the environment where communities meet creating rooms for sharing values and ideas. As underlined by Greenleaf, regulating the architecture of the cyberspace is not a neutral activity but reflects the values of its governors.Footnote 32

Notwithstanding all the actors contribute to shaping the overall picture, nodes have not the same influence on the network. Some dots in the network play the role of gatekeepers,Footnote 33 affecting the structure of the cyberspace more than others. According to Network Gatekeeper Theory’s scholars, ‘[a]ll nodes are not created equal. Nodes vary in their accessibility, their efficacy, the other nodes they can influence and how that influence is exerted. … The capacity of a node to influence or regulate depends in large part upon its resources broadly defined to include a wide range of forms of capital in the Bourdieuian sense’.Footnote 34 The node’s structure plays a fundamental role in the functioning of societies. Briefly, this model does not consider the individual as isolated in a specific environment, but every subject is part of a node which has the power to govern the network. Nodes do not have the same dimension or the same degree of development but, as centres of power, they share some common features: a strategy to govern (mentalities), modalities to govern (technologies), a definition of funds (resources) and a structure (institutions).Footnote 35

States can be an example of powerful nodes. Governments define the strategy and modalities to govern, choose the resources needed to make them effective while also relying on an institutional structure to execute decisions. This model can also be applied to other entities. Some actors can exercise a stronger influence over the structure of the cyberspace than other dots. In other words, by virtue of their ‘gravity’, some actors, or nodes, in the network can attract other active dots shaping online communities and, as a result, the entire network.Footnote 36 These actors are usually called macro-nodes or gatekeepers.Footnote 37 In other words, these actors mediate in a horizontal manner among spaces, for example, the state, the market and society.Footnote 38 For instance, governments are powerful actors influencing and attracting other nodes. However, the influence of nodes is not always equal. In states with a high degree of public intervention in the Internet sector like China or the Arabic states, these nodes can exercise more influence than constitutional democracies where public restrictions need to be justified and based on the rule of law. Online platforms are another example of powerful nodes which can impose their rules over the digital environment by defining and enforcing their standards on a global scale. The different weight of nodes confirms that communities are dynamic concepts whose evolution is the consequence of the relations between systems expressing different degree of powers.

Despite the ability of these systems to create their spaces, their rules are not generated outside any logic but are influenced by other forces including (constitutional) legal norms. It is precisely when constitutionalism overcomes state boundaries and penetrates the transnational context, including the private sector, that it loses a state-centric perspective and leads to processes of ‘constitutionalisation without the state’,Footnote 39 or beyond the state.Footnote 40 According to Teubner, this process cannot be understood just from the perspective of traditional public institutions but it can be considered as the expression of different autonomous subsystems of the global society.Footnote 41 In the case of the digital environment, social, technical and legal processes intertwine, with the result that the governance of these spaces results from the clash of different rationalities where the architecture constitutes the paradigm of power.

The scope of the norms produced by public and private actors is not equal across the globe but is affected by the legal environment in which these norms are created. It is not by chance that these kinds of norms tend to flourish in liberal democracies since these systems are characterised by general tolerance for pluralism and the principle of equality. On the contrary, these self-autonomous systems are weaker in authoritarian regimes where tolerance is replaced by instruments of control and surveillance.

The constitutional asymmetry among the approaches of states to the digital environment is not the only relevant point. New actors operating in the digital environment such as online platforms enjoy new areas of power deriving not just from a mix of business opportunities and technologies,Footnote 42 but also from the openness of democracies oriented to digital liberalism which has left these actors accumulating powers. While authoritarian models of governance aim to control and monitor online activities, on the opposite side, democratic models tend to digital liberalism by nature in order to respect private freedoms and promote economic and social growth.

Nonetheless, as analysed in Chapter 2, this approach devoted to digital liberalism has reduced the influences of states on the private actors which have been able to develop their system of governance by relying on constitutional freedoms. A new phase of liberalism based on a fundamental transformation towards privatisation and deregulation has triggered the development of a new space of power operating in the digital environment.Footnote 43 In other words, legal tolerance characterising constitutional democracies has played a crucial role in defining the boundaries of platform geography as a space where online platforms self-generate their rules on a global scale. This process could be described not only just by ‘the annihilation of law by space’,Footnote 44 but also as ‘the annihilation of law by law’. Merging the socio-legal and constitutional perspective, this phenomenon can be considered as expressing the rise of civil constitutions on a global scale.Footnote 45

In order to better understand how the shift of powers from public to private actors primarily concerns constitutional democracies, the next subsections focus on two constitutional asymmetries. The first concerns the relationship between democratic and authoritarian models of digital governance while the second focuses on the asymmetry between democratic and platform governance.

3.2.1 The First Constitutional Asymmetry

The constitutional asymmetry between liberal and illiberal models of governance provides a first angle to understand the challenges raised by private powers for constitutional democracies. Particularly in countries where forms of surveillance and control over information are diffused, like China and the Arab states,Footnote 46 the Internet has been subject to public controls leading to the monitoring of data,Footnote 47 or to Internet shutdowns.Footnote 48 States around the world have not taken the same road towards a free-market approach to the Internet which Johnson and Post identified as the solution for the governance of the cyberspace.Footnote 49 While a liberal approach became the standard across the Atlantic at the end of the last century, illiberal regimes have shown how public actors can regulate the digital environment, thus confirming the paternalistic positions of scholars who have criticised the libertarian approach,Footnote 50 and considered network architecture as the primary source of regulatory powers.Footnote 51

Unlike democratic systems considering the Internet as an instrument to foster fundamental freedoms and rights, primarily freedom of expression, authoritative or illiberal regimes have shown less concern in censoring the digital environment.Footnote 52 In this case, Internet censorship is merely a political decision to pursue political purposes prevailing over any other conflicting rights and interest with the regime. The central authority aims to protect its power by dissolving any personal freedoms and other constitutional values and principles such as the rule of law.Footnote 53 These models do not deny constitutional principles and limits but manipulate them as an instrument to pursue political purposes transforming political constitutions into a façade.Footnote 54 Within this framework, the lack of pluralism and solid democratic institutions does not promote any form of freedom whose boundaries can extend so broadly to undermine the central authority. In the lack of any safeguard and tolerance for pluralism, censoring the digital environment is not a matter of freedom and right any longer, but is equated to other discretionary measures implemented for political purposes. Therefore, it should not come as a surprise if the first aim of authoritarian and illiberal regimes is to suppress or control the degree of pluralism to avoid any interference with the central authority.

The Internet is a paradigmatic enabler of pluralism which these systems aim to suppress or limit. Digital technologies provide instruments to express fundamental rights and freedoms and particularly civil and political rights which could potentially undermine the central authority. The example of Internet shutdowns before elections or during protests or less intrusive forms of digital censorship like the suppression of false content have demonstrated how governments implement these practices without providing explanations or relying on a general legal basis.Footnote 55

In the opposite scenario, liberal and democratic models are open environments for pluralism. The expression ‘liberal democracy’ evokes values and principles such as liberty, equality, liberalism and participation rights. On the contrary, as already underlined, authoritarianism is based on narratives based on public interests, paternalism and pragmatic decision-making. On the opposite, the respect of fundamental rights and freedoms is at the basis of democratic systems. Without protecting equality, freedom of expression or assembly, it would not be possible to enjoy a democratic society. This shows why fundamental rights and democracy are substantially intertwined. Because of this substantive relationship, fundamental rights cannot easily be exploited to pursue unaccountable political ends.Footnote 56

This first constitutional asymmetry has led to the polarisation of the models to govern the digital environment. While authoritarian and illiberal systems have focused on developing their digital political economy by controlling the market and platforms as in the case of China,Footnote 57 democratic systems have followed a liberal approach to strike a fair balance between different rights and interests at stake, primarily the freedom to conduct business of online platforms or freedom of expression. The digital environment is a crucial vehicle to foster fundamental rights and freedoms, especially through the services offered by private actors such as social media and search engines. Intervening in this market requires constitutional democracies to assess not only the drawbacks for innovation but also the potential disproportionate interference with economic freedoms and fundamental rights.

The openness to legal pluralism is one of the reasons for the asymmetry among models of internet governance. This gap also provides clues to understand that the rise of digital private powers primarily affects constitutional democracies. Democratic systems tend to ensure a political and institutional environment where private actors can potentially consolidate their powers. On the opposite, in authoritarian countries, there is not enough space for the private sector not only to exercise freedoms but also to turn this area into forms of powers.

The liberal framework driving constitutional democracies across the Atlantic has led to the consolidation of private powersin governing the flow of information online and developing instruments of surveillance based on the processing of a vast amount of personal data. The spread of disinformation and the misuse of data are only two examples of the challenges raised by the role of private actors in the digital environment.Footnote 58 As underlined in the next subsection, these challenges are primarily the result of a constitutional asymmetry between public and private powers. While constitutional democracies are not free to restrict rights and freedoms by imposing their authority without balancing conflicting interests, private actors perform their business without being bound by constitutional limits given the lack of regulation.

3.2.2 The Second Constitutional Asymmetry

Governing the Internet is far from simple for constitutional democracies. Democratic systems cannot freely pursue their goals, but they are (positively) stuck in respect of the principle of the rule of law as well as the protection of fundamental rights and freedoms. The respect of these constitutional values is crucial to safeguard democratic values. For instance, from a European constitutional standpoint, disproportionate measures to regulate the Internet are not tolerated. In the case of online platforms, Member States are required to respect the freedom to conduct business as recognised by the Charter,Footnote 59 and the Treaties protecting fundamental freedoms, especially the freedom to provide services.Footnote 60

Each attempt to regulate online platforms should comply with the test established by the Charter setting a test based on the principle of legality, legitimacy and proportionality.Footnote 61 Therefore, in order to impose limitations on platform freedoms, it is necessary to comply with this test, which does not only consider the limitation of platform freedoms but also the impact of regulation on individual rights such as freedom of expression, privacy and data protection.

The ban on abuse of rights complements this system. According to the Charter, ‘nothing in this Charter shall be interpreted as implying any right to engage in any activity or to perform any act aimed at the destruction of any of the rights and freedoms recognised in this Charter or at their limitation to a greater extent than is provided for herein’.Footnote 62 Therefore, the Union cannot recognise absolute protection just to economic freedoms or fundamental rights. Instead, it is necessary to ensure that the enjoyment of fundamental rights does not lead to the destruction of other constitutional values. As a result, Member States need to strike a fair balance between platform freedoms and individual rights, thus respecting the core protection of these constitutional values.

Looking at the other side of the Atlantic, online platforms enjoy even broader protection since the constitutional ground to perform their business does not merely lies in economic freedoms but also in the right to free speech as recognised by the First Amendment. Precisely, the US Supreme Court applies a strict scrutiny test according to which any such law should be narrowly tailored to serve a compelling state interest, as the case Reno v. ACLU already underlined at the end of the last century.Footnote 63 Despite the differences between the two models, in both cases, online platforms enjoy a ‘constitutional safe area’ whose boundaries can be restricted only by a disproportionate prominence over other fundamental rights or legitimate interests. Despite the passing of years and opposing positions, this liberal approach has been reiterated more recently in Packingham v. North Carolina.Footnote 64 In the words of Justice Kennedy: ‘It is cyberspace – the “vast democratic forums of the Internet” in general, and social media in particular’.Footnote 65 Therefore, social media enjoy a safe constitutional area of protection under the First Amendment, which, in the last twenty years, has constituted a fundamental ban on any regulatory attempt to regulate online speech.Footnote 66

Therefore, when addressing the challenges raised by the algorithmic society, constitutional democracies cannot just rely on general justifications or political statements arguing the need to protect public security or other public interests. In order to restrict fundamental rights and freedoms, constitutional democracies are required to comply with constitutional procedures and safeguards. Furthermore, the respect of other constitutional rights plays a crucial role in limiting the possibility to recognise absolute protection to some values rather than others and promote the development of pluralism.

Historically, the first bills of rights were designed to restrict the power of public actors rather than interfere with the private sphere. As a result, constitutional provisions have been conceived, on the one hand, as a limit to the power of the state and, on the other hand, as a source of positive obligation for public actors to protect constitutional rights and liberties. Within this framework, the primary threats to individual rights and freedoms do not derive from the exercise of unaccountable freedoms by private actors but from public powers.

The increasing areas of power enjoyed by transnational corporations such as online platforms challenge this constitutional paradigm. The rapid expansion of new digital technologies combined with the choice of constitutional democracies to adopt a liberal approach regarding the digital environment are two of the reasons which have promoted the rise of online platforms as private powers.

Neoliberal ideas rejecting market intervention have paved the way towards a self-regulatory environment based on individual autonomy and freedom from public interferences. The application of neoliberal approaches to the digital environment have led to neglecting the critical role of public actors to ensure democratic principles against the consolidation of actors imposing their powers in the digital age.Footnote 67 Particularly, the consolidation of online platforms come from the exploitation of the same neoliberal narrative that constitutional democracies aim to ensure to promote pluralism and freedom.

While illiberal regimes have shown their ability to address this situation maintaining their power by implementing instruments of control and surveillance, the laissez-faire approach of democratic systems has led to the emergence of private powers underlining, de facto, a second constitutional asymmetry in the digital environment. In this case, given the lack of any regulation, online platforms can regulate their digital spaces without the obligation to protect constitutional values. Like illiberal regimes, platforms escape from constitutional obligations to pursue their business purposes.

Despite these challenges, instead of regulating online platforms to preclude private actors from expanding their powers, in the last decades, constitutional democracies have indirectly delegated public functions to online platforms. These observations just introduce some of the reasons leading private actors to expand their regulatory influence in the digital age and develop autonomous forms of power. In order to understand this situation from a constitutional perspective, the next sections address the power of online platforms to exercise delegated and autonomous functions.

3.3 Delegated Exercise of Quasi-Public Powers Online

The consolidation of platform powers has not been by chance, following the evolution of the digital environment. Law and policies have contributed to supporting the consolidation of platform capitalism. Online platforms have not just exploited the opportunities of digital technologies. The rise of digital private powers can primarily be considered the result of an indirect delegation of public functions. The shift from public to private in the digital environment is not an isolated case, but it is the result of a general tendency towards the transfer of functions or public tasks from lawmakers to specialised actors both in the public and the private sector.Footnote 68

The result of this complexity is part of a larger system of delegation which does not involve anymore the relationship between the lawmaker and the Government (legislative-executive) but also two new branches, respectively public bodies such as agencies (fourth branch) and private entities dealing with delegated public tasks (fifth branch). The delegation of public functions is not just a unitary phenomenon. It can include agreements between public and private actors based on public-private partnership schemes allowing private entities to provide goods or services.Footnote 69 The cases of smart cities or governmental services are clear examples of the shift of responsibilities from the public sector to private entities through instruments of public procurement.Footnote 70 In other cases, the delegation of public functions consists of the creation of new (private or public) entities to perform public tasks such as the provisions of products and services or the support to rule-making activities. In this case, the establishment of new government corporation or agency is one of the most evident examples.Footnote 71

More than fifteen years ago, this shift of power from public to private actors in the digital environment captured the attention of scholars who started to think how public law can be extended to a multi-stakeholder and decentralised system like the Internet. Boyle already wondered whether the Internet would have led to a transformation challenging basic assumptions not only concerning economics but also constitutional and administrative law.Footnote 72 As reported by Kaplan in the aftermath of the ICANN’s foundation, Zittrain referred to a ‘constitutional convention in a sense’.Footnote 73 At that time, it was clear that ICANN was in a position of governing the Internet architecture in ‘a position to exercise a substantial degree of power over the supposedly ungovernable world of the Internet’.Footnote 74 The case of ICANN has been the first example of the delegation to agency or other entities of regulatory powers over the digital environment. Froomkin underlined how, in the case of ICANN, the Government was violating the Administrative Procedures Act (APA) and going beyond the non-delegation doctrine coming from the interpretation of Article 1 of US Constitution and the separation of powers principle.Footnote 75 Likewise, Weinberg underlined how ICANN played the role of a public authority since ‘a private entity wielding what amounts to public power may be subjected to constitutional restraints designed to ensure that its power is exercised consistently with democratic values’.Footnote 76

These challenges had already unveiled some of the primary concerns that are relevant for examining platform powers. At the beginning of this century, scholars have defined the cooperation between public actors and online intermediaries as the ‘invisible handshake’,Footnote 77 based on the idea that public actors rely on private actors online to pursue their aims online outside constitutional safeguards. For instance, the use of online intermediaries for law enforcement purposes could support public tasks by mitigating enforcement complexity in the digital environment. In this case, online intermediaries would provide the infrastructural capabilities to pursue public policies online since they govern the digital spaces where information flows online, no matter if it crosses national borders. In other words, online intermediaries, as other private entities, were considered an instrument for public actors to ensure the enforcement of public policies rather than a threat leading to the rise of new powers online. The size of the infrastructure they provide is of particular interest for public authorities that are interested in accessing information to pursue public tasks.

When focusing on the digital environment, rather than a trend towards agencification, public actors have recognised the role of online intermediaries in enforcing public policies online. At the beginning of this century, Reidenberg underlined the dependency of the public sector on online intermediaries. He defined three modalities to ensure the enforcement of legal rules online: network intermediaries, network engineering and technological instruments.Footnote 78 Regarding the first approach, Reidenberg explained how public actors can rely on online platforms to ensure the enforcement of public policies online. States do not own the resources to pursue any wrongdoer acting in the digital environment. Already at the beginning of this century, the spread of peer-to-peer and torrent mechanisms unveiled the complexities of investigating, prosecuting and sanctioning millions of infringers every day. In such situations, online providers can function as ‘gateway points’ to identify and block illicit behaviours acting directly on the network structure. In this way, governments would regain control over the Internet using platforms as proxies to reaffirm their national sovereignty online.Footnote 79 In the last years, different regulatory models have arisen, thus moving from traditional approaches like ‘command and control’ to other models,Footnote 80 such as co-regulation, self-regulation and codes of conduct.Footnote 81 The choice for models outside the control of public actors comes from expertise increasingly found outside the government.Footnote 82

The shift of power from the public to the private sector can be interpreted not only as the consequence of economic and technical forces but also as the result of the changing influence of constitutional democracies in the field of Internet governance.Footnote 83 The delegation of public functions to online platforms is linked to the opportunity to rely on entities governing transnational areas such as the digital environment. Governments have increasingly started to rely on online platforms, for instance, to offer public services or improve their quality through digital and automated solutions like in the case of the urban environment.Footnote 84 However, this cooperation leads, firstly, to tech companies to hold a vast amount of information coming from the public sector, including personal data. Secondly, since public actors are increasingly technologically dependent on private actors, platforms can impose their conditions when agreeing on partnerships or other contractual arrangements with public actors. For instance, the use of artificial intelligence developed by private companies and then implemented by public authorities in welfare programs or criminal justice is another example where the (private) code and the accompanying infrastructure mediate individual rights and public functions.Footnote 85 Besides, governments have also forfeited power to private actors providing national services based on digital infrastructure governed by the private sector.Footnote 86 In other words, rather than preserving public functions or creating a new administrative body to deal with these areas of power, public actors have considered it more convenient to rely on entities which know how to do their job. Online platforms can indeed influence public policy due to the dependency of the public sector, especially for surveillance purposes, and the interests of citizens to access digital services which otherwise would not be offered by the public sector. The case of online contact tracing of COVID-19 has showed this intimate relationship between public and private actors in the algorithmic society.Footnote 87

Even if online platforms can play a critical role in ensuring the enforcement of policies in the digital environment, delegating public functions to the private sector entails the transfer of power to set the rule of the game through a mix of law and technology. Online platforms can indeed set the technical rules and the degree of transparency of their technologies, thus precluding public actors from exercising any form of oversight. Whether direct or indirect, the delegation of public functions to private actors touches upon some of the most intimate features of constitutional law: the constitutional divide between public and private actors, the separation of power, the principle of rule of law and, even more importantly, the democratic system. Although the gap between public and private actors could be formal at first glance, this distinction involves the core of constitutional law and, especially, how constitutional provisions apply vertically only to public bodies, while private actors are not required to comply with these boundaries given the lack of any regulatory intervention.

This constitutive difference can explain why the transfer of public functions to the private sector is subject to constitutional limits. These boundaries aim to control to what extent lawmakers can transfer or delegate authority to other (public or private) entities and the constitutional safeguards that should apply to avoid a dangerous marginalisation of democratic values in favour of non-accountable logics. These challenges have already emerged in other sectors where financial institutions, telecom companies and other infrastructure own the resources and the means to impose private standards on public values.Footnote 88 This concern was already expressed by Brandeis who defined this situation as the ‘curse of bigness’ to underline the role of corporations and monopolies in the progressive era.Footnote 89 However, unlike traditional forms of delegating public functions, online platforms can exercise powers deriving from an indirect form of delegation which is not backed by public safeguards and oversight.

Delegating online platforms to perform public tasks online is not problematic per se. It is the lack of procedural and substantive safeguards that raises constitutional challenges since it leaves the private sector free to consolidate its power. Precisely the idea of a government ‘of the people, by the people, for the people’ is put under pressure when public functions are left to the discretion of non-accountable private actors establishing standards driven by business interests. Looking at US constitutional law, the ban for the Congress to delegate power ‘is a principle universally recognized as vital to the integrity and maintenance of the system of government ordained by the Constitution’.Footnote 90 Moving to the European framework, the ECJ has clarified the boundaries of delegation from the Union’s institutions to agency and private actors by, de facto, creating a judicial non-delegation doctrine.Footnote 91 As observed by the Strasbourg Court, ‘the State cannot absolve itself from responsibility by delegating its obligation to private bodies or individuals’.Footnote 92 Because ‘the fact that a state chooses a form of delegation in which some of its powers are exercised by another body cannot be decisive for the question of State responsibility … ; [t]he responsibility of the respondent State thus continues even after such a transfer’.Footnote 93

This view has not only been questioned by the increasing reliance of public powers on other public bodies such as agencies and independent administrative authorities to face the technocratic reality of the administrative state.Footnote 94 It has also been challenged by a general trust in the role of the private sector or rather the belief that digital liberalism would have been the most suitable approach for the digital environment at the end of the last century. Therefore, when delegating public functions to private actors, public safeguards limit the consolidation of unaccountable powers. In other words, the aim of these safeguards is to avoid a dangerous uncertainty resulting from the mix of, quoting Boyle when referring to ICANN, ‘public and private, technical harmonization and political policy choice, contractual agency relationship and delegated rulemaker, state actor and private corporation’.Footnote 95

The rise of digital liberalism at the end of the last century has led to a shift of power and responsibility from public actors to the private sector based on technological optimism which, however, given the lack of any safeguard, is misplaced for at least two reasons. Firstly, private actors are not bound by limits to respect constitutional values and principles such as fundamental rights. Therefore, the absence of any regulatory safeguard or incentive leads private actors free to choose how to shape constitutional values based on their business interests. Secondly, even supporting self-regulation leaves the private sector free to impose standards which do not only influence public values but also private entities suffering the exercise of horizontal forms of authority coming from a mix of regulatory, economic and technological factors.

The next subsections underline how public actors have indirectly delegated public functions to online platforms. In the field of content, the analysis focuses on how the liability regime of online intermediaries has played a part in encouraging platforms to moderate content and setting the standard of protection of freedom of expression in the digital environment. The second subsection focuses on the role of European data protection law in entrusting online platforms with discretion on the processing of personal data.

3.3.1 Delegating Powers on Content

The US and European regimes of online intermediaries could be considered examples of delegating public functions in the field of content. The Communications Decency Act,Footnote 96 together with the Digital Millennium Copyright ActFootnote 97 and the e-Commerce Directive,Footnote 98 have not only introduced a special regime of immunity or exemption of liability for online intermediaries, acknowledging, in abstracto, their non-involvement as content providers.Footnote 99 These instruments have also recognised the power of intermediaries to make decisions on content, thus determining the lawlessness of online speech.

Despite this allocation of power, these systems do not provide any procedural safeguard while nor do they require any administrative or judicial filter determining ex ante whether content is illicit in a specific case. The e-Commerce Directive refers to the protection of freedom of expression only when underlining its functional role to the free movement of information society services,Footnote 100 and clarifying that the removal or disabling of access to online content must be undertaken in the observance of the principle of freedom of expression and procedures established for this purpose at national level. It is not by chance if another Recital clarifies that Member States can require service providers to apply ‘duties of care’ to detect and prevent certain types of illegal activities. These provisions are not binding since they simply play the role of interpretative guidelines for Member States implementing the e-Commerce Directive.Footnote 101 Moreover, even if the Recitals of the e-Commerce Directive refer to the need that online intermediaries respect the right to free speech when they moderate content, it is not clear whether this interpretative statement refers to the protection ensured, at that time, by Article 10 of the Convention or, also, to a functional dimension of freedom of expression resulting from the need to ensure the freedom to movement of information society services. This acknowledgement contributes to entrusting online platforms with the power to enforce and adjudicate disputes in the field of online content based on a standard of protection which is not only unclear but also based on business interests. Therefore, this system of liability has contributed to entrusting online intermediaries with the power to decide how to organise information as well as whether to remove or block content, thus creating the basis for business models based on third-party content sharing with limited risks of secondary liability. Therefore, due to the lack of transparency and accountability safeguards, online platforms are not required to consider the impact of their activities on fundamental rights and democratic values.

Moreover, the notice and takedown approach leads online platforms to perform this function based on the risk to be held liable, thus raising questions around collateral censorship.Footnote 102 Since online platforms are run privately, these actors would try to avoid the risks of being sanctioned for non-compliance with this duty by removing or blocking especially content whose illicit nature is not fully evident. The case of disinformation can provide an interesting example. Since it is not always possible to understand whether a false content is unlawful and eventually on which legal basis, this legal uncertainty encourages online platforms to monitor and remove even lawful speech to avoid any risk of liability.Footnote 103 This obligation encourages online intermediaries to censor even content whose illicit nature is not clear as a means to avoid economic sanctions.Footnote 104 The Strasbourg Court has also underlined this risk which can produce chilling effects for freedom of expression.Footnote 105 In other words, online intermediaries, as business actors, would likely focus on minimising this economic risk rather than adopting a fundamental-rights-based approach, thus pushing private interests to prevail over constitutional values

If the moderation of content is not an issue when online intermediaries just perform passive hosting functions, the same trust on private enforcement might be questioned by observing how platforms profit from moderating the content of billions of users on a daily basis. In this case, the role of online platforms is not merely passive in terms of their interest to organise and remove content which is driven by business purposes. It is here that the trust in the market turns into a fear that these actors can influence not only individual rights but also democratic values. Therefore, such delegated activity implies, inter alia, that platforms can take decisions affecting fundamental rights and democratic values.Footnote 106

Even if, as analysed in Chapter 2, the Union has started to limit platform discretion in content moderation, there are still constitutional drawbacks. Firstly, taking decisions on the lawfulness of content is a function traditionally belonging to public authorities. Instead, platforms are called to assess the lawfulness of the content in question to remove it promptly. Given the lack of any regulation of this process, online platforms are free to assess whether expressions are unlawful and make a decision regarding the removal or blocking of online content based on the contractual agreement with users. As a result, this anti-system has led platforms to acquire an increasing influence on the enforcing and balancing of users’ fundamental rights. For example, the choice to remove or block defamatory content or hate speech videos interferes with the right to freedom of expression of the users. Likewise, the decision about the need to protect other conflicting rights such as the protection of minors or human dignity is left to the decision of private actors without any public guarantee.

This change of responsibility would also lead to calling for introducing effective and appropriate safeguards to ensure the prevention of unintended removal of lawful content and respect the fundamental rights and democratic values.Footnote 107 However, this is not the current situation. As already stressed, the e-Commerce Directive does not provide any safeguards limiting platforms’ discretion. Obligations are indeed directed to Member States while online platforms, as hosting providers, are just required to remove content once they become aware of their illicit presence online.

Within this framework, as examined in Chapter 5, the primary issue is the lack of any accountability, transparent procedure or redress mechanisms limiting platform power in the field of content. For example, platforms are neither obliged to explain the reasoning of the removal or blocking of online content, nor to provide remedies against their decisions even if they process a vast amount of content. While waiting for the effects of the Digital Services Act,Footnote 108 users cannot rely on a horizontal legal remedy against autonomous decisions of online platforms affecting their rights and freedoms. This situation raises concerns even for democratic systems. Delegating online platforms to make decisions on content empowers these private actors to influence public discourse. The case of the deplatforming of the former US president Trump is a paradigmatic example of the power an online platform can exercise on online speech. This issue is also connected to the autonomous powers these private entities can exercise by setting the standard of protection of fundamental rights online, including the right to freedom of expression, which is one of the cornerstones of democracy.

Nonetheless, the expansion of digital private powers does not concern just the field of content. As the next subsection will examine, even in the field of data, the Union has contributed to extending the power of the platform to make decisions based on a risk-based approach.

3.3.2 Delegating Powers on Data

The field of data has also experienced a process of delegation of public functions to the private sector. Unlike the case of content, however, the primary concerns are not related to the lack of safeguards but to the risk-based approach which the Data Protection Directive introduced in 1995.Footnote 109 The WP29 stressed the role of a risk-based approach in data protection underlining how risk management is not a new concept in data protection law.Footnote 110

Even the Council of Ministers of the Organisation for Economic Co-operation and Development (OECD) implemented a risk-based approach when revising the Guidelines Governing the Protection of Privacy and Transborder Flows of Personal Data, first adopted in 1980,Footnote 111 for instance, concerning the implementation of security measures.Footnote 112 According to the Data Protection Directive, security measures must ‘ensure a level of security appropriate to the risks represented by the processing and the nature of the data to be protected’.Footnote 113 Even more importantly, the assessment of risk was also considered one legal basis for the processing of personal data when the processing was ‘necessary for the purposes of the legitimate interest pursued by the controller or by the third party or parties to whom data are disclosed, except where such interests are overridden by the interests for fundamental rights and freedoms of the data subjects’.Footnote 114 In both cases, this assessment rested in the hands of the data controller which ‘determines the purposes and means of the processing of personal data’. This definition can explain how the governance of personal data is not just determined by public authorities but is also firmly dependent on the choices of the data controller. Unlike these cases, the relevance of risk also extends to Member States through data protection authorities to assess specific risks coming from the processing of personal data.Footnote 115

The GDPR has consolidated this approach by introducing a comprehensive risk-based approach built upon the principle of accountability of the data controller.Footnote 116 As underlined in Chapter 2, the principle of accountability requires the controller to prove compliance with the general principles of the GDPR by establishing safeguards and limitations based on the specific context of the processing, especially on the risks for data subjects. The GDPR modulates the obligation of the data controller according to the specific context in which the processing takes place.Footnote 117 As observed by Macenaite, ‘risk becomes a new boundary in the data protection field when deciding whether easily to allow personal data processing or to impose additional legal and procedural safeguards in order to shield the relevant data subjects from possible harm’.Footnote 118 It would be enough to focus on the norms concerning the Data Protection Impact Assessment,Footnote 119 or the appointment of the Data Protection Officer,Footnote 120 to understand how the GDPR has not introduced mere obligations to comply but a flexible risk-based approach which leads to different margins of responsibility depending on the context at stake.Footnote 121 In other words, the GDPR has led to the merge of a rights-based approach with a risk-based approach based on a case-by-case assessment about the responsibility of data controllers.

However, the risk-based approach leads data controllers to playing a critical role in defining not whether but how to comply with the GDPR. This system entrusts data controllers to decide the appropriate safeguards and procedures which, in a specific context, would be enough to be aligned with the general principles of the GDPR. This approach is also the result of a dynamic definition of the data controller’s responsibilities based on the nature, the scope of application, the context and the purpose of the processing, as well as the risks to the individuals’ rights and freedoms. On this basis, the data controller is required to implement appropriate technical and organisational measures to guarantee, and be able to demonstrate, that the processing is conducted in accordance with the GDPR.Footnote 122 The principles of privacy by design and by default contributes to achieving this purpose by imposing an ex-ante assessment of compliance with the GDPR and, as a result, with the protection of the fundamental right to data protection.Footnote 123 Put another way, the GDPR focuses on promoting a proactive, rather than a reactive approach based on the assessment of the risks and context of specific processing of personal data. A paradigmatic example of this shift is the obligation for the data controller to carry out the Data Protection Impact Assessment, which, however, is a mandatory step only in the cases defined by the GDPR and based on the data controller sensitivity.Footnote 124 This obligation requires data controllers to conduct a risk assessment which is not only based on business interests but also on data subjects’ (fundamental) rights. In other words, the risk-based approach introduced by the GDPR could be considered a delegation to the data controller of the power to balance conflicting interests, thus making the controller the ‘arbiter’ of data protection.

However, the GDPR does not exhaust the concern about delegation in the field of data. Even before the adoption of the GDPR, the ECJ had contributed to extending platform powers in delisting online content. Even without analysing the well-known facts of the landmark decision in Google Spain,Footnote 125 the ECJ has brought out a new right to be forgotten as a part of the right to privacy and data protection in the digital age.Footnote 126 In order to achieve this aim, the ECJ, as a public actor, interpreted the framework of fundamental rights together with the dispositions of the Data Protection Directive and de facto entrusted private actors, more precisely search engines, to delist online content without removing information on the motion of the individual concerned.

However, unlike the case of content, both the ECJ and the EDPB, and before the WP29, have identified some criteria according to which platforms shall assess the request of the data subject.Footnote 127 Moreover, the recent European codification of the right to erasure has contributed to clarifying the criteria to apply the right to delist. Precisely, the data subject has the right to obtain from the controller, without undue delay, the erasure of personal data concerning him or her according to specific grounds,Footnote 128 and excluding such rights in other cases,Footnote 129 for example when the processing is necessary for exercising the right to freedom of expression and information.

Although the data subject can rely on a legal remedy by lodging a complaint to the public authority to have their rights protected, the autonomy of platforms continues to remain a relevant concern. When addressing users’ requests for delisting, the balancing of fundamental rights is left to the assessment of online platforms. As explained in Chapter 4, this issue overlaps with the concerns about the notice and takedown mechanism in the field of content since search engines enjoy a broad margin of discretion when balancing fundamental rights and enforcing their decisions. In the case of the right to be forgotten online, search engines decide whether the exception relating to the freedom to impart information applies in a specific case. They delist search results by relying only on their internal assessments and they are not obliged to provide any reason for their decision or redress mechanism. Therefore, the online enforcement of the right to be forgotten is another example of (delegated) powers that platforms exercise when balancing and enforcing fundamental rights online.

Even if, in the field of data, the primary concerns do not result from the lack of procedural safeguards, the adoption of the risk-based approach still demonstrates the risk of entrusting private actors with functions which increasingly mirror the powers exercised by public authorities. Even if, as underlined in Chapter VI, the risk-based approach plays a critical role for European digital constitutionalism, it should not be neglected how the structure of European data protection law has entrusted the private sector with important decision-making functions on fundamental and democratic values. It is not by chance that, as in the field of content, the delegation of power to online platforms has also contributed to consolidating autonomous powers outside the oversight of public authorities.

3.4 Autonomous Exercise of Quasi-Public Powers Online

The indirect delegation of public functions has not only expanded platform powers. It has also contributed to extending the autonomy of online platforms, leading these actors to consolidate areas of power beyond delegation. Indeed, the liberal constitutional approach across the Atlantic has encouraged online platforms to exploit technology not only to become proxies of public actors but also rely on their freedoms to set their standards and procedures.

Although online platforms are still considered service providers, the consequences of their gatekeeping role cannot be neglected. The possibility to autonomously set the rules according to which information flows and is processed on a global scale leads to an increase in the discretion of these private actors. As Pasquale observed, ‘in functional arenas from room-letting to transportation to commerce, persons will be increasingly subject to corporate, rather than democratic, control’.Footnote 130 Daskal underlined the ability of private actors in setting the rules governing the Internet.Footnote 131 Intermediaries have increasingly arisen as surveillance infrastructures,Footnote 132 as well as governors of digital expressions.Footnote 133

These functional expressions of power increasingly compete with states’ authority based on the exercise of sovereign powers on a certain territory.Footnote 134 This consideration highlights why some scholars have referred to this phenomenon as the rise of the law of the platforms.Footnote 135 Put another way, online platforms have developed their private geography on a global scale. The possibility to autonomously set the rules according to which data flows and is processed leads to an increase in the discretion of private actors.Footnote 136 In the laissez-faire scenario, data and information have started being collected globally by private actors for business purposes. Whereas the Internet has allowed private actors to gather information and develop their businesses, today algorithmic technologies enable such actors to process vast amounts of data (or Big Data) to extract value. Their processing has led to an increase in the economic and political power of some private actors in the digital age where the monopoly over knowledge does not belong exclusively to public authorities anymore but also to private actors. From a transnational constitutional perspective, this phenomenon can be described as the rise of a civil constitution outside institutionalised politics. According to Teubner, the constitution of a global society cannot result from a unitary and institutionalised effort but emerges from the constitutionalisation of autonomous subsystems of that global society.Footnote 137

Therefore, these challenges do not just concern the limit faced by public actors in regulating the Internet, but, more importantly, how constitutional democracies can avoid the consolidation of private powers the nature of which is more global than local.Footnote 138 As already underlined, constitutional democracies aim to protect freedoms and pluralism. The primary challenge is when such tolerance contributes to the rise of private powers centralising and excluding any form of pluralism. In other words, in a circular way, the rise of private power would threaten the goal of constitutional democracies to protect a pluralistic environment. This is why the next subsection focuses on examining the exercise of autonomous powers by online platforms. The first part examines how the situation can be considered as a new status of subjectionis or digital social contract. The second describes how platforms enjoy areas of freedoms that de facto represent the exercise of quasi-public powers online.

3.4.1 A New Status Subjectionis or Digital Social Contract

In 2017, Zuckerberg stated that ‘Great communities have great leaders’ and ‘we need to give more leaders the power to build communities’.Footnote 139 These expressions might not raise concerns at first glance. Nevertheless, they indirectly picture the inspirational values of Facebook. The success of online communities does not come from users’ participation and involvement but from the power of its leader. The will of the leader, receiving its investiture from the company, shapes communities. This narrative is far from looking democratic. However, these pharaonic statements should not surprise since online platforms, as business actors, are not keen on democratic forms of participation based on transparency and accountability. They care more to ensure a sound and stable governance driven by profit maximisation.

Therefore, the starting point to understand the exercise of autonomous powers by online platforms is to focus on the vertical regulation of users reflecting the relationship between authority and subjects. At first glance, a contractual agreement governs the relationship between users and online platforms. Users decide spontaneously to adhere to the rules established in ToS and community guidelines. Nonetheless, ToS are not just contracts but instruments of governance. It is not by chance that these agreements have been analysed as the constitutional foundation of online platforms’ activities.Footnote 140 As Radin explained, generally businesses try to exploit new forms of contracts to overrule legislation protecting parties’ rights.Footnote 141 Contract law allows private actors to exercise a regulatory authority over a private relationship ‘without using the appearance of authoritarian forms’.Footnote 142 According to Slawson, contracts, and especially standard forms, hide an antidemocratic tendency ‘[s]ince so much law is made by standard form it is important that it be made democratically’.Footnote 143 Users enter into digital spaces where private companies are ‘both service providers and regulatory bodies that govern their own and their users’ conduct’.Footnote 144 It is not by chance that Zuboff describes the aim of ToS as a ‘form of unilateral declaration that most closely resembles the social relations of a pre-modern absolutist authority’.Footnote 145 Likewise, MacKinnon describes this situation as a Hobbesian approach to governance, where users give up their fundamental rights to access and enjoy digital services.Footnote 146 In other words, moving from private to constitutional law, platforms vertically govern their communities and the horizontal relationship between users through a mix of instruments of technology and contract law.Footnote 147

Besides, the role of online platforms as social infrastructures annihilates any contractual power of the user making the relationship between users and platforms vertical rather than horizontal. The digital dominance of online platforms plays a critical role in daily lives.Footnote 148 They contribute to offering people services, for example, to find resources online (i.e. search engines), buy products and services (i.e. e-commerce marketplaces), communicate and share information and data with other people (i.e. social media). And this role has been confirmed even during the pandemic. Without considering their market power, it would be enough to look at the number of users of Facebook or Google to understand that their community is bigger than entire regions of the world,Footnote 149 so that the definition of a ‘company-town’ would seem reductive.Footnote 150

The inhabitants of these digital spaces consider online platforms as primary channels for news or even managing intimate and professional relationships as well as advertising their businesses. According to Pasquale, the real product here is users’ information and data.Footnote 151 The company can exercise a form of private monitoring over content and data shared, not so differently from governmental surveillance. Indeed, Kim and Telman underline how ‘private data mining is just as objectionable and harmful to individual rights as is governmental data mining’,Footnote 152 and ‘because corporate actors are now empowered to use their technological advantages to manipulate and dictate the terms on which they interact with the public, they govern us in ways that can mimic and even supersede governance through democratic processes’.Footnote 153 Therefore, platform power is not just a matter of quantity but also of quality. In other words, online platforms have acquired their areas of power not only as resulting from the amount of data or their scale but also from their gatekeeping role based on the organisation of online spaces for billions of users.Footnote 154

These digital spaces governed by online platforms are not based on horizontal systems where communities decide and shape their rules but on vertical contractual relationships resembling a new pactum subjectionis or digital social contract. Users bargain their constitutional rights to adhere to conditions determined through a top-down approach driven by business interests. As the ruler of digital space, the governance of online platforms defines a private geography of power based on norms and spaces whose boundaries escape the traditional notion of territorial sovereignty.

The mix of automated technologies of moderation with internal and community guidelines reproduces a system of constitutional rules and principles governing communities. As Evans explains, the rules and penalties imposed by the platform mirror (and, in some cases, substitute) those adopted by public authorities.Footnote 155 In this para-constitutional framework, the vertical and horizontal relationship of users and, therefore, the exercise of their rights and freedoms are privately determined without the substantive and procedural safeguards democratic constitutional norms traditionally offer. Within this authoritarian framework, as observed by Shadmy, ‘corporate services … transforms rights in the public imaginary into privileges that the company grants and can revoke, according to its own will and interest’.Footnote 156

Besides, the power to shape and determine rights and freedoms in the digital environment is not the only concern. The vertical relationship between community and platform reflects the characteristics of an absolute regime rather than that of a private constitutional order. Online platforms set the rules governing their communities without involving users, who have no instrument of participation to determine the rules of the game. Even if online platforms offer their spaces as instruments to foster democracy, there is no space for democratic participation.Footnote 157 Although online platforms base their narrative on their role in establishing a global community, it is worth wondering how it is possible to reach an agreement upon common rules between communities which, in some cases, are made up to two billion people. Someone could argue that users can participate in the platforms’ environment by selecting to hide news or opt-in to specific data regimes. However, it should not be forgotten that online platforms set these options, thus leaving users the mere feeling of freedom in their digital spaces.

In this regard, Jenkins distinguishes between participation and interactivity.Footnote 158 According to Jenkins, ‘Interactivity refers to the ways that new technologies have been designed to be more responsive to consumer feedback’ while ‘Participation, on the other hand, is shaped by the cultural and social protocols’. Translating this distinction in the field of online platforms, it is possible to observe how there is no participation since online platforms autonomously define the protocols while inviting users to engage and interact. Platforms foster interactivity as an alternative to participation which create a reasonable feeling of trust and involvement in online platforms’ determinations. The rights and freedoms in the digital environment are not just the result of democratic participation (‘bottom-up’) but also of the privileges granted by online platforms (‘top-down’). In this case, constitutional values and principles compete with discretionary private determinations which are not required to respect constitutional safeguards and act like an absolute power.

The lack of any participatory instrument or transparency makes individuals subject to the autonomous powers exercised by online platforms, leading to a process of ‘democratic degradation’.Footnote 159 Therefore, it is not just a matter of formal adherence to boilerplate clauses but the lack of participation in activities which affect the rights and freedoms of billions of people in the world. This situation also extends to the lack of transparency and redress mechanisms. Although data protection law provides more safeguards on this point, it is possible to generally observe how online platforms escape from accountability for their conducts. Within this framework, it would be possible to argue that the power exercised by online platforms mirrors, to some extent, the same discretion which an absolute power can exercise over its subjects.

3.4.2 The Exercise of Autonomous Powers

The vertical relationship between platform powers and users is not the only piece of this authoritarian puzzle. It is also critical to understand how online platforms express autonomous forms of power. By ToS and community guidelines, platforms unilaterally establish what users can do in their digital spaces. Platforms rely on private freedoms to regulate relationships with their online communities, precisely determining how content and data are governed online. In the field of content, this is particularly evident as underlined in Chapter 2. In the lack of any regulation of the process through which expression is moderated, platforms are free to set the rules according to which speech flows online. While, in the field of data, on the one hand, the GDPR introduces safeguards and obligations, on the other hand, it leaves data controllers broad margins of discretion in assessing the risks for data subject’s fundamental rights and their ability to prove compliance with data protection principles according to the principle of accountability.

Regulating speech and data is usually the result of legislative fights and constitutional compromises. On the opposite, online platforms autonomously set standards and procedures through instruments of contract law even if they operate transnationally and are driven by their business purposes. At first glance, the significance of this situation under a public (or rather constitutional) law perspective may not be evident, both because boilerplate contracts are very common even in the offline world and ToS do not seem to differ from the traditional contractual model.Footnote 160 Boilerplate contracts provide clauses based on standard contractual terms that are usually included in other agreements.Footnote 161 However, as Jaffe underlined in the first half of the last century, contract law could be considered as a delegation of lawmaking powers to private parties,Footnote 162 and this extends to the private governance of the digital environment.Footnote 163 Put another way, these agreements compete with the traditional way norms and powers are conceived as expression of public authorities.

By defining the rules to enforce private standards as well as the procedural and technical tools underpinning their ToS, platforms govern their digital spaces.Footnote 164 In this case, it would be possible to observe how ToS constitute the expression of a quasi-legislative power. The lack of transparency and accountability of the online platforms’ decision-making processes does not allow assessing whether platforms comply with legal standards, internal guidelines or other business purposes.Footnote 165 These instruments do not ensure the same degree of protection as public safeguards.Footnote 166 Although this autonomy is limited in some areas such as data protection, the global application of their services and the lack of any legal rule regulating online content moderation leave a broad margin of political discretion in their hands when drafting their ToS. In other words, similarly to the law, these private determinations can be considered as the legal basis according to which platforms exercise their powers or an expression of how platforms can promote an autopoietic set of rules which compete with the law.

Besides, the exercise of quasi-legislative functions is not the only expression of platform powers. Online platforms can enforce contractual clauses provided for in the ToS directly without relying on public mechanisms such as a judicial order or the intervention of law enforcement authorities. For instance, the removal of online content or the erasure of data can be performed directly and discretionarily by online platforms without the involvement of any public body ordering the infringing party to fulfil the related contractual obligations. This technological asymmetry constitutes the grounding difference from traditional boilerplate contracts. Their enforcement is strictly dependent on the role of the public authority in ensuring the respect of the rights and obligations which the parties have agreed upon. Here, the code assumes the function of the law,Footnote 167 and the network architecture shows its role as modality of regulation.Footnote 168 Platforms can directly enforce their rights through a quasi-executive function. This private enforcement is the result of an asymmetrical technological position with respect to users. Platforms are the rulers of their digital space since they can manage the activities which occur within their boundaries. This power, which is not delegated by public authorities but results from the network architecture itself, is of special concern from a constitutional perspective since it represents not only a form of disintermediation of the role of public actors but also the constitutionalisation of self-regulation.Footnote 169

Together with these normative and executive functions, online platforms can also exercise a quasi-judicial power. Platforms have showed that they perform functions which are similar to judicial powers and especially mirror the role of constitutional courts, namely the balancing of fundamental rights. When receiving a notice from users asking for content removal or delisting, platforms assess which fundamental rights or interests should prevail in the case at issue to render a decision. Taking as an example the alleged defamatory content signalled by a user, the platform could freely decide that the right to inform prevails over human dignity. The same consideration applies when focusing on how the right to privacy should be balanced with freedom of expression. This situation is evident not only when platforms moderate content but also when processing personal data exploiting the loopholes of data protection law. These decisions are based on their business purposes without being obliged to respect or take into account fundamental rights. The result of this situation leads to a chilling effect for fundamental rights and, even more importantly, to the establishment of paralegal frameworks in the algorithmic society.

Furthermore, adding another layer of complexity – and concern – is the possibility that these activities can be executed by using automated decision-making technologies. On the one hand, algorithms can be considered as technical instruments facilitating platform’s functionalities, such as the organisation of online content and the processing of data. However, on the other hand, such technologies can constitute technical self-executing rules, obviating even the need for a human executive or judicial function.

The use of automated decision-making technologies is not neutral from a constitutional law perspective. Humans and machines shape the choices of online platforms.Footnote 170 The lack of transparency and accountability in these systems challenges fundamental rights and democratic values.Footnote 171 The new relationship between human and machine in the algorithmic society leads to the increase of platform powers. Within this framework, there is no room for users to complain against abuse of powers. The governance of decision-making is not shared but centralised and covered by unaccountable purposes. As underlined by Hartzog, Melber and Salinger, ‘our rights are established through non-negotiable, one-sided and deliberately opaque “terms of service” contracts. These documents are not designed to protect us. They are drafted by corporations, for corporations. There are few protections for the users – the lifeblood powering social media’.Footnote 172

From a constitutional perspective, users, as members of online communities, are subject to the exercise of a contractual authority exercised by platforms through instruments of private law mixed with technology (i.e. the law of the platforms). The three traditional public powers are centralised when focusing on platforms quasi-public functions: the definition of the rules to assess online content, the enforcement and the balancing of rights are all practised by the platform without any separation of powers. Constitutionalism has primarily been based on the idea of the separation of powers, as theorised by Charles De Secondat,Footnote 173 and the protection of rights and freedoms.Footnote 174 In contrast, the governance of online platforms reflects a private order whose characteristics are not inspired by democratic constitutionalism but are more similar to the exercise of absolute power. Precisely, this is not the rise of a ‘private constitutional order’ since neither the separation of powers nor the protection of rights are granted in this system.Footnote 175 This framework has led some authors to refer to this phenomenon as a return to feudalism,Footnote 176 or to the ancien régime.Footnote 177

3.5 Converging Powers in the Algorithmic Society

The rise of online platforms as digital private powers is not an unexpected consequence if framed within the global trends challenging constitutional democracies. Globalisation as also driven by neoliberal narratives has contributed to the rise of transnational actors producing and shaping norms beyond national boundaries. This situation contributes to weakening the relationship between ‘law and territory’ and enhancing that between ‘norms and space’. The evolution of different systems leads to the emergence of different institutions which operate according to their internal rationality. As a result, the unitary of the state and the role of law is slowly replaced by the fragmentation of new institutions expressing their principles and values on a global scale.

The digital environment constitutes a sort of battlefield between these systems. Different rationalities influence each other since they are cognitively open, although they develop their rules according to their internal norms and procedures which are based on autonomous principles. On the one hand, public authorities can express binding rules censoring content or restricting access to the Internet, also enjoying the exclusive monopoly on the use of force. On the other hand, online platforms develop standards and procedures answering their business logic, thus inevitably providing a model competing with public powers. Both systems develop their rules according to their internal procedures and logic in a continuous interaction defining different relationship of power.

Against these challenges, while authoritarian systems have replied to these threats by extending their powers to the digital environment to protect the central authority, constitutional democracies, which instead are open environments for freedoms, have adopted a liberal approach entrusting online platforms with public tasks in the digital environment. Such a transfer of responsibilities has also been driven by the ability of platforms to enforce public policies online as gatekeepers. Although the delegation to private actors of public tasks should not be considered a negative phenomenon per se, the lack of safeguards leaves these actors free to exercise private powers. Unlike public actors, online platforms are not required to pursue public interests such as the protection of fundamental rights and democratic values.

Therefore, without providing instruments to foster transparency and accountability, even the indirect delegation of public functions contributes to the consolidation of economic and political power as well as to the exercise of autonomous functions by private actors. Moreover, delegated powers are not the only source of concern. Platforms can exercise private powers over their online spaces through instruments based on contract law and technology. In the field of content and data, platform governance mirrors the exercise of quasi-public functions by defining the values and the principles on which their communities are based. This discretion in setting the standard of their communities or the possibility to balance and enforce fundamental rights through automated systems are examples of a private order mirroring an absolute regime resulting from a mix of constitutional freedoms and technology.

The fields of content and data have provided a critical angle to examine the exercise of platform powers. This is not a coincidence, but it is the result of the intimate relationship between these two fields in the algorithmic society. Therefore, the next chapter focuses on how the evolution of the digital environment has led to the convergence of content and data, showing how, even if they are based on different constitutional premises, freedom of expression, privacy and data protection share the same constitutional direction.

Footnotes

1 Nicholas Negroponte, Being Digital (Alfred A. Knopf 1995).

2 Bill Gates, The Road Ahead (Viking Press 1995).

3 John P. Barlow, ‘The Economy of Ideas: Selling Wine Without Bottles on the Global Net’ in Peter Ludlow (ed.), High Noon on the Electronic Frontier: Conceptual Issues in Cyberspace (MIT Press 1999).

4 Yochai Benkler, The Wealth of Networks: How Social Production Transforms Markets and Freedom (Yale University Press 2006); Andrew Murray, Information Technology Law: The Law and Society (Oxford University Press 2013).

5 Frank H. Easterbrook, ‘Cyberspace and the Law of the Horse’ (1996) University of Chicago Legal Forum 207.

6 Rosa Hartmut, Social Acceleration: A New Theory of Modernity (Columbia University Press 2013); John G. Palfrey, ‘Four Phases of Internet Regulation’ (2010) 77(3) Social Research 981.

7 K. Sabeel Rahman, ‘The New Utilities: Private Power, Social Infrastructure, and the Revival of the Public Utility Concept’ (2018) 39 Cardozo L. Rev. 1621; Alex Moazed and Nicholas L. Johnson, Modern Monopolies: What It Takes to Dominate the 21st Century Economy (St Martin’s Press 2016); Robin Mansell and Michele Javary, ‘Emerging Internet Oligopolies: A Political Economy Analysis’ in Arthur S. Miller and Warren J. Samuels (eds.), An Institutionalist Approach to Public Utilities Regulation (Michigan State University Press 2002).

8 Orly Lobel, ‘The Law of the Platforms’ (2016) 101 Minnesota Law Review 87.

9 Niva Elkin-Koren and Eldar Haber, ‘Governance by Proxy: Cyber Challenges to Civil Liberties’ (2017) 82(1) Brooklyn Law Review 105.

10 Lawrence Lessig, Code: And Other Laws of Cyberspace. Version 2.0 (Basic Books 2006); Tarleton Gillespie, ‘The Relevance of Algorithms’ in Tarleton Gillespie, Pablo J. Boczkowski and Kirsten A. Foot (eds.), Media Technologies: Essays on Communication, Materiality, and Society (MIT Press 2014); Helen Nissenbaum, ‘From Preemption to Circumvention: If Technology Regulates, Why Do We Need Regulation (and Vice Versa)?’ (2011) 26 Berkley Technology Law Journal 1367.

11 Neil Walker, Intimations of Global Law (Cambridge University Press 2015).

12 Eric C. Ip, ‘Globalization and the Future of the Law of the Sovereign State’ (2010) 8(3) International Journal of Constitutional Law 636.

13 Ran Hirschl and Ayelet Shachar, ‘Spatial Statism’ (2019) 17(2) International Journal of Constitutional Law 387.

14 Saskia Sassen, Losing Control?: Sovereignty in the Age of Globalization (Columbia University Press 1996).

15 Giuliana Ziccardi-Capaldo, The Pillars of Global Law (Ashgate 2008).

16 Mattias Kumm, ‘Constituent Power, Cosmopolitan Constitutionalism, and Post-Positivist Law’ (2016) 14(3) International Journal of Constitutional Law 2016.

17 Gunther Teubner, Law as an Autopoietic System (Blackwell 1993).

18 Gunther Teubner, Constitutional Fragments: Societal Constitutionalism and Globalization (Oxford University Press 2012).

19 David R. Johnson and David Post, ‘Law and Borders: The Rise of Law in Cyberspace’ (1996) 48(5) Stanford Law Review 1367.

20 I. Trotter Hardy, ‘The Proper Legal Regime for “Cyberspace”’ (1994) 55 University of Pittsburgh Law Review 993.

21 Joseph H. Sommer, ‘Against Cyberlaw’ (2000) 15 Berkeley Technology Law Journal 1145; Jack L. Goldsmith, ‘Against Cyberanarchy’ (1999) 40 University of Chicago Law Occasional Paper 1; Andrew Shapiro, ‘The Disappearance of Cyberspace and the Rise of Code’ (1998) 8 Seton Hall Constitutional Law Journal 703; Tim Wu, ‘Cyberspace Sovereignty? The Internet and the International Systems’ (1997) 10(3) Harvard Law Journal 647.

22 Lawrence Lessig and Paul Resnick, ‘Zoning Speech on the Internet: A Legal and Technical Model’ (1998) 98 Michigan Law Review 395; Jack L. Goldsmith, ‘The Internet and the Abiding Significance of Territorial Sovereignty’ (1998) 5 Indiana Journal of Global Legal Studies 474; Joel R. Reidenberg, ‘Governing Networks and Rule-Making Cyberspace’ (1996) 45 Emory Law Journal 911.

23 Joel R. Reidenberg, ‘Lex Informatica: The Formulation of Information Policy Rules through Technology’ (1997–8) 76 Texas Law Review 553.

24 Jonathan Zittrain and Benjamin Edelman, ‘Empirical Analysis of Internet Filtering in China’ (2003) Harvard Law School Public Law Research Paper No. 62.

25 Lessig (Footnote n. 10); Francesca Musiani, ‘Network Architecture as Internet Governance’ (2013) 2(4) Internet Policy Review https://policyreview.info/node/208/pdf accessed 21 November 2021.

26 Jack L. Goldsmith, ‘The Internet, Conflicts of Regulation and International Harmonization’ in Christoph Engel (ed.), Governance of Global Networks in the Light of Differing Local Values 197 (Nomos 2000).

27 Saskia Sassen, ‘On the Internet and Sovereignty’ (1998) 5 Indiana Journal of Global Legal Studies 545, 551.

28 Henry H. Perritt, ‘Cyberspace Self-Government: Town Hall Democracy or Rediscovered Royalism?’ (1997) 12 Berkeley Technology Law Journal 413.

29 Lawrence Lessig, ‘The New Chicago School’ (1998) 27(2) The Journal of Legal Studies 661.

30 Andrew Murray, The Regulation of Cyberspace (Routledge 2007).

31 Cass R. Sunstein, Republic.com 2.0 (Princeton University Press 2007).

32 Graham Greenleaf, ‘An Endnote on Regulating Cyberspace: Architecture vs Law?’ (1998) 2(2) University of New South Wales Law Journal 593.

33 Karine Barzilai-Nahon, ‘Toward a Theory of Network Gatekeeping: A Framework for Exploring Information Control’ (2008) 59(9) Journal of the American Society for Information Science and Technology 1493.

34 Scott Burris, Peter Drahos and Clifford Shearing, ‘Nodal Governance’ (2005) 30 Australian Journal of Legal Philosophy 30.

35 Less Johnston and Clifford Shearing, Governing Security. Explorations in Policing and Justice (Routledge 2003).

36 Andrew Murray, ‘Nodes and Gravity in Virtual Space’ (2011) 5(2) Legisprudence 195.

37 Emily B. Laidlaw, ‘A Framework for Identifying Internet Information Gatekeepers’ (2012) 24(3) International Review of Law, Computers & Technology 263.

38 Julia Black, ‘Constitutionalising Self-Regulation’ (1996) 59(1) The Modern Law Review 24.

39 Gunther Teubner, ‘Societal Constitutionalism: Alternatives to State-Centred Constitutional Theory?’ in Christian Joerges, Inger-Johanne Sand and Gunther Teubner (eds.), Transnational Governance and Constitutionalism 3, 8 (Hart 2004).

40 Nico Krisch, Beyond Constitutionalism: The Pluralist Structure of Postnational Law (Oxford University Press 2010).

42 Nicolas Suzor, Lawless: The Secret Rules That Govern Our Digital Lives (Cambridge University Press 2019).

43 Joshua Barkan, ‘Law and the Geographic Analysis of Economic Globalization’ (2011) 35(5) Progress in Human Geography 589.

44 Bruce D’Arcus, ‘Extraordinary Rendition, Law and the Spatial Architecture of Rights’ (2014) 13 ACME: An International E-Journal for Critical Geographies 79.

45 Teubner (Footnote n. 39), 3.

46 Barney Warf, ‘Geographies of Global Internet Censorship’ (2011) 76 GeoJournal 1.

47 Anupam Chander and Uyen P. Le, ‘Data Nationalism’ (2015) 64(3) Emory Law Journal 677.

48 Giovanni De Gregorio and Nicole Stremlau, ‘Internet Shutdowns and the Limits of Law’ (2020) 14 International Journal of Communication 4224.

49 Johnson and Post (Footnote n. 19).

50 Jack Goldsmith and Tim Wu, Who Controls the Internet? Illusions of a Borderless World (Oxford University Press 2006).

51 Lessig (Footnote n. 10); Reidenberg (Footnote n. 23).

52 Justin Clark and others, ‘The Shifting Landscape of Global Internet Censorship’ (2017) Berkman Klein Center for Internet & Society Research Publication http://nrs.harvard.edu/urn-3:HUL.InstRepos:33084425 accessed 20 November 2021; Ronald Deibert and others, Access Denied: The Practice and Policy of Global Internet Filtering (MIT Press 2008).

53 Tom Ginsburg and Alberto Simpser (eds.), Constitutions in Authoritarian Regimes (Cambridge University Press 2014).

54 Giovanni Sartori, ‘Constitutionalism: A Preliminary Discussion’ (1962) 56(4) The American Political Science Review 853.

55 Ben Wagner, ‘Understanding Internet Shutdowns: A Case Study from Pakistan’ (2018) 12 International Journal of Communication 3917.

56 Susan Marks, The Riddle of All Constitutions: International Law, Democracy, and the Critique of Ideology (Oxford University Press 2004).

57 Yun Wen, The Huawei Model: The Rise of China’s Technology Giant (University of Illinois Press 2020).

58 Giovanni Pitruzzella and Oreste Pollicino, Hate Speech and Disinformation: A European Constitutional Perspective (Bocconi University Press 2020).

59 Charter of Fundamental Rights of the European Union (2012) OJ C 326/391, Art. 16.

60 Consolidated version of the Treaty on the Functioning of the European Union (2012) OJ C 326/47, Arts. 56–62.

61 Charter (Footnote n. 59), Art. 52.

62 Footnote Ibid., Art. 54.

63 Reno v. American Civil Liberties Union, 521 U.S. 844 (1997). Oreste Pollicino and Marco Bassini, ‘Free Speech, Defamation and the Limits to Freedom of Expression in the EU: A Comparative Analysis’ in Andrej Savin and Jan Trzaskowski (eds.), Research Handbook on EU Internet Law 508 (Edward Elgar 2014).

64 Packingham v. North Carolina (2017) 582 US ___.

66 See, for example, Reno (Footnote n. 63); Ashcroft v. Free Speech Coalition (2002) 535 US 234; Aschroft v. American Civil Liberties Union (2002) 535 US 564.

67 Neil W. Netanel, ‘Cyberspace Self-Governance: A Skeptical View from the Liberal Democratic Theory’ (2000) 88 California Law Review 401.

68 Jody Freeman and Martha Minow (eds.), Government by Contract Outsourcing and American Democracy (Harvard University Press 2009).

69 Albert Sánchez Graells, Public Procurement and the EU Competition Rules (Hart 2015).

70 Sofia Ranchordas and Catalina Goanta, ‘The New City Regulators: Platform and Public Values in Smart and Sharing Cities’ (2020) 36 Computer Law and Security Review 105375.

71 Marta Simoncini, Administrative Regulation Beyond the Non-Delegation Doctrine: A Study on EU Agencies (Hart 2018).

72 James Boyle, ‘A Nondelegation Doctrine for the Digital Age?’ (2000) 50 Duke Law Journal 5.

73 Carl S. Kaplan, ‘A Kind of Constitutional Convention for the Internet’ The New York Times (23 October 1998) www.nytimes/com/library/tech/98/10/cyber/cyberlaw/23law.html accessed 21 November 2021.

74 Boyle (Footnote n. 72).

75 A. Michael Froomkin, ‘Wrong Turn in Cyberspace: Using ICANN to Route Around the APA and the Constitution’ (2000) 50 Duke Law Journal 17.

76 Jonathan Weinberg, ‘ICANN and the Problem of Legitimacy’ (2000) 50 Duke Law Journal 187, 217.

77 Micheal D. Birnhack and Niva Elkin-Koren, ‘The Invisible Handshake: The Reemergence of the State in the Digital Environment’ (2003) 8 Virginia Journal of Law & Technology 1.

78 Joel R. Reidenberg, ‘States and Internet enforcement’ (2004) 1 University of Ottawa Law & Techonology Journal 213.

79 Elkin-Koren and Haber (Footnote n. 9).

80 Ian Brown and Christopher Marsden, Regulating Code: Good Governance and Better Regulation in the Information Age (MIT Press 2013).

81 Monroe E. Price and Stefaan G. Verhulst, Self-Regulation and the Internet (Kluwer 2004).

82 Dennis D. Hirsch, ‘The Law and Policy of Online Privacy: Regulation, Self-Regulation, or Co-Regulation?’ (2011) 34 Seattle University Law Review 439.

83 Uta Kohl (ed.), The Net and the Nation State Multidisciplinary Perspectives on Internet Governance (Cambridge University Press 2017).

84 Robert Brauneis and Ellen P. Goodman, ‘Algorithmic Transparency for the Smart City’ (2018) 20 Yale Journal of Law and Technology 103; Lilian Edwards, ‘Privacy, Security and Data Protection in Smart Cities: A Critical EU Law Perspective’ (2016) 1 European Data Protection Law 26.

85 Aziz Z. Huq, ‘Racial Equity in Algorithmic Criminal Justice’ (2019) 68 Duke Law Journal 1043.

86 Aaron Gregg and Jay Greene, ‘Pentagon Awards Controversial $10 Billion Cloud Computing Deal to Microsoft, Spurning Amazon’ Washington Post (26 October 2019) www.washingtonpost.com/business/2019/10/25/pentagon-awards-controversial-billion-cloud-computing-deal-microsoft-spurning-amazon/ accessed 22 November 2021.

87 Teresa Scassa, ‘Pandemic Innovation: The Private Sector and the Development of Contact-Tracing and Exposure Notification Apps’ (2021) 6(2) Business and Human Rights Journal 352.

88 Tim Wu, The Curse of Bigness: How Corporate Giants Came to Rule the World (Atlantic Books 2020).

89 Louis D. Brandeis, ‘The Curse of Bigness’ in Osmond K. Fraenkel (ed.), The Curse of Bigness: Miscellaneous Papers of Louis D. Brandeis (Viking Press 1934).

90 Field v. Clark 143 US 649 (1892), 692.

91 Robert Schutze, ‘“Delegated” Legislation in the (New) European Union: A Constitutional Analysis’ (2011) 74(5) Modern Law Review 661.

92 Costello-Roberts v. United Kingdom (1993) 19 EHRR 112 27–8.

93 Wos v. Poland (2007) 45 EHRR 28, 72.

94 Gary Lawson, ‘The Rise and Rise of the Administrative State’ (1994) 107 Harvard Law Review 1231.

95 Boyle (Footnote n. 72), 8.

96 Communications Decency Act (1996), Section 230(c)(1).

97 Digital Millennium Copyright Act (1997), Section 512.

98 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (2000) OJ L 178/1.

99 Mariarosaria Taddeo and Luciano Floridi (eds.), The Responsibilities of Online Service Providers (Springer 2017).

100 Footnote Ibid., Recital 9.

101 Guarantees of some Member States.

102 Jack M. Balkin, ‘Free Speech and Hostile Environments’ (1999) Columbia Law Review 2295.

103 Oreste Pollicino, Giovanni De Gregorio and Laura Somaini, ‘The European Regulatory Conundrum to Face the Rise and Amplification of False Content Online’ (2020) 19(1) Global Yearbook of International Law and Jurisprudence 319.

104 Felix T. Wu, ‘Collateral Censorship and the Limits of Intermediary Immunity’ (2013) 87 Notre Dame Law Review 293; Seth F. Kreimer, ‘Censorship by Proxy: The First Amendment, Internet Intermediaries, and the Problem of the Weakest Link’ (2006) 155 University of Pennsylvania Law Review 11; Jack M. Balkin, ‘Free Speech and Hostile Environments’ (1999) 99 Columbia Law Review 2295.

105 Delfi AS v. Estonia (2015); Magyar Tartalomszolgáltatók Egyesülete and Index.Hu Zrt v. Hungary (2016). According to para 86: ‘Such liability may have foreseeable negative consequences on the comment environment of an Internet portal, for example by impelling it to close the commenting space altogether. For the Court, these consequences may have, directly or indirectly, a chilling effect on the freedom of expression on the Internet. This effect could be particularly detrimental to a non-commercial website such as the first applicant’.

106 Orla Lynskey, ‘Regulation by Platforms: The Impact on Fundamental Rights’ in Luca Belli and Nicolo Zingales (eds.), Platform Regulations How Platforms Are Regulated and How They Regulate Us (FGV Direito Rio 2017); James Grimmelmann, ‘Speech Engines’ (2014) 98 Minnesota Law Review 868.

107 e-Commerce Directive (Footnote n. 98), Recital 42.

108 Proposal for a Regulation of the European Parliament and of the Council on a Single Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC, COM(2020) 825 final.

109 Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data (1995) OJ L 281/31.

110 Working Party 29, ‘Statement on the Role of a Risk-Based Approach in Data Protection Legal Frameworks’ (2014) https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2014/wp218_en.pdf accessed 21 November 2021.

111 OECD, Guidelines Governing the Protection of Privacy and Transborder Flows of Personal Data (2013).

112 Christopher Kuner and others, ‘Risk Management in Data Protection’ (2015) 5(2) International Data Privacy Law 95.

113 Data Protection Directive (Footnote n. 109), Art. 17.

114 Footnote Ibid., Art. 7(f).

115 Footnote Ibid., Art. 20.

116 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (2016) OJ L 119/1, Art. 5.

117 Raphaël Gellert, The Risk-Based Approach to Data Protection (Oxford University Press 2020).

118 Milda Macenaite, ‘The “Riskification” of European Data Protection Law through a Two-fold Shift’ (2017) 8(3) European Journal of Risk Regulation 506.

119 GDPR (Footnote n. 116), Art. 35.

120 Footnote Ibid., Art. 37.

121 Raphaël Gellert, ‘Understanding the Notion of Risk in the General Data Protection Regulation’ (2018) 34 Computer Law & Security Review 279; Claudia Quelle, ‘Enhancing Compliance under the General Data Protection Regulation: The Risky Upshot of the Accountability- and Risk-based Approach’ (2018) 9(3) European Journal of Risk Regulation 502.

122 GDPR (Footnote n. 116), Art. 24.

123 Footnote Ibid., Art. 25.

124 Footnote Ibid., Art. 35(3)(a).

125 Case C-131/12 Google Spain SL and Google Inc. v. Agencia Española de Protección de Datos (AEPD) and Mario Costeja González (2014).

126 Oreste Pollicino and Marco Bassini, ‘Reconciling Right to Be Forgotten and Freedom of Information in the Digital Age. Past and Future of Personal Data Protection in the EU’ (2014) 2 Diritto pubblico comparato ed europeo 641.

127 European Data Protection Board, Guidelines 5/2019 on the criteria of the Right to be Forgotten in the search engines cases under the GDPR, 2 December 2019; Working Party Article 29, ‘Working Party on the Implementation of the Court of Justice of the European Union Judgment on “Google Spain and Inc v. Agencia Española de Protección de Datos (AEPD) and Mario Costeja González” C-131/12’, http://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2014/wp225_en.pdf accessed 21 November 2021.

128 GDPR (Footnote n. 116), art. 17(1).

129 Footnote Ibid., art. 17(3).

130 Frank Pasquale, ‘From Territorial to Functional Sovereignty: The Case of Amazon. Law and Political Economy’ LPE (12 June 2017) https://lpeblog.org/2017/12/06/from-territorial-to-functional-sovereignty-the-case-of-amazon/ accessed 22 November 2021.

131 Jennifer C. Daskal, ‘Borders and Bits’ (2018) 71 Vanderbilt Law Review 179.

132 Alan Z. Rozenshtein, ‘Surveillance Intermediaries’ (2018) 70 Stanford Law Review 99.

133 Kate Klonick, ‘The New Governors: The People, Rules, and Processes Governing Online Speech’ (2018) 131 Harvard Law Review 1598; Jack M. Balkin, ‘Free Speech in the Algorithmic Society: Big Data, Private Governance, and New School Speech Regulation’ (2018) 51 U.C. Davis Law Review 1151

134 Kristen E. Eichensehr, ‘Digital Switzerlands’ (2018) 167 University Pennsylvania Law Review 665.

135 Luca Belli, Pedro A. Francisco and N. Zingales, ‘Law of the Land or Law of the Platform? Beware of the Privatisation of Regulation and Policy’ in Belli and Zingales (Footnote n. 106), 41.

136 Yochai Benkler, ‘Degrees of Freedom Dimension and Power’ (2016) 145 Daedalus 18.

137 Gunther Teubner, Constitutional Fragments: Societal Constitutionalism and Globalization (Oxford University Press 2012).

138 Gunther Teubner, ‘The Anonymous Matrix: Human Rights Violations by “Private” Transnational Actors’ (2006) 69 Modern Law Review 327.

139 Mark Zuckerberg, ‘Bringing the World Closer Together’ Facebook (22 June 2017) https://www.facebook.com/notes/mark-zuckerberg/bringing-the-world-closer-together/10154944663901634/ accessed 22 November 2021.

140 Edoardo Celeste, ‘Terms of Service and Bills of Rights: New Mechanisms of Constitutionalisation in the Social Media Environment?’ (2018) 33(2) International Review of Law, Computers & Technology 122.

141 Margaret J. Radin, Boilerplate the Fine Print, Vanishing Rights, and the Rule of Law (Princeton University Press 2013).

142 Fredrick Kessler, ‘Contract of Adhesion – Some Thoughts about Freedom of Contract’ (1943) 43 Columbia Law Review 629, 640.

143 David Slawson, ‘Standard Forms of Contract and Democratic Control of Lawmaking Power’ (1967) 84 Harvard Law Review 529, 530.

144 Omri Ben-Shahar and Carl E. Schneider, More Than You Wanted to Know: The Failure of Mandated Disclosure 27 (Princeton University Press 2016).

145 Shoshana Zuboff, ‘Big Other: Surveillance Capitalism and the Prospects of an Information Civilization’ (2015) 30(1) Journal of Information Technology 75.

146 Rebecca MacKinnon, Consent of the Networked: The Worldwide Struggle for Internet Freedom (Basic Books 2013).

147 Tal Z. Zarsky, ‘Social Justice, Social Norms and the Governance of Social Media’ (2014) 35 Pace Law Review 154.

148 Martin Moore and Damian Tambini (eds.), Digital Dominance. The Power of Google, Amazon, Facebook, and Apple (Oxford University Press 2018).

149 Anupam Chander, ‘Facebookistan’ (2012) 90 North Carolina Law Review 1807.

150 Zarsky (Footnote n. 147).

151 Frank A. Pasquale, ‘Privacy, Autonomy, and Internet Platforms’ in Marc Rotenberg, Julia Horwitz and Jeramie Scott (eds.), Privacy in the Modern Age, the Search for Solutions (The New Press 2015).

152 Nancy S. Kim and D. A. Telman, ‘Internet Giants as Quasi-Governmental Actors and the Limits of Contractual Consent’ (2015) 80 Missouri Law Review 723, 730.

154 Orla Lynskey, ‘Regulating Platform Power’ (2017) LSE Legal Studies Working Paper 1 http://eprints.lse.ac.uk/73404/1/WPS2017-01_Lynskey.pdf accessed 22 November 2021.

155 David S. Evans, ‘Governing Bad Behavior by Users of Multi-Sided Platforms’ (2012) 27 Berkeley Technology Law Journal 1201.

156 Tomer Shadmy, ‘The New Social Contract: Facebook’s Community and Our Rights’ (2019) 37 Boston University International Law Journal 307, 329.

157 Laura Stein, ‘Policy and Participation on Social Media: The Cases of YouTube, Facebook, and Wikipedia’ (2013) 6(3) Communication, Culture & Critique 353.

158 Henry Jenkins, Convergence Culture: Where Old and New Media Collide (New York University Press 2006).

159 Radin (Footnote n. 141), 16.

160 Peter Zumbansen, ‘The Law of Society: Governance Through Contract’ (2007) 14(1) Indiana Journal of Global Legal Studies 19.

161 Woodrow Hartzog, ‘Website Design as Contract’ (2011) 60(6) American University Law Review 1635.

162 Louis Jaffe, ‘Law Making by Private Groups’ (1937) 51 Harvard Law Review 201.

163 Lee A. Bygrave, Internet Governance by Contract (Oxford University Press 2015).

164 Luca Belli and Jamila Venturini, ‘Private Ordering and the Rise of Terms of Service as Cyber-Regulation’ (2016) 5(4) Internet Policy Review https://policyreview.info/node/441/pdf accessed 22 November 2021.

165 Paul S. Berman, ‘Cyberspace and the State Action Debate: The Cultural Value of Applying Constitutional Norms to “Private” Regulation’ (2000) 71 University of Colorado Law Review 1263.

166 Ellen Wauters, Eva Lievens and Peggy Valcke, ‘Towards a Better Protection of Social Media Users: A Legal Perspective on the Terms of Use of Social Networking Sites’ (2014) 22 International Journal of Law & Information Technology 254.

167 Lessig (Footnote n. 10).

168 Reidenberg (Footnote n. 23).

169 Black (Footnote n. 38).

170 Ira Rubinstein and Nathan Good, ‘Privacy by Design: A Counterfactual Analysis of Google and Facebook Privacy Incidents’ (2013) 28 Berkeley Technology Law Journal 1333; Robert Brendan Taylor, ‘Consumer-Driven Changes to Online Form Contracts’ (2011–12) 67 NYU Annual Survey of American Law 371.

171 Frank Pasquale, The Black Box Society: The Secret Algorithms that Control Money and Information (Harvard University Press 2015); Matteo Turilli and Luciano Floridi, ‘The Ethics of Information Transparency’ (2009) 11(2) Ethics and Information Technology 105; Tal Zarsky, ‘Transparent Predictions’ (2013) 4 University of Illinois Law Review 1507.

172 Woodrow Hartzog, Ari Melber and Evan Salinger, ‘Fighting Facebook: A Campaign for a People’s Terms of Service’ Center for Internet and Society (22 May 2013) http://cyberlaw.stanford.edu/blog/2013/05/fighting-facebook-campaign-people%E2%809699s-terms-service accessed 22 November 2021.

173 Charles De Secondat, L’esprit des loi (1748).

174 Charles Howard McIlwain, Constitutionalism: Ancient and Modern (Amagi 2007).

175 The French Declaration of the Rights of Man and Citizens art. 16 states: ‘Any society in which the guarantee of rights is not assured, nor the separation of powers determined, has no Constitution’. Declaration of the Right of Man and the Citizen, 26 August 1789.

176 James Grimmelmann, ‘Virtual World Feudalism’ (2009) 118 Yale Law Journal Pocket Part 126.

177 Belli and Venturini (Footnote n. 164).

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×