Skip to main content Accessibility help
×
Hostname: page-component-cd9895bd7-dzt6s Total loading time: 0 Render date: 2024-12-27T08:40:05.729Z Has data issue: false hasContentIssue false

6 - Artificial Intelligence and the Past, Present, and Future of Democracy

from Part II - Current and Future Approaches to AI Governance

Published online by Cambridge University Press:  28 October 2022

Silja Voeneky
Affiliation:
Albert-Ludwigs-Universität Freiburg, Germany
Philipp Kellmeyer
Affiliation:
Medical Center, Albert-Ludwigs-Universität Freiburg, Germany
Oliver Mueller
Affiliation:
Albert-Ludwigs-Universität Freiburg, Germany
Wolfram Burgard
Affiliation:
Technische Universität Nürnberg

Summary

In this chapter, the philosopher Mathias Risse reflects on the medium and long-term prospects and challenges democracy faces from AI. Comparing the political nature of AI systems with traffic infrastructure, the author points out AI’s potential to greatly strengthen democracy, but only with the right efforts. The chapter starts with a critical examination of the relation between democracy and technology with a historical perspective before outlining the techno skepticism prevalent in several grand narratives of AI. Finally, the author explores the possibilities and challenges that AI may lead to in the present digital age. He argues that technology critically bears on what forms of human life get realised or imagined, as it changes the materiality of democracy (by altering how collective decision making unfolds) and what its human participants are like. In conclusion, Mathias Risse argues that both technologists and citizens need to engage with ethics and political thoughts generally to have the spirit and dedication to build and maintain a democracy-enhancing AI infrastructure.

Type
Chapter
Information
The Cambridge Handbook of Responsible Artificial Intelligence
Interdisciplinary Perspectives
, pp. 85 - 103
Publisher: Cambridge University Press
Print publication year: 2022
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC-ND 4.0 https://creativecommons.org/cclicenses/

I. Introduction: How AI Is Political

Langdon Winner’s classic essay ‘Do Artifacts Have Politics?’ resists a widespread but naïve view of the role of technology in human life: that technology is neutral, and all depends on use.Footnote 1 He does so without enlisting an overbearing determinism that makes technology the sole engine of change. Instead, Winner distinguishes two ways for artefacts to have ‘political qualities’. First, devices or systems might be means for establishing patterns of power or authority, but the design is flexible: such patterns can turn out one way or another. An example is traffic infrastructure, which can assist many people but also keep parts of the population in subordination, say, if they cannot reach suitable workplaces. Secondly, devices or systems are strongly, perhaps unavoidably, tied to certain patterns of power. Winner’s example is atomic energy, which requires industrial, scientific, and military elites to provide and protect energy sources. Artificial Intelligence (AI), I argue, is political the way traffic infrastructure is: It can greatly strengthen democracy, but only with the right efforts. Understanding ‘the politics of AI’ is crucial since Xi Jinping’s China loudly champions one-party rule as a better fit for our digital century. AI is a key component in the contest between authoritarian and democratic rule.

Unlike conventional programs, AI algorithms learn by themselves. Programmers provide data, which a set of methods, known as machine learning, analyze for trends and inferences. Owing to their sophistication and sweeping applications, these technologies are poised to dramatically alter our world. Specialized AI is already broadly deployed. At the high end, one may think of AI mastering Chess or Go. More commonly we encounter it in smartphones (Siri, Google Translate, curated newsfeeds), home devices (Alexa, Google Home, Nest), personalized customer services, or GPS systems. Specialized AI is used by law enforcement, the military, in browser searching, advertising and entertainment (e.g., recommender systems), medical diagnostics, logistics, finance (from assessing credit to flagging transactions), in speech recognition producing transcripts, trade bots using market data for predictions, but also in music creations and article drafting (e.g., GPT-3’s text generator writing posts or code). Governments track people using AI in facial, voice, or gait recognition. Smart cities analyze traffic data in real time or design services. COVID-19 accelerated use of AI in drug discovery. Natural language processing – normally used for texts – interprets genetic changes in viruses. Amazon Web Services, Azure, or Google Cloud’s low- and no-code offerings could soon let people create AI applications as easily as websites.Footnote 2

General AI approximates human performance across many domains. Once there is general AI smarter than we are, it could produce something smarter than itself, and so on, perhaps very fast. That moment is the singularity, an intelligence explosion with possibly grave consequences. We are nowhere near anything like that. Imitating how mundane human tasks combine agility, reflection, and interaction has proven challenging. However, ‘nowhere near’ means ‘in terms of engineering capacities’. A few breakthroughs might accelerate things enormously. Inspired by how millions of years of evolution have created the brain, neural nets have been deployed in astounding ways in machine learning. Such research indicates to many observers that general AI will emerge eventually.Footnote 3

This essay is located at the intersection of political philosophy, philosophy of technology, and political history. My purpose is to reflect on medium and long-term prospects and challenges for democracy from AI, emphasizing how critical a stage this is. Social theorist Bruno Latour, a key figure in Science, Technology and Society Studies, has long insisted no entity matters in isolation but attains meaning through numerous, changeable relations. Human activities tend to depend not only on more people than the protagonists who stand out, but also on non-human entities. Latour calls such multitudes of relations actor-networks.Footnote 4 This perspective takes the materiality of human affairs more seriously than is customary, the ways they critically involve artefacts, devices, or systems. This standpoint helps gauge AI’s impact on democracy.

Political theorists treat democracy as an ideal or institutional framework, instead of considering its materiality. Modern democracies involve structures for collective choice that periodically empower relatively few people to steer the social direction for everybody. As in all forms of governance, technology shapes how this unfolds. Technology explains how citizens obtain information that delineates their participation (often limited to voting) and frees up people’s time to engage in collective affairs to begin with. Devices and mechanisms permeate campaigning and voting. Technology shapes how politicians communicate and bureaucrats administer decisions. Specialized AI changes the materiality of democracy, not just in the sense that independently given actors deploy new tools. AI changes how collective decision making unfolds and what its human participants are like: how they see themselves in relation to their environment, what relationships they have and how those are designed, and generally what forms of human life can come to exist.Footnote 5

Section II explores what democracy is, emphasizes the materiality of ‘early’ and ‘modern’ democracy and rearticulates the perspective we take from Winner. Section III recounts some of the grand techno-skeptical narratives of twentieth-century philosophy of technology, distilling the warnings they convey for the impact of AI on democracy. Section IV introduces another grand narrative, a Grand Democratic AI Utopia, a way of imagining the future we should be wary of. Section V discusses challenges and promises of AI for democracy in this digital century without grand narratives. Instead, we ask how to design AI to harness the public sphere, political power, and economic power for democratic purposes, to make them akin to Winner’s inclusive traffic infrastructure. Section VI concludes.

II. Democracy and Technology

A distinctive feature – and an intrinsic rather than comparative advantage – of recognizably democratic structures is that they give each participant at least minimal ownership of social endeavors and inspire many of them to recognize each other as responsible agents across domains of life. There is disagreement about that ideal, with Schumpeterian democracy stressing peaceful removal of rulers and more participatory or deliberative approaches capturing thicker notions of empowerment.Footnote 6 Arguments for democracy highlight democracy’s possibilities for emancipation, its indispensability for human rights protection, and its promise of unleashing human potentials. Concerns to be overcome include shortsightedness vis-a-vis long-term crises, the twin dangers of manipulability by elites and susceptibility to populists, the potential of competition to generate polarization, and a focus on process rather than results. However, a social-scientific perspective on democracy by David Stasavage makes it easier to focus on its materiality and thus, later on, the impact of AI.Footnote 7 Stasavage distinguishes early from modern democracy, and both of those from autocracy. Autocracy is governance without consent of those people who are not directly controlled by the ruling circles anyway. The more viable and thus enduring autocracies have tended to make up for that lack of consent by developing a strong bureaucracy that would at least guarantee robust and consistent governance patterns.

1. Early Democracy and the Materiality of Small-Scale Collective Choice

Early democracy was a system in which rulers governed jointly with councils or assemblies consisting of members who were independent from rulers and not subject to their whims. Sometimes such councils and assemblies would provide information, sometimes they would assist with governance directly. Sometimes councils and assemblies involved participation from large parts of the population (either directly or through delegation), sometimes councils were elite gatherings. Rulership might be elective or inherited. Its material conditions were such that early democracy would arise in smaller rather than larger polities, in polities where rulers depended on subjects for information about what they owned or produced and so could not tax without compliance, and where people had exit options. Under such conditions, rulers needed consent from parts of the population. Early democracy thus understood was common around the globe and not restricted to Greece, as the standard narrative has it.Footnote 8

However, what is special about Athens and other Greek democracies is that they were most extensively participatory. The reforms of Cleisthenes, in the sixth century BC, divided Athens into 139 demes (150 to 250 men each, women playing no political role) that formed ten artificial ‘tribes’. Demes in the same tribe did not inhabit the same region of Attica. Each tribe sent 50 men, randomly selected, for a year, to a Council of 500 to administer day-to-day affairs and prepare sessions of the Assembly of all citizens. This system fed knowledge and insights from all eligible males into collective decision making without positioning anyone for take-over.Footnote 9 It depended on production and defense patterns that enslaved people to enable parts of the population to attend to collective affairs. Transport and communication had to function to let citizens do their parts. This system also depended on a steady, high-volume circulation of people in and out of office to make governance impersonal, representative, and transparent at the same time. That flow required close bookkeeping to guarantee people were at the right place – which involved technical devices, the material ingredients of democratic governance.

Let me mention some of those. The kleroterion (allotment machine) was a two-by-three-foot slab of rock with a grid of deep, thin slots gouged into it. Integrating some additional pieces, this sophisticated device helped select the required number of men from each tribe for the Council, or for juries and committees where representation mattered. Officers carried allotment tokens – pieces of ceramics inscribed with pertinent information that fit with another piece at a secure location to be produced if credentials were questioned. (Athens was too large for everyone to be acquainted.) With speaking times limited, a water clock (klepsydra) kept time. Announcement boards recorded decisions or messages. For voting, juries used ballots, flat bronze disks. Occasionally, the Assembly considered expelling citizens whose prominence threatened the impersonal character of governance, ostracisms for which citizens carved names into potsherds. Aristotle argued that citizens assembled for deliberation could display virtue and wisdom no individual could muster, an argument for democracy resonant through the ages.Footnote 10 It took certain material objects to make it work. These objects were at the heart of Athenian democracy, devices in actor-networks to operationalize consent of the governed.Footnote 11

2. Modern Democracy and the Materiality of Large-Scale Collective Choice

As a European invention, modern democracy is representative, with mandates that do not bind representatives to an electorate’s will. Representatives emerge from competitive elections under increasingly universal suffrage. Participation is broad but typically episodic. The material conditions for its existence resemble early democracy: they emerge where rulers need subjects to volunteer information and people have exit options. But modern democracies arise in large territories, as exemplified by the United States.Footnote 12 Their territorial dimensions (and large populations) generate two legitimacy problems. First, modern democracy generates distrust because ‘state’ and ‘society’ easily remain abstract and distant. Secondly, there is the problem of overbearing executive power. Modern democracies require bureaucracies to manage day-to-day-affairs. Bureaucracies might generate their own dynamics, and eventually citizens no longer see themselves governing. If the head of the executive is elected directly, excessive executive power becomes personal power.Footnote 13

Modern democracy too depends on material features to function. Consider the United States. In 1787 and 1788, Alexander Hamilton, James Madison, and John Jay, under the collective pseudonym ‘Publius’, published 85 articles and essays (‘Federalist Papers’) to promote the constitution. Hamilton calls the government the country’s ‘center of information’.Footnote 14 ‘Information’ and ‘communication’ matter greatly to Publius: the former term appears in nineteen essays, the latter in a dozen. For these advocates of this trailblazing system, the challenge is to find structures for disclosure and processing of pertinent information about the country. Publius thought members of Congress would bring information to the capital, after aggregating it in the states. But at the dawn of the Republic, the vastness of the territory made these challenges formidable. One historian described the communication situation as a ‘quarantine’ of government from society.Footnote 15 Improvements in postal services and changes in the newspaper business in the nineteenth century brought relief, facilitating the central role of media in modern democracies. Only such developments could turn modern democracies into actor-networks where representatives do not labor in de-facto isolation.Footnote 16

‘The aim of every political constitution is or ought to be first for rulers to obtain men who possess most wisdom to discern, and most virtue to pursue the common good of society’, we read in Federalist No. 57.Footnote 17 To make this happen, in addition to a political culture where the right people seek office, voting systems are required, the design of which was left to states. Typically, what they devised barely resembled the orderliness of assigning people by means of the kleroterion. ‘Ballot’ comes from Italian ballotta (little ball), and ballots often were something small and round, like pebbles, peas, beans or bullets.Footnote 18 Paper ballots gradually spread, partly because they were easier to count than beans. Initially, voters had to bring paper and write down properly spelled names and offices. The rise of parties was facilitated by that of paper ballots. Party leaders printed ballots, often in newspapers – long strips, listing entire slates, or pages to be cut into pieces, one per candidate. Party symbols on ballots meant voters did not need to know how to write or read, an issue unknown when people voted by surrendering beans or by voice.

In 1856, the Australian state of Victoria passed its Electoral Act, detailing the conduct of elections. Officials had to print ballots and erect booths or hire rooms. Voters marked ballots secretly and nobody else was allowed in polling places. The ‘Australian ballot’ gradually spread, against much resistance. Officially, such resistance arose because it eliminated the public character of the vote that many considered essential to honorable conduct. But de facto there often was resistance because the Australian ballot made it hard for politicians to get people to vote for them in exchange for money (as such voting behavior then became hard to verify). In 1888, Massachusetts passed the first statewide Australian-ballot law in the United States. By 1896, most Americans cast secret, government-printed ballots. Such ballots also meant voters had to read, making voting harder for immigrants, formerly enslaved people, and the uneducated poor. Machines for casting and counting votes date to the 1880s. Machines could fail, or be manipulated, and the mechanics of American elections have remained contested ever since.

3. Democracy and Technology: Natural Allies?

The distant-state and overbearing-executive problems are so substantial that, for Stasavage, ‘modern democracy is an ongoing experiment, and in many ways, we should be surprised that it has worked at all.’Footnote 19 The alternative to democracy is autocracy, which is viable only if backed by competent bureaucracies. Stasavage argues that often advances in production and communication undermined early democracy. New or improved technologies could reduce information advantages of subjects over rulers, e.g., regarding fertility of land – if governments have ways of assessing the value of land, they know to tax it; if they do not, they have no good way of taxing it without informational input from the owners. Agricultural improvements led to people living closer together so bureaucrats could easily monitor them. Conversely, slow progress in science and development favored survival of early democracy.

Innovations in writing, mapping, measurement, or agriculture made bureaucracies more effective, and thus made autocracies with functioning bureaucracies the more viable. Much depends on sequencing. Entrenched democracies are less likely to be undermined by technological advances than polities where autocracy is a live option. And so, in principle, entrenched democracies these days could make good use of AI to enhance their functionality (and thus make AI a key part of the materiality of contemporary democracies). In China, the democratic alternative never gained much traction. In recent decades, the country made enormous strides under an autocratic system with a competent bureaucracy. Under Xi Jinping, China aggressively advertises its system, and AI has started to play a major role in it, especially in the surveillance of its citizens.Footnote 20

Yuval Noah Harari recently offered a somewhat different view of the relationship between democracy and technology.Footnote 21 Historically, he argues, autocracies have faced handicaps around innovation and growth. In the late twentieth century especially, democracies outperformed dictatorships because they were better at processing information. Echoing Hayek’s Road to Serfdom, Harari thinks twentieth-century technology made it inefficient to concentrate information and power.Footnote 22 But Harari also insists that, at this stage, AI might altogether alter the relative efficiency of democracy vs. authoritarianism.

Stasavage and Harari agree that AI undermines conditions that make democracy the more viable system. This does not mean existing democracies are in imminent danger. In fact, it can only be via technology that individuals matter to politics in modern democracies in ways that solve the distant-state and overbearing-executive problems. Only through the right kind of deployment of modern democracy’s materiality could consent to governance be meaningful and ensure that governance in democracies does not mean quarantining leadership from population, as it did in the early days of the American Republic. As the twenty-first century progresses, AI could play a role in this process. Because history has repeatedly shown how technology strengthens autocracy, democrats must be vigilant vis-à-vis autocratic tendencies from within. Technology is indispensable to make modern democracy work, but it is not its natural ally. Much as in Winner’s infrastructure design, careful attention must be paid to ensure technology advances democratic purposes.Footnote 23

III. Democracy, AI, and the Grand Narratives of Techno-Skepticism

Several grand techno-skeptical narratives have played a significant role in the twentieth-century philosophy of technology. To be sure, that field now focuses on a smaller scale, partly because grand narratives are difficult to establish.Footnote 24 However, these narratives issue warnings about how difficult it might be to integrate specifically AI into flourishing democracies, warnings we are well advised to heed as much is at stake.

1. Lewis Mumford and the Megamachine

Mumford was a leading critic of the machine age.Footnote 25 His 1934 Technics and Civilization traces a veritable cult of the machine through Western history that often devastated creativity and independence of mind.Footnote 26 He argues that ‘men had become mechanical before they perfected complicated machines to express their new bent and interest’.Footnote 27 People had lived in coordinated ways (forming societal machines) and endorsed ideals of specialization, automation, and rationality before physical machines emerged. That they lived that way made people ready for physical machines. In the Middle Ages, mechanical clocks (whose relevance for changing life patterns Mumford tirelessly emphasizes) literally synchronized behavior.Footnote 28

Decades later Mumford revisited these themes in his two-volume ‘Myth of the Machine.Footnote 29 These works offer an even more sweeping narrative, characterizing modern doctrines of progress as scientifically upgraded justifications for practices the powerful had deployed since pharaonic times to maintain power. Ancient Egypt did machine work without actual machines.Footnote 30 Redeploying his organizational understanding of machines, Mumford argues pyramids were built by machines – societal machines, centralized and subtly coordinated labor systems in which ideas like interchangeability of parts, centralization of knowledge, and regimentation of work are vital. The deified king, the pharaoh, is the chief engineer of this original megamachine. Today, the essence of industrialization is not even the large-scale use of machinery. It is the domination of technical knowledge by expert elites, and our structured way of organizing life. By the early twentieth century, the components of the contemporary megamachine were assembled, controlled by new classes of decision makers governing the ‘megatechnical wasteland’ (a dearth of creative thinking and possibilities in designing their own lives on the part of most people).Footnote 31 The ‘myth’ to be overcome is that this machine is irresistible but also beneficial to whoever complies.

Mumford stubbornly maintained faith in life’s rejuvenating capacities, even under the shadow of the megamachine. But clearly any kind of AI, and social organization in anticipation of general AI, harbors the dangers of streamlining the capacities of most people in society that Mumford saw at work since the dawn of civilization. This cannot bode well for governance based on meaningful consent.

2. Martin Heidegger and the World As Gestell

Heidegger’s most influential publication on technology is his 1953 ‘The Question Concerning Technology’.Footnote 32 Modern technology is the contemporary mode of understanding things. Technology makes things show up as mattering, one way or another. The mode of revealing (as Heidegger says) characteristic of modern technology sees everything around us as merely a standing-reserve (Bestand), resources to be exploited as means.Footnote 33 This includes the whole natural world, even humans. In 1966, Heidegger even predicted that ‘someday factories will be built for the artificial breeding of human material’.Footnote 34

Heidegger offers the example of a hydroelectric plant converting the Rhine into a mere supplier of waterpower.Footnote 35 In contrast, a wooden bridge that has spanned the river for centuries reveals it as a natural environment and permits natural phenomena to appear as objects of wonder. Heidegger uses the term Gestell (enframing) to capture the relevance of technology in our lives.Footnote 36 The prefix ‘Ge’ is about linking together of elements, like Gebirge, mountain range. Gestell is a linking together of things that are posited. The Gestell is a horizon of disclosure according to which everything registers only as a resource. Gestell deprives us of any ability to stand in caring relations to things. Strikingly, Heidegger points out that ‘the earth now reveals itself as a coal mining district, the soil as a material deposit’.Footnote 37 Elsewhere he says the modern world reveals itself as a ‘gigantic petrol station’.Footnote 38 Technology lets us relate to the world only in impoverished ways. Everything is interconnected and exchangeable, efficiency and optimization set the stage. Efficiency demands standardization and repetition. Technology saves us from having to develop skills while also turning us into people who are satisfied with lives that do not involve many skills.

For Heidegger, modern democracy with its materiality could only serve to administer the Gestell, and thus is part of the inauthentic life it imposes. His interpreter Hubert Dreyfus has shown how specifically the Internet exemplifies Heidegger’s concerns about technology as Gestell.Footnote 39 As AI progresses, it would increasingly encapsulate Heidegger’s worries about how human possibilities vanish through the ways technology reveals things. Democracies that manage to integrate AI should be wary of such loss.

3. Herbert Marcuse and the Power of Entertainment Technology

Twentieth-century left-wing social thought needed to address why the revolution as Marx predicted it never occurred. A typical answer was that capitalism persevered by not merely dominating culture, but by deploying technology to develop a pervasive entertainment sector. The working class got mired in consumption habits that annihilated political instincts. But Marxist thought sustains the prospect that, if the right path were found, a revolution would occur. In the 1930s, Walter Benjamin thought the emerging movie industry could help unite the masses in struggle, capitalism’s efforts at cultural domination notwithstanding. Shared movie experiences could allow people to engage the vast capitalist apparatus that intrudes upon their daily lives. Deployed the right way, this new type of art could help finish up capitalism after all.Footnote 40

When Marcuse published his ‘One-Dimensional Man’ in 1964, such optimism about the entertainment sector had vanished. While he had not abandoned the Marxist commitment to the possibility of a revolution, Marcuse saw culture as authoritarian. Together, capitalism, technology, and entertainment culture created new forms of social control, false needs and a false consciousness around consumption. Their combined force locks one-dimensional man into one-dimensional society, which produces the need for people to recognize themselves in commodities. Powers of critical reflection decline. The working class can no longer operate as a subversive force capable of revolutionary change.

‘A comfortable, smooth, reasonable, democratic unfreedom prevails in advanced civilization, a token of technical progress’, Marcuse starts off.Footnote 41 Technology – as used especially in entertainment, at which Benjamin still looked differently – immediately enters Marcuse’s reckoning with capitalism. It is ‘by virtue of the way it has organized its technological base, [that] contemporary industrial society tends to be totalitarian’.Footnote 42 He elaborates: ‘The people recognize themselves in their commodities; they find their soul in their automobile, hi-fi set, split-level home, kitchen equipment.’Footnote 43 Today, Marcuse would bemoan that people see themselves in possibilities offered by AI.

4. Jacques Ellul and Technological Determinism

Ellul diagnoses a systemic technological tyranny over humanity. His most celebrated work on philosophy of technology is ‘The Technological Society’.Footnote 44 In the world Ellul describes, individuals factor into overall tendencies he calls ‘massification’. We might govern particular technologies and exercise agency by operating machines, building roads, or printing magazines. Nonetheless, technology overall – as a Durkheimian social fact that goes beyond any number of specific happenings – outgrows human control. Even as we govern techniques (a term Ellul uses broadly, almost synonymously with a rational, systematic approach, with physical machines being the paradigmatic products), they increasingly shape our activities. We adapt to their demands and structures. Ellul is famous for his thesis of the autonomy of technique, its being a closed system, ‘a reality in itself […] with its special laws and its own determinations.’ Technique elicits and conditions social, political, and economic change. It is the prime mover of all the rest, in spite of any appearances to the contrary and in spite of human pride, which pretends that man’s philosophical theories are still determining influences and man’s political regimes decisive factors in technical evolution.Footnote 45

For example, industry and military began to adopt automated technology. One might think this process resulted from economic or political decisions. But for Ellul the sheer technical possibility provided all required impetus for going this way. Ellul is a technological determinist, but only for the modern age: technology, one way or another, causes all other aspects of society and culture. It does not have to be this way, and in the past it was not. But now, that is how it is.

Eventually, the state is inextricably intertwined with advancements of technique, as well as with corporations that produce machinery. The state no longer represents citizens if their interests contradict those advancements. Democracy fails, Ellul insists: we face a division between technicians, experts, and bureaucrats, standard bearers of techniques, on the one hand, and politicians who are supposed to represent the people and be accountable on the other. ‘When the technician has completed his task,’ Ellul says, ‘he indicates to the politicians the possible solutions and the probable consequences – and retires.’Footnote 46 The technical class understands technique but is unaccountable. In his most chilling metaphor, Ellul concludes the world technique creates is ‘the universal concentration camp’.Footnote 47 AI would perfect this trend.

IV. The Grand Democratic AI Utopia

Let us stay with grand narratives a bit longer and consider what we might call the Grand Democratic AI Utopia. We are nowhere near deploying anything like what I am about to describe. But once general AI is on our radar, AI-enriched developments of Aristotle’s argument from the wisdom of the multitude should also be. Futurists Noah Yuval Harari and Jamie Susskind touch on something like this;Footnote 48 and with technological innovation, our willingness to integrate technology into imageries for the future will only increase. Environmentalist James Lovelock thinks cyborgs could guide efforts to deal with climate change.Footnote 49 And in his discussion of future risks, philosopher Toby Ord considers AI assisting with our existential problems.Footnote 50 Such thinking is appealing because our brains evolved for the limited circumstances of small bands in earlier stages of homo sapiens rather than the twenty-first century’s complex and globally interconnected world. Our brains could create that world but might not be able to manage its existential threats, including those we created.

But one might envisage something like this. AI knows everyone’s preferences and views and provides people with pertinent information to make them competent participants in governance. AI connects citizens to debate views; it connects like-minded people but also those that dissent from each other. In the latter case, people are made to hear each other. AI gathers the votes, which eliminates challenges of people reaching polling stations, vote counting, etc. Monitoring everything, AI instantly identifies fraud or corruption, and flags or removes biased reporting or misleading arguments. AI improves procedural legitimacy through greater participation while the caliber of decision making increases because voters are well-informed. Voters no longer merely choose one candidate from a list. They are consulted on multifarious issues, in ways that keep them abreast of relevant complexities, ensure their views remain consistent, etc. More sophisticated aggregation methods are used than simple majoritarian voting.Footnote 51

Perhaps elected politicians are still needed for some purposes. But by and large AI catapults early democracy into the twenty-first century while solving the problems of the distant state and of overbearing executive power. AI resolves relatively unimportant matters itself, consulting representative groups for others to ensure everything gets attention without swallowing too much time. In some countries citizens can opt out. Others require participation, with penalties for those with privacy settings that prohibit integration into the system. Nudging techniques – to get people to do what is supposed to be in their own best interest – are perfected for smooth operations.Footnote 52 AI avoids previously prevalent issues around lack of inclusiveness. Privacy settings protect all data. AI calls for elections if confidence in the government falls below a threshold. Bureaucracies are much smaller because AI delivers public services, evaluating experiences from smart cities to create smart countries. Judges are replaced by sophisticated algorithms delivering even-handed rulings. These systems can be arranged such that many concerns about functionality and place of AI in human affairs are resolved internally. In such ways enormous amounts of time are freed up for people to design their lives meaningfully.

As a desirable possibility, something like this might become more prominent in debates about AI and democracy. But we should be wary of letting our thinking be guided by such scenarios. To begin with, imagining a future like this presupposes that for a whole range of issues there is a ‘most intelligent’ solution that for various reasons we have just been unable to put into practice. But intelligence research does not even acknowledge the conceptual uniqueness of intelligence, that is, that there is only one kind of intelligence.Footnote 53 Appeals to pure intelligence are illusionary, and allowing algorithms to engage judgments and decisions like this harbors dangers. It might amount to brainwashing people, with intelligent beings downgraded to responders to stimuli.Footnote 54 Moreover, designing such a system inevitably involves unprecedented efforts at building state capacities, which are subject to hijacking and other abuse. We should not forget that at the dawn of the digital era we also find George Orwell’s 1984.

This Grand Democratic AI Utopia, a grand narrative itself, also triggers the warnings from our four techno-skeptical narratives: Mumford would readily see in such designs the next version of the megamachine, Heidegger detect yet more inauthenticity, Marcuse pillory the potential for yet more social control, and Ellul recognize how in this way the state is ever more inextricably intertwined with advancements of technique.

V. AI and Democracy: Possibilities and Challenges for the Digital Century

We saw in Section III that modern democracy requires technology to solve its legitimacy problems. Careful design of the materiality of democracy is needed to solve the distant-state and overbearing-executive problems. At the same time, autocracy benefits from technological advances because they make bureaucracies more effective. The grand techno-skeptical narratives add warnings to the prospect of harnessing technology for democratic purposes, which, however, do not undermine efforts to harness technology to advance democracy. Nor should we be guided by any Grand Democratic AI Utopia. What then are the possibilities and challenges of AI for democracy in this digital century? Specifically, how should AI be designed to harness the public sphere, political power, and economic power for democratic purposes, and thus make them akin to Winner’s inclusive traffic infrastructure?

1. Public Spheres

Public spheres are actor-networks to spread and receive information or opinions about matters of shared concern beyond family and friendship ties.Footnote 55 Prior to the invention of writing, public spheres were limited to people talking. Their flourishing depended on availability of places where they could do so safely. The printing press mechanized exchange networks, dramatically lowering costs of disseminating information or ideas. Eventually, newspapers became so central to public spheres that the press and later the media collectively were called ‘the fourth estate’.Footnote 56 After the newspapers and other printed media, there was the telegraph, then radio, film production, and television. Eventually, leading twentieth century media scholars coined slogans to capture the importance of media for contemporary life, most distinctly Marshall McLuhan announcing ‘the medium is the message’ and Friedrich Kittler stating ‘the media determine our situation’.Footnote 57

‘Fourth estate’ is an instructive term. It highlights the relevance of the media, and the deference for the more prominent among them, as well as for particular journalists whose voices carry weight with the public. But the term also highlights that media have class interests of sorts: aside from legal regulations, journalists had demographic and educational backgrounds that generated certain agendas rather than others. The ascent of social media, enabled by the Internet, profoundly altered this situation, creating a public sphere where availability of information and viewpoints was no longer limited by ‘the fourth estate’. Big Tech companies have essentially undermined the point of referring to media that way.

In the Western world, Google became dominant in internet searches. Facebook, Twitter, and YouTube offered platforms for direct exchanges among individuals and associations at a scale previously impossible. Political theorist Archon Fung refers to the kind of democracy that arose this way as ‘wide aperture, low deference democracy’: a much wider range of ideas and policies are explored than before, with traditional leaders in politics, media, or culture no longer treated with deference but ignored or distrusted.Footnote 58 Not only did social media generate new possibilities for networking but also an abundance of data gathered and analyzed to predict trends or target people with messages for which data mining deems them receptive. The 2018 Cambridge Analytica scandal – a British consulting firm obtaining personal data of millions of Facebook users without consent, to be used for political advertising – revealed the potential of data mining, especially for locations where elections tend to be won by small margins.Footnote 59

Digital media have by now generated an online communications infrastructure that forms an important part of the public sphere, whose size and importance will only increase. This infrastructure consists of the paraphernalia and systems that make our digital lives happen, from the hardware of the Internet to institutions that control domain names and the software that maintains the functionality of the Internet and provides tools to make digital spaces usable (browsers, search engines, app stores, etc.). Today, private interests dominate our digital infrastructure. Typically, engineers and entrepreneurs ponder market needs, profiting from the fact that more and more of our lives unfolds on platforms optimized for clicks and virality.

Especially, news is presented to appeal to certain users, which not only creates echo-chambers but spreads a plethora of deliberate falsehoods (disinformation, rather than misinformation) to reinforce the worldviews of those users. Political scientists have long lamented the ignorance of democratic citizens and the resulting poor quality of public decision making.Footnote 60 Even well-informed, engaged voters choose based on social identities and partisan loyalties.Footnote 61 Digital media reinforce these tendencies. Twitter, Facebook, YouTube, and competitors seek growth and revenue. Attention-grabbing algorithms of social media platforms can sow confusion, ignorance, prejudice, and chaos. AI tools then readily create artificial unintelligence.Footnote 62

Having a public sphere where viewpoints can be articulated with authority recently became much harder through the emergence of deepfakes. Bringing photoshopping to video, deepfakes replace people in existing videos with someone else’s likeness. Currently their reach is mostly limited to pornography, but their potential goes considerably beyond that. In recent decades, video has played a distinguished role in inquiry. What was captured on film served as indisputable evidence, in ways photography no longer could after manipulation techniques became widespread. Until the arrival of deepfakes, videos offered an ‘epistemic backstop’ in contested testimony.Footnote 63 Alongside other synthetic media and fake news, deepfakes might help create no-trust societies where people no longer bother to separate truth from falsehood, and no media help them do so.

What is needed to countermand such tendencies is the creation of what media scholar Ethan Zuckerman calls ‘digital public infrastructure’.Footnote 64 Digital public infrastructure lets us engage in public and civic life in digital spaces, with norms and affordances designed around civic values. Figuratively speaking, designing digital public infrastructure is like creating parks and libraries for the Internet. They are devised to inform us, structured to connect us to both people we agree with and people we disagree with, and encourage dialogue rather than simply reinforcing perceptions. As part of the design of such infrastructures, synthetic media must be integrated appropriately, in ways that require clear signaling of how they are put together. People would operate within such infrastructures also in ways that protect their entitlements as knowers and knowns, their epistemic rights.Footnote 65

One option is to create a fleet of localized, community-specific, public-serving institutions to serve the functions in digital space that community institutions have met for centuries in physical places. There must be some governance model, so this fleet serves the public. Wikipedia’s system of many editors and authors or Taiwan’s digital democracy platform provide inspiring models for decentralized participatory governance.Footnote 66 Alternatively, governments could create publicly funded nonprofit corporations to manage and maintain the public’s interest in digital life. Specialized AI would be front and center to such work. Properly designed digital public infrastructures would be like Winner’s inclusive traffic infrastructures and could greatly help solve the distant-state and overbearing-executive problems.

2. Political Power

As far as use of AI for maintenance of power is concerned, the Chinese social credit system – a broadly-based system for gathering information about individuals and bringing that information to bear on what people may do – illustrates how autocratic regimes avail themselves of technological advances.Footnote 67 Across the world, cyberspace has become a frequent battleground between excessively profit-seeking or outright criminal activities and overly strong state reactions to them. By now many tools exists that help governments rein in such activities, but those same tools then also help authoritarians oppress political activities.Footnote 68 While most mass protests in recent years, from Hong Kong to Algeria and Lebanon, were inspired by hashtags, coordinated through social networks and convened by smartphones, governments have learned how to countermand such movements. They control online spaces by blocking platforms and disrupting the Internet.Footnote 69

In his 1961 farewell speech, US president Dwight D. Eisenhower famously warned against acquisition of unwarranted influence ‘by the military-industrial complex’ and against public policy becoming ‘captive of a scientific-technological elite’.Footnote 70 Those interconnected dangers would be incompatible with a flourishing democracy. Eisenhower spoke only years after the Office of Naval Research had partly funded the first Summer Research Project on AI at Dartmouth in 1956, and thereby indicated that the military-industrial complex had a stake in this technology developed by the scientific-technological elite.Footnote 71 Decades later, the 2013 Snowden revelations showed what US intelligence could do with tools we can readily classify as specialized AI. Phones, social media platforms, email, and browsers serve as data sources for the state. Analyzing meta-data (who moved where, connected to whom, read what) provides insights into operations of groups and activities of individuals. Private-sector partnerships have considerably enhanced capacities of law enforcement and military to track people (also involving facial, gait, and voice recognition), from illegal immigrants at risk of deportation to enemies targeted for killing.Footnote 72

Where AI systems are deployed as part of the welfare state, they often surveil people and restrict access to resources, rather than providing greater support.Footnote 73 Secret databases and little-known AI applications have had harmful effects in finance, business, education, and politics. AI-based decisions on parole, mortgages, or job applications are often opaque and biased in ways that are hard to detect. Democratic ideals require reasons and explanations, but the widespread use of opaque and biased algorithms has prompted one observer to call societies that make excessive use of algorithms ‘black-box societies’.Footnote 74 If algorithms do things humans find hard to assess, it is unclear what would even count as relevant explanations. Such practices readily perpetuate past injustice. After all, data inevitably reflect how people have been faring so far. Thus, they reflect the biases, including racial biases, that have structured exercises of power.Footnote 75 Decades ago Donna Haraway’s ‘Cyborg Manifesto’, a classic at the intersection of feminist thought and the philosophy of technology, warned the digital age might sustain white capitalist patriarchy with ‘informatics of domination’.Footnote 76

Of course, digital technologies can strengthen democracy. In 2011, Iceland produced the first-ever ‘crowdsourced’ constitutional proposal in the world. In Taiwan, negotiations among authorities, citizens, and companies like Uber and Airbnb were aided by an innovative digital process for deliberative governance called vTaiwan. France relied on digital technologies for the Great National Debate in early 2019 and the following Convention on Climate Change between October 2019 and June 2020, experimenting with deliberation at the scale of a large nation.Footnote 77 Barcelona has become a global leader in the smart city movement, deploying digital technology for matters of municipal governance,Footnote 78 though it is Singapore, Helsinki, and Zurich that do best on the Smart City Index 2020 (speaking to the fact of how much innovation goes on in that domain).Footnote 79 An Australian, non-profit, eDemocracy project, openforum.com.au, invites politicians, senior administrators, academics, businesspeople, and other stakeholders to engage in policy debates. The California Report Card is a mobile-optimized web application promoting public involvement in state government. As the COVID-19 pandemic ravaged the world, democracies availed themselves of digital technologies to let people remain connected and serve as key components of public health surveillance. And while civil society organizations frequently are no match for abusive state power, there are remarkable examples of how even investigations limited to open internet sources can harvest the abundance of available data to pillory abuse of power. The best-known example is the British investigative journalism website Bellingcat that specializes in fact-checking and open-source intelligence.Footnote 80

One striking fact specifically about the American version of modern democracy is that, when preferences of low- or middle-income Americans diverge from those of the affluent, there is virtually no correlation between policy outcomes and desires of the less advantaged groups.Footnote 81 As far as political power is concerned, the legitimacy of modern democracy is questionable indeed. Democracy could be strengthened considerably by well-designed AI. Analyzing databases would give politicians a more precise image of what citizens need. The bandwidth of communication between voters and politicians could increase immensely. Some forms of surveillance will be necessary, but democratic governance requires appropriate oversight. The digital public infrastructure discussed in the context of the public sphere can be enriched to include systems that deploy AI for improving citizen services. The relevant know-how exists.Footnote 82

3. Economic Power

Contemporary ideals of democracy include egalitarian empowerment of sorts. But economic inequality threatens any such empowerment. Contemporary democracies typically have capitalist economies. As French economist Thomas Piketty has argued, over time capitalism generates inequality because, roughly speaking, owners of shares of the economy benefit more from it than people living on the wages the owners willingly pay.Footnote 83 A worry about democracy across history (also much on the mind of Publius) has been that masses would expropriate elites. But in capitalist democracies, we must worry about the opposite. It takes sustained policies around taxation, transportation, design of cities, healthcare, digital infrastructure, pension and education systems, and macro-economic and monetary policies to curtail inequality.

One concern about AI is that, generally, the ability to produce or use technology is one mechanism that drives inequality, enabling those with requisite skills to advance – which enables them not only to become well-off but to become owners in the economy in ways that resonate across generations. Technology generally and AI specifically are integral parts of the inequality-enhancing mechanisms Piketty identifies. One question is how the inequality-increasing tendencies play out for those who are not among the clear winners. AI will profoundly transform jobs, at least because aspects of many jobs will be absorbed by AI or otherwise mechanized. These changes also create new jobs, including at the lower end, in the maintenance of the hardware and the basic tasks around data gathering and analysis.Footnote 84 On the optimistic side of predictions about the future of work, we find visions of society with traditional jobs gradually transformed, some eliminated and new jobs added – in ways that create much more leisure time for the average people, owing to increased societal wealth.

On the pessimistic side, many who are unqualified for meaningful roles in tech economies might be dispensable to the labor force. Their political relevance might eventually amount to little more than that they must be pacified if they cannot be excluded outright. Lest this standpoint be dismissed as Luddite alarmism (‘at the end of the tunnel, there have always been more jobs than before’), we should note that economies where data ownership becomes increasingly relevant and AI absorbs many tasks could differ critically from economies organized around ownership of land or factories. In those earlier cases large numbers of people were needed to provide labor, in the latter case also as consumers. Elites could not risk losing too many laborers. But this constraint might not apply in the future. To be sure, a lot here will depend on how questions around control over, or ownership of, data are resolved, questions whose relevance for our future economy cannot be overstated.Footnote 85

As recently argued by Shoshana Zuboff, the importance of data collection for the economy has become so immense that the term ‘surveillance capitalism’ characterizes the current stage of capitalism.Footnote 86 Surveillance capitalism as an economic model was developed by Google, which to surveillance capitalism is what Ford was to mass production. Later, the model was adopted by Facebook, Amazon, and others. Previously, data were collected largely to improve services. But subsequently, data generated as byproducts of interactions with multifarious devices were deployed to develop predictive products, designed to forecast what we will feel, think, or do, but ultimately also to control and change it, always for the sake of monetization. Karl Marx and Friedrich Engels identified increasing commodification as a basic mechanism of capitalism (though they did not use that very term). Large-scale data collection is its maximal version: It commodifies all our lived reality.

In the twentieth century, Hannah Arendt and others diagnosed mechanisms of ‘totalitarian’ power, the state’s all-encompassing power.Footnote 87 Its central metaphor is Big Brother, capturing the state’s omnipresence. Parallel to that, Zuboff talks about ‘instrumentarian’ power, exercised through use of electronic devices in social settings for harvesting profits. The central metaphor is the ‘Big Other’, the ever-present electronic device that knows just what to do. Big Brother aimed for total control, Big Other for predictive certainty.

Current changes are driven by relatively few companies, which futurist Amy Webb calls ‘the Big Nine’: in the US, Google, Microsoft, Amazon, Facebook, IBM and Apple, in China Tencent, Alibaba and Baidu.Footnote 88 At least at the time of Webb’s writing, the Chinese companies were busy consolidating and mining massive amounts of data to serve the government’s ambitions; the American ones implemented surveillance capitalism, embedded into a legal and political framework that, as of 2021, shows little interest in developing strategic plans for a democratic future and thus do for democracy what the Chinese Communist party did for its system – upgrade it into this century. To be sure, the EU is more involved in such efforts. But none of the Big Nine are based there, and overall, the economic competition in the tech sector seems to be ever more between the United States and China.

The optimistic side of predictions about the future of work seems reachable. But to make that happen in ways that also strengthen democracy, both civil society and the state must step up, and the enormous power concentrated in Big Tech companies needs to be harnessed for democratic purposes.

VI. Conclusion

Eventually there might be a full-fledged Life 3.0, whose participants not only design their cultural context (as in Life 2.0, which sprang from the evolutionary Life 1.0), but also their physical shapes.Footnote 89 Life 3.0 might be populated by genetically enhanced humans, cyborgs, uploaded brains, as well as advanced algorithms embedded into any manner of physical device. If Life 3.0 ever emerges, new questions for governance arise. Would humans still exercise control? If so, would there be democracies, or would some people or countries subjugate everybody else? Would it be appropriate to involve new intelligent entities in governance, and what do they have to be like for the answer to be affirmative? If humans are not in control, what will governance be like? Would humans even be involved?Footnote 90

It is unclear when questions about democracy in Life 3.0 will become urgent. Meanwhile, as innovation keeps happening, societies will change. Innovation will increase awareness of human limitations and set in motion different ways for people to deal with them. As Norbert Wiener, whose invention of cybernetics inaugurated later work on AI, stated in 1964:

The world of the future will be an ever more demanding struggle against the limitation of our intelligence, not a comfortable hammock in which we can lie down to be waited upon by our robot slaves.Footnote 91

Maybe more and more individuals will want to adapt to technological change and perhaps deploy technology to morph into a transhuman stage.Footnote 92 Generally, what technology they use – the materiality of their lives – affects who people are and want to become. Technology mediates how we see ourselves in relation to our social and natural environment, how we engage with people, animals, and material objects, what we do with ourselves, how we spend our professional lives, etc. In short, technology critically bears on what forms of human life get realized or even imagined. For those that do get realized or imagined, what it means to be part of them cannot be grasped without comprehending the role of technology in them.Footnote 93

As we bring about the future, computer scientists will become ever more important, also as experts in designing specialized AI for democratic purposes. That raises its own challenges. Much as technology and democracy are no natural allies, technologists are no natural champions of, nor even qualified advisers to, democracy. Any scientific activity, as Arendt stated some years before Wiener wrote the words just cited, as it acts into nature from the standpoint of the universe and not into the web of human relationships, lacks the revelatory character of action as well as the ability to produce stories and become historical, which together form the very source from which meaningfulness springs into and illuminates human existence.Footnote 94

Democracy is a way of life more than anything else, one that greatly benefits from the kind of action Arendt mentions. And yet modern democracy critically depends on technology to be the kind of actor-network that solves the distant-state and overbearing-executive problems. Without suitable technology, modern democracy cannot survive. Technology needs to be consciously harnessed to become like Winner’s inclusive traffic infrastructure, and both technologists and citizens generally need to engage with ethics and political thought to have the spirit and dedication to build and maintain that kind of infrastructure.

Footnotes

* I am grateful to audiences at University College London and at the University of Freiburg for helpful discussions during Zoom presentations of this material in June 2021. I also acknowledge helpful comments from Sushma Raman, Derya Honca, and Silja Voeneky.

1 L Winner, ‘Do Artifacts Have Politics?’ (1980) 109 Daedalus 121.

2 For current trends, see P Chojecki, Artificial Intelligence Business: How You Can Profit from AI (2020). For the state of the art, see M Mitchell, Artificial Intelligence: A Guide for Thinking Humans (2019); T Taulli, Artificial Intelligence Basics: A Non-Technical Introduction (2019); S Russell, Human Compatible: Artificial Intelligence and the Problem of Control (2019). See also The Future Today Institute, ‘14th Annual Tech Trends Report’ (2021); for musings on the future of AI, see J Brockman (ed), Possible Minds: Twenty-Five Ways of Looking at AI (2019).

3 For optimism about the occurrence of a singularity, see R Kurzweil, The Singularity Is Near: When Humans Transcend Biology (2006); for pessimism, see EJ Larson, The Myth of Artificial Intelligence: Why Computers Can’t Think the Way We Do (2021); see also N Bostrom, Superintelligence: Paths, Dangers, Strategies (2016) (hereafter Bostrom, Superintelligence); M Tegmark, Life 3.0: Being Human in the Age of Artificial Intelligence (2017) (hereafter Tegmark, Life 3.0).

4 B Latour, Reassembling the Social: An Introduction to Actor-Network-Theory (2007); B Latour, We Have Never Been Modern (1993). To be sure, and notwithstanding the name of the theory, Latour speaks of actants rather than actors, to emphasize the role of non-human entities.

5 How to understand ‘technology’ is a non-trivial question in the philosophy of technology, as it affects how broad our focus is; see C Mitcham, Thinking through Technology: The Path between Engineering and Philosophy (1994); M Coeckelbergh, Introduction to Philosophy of Technology (2019). For AI one could just think of a set of tools in machine learning; alternatively, one could think of the whole set of devices in which these tools are implemented, and all productive activities that come with procurement and extraction of materials involved; see K Crawford, Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence (2021) (hereafter Crawford, Atlas of AI). While I mostly sideline these issues, I adopt an understanding of technology from W Bijker, ‘Why and How Technology Matters’ in RE Goodin and C Tilly (eds), The Oxford Handbook of Contextual Political Analysis (2006). At a basic level, ‘technology’ refers to sets of artefacts like computers, cars, or voting machines. At the next level, it also includes human activities, as in ‘the technology of e‐voting’. Thereby it refers also to the making and handling of such machines. Finally, and closest to its Greek origin, ‘technology’ refers to knowledge: It is about what people know as well as what they do with machines and related production processes.

6 For a good overview, see A Gutmann, ‘Democracy’ in RE Goodin, P Pettit, and TW Pogge (eds), A Companion to Contemporary Political Philosophy (2007).

7 D Stasavage, The Decline and Rise of Democracy: A Global History from Antiquity to Today (2020) (hereafter Stasavage, The Decline and Rise of Democracy).

8 To think of Greek democracy as a uniquely located innovation also contradicts the evolutionary story of early bands of humans who succeeded because they were good at cooperating and had brains that had evolved to serve cooperative purposes. See for example, C Boehm, Hierarchy in the Forest. The Evolution of Egalitarian Behavior (1999). To the extent that a demos separate from an aristocracy is the hallmark of democracy (a sensible view given the etymology), many cases covered by Stasavage do not count. Still, his account creates an illuminating contrast with autocracies. Also, in structures where consent is needed, internal dynamics over time typically demand broader inclusion.

9 Stasavage, The Decline and Rise of Democracy (Footnote n 7) 29; J Ober, The Rise and Fall of Classical Greece (2015) 123; J Thorley, Athenian Democracy (2004) 23.

10 O Höffe (ed), Aristotle. Politics (1998). Also see M Risse, ‘The Virtuous Group: Foundations for the ‘Argument from the Wisdom of the Multitude’’ (2001) 31 Canadian Journal of Philosophy 31, 53.

11 For the devices, I draw on J Dibbell, ‘Info Tech of Ancient Democracy’ (Alamut), www.alamut.com/subj/artiface/deadMedia/agoraMuseum.html, which explores museum literature on these artefacts displayed in Athens. See also S Dow, ‘Aristotle, the Kleroteria, and the Courts’ (1939) 50 Harvard Studies in Classical Philology 1. For the mechanics of Athenian democracy, see also MH Hansen, The Athenian Democracy in the Age of Demosthenes: Structure, Principles, and Ideology (1991).

12 Hélène Landemore has argued that modern democracy erred in focusing on representation. Instead, possibilities of small-scale decision making with appropriate connections to government should have been favored – which now is more doable through technology. See H Landemore, ‘Open Democracy and Digital Technologies’ in L Bernholz, H Landmore, and R Reich (eds), Digital Technology and Democratic Theory (2021) 62; H Landemore, Open Democracy: Reinventing Popular Rule for the Twenty-First Century (2020).

13 Howard Zinn has a rather negative take specifically on the founding of the United States that would make it unsurprising that these legitimacy problems arose: ‘Around 1776, certain important people in the English colonies […] found that by creating a nation, a symbol, a legal unity called the United States, they could take over land, profits, and political power from favorites of the British Empire. In the process, they could hold back a number of potential rebellions and create a consensus of popular support for the rule of a new, privileged leadership’; H Zinn, A People’s History of the United States (2015) 59.

14 JE Cooke, The Federalist (1961), 149 (hereafter Cooke, Federalists).

15 JS Young, The Washington Community 1800–1828 (1966) 32.

16 B Bimber, Information and American Democracy: Technology in the Evolution of Political Power (2003) 89. For the argument that, later, postal services were critical to the colonization of the American West (and thus have been thoroughly political throughout their existence), see C Blevins, Paper Trails: The US Post and the Making of the American West (2021).

17 Cooke, Federalist (Footnote n 14) 384.

18 I follow J Lepore, ‘Rock, Paper, Scissors: How We Used to Vote’ (The New Yorker, 13 October 2008). Some of those themes also appear in J Lepore, These Truths: A History of the United States (2019), especially chapter 9. See also RG Saltman, History and Politics of Voting Technology: In Quest of Integrity and Public Confidence. (2006). For the right to vote in the United States, see A Keyssar, The Right to Vote: The Contested History of Democracy in the United States (2009).

19 Stasavage, The Decline and Rise of Democracy (Footnote n 7) 296. For a political-theory idealization of modern democracy in terms of two ‘tracks’, see J Habermas, Between Facts and Norms: Contributions to a Discourse Theory of Law and Democracy (1996) Chapters 7–8. The first track is formal decision making (e.g., parliament, courts, agencies). The other is informal public deliberation, where public opinion is formed.

20 The success of the Chinese model has prompted some philosophers to defend features of that model, also in light of how democracies have suffered from the two legitimacy problems; see DA Bell, The China Model: Political Meritocracy and the Limits of Democracy (2016); T Bai, Against Political Equality: The Confucian Case (2019); J Chan, Confucian Perfectionism: A Political Philosophy for Modern Times (2015). For the view that China’s Communist Party will face a crisis that will force it to let China become democratic, see Ci, Democracy in China: The Coming Crisis (2019). For the argument that different governance models emerge for good reasons at different times, see F Fukuyama, The Origins of Political Order: From Pre-Human Times to the French Revolution (2012); F Fukuyama, Political Order and Political Decay: From the Industrial Revolution to the Globalization of Democracy (2014).

21 YN Harari, ‘Why Technology Favors Tyranny’ (The Atlantic, October 2018) www.theatlantic.com/magazine/archive/2018/10/yuval-noah-harari-technology-tyranny/568330/.

22 FA Hayek, The Road to Serfdom (2007).

23 Similarly, Cohen and Fung – reviewing deterministic viewpoints that see technology clearly favor or disfavor democracy – conclude that ‘the democratic exploitation of technological affordances is vastly more contingent, more difficult, and more dependent on ethical conviction, political engagement, and good design choices than the technological determinists appreciated’ A Fung and J Cohen, ‘Democracy and the Digital Public Sphere’ in L Bernholz, H Landemore, and R Reich (eds), Digital Technology and Democratic Theory (2021) 25 (hereafter Fung and Cohen, ‘Democracy and the Digital Public Sphere’). Or as computer scientist Nigel Shadbolt says, addressing worries that ‘machines might take over’: ‘[T]he problem is not that machines might wrest control of our lives from the elites. The problem is that most of us might never be able to wrest control of the machines from the people who occupy the command posts’, N Shadbolt and R Hampson, The Digital Ape: How to Live (in Peace) with Smart Machines (2019) 63.

24 M Coeckelbergh, Introduction to Philosophy of Technology (2019) Part II.

25 On Mumford, see DL Miller, Lewis Mumford: A Life (1989).

26 L Mumford, Technics and Civilization (2010).

28 Footnote Ibid, 12–18.

29 L Mumford, Myth of the Machine: Technics and Human Development (1967) (hereafter Mumford, Myth of the Machine); L Mumford, Pentagon of Power: The Myth of the Machine (1974) (hereafter Mumford, Pentagon of Power).

30 Mumford, Myth of the Machine (Footnote n 29) chapter 9.

31 The title of chapter 11 of Mumford, Pentagon of Power (Footnote n 29).

32 M Heidegger, The Question Concerning Technology, and Other Essays (1977) 3–35 (hereafter Heidegger, The Question Concerning Technology). On Heidegger, see J Richardson, Heidegger (2012); ME Zimmerman, Heidegger’s Confrontation with Modernity: Technology, Politics, and Art (1990).

33 Heidegger, The Question Concerning Technology (Footnote n 32) 17.

34 Quoted in Young, Heidegger’s Later Philosophy (2001) 46.

35 Heidegger, The Question Concerning Technology (Footnote n 32) 16.

38 Quoted in J Young, Heidegger’s Later Philosophy (2001) 50.

39 HL Dreyfus, On the Internet (2008).

40 W Benjamin, The Work of Art in the Age of Its Technological Reproducibility, and Other Writings on Media (2008).

41 H Marcuse, One-Dimensional Man: Studies in the Ideology of Advanced Industrial Society (1991) 1 (hereafter Marcuse, One-Dimensional Man).

42 Marcuse, One-Dimensional Man (Footnote n 41) 3.

44 J Ellul, The Technological Society (1964) (hereafter Ellul, The Technological Society). For recent discussions, see JP Greenman, Understanding Jacques Ellul (2012); HM Jerónimo, JL Garcia, and C Mitcham, Jacques Ellul and the Technological Society in the 21st Century (2013).

45 Ellul, The Technological Society (Footnote n 44) 133.

46 Ellul, The Technological Society (Footnote n 44) 258.

48 J Susskind, Future Politics: Living Together in a World Transformed by Tech (2018) Chapter 13; YN Harari, Homo Deus: A Brief History of Tomorrow (2018) Chapter 9.

49 J Lovelock, Novacene: The Coming Age of Hyperintelligence (2020).

50 See T Ord, The Precipice: Existential Risk and the Future of Humanity (2021) Chapter 5.

51 For a discussion of majority rule in the context of competing methods that process information differently, also M Risse, ‘Arguing for Majority Rule’ (2004) 12 Journal of Political Philosophy 41.

52 RH Thaler and CR Sunstein, Nudge: Improving Decisions about Health, Wealth, and Happiness (2009).

53 See e.g., HE Gardner, Frames of Mind: The Theory of Multiple Intelligences (2011).

54 On this, see also D Helbing and others, ‘Will Democracy Survive Big Data and Artificial Intelligence?’ (Scientific American, 25 February 2017) www.scientificamerican.com/article/will-democracy-survive-big-data-and-artificial-intelligence/.

55 For a classic study of the emergence of public spheres, see J Habermas, The Structural Transformation of the Public Sphere: An Inquiry into a Category of Bourgeois Society (1991). For how information spread in different periods, see A Blair and others, Information: A Historical Companion (2021). For the development of media in recent centuries, see P Starr, The Creation of the Media: Political Origins of Modern Communications (2005).

56 This term has been attributed to Edmund Burke, and thus goes back to a time decades before media played that kind of role in the American version of modern democracy, see J Schultz, Reviving the Fourth Estate: Democracy, Accountability and the Media (1998) 49.

57 M McLuhan, Understanding Media: The Extensions of Man (1994); FA Kittler, Gramophone, Film, Typewriter (1999).

58 For the emergence of digital media and their role for democracy, see Fung and Cohen, ‘Democracy and the Digital Public Sphere’ (Footnote n 23). For the formulation I attributed to Fung, see for instance this podcast PolicyCast, ‘211 Post-expert Democracy: Why Nobody Trusts Elites Anymore’ (Harvard Kennedy School, 3 February 2020) www.hks.harvard.edu/more/policycast/post-expert-democracy-why-nobody-trusts-elites-anymore.

59 A Jungherr, G Rivero and D Gayo-Avello, Retooling Politics: How Digital Media Are Shaping Democracy (2020) Chapter 9; C Véliz, Privacy Is Power: Why and How You Should Take Back Control of Your Data (2021) Chapter 3 (hereafter Véliz, Privacy Is Power).

60 J Brennan, Against Democracy (2017); B Caplan, The Myth of the Rational Voter: Why Democracies Choose Bad Policies (2nd ed., 2008); I Somin, Democracy and Political Ignorance: Why Smaller Government Is Smarter (2nd ed., 2016).

61 CH Achen and LM Bartels, Democracy for Realists: Why Elections Do Not Produce Responsive Government (2017).

62 M Broussard, Artificial Unintelligence: How Computers Misunderstand the World (2019) (hereafter Broussard, Artificial Unintelligence).

63 R Rini, ‘Deepfakes and the Epistemic Backstop’ (2020) 20 Philosophers’ Imprint 1. See also C Kerner and M Risse, ‘Beyond Porn and Discreditation: Promises and Perils of Deepfake Technology in Digital Lifeworlds’ (2021) 8(1) Moral Philosophy and Politics 81.

64 For E Zuckerman’s work, see E Zuckerman, ‘What Is Digital Public Infrastructure’ (Center for Journalism & Liberty, 17 November 2020) www.journalismliberty.org/publications/what-is-digital-public-infrastructure#_edn3; and E Zuckerman, ‘The Case of Digital Public Infrastructure’ (Knight First Amendment Institute, 17 January 2020) https://knightcolumbia.org/content/the-case-for-digital-public-infrastructure; see also E Pariser and D Allen, ‘To Thrive Our Democracy Needs Digital Public Infrastructure’(Politico, 1 May 2021) www.politico.com/news/agenda/2021/01/05/to-thrive-our-democracy-needs-digital-public-infrastructure-455061.

65 S Zuboff, ‘The Coup We Are Not Talking About’ (New York Times, 29 January 2021) www.nytimes.com/2021/01/29/opinion/sunday/facebook-surveillance-society-technology.html; M Risse, ‘The Fourth Generation of Human Rights: Epistemic Rights in Digital Lifeworlds’ (2021) Moral Philosophy and Politics https://doi.org/10.1515/mopp-2020-0039.

66 On Taiwan, see A Leonard, ‘How Taiwan’s Unlikely Digital Minister Hacked the Pandemic’ (Wired, 23 July 2020) www.wired.com/story/how-taiwans-unlikely-digital-minister-hacked-the-pandemic/.

67 For a recent take, see J Reilly, M Lyu, and M Robertson ‘China’s Social Credit System: Speculation vs. Reality’ (The Diplomat, 30 March 2021) https://thediplomat.com/2021/03/chinas-social-credit-system-speculation-vs-reality/. See also B Dickson, The Party and the People: Chinese Politics in the 21st Century (2021).

68 RJ Deibert, Black Code: Surveillance, Privacy, and the Dark Side of the Internet (2013); RJ Deibert, Reset: Reclaiming the Internet for Civil Society (2020).

69 Fung and Cohen, ‘Democracy and the Digital Public Sphere’ (Footnote n 23).

70 For the speech, see DD Eisenhower, ‘Farewell Address’ (1961) www.ourdocuments.gov/doc.php?flash=false&doc=90&page=transcript.

71 Crawford, Atlas of AI (Footnote n 5) 184; Obviously in 1961, AI is not what Eisenhower had in mind.

72 Crawford, Atlas of AI (Footnote n 5) Chapter 6. See also Véliz, Privacy Is Power (Footnote n 59).

73 V Eubanks, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (2018).

74 F Pasquale, The Black Box Society: The Secret Algorithms That Control Money and Information (2016). See also Broussard, Artificial Unintelligence (Footnote n 62); C O’Neil, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (2017).

75 R Benjamin, Race After Technology Race After Technology: Abolitionist Tools for the New Jim Code (2019); R Benjamin, Captivating Technology: Race, Carceral Technoscience, and Liberatory Imagination in Everyday Life (2019); SU Noble, Algorithms of Oppression: How Search Engines Reinforce Racism (2018). See also C D’Ignazio and LF Klein, Data Feminism (2020); S Costanza-Chock, Design Justice: Community-Led Practices to Build the Worlds We Need (2020).

76 D Haraway, Simians, Cyborgs, and Women: The Reinvention of Nature (2015) 149–182.

77 L Bernholz, H Landemore, and R Reich, Digital Technology and Democratic Theory (2021).

78 P Preville, ‘How Barcelona Is Leading a New Era of Digital Democracy’ (Medium, 13 November 2019) https://medium.com/sidewalk-talk/how-barcelona-is-leading-a-new-era-of-digital-democracy-4a033a98cf32.

79 IMD, ‘Smart City Index’ (IMD, 2020) www.imd.org/smart-city-observatory/smart-city-index/.

80 E Higgins, We Are Bellingcat: Global Crime, Online Sleuths, and the Bold Future of News (2021). See also M Webb, Coding Democracy: How Hackers Are Disrupting Power, Surveillance, and Authoritarianism (2020).

81 LM Bartels, Unequal Democracy: The Political Economy of the New Gilded Age (2018); M Gilens, Affluence and Influence: Economic Inequality and Political Power in America (2014).

82 On AI and citizen services, see H Mehr, ‘Artificial Intelligence for Citizen Services and Government’ (Harvard Ash Center Technology & Democracy Fellow, August 2017) https://ash.harvard.edu/files/ash/files/artificial_intelligence_for_citizen_services.pdf.

83 T Piketty, Capital in the Twenty First Century (2014).

84 On these topics, see e.g., D Susskind, A World Without Work: Technology, Automation, and How We Should Respond (2020); DM West, The Future of Work: Robots, AI, and Automation (2019).

85 On this, also see M Risse, ‘Data as Collectively Generated Patterns: Making Sense of Data Ownership’ (Carr Center for Human Rights Policy, 4 April 2021) https://carrcenter.hks.harvard.edu/publications/data-collectively-generated-patterns-making-sense-data-ownership.

86 S Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (2019). See also Véliz, Privacy Is Power (Footnote n 59).

87 H Arendt, The Origins of Totalitarianism (1973).

88 A Webb, The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity (2020).

89 For that term, see Tegmark, Life 3.0 (Footnote n 3).

90 Tegmark, Chapter 5 (Footnote n 3).

91 N Wiener, God and Golem, Inc.; a Comment on Certain Points Where Cybernetics Impinges on Religion (1964) 69.

92 D Livingstone, Transhumanism: The History of a Dangerous Idea (2015); M More and N Vita-More, The Transhumanist Reader: Classical and Contemporary Essays on the Science, Technology, and Philosophy of the Human Future (2013); Bostrom, Superintelligence (Footnote n 3).

93 For some aspects of this, NC Carr, The Shallows: How the Internet Is Changing the Way We Think, Read and Remember (2011); S Turkle, Alone Together: Why We Expect More from Technology and Less from Each Other (2017). But the constitutive role of technology on human life is a central theme in the philosophy of technology and adjacent areas generally.

94 H Arendt, The Human Condition (1958) 324.

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×