Hostname: page-component-586b7cd67f-dlnhk Total loading time: 0 Render date: 2024-11-23T19:22:21.044Z Has data issue: false hasContentIssue false

Training Humans Not Machines

Artificial Intelligence and the Performance Culture of Its Critique

Published online by Cambridge University Press:  01 March 2024

Rights & Permissions [Opens in a new window]

Abstract

In the 21st century, the performance culture of critique has transformed with the increasing implementation of AI technologies upon which the operative functions of data capitalism are built. Operating within the performance-based culture industry, the works of Trevor Paglen, Gerald Nestler/Sylvia Eckermann, and Vladan Joler respond critically to data capitalism’s modes of data extraction and how the societal performances of capitalism condition people’s physical and digital performances.

Type
Special Issue Still Exhausted: Labor, Digital Technologies, and the Performing Arts
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s), 2024. Published by Cambridge University Press for Tisch School of the Arts/NYU

“Are we all going to be working for a smart machine, or will we have smart people around the machine?” (Zuboff Reference Zuboff2019:3). With this provocative question Shoshana Zuboff opens her book The Age of Surveillance Capitalism, which describes a new production logic within the 21st-century capitalist system.Footnote 1 Without addressing issues concerning class or labor per se, the business scholar analyzes how leading Silicon Valley tech companies (e.g., Amazon, Apple, Google/Alphabet, Microsoft, Meta Platforms/Facebook) extract, harvest, and analyze human behavior patterns with artificial intelligence technologies; the accumulated big data of human activities and identities online are then metabolized through the companies’ machine-learning operations into algorithmic codes, serving the corporate aim to generate financial profit—hence capital.Footnote 2 Zuboff’s model of 21st-century data capitalism contextualizes how B.F. Skinner’s behavioral conditioning techniques (1938), which derive from scientific animal studies and provided a groundwork for mid-20th-century cybernetics, are implemented through artificial intelligence operations and are performed via digital platforms. These algorithm-based technologies do not only track, but also continuously manipulate people’s behavior patterns on a mass scale. The extractive data industry of AI forms the technological operations on which the infrastructures of 21st-century capitalism run and have, so far, not reached a point of exhaustion.

Given the urgency for critical intervention into how capitalist AI technologies perform (including their production, application, and use) and adapt people’s behavior and actions, performance can be considered in two ways to address this issue. First, I define the performance of AI technologies as operating within capitalism’s political economy and feeding on the exploitation of human activity and natural resources; and second, I define artistic performances specifically operating within the cultural industry that are critical of AI technologies as performance models of critique. This two-fold approach to performance exposes the relation between capitalism’s AI operations and artistic critique. I propose three distinct artistic “performance models of critique” to reevaluate both the notion of performance and critique.

In the early millennium, Jon McKenzie suggested that performance is not a discipline but a paradigm and coined the notion of “techno-performance” to denote “technological performances” engineered by researchers who initially studied high performance missiles and early computer systems in the US in the 1950s (2001:131). While McKenzie focuses on the performance of the military-industrial-academic complex, I am interested in exploring the relationship between recent cybernetic technologies on which 21st-century data capitalism operates, and artistic performances that critique their capitalist production and their impact on society.

Performance Models of Critique

The historical tradition of the Enlightenment is key to the entanglement of how AI technologies and artistic critique perform. Isabelle Graw and Christoph Menke define “critique” as “the enlightenment strategy of judgment whereby a subject establishes itself and declares its autonomy as an independent judge above and distant from the matter of its consideration,” with the aim of empowering the subject itself (2019:9). Although the commercialization of critique is not a concern of Graw and Menke, subjective empowerment fuels the relation between the performance of cultural critique and artistic work: artists study a specific subject that they critique, and through their cultural performances they provide critical insights into it. As the aesthetic form of works analyzed in this article both incorporates knowledge about and uses some of the same means of production and techniques of capitalism’s operative AI technologies to stage a critique, these critical artistic performances are produced within the same societal infrastructures that they critique and therefore also actively shape. Although this kind of critical artistic labor is complicit with economic production, I suggest that its value form represents a negative attitude as it “stands in the service of self-preservation or -empowerment” (10).

Following this proposition: human agency is implicit in both the production and user-consumer experiences of techno-capitalism. As the three artistic practices from the 2010s that I consider use the cultural sphere of art to critique the application of capitalism’s operational functions of AI technologies, the following questions arise: How can humans navigate the algorithmicized “technological condition” (Hörl Reference Hörl2015) without becoming corrupted and governed by corporate algorithms that feed on the insight gained from their behavioral patterns? Which aesthetic and technological means do artists use to perform their critique of the invisible power mechanisms of 21st-century capitalist AI technologies to a lay audience? What aesthetics do such critical performances generate? Does it matter if they are regarded as art? And ultimately, what do such critical artistic performance practices reveal about the limitations of both machine and human agency, operating within a political economy steered by the latest cybernetic technologies?

To offer answers to these questions addressing conceptions of contemporary AI technology and artistic performances of its cultural critique in the 21st century, I take up Dave Beech’s prompt to focus on how “art correspond[s] to the capitalist mode of production” (2015:5), as well as Gerald Nestler’s term “post-disciplinary” (2020:114), which conceives of art not as a discipline but as an entanglement of theoretical, artistic, scientific, and political engagement. Distinct from performance studies scholarship that explores the relationship between media and performance, I analyze three distinct artistic performance practices that address the political economy of how AI technologies perform, its human and material resources, and the ethical issues (the social discrimination patterns) that it (re)produces.Footnote 3 To conduct a close visual and performance analysis of the three works and map their distinct artistic performance models, I draw on conversations and interviews I had with the artists.

I focus on Trevor Paglen’s live performance Sight Machine (2019), Gerald Nestler and Sylvia Eckermann’s participatory live two-part performance event series The Future of Demonstration (2017–18), and Vladan Joler’s animation video The New Extractivism (2020). Although these artistic positions realize a critique of AI technologies in distinct exhibition formats, their “artistic performances” (they perform their human agency as artists in the cultural industry) all draw attention to the gap between bodily figuration and technological abstraction, and interfere in the political economy of AI by mediating the necessity for a wider transparency regarding the visually invisible capitalist developments and application of AI technologies, as well as the impact data capitalism has on society.

The Cooptation of Scientific Research and Artistic Production

The cultural industry’s social critique of AI is representative of the increasing approximation of artistic and scientific production. Since the early 1990s, structural changes to universities and their funding resulted in the introduction of PhD-in-practice programs, first in the UK and Finland and then in the EU with the Bologna process in 1999 (Share Network n.d.). This led to more and more artists earning PhDs for intellectual and sometimes also financial support. Following the change in research and arts-based research education in the 2000s, artistic research and scientific research have been practiced in close proximity and methods and knowledge shared.Footnote 4 In 2010, cultural critic Irit Rogoff questioned the impact “educational turns” have on cultural practices as notions of “knowledge production,” “research,” “education,” “open-end production,” and “self-organized pedagogies” have increasingly formalized artistic working methods (2008).Footnote 5 This recent transformation of the educational and cultural sphere of production poses a challenge to the understandings of artistic research and work, and prompts the tongue-in-cheek question: Can artists perform critically at all?

Trevor Paglen, Gerald Nestler, and Vladan Joler hold PhDs, and the research methods that they have developed shape the aesthetics of their work: Paglen completed a PhD in geography at the University of California, Berkeley, on top of his MFA; Gerald Nestler earned a PhD-in-Practice from Goldsmiths, University of London, after completing his diploma at the Academy of Fine Arts Vienna some time before; and Vladan Joler holds a PhD-in-Practice from the arts department of the University of Novi Sad after studying art at the same university.Footnote 6 Over the past few years, their critical artistic practice has built on recent scientific and insider knowledge, which they stage in an aesthetic form that is distinct from established artistic practices resulting in painting, photography, sculpture, music, visual art, video art, installation, or visual/dance/theatre performances. Paglen, Nestler, and Joler’s productive appropriation of scientific knowledge in their artistic work enables them to present cultural critique as art in public spaces. Their political commitment to researching AI systems and exhibiting their findings makes the otherwise invisible mechanism of data capitalism transparent to laypeople.

Approaching the technological condition of data capitalism critically, I suggest that the critical work of Paglen, Nestler working in collaboration with Eckermann, and Joler represents a stream of artistic practice that moves beyond existing performance models of artistic critique. Their focus on the operational applications of AI, which is the basis for today’s financialized capitalist system (including the cultural industry and tech companies), resonates with art theorist Marina Vishmidt’s latest call for a shift from the “critique of institutions”—dating from conceptual performance art in museums and galleries in the 1960s and culminating with Andrea Fraser’s critical museum performances in the 1990s (see Fraser Reference Fraser2005)—towards “infrastructural critique” to describe a form of critique that “exceeds the institution” (Vishmidt Reference Vishmidt2017:267). Focusing on the political economy of capitalism’s AI technologies, the three artists under consideration perform a critique of the operations of AI-steered data capitalism, a critique that exceeds, and does not per se include, the art institution.

Paglen, Nestler, and Joler’s artistic performances of critique came to the fore at a time when the “ethos of critique,” a description Elizabeth S. Anker and Rita Felski use in their work on postcritique (2017), was faced with the challenges that economic infrastructures impose on the arts and the humanities.Footnote 7 Anker and Felski propose that critique has exhausted itself; by contrast, political critique criticizes capitalist operations over postcritique, which is primarily preoccupied with literary modes of interpretation and the genre of criticism per se. The relationship between the practice of critiquing AI in the arts hinges between artists’ critique of capitalism’s AI systems and their thematic affirmation of these same systems by making it the subject of their artworks.

As such artistic practices combine critique with affirmation of the status quo, their work represents “the passive side of all activity” with “art enabl[ing] the active exploration of our passivity” (Seel Reference Seel2019:73–75). Without doing away with an enlightening mindset that is implicit in critical artist practices, the following case studies present three artistic performance models of visualizing, experiencing, and mapping and provide insights into the role artistic critique of capitalism’s technological AI operations plays in the cultural sphere.

Performance Model of Critique I: VISUALIZING

Trevor Paglen’s Sight Machine

In 2017, Trevor Paglen’s multimedia performance Sight Machine, which he made in collaboration with the renowned Kronos Quartet, premiered at Pier 70 in San Francisco.Footnote 8 David Harrington, John Sherba, Hank Dutt, and Sunny Yang formed the string quartet that played 11 titles especially written or arranged for them. Among them were compositions by John Oswald, Laurie Anderson, Raymond Scott, Terry Riley, and Steve Reich (Barbican 2019). The performance of each piece of music was aligned with a machine-learning algorithm that was given a specific task: to project an image of how the cameras perceived the performer on the screen behind the ensemble. Paglen’s art studio team did not write the algorithms for this multimedia performance but built a software environment into which they fed several existing machine vision algorithms to test their validity (Paglen Reference Paglen2022). The result of the computer vision program was applied during the live performance and projected onto the stage where the performers sat. For example, when the ensemble played Islam Chipsy’s “Zaghlala,” the facial recognition algorithms were given the task to continuously perceive and analyze the age, gender, and emotional state of the performers. Due to the performers’ erratic gestural movements, the facial recognition algorithm failed to identify the musicians correctly. At one point during the song, David Harrington was perceived and identified by the camera vision as being 70.33% female; Sunny Yang as Batman (fig. 1); and John Sherba as being 44.0% angry and 20.0% neutral. The categories with which they were identified by the camera vision algorithm were projected on the stage, visible to the viewers of the performance. It is important to stress that the confusion between how the machine vision interpreted the identity characteristics of the performers and their actual embodied identities was neither explicitly intended nor was it avoided by Paglen’s team. As Paglen clarifies, “we wanted to show how bad the algorithms (developed by humans) are that everyone uses” (2022). The hybrid condition was therefore not designed to show that the chosen algorithms do not work, but that Sight Machine’s synchronization of a live event with AI operations demonstrated how infantile, discriminating, and slow the algorithms are.

Figure 1. Trevor Paglen, Sight Machine, 2017–present, production still. (Photo © Trevor Paglen; courtesy of the artist)

The live performance Sight Machine in 2017 has been part of Paglen’s long-term research project on machine vision. It started to take form in 2007/08, at a time when the tools of AI were largely inaccessible; only over time, with storage getting easier to acquire and computers getting faster, was he able to make art that responds to AI technologies (Paglen Reference Paglen2022). Sight Machine grew out of Paglen’s residency at the Cantor Arts Center at Stanford University. For five months, he had access to various departments to conduct his research. He also received support for the live event from Metro Pictures (his New York gallery at that time), the Altman Siegel Gallery, A.I. Now, and Obscura Digital. The financial support structure of the performance sheds light on why Paglen considers himself primarily an artist who runs an art studio in Berlin, and not an activist who works towards the realization of a political program (Paglen Reference Paglen2022).

Paglen came into the art world’s spotlight with The Black Sites, his large-scale photography of the CIA’s secret military bases (see Paglen Reference Paglen2011). Initially he approached photography from the historically productivist position of the early 20th-century avantgarde, conceiving of photography as “a political performance,” underpinned by “the right to do it and enacting that right” (Stallabrass Reference Stallabrass2011:5). Moving on from pairing a critique of state surveillance with an aesthetic practice of photography, Paglen’s more recent research on AI and performances made with AI technologies foreground the performative role of critique. On the one hand, his work criticizes how corporate AI technologies shape everyday life, and on the other hand, his work sparks a reconsideration of what can be today regarded as a critical artistic performance.

“Training Humans”

The same year he presented Sight Machine, Paglen staged the exhibition Training Humans in collaboration with AI expert Kate Crawford at the Fondazione Prada in Milan. Also in 2017, Crawford cofounded the AI Now Institute in New York.Footnote 9 The center is devoted to interdisciplinary research on AI and public engagement about its social impact. It claims autonomy and does not accept funding from corporate donors (such as tech companies whose practices and products are suspect).Footnote 10 With the public display of Training Humans, Paglen and Crawford wanted to, as Paglen notes, “tell a story about the history of images used to ‘recognize’ humans in computer vision and AI systems” (Fondazione Prada 2019). The exhibition’s visual storytelling approach traced the evolution of image sets from the 1960s and made transparent the relation between digital image classifications found on the internet and applied in everyday life, and AI machine learning operations (Fondazione Prada 2019). In her groundbreaking book Atlas of AI, Crawford does away with the popular trope of mythologizing AI. She defines AI systems as “embedded in social, political, cultural, and economic worlds, shaped by humans, institutions, and imperatives that determine what they do and how they do it” (2021:211). She stresses that “contemporary forms of artificial intelligence are neither artificial nor intelligent” as they are rooted in society’s material base:

the hard physical labor of mine workers, the repetitive factory labor on the assembly line, the cybernetic labor in the cognitive sweatshops of outsourced programmers, the poorly paid crowd-sourced labor of Mechanical Turk workers, and the unpaid immaterial work of everyday users. (2021:69)

Paglen’s political engagement with AI technologies is not limited to visual storytelling and exhibition-making but, like Crawford’s output, is available in the form of research articles online (see Paglen Reference Paglen2014 and 2016). In their coauthored article “Excavating AI: The Politics of Images in Machine Learning Training Sets,” published via the AI Now Institute, Crawford and Paglen outline the technical details of the relation between images sourced by companies from the internet and the programming of AI systems that result in datasets (Crawford and Paglen Reference Crawford and Paglen2019). Humans train the neuronal networks of machines through conditioning techniques to have them automatically recognize specific “objects” (digital images) and classify them under specific labels. CAPTCHA is one example of this; it is a program that runs on nonwaged labor and low-waged data classification jobs and relies on the model of a completely automated public Turing test to tell computers and humans apart. The most widely shared visual database, ImageNet, which was developed in the mid-2000s by academic researchers, is one of Paglen and Crawford’s key case studies to explain the workings of image data systems and how they reproduce patterns of social discrimination (Crawford and Paglen Reference Crawford and Paglen2019). As part of their critique of ImageNet, they also launched the freely available website ImageNet Roulette. Until the website was discontinued, its users could feed selfies into the system, revealing which sexist and racist algorithms such neural networks operate if they misrecognize the people.

ImageNet Roulette continues only to exist as an art installation presented in museums and galleries. Parts of this research project were last presented as the exhibition From “Apple” to “Anomaly” at the Barbican Centre’s Curve in London as a large image wall (fig. 2), incorporating approximately 30,000 photographs from ImageNet that make sense of culture’s variety by classifying images of it. From an art historical perspective, the exhibition’s format recalls Aby Warburg’s Bilderatlas Mnemosyne (1925–1929), an attempt to group different images (photographs, photographic reproductions, diagrams, postcards, etc.) on wooden panels to trace similarities between images since antiquity. While Warburg’s experimental classifying system poses challenges to formally accepted and systematically reproduced image-based forms of representation, ImageNet Roulette relies on preprogrammed AI techniques and thereby shows how human-trained AI systems determine how digital images perform in everyday life. The initial impetus to launch ImageNet Roulette foregrounds that Paglen’s artistic research into capitalist technologies is both critique and performance, hence output oriented.

Figure 2. Trevor Paglen, From “Apple” to “Anomaly,” installation view, Barbican Art Gallery, 2019 (Photo © Max Colson)

Engineering Facial Recognition Programs

Paglen’s 2017 performance project Sight Machine moves on from two-dimensional data sets trained on image recognition techniques to stage the hybrid condition of AI technologies. It is not preoccupied with the application of optimized new media per se, but with the social and political impacts AI operations have on everyday life. In Sight Machine, Paglen and his team reveal that AI technologies are incapable of reading people’s identities correctly as the machine vision technology adds incorrect identity labels to the musicians based on its reading of their faces—which draws attention to the digital reproduction of social discrimination rooted in class, gender, and racial differences.Footnote 11 One of the key concerns about AI technologies is that they are not trained to perceive identity-based differences because they can operate without this capacity.Footnote 12 As a result, applied algorithms, such as the Google Photos App, have been struggling to identify nonwhite people and also to distinguish between different animal species.Footnote 13

Discrimination based on facial recognition techniques is rooted in the fact, as Paglen has stressed in a conversation with Alona Pardo, that corporate AI systems are the products of a “small set of elite university laboratories: spaces that in the West tend to be extremely affluent, very white, technically orientated and, of course, predominantly male” (Pardo Reference Pardo2019:37).Footnote 14 Eliminating racial, gender, or class differences in algorithms is therefore not intentional, but happens automatically due to the reception and valuation of the workers who produce it according to their own conceptions. And the corporations subsequently founded by these people therefore develop biased data systems, which they then sell to governments. In real-life situations when facial recognition techniques have to identify living bodies, as Sight Machine shows, the structural mismatches often move to the fore and require human agency to correct computational recognition, such as in the case of border control AI systems and when the technology has to identify nonwhite people. Diverse identity categories are therefore only recognized by AI technologies if the computational machines have already been trained on diverse identity categories.

While Paglen focuses on the conditioning techniques of AI and makes visible how AI techno-performances reproduce visual representations of social discrimination, Gerald Nestler and Sylvia Eckermann create participatory kinesthetic experiences that draw attention to how AI systems depend on and impact how people perform (physically), as well as to the financial base structure of AI technologies.

Performance Model of Critique II: EXPERIENCING

Gerald Nestler and Sylvia Eckermann’s The Future of Demonstration

In response to society’s “algorithmic governance” (Schuilenburg and Peeters 2021), which shapes everyday life and perceptions of reality, the artists Gerald Nestler and Sylvia Eckermann produced the performance event series The Future of Demonstration at the Reaktor at the Atelier Augarten in Vienna in October 2017 and in 2018 at the Atelier Augarten.Footnote 15 The two-part series was funded by a mélange of public money from the Stadt Wien and the federal government of Austria as well as by private money from the insurance company UNIQA. Comprising eight independent evening-long events, the performances were staged throughout the entire building of each venue and viewers could move around independently.Footnote 16 The Future of Demonstration was split into two seasons, based on the themes Vermögen (capability) and Passion. Each evening was devoted to a certain thematic constellation and made with a different group of artists, filmmakers, architects, theorists, scientists, and other experts. Each episode assembled an audience that walked among the different performative-discursive presentations, often participating. The participatory performance event had the goal of “engaging in renegade agency as artistic devices for transgressing critique towards new forms of insurrection” (Eckermann and Nestler Reference Eckermann and Nestler2018a). Making the Black Box Speak, the last event of The Future of Demonstration (Passion), opens with the actress Anna Mendelssohn reciting Nestler’s words:

What you see is not what you get. In fact, you see next to nothing. Your vision is blurred. Not by distraction but by deception. Distraction entertains you. By giving you something to hold on to, to bond with. And you enjoy the noise because it makes information so much more interesting. Deflection moves you suddenly away from yourself and those you might bond with […] Now what is happening is you. You are performed. And this is your bondage for which you pay dearly with each microsecond, which they call your life. Now, you are no longer prey to representation. It is all performative and affective. The stage is all yours. (Eckermann and Nestler Reference Eckermann and Nestler2018b)

This extract of the performance’s introductory text contains the logic of 21st-century techno-capitalism, which performs its power through rooted cybernetic techniques. The field of cybernetics, in whose development Norbert Wiener played a key role, emerged during the Cold War in the aftermath of WWII to develop “a machine for predicting and monitoring the positions of enemy aircraft for the purpose of destroying them” (Tiqqun [2001] 2020:39). Wiener’s mathematically developed cybernetic feedback system was ultimately intended to result in the self-regulation in and across living beings (humans and animals) and their environment by steering of feedback mechanisms mimicking their organic nervous system (see Wiener Reference Wiener1961). Before Wiener, the Russian physiologist Ivan Pavlov ran lab experiments on the gastric function of dogs and children in the 1940s, which became widely known as classical conditioning processes. Subsequently, the American scientist B.F. Skinner experimented with the behavior of animals (such as rats, dogs, and pigeons) to learn more about the functioning of the human body through “operant conditioning” (1938), and how it can be steered to implement particular morals on a societal level. Focused on controlling external “bodies,” cybernetics has thus been aimed at steering the kinesthetic sense of living beings and objects that humans operate. As the artists’ collective Tiqqun notes, the techno-science of cybernetics aimed first to determine the position and then the behavior of a body or object to do away with the difference between its actual and desired behavior ([2001] 2020:39).

In the second half of the 20th century, first- and second-wave theories (Wiener Reference Wiener1961; von Foerster Reference von Foerster1984) of cybernetic conditioning mechanisms and the search for self-regulating (human and machine) systems developed with neoliberal thought and economic theory, paired with the power of the state. First- and second-wave cybernetic theory and the invention of computational machine-based algorithms arrived in accordance with the emerging market-based ideology referred to as neoliberalism (Mirowski Reference Mirowski2002; Slobodian Reference Slobodian2018:224–35).Footnote 17 By the turn of the 21st century, capitalism had increasingly incorporated these mechanisms by tracking down how humans have continued to perform digitally since the 1980s with the rise of the personal computer. As people’s digital performances are the motor of data capitalism, algorithms are, Gerald Raunig stresses, always on the hunt for what lies outside of their territory (2016:14). Algorithmic operations, like humans and animals, are not black boxes, as has been claimed by many, but are carefully engineered systems that depend on and consequently manipulate people’s behaviors to keep them, as long as possible, on their digital interfaces as active users, who are always also potential consumers of the products and services that they show.

The Digitalization of Financial Markets

Following Mendelssohn’s delivery of Nestler’s text at the beginning of Making the Black Box Speak, dancer Florentina Holzinger entered the building from above by breaking through the high glass ceiling. She fell all the way down onto a white mattress laying on the floor, around which the viewers stood, and started a stunt fight with Nina Porzer. Her physical performance was followed by the performance of rapper Soulcat E-Phife, who sang:

From traitor to educator, find what emancipates her, interrogate her but you cannot break her.

She’s the 1 percent not the 99 […]

I embrace the risk, do you all share in? Get informed to win. (Eckermann and Nestler Reference Eckermann and Nestler2018b)

Soulcat’s vocal performance is followed by Haim Bodek’s insightful talk about his profession (fig. 3) as a high-frequency trader, expert of automated financial markets, and whistleblower. As a high-frequency trader he and his team develop the kind of automatized AI operations on which the financial system operates today, and which, in turn, depends on the private corporations that develop them. As Nestler stressed at the video screening and panel discussion of “(Art)Activism and AI” that I curated and moderated in Vienna: “AI is in private hands. Alphabet, Google’s parent company, spends over 31 billion on AI a year” (Nestler Reference Nestler2022). Google, “a leviathan corporation” to use Julian Stallabrass’s term (1995:30), has more than 90% of the market share of search engines and operates on factors such as paid advertising and user optimization. For artistic performances that critique the capitalist technologies used in governmental and cooperate operations, the devil lies in knowing the details about AI’s political economy; for the corporations, the key to financial success is knowing more about their users’ identity and behavior. As Bodek reveals in his performance lecture at the live event, he makes:

a living out of knowing more about the details. These details are hidden in complexity […W]e say we put intelligence into the black box but what we really mean is that we need an information advantage over you. […] We need you to make the wrong decision. We need you to be at a disadvantage. We need you to not know how things work. (2017)

Figure 3. Video still from Sylvia Eckermann/Gerald Nestler, “Making the Black Box Speak.” The Future of Demonstration, season 2, episode 3. Atelier Augarten, Vienna, 2018, thefutureofdemonstration.net/passion/e03/. (Courtesy of Gerald Nestler)

Bodek’s transparent self-introduction demystifies the notion of the technological black box of financial markets while making transparent his Janus-faced role as a capitalist and whistleblower. He is a long-term collaborator of Nestler and also appeared in his performance event Instanternity: A Black Box Body Cult at the Freies Theater in Innsbruck in 2017. In the West of Austria, he gave a performance-lecture together with Nestler focused on the operations of financial markets. Bodek and Nestler unpacked abstract concepts such as volatility, leverage, liquidity, and arbitrage that form derivative finance by writing on a whiteboard and talking to their audience. At the same time, three dancers moved through the space, interacting with and touching the audience to, according to Nestler’s concept, performatively transfer this insider knowledge from body to body (Nestler Reference Nestler2022).

Nestler and Eckermann’s critical staging of knowledge about techno-capitalism’s operations in the form of a large public physical performance event is a continuation of their individual and collaborative work on the operations of financial markets and experimentation with new media. Eckermann is a self-taught artist and was the first artist to receive the City of Vienna’s media art award in 2014, and in 2018 won the Austrian State art prize for media art for her pioneering work in digital space, which explored the link between digital and analog spaces, at a time when cyberspace was becoming increasingly capitalized (see Eckermann Reference Eckermann2014). In contrast to Eckermann, Nestler studied painting and has approached what he terms the “derivative condition” through artistic and academic research on how digital technologies have reshaped financial markets and how finance has consequently reshaped technologies since the late 1990s. In the second half of the 2010s, he continued this research as part of the research group Forensic Architecture at the Centre for Research Architecture at Goldsmiths, University of London, where he earned a PhD in practice in 2017.

A key work predating The Future of Demonstration derives from Nestler’s PhD research. In the video Countering Capitulation: From Automated Participation to Renegade Solidarity (2014), Nestler draws on his earlier stint as a broker and trader from 1994–1997 and explores the contested historical phenomenon of the flash crash (fig. 4). In the US, a flash crash occurred in May 2010 and was the biggest one-day market decline (and recovery 15 minutes later) in history, believed to have been caused by a trader participating in automated market operations. Nestler’s video talks the viewers through the time-based operations of automated computer-based high-frequency trading (HFT), where transactions are executed in microseconds, with time being a key factor of algorithmic trading as it is responsible for capturing the consistent flows of capital and hence leads to its growth. As the artist collective Tiqqun noted nine years prior to when the US flash crash happened, “cybernetic capitalism” maximizes circulation by speeding trading operations up almost to the speed of light (2020:65). Because the whole financial system has been switching to algorithmic systems since the late 1990s, human agency can, as the 2010 flash crash shows, potentially still have an impact on the whole AI system, similar to how the performance of financial markets impacts political, economic, cultural, and social lives.Footnote 18 Nestler’s artistic performance of critique is driven by what he calls “renegade activism.” In a research paper, he theorizes his artistic research practice, stressing that it establishes “alliances with those that make the black box speak from inside,” that is, with dissident expert figures who betray the system from within (2020:121). Understanding the artist-as-collective, Nestler and Eckermann’s two-part performance event, The Future of Demonstration, represents a temporary body of work that draws critical attention to AI’s political economy.

Figure 4. Video still from Gerald Nestler. Countering Capitulation. From Automated Participation to Renegade Solidarity. High-frequency trading and the forensic analysis of the Flash Crash, 6 May 2010, single channel video, 11:20 min., 2013–14, produced for the Forensic Architecture exhibition, Forensis at the Haus der Kulturen der Welt, Berlin, curated by Anselm Franke and Eyal Weizmann, 2014, vimeo.com/channels/AoR. (Courtesy of Gerald Nestler; courtesy of Nanec LLC)

The Performance of Critique

The terms “insurrection” and “demonstration,” both used by Nestler and Eckermann, are key to their critical art activist agenda. Just as the notion of performance describes the activities of techno-capitalism that they critique, the nature of their performance event series presents their critical approach to artistic performances by enabling new encounters for audience members. In contrast to Paglen, who performs as an artist by publishing research and staging exhibitions that make AI’s operations transparent, Nestler and Eckermann perform as an artistic duo who critique the operational performativity of AI technology by creating large public performance productions. As they both make AI operations transparent, their artistic practices of critique advance from a productivist and enlightening logic. For Nestler, transparency is not the aim of the work; instead, he observes that transparency “has come under extreme pressure and the logics of techno-capitalism have thus become a threat to the body politic” (2021:173). Following how he describes his own artistic practice, his artistic performances in the culture industry are more about “finding collective forms, because even when we have transparency, we realize that it is always passive” (Nestler Reference Nestler2022).

As the act of making capitalism’s AI mechanisms transparent remains economically passive, artistic performances of critique fall into a mode of performing what Fred Moten, following an idealist strand of thinking, refers to as “nonperformance” (2015). For him, humans can refuse to perform because of their freedom. Despite the ideal of absolute freedom that Moten addresses, humans (including artists) have to perform to make a living; to perform efficiently, artists’ performance of critique has to operate as an effective performance in the economic system itself. Artistic performances that provide a critical insight into capitalism’s AI operations can, perversely, only present insider knowledge when they actively participate in the attention economy; and the alignment of the subject of artistic work with one of capitalism’s leading industries implies that the artistic performances are culturally empowered if they are relevant for society at a given moment in time. However, critical artistic insights into the operative functions of data capitalism remain impactless in contrast to the AI industry in financial terms, as they do not contribute to the increasing digital transformation of society’s infrastructures in the name of macro-economic progress. Therefore, the social potential of artistic performances of critique lies in their ability to communicate their insider knowledge to a public lay audience.

The second model of critique performed in The Future of Demonstration, “experiencing,” makes an audience aware of the invisible power of capitalist cooperation, working on and with AI to produce a shared kinesthetic experience. Nestler suggests a move from critique and aesthetics, focused on perception, to processes of making, and refers to his and Eckermann’s artistic and research method as a “poetics of resolution” (2021:212–13).Footnote 19 His collective approach to both the producer and consumer aspect of artistic work deals collectively with the current technological condition that shapes the “condition of production” and consequently, as Walter Benjamin notes, the “social conditions” ([1934] 2005:699). In contrast, Vladan Joler’s critical investigation into the social and material resources underlying the production of AI systems attempts to grasp the multiple layers of the complex production process and user-producer relation by graphically mapping knowledge about their operations.

Performance Model of Critique III: MAPPING

Vladan Joler’s New Extractivism

The form in which artists critique the condition of the political technology-based economy reflects their working method. Over the past 20 years, the focus on method in art discourse refers to, as Alexander Galloway pinpoints, “that moment in history when knowledge becomes production, when knowledge loses its absolute claims to immanent efficacy, when knowledge ceases being intuitive and must be legitimized via recourse to some kind of metadiscourse” (2014:108). Joler’s animation video, New Extractivism (2020), is representative of the knowledge-production-metadiscourse problem of the 21st-century that Galloway describes. Offset by the transformation of the internet into a mass media in the early 1990s, educative and public institutions have, on one hand, increasingly been digitizing knowledge to make it widely accessible; and, on the other hand, the users of the internet have increasingly accepted formally approved websites to offer unlimited digital content as valid information for free in exchange for their personal user data.

Joler’s educational video sets out to provide “one semi-coherent picture, or let us say a map, a worldview” by assembling objects that form the complex layers and networks of the current technological condition (Joler:Reference Joler2020). It comprises 33 guiding points and is extended by 40 footnotes. These points include the “digital labor triangle”; “the allegory of the cave,” which alludes to Plato’s dystopic parable; “dividuals,” a term used by Gilles Deleuze to describe the hybrid condition of human identity in which an individual exists in various forms, constellations, contexts, and at different places at the same time; and “the extraction” of data, nature, and labor. Joler’s first digitally animated map was commissioned for the 2021 online exhibition Open Secret at KW Institute for Contemporary Art in Berlin. Using video animation as a storytelling device, New Extractivism delves into the complex networks and operations that produce AI technologies which, in turn, underlie the interface of the world wide web. It takes the physical concept of “gravity” as its starting point to analyze how the biggest techno-monopolies, Alphabet and Amazon (based on the number of users and content), extract and then create value through the subsumption of human activities and material resources.

The video traces the passive performance of a single (male) human figure in the face of social isolation and relates his position to the networks of the global economy, in which the history of colonialism is still incorporated. In the beginning, we see an “imaginary hero” who unsuccessfully tries to swim against the gravity of the “black hole” produced through companies that extract data from human behavior and then sell it to companies and governmental bodies (fig. 5). The computer-generated voice of the male narrator explains that “the cost of opting out has become so high that opting out has essentially become a fantasy [and that] the social and economic price of leaving these platforms is becoming too high” (Joler Reference Joler2020). Then, we see the “human hero” entering into the “allegory of the Cave,” where he is then caged in a transparent cube and exposed to the constant flow of digital animations of the world, which also monitors each and all of his (re)actions and emotions—and also transforms them into data. The human figure seems to be lost as his body appears to be: his upper body and head are bent forward, and his gaze is fixed down to the floor as if there was nothing left for him to see or sense kinesthetically.

Figure 5. Vladan Joler, New Extractivism, https://extractivism.online, 2020. (Courtesy of Vladan Joler)

Performing Knowledge Critically

New Extractivism’s dystopian visual storytelling technique about the operations of 21st-century techno-capitalism expands Joler’s previous cartographic work. For example, in 2018, he collaborated, like Paglen, with Kate Crawford and the AI Now Institute. Together they made Anatomy of an AI System, a video on capitalist AI systems, starting with Alexa, the Amazon Echo, which is a human-AI interaction interface (Crawford and Joler Reference Crawford and Joler2018).Footnote 20 Read from left to right, the static map begins with income distributors, assemblers, mines, and material elements (forming the Marxian base-structure of the digital extraction economy). It then expands to the “human operator,” who adds data to the operation at the end of the production chain and moves via the quantification of “nature” (such as of lithium) and the exploitation of data (fig. 6) to the material leftovers that are partly toxic, eclectic, or metal. The latter materials enable digital surplus value extraction to take place in the first place. In the essay accompanying the large black-and-white map Anatomy of an AI System, Crawford and Joler note that the scale underlying the consumer voice-command tool is “almost beyond human imagining,” and ask: “how can we begin to see it, to grasp its immensity and complexity as a connected form?” (2018).

Figure 6. Vladan Joler, Facebook Algorithmic Factory, NOVI EKSTRAKTIVIZAM: o mašinama, eksploataciji ljudi i prirode, MSUV, Fotografija. (Courtesy of Vladan Joler)

Borrowing terms and concepts from existing research papers and monographs, such as Sandro Mezzadra and Brett Neilson’s theory of “extractivism” in “contemporary capitalism” (2017), Joler’s artistic critique of such capitalist operations that produce, train, and depend on AI operations and social exploitation follows a diagrammatic strategy of visualization. He does away with critical theory and instead gathers existing scientific knowledge about AI’s political economy and ecology, groups it coherently together, and thereby produces his own working aesthetics. Joler’s maps and accompanying essays do not present new scientific research per se: his maps neither follow scientific standards in the sense of presenting a concept or argument stemming from raw material, nor do they produce a traditionally marketable art object that has no other function than to be aesthetically pleasing. Instead, Joler’s method of visual mapping critically investigates the totalizing (note the difference to totalitarian) operations of AI’s social and material means of production and techniques.

The form and content of Joler’s artistic performance practice are codependent. The way he combines the critique of and knowledge about AI systems recalls Fredric Jameson’s speculative call for “an aesthetics of cognitive mapping” from the late 1980s to debate political issues from a theoretical position (Jameson Reference Jameson1988). In 1991, just three years before internet service providers were founded, Jameson observes that society is struggling against a loss of orientation within the increasingly globalized and abstract capitalist operation; and that also the practice of mapping has, like a concept, its limits because it is “drawn back by the force of gravity of the black hole of the map itself […] and therein cancels out its own impossible originality” (1991:279). It is important to stress that Jameson’s concept of cognitive mapping is not to be represented by practices of actual mapping (275), but should be understood as a “reflexive form of ‘theoretical discourse’” (44).

The issue with artistic production that critiques the technological operations of the political economy and academic discourse itself (including this article) exemplifies that increasing and accelerated cognitive production has led to a decrease in the production of substantial knowledge, with both artistic production and academic discourse having to be legitimized through a metadiscourse to justify their existence and claim to (re)produce societal relevance. The footnotes and concepts of Joler’s New Extractivism use cross-reference as a metadiscourse to affirm its validity.

While referencing is an artistic and scholarly method, networking is one of data capitalism’s growth strategies. The appropriation of this technique in the arts and humanities brings back to mind the performance of the human protagonist in New Extractivism who passively experiences rather than actively shapes society’s means of production. Similar to how the members of the audience are regarded as passive entities in the theatre, Joler’s unheroic human character glides through different events without being able to interfere with the video’s narrative. Given the digital totalizing process of all spheres of everyday life (food shopping, banking, booking doctor appointments, and online activities such as streaming music or videos), cultural production and its self-reflexive critique is not exempt from the digital hybrid condition. Like citizens to a certain degree have to be daily users of digital platforms to participate in the (Western) representative democracy, Joler provocatively notes in his video that “[d]igital identity labor [not paid labor] is the forced labor of the 21st century” (2020).

Joler has also been teaching from New Extractivism at the Academy of Arts at the University of Novi Sad (Serbia), where he holds a professorship in the New Media program (focusing on video and media arts) founded in 2020. As the video covers several complex layers of data capitalism and its operations, he is able to teach from the video for a whole seminar (Joler Reference Joler2022). Joler’s artistic method of mapping used for educational purposes goes back to his art school training during the Yugoslav wars in the 1990s, when he was involved in activist happenings around the university. While new media approaches to cybernetic inventions were experimental in the long 1960s (see e.g., Higgins and Kahn 2012), Joler’s work performs an activist approach to both artistic and scholarly work that also drives the work of the independent artistic organization SHARE LAB that he cofounded.Footnote 21 His political engagement with capitalism’s operative functions of AI approximates artistic critique with artistic labor to make sense of the current conditions of techno-capitalism.

Carry On

The increasing application of capitalism’s AI technologies has drastically changed our kinesthetic experience of everyday life and work. Similar to how technology changes “the concept of sense” in our information society, Hörl notes, it also reorients “the entire sense culture” (2015:3). As Joler, Nestler/Eckermann, and Paglen’s artistic performances of critique operate in the cultural field and are presented as art, their critical engagement with the latest developments of capitalism’s operative functions and their public artistic performances enact their political practices as educative aesthetic experiences. Their works do not follow a totalizing enlightening strategy. Still, as is the nature of critique, these artists make the viewers of their work conscious of the normalized operations of AI technologies in everyday life.

If the three artistic performance models of critique (visualizing, experiencing, and mapping) constitute the “progressive intelligentsia,” a term used by Benjamin to describe people interested in “serving the class struggle” ([1934] 2005:774) to ideally contribute to reducing society’s asymmetrical wealth distribution, then digital interfaces and AI technologies must not, like capitalism, be regarded as automated black boxes in which human agency plays no decisive role. The implementation of AI technologies on an infrastructural level represents the performance of specific human interests in the profit-seeking political economy. As Joler, Paglen, and Nestler/Eckermann perform their human agency to critique AI’s operative functions in the current capitalism system, their artistic work has absolutely nothing to do with an uncritical understanding of artistic practice that aesthetically reproduces fashionable utopic theories of posthuman ethics and fantasies.

While posthuman theories draw attention to the fact that the human is one of other nonhuman, self-regulating, and “intelligent” entities (e.g., Bridle Reference Bridle2022), humans’ historically developed neoliberal humanism does remain the driving force of growing wealth and power imbalances. In the face of the increasing mass use of information and communication technologies in Western societies, is there even a point of asking if (privileged artistic) critique, performed through human agency, has an impact on the increasingly totalizing, automatized algorithmic operations of the political economy and thereby of everyday life and work?

So what is it that continues to bring forth artistic performance models of critique? The three models mapped here should be understood as representing a new paradigm of the performance of artistic critique: they marry the aesthetic desire to produce art with activist endeavors to permeate culture’s normatively conditioned public sphere. Writing about “performance activism,” Dan Friedman, quoting Richard Schechner (Reference Schechner2002), notes that if “ritual performance ‘makes belief’ and performance in the theater ‘makes believe,’ performance in performance activism ‘makes-anew’” (2021:14). In a recent roundtable discussion on radicality, Boris Groys optimistically observes that if we see “a revolt of life against the dead weight of technology and against the technologically based social order,” then there is a promise of new radical art that turns to “life as its actual root” (in Alberro et al Reference Alberro, Homi Bhabha, Keti Chukhrov, Demos, Emmelhainz, English, Flores, González, Groys, Holert, Huyssen, Jones, Joselit, Kee, Mirzoeff, Osborne, Roberts, Shaked, Smith, Stiles, Tiampo and Wagner2021:48). As the human is the root of capitalist operations and continues to perform within it, then artistic critique has not fully exhausted itself. It carries on being actively performed as artistic labor, with its social value and effectiveness remaining beyond measure.

Footnotes

1 Evgeny Morozov has criticized Zuboff’s work, noting that “the problem with Zuboff’s account of dispossession-obsessed ‘surveillance capitalism’ is that it is constitutionally incapable of grasping just how the non-capitalist digital economy might operate in the future. As a result, it has no radical political agenda except for some vaguely liberal demands for undefinable things like ‘the right to the future.’ In pathologizing the ongoing extractivist side of contemporary digital capitalism, Zuboff’s critique normalizes its non-extractivist dimension” (2022:112).

2 On 30 November 2022, these companies were listed amongst the world’s 12 largest companies in the world on Forbes Global 2000, an index that is based on the assessment of their sales, profits, assets, and market value (see Murphy and Contreras Reference Murphy and Contreras2022).

3 The artists work with media distinct to the 12 categories of media applied in performance that David Z. Saltz started to map in 2001, before digitization started to be increasingly part of everyday life practices (2015:93–125).

4 For more on this topic see Tom Holert’s Knowledge Beside Itself (2020).

5 Also, performer Mårten Spångberg stressed in regards to the performing arts: “From the product and image intensive period of the 1980s, following a period of politically orientated work, the 1990s and early 2000s will most probably be remembered as the era of research” (2010).

6 To the list could also be added two other prominent critical artists: Hito Steyerl (PhD in philosophy from the Academy of Fine Arts Vienna); and Eyal Weizman (PhD in architecture from the London Consortium).

7 For a postcritique approach to critique addressing the field of science see Latour (Reference Latour2004).

8 The live performance stems from Paglen’s video installation Image Operations. Op.10. Sight Machine was also performed at the Holland Festival in Amsterdam (2018), at the Smithsonian American Art Museum in Washington, DC (2018), and at the Barbican Centre in London (2019). Coincidently(?), a San Francisco–based company (founded in Michigan in 2011) that analyzes and produces data platforms is also called Sight Machine.

9 In The Atlas of AI, Crawford focuses on the material and social means implicit in the production of AI technologies. Her study of “data capitalism” defines AI systems as being “embedded in social, political, cultural, and economic worlds” that are “shaped by humans, institutions, and imperatives that determine what they do and how they do it” (2021:211).

10 See the institute’s website at ainowinstitute.org/about.

11 For a key study on how certain technologies are “ideologically shaped by the operation of gender interests and, consequently, how they serve to reinforce traditional gendered patterns of power and authority” see Balsamo (Reference Balsamo1995:10).

12 See Wendy Hui Kyong Chun’s detailed study of how computational operations have historically reproduced social discrimination (2021).

13 This made the news in 2015 when Google’s app made a “racist blunder” (BBC 2015). For a focused study of the relation between race and technology see the special issue of Camera Obscura edited by Wendy Hui Kyong Chun (Reference Chun2009). Also see Simone Browne’s outline of the historical reproduction of “prototypical whiteness,” in which she argues that “Digital epidermalization is the exercise of power cast by the disembodied gaze of certain surveillance technologies (for example, identity card and e-passport verification machines)” (2010:135).

14 Paglen’s collaborator Kate Crawford refers to the fact that AI depends on how humans program their operations as a “data problem” (2016:11).

15 Dates of the two parts of The Future of Demonstration: Vermögen, 31 October–11 November 2017; Passion, 20–25 October 2018. See thefutureofdemonstration.net.

16 The arts series was funded by the Media Art Festival of the City of Vienna.

17 For a neat study of the relation between artistic performances, neoliberalism, and the liberal subject in “computer choreography” see Eacho (Reference Eacho2021).

18 For an analysis of the performativity of markets in relation to the work of the sociologist Michel Callon and ANT (actor-network theory) see Schröter (Reference Schröter, Leeker, Schipper and Beyes2017).

19 The concept of “Aesthetic Commons” resonates with Nestler and Eckermann’s approach (see Sollfrank et al. 2021).

20 The visual map of Anatomy of an AI System was exhibited in several art institutions, such as at the ZMK Centre for Art and Media in Karlsruhe (2018), at the Frankfurter Kunstverein (2020), and at MoMA (May 2022–ongoing).

21 SHARE LAB is based in Serbia (Novi Sad and Belgrade) with members Vladan Joler, Olivia Solis Villaverde, Andrej Petrovski, Dušan Ostraćanin, and Milica Jovanović. Their first case study was Facebook.

References

References

Alberro, Alexander, Homi Bhabha, Ajelandra Castillo, Keti Chukhrov, T.J. Demos, Keyna Eleison, Emmelhainz, Irmgard, English, Darby, Flores, Patrick, González, Jennifer A., Groys, Boris, Holert, Tom, Huyssen, Andreas, Jones, Amelia, Joselit, David, Kee, Joan, Mirzoeff, Nicholas, Osborne, Peter, Roberts, John, Shaked, Nizan, Smith, Terry, Stiles, Kristine, Tiampo, Ming, and Wagner, Anne M.. 2021. “What is Radical? Roundtable.” ARTMargins 10, 3:896. doi.org/10.1162/artm_a_00301 CrossRefGoogle Scholar
Anker, Elizabeth S., and Rita, Felski, eds. 2017. Critique and Postcritique. Duke University Press.Google Scholar
Balsamo, Anne. 1995. Technologies of the Gendered Body: Reading Cyborg Women. Duke University Press.Google Scholar
Barbican. 2019. Evening program for Sight Machine, 11 July. Barbican Hall, London.Google Scholar
BBC. 2015. “Google Apologises for Photos App’s Racist Blunder.” BBC News, 1 July. www.bbc.com/news/technology-33347866 Google Scholar
Beech, Dave. 2015. Art and Value: Art’s Economic Exceptionalism in Classical, Neoclassical and Marxist Economics. Brill.CrossRefGoogle Scholar
Benjamin, Walter. (1934) 2005. “The Author as Producer.” In Walter Benjamin, Selected Writings. Vol. 2, Part 2, 1931–34. Trans. Rodney Livingstone et al., ed. Michael W. Jennings, Howard Eiland, and Gary Smith, 768–91. Harvard University Press.Google Scholar
Bodek, Haim. 2017. “Instanternity Vorbrenner 17: A Black Box Body Cult.” Lecture-Performance, 6 April. vorbrenner.org/black-box-body-cult-gerald-nestler/ Google Scholar
Bridle, James. 2022. Ways of Being: Beyond Human Intelligence. Penguin Books.Google Scholar
Browne, Simone. 2010. “Digital Epidermalization: Race, Identity and Biometrics.” Critical Sociology 36, 1:131–50. doi.org/10.1177/0896920509347144 CrossRefGoogle Scholar
Chun, Wendy Hui Kyong. 2009. “Introduction: Race and/as Technology; or, How to Do Things to Race.” Camera Obscura 70, 24:735. doi.org/10.1215/02705346-2008-013 CrossRefGoogle Scholar
Chun, Wendy Hui Kyong. 2021. Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition. The MIT Press.CrossRefGoogle Scholar
Crawford, Kate. 2016. “Artificial Intelligence’s White Guy Problem.” New York Times, 25 June. www.nytimes.com/2016/06/26/opinion/sunday/artificial-intelligences-white-guy-problem.html Google Scholar
Crawford, Kate. 2021. Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. Yale University Press.Google Scholar
Crawford, Kate, and Joler, Vladan. 2018. “Anatomy of an AI System: The Amazon Echo as an Anatomical Map of Human Labor, Data and Planetary Resources.” AI Now Institute and Share Lab, 7 September. anatomyof.aiGoogle Scholar
Crawford, Kate, and Paglen, Trevor. 2019. “Excavating AI: The Politics of Images in Machine Learning Training Sets.” AI Now Institute, 19 September. excavating.aiGoogle Scholar
Eacho, Douglas. 2021. “Scripting Control: Computer Choreography and Neoliberal Performance.” Theatre Journal 73, 3:339–57. doi.org/10.1353/tj.2021.0071 CrossRefGoogle Scholar
Eckermann, Sylvia. 2014. Algorithmisiert. Artist catalog. Czernin. issuu.com/eckermann/docs/sylvia_eckermann Google Scholar
Eckermann, Sylvia, and Nestler, Gerald. 2018a. The Future of Demonstration. thefutureofdemonstration.net/passion/index.html Google Scholar
Eckermann, Sylvia, and Nestler, Gerald. 2018b. “Making the Black Box Speak.” The Future of Demonstration S2E3. vimeo.com/297599065?embedded=true&source=vimeo_logo&owner=33570491 Google Scholar
von Foerster, Heinz. 1984. Observing Systems. Intersystems Publications.Google Scholar
Fondazione Prada. 2019. Training Humans, exhibition booklet, #26. Milan, Fondazione Prada, 12 September 2019–24 February 2020.Google Scholar
Fraser, Andrea. 2005. “From the Critique of Institutions to an Institution of Critique.” Artforum, September. www.artforum.com/print/200507/from-the-critique-of-institutions-to-an-institution-of-critique-9407 Google Scholar
Friedman, Dan. 2021. Performance Activism: Precursors and Contemporary Pioneers. Palgrave Macmillan.CrossRefGoogle Scholar
Galloway, Alexander R. 2014. “The Cybernetic Hypothesis.” differences: A Journal of Feminist Cultural Studies 25, 1:107–31. doi.org/10.1215/10407391-2420021 CrossRefGoogle Scholar
Graw, Isabelle, and Christoph, Menke, eds. 2019. The Value of Critique: Exploring the Interrelations of Value, Critique, and Artistic Labour. Campus Verlag.Google Scholar
Higgins, Hannah B., and Douglas, Kahn, eds. 2012. Mainframe Experimentalism: Early Computing and the Foundations of the Digital Arts. University of California Press.CrossRefGoogle Scholar
Holert, Tom. 2020. Knowledge Beside Itself: Contemporary Art’s Epistemic Politics. Sternberg Press.Google Scholar
Hörl, Erich. 2015. “The Technological Condition.” Trans. Anthony Enns. Parrhesia 22:115.Google Scholar
Jameson, Fredric. 1988. “Cognitive Mapping.” In Marxism and the Interpretation of Culture. ed. Cary Nelson and Lawrence Grossberg, 347–57. University of Illinois Press.Google Scholar
Jameson, Frederic. 1991. Postmodernism or the Cultural Logic of Late Capitalism. Duke University Press.Google Scholar
Joler, Vladan. 2020. New Extractivism: An Assemblage of Concepts and Allegories. extractivism.onlineGoogle Scholar
Joler, Vladan. 2022. Zoom conversation with author, 7 December.Google Scholar
Latour, Bruno. 2004. “Why Has Critique Run Out of Steam? From Matters of Fact to Matters of Concern.” Critical Inquiry 30, 2:225–48. doi.org/10.1086/421123 CrossRefGoogle Scholar
McKenzie, Jon. 2001. Perform or Else: From Discipline to Performance. Routledge.Google ScholarPubMed
Mezzadra, Sandro, and Neilson, Brett. 2017. “On the Multiple Frontiers of Extraction: Excavating Contemporary Capitalism.” Cultural Studies 31, 2–3:185–204. doi.org/10.1080/09502386.2017.1303425 CrossRefGoogle Scholar
Mirowski, Philip. 2002. Machine Dreams: Economics becomes a Cyborg Science. Cambridge University Press.Google Scholar
Morozov, Evgeny. 2022. “Critique of Techno-Feudal Reason.” New Left Review 133/134:89–126.Google Scholar
Moten, Fred. 2015. “Blackness and Nonperformance.” Afterlives: The Persistence of Performance. The Museum of Modern Art, YouTube, 25 September. www.youtube.com/watch?v=G2leiFByIIg Google Scholar
Murphy, Andrea, and Contreras, Isabel. 2022. “Forbes Global 2000 List 2022.” Forbes, 12 May. www.forbes.com/sites/forbesstaff/2022/05/12/forbes-global-2000-list-2022-the-top-200/?sh=1eebe77a3290 Google Scholar
Nestler, Gerald. 2020. “Contingent Claims: The Performativity of Finance, or How the Future Materializes in Technocapitalism.” Performance Research 25, 3:114–22. doi.org/10.1080/13528165.2020.1807770 CrossRefGoogle Scholar
Nestler, Gerald. 2021. “Countering Capitulation: An Arts-Based, Postdisciplinary Approach to Resolving Non-Transparency.” In Retracing Political Dimensions: Strategies in Contemporary New Media Art, ed. Oliver Grau und Inge Hinterwaldner, 173–93. De Gruyter.CrossRefGoogle Scholar
Nestler, Gerald. 2022. (Art)Activism in the 21st-century. Video screening and panel discussion with Gerald Nestler and Vladan Joler, curated by Lisa Moravec. Belvedere 21, Vienna, 12 November.Google Scholar
Nestler, Gerald, and Eckermann, Sylvia. 2021. “Against Platform Non-Transparency: A Politics of Resolution for Renegade Activism.” In Platform Urbanism and Its Discontents, ed. Peter Mörtenböck and Helge Mooshammer, 209–16. nai010 publishers.Google Scholar
Paglen, Trevor. 2011. “The Expeditions: Landscape as Performance.” TDR 55, 2 (T210):23. www.jstor.org/stable/23017613 CrossRefGoogle Scholar
Paglen, Trevor. 2014. “Operational Images.” e-flux journal 59 (November). www.e-flux.com/journal/59/61130/operational-images/ Google Scholar
Paglen, Trevor. 2016. “Invisible Images (Your Pictures Are Looking at You).” The New Inquiry, 8 December. thenewinquiry.com/invisible-images-your-pictures-are-looking-at-you/ Google Scholar
Paglen, Trevor. 2022. WhatsApp call with author, 25 November.Google Scholar
Pardo, Alona. 2019. “From ‘Car’ to ‘Omelette’—Trevor Paglen in Conversation with Alona Pardo.” In Trevor Paglen: From “Apple” to “Anomaly,” ed. Sarah Cook and Alona Pardo, 35–45. Barbican Centre.Google Scholar
Raunig, Gerald. 2016. “No Future, Dividuelle Linien, neue AkteurInnen der Kreativität.” springerin 01/2016:12–14.Google Scholar
Rogoff, Irit. 2008. “Turning.” e-flux journal 00 (November). www.e-flux.com/journal/00/68470/turning/ Google Scholar
Saltz, David Z. 2015. “Sharing the Stage with Media: A Taxonomy of Performer-Media Interactions.” In Performance and Media: Taxonomies for a Changing Field, ed. Bay-Cheng, Sarah, Parker-Starbuck, Jennifer, and David, Z. Saltz, 93–125. University of Michigan Press.Google Scholar
Schechner, Richard. 2002. Performance Studies: An Introduction. Routledge.Google Scholar
Schröter, Jens. 2017. “Performing the Economy, Digital Media and Crisis.” In Performing the Digital: Performativity and Performance Studies in Digital Cultures, ed. Leeker, Martina, Schipper, Imanuel, and Beyes, Timon, 247–76. Transcript Verlag.Google Scholar
Schuilenburg, Marc, and Peeters, Rik, eds. 2021. The Algorithmic Society: Technology, Power, and Knowledge. Routledge.Google Scholar
Seel, Martin. 2019. “Letting ourselves be determined: A Comment on ‘Diderot, or: The Power of Critique.’” In The Value of Critique: Exploring the Interrelations of Value, Critique, and Artistic Labour, ed. Isabelle Graw and Christoph Menke, 73–75. Campus Verlag.Google Scholar
Network, Share. n.d. “Emerging Models for Artistic Research across Europe.” www.sharenetwork.eu/images/products/5/1-emerging-models-for-artistic-research-across-europe.pdf Google Scholar
Sight Machine. 2019. Evening program. Barbican Hall, London, 11 July.Google Scholar
Skinner, B.F. 1938. The Behavior of Organisms: An Experimental Analysis. Appleton-Century-Crofts.Google Scholar
Slobodian, Quinn. 2018. Globalists: The End of Empire and the Birth of Neoliberalism. Harvard University Press.CrossRefGoogle Scholar
Sollfrank, Cornelia, Stalder, Felix, and Niederberger, Shusha, eds. 2021. Aesthetics of the Commons. Diaphanes.Google Scholar
Spångberg, Mårten. 2010. “Researching Research, Some reflections on the current status of research in performing arts.” web.archive.org/web/20100423050331/https://www.international-festival.org/node/28529Google Scholar
Stallabrass, Julian. 1995. “Empowering Technology: The Exploration of Cyberspace.” The New Left Review 211:332.Google Scholar
Stallabrass, Julian. 2011. “Negative Dialectics in the Google Era: An Interview with Trevor Paglen.” October 138:314. www.jstor.org/stable/41417903 CrossRefGoogle Scholar
Tiqqun. (2001) 2020. The Cybernetic Hypothesis. Trans. Robert Hurley. Semiotext(e).Google Scholar
Vishmidt, Marina. 2017. “Between Not Everything and Not Nothing: Cuts Towards Infrastructural Critique.” In Former West: Art and Contemporary After 1989, ed. Maria Hlavajova and Simon Sheikh, 265–69. The MIT Press.Google Scholar
Wiener, Norbert. 1961. Cybernetics: Or Control and Communication in the Animal and the Machine. The MIT Press.CrossRefGoogle Scholar
Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Profile Books.Google Scholar

TDReadings

Critical Art Ensemble. 2000. “Recombinant Theatre and Digital Resistance.” TDR 44, 4 (T168):151–66. doi.org/10.1162/10542040051058546 Google Scholar
McGlotten, Shaka. 2019. “Streaking.” TDR 63, 4 (T244):152–71. doi.org/10.1162/dram_a_00881 CrossRefGoogle Scholar
Paglen, Trevor. 2011. “The Expeditions: Landscape as Performance.” TDR 55, 2 (T210):23. doi.org/10.1162/DRAM_a_00065 CrossRefGoogle Scholar
Figure 0

Figure 1. Trevor Paglen, Sight Machine, 2017–present, production still. (Photo © Trevor Paglen; courtesy of the artist)

Figure 1

Figure 2. Trevor Paglen, From “Apple” to “Anomaly,” installation view, Barbican Art Gallery, 2019 (Photo © Max Colson)

Figure 2

Figure 3. Video still from Sylvia Eckermann/Gerald Nestler, “Making the Black Box Speak.” The Future of Demonstration, season 2, episode 3. Atelier Augarten, Vienna, 2018, thefutureofdemonstration.net/passion/e03/. (Courtesy of Gerald Nestler)

Figure 3

Figure 4. Video still from Gerald Nestler. Countering Capitulation. From Automated Participation to Renegade Solidarity. High-frequency trading and the forensic analysis of the Flash Crash, 6 May 2010, single channel video, 11:20 min., 2013–14, produced for the Forensic Architecture exhibition, Forensis at the Haus der Kulturen der Welt, Berlin, curated by Anselm Franke and Eyal Weizmann, 2014, vimeo.com/channels/AoR. (Courtesy of Gerald Nestler; courtesy of Nanec LLC)

Figure 4

Figure 5. Vladan Joler, New Extractivism, https://extractivism.online, 2020. (Courtesy of Vladan Joler)

Figure 5

Figure 6. Vladan Joler, Facebook Algorithmic Factory, NOVI EKSTRAKTIVIZAM: o mašinama, eksploataciji ljudi i prirode, MSUV, Fotografija. (Courtesy of Vladan Joler)