We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
from
Part II
-
Gender and Technology at the Workplace
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
The chapter starts with the invisibility of women in the early days of computer science and their substantial contributions to the field. An important part of this chapter are studies demonstrating the differences in women’s participation in the computing field in different countries. A nuanced, culturally situated intersectional analysis of gender and computing reveals that some countries (e.g. Malaysia or India) are much more open to educating and employing women IT professionals than others. Power issues continue to determine women’s opportunities to enter a career in the computing field. The chapter describes women’s work experiences in IT professions, the male-dominated working culture, and forms of gendered racism. One of the main barriers for women continues to be the typical working conditions in IT companies. But studies also show how women with time empower themselves, carving out a space that helps them deploy their skills and grow, and take control of their own careers. The chapter concludes with the question ‘What is wrong with computing?’
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
We have covered a considerable amount of ground. After beginning with some working definitions of gender and technology and providing an overview of the ethical-political approach to design, we went on to address historical and contemporary studies concerned with the interaction of gender and technology at work. The first step was to look back, collecting and presenting studies of women’s paid employment in different areas of work. This was done with the aim to describe key insights about gender and technology that were gained in these early studies and analyze in what ways they influenced our approach to system design and design justice. A crucial role was, for example, played by findings about invisible work. We also demonstrated how the early studies of women and technology at work helped lay the foundation for related areas that are concerned with design justice in relation to technology at work – such as critical race studies, queer studies, and intersectionality.
from
Part II
-
Gender and Technology at the Workplace
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
This chapter discusses the computerization of office work, which in many ways served as a focal point for the emergence of research about gender and technology in the 1980s. The focus on office automation allows us to explore topics that have been central to the development of feminist perspectives on work. The issues the chapter addresses include the nature of office work, invisible work and skill, the ‘gendering’ of office machines, and debates about skill – deskilling, upskilling, and the social construction of skill. Observational studies of office information also produced many valuable insights concerning technology, organization, and managerial decisions, with a focus on skill, learning, and the need for workplace design aiming to support office workers to appropriate the new technologies integrating them into their practices. The chapter positions workplace design with respect to several strong traditions of (re)designing jobs and the organization in which they are embedded that were developed in the 1970s and 1980s.
This chapter discusses the current state of laws regulating facial recognition technology (FRT) in the United States. The stage is set for the discussion with a presentation of some of the unique aspects of regulation in the United States and of the relevant technology. The current status of FRT regulation in the United States is then discussed, including general laws (such as those that regulate the use of biometrics) and those that more specifically target FRT (such as those that prohibit the use of such technologies by law enforcement and state governments). Particular attention is given to the different regulatory institutions in the United States, including the federal and state governments and federal regulatory agencies, as well as the different treatment of governmental and private users of FRT. The chapter concludes by considering likely future developments, including potential limits of or challenges to the regulation of FRT.
The key ethical requirements for all AI technologies, including facial recognition technology (FRT), is their transparency and explainability. This chapter first identifies the extent to which transparency and explainability is needed in relation to FRT among different stakeholders. Second, after briefly examining which types of information about AI could be potentially protected as trade secrets, it identifies situations where trade secret protection may inhibit transparent and explainable FRT. It then analyses whether the current trade secret law, in particular the ‘public interest’ exception, is capable of addressing the conflict between the proprietary interests of trade secret owners and artificial intelligence transparency needs of certain stake holders. This chapter focusses on FRT in law enforcement, with a greater emphasis on real-time biometric identification technologies that are considered the highest risk.
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
This chapter goes back to the feminist discourse on science/technology and gender, which started in the 1960s and 1970s and was led by women scientists. Feminists criticized the gender binary and other dualisms and brought forward an understanding of ‘scientific objectivity’ as being rooted in the multiplicity of experiences. Feminist criticism of science and technology was later enriched by queer theory and a focus on intersectionality. Of particular influence on a feminist approach to science and technology were feminist standpoint theory and, connected with it, Donna Haraway’s notion of ‘situated knowledge’. In an STS tradition, Cynthia Cockburn analyzed the gendering of technologies – or the mutual shaping of gender and technology. Researchers in the field of cultural studies have followed the STS tradition with empirical studies of how gender plays out in activities such as radio tinkering or in makerspaces. One of the important insights on the way to a gender/intersectional perspective on design is Faulkner’s work on engineers and her understanding that the gendering that occurs in engineering practices is complex and heterogeneous.
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
The aim of this chapter is to promote a view of designers making ethical-political choices. It introduces key concepts that feminist scholars contributed to our understanding of ethics and politics concerning the relationships between equality, difference, and social justice. The feminist ethics of care and responsibility builds on the centrality of relationships for women’s thinking. Joan Tronto emphasized the interrelatedness of an ethics of care with social justice arguing that the ways care is performed and institutionalized are deeply entangled with issues of power and inequality. There is also a connection between caretaking and the many forms of invisible work that socialist feminists claimed women do to maintain the paid labour force and their domestic labours that ensure social reproduction. The chapter concludes with observations about feminist politics drawing a line from ‘the personal is political’ to the ‘matrix of domination’, concepts that help us understand how power as a source of inequalities is organized on different levels and what to do to enable marginalized groups to ‘jump into the public sphere’, become visible, and have a voice.
Russia’s invasion of Ukraine is the first major military conflict in which facial recognition technology (FRT) is being used openly, with Ukraine’s Ministry of Defence publicly acknowledging its use of FRT to assist in the identification of Russian soldiers killed in combat. The technology has also likely been used to investigate people at checkpoints and during interrogations. We can expect FRT to be used for tracing individuals responsible for war crimes in the near future. For the Russian Federation, FRT has become a powerful tool to suppress anti-war protests and to identify those taking part in them. In territories occupied by Russia, FRT has been used to identify political opponents and people opposing Russian rule. This chapter focusses on the potential and risks of the use of FRT in a war situation. It discusses the advantages that FRT brings to both sides of the conflict and underlines the associated concerns. It is argued that despite human rights concerns, FRT is becoming a tool of military technology that is likely to spread and develop further for military purposes.
Digital surveillance technologies using artificial intelligence (AI) tools such as computer vision and facial recognition are becoming cheaper and easier to integrate into governance practices worldwide. Morocco serves as an example of how such technologies are becoming key tools of governance in authoritarian contexts. Based on qualitative fieldwork including semi-structured interviews, observation, and extensive desk reviews, this chapter focusses on the role played by AI-enhanced technology in urban surveillance and the control of migration between the Moroccan–Spanish borders. Two cross-cutting issues emerge: first, while international donors provide funding for urban and border surveillance projects, their role in enforcing transparency mechanisms in their implementation remains limited; second, Morocco’s existing legal framework hinders any kind of public oversight. Video surveillance is treated as the sole prerogative of the security apparatus, and so far public actors have avoided to engage directly with the topic. The lack of institutional oversight and public debate on the matter raise serious concerns on the extent to which the deployment of such technologies affects citizens’ rights. AI-enhanced surveillance is thus an intrinsically transnational challenge in which private interests of economic gain and public interests of national security collide with citizens’ human rights across the Global North/Global South divide.
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
This chapter reflects on some of the challenges surrounding the broader context of design. Strengthening intersectionality in systems design requires data not just concerning women but also gender minorities, that can be shared and analyzed; it requires ‘finding’ those who should be part of research and creating safe spaces for them. A feminist perspective encourages to prioritize the ‘personal’, recognizing it as a political act of resistance. At the heart of gender equality is the collective dimension of women’s citizenship and their social capital. Alliances in support of gender/social justice in design need to be built using strategies such as participatory infrastructuring and ‘institutioning’ but also acknowledging the importance of feminist trade unionism. The final points raised in this chapter are: how to connect with moves to decolonize discourses and practices of IT design; how to take a feminist perspective with regard to teaching; how to get funded and published; and how to challenge the business models of the software industry that undermine technical flexibility and make gender-sensitive design approaches difficult to implement on a larger scale.
This chapter provides an introductory overview of the recent emergence of facial recognition technologies (FRTs) into everyday societal contexts and settings. It provides valuable social, political, and economic context to the legal, ethical, and regulatory issues that surround this fast-growing area of technology development. In particular, the chapter considers a range of emerging ‘pro-social’ applications of FRT that have begun to be introduced across various societal domains - from the application of FRTs in retail and entertainment, through to the growing prevalence of one-to-one ID matching for intimate practices such as unlocking personal devices. In contrast to this seemingly steady acceptance of FRT in everyday life, the chapter makes a case for continuing to pay renewed attention to the everyday harms of these technologies in situ. The chapter argues that FRT remains a technology that should not be considered a benign addition to the current digital landscape. It is technology that requires continued critical attention from scholars working in the social, cultural, and legal domains.
State actors in Europe, in particular security authorities, are increasingly deploying biometric methods such as facial recognition for different purposes, especially in law enforcement, despite a lack of independent validation of the promised benefits to public safety and security. Although some rules such as the General Data Protection Regulation and the Law Enforcement Directive are in force, a concrete legal framework addressing the use of facial recognition technology (FRT) in Europe does not exist so far. Given the fact that FRT is processing extremely sensitive personal data, does not always work reliably, and is associated with risks of unfair discrimination, a general ban on any use of artificial intelligence for automated recognition of human features at least in publicly accessible spaces has been demanded. Against this background, the chapter adopts a fundamental rights perspective, and examines whether and to what extent a government use of FRT can be accepted under European law.