We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
from
Part II
-
Gender and Technology at the Workplace
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
Factory work was and still is strongly connected with images of masculinity and the ideal worker being a man. The chapter starts out with historical studies of women’s work in the electronics industry, meatpacking, and the automotive industry. The findings from these studies point to theories of the gendered organization: the gender subtext in organizations and its ‘omnirelevance’, the institutional nature of gendered notions of skill and performance, as well as the diversity and persistence of masculinities. Images of masculinity, one of the main barriers to ‘undoing gender’ in modern industrial settings, are particularly dominant in male-dominated industrial companies – in construction, mining, oil and gas – which continue resisting acceptance of women workers and women engineers, although new technologies have reduced the physical strains of the work together with a shift of skills requirements. The chapter also discusses feminist research on the gender dynamics of globalization of production and household relations based on cases studies in East Asian countries. It concludes with design considerations for the modern industrial workplace.
Facial recognition technology (FRT) has been actively deployed by both private and public sectors for a wide range of purposes in China. As the technology has become more prevalent, the laws governing FRT have developed rapidly in recent years. While the use of FRT is increasingly regulated in the country, the regulatory restrictions can be invariably lifted for the reason of public security. Government agencies have consistently claimed this regulatory exemption for their massive FRT deployment. Moreover, the liability for government’s abuse or misuse of personal data is relatively insignificant when compared with that for private parties. Based on recent laws and cases, this chapter explains China’s asymmetric regulatory framework and the factors shaping it.
Almost forty Brazilian cities have begun to deploy facial recognition technology (FRT) in a bid to automate the public safety, transportation, and border control sectors. Such initiatives are frequently introduced in the context of ‘Smart City’ programmes, which exist in a sort of legislative vacuum. Despite the numerous bills recently discussed in the Brazilian Parliament, there is still no legislation that addresses artificial intelligence in general or FRT use specifically. Only minimal and incomplete guidance can be found in general frameworks and sectoral legislation, such as the Brazilian General Data Protection Law (LGPD), the Brazilian Civil Rights Framework for the Internet, the Civil Code, and even the Federal Constitution. This chapter provides an overview of the current status of FRT regulation in Brazil, highlighting the existing deficiencies and risks. It discusses whether LGPD rules allowing the use of FRT for public safety, national defence, state security, investigative activities, and the repression of criminal activities are reasonable and justified.
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
This chapter frames the book. It explains the focus of women as a historically highly relevant category while acknowledging the multiplicities of (gender) identities and relations that the rise of queer theory has opened. It also draws attention to the different experiences that women have at work in relation to technology, which are mediated in complex ways by ethnic and class backgrounds as well as issues of sexuality. The chapter outlines the different disciplinary orientations the book draws upon – including feminist theory, science, technology, and society studies, sociology of work, political economy, organizational studies, labour history, as well as CSCW, HCI, and participatory design – to then introduce the key concepts and theories used in the book: the distinction between sex and gender, intersectionality, the problematic notion of race, the view of engineers/designers making ethical-political choices, the concept of technology. It forwards the notion of practice-based research and the importance of involving users in design decisions as key to achieving gender equality in design. These concepts will be elaborated as well as made ‘practical’ in the course of the book.
Facial recognition technology (FRT) has achieved remarkable progress in the last decade owing to the improvement of deep convolutional neural networks. The massive deployment of FRT in the United Kingdom has unsurprisingly tested the limits of democracy: where should the line be drawn between acceptable uses of this technology for collective or private purposes and the protection of individual entitlements that are compressed by the employment of FRT? The Bridges v South Wales Police case offered guidance on this issue. After lengthy litigation, the Court of Appeal of England and Wales ruled in favour of the applicant, a civil rights campaigner who claimed that the active FRT deployed by the police at public gatherings infringed his rights. Although the Bridges case offered crucial directives on the balancing between individual rights and the lawful use of FRT for law enforcement purposes under the current UK rules, several ethical and legal questions still remain unsolved. This chapter provides an overview of sociological and regulatory attitudes towards this technology in the United Kingdom; discusses the Bridges saga and its implications, and offers reflections on the future of FRT regulation in the United Kingdom.
from
Part II
-
Gender and Technology at the Workplace
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
Like clerical work much of data work is skilled but undervalued, while other parts of data work are standardized, repetitive, and organized via platforms. Feminist HCI emphasizes the skills and care that are needed to create meaningful data. While online platform work is not necessarily women’s work, research suggests that significant gender disparities exist. The chapter presents a number of case studies ranging from outsourced ML (machine learning) data work in Latin America to small-town Indian women AMT or crowdworkers in India. While offering work to women who would otherwise not have access to an independent income, the studies also highlight their vulnerability to pressures arising from work and the demands from family members. The chapter underlines the importance of labour issues connected to modern workplaces – the invisibility of the workers, the precarity of their work situation, the lack of opportunities for learning, and so forth. It points at design issues such as how to support data workers in producing data with care, and how to provide them with opportunities to learn and professionalize their work.
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
This chapter goes back to the arguments about the importance of context for design-oriented research on women’s work. It addresses questions such as: What are the most relevant aspects of context, how much do designers need to know about them, and what are the methods that can help them understand and deal with contextual elements in their work? The chapter revisits concepts that help understand contexts and their epistemological roots and discusses approaches to dealing with context in practical terms: learning about the history of a place and its culture; understanding politics, policymaking, and the institutional/organizational context; getting a hold on working conditions and skills; making space for intersectionality. The chapter includes a retrospective analysis of two of the authors’ own design/research projects, looking into how they dealt with context. It formulates a set of questions intended to help designers develop strategies that will maintain a sensitivity towards gender issues.
from
Part II
-
Gender and Technology at the Workplace
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
This chapter follows the debate of data work, without which AI-based technologies would not exist. It introduces concepts that capture bias in datasets and algorithms from a feminist data ‘ethics of care’ approach and discusses approaches to avoid or countervail bias. The chapter then turns to work and the question how to make AI-based technologies work in practice. Examples, most of them from IT development and health care, help understand the centrality of care, trust, and human–algorithm collaborations, what is often called ‘the human in the loop’, as key elements determining the usefulness of AI-based systems and tools to work. Trust and care are central to data and algorithmic stewardship, which will need to be mobilized in relation to AI and machine learning if we are going to achieve gender justice in future work processes.
This chapter examines facial recognition technology (FRT) and its potential for bias and discrimination against racial minorities in the criminal justice system. The chapter argues that by defining the technology as an automated process, there is an implied objectivity, suggesting that such technologies are free from errors and prejudices. However, facial recognition is dependent on data used to train an algorithm, and operators make judgements about the wider social system and structures it is deployed within. The algorithms that underpin FRT will continue to advance the status quo with respect to power relations in the criminal justice system, unless both data-based and societal-based issues of inequality and discrimination are remedied. FRT is imbued with biases that can negatively impact outcomes for minority groups. The chapter argues that there is a need to focus on systemic discrimination and inequality (rather than calling for a ban of the technology). While the data-based issues are more straightforward to address, this alone will not be sufficient: addressing broader and more complex social factors must be a key focus in working towards a more equal society.
Scholarly treatment of facial recognition technology (FRT) has focussed on human rights impacts with frequent calls for the prohibition of the technology. While acknowledging the potentially detrimental and discriminatory uses that FRT use by the state has, this chapter seeks to advance discussion on what principled regulation of FRT might look like. It should be possible to prohibit or regulate unacceptable usage while retaining less hazardous uses. In this chapter, we reflect on the principled use and regulation of FRT in the public sector, with a focus on Australia and Aotearoa New Zealand. The authors draw on their experiences as researchers in this area and on their professional involvement in oversight and regulatory mechanisms in these jurisdictions and elsewhere. Both countries have seen significant growth in the use of FRT, but regulation remains patchwork. In comparison with other jurisdictions, human rights protections, and avenues for individual citizens to complain and seek redress remain insufficient in Australia and New Zealand.
Protest movements are gaining momentum across the world, with Extinction Rebellion, Black Lives Matter, and strong pro-democracy protests in Chile and Hong Kong taking centre stage. At the same time, many governments are increasing their surveillance capacities in the name of protecting the public and addressing emergencies. Irrespective of whether these events and/or political strategies relate to the war on terror, pro-democracy or anti-racism protests, state resort to technology and increased surveillance as a tool to control the masses and population has been similar. This chapter focusses on the chilling effect of facial recognition technology (FRT) use in public spaces on the right to peaceful assembly and political protest. Pointing to the absence of oversight and accountability mechanisms on government use of FRT, the chapter demonstrates that FRT has significantly strengthened state power. Attention is drawn to the crucial role of tech companies in assisting governments in public space surveillance and curtailing protests, and it is argued that hard human rights obligations should bind these companies and governments, to ensure that political movements and protests can flourish in the post-COVID-19 world.
Although in all of the EU member states, law enforcement institutions have to adhere to European standards of facial recognition technology (FRT) usage, each country has local national standards that transpose these requirements into the framework of FRT in practice. However, recognising that each society has an important role in controlling the implementation of legal acts, especially where they relate to human rights, society and related interest groups have to regard the proper implementation of FRT regulation as necessary; otherwise it remains declarative and void. If public awareness and pressure to have a law implemented properly are high, the implementing institutions are forced to take action.
This chapter analyses the regulation of FRT usage by Lithuanian law enforcement institutions. Public discussion relating to FRT usage in the media, the involvement of non-governmental organisations, and other types of social control are also discussed. Finally, the chapter considers the changes that may be brought to national regulation of FRT by the EU Artificial Intelligence Act.
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
The material for this chapter is interviews the authors conducted with women who pioneered research on women’s work and technology as well as with researchers who are earlier in their career and continue this tradition with their own ideas about gender and design. Analysis of these interviews shows that the ways that people come to focus on gender are varied: they come from different disciplinary backgrounds, have developed their approach in different contexts seizing different research opportunities, and forged their own pathways in making a career in women/gender studies and technology development. This chapter takes up these personal biographies, highlighting the different starting points, research interests, and struggles.
This chapter, authored by a computer scientist and an industry expert in computer vision, briefly explains the fundamentals of artificial intelligence and facial recognition technologies. The discussion encompasses the typical development life cycle of these technologies and unravels the essential building blocks integral to understanding the complexities of facial recognition systems. The author further explores key challenges confronting computer and data scientists in their pursuit of ensuring the accuracy, effectiveness, and trustworthiness of these technologies, which also drives many of the common concerns regarding facial recognition technologies.
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
Ellen Balka, Simon Fraser University, British Columbia,Ina Wagner, Universität Siegen, Germany,Anne Weibert, Universität Siegen, Germany,Volker Wulf, Universität Siegen, Germany
As facial recognition technology (FRT) becomes more widespread, and its productive processes, supply chains, and circulations more visible and better understood, privacy concepts become more difficult to consistently apply. This chapter argues that privacy and data protection law’s clunkiness in regulating facial recognition is a product of how privacy and data protection conceptualise the nature of online images. Whereas privacy and data protection embed a ‘representational’ understanding of images, the dynamic facial recognition ecosystem of image scraping, dataset production, and searchable image databases used to build FRT suggest that online images are better understood as ‘operational’. Online images do not simply present their referent for easy human consumption, but rather enable and participate in a sequence of automated operations and machine–machine communications that are foundational for the proliferation of biometric techniques. This chapter demonstrates how privacy law’s failure to accommodate this theorisation of images leads to confusion and diversity in the juridical treatment of facial recognition and the declining coherence of legal concepts.