Book contents
- Frontmatter
- Dedication
- Contents
- About the Author
- Acknowledgements
- One Information Warfare in Technocratic Times
- Two The Digiqueer Fight Against Algorithmic Governance
- Three Information Warfare Against Drag Queen Storytime
- Four (Mis)Representation of Same-Sex Attraction
- Five Digiqueer Activism, Advocacy and Allyship
- Six Data Driven Times?
- Notes
- References
- Index
Two - The Digiqueer Fight Against Algorithmic Governance
Published online by Cambridge University Press: 18 January 2024
- Frontmatter
- Dedication
- Contents
- About the Author
- Acknowledgements
- One Information Warfare in Technocratic Times
- Two The Digiqueer Fight Against Algorithmic Governance
- Three Information Warfare Against Drag Queen Storytime
- Four (Mis)Representation of Same-Sex Attraction
- Five Digiqueer Activism, Advocacy and Allyship
- Six Data Driven Times?
- Notes
- References
- Index
Summary
Introduction
The sorting of populations into measurable types for security and profit through surveillance technologies has amplified the pre-digital fault lines of anti-LGBTQ+ stigma, reinforcing aspects of the hierarchy of human value. As noted in Chapter 1, algorithmically filtered, discriminatory depictions can reinforce the subordination of a group based on identity, including stereotyping, recognition, denigration and underrepresentation (Mehrabi et al 2021). Platform biometrics such as facial and gait recognition software can be the basis for unlawful discrimination based on ethnicity, race, national origin, gender and other characteristics (United Nations General Assembly, 2019). Artificial intelligence extracts patterns from digital information that contests Western liberal notions of consent and privacy, and is only as reliable as its source information.
Shifting norms on consent have ‘blurred’ through data accrual on sexual identity and government aggregation of social media data with other available databases (Shephard 2016). The appropriation of digital human experience has ‘redistributed’ privacy (Zuboff 2019). As a consequence, algorithmic intervention, and the adoption of ‘neuroliberal’ approaches to behaviour modification – a combination of neoliberal principles with policy initiatives derived from insights in the behavioural sciences (Whitehead et al 2019) – have further justified the erosion of standards of truth and trust in democratic institutions by ostensibly supporting liberal orthodoxies of freedom, while adopting ‘novel cognitive strategies, emotions, and pre-cognitive affects as a way of securing preferred forms of social conduct’ (Whitehead et al 2019, p 633).
The expropriation of critical human rights has been argued as an overthrow of people's sovereignty and a central tenet of ‘surveillance capitalism’ – ‘[a] new economic order that claims human experience as free raw materials for hidden commercial practices of extraction, prediction, and sales’ (Zuboff 2019, p vi). Simply put, algorithmic governance, and neuroliberal approaches to behavioural modification, can commodify questionably sourced user-generated data without commensurate investment in strategies to mitigate the social costs of the biases it might generate, and/or perpetuate. It is within this context that the digiqueer citizen must address the intact social origins of stigma, despite the decriminalization of same-sex conduct in a growing number of jurisdictions, the expansion of legitimate categories of vulnerability enshrined in anti-discrimination law (Solanke 2017), and progress made on marriage equality.
- Type
- Chapter
- Information
- Representation, Resistance and the DigiqueerFighting for Recognition in Technocratic Times, pp. 25 - 47Publisher: Bristol University PressPrint publication year: 2023