Book contents
- Frontmatter
- Dedication
- Contents
- Acknowledgements
- Series Editor’s Preface
- 1 Introduction
- 2 Setting the Ground: The Intermediary Liability Debate and Framing Issues
- 3 First Principles and Occupiers’ Liability: The Case against Immunity
- 4 Property and Privacy: The Case for Strict Liability
- 5 Property and Privacy: Objections and Possible Extensions
- 6 The Policy Debate: Uniqueness of Harm from NCII
- 7 The Policy Debate: Freedom of Expression and Financial Costs of Filtering
- 8 The Easy Case for Viewers’ Liability: Child Pornography and Apportionment of Liability
- 9 Viewers’ Liability: Intention and Objective Fault
- 10 The Power of Property: Strict Liability for Viewing NCII
- 11 Scope of Liability for Breaches of Privacy
- 12 Is Suing Viewers Practicable?
- 13 Conclusion
- References
- Index
7 - The Policy Debate: Freedom of Expression and Financial Costs of Filtering
Published online by Cambridge University Press: 17 January 2024
- Frontmatter
- Dedication
- Contents
- Acknowledgements
- Series Editor’s Preface
- 1 Introduction
- 2 Setting the Ground: The Intermediary Liability Debate and Framing Issues
- 3 First Principles and Occupiers’ Liability: The Case against Immunity
- 4 Property and Privacy: The Case for Strict Liability
- 5 Property and Privacy: Objections and Possible Extensions
- 6 The Policy Debate: Uniqueness of Harm from NCII
- 7 The Policy Debate: Freedom of Expression and Financial Costs of Filtering
- 8 The Easy Case for Viewers’ Liability: Child Pornography and Apportionment of Liability
- 9 Viewers’ Liability: Intention and Objective Fault
- 10 The Power of Property: Strict Liability for Viewing NCII
- 11 Scope of Liability for Breaches of Privacy
- 12 Is Suing Viewers Practicable?
- 13 Conclusion
- References
- Index
Summary
Introduction
In this chapter I debunk claims that filtering NCII involves too high costs in terms of either freedom of expression or financial costs, as well as the related claim that such filtering is not technologically feasible. I first focus on Facebook’s (now Meta) filtering practice as reflected in its (untransparent) transparency report. I then evaluate this practice to highlight its shortcoming, delineate the contours of an acceptable and practicable NCII filtering backed by (a more controversial) strict liability for harm from remaining NCII. I discuss penumbra definitional issues of intimacy beyond nudity and cultural differences and scope of liability for harms from these images. The approach I take diverges from the recent Law Commission’s definition of intimate images (2022), by affording better protection to cultural minorities and taking lessons from medical ethics and law. I also discuss an economy of scales and its potential relevance to smaller intermediaries with a critique of the weight given in recent policy discussions to a means-based test as limiting intermediaries’ potential duties to filter content.
Existing filtering practice, with a focus on Facebook
As was documented in Chapters 2 to 3, the two main arguments against intermediaries’ liability for user content are the financial costs involved in such an obligation – including effects on competition – and a chilling effect on expression given excessive removal of lawful speech (Friedmen and Buono, 2000; Guo, 2008; Kosseff, 2010; Goldman, 2012; Keller, 2017b). This part is dedicated to examine whether these costs are real or significant for a filtering system of NCII. As the discussion demonstrates, they are not.
In evaluating the financial and expression costs of filtering NCII, it might be useful to start by distinguishing between two versions of a prenotice liability. One is a best effort – or feasibility – standard (Helman and Parchomovsky, 2011), which is similar to the duty of care standard suggested by the government in the Online Harms White Paper and the Online Safety Bill, to the European Commission approach of expecting platforms – albeit voluntarily – to proactively monitor some types of illegal content, and to the C-Digital SM Directive.
- Type
- Chapter
- Information
- Egalitarian Digital PrivacyImage-based Abuse and Beyond, pp. 110 - 131Publisher: Bristol University PressPrint publication year: 2023