Published online by Cambridge University Press: 10 June 2021
The debate around whether novel beings should be legally recognized as legitimate rights holders is one that has produced a vast amount of commentary. This paper contributes to this discourse by shifting the normative focus of moral rights away from criteria possessed by the novel beings in question, and back toward the criterion upon which we ourselves are able to make legitimate rights claims. It draws heavily on the moral writing of Alan Gewirth’s identification of noumenal agency as the source of all legitimate rights claims. Taking Gewirthian ethical rationalism as providing a universally applicable hypothetical imperative which binds all agents to comply with its requirements, the paper argues that it is at least morally desirable that any legal system should recognize the moral rights claims of all agents as equally legitimate. By extension, it is at least morally desirable that the status of legal personhood should be granted by a legal system to all novel beings who are noumenal agents, insofar as this status is necessary for rights’ legal recognition. Having established the desirability of this extension, the paper closes with an examination of recent cases involving both biological and nonbiological novel beings in order to assess their conformity with the desirable approach outlined above. The paper demonstrates that such recognition is conceptually possible, thus requiring us to move beyond the current anthropocentricity of legal systems and recognize the legitimate moral claim for legal personhood for all novel beings who possess noumenal agency.
1. As questions as to the desirability of legal rights for all these novel beings exist, this non-exhaustive list should be seen as a definitional point of reference for the term “novel being” for the remainder of this article.
2. Department of Health and Social Security, Report of the Committee of Inquiry into Human Fertilisation and Embryology (Cmnd 9314, 1984), para 11.17.
3. See note 2, Warnock Report 1984, para 3–5.
4. See note 2, Warnock Report 1984, paras 11.18, 22, 24, and 30, respectively.
5. The argument is dialectical as it proceeds on a conversational basis whereby a series of claims affecting an agent are made, and is necessary as acceptance of the starting premise logically requires the agent to accept the subsequent premise. It is therefore designed to be logically compelling, as the end point of the argument is an inescapable conclusion of accepting the starting premise.
6. Beyleveld, D. The Dialectical Necessity of Morality; An Analysis and Defense of Alan Gewirth’s Argument to the Principle of Generic Consistency. Chicago: University of Chicago Press; 1991.Google Scholar
7. Jowitt, J. Monkey see, monkey sue: Gewirth’s principle of generic consistency and rights for non-human agents. Trinity College Law Review 2016;19:71.Google Scholar
8. A categorical imperative is a specific type of maxim that claims to regulate our behavior in a way which overrides all contradictory reasons. It is an imperative in that it is addressed to all agents, and categorical in that it applies unconditionally.
9. A noumenal (or bare) agent is an agent whose subjective characteristics have been removed, leaving only the ability to choose to act in response to a desire for a given end, beyond mere reflex or natural impulse.
10. See note 6, Beyleveld 1991, at 14.
11. This paper will not attempt to engage with the debate as to whether the standard view is the correct view of agency. It assumes that, in the absence of convincing evidence as to why this view is incorrect, it should be accepted as an accurate representation of what it means to be an agent.
12. Gewirth, A. Reason and Morality. Chicago: University of Chicago Press; 1978:22–63.Google Scholar
13. See note 12, Gewirth 1978, at 63–103.
14. See note 6, Beyleveld 1991, at 44–5.
15. See note 12, Gewirth 1978, at 104–98 (especially 104–28).
16. See note 6, Beyleveld 1991, at 44–5.
17. Nagel, T. What is it like to be a bat? The Philosophical Review 1974; 83(4):435, 436–40.CrossRefGoogle Scholar
18. Beyleveld, D. The principle of generic consistency as the supreme principle of human rights. Human Rights Review 2011;13:9–10.Google Scholar
19. Capps, B. Do chimeras have minds? The ethics of clinical research on a human-animal brain model. Cambridge Quarterly of Healthcare Ethics 2017;26:577–8.CrossRefGoogle Scholar
20. Sebo, J The moral problem of other minds. Harvard Review of Philosophy 2018;25:51–3.CrossRefGoogle Scholar
21. Floridi, L, Sanders, J. On the morality of artificial agents. Minds and Machines 2004;14(3):349–50CrossRefGoogle Scholar; Gunkel, DJ. A vindication of the rights of machines. Philosophy and Technology 2014;27:118.CrossRefGoogle Scholar
22. See note 21, Floridi, Sanders 2004, at 350.
23. Wendell, W, Allen, C, Franklin, S. Consciousness and ethics: Artificially conscious moral agents. International Journal of Machine Consciousness 2011;3(1):179, 190.Google Scholar
24. Himma, K. Artificial agency, consciousness, and the criteria for moral agency: What properties must an artificial agent have to be a moral agent? Ethics and Information Technology 2009;11:21.CrossRefGoogle Scholar
25. The question of legal personhood for non Homo sapiens has been explored before. See Stone, CD. Do trees have standing? Towards legal rights for natural objects. Southern California Law Review 1972;45:450.Google Scholar
26. Chen, J, Burgess, P. The boundaries of legal personhood: How spontaneous intelligence can problematise differences between humans, artificial intelligence, companies and animals. Artificial Intelligence and Law 2019;27(1):80CrossRefGoogle Scholar; Brozek, B. The troublesome “person”. In: Kurki, V, Pietrzykowski, T, eds. Legal Personhood: Animals, Artificial Intelligence and the Unborn. Cham: Springer; 2017:8.Google Scholar
27. Naffine, N. Who are law’s persons—from Cheshire cats to responsible subjects. Modern Law Review 2003;66:362–4.CrossRefGoogle Scholar
28. Taylor, C. Sources of the Self: The Making of the Modern Identity. Cambridge, MA: Harvard University Press; 1992Google Scholar; Griffin, J. On Human Rights. Oxford: Oxford University Press; 2008.CrossRefGoogle Scholar
29. van den, Hoven van Genderen, R. Do we need new legal personhood in the age of robots and AI? In: Corrales, M, Fenwick, M, Forgó, N, eds. Robotics, AI and the Future of Law. Singapore: Springer; 2018:25.Google Scholar
30. Lawrence, D. More human than human. Cambridge Quarterly of Healthcare Ethics 2017;26:486.CrossRefGoogle Scholar
31. See note 30, Lawrence 2017, at 485.
32. Warwick, K. What is it like to be a robot? In: Vallverdú, J, ed. Thinking Machines and the Philosophy of Computer Science: Concepts and Principles. Hershey, PA: IGI global; 2010:322.Google Scholar
33. Midgley, M. Animals and Why They Matter. Athens: University of Georgia Press; 1983:65Google Scholar; StPierre, DW. The transition from property to people: The road to the recognition of rights for non-human animals. Hastings Womens Law Journal 1998;9(2):255Google Scholar; Wise, SM. Rattling the Cage: Towards Legal Rights for Animals. London: Profile; 2000:5, 49–50Google Scholar; McDaris, CJ. Legal protection only for those who are most like us—what animal activists can learn from the early women’s movement about society’s resistance to acknowledging rights. Journal of Animal Law 2006;2(1):159Google Scholar; Cornell, D. Are women persons? Animal Law 1997;3:7.Google Scholar
34. Lawrence, D, Palacios-González, C, Harris, J. Artificial intelligence: The Shylock syndrome. Cambridge Quarterly of Healthcare Ethics 2016;25:250.CrossRefGoogle Scholar
35. See note 34, Lawrence, Palacios-González, Harris 2016, at 251.
36. Boella, G, Lesmo, L, Damiano, R. On the ontological status of plans and norms. Artificial Intelligence and Law 2004;12:317.CrossRefGoogle Scholar
37. Pagallo, U, Corrales, M, Fenwick, M, Forgó, N. The rise of robotics & AI: Technological advances& normative dilemmas. In: Corrales, M, Fenwick, M, Forgó, N, eds. Robotics, AI and the Future of Law. Singapore: Springer; 2018:7.Google Scholar
38. Bryson, J, Diamantis, M, Grant, T. Of, for, and by the people: The legal lacuna of synthetic persons. Artificial Intelligence Law 2017;25(2):273.CrossRefGoogle Scholar
39. See note 38, Bryson, Diamantis, Grant 2017, at 275.
40. See note 38, Bryson, Diamantis, Grant 2017, at 278.
41. See note 38, Bryson, Diamantis, Grant 2017, at 283.
42. See note 38, Bryson, Diamantis, Grant 2017, at 284.
43. See note 38, Bryson, Diamantis, Grant 2017, at 274; Solaiman, SM. Legal personality of robots, corporations, idols and chimpanzees: A quest for legitimacy. Artificial Intelligence Law 2017;25(2):155.CrossRefGoogle Scholar
44. See note 27, Naffine 2003, at 346.
45. See note 27, Naffine 2003, at 359.
46. Te Awa Tupua (Whanganui River Claims Settlement) Act 2017 ss 18–20 (2) (New Zealand).
47. Mohd. Salim v State of Uttarakhand and others (Writ Petition (PIL) No.126 of 2014) para 19; available at https://drive.google.com/file/d/0BzXilfcxe7yuM3VRWTZDeEtmSGc/view (last accessed 26 Mar 2019).
48. Children and Young Persons Act 1933 s 50, as amended by Children and Young Persons Act 1963 s 16 (1).
49. McGee, EM, McGuire, GQ Jr. Becoming Borg to become immortal: Regulating brain implant technologies. Cambridge Quarterly of Healthcare Ethics 2007;16:291.CrossRefGoogle ScholarPubMed
50. Acción de hábeas corpus presentada por la Asociación de Funcionarios y Abogados por los Derechos de los Animales (AFADA) EXPTE. NRO. P-72.254/15, Tezer Juzgado de Garantías Poder Judicial Mendoza; available at https://www.nonhumanrights.org/content/uploads/Chimpanzee-Cecilia_translation-FINAL-for-website-2.pdf translation by Ana María Hernández (Cecilia’s Case) (last accessed 5 Apr 2018).
51. See note 50, Cecilia’s Case 2015, at 32–3.
52. See note 50, Cecilia’s Case 2015, at 8–12.
53. See note 50, Cecilia’s Case 2015, at 24.
54. Favre, D. A new property status for animals: Equitable self-ownership. In: Sunstein, S, Nussbaum, M, eds. Animal Rights: Current Debates and New Directions. Oxford: Oxford University Press; 2004:236Google Scholar; Wise, SM. Animal rights, one step at a time. In: Sunstein, S, Nussbaum, M, eds. Animal Rights: Current Debates and New Directions. Oxford: Oxford University Press; 2004:19.Google Scholar
55. Goodall, J, Wise, SM. Are Chimpanzees entitled to fundamental legal rights? Animal Law 1997;3:69.Google Scholar
56. Moritz v Commissioner of Internal Revenue 469 F.2d 466 (10th Cir. 1972).
57. Atiyah, PS, Summers, R. Form and Substance in Anglo-American Law. Oxford: Clarendon Press; 1987:134Google Scholar; Llewellyn, K. The Common Law Tradition: Deciding Appeals. Boston: Little, Brown and Co; 1960:90–1Google Scholar; Kelch, TG. Toward a non-property status for animals. New York University Environmental Law Journal 1998;6(3):546.Google Scholar
58. Arguably, the two most influential and thorough examples of such academic advocacy can be found in the following contributions: see note 34, Wise SM 2000 and Regan, T. The Case for Animal Rights. 2nd ed. Berkeley: University of California Press; 2004.Google Scholar
59. Objectives of Nonhuman Rights Project; available at https://www.nonhumanrights.org/who-we-are/ (last accessed 20 Apr 2019).
60. Naruto et al. v Slater et al., No 3:15-cv-04324, 2016 (ND CA, 6 Jan 2016) at 4; Naruto et al. v Slater et al., 888 F.3d 418, 426 (9th Cir. 2018).
61. See note 60, Naruto 2016.
62. See note 60, Naruto 2016, at 27.
63. See note 60, Naruto 2016, at 33.
64. Naruto et al. v Slater et al., No 3:15-cv-04324, 2016 (ND CA, 6 Jan 2016) Motion to dismiss, filed 6 Nov 2015, at 2.
65. See note 60, Naruto 2018, at 419–22.
66. The Cetacean Community v Bush 386 F. 3d 1169 (9th Cir. 2004).
67. Nonhuman Rights Project, Inc., on Behalf of Tommy v. Lavery, 31 N.Y.3d 1054, 1058 (8 May 2018).
68. See note 50, Cecilia’s Case 2015, at 23–4.
69. See note 60, Naruto 2016, at 27.
70. See note 32, Warwick 2010, at 313–7.
71. See note 32, Warwick 2010, at 317–8.
72. See note 32, Warwick 2010, at 321.
73. Zenor, J. Endowed by their creator with certain rights: The future rise of civil rights for artificial intelligence? Savannah Law Review 2018;5(1):119.Google Scholar
74. Andrade, F, Novais, P, Machado, J, Neves, J. Contracting agents: Legal personality and representation. Artificial Intelligence Law 2007;15:364.CrossRefGoogle Scholar
75. Arnold, BB, Gough, D. Turing’s people: Personhood, artificial intelligence and popular culture. Canberra Law Review 2017;15(1):1.Google Scholar
76. Brozek, B, Jacubiec, M. On the legal responsibility of autonomous machines. Artificial Intelligence Law 2017;25:293.CrossRefGoogle Scholar
77. See note 76, Brozek, Jacubiec 2017, at 301–2.
78. See note 76, Brozek, Jacubiec 2017, at 299.
79. See note 37, Pagallo, Corrales, Fenwick, Forgó 2018, at 5.
80. See note 24, Himma 2009, at 23.
81. See note 24, Himma 2009, at 25, 27–8.
82. See note 32, Warwick 2010, at 324.
83. Joint A. Laws of robotics: After Azimov. Law Society Gazette. 2017 Feb 13; In Practice: Technology.
84. Pagallo, U. The Laws of Robots: Crimes, Contracts, and Torts. Dordrecht: Springer; 2013:25.Google Scholar
85. See note 29, van den Hoven van Genderen 2018, at 16.
86. Nasarre-Aznar, S. Ownership at stake (once again): Housing, digital contents, animals and robots. Journal of Property, Planning and Environmental Law 2018;10(1):69.CrossRefGoogle Scholar