Skip to main content Accessibility help
×
Hostname: page-component-586b7cd67f-t7czq Total loading time: 0 Render date: 2024-11-27T16:55:34.308Z Has data issue: false hasContentIssue false

7 - Privacy

from Section IA - Concepts

Published online by Cambridge University Press:  09 June 2021

Graeme Laurie
Affiliation:
University of Edinburgh
Edward Dove
Affiliation:
University of Edinburgh
Agomoni Ganguli-Mitra
Affiliation:
University of Edinburgh
Catriona McMillan
Affiliation:
University of Edinburgh
Emily Postan
Affiliation:
University of Edinburgh
Nayha Sethi
Affiliation:
University of Edinburgh
Annie Sorbie
Affiliation:
University of Edinburgh

Summary

Privacy is a well-established element of the governance and narrative of modern society. In research, it is a mainstay of good and best practice; major research initiatives all speak of safeguarding participants’ rights and ensuring ‘privacy protecting’ processing of personal data. However, while privacy protection is pervasive in modern society and is at the conceptual heart of human rights, it remains nebulous in character. For researchers who engage with people in their studies, the need to respect privacy is obvious, yet how to do so is less so. This chapter offers first an explanation of why privacy is a difficult concept to express, how the law approaches the concept, and how it might be explored as a broader normative concept that can be operationalised by researchers. In that broad scheme, I show how individuals respond to the same privacy situation in different ways – that we have a range of privacy sensitivities. I think about four privacy elements in the law: human rights, privacy in legal theory, personal data protection and consent. Finally, I consider how law participates in the broader normative understanding of property as the private life lived in society.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2021
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NC
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC 4.0 https://creativecommons.org/cclicenses/

7.1 Introduction: The Modern DifficultyFootnote 1

Privacy is a well-established element of the governance and narrative of modern society. In research, it is a mainstay of good and best practice; major research initiatives all speak of safeguarding participants’ rights and ensuring ‘privacy protecting’ processing of personal data. However, while privacy protection is pervasive in modern society and is at the conceptual heart of human rights, it remains nebulous in character. For researchers who engage with people in their studies, the need to respect privacy is obvious, yet how to do so is less so. This chapter offers first an explanation of why privacy is a difficult concept to express, how the law approaches the concept and how it might be explored as a broader normative concept that can be operationalised by researchers. In that wider scheme, I show how individuals respond to the same privacy situation in different ways – that we have a range of privacy sensitivities. I think about four privacy elements in the law: human rights, privacy in legal theory, personal data protection and consent. Finally, I consider how law participates in the broader normative understanding of property as the private life lived in society.

7.2 Privacy as a Normative Difficulty

A good starting point is to ask: what do we mean when we talk about ‘privacy’? It would be difficult for a modern research project to suggest that it was not ‘privacy respecting’ or ‘privacy preserving’. However, the concept is somewhat ill-defined, and that claim to be privacy respecting or preserving might, in reality, add little to the protection of individuals. In part, this problem stems from the colloquial, cultural aspect of the concept: we each have our own idea of what constitutes our privacy – our private space.

Imagine setting up a new data sharing project. You hypothesise that linking data that different institutions already gather could address a modern health problem – say, the growth of obesity and type 2 diabetes. Such data, current and historical, could be used by machine learning to create and continuously revise algorithms to help identify and ‘nudge’ those at risk of developing the condition or disease. The already-gathered data could be from general practitioners and hospitals, supermarkets and banks, gym memberships, and health and lifestyle apps on smart phones, watches and other ‘wearables’. But how would individuals’ privacy be protected within such a project? Many will be uneasy about such data being stored in the first place, let alone retaining it and linking it for this purpose. Many will see that there might be a benefit, but would want to be convinced of technical safeguards before opting into such a project. Many will be happy, having ‘nothing to hide’ and seeing the benefits for their health through such an app. Some would see this initiative as socially desirable, as part, perhaps, of one’s general duty and the basis of personalised medicine, so that such processing would be a compulsory part of registration for healthcare; an in-kind payment to the healthcare system alongside financial payments, necessary for the continued development of modern healthcare that is a general societal and personal good.

Our difficulty is that each one of the people taking these different positions would see their response as a ‘privacy preserving’ stance.Footnote 2 As explored elsewhere in this volume, this observation underlines the diversity of ‘publics’ and their views (see Aitken and Cunningham Burley, Chapter 11, and Burgess, Chapter 25, in this volume). Under the label ‘privacy’ there is a wide spectrum of conceptualisations, from the enthusiastic adopter and compulsion for all, through allowing people to opt-out, generally leaving participation to opting in, to wanting nothing to do with such projects. How then can a researcher frame a ‘privacy’ policy for their research? Are we creating the problem by using the term ‘privacy’ informally and colloquially? Does the law provide a definition of the term that avoids or militates against the problem?

7.3 Privacy as a Human Right

A logical starting point might be human rights law. Privacy and the right to respect for private life is enshrined in human rights law. Unfortunately, it does not give much assistance in the definition of those rights. Two examples show the common problem clearly.Footnote 3 Article 12 of the Universal Declaration of Human Rights states:

No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.Footnote 4

Article 8 of the European Convention on Human Rights creates the right in this way:

  1. 1. Everyone has the right to respect for his private and family life, his home and his correspondence.

  2. 2. There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.Footnote 5

Two observations can be made about these ‘privacy’ rights: (1) privacy is not an absolute right, i.e. there are always exceptions and (2) ‘privacy’ and ‘respect for private life’ require a great deal of further definition to make them operational. As to the first observation, the rights are held in relation to the competing rights of others: a right against ‘arbitrary interference’ and ‘no interference … except such’. The concepts of privacy in human rights legislation acknowledge that the rights are held in balance between members of society; privacy is not absolute, because on occasion one has to give way to the needs of others.

As to the content of privacy – and reflecting the broad conceptualisations in the research project example above – we see that what is available from the human right to privacy is international recognition of a space where an individual can exist, free from the demands of others; there is a normative standard that recognises that people must be respected as individuals.

The European Court of Human Rights has ruled extensively on the human right to respect for private life, and a line of caselaw has been created. This produces a canon of decisions where particular disputes have been settled where the particular parties have been unable to resolve their conflict between themselves. However, does that line of cases produce a normative definition of privacy, i.e. one that sits with and accommodates the range of sensitivities expressed above? I think not. A courtroom determination arguably defines a point on the range of sensitivities as ‘privacy’, pragmatically for the parties. Our problem comes when we try to use caselaw as indicative of more than how judges resolve conflicts between intractable parties when a privacy right is engaged. Does this mean the law adds little to the broader normative question about how we, as researchers, should respect the privacy of those with whom we engage in our work?

Two North American contributions could help to understand this. The first expression of the legal right to privacy is usually recognised as Warren and Brandeis’ 1890 idea that we can agree that individuals have the right to be left alone.Footnote 6 Reading their paper today, it resonates with current concerns: technological developments and the increasing press prurience required a right to be ‘left alone’. In the modern context of genetics, Allen proposes a broader typology of privacy: ‘physical privacy’, ‘proprietary privacy’, ‘informational privacy’, and ‘decisional privacy’.Footnote 7 The first two, which seem strange ‘privacies’ today, are where Warren and Brandeis clearly see the Common Law as having reached in 1890. Law protects individuals’ physical privacy through consent; private property law is equally well established. Warren and Brandeis identified ‘informational privacy’ and what might be described as ‘reputational privacy’ as the area where the law needed to develop in 1890. Allen pointed to the vast and compelling literature around the woman’s right to choose in discussing the right to ‘decisional privacy’. Legal theory, in part, responds to current privacy issues. Today, two major privacy issues in research are the protection of personal data protection and informed consent.

7.4 Privacy in Specific Legal Responses: Personal Data Protection and Informed Consent

The development of the automated processing of personal data has focussed privacy, at least in part, around ‘informational privacy’.Footnote 8 The Organization for Economic Co-operation and Development OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data set an international standard in 1980 that remains at the core of data protection law.Footnote 9 The guidelines are transposed into regional and national laws.Footnote 10

Data protection, as an expression of an area of privacy, seeks to balance a variety of interests in the processing of personal data within the non-absolute nature of privacy; the object of data protection is to create the legal conditions under which it is possible and appropriate to process personal data. Taking the European Union General Data Protection Regulation 2016/679 (GDPR) as an example, there are four elements in data protection law: data protection principles;Footnote 11 legal bases for processing personal data;Footnote 12 information that must be given to the data subject;Footnote 13 and, rights of the data subject.Footnote 14 Each element contains a balance of interests.

For stand-alone research with human participants directly contacted by the researcher, the route through the GDPR is clear. Security and data minimisation standards (i.e. only gathering, analysing and keeping data for such a time necessary for the purpose of the project) are clear; data subjects can be informed about the project fully, and data subjects rights can be respected. More complex data-sharing methodologies – perhaps the project envisaged in Section 7.2 above – are more difficult to negotiate through the GDPR. Are original consents valid for the new processing? Was the consent too broad for the new GDPR requirements? Might the processing be compatible with the original purpose for which the data were gathered? Could the new processing be in the public interest? How should data subjects be informed about the proposed new project? Each of these questions is open to debate in the GDPR. And the problem is how can the lack of definitional clarity in the rights be resolved in such a way that it accommodates all the positions on the spectrum of interests indicated in Section 7.2 above? One could say, law must produce a working definition and in a democracy, all differences cannot be accommodated, so some will be disappointed. However, the sensitivity of the data in the example above shows that there is a danger that those who are not within the working definition of privacy will be alienated from participating in key areas of social life, perhaps even avoiding interaction with, say, health research or medical services to their detriment.

If one of the current legal discussions is around informational privacy, the other is around decisional privacy. Informed consent is a legal mechanism to protect decisional privacy, not just in research, but across consumer society. The right of individual adults to make their own choices is largely unchallenged.Footnote 15 The choices must be free and informed. The question is, how informed must a choice be to qualify as a valid choice from an individual? This, in many modern biomedical research methodologies, is contested. A biobank, where data are gathered for the purpose of providing datasets for future, as yet undefined, research projects, depends on creating the biobank at the outset through a ‘broad’ consent. How though can an individual be said to give ‘informed consent’ if the purposes for which the consent is asked cannot be explained in detail? How can a consent that is ‘for research’ be specific enough to be an adequate safeguard of privacy interests? (See, for example, Kaye and Prictor, Chapter 10, in this volume for a specific discussion of consent in this context.)

The privacy issue is: what constitutes sufficient information upon which a participant can base her choice? Two conditions have to be satisfied: the quality of the information that will be made available; and (who determines) the amount of information that is necessary to underpin a decision. A non-specialist participant is not necessarily in a position to judge the first of these conditions. That is the role of independent review boards, standing as a proxy for the participant to assess the quality and trustworthiness of the scientific and methodological information that will be offered to the participant. For the second condition, what is sufficient information upon which to make a decision and who determines that decision, is a matter for the individual participant, and should not be seen as part of the role of the ethics committee, researcher or other body. The purpose of informed consent is to protect the individual participant from, essentially, paternalism – the usurping of the participant’s free choice of whether or not to participate (unless the decision is palpably to the detriment of an individual who is not deemed competent to make a choice). Therefore, in the general case, it is inappropriate to remove the determination of what is sufficient information to inform the particular person from that person, or to determine for them what are appropriate or inappropriate considerations to bring to the decision-making process. This would seem to be crucial in ensuring an individual’s decisional privacy – the extent of the right to make decisions for oneself.

7.5 Realising Privacy in Modern Research Governance

So far I have made two claims about privacy. First, individuals hold a range of sensitivities about their privacy (and we could add that this is a dynamic balance depending also upon the relationships between individuals and the emotional setting or moment of the relationship.Footnote 16 Second, the law produces a mechanism for resolving conflicts that fall within its definition of privacy, but it does not provide a complete normative definition of privacy that meets all the social functions required of the concept (that will confront researchers negotiating privacy relationships with their participants). Two observations might help with locating our thinking at this point. First, there is not a complete, normative definition of privacy in any discipline that satisfactorily meets the dynamic nature of privacy. There are many different definitions and conceptualisations, but there is no granular agreement on the normative question – what ought I to understand as ‘my permitted private life’.Footnote 17 Second, the presentation so far might appear to suggest that privacy is a matter of individual autonomy, in opposition to society. This, in the remainder of the chapter, I will argue is not the case, by exploring how privacy might be operationalised, in our case, in research. The question is: what tools can we use to understand our relationships as individuals in society?

To do this, I suggest that there are three areas that can usefully be considered by both researchers (and participants) in the particular circumstances of a research project and by society in trying to understand the conceptualisation of privacy in modern society: the public interest, confidentiality and discourse.

The public interest, the common good, as a measure of solidarity is very attractive. It addresses directly the range of sentiments problem to which I refer to throughout the chapter (see Kieslich and Prainsack, Chapter 5, and Sorbie, Chapter 6, in this section, and Taylor and Whitton, Chapter 24, later in this volume). Appealing to the public interest is a practical mechanism that answers the individual’s privacy sensitivities with the following: whatever you believe to be your privacy, these are the supervening arguments why, for example, you should let me stand on your land or use your personal data (your privacy has to accommodate these broader needs of others). The difficulty with the public interest is that it seems itself to have no definition or internal rules. Appeals to the public interest seem to be constructed loosely through a utilitarian calculus: the greatest utility for the greatest number. Mill himself identifies the problem: the tyranny of the majority. The problem has two elements. The claim to ‘supervening utility’ could seem itself to be a subjective claim, so those in the minority, suffering the consequences of a loss of amenity (in this case the breach of their privacy), are not immediately convinced of the substance of the argument. The construction does not balance the magnitude of the loss to the individual with the benefit (or avoided loss) of another individual; rather, the one stands against the many. This is not particularly satisfactory, especially when one links this back to the fundamental breach and the sense of the loss of privacy cutting to the personhood of the individual. Adopting the arguments of Arendt, we might phrase this more strongly. Arendt identifies the individual as constituted in two parts: the physical and the legal. In her studies of totalitarianism, she finds that tyrannies occur where the two parts of the individual are separated by bureaucracies and the legal individual is forgotten. Left with only the physical individual, the human is reduced to an expendable commodity.Footnote 18 Simple appeals to the public interest could be in danger of overlooking the whole individual and producing an alienation of those whose rights are removed in the public interest or common good.

Another way of constructing the appeal to the public interest can be through deontological rather than ontological theories, particularly those of Kant and Rawls. Taking Kant,Footnote 19 a first step would be to consider the losses to individuals – the person who stands to lose their privacy rights, and a person who would suffer a loss if that privacy was not breached. A second step would be to require each of those individuals to consider their claim to their privacy through the lens of the second formulation of the Categorical Imperative – that one should treat others as ends in themselves, not merely as means to one’s own ends.Footnote 20 Because privacy is not an absolute right, when making such a claim, we must each ask: do I merely instrumentalise the other person in the balance by making this privacy claim? This is a matter of fact: which of us will suffer most? The third element is to acknowledge that the law can require me to adopt that choice if I fail to make it for myself, as it is the choice I should have made unprompted (I can argue that the calculation on the facts is incorrect, but not that the calculation ought not to be made). Rawls might construct it slightly differently: whereas I might prefer a particular action preserving my privacy, I must accept the breach of my privacy as reasonable in the circumstances. Using his ‘veil of ignorance’, when I do not know my potential status in society, I must adopt this measure to protect the least-well-off member of society when the decision is made.Footnote 21

In the example raised in Section 7.2, using this public interest consideration helps to reconcile the range of sensitivities problem. As a researcher trying to design privacy safeguards, I can use the calculations to evaluate the risks and benefits identifiable in the research, and then present the evaluation to participants and regulators. The public interest creates a discourse that steps outside self-interest. However, this sets off a klaxon that the public interest is not antithetical to privacy, as presented here; public interest is part of privacy. And I agree. Here, I am suggesting that using public interest arguments is a mechanism for defining the relationship of the individual to others (that is, to other individuals). The result is not saying that the public interest ‘breaches’ the privacy of the participant, but that it helps to define the individual’s privacy in relation to others (for the individual and for other people and institutions). It brings to the subjectivity of the dynamic range of sentiments (that I identified as an issue at the outset) the solidarity and community that is also part of one’s privacy.Footnote 22 This holistic understanding of privacy as a private life lived in community not reducible to a simply autonomy-based claim is best explored by Laurie’s ‘spatial’, psychological privacy.Footnote 23

Confidentiality is a second legal tool to ensure participants’ rights are safeguarded. Arguably, it is a more practical tool or concept for researchers than privacy. Confidentiality earths abstract privacy concepts in actionable relationships and duties. Taking Common Law confidentiality as an example, it is constructed either expressly, as a contractual term, or it is implied into the conduct of a contract or through equity into the relationship between individuals.Footnote 24 Confidentiality depends on concrete, known parameters of the relationship, or parameters that one ought, in good conscience, to have known. Like data protection, it does not prohibit behaviour; rather confidentiality creates an environment in which particular behaviours can occur. This is important in the context of health research regulation because many potential research participants will be recruited through the professional relationships they enjoy with healthcare professionals; it is a tool that can be extended into other researcher–participant relationships. Confidentiality and the trust-based nature of that relationship can both help with recruitment and provide a welcome degree of reassurance about privacy protection.

Finally, and implicit throughout the operationalisation of privacy, privacy is a negotiated space that requires public engagement through discourse. Discourse ethics has a modern iteration, but a long history. The virtue ethics of Aristotle and Ancient Greek philosophy is dependent on the identification of the extent and nature of the virtues and their application in human life; Shaftesbury’s early enlightenment ‘politeness’Footnote 25 and the salons of the Age of Reason again ground the discussion of the questions, ‘who are we, and how ought we to behave?’ in public, albeit intellectual, discourse; today, Habermas et al. advocate this inclusion as a part of a participative democracy, perhaps reiterating the central arguments of the early Frankfurt School against the false consciousness of the Culture Industry.Footnote 26 The thrust of this whole chapter is that privacy must be debated and understood in the lives of individuals; universities, professional bodies, and ethics committees must facilitate conversations that empower individuals to realise their decisional privacy in making choices about the nature of their participation in society.Footnote 27

7.6 Conclusion

This chapter has focused on different aspects of a conceptual problem raised in relation to a modern research dilemma: how do we negotiate privacy-protecting research where individuals hold a dynamic range of sensitivities about their relationships to others in society? We have seen that whereas human rights law does not present granular definitions of privacy and courts use privacy concepts to resolve disputes in the area, attempts in legal theory and specific areas of law (personal data protection and informed consent) do not fill the conceptual gaps. The argument I advance is that using the public interest, confidentiality and public engagement discourse in constructing research protocols will go some way to address those gaps. It will also strengthen the relationship between researchers and the public they seek to engage and to serve, and could facilitate a greater understanding of the methods and objectives of science.

Footnotes

1 I am grateful to Graeme Laurie, Annie Sorbie and all the editors and colleagues who commented on this chapter. Errors are mine.

2 See the range of sensitivities expressed in public opinion surveys about privacy. For example, the Eurobarometers on data protection, Eurobarometers numbers 147 and 196 (2003), 225 and 226 (2008), 359 (2011), and 431 (2015), and on biotechnology, Eurobarometers numbers 61 (1991), 80 (1993), 108 (1997), 134 (2000), 177 (2003), 244b (2006), and 341 (2010), all available at ‘Public Opinion’, (European Union), www.ec.europa.eu/commfrontoffice/publicopinion/index.cfm.

For a discussion of a broader literature, see D. Townend et al.,‘Privacy Interests in Biobanking: A Preliminary View on a European Perspective’ in J. Kaye and M. Stranger (eds), Principles and Practice in Biobanking Governance (Farnham: Ashgate Publishing Ltd., 2009), pp. 137159.

3 See also, Articles 7, 8 and 52 of the European Union, Charter of Fundamental Rights of the European Union, 26 October 2012, 2012/C 326/02.

4 UN General Assembly, ‘Universal Declaration of Human Rights’, 10 December 1948, 217 A (III).

5 Council of Europe, European Convention for the Protection of Human Rights and Fundamental Freedoms, as amended by Protocols Nos 11 and 14, 4 November 1950, ETS 5.

6 S. D. Warren and L. D. Brandeis, ‘The Right to Privacy’, (1890) Harvard Law Review, 4(5), 193220.

7 A. L. Allen, ‘Genetic Privacy: Emerging Concepts and Values’ in M. A. Rothstein (ed.) Genetic Secrets: Protecting Privacy and Confidentiality in the Genetic Era (New Haven: Yale University Press, 1997), pp. 3160.

8 See the tone, for example, of the Council of Europe website, where the focus is on privacy of personal data. ‘Council of Europe Data Protection Website’, (Council of Europe), www.coe.int/en/web/data-protection

9 Organization for Economic Co-operation and Development, ‘OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data’, (OECD, 1980). See also, OECD, ‘The OECD Privacy Framework’, (OECD, 2013).

10 See, for example, Council of Europe Convention 108; European Union Directive 95/46/EC replaced by the General Data Protection Regulation 2016/679.

11 GDPR, Article 5.

12 GDPR, Articles 6 and 9.

13 GDPR, Articles 13 and 14.

14 GDPR, Articles 15–22.

15 Although it is not an absolute right. See, for example, A. Smith, The Theory of Moral Sentiments (1759) or J. S. Mill, On Liberty (1859).

16 See M. J. Taylor, Genetic Data and the Law: A Critical Perspective on Privacy Protection (Cambridge University Press, 2012).

17 This can be seen in privacy debates in other academic disciplines. See, for example, J. DeCew, ‘Privacy’, (The Stanford Encyclopedia of Philosophy, Spring 2018 Edition), E. N. Zalta (ed) www.plato.stanford.edu/archives/spr2018/entries/privacy/; A. Westin, Privacy and Freedom (New York: Atheneum, 1967); A. Westin, ‘Social and Political Dimensions of Privacy’, (2003) Journal of Social Issues, 59(2), 431453.

18 H. Arendt, The Human Condition (Chicago University Press, 1958).

19 I have developed this idea previously: D. Townend, ‘Privacy, Politeness and the Boundary Between Theory and Practice in Ethical Rationalism’ in P. Capps and S. Pattinson (eds), Ethical Rationalism and the Law (Oxford: Hart Publishing, 2017), pp. 171189.

20 I. Kant, Groundwork of the Metaphysics of Morals (1785). See M. Rohlf, ‘Immanuel Kant’, (The Stanford Encyclopedia of Philosophy, Spring 2020 Edition), E. N. Zalta (ed.), www.plato.stanford.edu/archives/spr2020/entries/kant/ (section 5.4).

21 J. Rawls, A Theory of Justice (Cambridge, MA: Belknap Press, 1971, Revised Edition 1999). See L. Wenar, ‘John Rawls’, (The Stanford Encyclopedia of Philosophy, Spring 2017 Edition), E. N. Zalta (ed), www.plato.stanford.edu/archives/spr2017/entries/rawls/.

22 And this is what the court advocates in W v. Egdell [1989] EWCA Civ 13 and is arguably the purpose of the derogations in human rights law discussed above.

23 G. Laurie, Genetic Privacy: A Challenge to Medico-Legal Norms (Cambridge University Press, 2002).

24 Francome v. Mirror Group Newspapers Ltd [1984] 1 WLR 892 (UK); Campbell v. MGN Ltd [2004] UKHL 22. See Taylor and Whitton, Chapter 24, in this volume.

25 A. A. Cooper, Third Earl of Shaftesbury, Characteristics of Men, Manners, Opinions, Times, L. E. Klein (ed.), (Cambridge University Press, 1999); L. Klein, Shaftesbury and the Culture of Politeness: Moral Discourse and Cultural Politics in Early Eighteenth-Century England (Cambridge University Press, 1994).

26 See, for example, J. Habermas, Between Facts and Norms: Contributions to a Discourse Theory of Law and Democracy (tr. W. Rehg) (Cambridge, US-MA: MIT Press, 1996) (originally published in German, 1992); M. Horkheimer and T. W. Adorno, Dialectic of Enlightenment (tr. J. Cumming) (New York: Herder and Herder, 1972) (original publication in German, 1944).

27 Townend, ‘Privacy’.

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×