Skip to main content Accessibility help
×
Hostname: page-component-586b7cd67f-2brh9 Total loading time: 0 Render date: 2024-11-27T21:53:37.278Z Has data issue: false hasContentIssue false

Part I - Relocating the Dead-End

Published online by Cambridge University Press:  18 February 2021

Elizabeth T. Hurren
Affiliation:
University of Leicester

Summary

Type
Chapter
Information
Hidden Histories of the Dead
Disputed Bodies in Modern British Medical Research
, pp. 1 - 100
Publisher: Cambridge University Press
Print publication year: 2021
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC-ND 4.0 https://creativecommons.org/cclicenses/

Our Dead Are Never Dead To Us,

Until We Have Forgotten Them

[Adam Bede, George Eliot, 1819–1880]

Introduction: A Consignment for the Cul-de-Sac of History?

At the heart of modern conceptions of biomedicine sits a core narrative of ‘progress’, one in which profound scientific breakthroughs from the nineteenth century onwards have cumulatively and fundamentally transformed the individual life course for many patients in the global community. Whilst there remain healthcare inequalities around the world, science has endeavoured to make medical breakthroughs for everybody. Thus for many commentators it has been vital to focus on the ends – the preservation or extension of life and the reduction of human suffering emerging out of new therapeutic regimes – and to accept that the accumulation of past practice cannot be judged against the yardstick of the most modern ethical values. Indeed, scientists, doctors and others in the medical field have consistently tried hard to follow ethical practices even when the law was loose or unfocussed and public opinion was supportive of an ends rather than means approach. Unsystematic instances of poor practice in research and clinical engagement thus had (and have) less contemporary meaning than larger systemic questions of social and political inequalities for the living, related abuses of power by states and corporate entities in the global economy, and the suffering wrought by cancer, degenerative conditions and antibiotic resistant diseases. Perhaps unsurprisingly given how many patients were healed, there has been a tendency in recent laboratory studies of the history of forensic science, pathology and transplant surgery, to clean up, smooth over and thus harmonise the medical past.1 Yet, these processes of ‘progress’ have also often been punctuated by scandals (historical and current) about medical experimentation, failed drug therapies, rogue doctors and scientists and misuse of human research material.2 In this broad context, while the living do have a place in the story of ‘progress’, it is the bodies of the dead which have had and always have a central role. They are a key component of medical training and anatomical teaching, provide the majority of resources for organ transplantation and (through the retention and analysis of organs and tissue) constitute one of the basic building blocks of modern medical research. For many in the medical sciences field, the dead could and should become bio-commons given the powerful impact of modern degenerative and other diseases, accelerating problems linked to lifestyle, and the threats of current and future pandemics. Yet, equally inside the medical research community there remain many neglected hidden histories of the dead that are less understood than they should be in global medicine, and for this reason they are central to this new book.

Such perspectives are important. On the one hand, they key into a wider sense that practice in medical science should not be subject to retrospective ethical reconstruction. On the other hand, it is possible to trace a range of modern challenges to the theme of ‘progress’, the ethics of medical research and practice, as well as the scope and limits of professional authority. This might include resistance to vaccination, scepticism about the precision of precision medicine, an increasing willingness to challenge medical decisions and mistakes in the legal system, accelerating public support for assisted dying, and a widening intolerance of the risks associated with new and established drugs. Nowhere is this challenge more acute than in what historians broadly define as ‘body ethics’. By way of recent example, notwithstanding the provisions of the Human Tissue Act (Eliz. 2 c. 30: 2004) (hereafter HTA2004), the BBC reported in 2018 that the NHS had a huge backlog of ‘clinical waste’ because its sub-contracted disposal systems had failed.3 Material labelled ‘anatomical waste’ and kept in secure refrigerated units contained organs awaiting incineration at home or abroad. By July 2019, the Daily Telegraph revealed how such human waste, including body parts and amputations from operative surgeries, was found in 100 shipping containers sent from Britain to Sri Lanka for clinical waste disposal.4 More widely, the global trade in organs for transplantation has come into increasingly sharp relief, while the supply of cadavers, tissue and organs for medical research remains contentious. Some pathologists and scientists, for instance, are convinced that HTA2004 stymied creative research opportunities.5 They point out that serendipity is necessary for major medical breakthroughs. Legislating against kismet may, they argue, have been counterproductive. Ethical questions around whose body is it anyway thus continue to attract a lot of media publicity and often involve the meaning of the dead for all our medical futures.

Lately these ethical issues have also been the focus of high-profile discussion in the global medical community, especially amongst those countries participating at the International Federation of Associations of Anatomists (hereafter FAA). It convened in Beijing, China, in 2014, where a new proposal promised ‘to create an international network on body donation’ with the explicit aim of providing practical ‘assistance to those countries with difficulties setting up donation programmes’.6 The initiative was developed by the Trans-European Pedagogic Anatomical Research Group (TEPARC), following HTA2004 in Britain that had increased global attention on best practice in body donation. Under the TEPARC reporting umbrella, Beat Riederer remarked in 2015: ‘From an ethical point of view, countries that depend upon unclaimed bodies of dubious provenance are [now] encouraged to use these reports and adopt strategies for developing successful donation programmes.’7 Britain can with some justification claim to be a global leader in moving away from a reliance on ‘unclaimed’ corpses for anatomical teaching and research to embracing a system of body bequests based on informed consent. Similar ethical frameworks have begun to gain a foothold in Europe and East Asia, and are starting to have more purchase on the African8 and North and South American subcontinents too.9 Nonetheless, there is a long way to travel. As Gareth Jones explains, although ‘their use is far less in North America’ it is undeniable that ‘unclaimed corpses continue to constitute … around 20 per cent of medical schools’ anatomical programmes’ in the USA and Canada.10 Thus, the New York Times reported in 2016 that a new City of New York state law aimed to stop the use of ‘unclaimed’ corpses for dissection.11 The report came about because of a public exposé that the newspaper ran about the burial of a million bodies on Hart Island in an area of mass graves called Potter’s Field. Since 1980, the Hart Island Research project has found 65,801 ‘unclaimed’ bodies, dissected and buried anonymously.12 In a new digital hidden history project called the ‘Passing Cloud Museum’, their stories are being collected for posterity.13 And with some contemporary relevance, for during the Covid-19 pandemic the Hart Island pauper graveyard was re-opened by the New York public health authorities. Today, it once more contains contaminated bodies with untold stories to be told about the part people played in medical ‘progress’. For the current reality is that ‘in some states of the US, unclaimed bodies are passed to state anatomy boards’. Jones thus points out that:

When the scalpel descends on these corpses, no-one has given informed consent for them to be cut up. … Human bodies are more than mere scientific material. They are integral to our humanity, and the manner in which this material is obtained and used reflects our lives together as human beings. The scientific exploration of human bodies is of immense importance, but it must only be carried out in ways that will enhance anatomy’s standing in the human community.14

In a global medical marketplace, then, the legal ownership of human material and the ethical conduct of the healthcare and medical sciences can twist and turn. But with the increasing reach of medical research and intervention, questions of trust, communication, authority, ownership and professional boundaries become powerfully insistent. As the ethicist Heather Douglas reminds us: ‘The question is what we should expect of scientists qua in their behaviour, in their decisions as scientists, engaged in their professional life. As the importance of science in our society has grown over the past half-century, so has the urgency of this question.’ She helpfully elaborates:

The standard answer to this question, arising from the Freedom of Science movement in the early 1940s, has been that scientists are not burdened with the same moral responsibilities as the rest of us, that is, that scientists enjoy ‘a morally unencumbered freedom from permanent pressure to moral self-reflection’. … Because of the awesome power of science, to change both, our world, our lives, and our conception of ourselves, the actual implementation of scientists’ general responsibilities will fall heavily on them. With full awareness of science’s efficacy and power, scientists must think carefully about the possible impacts and potential implications of their work. … The ability to do harm (and good) is much greater for a scientist, and the terrain almost always unfamiliar. The level of reflection such responsibility requires may slow down science, but such is the price we all pay for responsible behavior.15

Whether increasing public scepticism of experts and medical science will require a deeper and longer process of reflection and regulation is an important and interesting question. There is also, however, a deep need for historical explorations of these broad questions, and particularly historical perspectives on the ownership and use of, authority over and ethical framing of the dead body. As George Santanyana reminds us, we must guard against either neglecting a hidden scientific past or embellishing it since each generic storyline is unlikely to provide a reliable future guide –

Progress, far from consisting in change, depends on retentiveness. When change is absolute there remains no being to improve and no direction is set for possible improvement: and when experience is not retained … infancy is perpetual. Those who cannot remember the past are condemned to repeat it.16

Against this backdrop, in his totemic book The Work of the Dead, Thomas Laqueur reminds us how: ‘the dead body still matters – for individuals, for communities, for nations’.17 This is because there has been ‘an indelible relationship between the dead body and the civilisation of the living’.18 Cultural historians thus criticise those in medico-scientific circles who are often trained to ignore or moderate the ‘work of the dead for the living’ in their working lives. Few appreciate the extent to which power relations, political and cultural imperatives and bureaucratic procedures have shaped, controlled and regulated the taking of dead bodies and body parts for medical research, transplantation and teaching over the longue durée. Yet our complex historical relationships with the dead (whether in culture, legislation, memory, medicine or science) has significant consequences for the understanding of current ethical dilemmas. Again, as George Santayana observed: ‘Our dignity is not in what we do, but in what we understand’ about our recent past and its imperfect historical record.19 It is to this issue that we now turn.

History and Practice

To offer a critique of the means and not the ends of medical research, practice and teaching through the lens of bodies and body parts is potentially contentious. Critics of the record of medical science are often labelled as neo-liberals, interpreting past decisions from the standpoint of the more complete information afforded by hindsight and judging people and processes according to yardsticks which were not in force or enforced at the time. Historical mistakes, practical and ethical, are regrettable but they are also explicable in this view. Such views underplay, however, two factors that are important for this book. First, there exists substantial archival evidence of the scale of questionable practice in medical teaching, research and body ethics in the past, but it has often been overlooked or ignored. Second, there has been an increasing realisation that the general public and other stakeholders in the past were aware of and contested control, ownership and use of bodies and body parts. While much weight has been given to the impact of very recent medical scandals on public trust, looking further back suggests that ordinary people had a clear sense that they were either marginalised in, or had been misinformed about, the major part their bodies played in medical ‘progress’. In see-saw debates about what medicine did right and what it did wrong, intensive historical research continues to be an important counterweight to the success story of biomedicine.

Evidence to substantiate this view is employed in subsequent chapters, but an initial insight is important for framing purposes. Thus, in terms of ownership and control of the dead body, it is now well established that much anatomy teaching and anatomical or biomedical research in the Victorian and Edwardian periods was dependent upon medical schools and researchers obtaining the ‘unclaimed bodies’ of the very poor.20 This past is a distant country, but under the NHS (and notwithstanding that some body-stock was generated through donation schemes promoted from the 1950s) the majority of cadavers were still delivered to medical schools from the poorest and most vulnerable sectors of British society until the 1990s. The extraordinary gift that we all owe in modern society to these friendless and nameless people has until recently been one of the biggest untold stories in medical science. More than this, however, the process of obtaining bodies and then using them for research and teaching purposes raised and raises important questions of power, control and ethics. Organ retention scandals, notably at Liverpool Children’s Hospital at Alder Hey, highlighted the fact that bodies and body parts had been seen as a research resource on a considerable scale. Human material had been taken and kept over many decades, largely without the consent or knowledge of patients and relatives, and the scandals highlighted deep-seated public beliefs in the need to protect the integrity of the body at death. As Laqueur argues: ‘The work of the dead – dead bodies – is possible only because they remain so deeply and complexly present’ in our collective actions and sense of public trust at a time of globalisation in healthcare.21 It is essentially for this reason that a new system of informed consent, with an opt-in clause, in which body donation has to be a positive choice written down by the bereaved and/or witnessed by a person making a living will, was enshrined into HTA2004. Even under the terms of that act, however, it is unclear whether those donating bodies or allowing use of tissue and other samples understand all the ways in which that material might be recycled over time or converted into body ‘data’. Questions of ownership, control and power in modern medicine must thus be understood across a much longer continuum than is currently the case.

The same observation might be made of related issues of public trust and the nature of communication. There is little doubt that public trust was fundamentally shaken by the NHS organ retention scandals of the early twenty-first century, but one of the contributions of this book is to trace a much longer history of flashpoints between a broadly conceived ‘public’ and different segments of the medical profession. Thus, when a Daily Mail editorial asked in 1968 – ‘THE CHOICE: Do we save the living … or do we protect the dead?’ – it was crystallising the question of how far society should prioritise and trust the motives of doctors and others involved in medical research and practice.22 There was (as we will see in subsequent chapters) good reason not to, something rooted in a very long history of fractured and incomprehensible communication between practitioners or researchers and their patients and donors. Thus, a largely unspoken aspect of anatomical teaching and research is that some bodies, organs and tissue samples – identified by age, class, disability, ethnicity, gender, sexuality and epidemiology – have always been more valuable than others.23 Equally, when human harvesting saves lives, questions of the quality of life afterwards are often downplayed. The refinement of organ transplantation has saved many lives, and yet there is little public commentary on the impact of rejection drugs and the link between those drugs and a range of other life-reducing conditions. It was informative, therefore, in the summer of 2016 that the BBC reported on how although many patients are living longer after a cancer diagnosis, the standard treatments they undergo have (and always have had) significant long-term side effects even in remission.24 These are physical – a runny nose, loss of bowel control, and hearing loss – as well as mental. Low self-esteem is common for many cancer sufferers. A 2016 study by Macmillan Cancer Support, and highlighted in the same BBC report, found that of the ‘625,000 patients in remission’, the majority ‘are suffering with depression after cancer treatment’. We often think that security issues are about protecting personal banking on the Internet, preventing terrorism incidents and stopping human trafficking, but there are also ongoing biosecurity issues in the medical sciences concerning (once more) whose body and mind is it anyway?25

Other communication issues are easily identifiable. How many people, for instance, really understand that coroners, medical researchers and pathologists have relied on the dead body to demarcate their professional standing and still do?26 In the past, to raise the status of the Coronial office (by way of example) there was a concerted campaign to get those coroners that were by tradition legally qualified to become medically qualified. But to achieve that professional outcome, they needed better access and authority over the dead. And how many people – both those giving consent for use of bodies and body parts and those with a vaguer past and present understanding of the processes of research and cause of death evaluation – truly comprehend the journey on which such human material might embark? In the Victorian and Edwardian periods, people might be dissected to their extremities, with organs, bodies and samples retained or circulated for use and re-use. Alder Hey reminded the public that this was also the normative journey in the twentieth century too. Even today, Coronial Inquests create material that is passed on, and time limits on the retention of research material slip and are meant to slip, as we shall see in Part II. The declaration of death by a hospital doctor was (and is) often not the dead-end. As the poet Bill Coyle recently wrote:

The dead, we say, are departed. They
pass on, they pass away, they leave behind
family, friends, the whole of humankind –
They have gone on before. Or so we say.27

But, he asks, ‘could it be the opposite is true?’ To be alive is to experience a future tense ‘through space and time’. To be dead is all about the deceased becoming fixed in time – ‘while you stay where you are’, as the poet reminds us. Yet, this temporal dichotomy – the living in perpetual motion, the dead stock-still – has been and remains deceptive. Medical science and training rely, has always relied, on the constant movement of bodies, body parts and tissue samples. Tracing the history of this movement is a key part in addressing current ethical questions about where the limits of that process of movement should stand, and thus is central to the novel contribution being made in this book.

A final sense of the importance of historical perspective in understanding current questions about body ethics can be gained by asking the question: When is a body dead? One of the difficulties in arriving at a concise definition of a person’s dead-end is that the concept of death itself has been a very fluid one in European society.28 In early modern times, when the heart stopped the person was declared dead. By the late-Georgian era, the heart and lungs had to cease functioning together before the person became officially deceased. Then by the early nineteenth century, surgeons started to appreciate that brain death was a scientific mystery and that the brain was capable of surviving deep physical trauma. The notion of coma, hypothermia, oxygen starvation, resuscitation and its neurology entered the medical canon. Across the British Empire, meantime, cultures of death and their medical basis in countries like India and on the African subcontinent remained closely associated with indigenous spiritual concepts of the worship of a deity.29 Thus, the global challenge of ‘calling the time of death’ started to be the subject of lively debates from the 1960s as intersecting mechanisms – growing world population levels, the huge costs of state-subsidised healthcare, the rise of do not resuscitate protocols in emergency medicine, and a biotechnological revolution that made it feasible to recycle human material in ways unimaginable fifty years before – gave rise to questions such as when to prolong a whole life and when to accept that the parts of a person are more valuable to others. These now had more focus and meaning. Simultaneously, however, the reach of medical technology in the twentieth century has complicated the answers to such questions. As the ability to monitor even the faintest traces of human life – chemically in cells – biologically in the organs – and neurologically in the brain – became more feasible in emergency rooms and Intensive Care Units, hospital staff began to witness the wonders of the human body within. It turned out to have survival mechanisms seldom seen or understood.

In the USA, Professor Sam Parnia’s recent work has highlighted how calling death at twenty minutes in emergency room medicine has tended to be done for customary reasons rather than sound medical ones.30 He points out, ‘My basic message is this: The death we commonly perceive today … is a death that can be reversed’ and resuscitation figures tell their own story: ‘The average resuscitation rate for cardiac arrest patients is 18 per cent in US hospitals and 16 per cent in Britain. But at this hospital [in New York] it is 33 per cent – and the rate peaked at 38 per cent earlier this year.’31 Today more doctors now recognise that there is a fine line between peri-mortem – at or near the point of death – and post-mortem – being in death. And, it would be a brave medic indeed who claimed that they always know the definitive difference because it really depends on how much the patient’s blood can be oxygenated to protect the brain from anoxic insults in trauma. Ironically, however, the success story of medical technology has started to reintroduce medical dilemmas with strong historical roots. An eighteenth-century surgeon with limited medical equipment in his doctor’s bag knew that declaring the precise time of death was always a game of medical chance. Their counterpart, the twenty-first-century hospital consultant, is now equipped with an array of technology, but calling time still remains a calculated risk. Centuries apart, the historical irony is that in this grey zone, ‘the past may be dead’, but sometimes ‘it is very difficult to make it lie down’.32

In so many ways, then, history matters in a book about disputed bodies and body disputes. Commenting in the press on controversial NHS organ donation scandals in 1999, Lord Winston, a leading pioneer of infertility and IVF treatments, said:

The headlines may shock everyone, but believe me, the research is crucial. … Organs and parts of organs are removed and subjected to various tests – They are weighed and measured, pieces removed and placed under the microscope and biochemically tested. While attempts can be made to restore the external appearance of the body at the conclusion of a post-mortem, it is inevitable some parts may be occasionally missing.33

Winston admitted that someone of Jewish descent (as he was) would be upset to learn that a loved one’s body was harvested for medical research without consent and that what was taken might not be returned. As a scientist, he urged people to continue to be generous in the face of a public scandal. He was, like many leading figures in the medical profession, essentially asking the public to act in a more enlightened manner than the profession had itself done for centuries. The sanctions embodied in HTA2004 – the Human Tissue Authority public information website explains for instance that: ‘It is unlawful to have human tissue with the intention of its DNA being analyzed, without the consent of the person from whom the tissue came’ – are a measure of the threat to public trust that Winston was prefiguring.34 But this was not a new threat. As one leading educationalist pointed out in a feature article for the BBC Listener magazine in March 1961: ‘Besides, there are very few cultural or historical situations that are inert’ – the priority, he pointed out, should be dismembering medicine’s body of ethics – comparable, he thought, to ‘corpses patiently awaiting dissection’.35 By the early twenty-first century it was evident that medical ethics had come to a crucial crossroads and the choice was clear-cut. Medicine had to choose, either ‘proprietorial’ or ‘custodial’ property rights over the dead body, and to concede that the former had been its default position for too long.36 Phrases like ‘public trust’ could no longer simply be about paying lip service to public sensibilities, and there had been some recognition that the medical sciences needed to make a cultural transition in the public imagination from an ethics of conviction to an ethics of responsibility.37 Yet this transition is by no means complete. New legislation crossed a legal threshold on informed consent, but changing ingrained opinions takes a lot longer. And wider questions for both the public and scientists remain: Is the body ever a ‘dead-end’ in modern medical research? At what end-of-life stage should no more use be made of human material in a clinical or laboratory setting? Have the dead the moral right to limit future medical breakthroughs for the living in a Genome era? Would you want your body material to live on after you are dead? And if you did, would you expect that contribution to be cited in a transcript at an award ceremony for a Nobel Prize for science? Are you happy for that gift to be anonymous, for medical law to describe your dead body as abandoned to posterity? Or perhaps you agree with the former Archbishop of Canterbury, Dr. Rowan Williams, Master of Magdalen College Cambridge, who believes that ‘the dead must be named’ or else we lose our sense of shared humanity in the present?38

In this journey from proprietorial to custodial rights, from the ethics of conviction to an ethics of responsibility, and to provide a framework for answering the rhetorical questions posed above, history is important. And central to that history are the individual and collective lives of the real people whose usually hidden stories lie behind medical progress and medical scandal. They are an intrinsic aspect of a medical mosaic, too often massaged or airbrushed in a history of the body, because it seemed harder to make sense of the sheer scale of the numbers involved and their human complexities. Engaging the public today involves co-creating a more complete historical picture.

Book Themes

Against this backdrop, the primary purpose of this book is to ask what have often been uncomfortable questions about the human material harvested for research and teaching in the past. It has often been assumed (incorrectly) that the journey of such material could not be traced in the historical archives because once dead bodies and their parts had entered a modern medical research culture, their ‘human stories’ disappeared in the name of scientific ‘progress’. In fact, the chapters that follow are underpinned by a selection of representative case-studies focussing on Britain in the period 1945 to 2000. Through them, we can reconstruct, trace and analyse the multi-layered material pathways, networks and thresholds the dead passed through as their bodies were broken up in a complex and often secretive chain of supply. The overall aim is therefore to recover a more personalised history of the body at the end of life by blending the historical and ethical to touch on a number of themes that thread their way throughout Parts I and II. We will encounter, inter alia, notions of trust and expertise; the problem of piecemeal legislation; the ambiguities of consent and the ‘extra time’ of the dead that was created; the growth of the Information State and its data revolution; the ever-changing role of memory in culture; the shifting boundaries of life and death (both clinically and philosophically); the differential power relations inside the medical profession; and the nature and use of history itself in narratives of medical ‘progress’. In the process, the book moves from the local to the national and, in later chapters, to the international, highlighting the very deep roots of concerns over the use of the dead which we casually associate with the late twentieth century.

Part II presents the bulk of the new research material, raising fundamental historical questions about: the working practices of the medical sciences; the actors, disputes and concealments involved; the issues surrounding organ donation; how a range of professionals inside dissection rooms, Coronial courts, pathology premises and hospital facilities often approached their workflows in an ahistorical way; the temporal agendas set by holding on to research material as long as possible; the extent to which post-war medical research demanded a greater breaking up of the body compared to the past; and the ways that the medical profession engaged in acts of spin-doctoring at difficult moments in its contemporary history. Along the way, elements of actor network theory are utilised (an approach discussed in Chapter 1). This is because the dead passed through the hands and knowledge of a range of actors, including hospital porters, undertakers, ambulance drivers, coroners, local government officials registering death certificates, as well as those cremating clinical waste, not all of whom are currently understood as agents of biomedicine. The chapters also invite readers of this book to make unanticipated connections from core questions of body ethics to, for instance: smog, air pollution, networks between institutions and the deceased and the cultural importance of female bodies to dissection. These perspectives are balanced by taking into account that medical scientists are complex actors in their own right too, shaped by social, cultural, political, economic and administrative circumstances, that are sometimes in their control, and sometimes not. In other words, this book is all about the messy business of human research material and the messy inside stories of its conduct in the modern era.

In this context, three research objectives frame Chapters 4 to 6. The first is to investigate how the dead passed along a complex chain of material supply in twentieth-century medicine and what happened at each research stage, highlighting why those post-mortem journeys still matter for the living, because they fundamentally eroded trust in medicine in a way that continues to shape public debates. We thus begin in Chapter 4 with a refined case-study analysis of the human material that was acquired or donated to the dissection room of St Bartholomew’s Hospital in London from the 1930s to the l970s. Since Victorian times, it has been the fourth largest teaching facility in Britain. Never-before-seen data on dissections and their human stories reconnect us to hidden histories of the dead generated on the premises of this iconic place to train in medicine, and wider historical lessons in an era when biomedicine moved centre-stage in the global community.

Second, we then take a renewed look at broken-up bodies and the muddled bureaucracy that processed them. This human material was normally either dispatched using a bequest form from the mid-1950, or, more usually, acquired from a care home or hospice because the person died without close relatives in the modern era and was not always subject to the same rigorous audit procedures. What tended to happen to these body stories is that they arrived in a dissection room or research facility with a patient case note and then clinical mentalities took over. In the course of which, little consideration was given to the fact that processes of informed consent (by hospitals, coroners, pathologists and transplant teams) were not as transparent as they should have been; some parts of the body had been donated explicitly (on kidney donor cards) and others not (such as the heart). Effectively, the ‘gift’ became piecemeal, even before the organ transplantation, dissection or further research study got under way. Frequently, bureaucracy de-identified and therefore abridged the ‘gift exchange’. Human connections were thus consigned to the cul-de-sac of history. This is a physical place (real, rather than imagined) inside medical research processes where the human subjects of medical ‘progress’ often got parked out of sight of the general public. The majority were labelled as retentions and refrigerated for a much longer period of time than the general public generally realised, sometimes up to twenty-five years. This is not to argue that these retentions were necessarily an inconvenient truth, a professional embarrassment or part of a conspiracy theory with Big Pharma. Rather, retentions reflected the fact that the promise of ‘progress’ and a consequent augmentation of medico-legal professional status and authority proved very difficult to deliver unless it involved little public consultation in an era of democracy. Thus Chapter 5 analyses questions of the ‘extra time’ for the retention of bodies and body parts created inside the working practices of coroners and which are only drawn out through detailed consideration of organ donation controversies. A lack of visibility of these body parts was often the human price of a narrative of ‘progress’ and that invisibility tended to disguise the end of the process of use, and larger ethical questions of dignity in death. Likewise, a publicity-shy research climate created many missed research opportunities; frequently, coroners’ autopsies got delayed, imprecise paperwork was commonplace at post-mortems and few thought the bureaucratic system was working efficiently. As we will see in Chapter 5, frustrated families complained about poor communication levels between the police, coroners and grieving relatives, factors that would later influence the political reach of HTA2004. Paradoxically, the medical sciences, by not putting their ethics in order sooner, propped up a supply system of the dead that was not working for everyone involved on the inside, and thus recent legislation, instead of mitigating against the mistakes of this recent past, regulated much more extensively. In so doing, serendipity – the opportunity costs of potential future medical breakthroughs – took second place to the need for an overhaul of informed consent. Hidden histories of the dead therefore proved to be tactical and not strategic in the modern era for the medical research community.

Finally, the book culminates by examining the complex ways that bodies could be disputed, and how the body itself was in dispute with, the best intentions of new medical research after 1945. It focusses specifically on the work of pathologists in the modern era and their extensive powers of retention and further research. Unquestionably, many patients have benefitted from brain banking and the expansion of the science of neurology, a central thematic focus of Chapter 6. Yet, this innovative work was often conducted behind the closed doors of research facilities that did not see the need for better public engagement, until recently. As we shall see, that proved to be a costly error too, both for levels of professional trust in pathologists and better public understanding of what patients could expect of medicine in painful end-of-life situations. For many patients, meanwhile, the side-effects of drug development for brain conditions have sometimes been downplayed with detrimental outcomes for their sense of well-being. Quality-of-life ‘gains’ did contrast with the claims of ‘progress’ that underpinned a furtive research climate, and this resulted in a public stand-off once NHS scandals about brain retention started to emerge in 1999. On the one hand, in an ageing population research into degenerative diseases had and has a powerful role to play in global medicine. On the other hand, medicine still needs to learn much more about the complex and interconnected relationship between brain, emotion and memory formation as lived experiences. Few in the medical professions appreciated that missed body disputes – misinformation by doctors about lengthy retentions of human material – could create a countermovement that disputed medicine’s best intentions. Disputes about the body can go both ways – forwards and backwards – grateful and resentful – accepting and questioning – and it is this Janus-like approach that the book in part recounts.

To enter into this closed world of medico-legal actors and their support staff without setting their working-lives in context would be to misunderstand this fascinating and fast-moving modern medical research culture from 1945 in Britain. In Part I, therefore, Chapter 1 outlines the key historical debates there have been about this complex medical community of competing interest groups and their focus on the need to obtain more human research material. It concentrates on the main gaps in our historical knowledge about their working-lives. To fully appreciate that backdrop, Chapter 2 reviews the broad ethical and legal frameworks that regulated the use of the dead for research purposes locally, nationally and internationally. In this way, Chapter 3 illustrates, with a selection of representative human stories, the main cultural trends and threads of the central argument of the book that will be developed in Part II. We end this Introduction, therefore, with a thought-provoking encounter on the BBC imagined for us by Christopher Hitchens – talented journalist, public intellectual and writer, science champion, prominent atheist and cancer sufferer. He reminded his worldwide audience in Mortality (2012) why hidden histories of the dead matter to us all in a global community. His body had disputed chemotherapy’s ‘kill or cure venom’ that made him ‘a passive patient in a fight he did not pick’ with cancer. He disputed the ‘battle’ he was expected to wage when the disease was battling him, and praised the promise of precision medicine to retrieve out of the cul-de-sac of history, lost or neglected parts of this dreaded human experience, to be fused with new knowledge and creative-thinking.39 He hoped that superstitions surrounding cancer (what he called ‘its maladies of the mind and body’) would eventually ‘yield to reason and science’ not just in the laboratory but by co-creating with patients, both the living and the dead. For Hitchens died on 15 December 2011. The final deadline that he met was to sequence his genome. It remains deposited for posterity at the American National Institutes of Health. He pushed past the dead-end one last time, into scientific eternity – Eram quod eros quod sum – I am what you are; you will be what I am.40

1 Disputed Bodies and Their Hidden Histories

Introduction

In January 2001, the famous English sportsman Randolph Adolphus Turpin was elected into America’s International Boxing Hall of Fame. The celebration marked fifty years since he had defeated Sugar Ray Robinson to win the world middle-weight boxing title in 1951.1 Older fans of boxing appreciated that Turpin would not be present at the US inauguration. He had committed suicide aged just 38, in 1966. Few, however, knew that the fatal decision to end his life had caused considerable controversy in British medical circles. His boxer’s brain became the subject of professional debates and medical research disputes between a coroner, pathologist, senior neurologists and heart specialists, as well as his family and the popular press. In 1966, the tragic events were opened up to public enquiry and exposed medico-legal tensions about who owned a body and its parts in death. In neglected archives, forgotten medical stories like that of Turpin reveal narratives of the dead that often question the global picture of a medico-scientific consensus which argued that the accumulation, deidentification and retention of human material was necessary for ‘progress’. We rediscover, instead, faces, people, families and communities whose loved ones became the unacknowledged bedrock of modern British medical research. These missing persons relocated in the historical record exemplify that medical breakthroughs could have been part of an important and ongoing public engagement campaign in a biomedical age.

On Friday 22 July 1966, the lead sports writer of the Daily Mail featured the sad death of Turpin. The ex-boxer ‘shot himself with a .22 pistol in an attic bedroom over his wife’s Leamington Spa café on May 17’.2 The case looked like a straightforward suicide, but was to prove to be more complicated and controversial. Turpin died ‘after wounding his daughter, Carmen, aged two’ (although critically injured, she survived the violent attack by her father). At the Inquest, medical evidence established how: ‘Turpin fired at himself twice. The first bullet lodged against his skull but was not fatal. The second passed through his heart.’ The coroner, however, came in for considerable criticism in the press about his conclusions. It was noted that ‘Dr. H. Stephens Tibbits did not call for the brain tests that could have decided if brain damage caused by Turpin’s 24 years of boxing (including his amateur days) might have contributed to his state of mind on the day he died’. The pathologist who conducted the post-mortem on behalf of the Coronial hearing expressed the prevailing medical view that: ‘An examination by a neuropathologist using a fine microscope could have disclosed any tell-tale symptoms of brain damage such as a boxer might suffer.’ In particular, more medical research would have pinpointed ‘traces of haemorrhage in the tiny blood vessels of his brain’. But Dr Barrowcliff (pathologist) was not permitted to proceed because Dr Tibbits (coroner) would not authorise him to do so. The pathologist regretted that: ‘There was a certain amount of urgency involved here’ because of the fame of the suicide victim ‘to which academic interest took second place’. The press thus noted: that ‘the opportunity had been missed to carry out this study was received with dismay from a physician concerned with the Royal College of Physicians Committee on Boxing’. Its ‘eight leading specialists on the brain, heart and eyes’ were very disappointed that the pursuit of medical research that was in the public interest had been overridden by a coroner’s exclusive powers over the dead. The family meanwhile were relieved to have been consulted at all, since it was not a legal requirement at the time. They were anxious that the Coronial hearing should take into account Turpin’s suicide note. His last words, in fact, revealed disagreement between medical personnel, the family and suicide victim about the cause of death and therefore the potential of his brain for further research. To engage with this sort of hidden history of the dead and its body parts dispute, which is normally neglected in the literature, we need to trace this human story in greater archival depth.

Thus, Turpin left a handwritten note which stated that the Inland Revenue were chasing him for a large unpaid tax bill. He claimed this was levied on money he had not actually earned, and this was the chief cause of his death – ‘Naturally they will say the balance of my mind was disturbed but it is not’, he wrote; ‘I have had to carry the can.’3 Money troubles since his retirement from boxing in 1958 certainly seemed to have mounted. Four years previously the Daily Mail had reported on a bankruptcy hearing which established that ‘Turpin who earned £150,000 from his boxing career, now tussles for £25 a bout as a wrestler’.4 At a tax hearing at Warwick it was reported that: ‘His main creditor is the Inland Revenue. It claims £17,126 tax for boxing earnings between 1949 and 1958.’ He still owed ‘£15, 225’ and could only offer to pay back the tax bill ‘at £2 per week’ – a repayment schedule which would take ‘153 years’. Turpin had earned about £750 in 1961–2, but paid back a loan to a friend of £450 and £300 to his wife in cash, rather than the taxman. He was essentially broke and a broken man. The press, however, did not let the matter of his perilous financial situation or mental health condition rest. And because they did not, we can retrace the human circumstances of a controversial Coronial case concerning his valuable brain material: an approach this book will be following in subsequent chapters. For the aim is to uncover the sorts of human faces that were subsumed inside modern British medical research cultures.

In a hard-hitting editorial, the Daily Mail insisted that: ‘two questions must be answered about Randolph Turpin’s wretched life whilst boxing – Was he the lingering victim of punch drunkenness? What happened to the £585,439 paid to see his four title fights?’ Here was a ‘back-street kid who was a wealthy champion at 23, bankrupt at 34, and demented and dead at 38’.5 His ‘first marriage broke up, there were stories of assaults all pointing to a diminishing sense of social responsibility. A second marriage was to bring him happiness but his career… never recovered’. The newspaper asked why his family GP was not called as a medical witness at the Inquest. When interviewed by the press, the family doctor said that although ‘I do not like using the phrase, I would say that Turpin was punch drunk. He was not the sort of man to worry about financial matters or about people who had let him down. In my opinion boxing was responsible for his death.’ It was revealed that Turpin was ‘part deaf from a childhood swimming accident’ and he became ‘increasingly deaf through the years’. The GP, however, believed his hearing impairment had not impacted on either his physical balance or the balance of his mind. His elder brother and a family friend, nevertheless, contradicted that statement, telling the press that Turpin had ‘eye trouble’ and ‘double vision’ from his boxing days. He often felt dizzy and disorientated. The difficulty was that only Turpin’s 4-year-old daughter, Charmaine, and his youngest child, Carmen, aged 17 months (she sustained ‘bullet wounds in her head and chest’6) really knew what happened at the suicidal shooting. They were too young and traumatised to give evidence in the coroner’s court.7 In the opinion of Chief Detective-Inspector Frederick Bunting, head of Warwickshire CID, it was simply a family tragedy.8 Turpin had risen from childhood poverty and fought against racial discrimination (his father was from British Guyana and died after being gassed in WWI; his mother, almost blind, brought up five children on an army pension of just 27s per week, living in a single room).9 Sadly, ‘the money came too quickly’ and his ‘personality did not match his ring skill’, according to Bunting. Even so, by the close of the case what was noteworthy from a medico-legal perspective were the overarching powers of the coroner once the corpse came into his official purview. That evidence hinted at a hinterland of medical science research that seldom came into public view.

It seems clear that the pathologist commissioned to do Turpin’s post-mortem was prepared to apply pressure to obtain more human material for research purposes. Here was a fit young male body from an ethnicity background that could provide valuable anatomical teaching and research material. This perspective about the utility of the body and its parts was shared by the Royal College of Physicians, who wanted to better understand the impact of boxing on the brain. This public interest line of argument was also highlighted in the medical press, notably the Lancet. The family, meanwhile, were understandably concerned with questions of dignity in death. Their priority was to keep Turpin’s body intact as much as possible. Yet, what material journeys really happened in death were never recounted in the Coronial court. For, once the Inquest verdict of ‘death by suicide’ was reached, there was no need for any further public accountability. The pathologist in court did confirm that he examined the brain; he said he wanted to do further research, but tellingly he stated that he did not proceed ‘at that point’. Crucially, however, he did not elaborate on what would happen beyond ‘that point’ to the retained brain once the coroner’s case was completed in court.

As all good historians know, what is not said, is often as significant as what is. Today historians know to double-check on stories of safe storage by tracing what really happened to valuable human material once the public work of a coroner or pathologist was complete. The material reality was that Turpin’s brain was refrigerated, and it could technically be retained for many years. Whilst it was not subdivided in the immediate weeks and months after death, the fact of its retention meant that in subsequent years it could still enter a research culture as a brain slice once the publicity had died down. As we shall see, particularly in Chapter 6, this was a common occurrence from the 1960s onwards. At the time, it was normal for family and friends to trust a medico-legal system that could be misleading about the extra time of the dead it created with human research material. This neglected perspective therefore requires framing in the historiography dealing with bodies, body donations and the harvesting of human material for medical research purposes, and it is this task that informs the rest of the chapter.

The Long View

Historical studies of the dead, anatomisation and the use of bodies for research processes have become increasingly numerous since the early 2000s.10 Adopting theories and methodological approaches drawn from cultural studies,11 ethnography,12 social history, sociology, anthropology and intellectual history, writers have given us an increasingly rich understanding of cultures of death, the engagement of the medical professions with the dead body and the wider culture of body ethics. It is unfeasible (and not desirable) here to give a rendering of the breadth of this field given its locus at the intersection of so many disciplines. To do so would over-burden the reader with a cumbersome and time-consuming literature review. Imagine entering an academic library and realising that the set reading for this topic covered three floors of books, articles and associated reading material. It could make even the most enthusiastic student of the dead feel defeated. Two features of that literature, however, are important for the framing of this book.

First, we have become increasingly aware that medical ‘advances’ were intricately tied up with the power of the state and medicine over the bodies of the poor, the marginal and so-called ‘ordinary’ people. This partly involved the strategic alignment of medicine with the expansion of asylums, mental hospitals, prisons and workhouses.13 But it also went much further. Renewed interest in ‘irregular’ practitioners and their practices in Europe and its colonies highlighted how medical power and professionalisation were inexorably and explicitly linked to the extension of authority over the sick, dying and dead bodies of ‘ordinary’ people.14 More than this, the development of subaltern studies on the one hand and a ‘history from below’ movement on the other hand has increasingly suggested the vital importance for anatomists, medical researchers and other professionals involved in the process of death, of gaining and retaining control of the bodies of the very poorest and least powerful segments of national populations.15 A second feature of the literature has been a challenge to the sense and ethics of medical ‘progress’, notably by historians of the body who have been diligent in searching out the complex and fractured stories of the ‘ordinary’ people whose lives and deaths stand behind ‘great men’ and ‘great advances’. In this endeavour they have, inch by inch, begun to reconstruct a medico-scientific mindset that had been a mixture of caring and careless, clinical and inexact, dignified and disingenuous, elitist and evasive. In this space, ethical dilemmas and mistakes about medicine’s cultural impact, such as those highlighted in the Turpin case with which this chapter opened, were multiple. Exploring these mistakes and dilemmas – to some extent explicable but nonetheless fundamental for our understanding of questions of power, authority and professionalisation – is, historians have increasingly seen, much more important than modern ‘presentist’ views of medicine would have us believe.16

These are some of the imperatives for the rest of Parts I and II of this book. The remainder of this first chapter develops some of these historiographical perspectives. It does so by focussing on how trends in the literature interacted with social policy issues in the modern world. What is presented is not therefore a traditional historiographical dissection of the minutiae of academic debates of interest to a select few, but one that concentrates on the contemporary impact of archival work by historians as a collective. For that is where the main and important gap exists in the historical literature – we in general know some aspects of this medical past – but we need to know much more about its human interactions. Before that, however, we must engage with the question of definitions. Thus, around 1970 a number of articles appeared in the medical press about ‘spare-part surgery’ (today called organ transplantation). ‘Live donors’ and ‘donated’ cadavers sourced across the NHS in England will be our focus in this book too. To avoid confusion, we will be referring to this supply system as a combination of ‘body donations’ (willingly done) and ‘mechanisms of body supply’ (often involuntary). The former were bequested before death by altruistic individuals; the latter were usually acquired without consent. They entered research cultures that divided up the whole body for teaching, transplant and associated medical research purposes. This material process reflected the point at which the disassembling of identity took place (anatomical, Coronial, neurological and in pathology) into pathways and procedures, which we will be reconstructing. In other words, ‘pioneer operations’ in transplantation surgery soon ‘caught unawares the medical, legal, ethical and social issues’ which seemed to the media to urgently require public consultation in Britain.17 As one contemporary leading legal expert explained:

This is a new area of medical endeavour; its consequences are still so speculative that nobody can claim an Olympian detachment from them. Those who work outside the field do not yet know enough about it to form rational and objective conclusions. Paradoxically, those who work in the thick of it … know too much and are too committed to their own projects to offer impartial counsel to the public, who are the ultimate judges of the value of spare-part surgery.18

Other legal correspondents pointed out that since time was of the essence when someone died, temporal issues were bound to cause a great deal of practical problems:

For a few minutes after death cellular metabolism continues throughout the majority of the body cell mass. Certain tissues are suitable for removal only during this brief interval, although improvements in storage and preservation may permit a short delay in actual implantation in the recipient. Cadaver tissues are divided into two groups according to the speed with which they must be salvaged. First, there are ‘critical’ tissues, such as the kidney and liver, which must be removed from the deceased within a matter of thirty to forty-five minutes after death. On the other hand, certain ‘noncritical’ tissues may be removed more at leisure. Skin may be removed within twelve hours from time of death. The cornea may be taken at any time within six hours. The fact is, however, that in all cases action must be taken promptly to make use in a living recipient of the parts of a non-living donor, and this gives rise to legal problems. There is but little time to negotiate with surviving relatives, and waiting for the probate of the will is out of the question.19

Transplant surgeons today and anatomists over the past fifty years shared an ethical dilemma – how to get hold of human research material fast before it decayed too much for re-use. It was this common medico-legal scenario that scholars were about to rediscover in the historical record of the hidden histories of the body just as the transplantation era opened.

Ruth Richardson’s distinguished book, Death, Dissection and the Destitute, was first published in 1987. It pioneered hidden histories of disputed bodies.20 In it, she identified the significance of the Anatomy Act of 1832 (hereafter AA1832) in Britain, noting that the poorest by virtue of pauperism had become the staple of the Victorian dissection table. As Richardson pointed out, that human contribution to the history of medical science had been vital but hidden from public view. Those in economic destitution, needing welfare, owed a healthcare debt to society in death according to the New Poor Law (1834). Having identified this class injustice, more substantive detailed archive work was required to appreciate its cultural dimensions, but it would take another twenty-five years for the next generation of researchers to trace what exactly happened to those dying below the critical threshold of relative to absolute poverty.21 The author of this new book that you are currently reading for the first time (and three previous ones) has been at the vanguard of aligning such historical research with contemporary social policy issues in the medical humanities.

Once that research was under way, it anticipated several high-profile human material scandals in the NHS. These included the retention and storage of children’s organs at Alder Hey Children’s Hospital, the clinical audit of the practice of Dr Harold Shipman, and the response to the inquiry into the children’s heart surgery service at Bristol Royal Infirmary. Such scandals brought to the public’s attention a lack of informed consent, lax procedures in death certification, inadequate post-mortems and substandard human tissue retention paperwork, almost all of which depended upon bureaucracy developed from Victorian times. Eventually, these controversies would culminate in public pressure for the passing of HTA2004 to ensure that a proper system of informed consent repealed the various Anatomy Acts of the nineteenth and twentieth centuries, as we will go on to see in Chapter 2. Recent legislation likewise provided for the setting up of a Human Tissue Authority in 2005 to license medical research and its teaching practices in human anatomy, and more broadly regulate the ethical boundaries of biomedicine. As the Introduction suggested, it seemed that finally the secrets of the past were now being placed on open access in the public domain. Or were they?

Today, studies of the cultural history of anatomy and the business of acquiring the dead for research purposes – and it has always been a commercial transaction of some description with remarkable historical longevity – have been the focus of renewed scholarly endeavours that are now pan-European and postcolonial, and encompass neglected areas of the global South.22 In part, what prompted this genre of global studies was an increasing focus on today’s illegal trade in organs and body parts that proliferates in the poorest parts of the world. The most recent literatures on this subject highlight remarkable echoes with the increasingly rich historical record. Scott Carney, for instance, has investigated how the social inequalities of the transplantation era in a global marketplace are prolific because of e-medical tourism. In The Red Market (the term for the sale of blood products, bone, skulls and organs), Carney explains that on the Internet in 2011 his body was, and is, worth $200,000 to body-brokers that operate behind an antivirus firewall to protect them against international law.23 He could also sell what these e-traders term ‘black gold’ – waste products like human hair or teeth – less dangerous to his well-being to extract for sale but still intrinsic to his sense of identity and mental health. Carney calculates that the commodification of human hair is a $900 billion worldwide business. The sacred (hair bought at Hindu temples and shrines) has become the profane (wigs, hair extensions and so on) whether it involves ‘black gold’ or ‘Red Market’ commodities, in which Carney’s original phrasing (quoted in a New York Times book review) describes:

an impoverished Indian refugee camp for survivors of the 2004 tsunami that was known as Kidneyvakkam, or Kidneyville, because so many people there had sold their kidneys to organ brokers in efforts to raise desperately needed funds. ‘Brokers,’ he writes, ‘routinely quote a high payout – as much as $3,000 for the operation – but usually only dole out a fraction of the offered price once the person has gone through it. Everyone here knows that it is a scam. Still the women reason that a rip-off is better than nothing at all.’ For these people, he adds, selling organs ‘sometimes feels like their only option in hard times’; poor people around the world, in his words, ‘often view their organs as a critical social safety net’.24

Having observed this body-part brokering often during his investigative journalism on location across the developing world, Carney raises a pivotal ethical question. Surely, he asks, in the medical record-keeping the term ‘organ donor’ in such circumstances is simply a good cover story for criminal activity? When the poorest are exploited for their body parts, eyes, hair and human tissues – dead or alive – the brokers that do this turn the gift of goodwill implied in the phrase ‘organ donor’ into something far more sinister, the ‘organ vendor’. This perspective, as Carney himself acknowledges, is deeply rooted in medical history.

In the past, the removal of an organ or body part from a dissected body involved the immediate loss of a personal history. Harvesting was generally hurried and the paperwork done quickly. A tick box exercise was the usual method within hours of death. Recycling human identity involved medical bureaucracy and confidential paperwork. This mode of discourse mattered. Clinical mentalities soon took over and this lesson from the past has considerable resonance in the present. Thus, by the time that the transplant surgeon talks to the potential recipient of a body donation ‘gift’, involving a solid organ like the heart, the human transaction can become (and often became) a euphemism. Importantly, that language shift, explains Carney, has created a linguistic register for unscrupulous body traders too. Thus, when a transplant surgeon typically says to a patient today ‘you need a kidney’ – what they should be saying is ‘you need someone else’s kidney’. Even though each body part has a personal profile, the language of ‘donation’ generally discards it in the desire to anonymise the ‘gift’. Yet, Carney argues, just because a person is de-identified does not mean that their organ has to lose its hidden history too. It can be summarised: male, 24, car crash victim, carried a donor card, liked sports – female, 35, washerwoman, Bangladeshi, 3 children, healthy, endemic poverty. It might be upsetting on a post-mortem passport to know about the human details, disturbing the organ recipient’s mental position after transplant surgery, but modern medical ethics needs to be balanced by declaring the ‘gift’ from the dead to the living too. Instead, medical science has tended to have a fixed mentality about the superior contribution of bio-commons to its research endeavours.

Historians of the body that have worked on the stories of the poorest in the past to learn their historical lessons for the future, argue that it would be a more honest transaction to know their human details, either post-mortem or post-operative. Speaking about the importance of the ‘gift relationship’ without including its human face amounts to false history, according to Carney and others. In this, he reflects a growing body of literature on medical tourism, which challenges the prevailing view that medical science’s neglected hidden histories do not matter compared to larger systemic questions of social, medical and life-course inequalities for the living. Instead, for Carney and his fellow scholars, the hidden histories of ‘body donations’ were a dangerous road to travel without public accountability in the material journeys of human beings in Britain after WWII. They created a furtive research climate that others could then exploit. Effectively, unintended consequences have meant that body-brokers do buy abroad, do import those organs and do pass them off as ‘body donations’ to patients often so desperate for a transplant that medical ignorance is the by-product of this ‘spare-part’ trade. Just then as the dead on a class basis in the past lost their human faces, today the vulnerable are exploited:

Eventually, Red Markets have the nasty social side effect of moving flesh upward – never downward – through social classes. Even without a criminal element, unrestricted free markets act like vampires, sapping the health and strength from ghettos of poor donors and funneling their parts to the wealthy.25

Thus, we are in a modern sense outsourcing human misery in medicine to the poorest communities in India, Africa and China, in exactly the same way that medical science once outsourced its body supply needs in the past to places of high social deprivation across Britain, America and Australia, as well as European cities like Brussels and Vienna.26 The dead (in the past), the living-dead (in the recent past) and those living (today) are part of a chain of commodification over many centuries. In other words, what medical science is reluctant to acknowledge and which historians have been highlighting for thirty years is that a wide variety of hidden histories of the body have been shaped by the ‘tyranny of the gift’, as much as altruism, and continue to be so.27

Unsurprisingly, then, the complexities surrounding this ‘gift relationship’ are an important frame for this book.28 Margaret Lock, for instance, has explored Twice Dead: Transplants and the Reinvention of Death (2002) and ‘the Christian tradition of charity [which] has facilitated a willingness to donate organs to strangers’ via a medical profession which ironically generally takes a secular view of the ‘donated body’.29 One reason she notes that public confidence broke down in the donation process was that medical science did not review ‘ontologies of death’ and their meaning in popular culture. Instead, the emphasis was placed on giving without a balancing mechanism in medical ethics that ‘invites an examination of the ways in which contemporary society produces and sustains a discourse and practices that permit us to be thinkers at the end-of-life’ and, for the purpose of this book, what we do with the dead-end of life too.30 Lock helpfully elaborates:

Even when the technologies and scientific knowledge that enable these innovations [like transplant surgery] are virtually the same, they produce different effects in different settings. Clearly, death is not a self-evident phenomenon. The margins between life and death are socially and culturally constructed, mobile, multiple, and open to dispute and reformulation. … We may joke about being brain-dead but many of us do not have much idea of what is implicated in the clinical situation. … We are scrutinising extraordinary activities: death-defying technologies, in which the creation of meaning out of destruction produces new forms of human affiliation. These are profoundly emotional matters. … Competing discourse and rhetoric on the public domain in turn influences the way in which brain death is debated, institutionalised, managed and modified in clinical settings.31

Thus, for a generation that donated their bodies after WWII questions of reciprocity were often raised in the press but seldom resolved inside the medical profession by co-creating in medical ethics with the general public. There remained more continuity than discontinuity in the history of body supply, whether for dissection supply or transplant surgery, as we shall see in Part II. The reach of this research culture hence remains overlooked in ways that this book maps for the first time. Meanwhile, along this historical road, as Donna Dickenson highlights, often ‘the language of the gift relationship was used to camouflage … exploitation’. This is the common situation today when a patient consents to their human tissue donation, but it is recycled for commercial gain into data-generation. For the donor is seldom part of that medical choice nor shares directly in the knowledge or profits generated.32 In other words: ‘Researchers, biotechnology companies and funding bodies certainly don’t think the gift relationship is irrelevant: they do their very best to promote donors’ belief in it, although it is a one-way gift-relationship.’33 Even though these complex debates about what can, and what should go for further medical research and training today can seem to be so fast moving that the past is another country, they still merit more historical attention. Consequently, the historical work that Richardson pioneered was a catalyst, stimulating a burgeoning field of medical humanities study, and one with considerable relevance for contemporary social policy trends now.

How then do the hidden histories of this book relate to what is happening today in a biotech age? The answer lies in the immediate aftermath of WWII when medical schools started to reform how they acquired bodies for dissection and what they intended to do with them. Seldom do those procedures and precedents feature in the historical literature. This author studied in-depth older legislation like the Murder Act (running from 1752 to 1832) and the first Anatomy Act (covering 1832–1929) in two previous books. Even so, few studies move forward in time by maintaining those links to the past that continue to have meaning in the post-1945 era in the way that this study does.34 That anomaly is important because it limits our historical appreciation of medical ethics. It likewise adds to the problem of how science relates its current standards to the recent past. Kwame Anthony Appiah (philosopher, cultural theorist and novelist) conducting the Reith Lectures for the BBC thus reminds us: ‘Although our ancestors are powerful in shaping our attitudes to the past’ – and we need to always be mindful of this – we equally ‘should always be in active dialogue with the past’ – to stay engaged with what we have done – and why.35 Indeed, as academic research has shown in the past decade, the policing of the boundaries of medical ethics that involve the sorts of body disputes which are fundamental to us as a society also involves the maintenance of long-term confidence and public trust that have been placed in the medical sciences. This still requires vigilance, and in this sense the investigation and production of a seamless historical timeframe is vital. Such a process demands that we engage in an overview of the various threshold points that created – and create – hidden histories in the first place.

This is the subject of the next section, but since hidden histories of the body in the post-war period – stories like that of Randolph Turpin – are the product of, reflect and embody the powerful reach of intricate networks of power, influence and control, it is first necessary to engage briefly with the field of actor-network studies. Helpfully, Bruno Latour wrote in the 1980s that everything in the world exists in a constantly shifting network of relationships.36 The human actors involved operate together with varying degrees of agency and autonomy. Retracing and reconstructing these actor networks therefore involves engaged research and engagement with research, argue Michel Callow, John Law and Arne Rip.37 This approach to historical studies can enhance our collective understanding of how confidence and public trust change over time, as well as illuminate mistrust in the medical sciences. Latour argues we thus first need to ‘describe’ the network of actors involved in a given situation. Only then can we investigate the ‘social forces acting’ that shape the matrix of those involved. Latour along with Michel Callow hence prioritised the need to map the dynamic interactions of science and technology since these disciplines have come to such prominence in Western society. How the sociology of science operates in the modern world was likewise an extension of their work. Actor network theory and its study are therefore essentially ‘science in action’ and are one of the foundational premises of the case studies in Part II of this book.

Latour pioneered this novel approach because he recognised that science needed help to rebuild its reputation and regain its authority in the modern period, at a time when the ethical basis of so much medical research and claims to be in the public good became controversial in the global community. In 1999, John Law and John Hassard outlined a further development of actor network theory, arguing that if it was to become a genuine framework for transdisciplinary studies then it had to have five basic characteristics:

  • It does not explore why or how a network takes the form its does.

  • It does explore the relational ties of the network – its methods and how to do something within its world.

  • It is interested in how common activities, habits and procedures sustain themselves.

  • It is adamantly empirical – the focus is how the network of relationships performed – because without performance the network dissolves.

  • It is concerned with what conflicts are in the network, as well as consensus, since this is part of the performative social elements.38

Michael Lee Scott’s 2006 work further refined this model.39 He pointed out that those who defend the achievements of science and its research cultures too often treat its performance like a car. As long as the car travels, they do not question the performance of the results, the function of its individual components or its destination. Only when science stumbles or breaks down, is its research apparatus investigated. When society treats science like a well-performing car, ‘the effect is known as punctualisation’. We need medical mistakes and/or a breakdown of public confidence, argues Scott, to ‘punctuate’ our apathy about the human costs of the medical sciences to society as a whole. In other words, belief in science and rationalism is logical, but human beings are emotional and experiential too. If science has encapsulated our cultural imaginations for good healthcare reasons, we still need to keep checking that its medical performance delivers for everybody and is ethical. This notion of ‘encapsulation’, Scott explains, is important for understanding how the research cultures of the medical sciences really work. A useful analogy is computer programming. It is common for programmers to adopt a ‘language of mechanism that restricts access to some of the object’s component’. In other words, when a member of the general public then turns a computer on, most people are generally only concerned that the computer works today in the way that the car-owner does when they turn on the ignition key in the morning to go to work. Even so, those simple actions hand over a considerable part of human agency to new technology. On the computer, we do not see the language of algorithms (the mechanisms of the system) that have authority over us and conceal their real-time operation. Science operates in an equivalent way to computer programmes, according to Scott, because it has hidden and privileged research objectives, written into its code of conduct and a complex, interrelated and often hidden set of actors. This book takes its lead from this latest conceptual thinking in actor network studies, but it also takes those methods in a novel research direction too. We begin by remodelling the sorts of research threshold points created inside the system of so-called body bequests and what these ‘donations’ meant for the way that the medical sciences conducted itself, networked and performed its research expertise in post-war Britain.

Remapping Disputed Bodies – Missing Persons’ Reports

The quotation ‘volenti non fit iniuria’ – no wrong is done to one who is willing – encapsulates modern attitudes towards ethical conduct in the dissection room, transplant operation theatre and more widely towards the use of human tissue and body parts for research purposes.40 In practice, however, things are rarely this simple. Bronwyn Parry, a cultural geographer, has described this defensive position as follows:

New biotechnologies enable us, in other words, to extract genetic or biochemical material from living organisms, to process it in some way – by replicating, modifying, or transforming it – and in so doing, to produce from it further combinations of that information that might themselves prove to be commodifiable and marketable.41

In other words, the patient consents, is willing, and soon becomes the ‘other’, whether in life or death. A new cell-line, blood-product or genome sequence erases an original body identity. The ‘donor’ and ‘donated human part’ or ‘tissue’ are re-designated – ‘Out there’.42 As Margaret Lock explains – ‘first a dead body must be recognised as alienable … legally available for transfer or sale. Current policies in North America and Europe treat cadavers and body-parts as “quasi-property”, thus making them alienable, but their transfer may not involve payment’ or at least not a direct payment.43 Often there is (for instance) a refund of petty cash expenses to suppliers, as a way of working around regulations. The law of medical negligence on both sides of the Atlantic states in case law that the body is ‘abandoned’ into these recycling schemes – known as bio-commons. If the person has consented to this, then it is a transparent process. Yet, often, and particularly under the Human Tissue Acts of 1961 and 1984, this was not the case (which Chapter 2 explains in greater detail). During the various government enquiries into NHS organ scandals, the conclusion was that all the original paperwork to reconstruct what really happened had been cursory, destroyed or never created in the first place. Generic figures covering the scale of organ retentions are thus often cited routinely in the historical literature, without checking their material pathways inside the research culture of the time. This book argues that the human material was traceable, provided we begin by reconstructing the threshold points of medical research. Thus, after 1945, the anatomical sciences, coroners and pathologists formed actor networks inside the research community of the medical sciences in Britain. They passed human material along a chain of human supply from operating theatre (amputated part or dead person) to hospital morgue or pathology laboratory, from the teaching lecture theatre or dissection room, to burial or cremation. Together they performed a series of research thresholds in disputed bodies and hidden histories of the dead. In remapping these, it is feasible to trace a whole series of what effectively became missing persons’ reports, acknowledged by HTA2004. Conceptually, we need thus to start modelling a process that was hidden from public view.

The first research threshold point of the historical process for each individual ‘body donation’ was the need to put pressure on people to think more about giving. The second threshold point is usually then the approach made by a medical professional to obtain that tissue or organ when the time is right. The third threshold point normally comes with the medical decision to use that tissue or organ for a particular purpose. These threshold points go on being crossed until the human donor ‘disappears’ in terms of their whole body identity (see Chapter 4), but crucially their body part or human material does not. In point of fact, it is capable under certain circumstances of being recycled multiple times. A human heart transplanted from a young to an older person could (in theory) for instance be reutilised again, provided, that is, it remains healthy enough to be taken from one transplant recipient and given to another patient on a waiting list (see Chapter 5). Sometimes recipients need two hearts in their lifetime because each operation is time-limited by the effectiveness of immunosuppressant drugs. Mortality rates are much higher in such cases, but they are occasionally medically feasible. Tissue that is cultured or brain slices likewise could be recycled many times for different purposes under a myriad of medical research circumstances (see Chapter 6). This means that crossing these threshold points in modern science will always involve the potential for ambiguity, dispute, dilemma and resolution. Nothing is fixed, little is certain. Yet, medical science does two critical things with and around these threshold points which are in turn crucial for this book.

The first is that it treats each one of the threshold points described above as discrete. This is deliberate because such an approach distracts public attention from potential disputes about the fuzziness that surrounds medical ethics as each research threshold is crossed. The breaking up of a whole body history into parts across discrete moments on a research pathway is essential to disaggregate the human being from their ‘body donation’ point. In mapping, therefore, its historical process, we find – donation(s), discrete(s), disaggregate(s) and destination(s) – all to push past – dead-end(s) (see Figure 1.1). In other words, to become the ‘other’ you need a ‘donation mechanism’ that separates the ‘gift’ from its eventual destination, often called ‘out there’ or ‘abandoned’ as bio-commons in medical case law.

Figure 1.1 Re-modelling the threshold points in body bequests used for dissection and further research in the medical sciences, c. 1945–2015.

Source: Author designed, themes embedded in Chapters 46, Part II, of this book

The second aspect is that medical science effectively treats each threshold point as ahistorical. The history of the person and the body or body part is there but it does not matter or is not central to the crossing of a threshold. To add to the confusion, the keeper of the record of what is happening at each threshold point is one step removed from the clinical bench of medical science itself. The regulator does not take an overview of the entire research recycling process but concentrates instead on monitoring each threshold point: essentially the modus operandi of the Human Tissue Authority and older legislation in the past (see Chapter 2). Regulators tend to wait until medical science reports to them the need for a license to use human material. This is a matter of professional trust, but it also distances that official oversight from the whole body of the donor from which in principle a wide variety of human material disappears from public view. Essentially, medical science’s ‘body donation mechanism’ was (and is) given the capacity to act in a series of discrete steps in terms of its actor-network performance, and because its research professionals did just that, acts of bequest and donation move seamlessly into hidden histories of dead bodies and body parts. However, at each threshold point, relevant choices about its component activities and parts can become controversial – a drug development was judged worth the investment return – a specific treatment became commercialised – an experiment that was externally fundable was prioritised – and so on. In this sense, a set of related ethical questions arises that tends to remain unresolved in the historiography because few study them in-depth. What happens if these threshold points are not considered discrete in popular culture, and as a donor you regard them as one whole – as many people did around the time of the NHS scandals in 1999? What happens if this complete history does matter in certain cases, as it did in the case of Randolph Turpin at the start of this chapter? By way of further example, although there are rigorous screening protocols in place for cancer patients in full remission who donate, some recent transplant cases have been reported of a donor giving recipients undetected breast cancer at the point of transplantation.44 Surgeons estimate the chances of this happening are ‘between 1 in 10,000 and 5 in 10,000’; even so, the discrete history of each organ does matter to those living patients reliant on the dead for their healthy survival. Likewise, what happens when you have a whole set of body disputes that emerges in time to undermine public confidence and trust? These are complex issues, but ranging widely over the historiographical literature and primary materials, we can see that dealing in discrete thresholds generates three sorts of tensions (or disputes) between medical science broadly defined and ‘ordinary’ people and those (like the press) who represented them. These stress points are crucial to this volume.

First, they involve implicit disputes of the sort explored in Chapter 4. Here we encounter the stories of people who allowed use of their bodies by default rather than design, largely a reflection of the fact that nobody explained to them all the research steps properly. Second, we encounter explicit disputes of the sort explored in Chapter 5, where, for instance, coroners co-operating with transplant teams had the right to remove more than they declared officially after, say, road-traffic accidents, discovery of which brings them into direct conflict with families, the law or both. Finally, we can find missed disputes of the sort that underpin Chapter 6. Here, people were not able to dispute the use of dead bodies and their brain material because the discrete thresholds, layered onto complex actor-network relationships, kept them uninformed, such as at Alder Hey Children’s Hospital when pituitary glands of dead children were taken as ‘bio-extras’. In other words, it is true that ‘no harm is done to someone that was willing’ (as the Latin quotation stated at the start of this section). However, many people might have been unwilling to consent to the extent of what was about to be done to them or their deceased loved one, but they did not know this at the time, and these hidden histories matter to everybody. For, paradoxically, the medical profession prefers piecemeal methodologies that are untraceable, since these are not easily legally accountable. By pausing briefly to engage with a human story, this scenario can be poignantly and powerfully illustrated.

In the late 1950s, a distinguished and decorated hero of WWII died under tragic circumstances. For ethical reasons, this book does not identify this individual because they may still have living relatives. The 100-year rule has been applied to ensure that any distant kin who could not be consulted to give informed consent are still treated with the utmost dignity in this study, despite the fact that some of this information has been in the public domain for sixty years. Detailed record-linkage work reveals that the person in question had worked on mine sweepers in the Atlantic during WWII. Their career ladder was impressive. They were promoted after being ‘Mentioned in Despatches’ (MID) for bravery and eventually awarded the Distinguished Service Order (DSO). Once the war finished, like many service personnel, they were not discharged for some years after 1945. Even by the early 1950s, there was still a lot of cleaning up to do and de-militarisation of equipment to co-ordinate from the War Office in London. Thus, the war hero transferred to the regions, was allocated a new logistics job. Soon they were ‘overworked’, according to contemporary accounts. They had to process a large amount of paperwork in what became a busy semi-civilian job. As they were a diligent person, eventually the excessive workload triggered ‘depression’. Since they had never had an experience of mental ill-health, they booked an appointment with a local doctor under the new NHS. That GP signed the person off work for a time, but then ‘allowed [his patient] to return to work because he considered [the patient] was worrying so much about [the] paperwork piling up’ that a leave of absence was counter-productive to the war hero’s mental well-being.45 By now, the individual was middle-aged, had a settled home life, was married in a stable relationship, but still they found it hard to cope at work. Eventually, they drove their family car one Sunday evening to a remote side-road near the coast in the South of England and attached a tube from the exhaust pipe into the passenger side, and then rolled up the window. At a subsequent Coronial hearing: ‘the cause of death was stated as asphyxia due to the inhalation of carbon monoxide gas … while the balance of [name withheld by this book’s author] mind was disturbed’. The individual in question did not donate their body to medical science in their will. Nonetheless, what happened next does indicate the research threshold points that this dead body was now about to cross in the hands of medical science.

The first threshold point was that by virtue of the physical fact of a suicide, the body in question became the responsibility of a coroner whose public duty it was to perform a post-mortem and report to an Inquest Jury. At this point, the coroner had two legal options: to extensively cut open the body and examine the lungs and heart and/or test the carbon monoxide levels in the tissue; or, examine the external appearance of the body and use his powers of discretion to declare a death by suicide. Historically, this latter option, a ‘non-Jury’ case, came under Coroners’ Regulations. Since the early 1830s, when coroners started to be medically, not legally qualified, they had the discretion to save the costs of a post-mortem if a death was obvious, for example, in drowning cases. In other words, at this first threshold point, the body might be cut a little, some, or a lot. It all depended on the decision of the coroner, whether he was legally or medically qualified (or both) and the local political temperature, shaped by events surrounding an unexplained death. Today this practice continues with paper inquests, and it has always been part of Coronial discretionary justice.

The second threshold point that is then noteworthy is that despite a lack of bequest, this body went next to St Bartholomew’s Hospital in London. The records indicate that the person died, there was a quick Coronial hearing and the body arrived at the dissection room within a total of two days. It is likely some testing had been done on its CO2 levels and heart/lung tissue samples were removed for examination, but the body itself was substantially intact at this handover given the speed of delivery. It was now about to fall under the official jurisdiction of the dissection room because Coronial offices often had close working relationships with medical schools needing a steady supply of the dead to train students in anatomy in the 1950s. It had therefore travelled about 100 miles by van. In other words, the whole body had started to become the ‘other’ on that journey – literally and metaphorically moving by means of a medical bypass – but it was not, as yet, not ‘out there’ in parts – where its ultimate destination would be diverted to the cul-de-sac of history (as the Introduction outlined) until this book remapped it.

Crossing a third threshold point, the body of the dead person passed into the dissection room jurisdiction to underpin further teaching and research. It is evident from the original records that this phase took one and a half years in total, from entry to leaving the dissection room for the final time to be buried (cremation was not yet commonplace as it is today). In other words, this body was cut up extensively and no opportunity to learn missed. On entry, it was refrigerated and embalmed. This involved first washing the body. Then embalmers made initial small cuts at the neck in the carotid artery area and injected preservation chemicals into the inner thigh. The embalmer on duty pumped embalming fluid (a mixture of ethanol and formalin) into the arteries. About 25 to 40 litres was the normal level. Bodies were always refrigerated and checked regularly to see that the process was working. Additional fluid injected directly into areas of the body not responding to the chemical processes to fix the human material was likewise the usual procedure. Once preserved, cadavers, placed on a metal table in a temperature-controlled dissection room, were covered with a shroud until ready for teaching. The head was shaved for cleanliness too, akin to the sort of shorn-head appearance of all serving recruits in the armed forces. As the procedures for dissection were methodical on site, we can proceed to the fourth threshold point.

Allowing for the fact that the heart was still present (in some cases, coroners removed it as a precaution in suspected homicides, but this does not seem to have been so here), then medical students on site spent a concerted amount of time dissecting it. The lungs likewise were always the focus of intense interest, as would be the major organs like the liver and kidneys. The separated skin and each body part were prepared as prosthetics. The head generally was the focus of a month of teaching sessions too. Of importance here were the age, general condition of the body and its gender. The coroner’s report said the deceased ‘enjoyed reasonably good health’ despite a recent episode of ‘depression’. The person was middle-aged and had led an active life; therefore, the human remains were very good teaching aids. They were also useful for further research into mental ill-health in the brain, provided the pathologist had frozen below –20 degrees centigrade (rather than embalmed) that body part after a post-mortem. Consequently, the crossing of thresholds three and four technically represented a research opportunity to learn more about the potential physical manifestations of a suicide case and its neurology. Each threshold point was self-evidently a discrete step in which a whole body history was being dis-assembled into a series of hidden histories where the physical reality of completeness and the history of the person were eroded.

What happened then to each body part, organ, tissue or brain slice tends to fade from view into the jurisdiction of the pathologist and medical research community, as we shall see in the following chapters. After eighteen months, the body was buried with a Christian ceremony, complying with legislation. A family of undertakers in the employ of the dissection room at St Bartholomew’s for almost 100 years did the internment (see Chapter 4). Consequently, here, as the ethnographer Marie Andree Jacob puts it: ‘What deserves particular attention is the very creative ways actors [in this case, coroner, pathologist, dissector, student, medical researcher, lab technician and scientist] go around the law while going through the legal processes: for this is how legality is experienced.’46 In other words, it is important not to be distracted by the medical sciences’ insistence on the ‘global’ over the ‘local’. Indeed, this reunification does require a lot more concerted effort in the archives. Nonetheless, what historical research has to do is ‘privilege the microscope over the telescope’ to trace each threshold point, engaging with its hidden histories of the dead and potential body disputes (explicit, implicit and missed).47 That endeavour will provide a checking mechanism in respect of the success story of the ‘body donation mechanism’ of the medical sciences since WWII, testing in context the maintenance of public confidence and trust (or not) in actor networks and their achievements.

The material reality is that this suicide could have had many different types of threshold points. These would have shaped the sorts of disputes that could arise. The individual might have made a body bequest in a particular way that led to a medical breakthrough. If so, their bereaved family may have wanted to participate in its knowledge formation as a consolation after death but missed an opportunity to do so. On being opened up (even without this happening voluntarily), it is entirely feasible that a war hero would have a good physiology that a medical researcher was waiting on. Certainly, one cannot rule out the possibility that this body in the 1950s contributed to the development of crime scene forensic science. It could also have been used for new research into cancers caused by the presence of asbestos in the lungs, as the person had worked on mine sweepers in the war that would later prove to be of importance for the study of painful mesothelioma. As yet, Crick and Watson’s discovery of DNA at Cambridge was just four years old. Had the war hero died ten years later, the potential was there in the cells for early genetic study. Even so, human tissue culture work took place at the Strangeways laboratory in Cambridge at the time of death and St Bartholomew’s Hospital had shared training facilities and dissected cadavers with Oxbridge since the war. All of these possibilities and their potential thresholds could have created material afterlives. Speaking about them in this way is not about ‘moral pronouncements’ in which there have traditionally been ‘two camps’ – one defending science’s achievements, the other doing the opposite – but instead focusses historical attention onto the nature, scope and meaning of body ethics in both a historical and modern sense.48 And of course there is an irony here. Because the importance of discrete threshold points and their potential for generating dispute has rarely been acknowledged, medical science has gone about the Enlightenment project in a rather contradictory manner. Combating ignorance with reason, rationality and science has been dependent on the ignorance of donors about what was going on to achieve the ultimate goal called ‘progress’. Should the combatting of medical ignorance rely on generating cultural ignorance to this extent, is a thought-provoking question and one with wide-ranging ethical implications. Soon it gave rise to public criticism and a demand that the human story must be restored to the relationship between medical researchers and teachers and the bodies that they relied upon. It is to this medical humanities issue that the chapter finally turns.

Everybody – ‘Who Must Own the Organs of My Body?’

I think it is self-centred of the public to feel they have a right to other people’s organs without offering their own, and I think the present system, under which hundreds of kidney patients die each year while many more useful organs are destroyed is … inefficient. And yet, I cannot go along with the suggestion one’s body, even after death, should be considered the property of the state. Perhaps this is a libertarian view, or perhaps it is simply the greatest irony of the transplant problem. The period of this great scientific advance has coincided with a decline in a sense of collective responsibility, and the advance itself, by making us think we can postpone death indefinitely, has discouraged us from making arrangements for our own demise.49

In many respects this short extract from a feature article in the Independent on Sunday in the early 1990s denoted the start of a public discussion about body disputes. It recognised controversial human harvesting issues that the general public may have wanted to raise about the regulation of organ donation, transplant surgery and ‘body donation’ bequests, but did not have the full information to do so. The rise of doctoring in British society as a profession over several hundred years had created a set of expectations for fee-paying patients that ‘death’s door’ would be held shut for as long as possible by the medical sciences.50 After 1948, NHS consumers became taxpayers with a stake in the best that medical science could share with everybody. It had once been the view that, as George Steiner, the moral philosopher, explains: ‘Death has its history. The history is biological, social and mental. … Every historical era, every society and culture has had their own understanding, iconography and rites of mortality.’51 In Western cultures, by the modern era, however, the way that people had traditionally edified ‘the House of the Dead’ (to use Steiner’s analogy) was starting to change shape, and radically so. It no longer had in the popular imagination a clearly defined deadline – the metaphysical belief that this is your time, and date, and you must enter here after the traditional lifespan of three score years and ten – for that biblical timetable had eroded slowly with secularism and science. Patients now expected to push past the dead-end of life, and indeed, in many respects that so-called deadline seemed more alive than dead in emergency rooms that had lower mortality rates from improved resuscitation facilities. For Steiner this has created nonetheless what he calls ‘the barbarism of specialisation’ and with it the inability to see material things including the human body ‘in its totality’. The real problem is that it has also tended, in his view, to misrepresent scientific invention as human creativity. It is important to reflect on this philosophical perspective because it has often been excluded from historical accounts of the ‘success story’ of the medical sciences in the modern era.

Steiner points out that science seldom looks back. Its mentality is to cancel a drug, medical procedure or surgical innovation and move on to the next big breakthrough.52 Why study Newton when Einstein has taken a leap forward, was a rational position to take by the twentieth century; the new displaces the old. But this, Steiner believes, is contrary to the history of the creative arts over centuries. Creativity links the whole to each part – one artwork to another, one novel to a series of writings and so on. It is rare for new knowledge to cancel out old mind-sets and perspectives altogether. Knowledge is often compartmentalised for a time, retains the potential to re-join a creative conversation, may keep changing emphasis, and often productively so. What has tended to happen over a century of innovation in science that is worrying, for Steiner and philosophers of the body, is that the public have come to expect medical science to do the editing of information for them. This neglects the creative potential of knowledge formation, reinvention and retrieval in which everyone should be involved. Science instead will typically develop a new drug and work to lessen its side-effects because of the expense of clinical trials. Even if the drug is not really fit for purpose for some patients, the medical sciences will keep using it despite its downsides; until, that is, their lack of creative imagination to revisit their research agendas is held to public account. Occasionally we glimpse this sort of scenario, most notably in the case of thalidomide, which illustrates this key point succinctly.

Thalomid [sic] was the original name of the drug developed and sold in Germany in 1957 under the trade name Contergan.53 It was marketed without proper clinical safeguards for nausea and morning sickness in pregnancy, then banned. Later, its chemical interaction that stunted human growth in the limbs persuaded some governments to issue it under special license for cancer and leprosy treatments to inhibit tumours. The ‘dark remedy’ had thus a ‘one-track’ scientific history, until a public outcry caused its creative potential to be unlocked. This is exactly the sort of predicament that troubled Rhoda Koenig (the journalist) in her 1990s short piece on organ donation that opened this section. The ‘postponement of death’, as she put it, makes everyone’s eventual ‘demise’ not just difficult to talk about but there is an endemic cultural denial about difficult situations. ‘Edit me down so that I survive longer’ is all very well, as she explains, but it also disempowers the patient. Further complicating that situation was the reluctance of the medical profession to speak openly about the successes and failures of their clinical work, as the thalidomide controversy showed. Indeed, seldom was the legislative framework regulating laboratory practice, the development of drug rejection therapies or human tissue experiments, set out clearly in print in the immediate post-WWII era (see Chapter 2 for a more detailed discussion of the legalities). Innovations were publicised in the medical press like the Lancet and British Medical Journal, but almost never was the cohort of bodies or human tissue research activities acknowledged openly. It was not a legal requirement and thus omitted. Any publicity tended to be about promoting a new breakthrough and accrediting it to a doctor or scientist on their career path. A cultural fissure consequently started to open up after 1945 in Britain. The public thought they were being fully informed, when they were not; and the medical sciences assumed that the general public did not want to know what they did not know about!

Anatomists, clinicians and pathologists thus found themselves in a bad self-validating prophecy of their making: the public do not understand what we do, and we do not understand their attitude to us – ergo, we cannot work or co-create together. What exacerbated this situation was how talented doctors and scientists – ones genuinely working to improve the public good – made assumptions that laws in the past superseded present-day regulations. Soon it became clear that they were still working with outdated laws, broken down, tinkered with and rehashed, but never repealed. In stressing patient confidentiality (a legitimate legal concern), they seldom thought to look at the legal basis of their paperwork on bequests, post-mortems and so on. In other words, the methodologies of the medical sciences with their threshold points in human dissection and further research, done in discrete stages, ironically matched the way that the law itself had been revised in bits and pieces instead of in its entirety for the living and the dead. This cultural stand-off (for that is what it amounted to by the end of the 1990s) was further exacerbated by the medical sciences’ scepticism about the value of human stories to their research endeavours. This scepticism is misplaced, argue moral philosophers and poets such as John Glenday; his poetic satire is biting about medical science’s proverbial rubble from this recent past:

Rubble
            General term for a people who are harvested and reused
              Or broken. To be heaped randomly or roughly stored.
          That which is held for common use. Infill. Of little worth.
      Break them in different ways but they will always be the same.
Hold them in the dark; remind yourself why they should stay forgotten.
          These days there is little interest in stones that bear names.
      May they be piled up and given this title in common.
  Let them take their place in the register of unspoken things
               May they never be acknowledged again.54

To disassemble might be a necessary and inevitable part of research, but to forget is not. This book thus builds on philosophies of the body and science since it challenges, resituates and rediscovers the human ‘rubble’ of a bio-commons.

Conclusion: ‘No Decisions about Me, without Me’

HTA2004 reflected a wide range of reactions to a recent (and not so recent) history of disputed bodies that has included – anger, blame, disappointment, frustration, regret and sadness. In many respects, it follows that the fallout of that history was always going to be far-reaching but not necessarily in the ways that the medical sciences would have anticipated. After 2005, body donations did not decline dramatically, and more people were willing to donate their organs in the first decade of the new legislation because of the work of the Organ Donation Task force set up by the government in 2008.55 Rebuilding public trust can nonetheless be a complicated process. Often it is damaged far quicker than the long time it takes to be established. What continues to be at issue is the cultural fissure opened up by NHS scandals in 1999. These have been exemplified by the ongoing public stand-off over compulsory organ donation. On 17 July 2007 BBC News, for instance, reported that Sir Liam Donaldson (former Chief Medical Officer at the time of the NHS organ retention controversy) had done a volte face, despite his support for the principle of inclusiveness embedded into the new HTA2004 statute. He had embraced a system of ‘presumed consent’ because of long waiting lists for organs.56 Yet, as the Shadow Health Secretary at the time, Andrew Lansley, replied: ‘The state does not own our bodies, or have the right to take organs after death’ – echoing the prescient journalism of Rhoda Koenig in the 1990s touched on early in this chapter. That concept of state ownership has been rejected in Scotland (for now), though adopted in Wales from 2015. In England, meantime, what remains the subject of lively debate is the ethical principle of ‘No decisions about me, without me’, as it embraces in 2020 a new organ donation scheme based on the Welsh opt-out facility for the living and presumed consent for the dead. This proposal to change the law prompted lively discussion at a meeting of the Royal Society of Medicine (hereafter RSM) convened on 23 June 2016 to reflect on twelve years after ‘the good, the bad and the ugly’ of HTA2004.57

What remains palpable in bioethics is that if a person (alive or dead) gives (or has given) consent – whether for human tissue, cell-line, biopsy or organ – and if a medical researcher makes an invention or innovation that proves to be of commercial value from that human material – then that outcome distorts the ‘goodwill’ of the bequest. If we have moved from ‘proprietorial’ to ‘consensual’ medical ethics after HTA2004, then that legal emphasis has yet to become a medical reality in working practices. Moreover, there remains the difficult question of what happens when human tissue becomes recycled into computer data. Hugh Whittall, Director of the Nuffield Council on Bioethics, thus explained at the recent RSM conference in June 2016 that:

The long-term challenge is the issue of tissue banking. The value of a tissue sample, he says, is beginning to reside more ‘in the huge amount of data it can deliver once you put it through any kind of biochemical or genetic analysis’.

‘So to some extent, tissue banks could become redundant once you have got the data or information in the tissue. We then move from the framework of human tissue regulation into the framework of data and information regulation.’ The interaction of regulatory control and legal and ethical frameworks is going to be very difficult, he thinks, because ‘the two areas have not necessarily matched up completely’.

The current legislation … should be capable of working for ‘another 10 or 15 years, because we quite deliberately introduced a degree of flexibility and discretion that could be exercised by the HTA’.58

The rising cost of regulation, the bureaucracy involved, the question of how far systems of medical research are streamlined enough to be inspected uniformly and, above all, the fast-moving e-globalisation of all our personal information, remain uncertain. ‘Hack-work’ was once pejorative slang for medical students cutting open corpses. Now to be ‘hacked’ involves breaches in data protection privacy laws and ‘goodwill’ needing a better firewall to protect the biomedical boundaries being broken down in medical science.

Looking back, leaping forward, it remains apparent that when the medical sciences had ‘a degree of flexibility and discretion’ in the past (to quote Hugh Whittall’s phrasing above) they proved incapable of handling it. To build ‘deliberately’ therefore the same discretionary powers into HTA2004 to ensure it has longevity as a piece of legislation in terms of a Human Tissue Authority management culture, is dubious from a historical standpoint, however well intended its work. For it negates any historical sense of the research processes and their threshold points in the pieces of a medical mosaic. Indeed, it is striking that no historian of the body was (or is) invited to sit on the Human Tissue Authority. Such observations suggest that scientists, doctors, anatomists, coroners and pathologists continue to take a proprietorial view of the bodies and body parts in their professional hands. Few voluntarily adopted the mentalities of custodianship, and arguably this hidden history is still having important ramifications in scientific research circles today. As Sir Jeremy Farrar, Director of the Wellcome Trust, highlighted in his recent blog post on 10 September 2019:

The emphasis on excellence in the research system is stifling diverse thinking and positive behaviours. As a community we can rethink our approach to research culture to achieve excellence in all we do. The UK’s research sector is powering ahead, with our world-leading universities generating knowledge and innovations that are improving lives around the world. But in the engine room of this great enterprise, warning lights are blinking on. The relentless drive for research excellence has created a culture in modern science that cares exclusively about what is achieved and not about how it is achieved. As I speak to people at every stage of a scientific career, although I hear stories of wonderful support and mentorship, I’m also hearing more and more about the troubling impact of prevailing culture. People tell me about instances of destructive hyper-competition, toxic power dynamics and poor leadership behaviour – leading to a corresponding deterioration in researchers’ wellbeing. We need to cultivate, reward, and encourage the best while challenging what is wrong.59

Perhaps one of the greatest ironies is that the heritage sector may have better working practices in terms of the custodianship of our national assets than the medical sciences, which dominate public spending by government. Maybe because the heritage sector has always had a charitable status defined by trusteeship, its ethics were co-created in conversations with the entrance-fee-paying public. Yet, in medicine, people do pay their equivalent entrance fee in taxes to fund the NHS; its medical research base from patient case-histories is a national asset too: as the Covid-19 pandemic is highlighting. It is a point of view worth considering that whereas voters want politicians to protect the physical public ownership of the natural landscape of the environment, seldom are the insides of human nature seen as needing the same public property safeguards. One thing remains certain. This is a history not simply in our keeping, but in our collective making too. For, as Farrar emphasises, the medical sciences still need a more caring culture – ‘not exclusively about what is achieved’ but ‘how it is achieved’ too. The disputed bodies that have been missed and mislaid, exemplify the need for vigilance about the ethical basis of pushing back all our deadlines. We next therefore examine the legal framework of the messy business of these muddled research threshold points of the modern era.

2 Res Nullius – Nobody’s Thing

While the stories and hidden histories of the dead stand at the heart of this book, it is important to frame these narratives against the restrictions and permissions of the ‘laws’ that governed matters of consent, harvesting and research in modern British medical research. This seemingly simple endeavour is considerably complicated by the fact that as well as direct legislation on these matters, medical practice and the ‘rights’ of the dead and dying are shaped by legislation in other areas of criminal, civil and administrative law. Official and unofficial ‘guidance’ and long-established customs also have purchase on these matters. In turn, the fact that much ‘law’ merely clarified or amended previous legislation rather than repealing it, means that ‘the law’ becomes ‘the laws’. Thus, there is often considerable scope for differential interpretations of legal permissions at any chronological point. In this sense, law matters very much for the interpretation of the stories that we will go on to encounter in the rest of this volume.

A starting point for this process is the long tradition in English Common Law that: ‘A dead person cannot own the property of their body once deceased – the legal principle is Res Nullius – Nobody’s Thing.1 In many respects, this lack of a human identity set the tone for how medical science represented its dissection and research work to government, as we have already begun to see in previous chapters. The importance of this basic principle becomes apparent in the eighteenth century, when many European states were threatened by revolution and the mob, and preventing criminal behaviour became a matter of urgency. In Britain, central government decided by 1750 that the forces of law and order should link heinous crimes like murder to a system of extra-physical punishments. Murder thus became punishable by death and dissection. The thinking was that this double deterrent would prevent ordinary people from seeking the radical political change threatened in Europe. These new regulations drew on ingrained body taboos in northern European cultures. Popular opinion held that any interference with the integrity of the human body in death was a moral shame. For the soul to go to heaven, the dead body had to be buried intact. As this author has argued extensively elsewhere, the culmination of these cultural mentalities was the passing of new capital legislation called the Murder Act (25 Geo. 2 c. 37: 1752) in England.2 Based on the Common Law principle of Lex Talionis – that the punishment must match the degree of offensive committed – it had a biblical counterpart, ‘an eye for an eye’ of retributive justice, outlined in the book of Numbers, chapter 35. After 1752, if convicted of homicide in a court of law, the condemned faced a death sentence, was hanged on a public gallows, and then surgeons either dissected the criminal corpse or placed it on a gibbet to rot. The bodies thus released by the justice system became one significant strand of the supply that medical science required for its educational and research needs over the next eighty years. It relied on ‘Nobody’s Thing’.

It was not to be enough. There was meantime a corporate ambition amongst practitioners to gain full professional status from an expansion of European medical education. At Bologna, Padua and Paris, training doctors in human anatomy had been a national priority since the Renaissance. Now others, particularly in northern European countries and cities where Enlightenment values gained a strong intellectual foothold, like Edinburgh, followed suit. Yet, those in Britain faced a logistical problem. The murder rate lagged behind the expansion of human anatomy training. Not enough people were convicted of homicide to supply dissection tables, and medical students thus lacked enough corpses to dissect. Grave robbing soon became commonplace, and newspapers reflected public concern that the unscrupulous were indiscriminately digging up the dead for anatomical profit. Resurrection men sold the dead of the rich, middling-sort and labouring poor, disinterred for dissection. This class question of who owned the dead body and who should be charged legally for stealing human remains became a highly emotive one in contemporary British culture, until, that is, the controversial Anatomy Act (2 & 3 Will. 4 c. 75: 1832 (hereafter AA1832) changed the medical status quo. Two catalysts changed public debates about the need for more legal supply lines in human anatomy by the 1830s. First, the famous ‘Burke and Hare’ murders in Edinburgh revealed how the destitute who were killed for medical profit entered the supply chain of anatomists in Scotland. Second, the simultaneous death of an ‘Italian boy’ in London, murdered and traded for a similar dissection sale, caused public outrage. These scandals would result in the medical profession successfully lobbying for a better and more plentiful legal mechanism of supply but crucially one still based on class inequalities. AA1832 permitted the poorest in society to become the staple of dissection tables, supplied by asylums, infirmaries, workhouses and street deaths, amongst the homeless, friendless and nameless of society. In turn, key aspects of AA1832 were to remain in force until HTA2004, a remarkable 172 years. Officially, AA1832 was supposed to end when the New Poor Law closed in 1929.3 In reality, as we shall see, its class ethos, tinkered with and rehashed a number of times, did not alter that much. This was because, as Richard Smith and Peregrine Horden have observed, early Welfare State council care homes were really just workhouse infirmaries renamed. They still supplied the dispossessed for dissection.4 In other words, in terms of body supply-mechanisms there was a great deal more continuity than discontinuity inside the healthcare system, a theme that runs throughout this book. Starting from this point, Table 2.1 summarises key statutes and important regulatory changes in British law on matters of consent, biomedical research regulation and the rights of the dead.

Table 2.1 The official boundaries of bio-security in modern Britain and Europe

TimelineLegislation/regulationsMain features of its remit
  • Ancient

  • Times

English Common Law – Res Nullius – Nobody’s ThingA dead person cannot own the property of their body once deceased
1832Anatomy ActThe dead must repay any welfare debt to society. Welfare costs paid from public taxation merit post-mortem. The individual dissected and dismembered for the purposes of anatomy teaching and medical research
1926Coroners (Amendment) ActExtended retention powers over post-mortems
1926Registration of Stillbirths ActStillborn children now constitute a potential ‘living’ person in law and as such their death and burial must be registered officially
1950sPituitary Gland ProgrammeExtraction of Human Growth Hormone post-mortem by anatomists, coroners, pathologists
1952Corneal Grafting ActRegulates the removal of eye material taken from cadavers post-mortem
1960/1Declaration of HelsinkiWorld Medical Association’s new ethical framework for medical research
1961Human Tissue ActHuman tissue from a dead patient considered in law to be an unconditional gift. In the case of material derived from fatal operations (organ, body part, tissue) provided the patient when living gave consent for the surgical procedure that led to that removal, once removed in law is abandoned. It hence becomes the legal property of the medical establishment, removed for the therapeutic benefit of the consenting patient before their death. Doctors need ‘only make reasonable enquiries’ where human material originates
1962/3Medical Research Council (MRC) Annual ReportSeen as a cornerstone of medical ethics in Britain. Future funding of research studies dependent on adhering to a new Ethical Code of Conduct. Has been revised many times, especially in 1979 (see below)
1977National Health Service ActSection 25 – where the Secretary of State has acquired: (a) supplies of human blood … or (b) any part of a human body … s/he may arrange to make such supplies or that part available (on such terms, including terms as to charges, as he thinks fit) to any person
1979Medical Research Council (MRC) Ethical CodeCompulsory for scientific and medical research studies based in Britain
1984Coroners’ RulesClarified post-mortems by coroners and the Preservation of Material. Rule 12 stated that: A person making a special examination shall make provision, as far as possible, for the preservation of the material submitted to him for such period as the coroner thinks fit
1984Anatomy ActPassed to repeal aspects of 1961 legislation but did not clarify adequately use of tissue and organs, and their ownership
1986Corneal Tissue ActPermitted the removal of eyes or parts of eyes for therapeutic purposes, medical education and research by persons who are not medically qualified, subject to appropriate safeguards. Amended parts of the HTA1961 so responsibility for medical death resided with doctor(s) who had cared for the patient
1989Human Organ Transplant ActPassed to prevent the illegal trade in organs globally and to protect the vulnerable from becoming victims of organ harvesting
1988Anatomy RegulationsA written record kept of all bodies and body parts retained by medical schools for human anatomy teaching and medical research
1989Human Fertilisation and Embryology ActSpecifically regulates research into fertility and embryology research due to international concern about the future of designer babies
1998European Directive on the Legal Protection of Biotechnological InventionsDirective 98/44/EC of the European Parliament and of the Council of 6 July 1998 and ratified under the Treaty of Rome – Harmonises the laws of Member States on patentability of biotechnological inventions, plant varieties (as legally defined) and human genes – under BREXIT review in the UK
2004Human Tissue ActIt is a criminal offence to use or store human bodies or body parts without explicit consent. Human tissue can, however, be subsequently used in medical research under presumed consent provided it has been first removed for the benefit of a living patient being treated and they have not sought to object in person
2008Health and Social Care Act (Regulated Activities)Saying Sorry campaign of NHS Litigation Authority
2009Jonathan Yearworth and others v. North Bristol NHS Trust (known as the Yearworth Judgment)Court of Appeal Judge warned that patients were entitled to compensation if their bodies generated sperm before undergoing chemotherapy and these a hospital mistakenly destroyed. It was not a defence in law that the hospital now owned that sperm and was not liable for its mistake. The case was an admission that Common Law may be no longer reliable, with regards to, the development of medical technologies and body/parts/products ownership
2014Care Act (NHS)Duty of Candour – admission of errors is now a clinical responsibility to NHS patients

A full description of the technicalities of this legislative canvas is neither possible nor desirable in the context of this book. Broad trends are, however, important. Thus, prior to WWI a raft of intersecting changes influenced fundamentally public and legislative attitudes to the supply of the dead for dissection and research. The passing of the Third Reform Act (48 & 49 Vict. c. 3: 1884), the creation of County Councils (51 & 52 Vict. ch. 41: 1888), democratisation of the New Poor Law under the Local Government Act (56 & 57 Vict. c. 73: 1894) and the Liberal Welfare Reform Programme (1906–1911) encapsulated a growing sense that poverty and pauperism were not the fault of individuals.5 Having the vote without the citizenship rights of healthcare and welfare provision was thus regarded as an empty political promise by the labouring poor, and no longer tenable in a modern society. The progressive extension of the franchise to women, the structural and cultural effects of the war, increasing political and economic assertiveness by the working class and the final demise of the New Poor Law in 1929, all signalled the increasing fragility of public support for the legislative base that underpinned the use of bodies for medical research and teaching. During the 1930s, however, the modus operandi of the medical sciences did not really alter that much. It was resistant to the direction of wider cultural shifts happening in British life, and continued to rely on Victorian legislation.6

Change when it came was from an ostensibly unusual angle. The growth of the Victorian information state had been a boon for the medical sciences by the early twentieth century.7 In particular, the expansion of the Coronial Office proved to be an important stepping stone in the piecemeal regulation of dissection and its further research agendas by the 1930s. This was the culmination of fifty years or more of a strategic realignment of the professional classes inside the expanding Information State in which coroners sought to be pivotal to the development of forensic medicine and crime-scene evidence, working closely with the anatomical sciences, as well as pathologists. As this author has shown elsewhere, some coroners were so successful at expanding their official jurisdiction that by the turn of the century a medical school which did not co-operate with the Coronial Office risked losing an important source of supply in the dead.8 It came therefore as less of a surprise to the medical profession as a whole that coroners were the first to lobby about the need for ‘special examinations’ (not just post-mortems) under the Coroners (Amendment) Act (16 & 17 Geo. 5 c. 59: 1926). For the purposes this chapter’s legislative review, the part of the Bill ratified that mattered most to anatomists was Sections 21–24, which gave the coroner special powers for:

Post-Mortem and Special Examination

  1. 21. Post-Mortem examination without an Inquest.

  2. 22. Power of Coroner to request specifically qualified person to make a Post-Mortem and Special Examination.

  3. 23. Fees to Medical Witnesses.

  4. 24. Power of Removal of body for Post-Mortem Examination.9

All of these slippery legal terms, notably ‘Special Examination’, created material ambiguities that were eventually repealed by HTA2004. Meantime, what the legal framework did was to extend the already extensive powers of the coroner and the nature of discretionary justice in their hands. This they made, and remade, during the modern era, and often to the benefit of their professional contacts in dissection rooms and pathology labs, as we shall see in Part II.

At the same time, central government passed the Registration of Stillbirths Act (16 & 17 Geo. 5 c. 48: 1926), alarming anatomists. They worried that their natural allies at the Coronial Office in sponsoring this new legislation might cut off dissectors from parts of their historic supply-lines. Previously a stillbirth – defined by the Victorians as the death of a fetus after the twentieth week of pregnancy – went unrecorded as an ‘official’ death. In English law, spontaneously aborted fetuses (accidental and unnatural) physically had to breathe independently when separated from their mothers or they did not exist legally as a human being. To save money, normally such grieving parents buried their dead offspring without paying a sexton’s fee or covering a doctor’s death certificate expenses.10 Often when a mother and child died together, burying both in the same coffin was commonplace; families registered just the dead parent in the parish burial records of a local church. Anatomists could therefore ask coroners for their stillbirth cases without any official oversight and the promise of a small supply fee to those struggling to make ends meet in relative or absolute poverty. But after 1927, acquired human material now had to be recorded officially: ‘“still-born” and “still-birth” shall apply to any child which has issued forth from its mother after the twenty fourth week of pregnancy and which did not at any time after being completely expelled from its mother, breathe or show any other signs of life’.11 Then the Births and Deaths Registration Act (1 & 2 Eliz. 2 c. 20: 1953) altered this stipulation again. The qualifying time span of official notification increased to ‘within 42 days of the birth’. This regulatory change meant that anatomists who acquired (or were supplied) with dead fetuses for the purposes of teaching and research could no longer do so unofficially, and without a time limit, as they had done for 200 years.12 The outcome of the legislation was that it convinced the medical sciences of the vital importance of co-ordinating with coroners more closely by the 1950s. The professional tensions that arose in this process are explored in Part II of this book.

By the early 1950s, a series of new laws and regulations about the use of the dead by the medical sciences became even more piecemeal. These generally reflected concerted public health campaigns that again had their roots in the late-Victorian era. Two in particular stand out because they were to have long-term consequences for disputed bodies, and issues surrounding them were to feature in public debates around the time of the NHS scandal at Alder Hey Children’s Hospital. The first was the Pituitary Gland Programme (hereafter PGP) that began in the USA in 1958, extended to the UK under the auspices of the Medical Research Council (hereafter MRC). The aim of the initiative was to investigate whether children born with a shorter stature needed growth hormone treatment. The medical facts were that Growth Hormone Deficiency (GHD) appears on the pituitary gland, a pea-size gland at the base of the brain. Its function in the body is to be the ‘master controller’ to ‘make hormones and control the function of other glands’ efficiently.13 Once it starts to malfunction, it ‘slows down or stops from the age of two or three years onwards. It is often first detected through routine monitoring using growth charts although it can become more obvious when a child starts nursery or school and is much shorter than other children in the class.’ Children characteristically display GHD by ‘growing slowly’ but crucially they do so ‘in proportion’, that is, ‘the length of their arms and legs stay at the same ratio to their chest and abdomen’. Thus, ‘their face may look younger than their actual age. They may also seem chubbier, more than other children, due to the effect of growth hormone on fat storage in the body. Puberty may occur later than usual or not at all.’ By early adulthood, typical symptoms will have started to manifest, as:

  • Increase in fatty tissue, especially around the waist

  • Decrease in lean body mass (muscle)

  • Decrease in strength and stamina, reduction in exercise capacity

  • Decrease in bone density, increase in rate of fracture in middle age and beyond

  • Changes in blood cholesterol concentrations

  • Increased sensitivity to cold or heat

  • Excessive tiredness, anxiety or depression

  • Reduction in quality of life14

Medical science in Britain was therefore from the 1950s concerned to do new research on whether GHD had links to poor diet, a lack of sanitation or substandard housing: all social problems once familiar to the late-Victorians, exacerbated by the Wall Street Crash (1929) and the food rationing privations of WWII. The main diagnostic tool was to extract GH post-mortem in order to see ‘if it could be manufactured in the laboratory and used to treat patients with hypopituitarism’.15 This PGP initiative would expand exponentially in the 1960s, and by the 1980s it had grown into a commercial enterprise in northern Europe, but one still reliant (in Britain) on the relatively cheap extraction of GH by anatomists, coroners and pathologists. The standard MRC payment for each post-mortem extraction was 1s 6d in the 1950s, increasing to £0.20p by 1985. As the amount of GH extracted each time was very small, multiple extractions happened until official approval for a more profitable, synthetic replacement for NHS use occurred in the 1990s. It was this hidden history that Professor Van Velzen exploited at Alder Hey Children’s Hospital when he removed organs, including pituitary glands, as so-called ‘bio-extras’. The standard means of harvesting GH was thus a classic case of ‘going around the law while going through legal processes’ overseen by the MRC and then supposedly the NHS.16 And, it proved to be a pivotal catalyst for HTA2004.

Meantime a second post-war initiative involved the passing of the Corneal Grafting Act (15 & 16 George 6 & 1 Eliz. 2: 1952). This too had its roots in late-Victorian public health concerns about the welfare of the poorest children in England. Many suffered from common eye diseases and eye defects due to vitamin deficiencies and birthing problems associated with substandard medical practices before the establishment of the NHS. Professor Arthur Thomson, for instance, who ran the dissection program at Oxford University medical school from 1885, pioneered eye research and was funded by the MRC to do ophthalmology and its neurology from WWI. The new legislation in 1952 was hence the culmination of fifty years of research work, which seemed to justify expanding regulation of the removal of eye material from cadavers, post-mortem. As the British Medical Journal announced:

The use of cadaver material for medical purposes [has been] … governed by the Anatomy Act of 1832 (2nd and 3rd William 4, cap 75.), which put a stop to the practices of the ‘resurrectionists’, and aimed at ensuring a legal supply of subjects for anatomical dissections from the bodies of unclaimed persons dying in public institutions. That Act did not help the provision of material for corneal graft surgery, since a complicated legal procedure has to be carried out before the body is available, and does not permit the removal of a fresh organ from the body since this is permissible only on a Coroner’s order. Nor did the Act allow any person to bequeath his or her own eyes for graft purposes, as in law the dead body has no property. Legal opinion was that the removal of cadaver eyes for graft purposes, even with the consent of relations was, therefore, illegal. In addition, a large number of enlightened people in Great Britain who wished to bequeath their eyes for corneal grafts were, by law, prevented from doing so. It seemed, therefore, that if these obstacles could be removed the supply of donor material would be legally increased; surgeons would not run the risk of legal actions and the voluntary bequest of eyes would probably be sufficient for anticipated needs.17

Importantly, this legislation created two further initiatives that should have opened up a medico-legal space for donors and their families to enquire more about bodies and their body parts in their material afterlives. All the eye grafts were sent to a new eye-bank and cornea plastic units based at prominent hospital-based eye units such as that at the Queen Victoria Hospital in East Grinstead Suffolk. Aware also of the sensitivities surrounding the gift of eyes, with many people feeling squeamish about donating them even after death, government launched a major publicity campaign. The BBC contributed, the press (both quality and tabloid newspapers) withheld sensational cases and emphasised instead the positive outcomes for NHS patients, and together the Women’s Voluntary Service and the Royal College of Surgeons approached bereaved families in hospital emergency rooms for donations. In other words, in this specific context at the start of Queen Elizabeth II’s new reign there seemed to be a concerted effort to be more engaging and open-handed. The confusion therefore about material afterlives came about after the passing of three amendments to AA1832: namely the Human Tissue Act (9 & 10 Eliz. 2 c. 54: 1961), Human Organ Transplant Act (Eliz. 2 c. 31: 1989) and Anatomy Act (Eliz. 2 c. 14: 1984).

In what follows in the rest of this chapter, these are styled HTA1961, HOTA1989 and AA1984 to avoid confusion. Before summarising their key features and explaining why they gave rise to disputed bodies by the late 1990s, it is important to set these cumulative legislative initiatives in the context of the history of international law. This is because what was happening in Britain did not occur in political isolation. Thus, as P. Sohl and H. A. Bassford explain: ‘During the 1900s with the growth of complexity in both scientific knowledge and the organization of health services, the medical ethical codes addressed themselves to elaborate rules of conduct to be followed by the members of the newly emerging national medical associations.’18 Then ‘after World War II the World Medical Association was established as an international forum where national medical associations could debate the ethical problems presented by modern medicine’. Against this backdrop nonetheless concern was also being expressed that there was danger of seeing international consensus as ‘progress’ whilst ignoring its ‘cultural relativism’. In reality, everyone welcomed the international framework of medical ethics, but it had to be applied in countries with ‘different methods of financing medical services’ and therefore differential socio-economic forces shaped doctoring and medical research cultures that were constantly evolving during the post-war era. In other words, we need to briefly engage with what the Hippocratic principle to ‘first do no harm’ meant in principle (the international foundation of medical ethics) before considering how it got applied in practice in modern Britain (the national imprint of HTA1961, HOTA1989 and AA1984).

Primum Non Nocere – First Do No Harm – International Medical Ethics

Once the Nuremberg Trials in 1945 exposed the atrocities of Nazi medical experimentation in the death camps of Auschwitz-Birkenau, there was an international effort co-ordinated by the Security Council members of the United Nations to protect individuals from future exploitation.19 The Nuremberg Code (1947) hence outlawed human experimentation of all descriptions that involved doing harm to the patient. Linked to the Declaration of Geneva (1948), this reflected widespread condemnation of war crimes in medicine, as well as a global commitment to monitor medical ethics to an international standard. The subsequent Declaration of Helsinki (hereafter DofH) in 1960/1, however, did not become international law. Instead, the UN ratified it as a code of practice, and monitored its uptake. One influential organisation to adopt its framework voluntarily in June 1964 was the World Medical Association (hereafter WMA). WMA consisted of a collection of voluntary national associations containing some eight million doctors worldwide, who signed up to self-regulate their commitment to medical ethics, education and the highest professional standards in patient-practitioner relationships. A crucial part of their commitment was that the WMA promised to remain politically neutral of the UN. At its 50th anniversary celebration in 2014, what was celebrated by WMA was the fact that their original DofH was now regarded as the cornerstone of human rights, a code of medical ethics that seeks to protect individuals against human experimentation in a global medical marketplace. It has unquestionably become the standard by which all ethical codes in individual nation states are judged in the human rights arena. It is not a code fixed in aspic: quite the opposite. Seven revisions happened since 1964, and that evolution is a creative process that keeps medical ethics valid in biomedicine today. In summary an overview remains:

The fundamental principle is respect for the individual (Article 8), their right to self-determination and the right to make informed decisions (Articles 20, 21 and 22) regarding participation in research, both initially and during the course of the research. The investigator’s duty is solely to the patient (Articles 2, 3 and 10) or volunteer (Articles 16, 18), and while there is always a need for research (Article 6), the subject’s welfare must always take precedence over the interests of science and society (Article 5), and ethical considerations must always take precedence over laws and regulations (Article 9).

The recognition of the increased vulnerability of individuals and groups calls for special vigilance (Article 8). It is recognised that when the research participant is incompetent, physically or mentally incapable of giving consent, or is a minor (Articles 23, 24), then allowance should be considered for surrogate consent by an individual acting in the subject’s best interest. In which case their consent, should still be obtained, if at all possible.

(Article 25)20

The principal issue nonetheless with this important DofH codification is not its best intentions but, rather, its flaws. Few countries have queried the dignity of the human research subject. Most agree that an ethics committee should oversee scientific research that involves people (whether alive or dead). There is likewise consensus that good practice is what medicine is all about. Nation states do, however, differ on the degree of legal emphasis contained in the original DofH and its seven revisions. For the purposes of this book, there has been a great deal of contention about the meaning of ‘informed decisions’ (Articles 20, 21 and 22) and what system of consent (opt-in versus opt-out) should be adopted on location. In some countries like England, patients have to make a positive choice to enter a clinical study or donate their human remains to medical research in writing prior to death. Whereas, in the Welsh National Assembly, for instance, from 1 January 2015, an opt-out system of organ donation has been officially ratified because of organ donation shortages; that is, if you die it will be presumed in law that you intended to donate unless you took steps when living to state otherwise.21 Recently, the Conservative government under Theresa May ratified new legislation in Parliament that followed the Welsh example in organ donation – though not without controversy. Thus, the fundamentals are the same but their resource management does differ, and this matters if historians are to trace their research threshold points and actor networks (discussed in Chapter 1), as well as their body disputes that have taken place in different places, at different times and for different reasons using donated bodies.

There has been, therefore, an increasing recognition in legal circles that translational medical ethics require good communication, an ongoing dialogue to reflect cultural change, and that in the modern world this has been a very complicated process since WWII. Some legislation succeeded, other bills did not. This was because in the recent past, civil servants who drafted government business in Britain were tasked with reconciling ‘medical ethics, business ethics, professional ethics, and human rights considerations’ as well as taking into account a doctor’s ‘fundamental fiduciary responsibility to the patient in the context of a growing secular, libertarian tradition’.22 That complex and fast-moving bioethical backdrop started to expose the need for ‘a fundamental reorientation’ of issues of informed consent. Slowly, as legislation did not have the impact intended, patient groups began to argue that legal and ethical guarantees were not as robust as the medical sciences claimed. However, this often only became the focus of public attention after a number of body disputes came to press attention. This was because unless you can measure something, it is often difficult to manage it properly. Much modern medical research contained body parts, brain slices and tissue samples. It was consequently easier for those inside the system to evaluate international ethical policies translated to national contexts, rather than actual practices that were piecemeal locally. Approved policies also took time to be adopted, refined and applied by their intended users; continually these had the potential to result in multiple variables. It is therefore necessary to return to a discussion of keynote legislation and core medico-legal issues in the UK, since these ambiguities frame the research cultures in the rest of this book.

A Toothless Tiger23

On 6 November 1967, the Right Hon. Julian Snow MP, Minster for Health in Harold Wilson’s first Labour government (1964–1970), was asked by Cranley Onslow, MP for Woking, in the House of Commons: ‘if he is satisfied that general practitioners are sufficiently aware of the provisions of the Human Tissue Act 1961; and if he will make a statement’.24 The Minister replied that: ‘My Department gave general practitioners guidance on the provisions of this Act in a memorandum issued in September, 1961 and I have no reason to believe that this has been generally overlooked. I am, however, glad to take this opportunity of again drawing attention to this guidance.’ The matter, though, did not rest there. Over the next four years, there were numerous debates and discussions in Parliament about the efficacy of HTA1961. At issue was its implications for organ transplantation, and the degree to which it had placed more, not less, discretion in the hands of coroners, doctors, pathologists and transplant surgeons to decide on the material fate of donations from the dead and living donors in hospital care. So much so, that during a heated Prime Minister’s question time in the House of Commons on 15 June 1971, Edward Heath (leader of the Conservative party) in reply to a question about the need to repeal HTA1961 and replace it with a new HTA statute at a forthcoming Queen’s Bill, announced:

I realise that it is not only a question of opinion in the medical profession but that many hon. and right hon. Members have expressed the view that there should be legislation on this subject. Nevertheless, I think that if the hon. Gentleman studies the matter closely he will recognise that it is extremely controversial. What is required is a clear indication that legislation will improve the situation, and at the moment I think that that clear and convincing proof is lacking.25

At issue was that HTA1961 was supposed to have sorted out the class injustices of AA1832, but instead it had led to more ambiguity, confusion and misinformation. For the general public, what the legislation was supposed to have done was to set out what exactly informed consent meant in plain English, but it was flawed by the slippery civil-service speak of Parliamentary parlance. As Professor Margaret Brazier, Chair of Law at the University of Manchester, noted in the Journal of Medical Ethics:

The Human Tissue Act 1961 is a toothless tiger imposing fuzzy rules with no provision for sanctions or redress. Absent directions from the deceased … the act provides that the person lawfully in possession of the body (often the hospital where the body lies) may authorize removal of body parts for the purposes of medical education or research providing that having ‘made such reasonable inquiry as may be practicable’ [even though there is] … no reason to believe that the deceased had expressed objections to such a process or that ‘the surviving spouse or any surviving relative of the deceased objects to the body being so dealt with’. Under the Human Tissue Act it may appear that the requisite authorization, consent if you like, comes from the hospital. Hospitals permit themselves to remove organs and tissue which they desire to put to scientific or medical uses.26

Hindsight, she conceded, is a wonderful thing. Nonetheless, those who drafted HTA1961 should have been aware that although ‘consent is such a simple word’ it was also self-evident that a lack of clarity had resulted in many disputed cases. Helpfully, Brazier also elaborated on the legal position of the medical sciences:

A previous Master of the Rolls, Lord Donaldson, took a straightforward view of consent to medical treatment by living patients. He likened consent to a flak jacket. Once consent is obtained, the doctor is protected from legal gunfire. Consent protects his back. He cannot be sued. Academic lawyers, those rather precious creatures, dislike the analogy, ignoring as it does any analysis of the interests consent protects, avoiding even any mention of autonomy. Moreover, whether you like flak jackets or not, the crucial question remains of who has the requisite authority to provide the flak jacket to the doctor.27

There were essentially two medico-legal issues: ‘Whose consent should have been obtained for organ retention? And whose consent ought to be obtained for organ return?’ In other words, the main flaw in HTA1961 was exactly what the ethnographer Marie-Andree Jacobs identifies as a central problem with ‘the law: how was everyone involved absorbing and using legal frameworks’, and in what ways were those ‘actors’ going ‘around the law while going through legal processes?’28 In many respects, these key ethical questions were not resolved by the raft of new legislation in the 1970s and set out in Table 2.1. This despite how widely the medical profession welcomed the Medical Research Council’s new Ethical Code in 1979, which made MRC funding dependent on following new EC guidance. By the opening of the 1980s, there seemed to be an urgent need for yet more piecemeal legislation, tackling but never resolving discrete aspects of the consent issue.

The enterprise culture of Margaret Thatcher’s Conservative government (1979–1990) saw the start of an unprecedented expansion of biotechnology in Britain.29 In part, this reflected just how much early transplant surgery had benefitted from improved surgical training techniques, as well as the development of the next generation of drug-rejection therapies by the pharmaceutical industry. There were public health campaigns organised by the Department of Health to get more of the general public to carry organ donation cards, but still sociological studies found that half of those bereaved were prepared to give and half were not. As transplant lists grew longer, and patients’ expectations rose, wanting to push past the dead-end of life, more and more parliamentary questions reflected on the need to deal separately with human organ transplantation. The result was the passing of HOTA1989. It had been preceded by AA1984, and the Anatomy Regulations Act (1988) (hereafter ARA1988). HOTA and ARA were in principle about better accountability. The first prevented the illegal trade in organs and protected the vulnerable from becoming victims of organ harvesting. The latter made it compulsory for a written record to be kept of all bodies and body parts retained by medical schools for human anatomy teaching and medical research in Britain. This second medico-legal guarantee was heralded as a major ethical step forward, but it was nothing of the sort because the original AA1832 had a very robust system of tagging bodies to paperwork at each stage the corpse was moved on or changed hands.30 It was, therefore, reintroducing an old law that HTA1961 had watered down, reviving it again to mask that HTA1961 was flawed. Because no official body had oversight of the entire process of medical research and its various hidden histories of the dead, older standards could be recycled in the belief that this was progress. It was clumsy and careless to reverse AA1832 legislation that was not fit for purpose in its HTA1961 form.

Focussing on the central aims of the various pieces of legislation passed in the 1980s to protect patients and facilitate further medical research, one aspect of AA1984 stands out. Amendments to statutes dealing with the legal use of organs and human tissue did not clarify who owned human material removed from its source. Moreover, it was clear that the issue of informed consent in a whole variety of contexts was very complex indeed. This was because it involved a balancing act of four sorts of agency: the patient, scientific research, medical doctors and public scrutiny. Thus, in letters to the British Medical Journal (hereafter BMJ) at the time the new AA1984 became law, some clinicians were asking uncomfortable ethical questions. What would happen to vulnerable patients with mental ill-health, manipulated into clinical trials by virtue of their vulnerability, and would those that committed suicide be automatically handed over by coroners for medical research post-mortem? Of concern were those patients who helped test new psychiatric drugs or ‘electroconvulsive therapy’ that aimed to alleviate severe depression. Is it possible, enquired Dr Neville-Smith in a letter to the BMJ, that fully informed consent is never achievable because the person in mental ill-health has an unbalanced mind? Others were likewise questioning what happens in organ donation to those so bereaved after a fatality that they cannot think straight. In response, a member of the psychiatric department at Leicester Royal Infirmary claimed that:

SIR,- Dr Neville-Smith raises an important ethical issue when he questions the nature of informed consent. It is, however, impossible to offer a simple solution. The protection of the individual patient, the need for research to improve both fundamental knowledge and patient care, and the need to maintain a humane and scientific profession must all be secured by policies acceptable to doctors and open to public scrutiny.31

It was the emphasis in this letter of reply on matters of consent being ‘acceptable to doctors’ (first – paternalism) and open to ‘public scrutiny’ (second – accountability) in that running order of priority that would prove to be contentious by the end of the 1980s. Eventually, the Isaacs Report (2003) would set out how and why the various statutes had proven to be inadequate by the end of the 1990s, even without the various NHS scandals that were to be catalysts for HTA2004:

9.3 No claim by statute is available to the person from whom tissue is removed. Indeed, the implication of the Human Tissue Act 1961, the Human Organ Transplants Act 1989 and the Anatomy Act 1984, though it is not expressly stated, is that the tissue removed pursuant to these Acts is given free of all claims, that is an unconditional gift. The Human Fertilisation and Embryology Act 1990, is less straightforward. Donors of gametes or embryos may impose conditions on use and may vary or withdraw any consent given. By adopting a scheme of consents, however, the Act avoids vesting any property claim in the donor [sic].32

The ethical issue was that the piecemeal nature of legislation was matching the piecemeal climate of actual research on the body – disassembled into parts – opened up for transplant harvesting of organs – and disaggregated to facilitate tissue, cellular and DNA modification. As Ronald Munson in his thought-provoking study of organ transplantation, ethics and society observes: ‘Here is the “body that will not die” or at least not until the medical sciences is “done with it”.’33 Thus, the ethical question remains, why was (and is) the public not sharing in the profitable outcomes of this enterprise? For, Munson insists, to describe the reach of scientific research as a simple ‘gift exchange’ in a biomedical era is misleading, especially when ‘transplantation … is a second-rate technology. … It’s a crude, stop-gap measure to keep people from dying.’34 It is a viewpoint shared with many others in the wider scientific community. Sir Robert Lechler, Chair of Immunology at King’s College London, thus explained in an interview with the Times on 14 July 2018 that soon: ‘organ regeneration could end “barbaric” transplants’.35 His latest regenerative medical research aims to allow patients to ‘regrow their own diseased tissue … through stem cell changes to their genetic machinery’. The leading journal Nature likewise featured the latest laboratory discovery that there is a ‘latent capacity of some organs to grow back when they are damaged’ without the sort of debilitating side-effects that can blight the lives of transplantation patients on permanent immune-suppressants drugs. Science now recognises that transplantation does extend life expectancy but it also has opportunity costs for patients too, and ones seldom elaborated honestly in public health campaigns. As Jacobs reflects in a similar refrain: ‘what emerges from documentation practices [in patient case notes] is agency in abeyance, a form of submissive self’.36 It was this lived experience that would culminate in HTA2004, but not before the question of brain research was resolved.

Brain Banking

The final catalyst that would contribute to a very public set of debates about the need for a repeal of old legislation in its entirety was the publication of the Isaacs Report in 2003. Jeremy Metters, HM Inspectorate of Anatomy, conducted a public enquiry into the retention of brains at Manchester University for post-mortem investigation and further medical research. As he explained:

It is important to remember that this investigation followed the chance discovery by Mrs Elaine Isaacs in April 2000 that the brain of her late husband had been retained for research in February 1987.

Had Mrs Isaacs not come across the letter sent to Mr Isaacs’ general practitioner by the joint research team, she would never have known that her husband’s brain had been retained, and the widespread retention of brains, and other organs, from Coroners’ post mortems might have remained undisclosed.

Most of the brains from Coroners’ cases in the 1980s and 1990s were initially held for entirely proper diagnostic investigation into the cause of death. A very much smaller number were retained specifically for research or teaching. The feature that unifies both these categories is that very few relatives were aware of the practice and I found no evidence that any were asked for their consent for later research or teaching use.

In this way the requirements of the Human Tissue Act [1984] were consistently disregarded.37

Metters undertook an audit and discovered that ‘21,000 brains collected between 1970 and 1990 were still held’ for medical research. It was unclear how and under what circumstances Coronial cases generated human material from hospital mortuaries, or asylums, in England and Wales. He concluded that: ‘Among the limited number of consent forms that I have examined, few specifically mention organ retention.’ He thus reflected that: ‘It appears the assumption was made that a signed post mortem consent form also indicated agreement to organ and tissue retention. It will never be known how many relatives were aware that organs might be retained from hospital post mortems without their knowledge.’38 There was hence a need for an explicit and transparent form of informed consent keeping relatives fully and transparently engaged. This required new legislation to restore public confidence in post-mortems. His view was that there were ‘serious weaknesses in the Human Tissue Act (1984)’. Perhaps the most obvious human one was that the statute made little allowance for the fact that:

The sudden death of a relative is among the most stressful of life’s experiences and the closer the relative the greater the distress. The same usually holds true for the relatives of those whose deaths are reported to the Coroner for other reasons.

Many who are suddenly bereaved are ‘in shock’ in the days that immediately follow. More ready access is needed to the advice, support and counselling that is available for the relatives of those who die in NHS hospitals. …

When for the Coroner’s purposes a formal statement is needed, there should be no pressure on a relative for its urgent completion or duress over the contents. While ‘in shock’, erroneous information may too easily be included.

As many relatives do not, at first, take in details of what is explained to them a written summary should be provided.39

It was imperative that those bereaved had a process of informed consent explained to them, a notion that echoed what some correspondents had been saying in the letter page of the BMJ since 1984. In the case of Mr Isaacs whose brain had been retained, allegedly used for medical research, but in reality ‘destroyed’ (according to the official report) without the knowledge of his Orthodox Jewish family, an apology was sent by Professor Deaking, head of the brain research unit at Manchester University, on 28 July 2000, that read:

I do fully understand and sympathise with the additional distress this discovery has caused you. I very much regret that current standards and safeguards about post-mortem tissue that would have prevented this occurrence today, were not in place 13 years ago. At that time there was little awareness that a relative might have strong views or legitimate rights concerning the removal of tissue and this was overshadowed by a strong desire to assist research. While not in any way condoning these attitudes, it is worth reflecting that this UK research led directly to understanding the causes of Alzheimer’s disease and to entirely new treatments for this incurable condition [sic].40

There were two key misleading elements in this well-intentioned statement. The first is that Jeremy Metters, HM Inspectorate of Anatomy, concluded that: ‘My enquiries have subsequently confirmed that no research had been undertaken on Mr Isaacs’ brain, which had probably been disposed of in 1993.’41 So the apology and its justification based on a medical research defence – namely the contribution that brain retention in this case may have made to a future cure for Alzheimer’s – was a false one.42 It was in fact very rare for a medical researcher at the time to be able to explicitly identity from their flimsy paperwork what they were hoping to achieve with specific human material at the point of so-called ‘donation’ or subsequently because the culture of record-keeping was to keep it sparse. This therefore looked and read like an officious excuse for an apology to those who read it. There was then the question of the culture of medical research and a lack of knowledge about wider cultural and religious sensitivities at the time that formed the basis of the second statement of apology in the letter to the Isaacs family. Again, this was incorrect.

Mrs Isaacs had repeatedly told the police, coroner and attending doctor on the night of her husband’s suicide that he was an Orthodox Jew and that she needed therefore to bury the body intact within twenty-four hours according to her family’s religious traditions, but she was ignored. This failure of oversight is striking. Given the publication of Ruth Richardson’s renowned book, Death, Dissection and the Destitute, in 1987, there was ample information in the public domain about the cultural and religious meaning of death and dissection since the original AA1832. Richardson’s study received a lot of publicity in the medical press, and it was well known in the media that criticisms were being made about the cultural conduct of the medical research community per se. Indeed, so respected was her work that the Chief Medical Officer, Sir Liam Donaldson at the time of the various public enquiries into the NHS organ retention scandals at Liverpool and Bristol, had asked Richardson to assist with the cultural dimensions of his findings. It would therefore have been more honest to say in the Isaacs letter of apology that the medical profession did not choose to inform itself, rather than trying to use a weak ethical defence that ‘current standards and safeguards were not in place’ and there was ‘little awareness’ of the impact on grieving relatives. Indeed, it would be the scale of retention both at Manchester (‘5,000 organs and tissues held at 4 locations’43) and elsewhere (some 50,000 organs44 rising to 105,000 in the subsequent Redfern report45) that prompted a public backlash. It was no longer tenable to say that the medical sciences were sincere, but sincerely wrong.46

Today there is now an international recognition that bioethics is a very significant but also a somewhat complex and confusing legal framework which individual clinicians apply in their cultural settings in the global community. One key criticism of bioethicists that endures is how ‘in terms of the classic triad of thought, emotion and action’ – they have ‘focused almost exclusively on thought – ethical thinking per se – and given inadequate attention to emotion and action’.47 Thus, ‘what has been lost in the academic processes’ of evaluating the evolution of international and national ethical frameworks are ‘concrete human dimensions … the connection between ethical discourse and the full dimensions’ of clinical decision-makers in a biomedical research facility between actors, particularly as technology advanced after WWII. To advance clinical ethics thus requires more careful historical consideration of rhetoric (ethical codes internationally) and reality (muddled national legislation), and its ambiguities. Moreover, as George Belkin wrote, we need medico-legal perspectives that are:

less concerned with generating rules of conduct than with deepening and enriching the self-understanding and perspective brought to bear when people confront choices and each other. And a humanist ongoing engagement and routine reflection can make medicine more deeply ethical than can duels over methodologies or ethics per se. Bioethics has narrowed how reflection in medicine about medicine takes place and has inhibited rather than rescued a medical humanism by an overrated focus on restrictive reduction to ‘the ethical’.48

This book sits at this intersection – between rules and practicalities – between laws and choices in research spaces – between human stories and medical ethics that really happened.

Conclusion

A raft of legislation in Britain, stretching from the Murder Act in 1752 to the Human Tissue Act in 2004, had sought to regulate the use of human material from the dead and the living for teaching and research purposes. Largely, however, regulations were piecemeal, and Parliament never took a robust oversight of all the stipulations to check that they still made sense in a fast-changing biomedical world. Those working inside laboratories (pathologists and neurologists), dissection rooms (anatomists), medical schools (clinicians and doctors), as well as specialists attached to cancer study centres, all assumed that the particular law they were following was correct. Few stopped to think about, much less check on, the robustness of their medical ethics and governance criteria. Everyone assumed that methods and training were correct, standard practice within the medical science community. It was the cultural changes taking place in modern British society which would lead to their investigation properly by the Chief Medical Officer around the Millennium. Meantime, the network of actors involved – in which the Coronial Office would prove to be a linchpin – followed fundamentally flawed statutes. The legal framework turned out to be akin to standing on ethical quicksand. Thus, to engage with the sort of ‘medical humanism’ that Belkin called for recently, we end Part I of this book by navigating a selection of human stories in Chapter 3 that reflect the main research themes to come in Chapters 46 in Part II. In this way, instead of dissecting bodies and mislaying their material histories, we begin to reconstruct, trace and analyse what it meant to conduct medical research behind closed doors, to sign up to train in human anatomy and to experience medically what soon became known colloquially in popular culture as the Ministry of Offal.

3 The Ministry of Offal

Introduction

Francis Partridge, diarist and writer, attended a Christmas wedding in central London on 23 December 1962.1 Recently widowed, her financial affairs were precarious. She would shortly take the difficult decision to sell her Wiltshire home, Ham Spray House, it being too expensive to maintain on a small widow’s pension. Francis looked forward, even so, to her only son’s yuletide marriage.2 He had been a great comfort to her in the dark days of early bereavement. Bleak times seemed to be behind them both because there was now the promise of a future grandchild. Her son’s fiancée was pregnant and would shortly give birth to a baby girl. Little did Francis know, however, that her hopes of enlarging her family circle would soon be dashed, and cruelly so. Her beloved son, an up-and-coming talented writer, was to die of a heart attack just nine months after his wedding and only three weeks after the birth of his new daughter.3 On 7 September 1963, the day of her son’s death, Francis’s grief as recorded in her diary was raw: she wrote – ‘I have utterly lost my heart: I want no more of this cruel life’.4

On her son Burgo’s wedding day, Francis’s heart had in fact been full of hope.5 She invited a wide circle of friends to the celebration, many from amongst the famous Bloomsbury set of artists, painters and writers, her relatives by marriage. Her new daughter-in-law, 17-year-old Henrietta, was the offspring of David ‘Bunny’ Garnett.6 He was a former bi-sexual lover of Duncan Grant the painter and the ex-husband of Francis’s sister.7 As Bunny lived in France, it was a gathering from across Europe and England that promised to closer entwine the bonds of friends and family. Francis wrote an affectionate and amusing account of those assembled in her diary:

Notes on the wedding: the absolute charm of Duncan, arriving with a button-hole in a white paper bag, beaming at everyone. The geniality of Bunny who suddenly began talking about the necessity of leaving one’s body to the doctors with a look of great jollity on his face (more suitable to the occasion, than the subject). His father’s mistress, old Nellie someone-or-other, has just died and when Bunny went to arrange the funeral he found to his relief that the body-snatchers had been already, and all the trouble and expense were spared him: ‘You just ring up the Ministry of Offal, Sackville Street’ is what I remember his saying, but I suppose he can’t have.8

Having lost a husband and 28-year-old son to heart failure over a three-year period, Francis had every reason to revisit her diary entry on the Ministry of Offal. New medical research might have prevented the early deaths of those she loved. Yet, even this dispassionate, highly intelligent woman could not bear to donate her husband’s or son’s body to medical science. Here was someone so shockingly bereaved, in such emotional turmoil, that the physical pain she experienced was almost impossible to bear. The amusing quip at her son’s wedding had foreshadowed a tragic end to her intimate family life, akin to a Grimms’ fairy tale. As Burgo’s publisher, Antony Blond, wrote years later:

One afternoon whilst talking on the phone to Charlotte [Blond’s wife], Burgo died. He was suffering from von Falkenhausen’s disease [an aortic aneurysm discovered at the post-mortem] and part of his aorta had flaked off and choked him. I am told that when his mother was informed she telephoned Harrod’s and asked them to collect her son’s body, cremate him, and send her the bill.9

For a woman who did not believe in an afterlife, there was no solace gained from a sense of spirituality. Nor could she bear to contemplate the bodies of her loved ones displayed for public consumption in any respect. Indeed, in accordance with her rationalist and atheist beliefs, Francis refused to hold a formal funeral for Burgo – a decision that his publisher said he ‘never forgave her’ for taking.10 The alternative consolation of the gift to humanity of her son’s body was unconscionable as she sank into depression, unable to write her diary for the next two years. The gap came to symbolise the gulf that death left in her life.

Even so, Francis was a writer and what she could constructively do was to chronicle the human condition of trying to live with the pain of a double bereavement. As Anne Boston remarked of her diaries covering this sad period: ‘The stages of grief stand out almost like a clinical case history. At first she feels eerily like an amputee, at the same time fearing her sense of loss still lies in wait.’ Francis hence remarked in 1962 that grief is like a ‘ghastly elephant trap. … I have buried and suffocated some part of it and one day I shall wake and find I’ve been falsely bearing the unbearable and either kill myself or go mad.’11 It is precisely this sort of scenario that has often resulted in disputed bodies in modern biomedicine. For Francis could afford a cremation, she had legal control of the body and she never had to resort to voluntary donation out of poverty. A doctor did not compel her to think about when exactly the dead-end of life happens in a laboratory or dissection-room setting. Everyone respected her wish to cremate her son with dignity and in the way that she and her daughter-in-law envisaged. And without the proverbial Ministry of Offal this would also have been the ending story in all cases of untimely or tragic death. In practice, however, most ‘ordinary’ people did not know that at Coronial Inquests parts of their loved ones were used to establish a cause of death and for further medical study under one of the Human Tissue Acts outlined in Chapter 2. The Ministry of Offal had a fleeting presence in a doctor’s interaction with patients or in written guidance and advice. This is not necessarily a criticism of medical science. Many researchers and other professionals acted within current guidance at the time, and the story above clearly shows the dilemmas involved in reconciling research ethics with painful personal sensibilities. In later life, Francis thus still recalled ‘the sharpness of the death of her husband and son’ even after forty years of bereavement.12 Yet this was also the sort of person expected to be open to body donation. Francis never espoused religious beliefs that constructed medical research as something taboo: quite the opposite. Even so, like many of her contemporaries, it was the physical shock of grief that out-weighed the call of medical science. In this case, her wishes were respected. In others, the wishes of families were either ignored or never canvassed or undue pressure was applied for consent. The rest of this chapter unpicks some of the competing influences that shape how disputes about bodies (the focus of Part II of this book) might originate. Running from the early twentieth century to the present, it will concentrate on five core sets of life writing.

The first, letters by Mrs Pearl Craigie, explores how negative public sentiment about the use of bodies and the harvesting of organs could develop and the defensive attitudes in the medical establishment that could thus develop. The second, third and fourth sets of life writing – respectively, Richard Harrison, Jonathan Miller and Michael Crichton – illustrate the complex ethical, moral and personal standpoints of those who benefitted from or conducted anatomical research and its teaching activities. A final set of life writing – the author’s own reflections on visits to modern anatomical spaces and dissections – focusses on the sentimental and experiential aspect of anatomical practices, in effect showing how the three types of body disputes that underpin the agenda for Part II of this book can sometimes (but not always) be generated by complex feelings when involved in medical research cultures rather than an intent to deceive. Here then, we encounter the human flow of medical research and the tides of public opinion in the serpentine river of life and death of a biomedical age.

Mrs Craigie’s Complaint

At the turn of the twentieth century, female novelists who came to prominence in the press often did so with strong political convictions, and many went on to become journalists. One leading columnist was ‘John Oliver Hobbes’, the pseudonym of Mrs Pearl Mary Teresa Craigie. She used her writing talents and feeling for a good story, not just to entertain, but to tackle social inequalities in British society. Thus, the London Review observed how Mrs Craigie ‘with an unfailing finger pointed out the sores of modern life’ and did so in the belief that she should be ‘a woman who faithfully served her contemporaries to her utmost ability’ in popular print culture.13 During the Edwardian era, she focussed public attention on hidden histories of the dead, to the embarrassment of those dissecting at leading London medical schools.

In the late spring of 1906, a series of letters appeared in the Daily Mail, which caused considerable consternation in medico-legal circles. They were penned by Craigie (see Illustration 3.1), a former president of the Society for Women Journalists in London.14 One controversial letter asked ‘Mr Sydney Holland … Chairman of the London Hospital’ to reveal ‘how a post-mortem examination may be performed with the act of dissection’. Craigie queried the standard methods of cutting up a dead body according to the various definitions set out in a medical dictionary, pointing out that it was self-evident that there was a great deal of difference between:

  • Dissection: The operation of cutting-open a dead body.

  • Post-Mortem: An examination of the body after death: autopsy.

  • Autopsy: Dissection and inspection of a dead body.15

Illustration 3.1 Photograph of ‘Mrs Craigie’ for an article by Margaret Maison, ‘The Brilliant Mrs Craigie’, The Listener Magazine, 28 August 1969, Issue 2109, p. 272. The photograph originally appeared in the flyleaf of John Morgan Richards, The Life of John Oliver Hobbes told in her correspondence with numerous friends (John Murray, Albermarle Street, 1911). As this publication is now out of the copyright clearance restrictions and this author owns a copy of that original book, the image is being reproduced here under creative commons Attribution Non-Commercial Share Alike 4.0 International (CC BY-NC-SA, 4.0), authorised here for open access, and non-profit making for academic purposes only.

She wanted to know explicitly: ‘Mr Holland speaks of the “small disfigurement” caused by a post-mortem examination. With all respect, I must ask him whether he has personally seen many bodies after the operation in question, or bodies not especially prepared for his inspection?’ Mrs Craigie also queried whether relatives could dispute the use of their loved ones’ remains for post-mortem and subsequent medical research, or whether medical science ignored their intimate feelings. She challenged the prevailing medico-legal viewpoint that post-mortem protected patients from future medical negligence and was always a positive experience that the bereaved had consented to. Surely, she queried, this was dependent on the number of material cuts to the body of a loved one:

Again: is it always made clear to every patient (or to his or her relative), on entering other hospitals, that, in the event of his or her death, the body may be subjected to the ‘small disfigurement’ in question?16

She was sceptical that a relative would be told of deaths caused by the ‘hospital’s own negligence’, or indeed from ‘carelessness, or ignorance or bad nursing’. The common situation was surely that hospital doctors would instead close ranks to protect their reputations. Thus, she enquired, if the bereaved objected to a post-mortem and further medical research, ‘in the event of a refusal’ are the ‘relatives reminded that they have received free treatment’? This question of financial obligation was to have remarkable longevity in Britain, and indeed often shapes media debates today about the need to open up patient data for research in the NHS (as we shall see throughout this book). Meantime, Mrs Craigie’s questions about the ethical basis of medico-legal research and its actual working practices were to prove to be remarkably forward-thinking. In many respects, a lack of informed consent – her central complaint – was not resolved until the Human Tissue Act (Eliz. 2 c. 30: 2004), as we saw in Chapters 1 and 2. And so, in 1906 her letters caused an outcry at the start of a century of controversy. To appreciate her impact in the media and how defensive the medical science became at the time, we need to briefly reflect on her social origins and the reach of her social policy journalism in popular culture.

One of the reasons that Mrs Craigie’s Complaint (as it was styled in the national press) received such widespread publicity was that not only was she a successful novelist but also a well-known playwright and contemporary of Oscar Wilde on the London stage.17 Craigie was an American by birth, born in Massachusetts, but brought up in London by wealthy Anglo-American parents. As the Listener magazine explained:

Her father, John Morgan Richards, was a successful businessman of Non-Conformist stock. At the time, there were only about a dozen American families living in London. Mr Richards became founder and chairman of the American Society in England. He introduced the sale of American cigarettes into this country and became a leading light in the brave new world of advertising. His interests were literary, as well as commercial, and at one time he was proprietor of the Academy Magazine and Carter’s Little Liver Pills. His pioneering spirit made him a large fortune and he realised a cherished dream by buying a castle on the Isle of Wight.18

Richards thus had the financial wherewithal to fund his eldest child’s expensive education. Pearl enrolled at Misses Godwin’s boarding school at Newbury in Berkshire (1876–1877) before entering a number of private day schools in London. By 1885, she had grown into a confident young teenager and spent a year in Paris, where she became an accomplished pianist. Mrs Craigie was renowned, however, for having made an ill-fated marriage aged 19 to Reginald (known as Robert) Walpole Craigie, seven years her senior, and a banker.19 On her honeymoon Pearl realised that she had made a serious mistake, as her husband proved to be an alcoholic and a philanderer. Her marital problems were, she told friends, akin to ‘being strangled by a boa constrictor’. Nevertheless, she did her marital duty by giving birth to a son, John Churchill Craigie, in 1890. Soon, though, a legal separation and divorce followed in August 1895. In between, to avoid her husband’s excessive drinking and womanising, Pearl enrolled as a student of classics and philosophy at University College London. She also started to do some serious creative writing and developed intimate friendships with gentlemen in her social circle. In part, these inspired Henry James’s famous novel, The Wings of a Dove (1902). Consequently, according to commentators in the media, Pearl espoused the ‘new woman’ of the 1890s. For she was determined to speak her mind, earn an independent living and thus break free from the marital restraints of her bitterly unhappy home life. To become financially independent, and secure the sole custody of her only child in the divorce court, she published a novel, Some Emotion and a Moral, in 1891. The storyline concerned the trials of infidelity and a bad marriage.

It soon became an instant best-seller. Pearl was delighted when it sold ‘80,000 copies’ in the first year. The publicity surrounding her publishing success and the notoriety of her divorce case reflected her wide social circle of not just political but bohemian friends too. Many were up-and-coming artists, poets and dramatists of the fin-de-siècle. They included the first contributors to the famous Yellow Book, a magazine devoted to the decadent arts, featuring Oscar Wilde, George Tyrell, Aubrey Beardsley and George Moore. She likewise was befriended by the elderly William Gladstone (former Prime Minister) and a young Winston Churchill. Yet, her closest friendships were from amongst a wave of wealthy young American women who migrated to England during the annual social season. Many went on to marry into the top ranks of the British aristocracy. Most bought their title but soon found the marriage bargain to be disillusioning. One such was Consuelo Vanderbilt, who resented, but had to comply with, an arranged marriage to the 9th Duke of Marlborough in exchange for her dowry of $2.5 million. Consuelo by 1906 (the date of Pearl Craigie’s letter to the Daily Mail) had separated too and was to divorce in 1921. In many respects, then, Pearl espoused a new form of female liberation, and it was on this basis that medico-legal figures of the Gilded Age on both sides of the Atlantic derided Mrs Craigie’s Complaint.20

In all the articles and letters written to counter Mrs Craigie’s Complaint by those associated with the London Hospital and the medical research culture of the time in England, three things stand out. First, the responses all had an aggressive, affronted tone. To paraphrase their male sentiments, most said: Who is this woman with the effrontery to question what we as a medical profession do with the dead body? They next all sought to reassure the public that the dead were treated with the utmost respect. Again, a précis in the media often ran something like this: Why does this over-sensitive female writer, who is divorced and has converted to the Roman Catholic Church to assuage her guilt, think she has the right to interfere in our work of national importance? A third trend was that all responders to her letters stated categorically that only the poorest were dissected and a post-mortem for the rich and middle classes did not in any respect resemble what happened to the ‘unclaimed’ from amongst the lower classes who could not afford a pauper funeral. The line of argument stressed was ‘that there was never a time when the hospitals of this country were so much endeared to all classes of the community’.21 Yet, this trinity of stock responses was disingenuous and thus Mrs Craigie kept pressing for better public accountability.

Not one single medical correspondent was prepared to elaborate on the reasonable questions Mrs Craigie posed in print. Nobody defined what the material differences were between an autopsy, post-mortem and dissection. One angrily said: ‘I think Mrs Craigie should have taken the trouble to understand the differences between dissection and post-mortem’ before going into print.22 Of course, this only made readers of the Daily Mail more suspicious as to why the medical profession was not prepared to do so in the first place. Even a family acquaintance, Edwin Howard MRCS, did not explain explicitly that dissection meant dismemberment in his letters to the editors of several national newspapers in which he defended his profession. Nor did he concede how little materially was left at the end to bury. For, as this author has shown elsewhere, at best it was only about one third of the body at the end of an average dissection done during the Edwardian era.23 In other words, what Mrs Craigie had done was to ask some inconvenient questions.

The timing of Mrs Craigie’s letter was particularly unwelcome for the London Hospital. Mr Sydney Holland, to whom her letters were addressed, was the 2nd Viscount Knutsford, a barrister and hereditary peer, who chaired the London Hospital House Committee from 1896 to 1931. He had just completed a major fund-raising drive, and would clearly have been embarrassed socially by the allegations of medical impropriety.24 The press dubbed Holland The Prince of Beggars for the sheer number of financial activities he had personally undertaken to raise money to rebuild the rundown infrastructure of the London Hospital.25 By 1906, he had generated enough capital donations to rebuild the premises in their entirety, and this gave the hospital doctors a new opportunity to increase their involvement in medical research. It was likely therefore that in the future they would want to acquire more, not fewer, bodies to dissect. In private, Holland conceded that the hospital focussed on ‘B.I.D.’ patients – ‘Brought-In-Dead’ – the initials doctors used in their medical case-notes to indicate that a body might be suitable for medical research after post-mortem.26 The irony was not lost on those like Mrs Craigie that they would be ‘bid for’ in an expanding supply system that was becoming very competitive. In Part II we will be examining how these networks of actors acquired human material, their common activities, habits and procedures, building on and extending in new directions the conceptual approach of Bruno Latour, Michel Callon and John Law in actor-network theory, outlined in Chapter 1.27 For whilst historians and sociologists have considered in general terms how actor networks were fashioned by the science and technology of the twentieth century, there is a much less refined sense of how and for what purpose anatomists, coroners and pathologists generated and regenerated complex chains of human material to sustain new research cultures. In this book, we will be describing this actor network by mapping it out. From 1945 to 2000, its acquired human material created notable research agendas, attracting external funding, building professional status and making careers. This had performative elements that were intended and unintended, orthodox and unorthodox, seen and unseen. In other words, we are going to take our research lead from Mrs Craigie and her searching enquiries about ‘B.I.D.’ Her opponent Holland meantime was also a keen advocate of vivisection, believing that animal research was justified for the public good. So much so, that in 1908 he would become the president of the Research Defence Society, a position he held until 1931.28 He was therefore a committed and vocal exponent of human and animal research: passions that set in context Mrs Craigie’s Complaint and the press coverage it generated.

What Sydney Holland chiefly objected to was the accusation by Mrs Craigie in a letter to the Daily Mail of 28 April 1906 that said: ‘it is known that the hospitals are not under any Government inspection’. This was despite the Anatomy Act (2 & 3 Will. 4 c. 75: 1832) setting up an Anatomy Inspectorate to oversee dissection and its supply lines from infirmaries, large teaching hospitals and workhouse premises.29 As Pearl pointed out, ‘Some are well managed; some are less well managed.’ The fact that inspection was seriously underfunded meant it lacked rigour. She then used emotive language to describe bodies handed back after post-mortem: ‘I leave your readers to imagine the feelings of parents and others on receiving the bodies of their dead brutally disfigured and coarsely sewn up as though they were carcasses from Smithfield’ livestock market. There is no doubt that this was a controversial way to question contemporary medical ethics, and many thought that she should have used more measured language. Today, she would be criticised by some historians of science and medicine for her ‘neo-liberal’ values in a pre-liberal era (ironically), whereas she defended that what she espoused was a ‘basic humanism’.30 Pearl Craigie was a plain-speaking American who liked to take risks, and she thought that people of education in the public sphere of the arts should be radical. Thomas Hardy, the novelist, was praiseworthy of this character trait in her, often quoting the definition she espoused about the role of an artist in society. They should be a person, she said: ‘who thinks more than there is to think, feels more than there is to feel, and sees more than there is to see’.31 Even so, she had only a partial picture of reality, as subsequent letters to the press revealed.

Most dead patients underwent a post-mortem, but it was not their whole body that was taken for further research but rather their body parts, organs, tissues and cells that could and were often removed, supposedly to establish a cause of death, as we have already seen in earlier chapters. Coroners and the medical men they employed to do post-mortem work had a lot of discretion to remove human material as they saw fit. Mrs Craigie could not have known this in 1906, but she had potentially hinted at a trade shrouded in secrecy. There were in fact many unseen aspects to the business of anatomy and its supply lines.32 For instance, an amputation of a leg or arm sold after operative surgery often entered the chain of anatomical supply in London. The poorest, used extensively for teaching and research purposes, were divided up before burial. Bodies were broken for sale because a body in parts was more profitable than whole. Generally, the anatomist on duty did their best to make sure the body contained enough human material sewn up inside the skin for burial. The dead body thus weighed enough to meet grieving relatives’ expectations at the graveside (a theme we return to below). Meanwhile, the reference to Smithfield market in Mrs Craigie’s Complaint was ironic, because across the road from the famous meat market stood St Bartholomew’s Hospital, which always competed with the London Hospital to buy the dead and destitute of the East End for medical research and teaching purposes (see Chapter 4 for the modern period). In other words, the comments by Mrs Craigie were ill informed on the essential details, but they did hint that larger ethical problems existed. Predictably, perhaps, Sidney Holland picked on the inaccuracy of the finer details. He chose to ignore the bigger ethical dilemmas that the medical profession faced: there was a trade in the dead, it was active in 1906, and it would continue to be so at least up to the 1960s and often until very recently in most medical schools in Britain.

Sydney Holland admitted to the Daily Mail that the London Hospital undertook some ‘1,100 post-mortems every year’.33 He did not, though, reveal how many actual full-scale dissections this involved. Instead, he stressed that in the case of post-mortems generated on the hospital wards, when he received a complaint from a relative about medico-legal impropriety, he always investigated them personally. Holland appreciated that ‘the horror of post-mortem being made on anyone one loves is shared by the poor as well as the rich’ but reiterated that only a ‘small disfigurement’ occurred, disguised by being covered over when relatives came to view the body. This was misleading: the poorest cut ‘on the extremities and to the extremities’ could not accurately be described as having a ‘small disfigurement’.34 Class played a central role in cutting a little, or a lot. Holland, by concentrating on what happened at a post-mortem before a body went for dissection, was being deliberately evasive. Instead, he defended that Mrs Craigie was not in a position to verify her statements, and that in his opinion ‘she has permitted her tender feelings, stimulated perhaps by a complaint she has not tested, to tempt her to publish one more work of fiction, which, unlike her others, will give pain to many, and pleasure to none’. In a follow-up letter, he did reveal when pressed that there had been some ‘one hundred and ten thousand’ post-mortems in the ‘last ten years’ but stressed ‘we have had only three complaints’.35 He also emphasised that ‘very special and loving care is shown to the dead in the London Hospital’. There was a mortuary chapel, built from the bequest of William Evans Gordon, a major benefactor. Yet, this still did not elaborate on the fate of those sent for a full-scale dissection and dismemberment. Instead, Mrs Craigie faced accusations of being an interfering female of a sensitive disposition, given to storytelling, who was not in command of the material facts. It was difficult to see how she could be so, when the dead-end of life seldom featured in public. Searching questions often created this sort of medical backlash, and it could be biting to protect the fact of many missed body disputes of the sort analysed in later chapters.

There was to be one final twist in this storyline about disputing the dead-end of those used for medical research. Pearl Craigie died within just three months of penning her robust exchanges with Sydney Holland in the Daily Mail. On 13 August 1906 she was staying at her father’s house in London, excited about a touring holiday she was about to embark on to Scotland. Retiring to bed, she said she felt tired, but ill-health was not suspected. In the morning, a maid tried to rouse her in her bedroom, but to no avail. She had died of a heart attack in the night. Her shocked parents and her 16-year-old son were grieved to discover that, as her sudden death was unexplained, she would have to undergo an autopsy followed by a post-mortem. At a Coronial Inquest conducted in Paddington by Dr George Danford Thomas, the GP called to the death-bed scene (Dr Leslie Meredith) recalled that he ‘found Mrs Craigie lying on her back in bed, dead’.36 He thought that she had expired ‘painlessly’ and been dead ‘three or four hours, probably more’ sometime the previous evening. His post-mortem examination concluded with an informative summary: ‘One division of the heart was dilated and the muscle was thin and degenerated. Death was due to cardiac failure, and entirely due to natural causes.’ The jury heard the medical circumstances in full:

Coroner: Her death might have occurred anywhere suddenly?

Dr Leslie: Oh yes

Coroner: She must have fallen right back on her bed, dead?

Dr. Leslie: Yes.

Coroner: And that would be a painless death?

Dr. Leslie: Yes, quite …

Coroner: The case seemed a perfectly simple one. The deceased had probably been exerting herself. She was an active woman, and the heart not being able to stand the strain had given way, causing her death, which was quite painless. The deceased was a married lady. The marriage had been an unhappy one, and she took proceedings and obtained a divorce.37

Despite having been divorced for eleven years, this legal status, her gender and financial plight determined the courtroom’s attitude to Pearl Craigie’s unexpected death. The Inquest Jury was very concerned to make sure she had not committed suicide in despair at her failed marriage, or due to the exertion of having to work to earn a living. The fact that she would have strongly objected to a post-mortem of any description never featured in court. Yet until cause of death was confirmed, Craigie and her body did not belong in mainstream society. The need to establish why she died required that her family engage with a medico-legal process she had opposed determinedly and in recent memory. They understandably wanted to bury her but had to wait until the body was returned to them by the Coronial Court, and without her heart (a recurrent theme in such cases to which we return in Chapter 5). And when it was given back, at the reading of the will they discovered that Pearl wanted a cremation, which created yet more controversy. She had converted to Roman Catholicism in 1892 and the parish priest felt strongly that a burial would be more appropriate under the circumstances. Cremation was still a contentious and novel request in 1906. A requiem mass was thus held at Farm Street in Mayfair, and Pearl Mary Teresa Craigie was buried at St Mary’s Cemetery, Kensal Green in London. Despite her best efforts to prevent it, her cut-open heart, major dissected organs and tissue samples did not join her cadaver sewed back up for internment in the ground, superseding in death all the things she objected to in life. The press did not disclose, moreover, tissue retention for long-term heart research goals. Yet, as we shall see, heart failure and research to prevent it was one of commonest entries in the dissection registers of leading medical schools like St Bartholomew’s in London (see Chapter 4). It was incontrovertible that a 38-year-old woman in the prime of her life would have been a valuable research commodity and that, if not retained for further research, class had protected her from a fate the poorest could seldom hope to avoid. In many respects then Mrs Craigie’s Complaint personified a dead-end that medical science denied and in which the Ministry of Offal did have a basis in reality. The material reality of what went on behind the closed door of this ministry – in effect the substance of the answer that Mrs Craigie was searching for when she penned her first letter to the press – can be garnered from another, later, representative set of life stories.

KEEP OUT – Private!

On the eve of WWII, Richard Harrison aged 17 was a grammar school boy living in London, where he was a diligent student.38 Studying hard was essential if he was to realise his ambition of becoming a qualified doctor. He needed to obtain his Higher National Certificate in the sciences because entrance to a good medical school was very competitive. Like most young people of his wartime generation, Richard wanted to get ahead in his career plans. It was likely that he might have to enlist in the armed forces as war threatened across Europe. As a prospective medical student, he was eager to win a place at a prestigious London teaching hospital. He hoped to train somewhere with an excellent reputation. Before the National Health Service (hereafter NHS) in 1948, junior doctors needed a good reference from their medical school to be able to buy into a solvent general practice to start earning back the cost of their expensive, privately funded education.

Richard’s father encouraged his son to engage with the recruitment brochures of medical schools that he sent for in the post. Together they made a decision to apply to St Bartholomew’s Hospital, central London, and for three key reasons: first, it was where his mother had been treated successfully for laryngeal carcinoma; second, the medical staff had treated her with courtesy and professionalism which augured well; and third, the hospital was within travelling distance of the family home in Mill Hill, north-west London. Richard could commute daily, live at home to save costs, and do extra work in the holidays to earn his keep. As there was no tradition of a career in medicine in the Harrison family, Richard was nervous about his chances of securing a place at medical school. Yet, he impressed the interview committee by telling them that he never forgot his childhood inspiration, the medical novel The Elephant Man and Other Reminiscences written by Sir Frederick Treves, which he had read aged 13. It was, he believed, ‘the best volume of surgical memoirs ever published’.39 This was a curious coincidence because Mr Sydney Holland, 2nd Viscount Knutsford, had been responsible for the dissected body of the ‘Elephant Man’ in the collection of the London Hospital. Without knowing it, Richard Harrison had a strong connection to a hidden history of medical research that Mrs Craigie’s Complaint had hinted at some thirty-three years before he became a new medical student. For now, Richard was convinced that by training at St Bartholomew’s he would be at the centre of an exciting medical world.

Richard obtained a training place in the Indian summer of 1939. He remembered: ‘the huge poster covering the wall of the building nearest to the Old Bailey which proclaimed Barts was the Mother Hospital of the Empire. It convinced me that I had made a sensible choice [sic].40 Soon, however, the German Blitz on London would affect the training of all medical students. The Daily Mail announced on 29 September 1939 that some ‘6,000 medical students’ were about to ‘study amongst the sandbags’.41 Central government then asked Oxford and Cambridge universities to prepare for a threefold increase in evacuated students from the capital. New medical students, like Richard Harrison, arrived at either Queens’ College, Cambridge from St Bartholomew’s Hospital or St Catharine’s College, Cambridge from the London Hospital Medical School, sent there for the duration of the war. On his arrival, Richard found that ‘Cambridge in wartime was a sombre, not very sociable, place. Barts was at the university, but not truly of it [sic]’. He needed to find a way to make his mark, and he did so in the dissection room. The sign on the door read KEEP OUT – Private! Even so, Richard gained permission to enter this exclusive and privileged medical space. In doing so, he provides us with insights into the material substance of Mrs Craigie’s Complaint and the medical profession’s appellation The Ministry of Offal.

Like most medical students, Richard reflected that he was nervous about dissecting his first corpse:

We were required to dissect, and in considerable detail, the whole of the body. From time to time I had wondered, in desultory fashion, whether that might prove an emotional, even a fearful experience.42

He soon discovered that ‘I need not have worried’. For ‘our subjects were unclaimed corpses from the workhouse which had been steeped in preservation for so many weeks before reaching us that they would have been quite unrecognisable to anyone who might have known them in life’. Later he recalled what the bodies preserved with formaldehyde looked like: ‘They were, indeed, so shrunk and wizened, with such tough and leathery skins, as not to be instantly identifiable as human at all.’43 A relieved Richard explained that this inhuman appearance helped him to develop a clinical mentality of medical research in the dissection room: ‘As we teased them apart we gave little thought to the existence each had led.’ The priority was to compare each corpse according to Cunningham’s Manual on Practical Anatomy, the set textbook. Yet, Richard was troubled too: ‘I suppose we had become conditioned to the fact that we would have to dissect a human body.’ It may have been mundane and routine after a while, but from time to time he was reminded that others might dispute his dispassionate demeanour. One incident he called to mind:

Visitors to the dissecting room were not encouraged, but one weekend, when it was deserted, I took my father. He was not a squeamish man, and had seen much service on the Western Front but I heard not long after that, for 24 hours, he felt unwell and could eat nothing.44

Richard was close to his father and it disturbed him that a man familiar with the horrors of trench warfare in WWI could still react in the way he did to death, and its dead-end.

The main reason that medical students like Richard developed a detached attitude was, of course, that the corpse they dissected was not a complete body shell for long. It soon became a fragmented human being in the dissection room. Seldom did medical students and those training them in anatomy discuss the material reality of dismemberment, and so Richard’s recollections are strikingly honest:

Though we each dissected the whole body, it was not a single particular body. Six teams, each of three students, were assigned to every cadaver – one team to each limb, and two others to the torso and the head. This caused arguments at the start of each term, since those working on the arm began by approaching the shoulder from behind, whilst the ‘leg’ men commenced on the front of the hip. So a notice was hung from the subject’s toes during the first fortnight, saying: ‘Body will be turned at 2pm’.45

Here we can trace the development of a medical discourse in anatomical action. The person on the dissection table without a name was a ‘corpse’ – then a ‘cadaver’ – the ‘subject’ – a ‘body’ to be ‘turned over’ – facedown. As Richard conceded, ‘Gradual disintegration thereafter resolved the problem’ of how to divide up the dead on a daily basis. There was also a further practicable problem to overcome – generally offensive to public sensibilities. Richard elaborated that

Each corpse was weighed when it came into the department. It had to weigh, when eventually buried in consecrated ground, about the same as it had done originally. So, at the end of each day, Arthur, the attendant, transferred the fragments, from each cadaver back to its specific coffin. At least he did in theory. In practice, he moved down the long, brightly lit, and spotlessly clean room, sweeping the pieces of tissue from each glass-topped table into one bucket. He divided its contents between all the coffins, tipping into each as much as he calculated would satisfy HM Inspectors [of Anatomy]. If that seems like an arbitrary or irreverent procedure I always understood Arthur had arranged when the time came, he too would be dissected.46

In many respects, this first-hand testimony is not only representative of what happened inside many medical schools in Britain; it also provides confirmation of Mrs Craigie’s Complaint.

To use Richard Harrison’s precise phrase, anatomists buried ‘fragments’ of corpses in pieces that were ‘calculated’ to be concealed. The macabre may have made medical history but it remained in the scientific shadowlands. There was no public engagement effort, and communication was clumsy. Seldom did a newspaper feature an article that led with: We did this with your dead-end to push past the deadline of life. Nor was that status quo debated or reformed as cultural tastes changed – effectively it did not exist in the public domain. Richard Harrison made clear that in his medical training he was taught ‘punctilious history taking’ at the bedside, but never at the dissection table for the obvious reason that his patient cohort was dead. Few thought to ask whether the dead should have a post-mortem passport, in which their material journey could be mapped and précised for relatives to connect them to the gift of donation and its medical legacy. The attitude was that it took too much time, effort and resources to design and maintain identity links, and without public pressure to do so, the practical option was to follow ‘proprietorial’ rather than ‘custodial’ medical ethics.47 Ever since, this has essentially been the medical sciences’ default position, enshrined in law, until, that is, HTA2004. Thus, the profession kept disputed bodies and bodies in dispute with modern medical research behind the KEEP OUT – Private! sign. A similar representative life story takes us forward in time to trace how this set of training attitudes endured after the 1950s into the 1970s.

‘Say Ah!’

One key question that historians examining these sorts of personal accounts always need to ask themselves is how reliable and representative this recollection is of what happened. Did it reflect what occurred elsewhere? The answer is often straightforward – many medical students experienced dissection as a dehumanising encounter and they were relieved to do so. Jonathan Miller, writing for Vogue magazine in 1968, for instance, recounted his training as a doctor in the 1950s, which was in many ways similar to the sort of human anatomy sessions experienced by Richard Harrison in the 1940s:

That anatomy course stands out for another reason, too. As with most students, it was my first encounter with the dead. On the first day of term we were assembled in a lecture theatre and told what to expect. Afterwards we all trooped down to the tiled vestibule outside the dissection rooms and dared each other to be first inside. I cannot remember now just what macabre fantasies I had before going in the first time, but I remember quite clearly the vapid sense of anti-climax when we finally pushed through the frosted glass doors and stood facing our subjects.48

Once inside he was surprised how mundane the furniture, equipment and room looked. Again, the dead were called ‘subjects’, a professional language that Miller adopted easily. He recalled, ‘In our ignorance we had expected some ghastly parody of our living selves’ but instead ‘what we saw bore so little relationship to life that it didn’t seem to have anything to do with death either’. This was the grey zone of the dead-end of life, in which paradoxically the deceased would help the living push past a deadline. Soon he echoed Harrison’s impressions, but here the scale was greater. Miller trained at University College Hospital London (hereafter UCHL). The anatomy department had a policy of obtaining bodies of the homeless found dead in the streets around the back of Euston, King’s Cross and St Pancras stations. These were in plentiful supply during the cold winters of the early 1950s:

The bodies were laid out on fifty or sixty glass-topped tables, arranged in rows right down the length of an enormous shed lit from the windows in the roof. Most of them had been aged paupers. The pickle had turned them grey and stiff, and they lay in odd unfinished postures, like those pumice corpses fixed in headlong flight from the hot ash at Pompeii. Even their organs were dry and leathery, blood vessels filled with red lead, and hearts chocked with the ochre of brick dust. It was only much later, when we came to autopsies – dissection, that is, performed on the recent dead – that we finally experienced the ordeal of which we had been so mysteriously cheated.49

Miller then went on to describe what it was like to dissect a fresh cadaver. He soon came to appreciate the clinical importance of those aged paupers he encountered. Unbeknownst to him at that time, they were either destitute street deaths or passed on from old infirmaries and workhouse premises now run by the new NHS:

The body is opened from the chin to pubis and the organs are taken out and examined one by one and laid on a side table like a windfall of rotten exotic fruit. When it’s all been cleared, the carcass lies open to the sky with the ribs and spine showing like the hull of a wet canoe. It’s always a shock to see how much we hold inside us and the florid variety of it all. Heart, liver, spleen, bladder, lungs and guts, we know them all by name but we don’t feel them and know them directly as we do our limbs and torso. This bloody cargo of tripes [sic] is carried from day to day more or less without being felt.50

Unlike Harrison, Miller explains why this sort of clinical intimacy is essential for general practice. He elaborates on his belief that it may always be necessary for the dead with hidden histories to continue to inform the case histories of living patients, regardless of medicine’s technological prowess:

The doctor is not just a critical spectator, he is a participant … licensed by law to go right up close to the actors [patients] and poke the suffering innards. He can feel the physical vibrato of the patient’s pain and overhear the otherwise silent complaints of the injured heart. There is no job on earth that brings one into such close and such refined contact with the physical substance of human feeling.51

Every time a junior doctor asks a new patient to ‘Say Ah’ to be able to hear properly the heart and lungs functioning, it is ironically from holding the hearts of the dead that they owe their dexterity.

What is thought-provoking about this personal memoir is its candour and emotional engagement. At UCHL, remaining unfeeling about the autopsies of dead aged paupers was essential for a future doctor’s ability to feel for his patients (literally). Indeed, Miller concludes that before he dissected ‘it was almost as if one were deaf before going onto the wards’. For he says that taking his transferable skills from the dissection table to the bedside, meant that: ‘The scales suddenly drop from one’s sense and for the first time one can hear the complex eloquence of the tissues.’ He observed often that: ‘The muffled gibberish of the cells and organs suddenly makes sense, becomes grammatical, and makes itself heard in verses and paragraphs of distress.’ Yet, he never knew the names of his aged paupers nor how they arrived at their autopsy. Even so, he was sensitive to his situation, more attuned perhaps than many others. For it is one of the greatest ironies of this type of medical education that students soon discover how the shapes of organs ‘like the kidneys also provide a perfect illustration of the old-age anatomical truth: the body is designed to protect itself, not to be easy to dissect’.52 Barriers have to be broken when going under the lancet, just as the doctor trained in human anatomy will later have to cut through the sensibilities of patients who might dispute her or his actions. Cutting-edge reach is paradoxically always about cutting into and up the deadline of life. That process can be strikingly personal, something that goes a little way to explaining why in the past and present some researchers suggest that too much knowledge about its unsavoury material side can be incompatible with the competing ‘public good’ of giving consent for the use of bodies in death. The final section of this chapter thus tries to show through personal experiences – notably by other medical students in the 1970s and this author’s visits to current dissection spaces – just how complex the issues explored through the stories that underpin Part II of the book actually are.

‘Cut!’

How candid would you want your dissector to be? Would you ask in advance to know everything, a bit or not that much? The usual riposte to this unsettling question is: Well why worry? After all, you will be dead! This is a material fact of life. All bodies are abandoned, you might reasonably reply. You cannot change decay. Yet, what about the question of dignity in death? Donors and their relatives need reassurance that loved ones are handled decently because there has been a long history of disrespect for those dying in destitution. And since that hidden history is inextricably bound up with ongoing questions of public trust in the medical sciences, it is not something that can be simply argued away by holding that it does not matter for the dead because it is the living who celebrate, commemorate, cremate and bury. So what was it like to experience dissection in the more recent past? Here is how Michael Crichton describes his first encounter with a dissected body at medical school by the 1970s:

NOBODY moved. Everybody looked at one another. The instructor said that we would have to work quickly and steadily if we hoped to keep on schedule and finish the dissection in three months. Then, finally, we began to cut. The skin was cold, grey-yellow, slightly damp. I made my first cut with a scalpel. … I didn’t cut deeply enough the first time. I barely nicked the skin. ‘No, no,’ said my instructor. ‘Cut!53

Crichton soon lost his appetite for this dead work. He was not supposed to find this difficult. It was a rite of passage – something all medical students did with dark humour. So why could he not simply grin and bear it like his fellow students? If laughter is the best medicine, he still found it difficult to see the funny side: ‘The second-year students regarded us with amusement, but we weren’t making many jokes in the early days.’ In fact, he observed that most trainees ‘were all struggling too hard to handle the feelings, to do it all’.54 A lack of life experience created emotional hurdles not found with instructions in dissection manuals.

Then the atmosphere in the dissection room intensified as each body was broken up. Dissection soon gave way to dismemberment and the realisation that: ‘There were certain jobs in the dissection [room] that nobody wanted to do.’ Soon, he explains, the medical students ‘portioned out these jobs, argued over them’. His recollection is that: ‘I managed to avoid each of these jobs’ until, that is, the demonstrator in anatomy said, ‘OK, Crichton, but then you have to section the head [sic].’ He kept thinking, do not panic – ‘The head was in the future. I’d worry about it when I got there. But the day finally came’:

They handed me the hacksaw. I realized I had made a terrible bargain. I was stuck with the most overt mutilation of all. … I had to go through with it, try to do it correctly. Somewhere inside me, there was a kind of click, a shutting off; a refusal to acknowledge, in ordinary human terms, what I was doing. After that click, I was all right. I cut well. Mine was the best section in the class. People came round to admire the job I had done.55

To test the integrity and reliability of memories like this, there are two options. Either analyse yet more autobiographies published in the past twenty years or so for comparable reasons, or leap forward in time to find out in person exactly what dissection has been like since the 1980s. Several logistical issues are the deciding factor.

Medical students’ memories are a mixture of feelings, general recollections and post hoc rationalisations – in other words, bias needs balancing out. Entering hence a selection of dissection spaces today to check credentials seems sensible, but it also does present its own contemporary challenges. There is the need for a strong stomach. Just because, for instance, this author has written extensively about the history of dissection does not mean that they would relish the thought of cutting up a body personally, any more than Richard Harrison, Jonathan Miller or Michael Crichton once did in the 1950s to 1980s. Then there is the question of how to judge what is happening inside the dissection space when your perception is going to be coloured by the vast amount of academic reading that you have done on this subject for fifteen years. Seeing the present with fresh historical eyes will take a great deal of reflection and self-control. Indeed, as E. H. Carr always reminded his undergraduate students at Cambridge, find out about your historian and you will then understand the sort of history they write.56 Another thing to keep in mind is that medical schools have regulations about dignity standards and you generally need an invitation to enter the dissection room. This is an ethical requirement that is admirable, but it can also compromise the degree of physical freedom visitors can have once inside a dissection space. A uniform of a white laboratory coat is standard, talking loudly is discouraged and engaging with the reactions of students must be about participant observation. Nonetheless, on balance it is necessary to have a checking mechanism, because otherwise the unarticulated parts of this rite of passage – the feelings, sentiments and beliefs of those behind the closed doors of the Ministry of Offal – could be missed, or misconstrued. All good historians know that what is not said can be as important as what is – indeed, as Marianne Barouch, the dissection room poet, reminds us:

People say a lot of things.
And think three times that many.
Nothing like this place ever crossed my mind.57

Three features of contemporary dissection spaces which this author visited as preparation for this book are an important addendum to the medical experiences we have already encountered in this chapter.58

The first is that they are seldom what you expect. Of course, they look clinical because they must be kept clean (refer to Illustration 3.2).59 The furniture and basic equipment are much the same as they have been for a hundred years or more. And the layout of the tables in rows feels familiar from old photographs (compare to Illustration 3.3). But the air of anticipation, the sense that this room might be a bit smaller, lit slightly differently or run by individuals you have never met before, creates a first-time feeling on entering each new dissection venue. Indeed, the architectural variety and pragmatic use in the past of these medico-legal spaces is surprising for the uninitiated. We can see this, by way of example, in archive images of St Bartholomew’s Hospital dissection room in London. It was once hung with military recruitment posters from the WWI. These were also used to cover the cadavers being dissected each night (Illustration 3.4). Later teaching facilities were streamlined by building a separate new lecture theatre for the anatomy department to ensure clean sight lines: dissections were selected for special lectures and body parts placed on the lectern at the front of the room for students to observe (Illustration 3.5).60

Illustration 3.2 Publicity photograph of ‘Students Dissecting at the New Medical Centre’ ©University of Leicester – see, https://www2.le.ac.uk/departments/medicine/resources-for-staff/clinical-teaching/images/students-in-dissecting-room/view, accessed 10 January 2017, authorised for open access, and non-profit making, reproduced here under (CC BY-NC-SA, 4.0), for academic purposes only. Authorised by the University of Leicester where the author works.

Illustration 3.3 ©Wellcome Image, L0014980, ‘Photograph of Newcastle Dissection Room 1897’, by J. B. Walters, copyright cleared under creative commons Attribution Non-Commercial Share Alike 4.0, reproduced here under (CC BY-NC-SA, 4.0), authorised for open access, and non-profit making for academic purposes only.

Illustration 3.4 ©St Bartholomew’s Hospital Archives, Photographic Collection, ‘Dissection Room, 1915’, copyright cleared under creative commons Attribution Non-Commercial Share Alike 4.0, reproduced here under (CC BY-NC-SA, 4.0), authorised for open access, and non-profit making for academic purposes only.

Illustration 3.5 ©Wellcome Images, s3_L0018000_L0018253, ‘The New Operating Theatre at St Bartholomew’s Hospital around 1910’, looking recognisably modern with its stacked lecture theatre seats, Wellcome Trust Collection, digital download image reference, https://wellcomecollection.org/works/mtgyyb5w, reproduced under (CC BY-NC-SA, 4.0), authorised for open access, and non-profit making for academic purposes only.

Then once inside modern premises, a second experience starts to be stimulated naturally. The five senses recalibrate their normal running order. On entering the room, it is a place for smelling and listening, and then looking. Even a visual learner generally sniffs the air on entry, because the olfactory imprint of chemicals onto your skin, clothes and hair is what most people worry about. Being led by the nose into the room is commonplace. Quickly, though, the head turns to the side, because to most visitors’ surprise there is the low hum of air-conditioning units. These reduce any lingering chemical smells and keep the atmosphere crisp and fresh on entry. The eyes soon start to adjust to the lights overhead too, before modifying their lenses from a portrait view (seeing the upright students and demonstrators in the foreground) to a landscape scan (glimpsing the actual corpses and dissection tables in the background). The brain is now processing information fast in the first few minutes to make the visitor feel safe and circumvent the hyper-arousal mechanism of fight or flight that deals with fear in the body. An unnerving feeling can be trigged on entry: the sense that someone is standing just behind you. Some nervous visitors shudder and then realise that there is no reason to be spooked. A member of staff assigned to stand behind the visitor’s back makes sure they do not faint after a few seconds. The third feature of this experience is that most people generally want to look across the room, not down immediately to an actual corpse. It is the equivalent of having a fear of heights where you want to look out at a view but not down from a sharp precipice at what is below. This is so the mind has time to adjust to seeing a dead body with a human face. Generally, therefore, the new visitor is guided to an area of the room where the demonstrator in anatomy has pre-prepared a dissection of a limb called a prosthetic. First-time students are learning how to handle the human material with dignity, and touch the preserved tissues that will be the basis of their working life from now on. The ancient philosophy ‘healer, know thyself’ starts here. There is a human connectedness is this room even for those who have less interest in the anatomical sciences per se as a discipline-defining pursuit in their future careers.

One of the most common unforeseen experiences is the quality of human expressions still preserved in human body parts. Even an experienced visitor to these sorts of spaces can still be drawn to the touching beauty of the shape of a hand; the fingers that look female or male; the expressive quality of digits in an open greeting; all placed on the table for inspection. It is not difficult to spot a former farmer whose hands have toiled the soil for half a century – callouses, stodgy fingers, a big firm grasp; or the hairdresser who once chatted busily to her customers will have the telltale indentation of scissors marks on her forefinger. Again, all echoing what the poet Marianne Boruch recounts in her dissection-room visitor’s book Cadaver, Speak:

The hand in cadaver lab – the first fully human thing
we did. I thought. No hands alike, raging
small vessels run through them – you’d never
believe how many ribbons. The arm
kept springing up, no
not to volunteer. We tied it down with the ordinary rope
you’d get at the hardware store, and even then61

Wrists too are surprisingly evocative. The thinner they are, the more elegant is the mental impression of the absent person. A ring mark on a third finger’s paler skin likewise signifies a love token, taken perhaps in consolation by the bereaved before body donation. Slowly the fragmentary clues start to build a picture of the dead. Painted nails are redolent of a wartime generation for whom make-up was part of a person’s glamour. Tattoos too ‘are a reminder that this not just a body, but somebody’.62 It is striking how very few hands point the finger when preserved; all the moral judgements have evaporated. These are open hands that you can slip your hand into in a greeting and they can stimulate a student to respond in kind. Some stroke the hand and arm – intuitively (they often say later) – impulsively (most tend to claim) – calmly (say those whose interest in the science of dissection takes over quickly). There is a concentrated honesty in those present and it is a refreshing experience, because in the dead all pretence is stripped away.

Perhaps the most unanticipated aspect of visiting dissection rooms is the reaction of some of the staff on duty to the corpses and body parts. Those who work part-time to prepare the prosthetics generally tend to do shift work in local NHS general hospitals. Some are skilled in emergency medicine or intensive care nursing, and so this space can be challenging. For to them, it is a room full of failure. Every-body was a life that medical science could not save from death. The demonstrators in anatomy are dissecting their let-downs. Often, one of the most difficult emotional experiences involves unwrapping a cold storage body and recognising them as a patient who died in the care of the demonstrator; death can intrude uninvited into even the most impartial medic’s memory. There is then a lot of subjectivity surrounding the research subjects; just as there is a lot of emotional anticipation in what will become an emotive scientific endeavour. One thing, though, from all the visits is obvious. Whether at Harvard Medical School (where Michael Crichton trained in the 1970s) or at a British medical school since then, most students find that they have ‘that click’ deep somewhere inside themselves. The switch can be flicked to shut off their emotions, or not. It really does depend on the person. Crichton discovered that he had a talent for dissection, but he still looked for his emotional exit strategy, eventually becoming a successful film-maker and novelist. Johnathan Miller also left medicine. He became a renowned literary polymath and playwright, with a deep respect for his former general practice. Richard Harrison meantime worked tirelessly for patients with cancer and gynaecological problems until his retirement. He had few plaudits in the press, but it was, he thought on reflection, a life lived well. All nevertheless depended on hidden histories from the corpses in dissection rooms, secretly dreaded and silently taken for granted in their youth.

Janus-Like Hidden Histories of the Dead

In Paul Thompson’s seminal book about the value of oral history, The Voice of the Past (2000), he wrote that it is a combination of the written and spoken historical record that ‘can give back to the people who made and experienced history, through their own words, a central place’.63 Yet, in rediscovering threshold points, their research pathways and paperwork processes by actors who created hidden histories of the dead inside modern medical research cultures, it is evident that much more archival record linkage work is necessary to arrive at a revisionist perspective. Many closed conversations were never collected either on paper or recorded. In the official evidence base, there were gaps, silences, incomplete and shredded files. Private conversations were evasive in public. Even so, these were peopled with honesty, integrity and a sincerity too. Professional standards of behaviour continued to exude both medical altruism and clinical mentalities. Equally, medical staff and their students were trained not to speak openly outside their rank and file, or give only a partial account of their working lives about what really happened behind the dissection room door, pathology laboratory or hospital morgue because of wider cultural sensitivities about death, dying and the re-use of the dead in society. Part II thus sits at this complex cultural intersection where so much was consigned for filing but did not necessarily get forgotten. Often it was pared down, but could later be at least partially recalled, and thus, although considered lost forever, in fact endured in living memory to a remarkable degree. Chapters 46 nonetheless guard against the justifiable criticism of oral history that it could result in ‘the collection of trivia’ or ‘become little more than the study of myths’. For as Julianne Nyhan and Andrew Flinn alert us:

If oral history aimed to recover ‘the past as it was’, questions [from the 1970s] were asked as to whether the testimonies based upon retrospective memories of events (as opposed to documentary records produced contemporaneously and then authenticated and analysed through a professionally recognised method of ‘objective’ historical scholarship) could be relied on to be accurate. It was asked whether oral histories were not fatally compromised by the biases and uncertainties introduced by the interview process; and in the case of collective, community-focussed projects whether the selection of interviewees would introduce an unrepresentative or overly homogeneous data collection sample into the studies.64

Thus, the new case-material generated in this book essentially symbolises how the above historical debate moved on, and, recently so, with the advent of the digital humanities. Now historians of science and medicine test the validity of oral histories ‘by subjecting them to rigorous cross-checking with other sources, arguing for the general accuracy of memory and its suitability as a source of historical evidence, importing methodologies from sociology and the other social sciences’, particularly with regard to the representativeness of selected testimony.65 Historians today concur that every piece of historical evidence – whether written and spoken – is partial, and through rigorous archival checking it is feasible to arrive at a new ‘critical consciousness’.66 To achieve this, finding and fusing new source material, according to Alessandro Portelli, will mean that we arrive at a new consensus in which: ‘The peculiarities of oral history are not just about what people did, but what they wanted to do, what they believed they were doing, and what they now think they did.’67 The Oral History of British Science (2009–2013) is one example, deposited at the British Library, of this fascinating and necessary research journey. Admittedly, the ORHBS has been criticised for being innovative yet inward-looking, seminal yet celebratory, significant yet not self-reflective enough, for some scholars. Concern has been expressed that some scientists are too quick to praise the past because of a club culture mentality. Even so, new digital oral history collections like this do mark a break with the more fragmented past on paper. Speaking up about the hidden past of the dead will always be about human paradoxes that sit today at the ‘intersection and interaction with society, culture and ideology’:68 and this is where this book’s novel contribution is located too.

Part II thus builds on Thompson’s view that ‘the richest possibilities for oral history lie within the development of a more socially conscious and democratic history’.69 It does not seek to explore that historical record out of context, to apply ‘neo-liberal’ values to a time when the thinking was very different in the immediate aftermath of WWII. Instead, it is framed by a Janus-like approach, looking back to better understand a hidden past, and forward to engage with the long-term lessons of its lived experiences. As its focus is implicit, explicit and missed body disputes; at times there may be more of an emphasis on case-histories where things went wrong with medical ethics and inside research cultures in Chapters 46. This is balanced with a holistic sense that human beings can only learn from past mistakes when they get to know what those were in the first place to make future improvements. In other words, this is not a book about covering up, blame or pointing the finger – instead, its central focus is about joining in and renewing recent conversations about cultural change – from the proprietorial ethics of the past – to a custodial ethics of the future – from an ethics of conviction that framed the professionalisation of medical training – to an ethics of responsibility in a global community of precision medicine. For at the dead-end of life, as we shall see, there were many different sorts of hidden histories of the dead, and these created body disputes with stories that did not have to be buried or cremated without acknowledgement. Its bio-commons had medical dimensions and ethical implications not just in our keeping, but in our making too. In modern Britain from 1945 to 2000, we return to it, by looking forward to its past.

Footnotes

Introduction: A Consignment for the Cul-de-Sac of History?

1 Disputed Bodies and Their Hidden Histories

2 Res Nullius – Nobody’s Thing

3 The Ministry of Offal

Figure 0

Figure 1.1 Re-modelling the threshold points in body bequests used for dissection and further research in the medical sciences, c. 1945–2015.

Source: Author designed, themes embedded in Chapters 4–6, Part II, of this book
Figure 1

Illustration 3.1 Photograph of ‘Mrs Craigie’ for an article by Margaret Maison, ‘The Brilliant Mrs Craigie’, The Listener Magazine, 28 August 1969, Issue 2109, p. 272. The photograph originally appeared in the flyleaf of John Morgan Richards, The Life of John Oliver Hobbes told in her correspondence with numerous friends (John Murray, Albermarle Street, 1911). As this publication is now out of the copyright clearance restrictions and this author owns a copy of that original book, the image is being reproduced here under creative commons Attribution Non-Commercial Share Alike 4.0 International (CC BY-NC-SA, 4.0), authorised here for open access, and non-profit making for academic purposes only.

Figure 2

Illustration 3.2 Publicity photograph of ‘Students Dissecting at the New Medical Centre’ ©University of Leicester – see, https://www2.le.ac.uk/departments/medicine/resources-for-staff/clinical-teaching/images/students-in-dissecting-room/view, accessed 10 January 2017, authorised for open access, and non-profit making, reproduced here under (CC BY-NC-SA, 4.0), for academic purposes only. Authorised by the University of Leicester where the author works.

Figure 3

Illustration 3.3 ©Wellcome Image, L0014980, ‘Photograph of Newcastle Dissection Room 1897’, by J. B. Walters, copyright cleared under creative commons Attribution Non-Commercial Share Alike 4.0, reproduced here under (CC BY-NC-SA, 4.0), authorised for open access, and non-profit making for academic purposes only.

Figure 4

Illustration 3.4 ©St Bartholomew’s Hospital Archives, Photographic Collection, ‘Dissection Room, 1915’, copyright cleared under creative commons Attribution Non-Commercial Share Alike 4.0, reproduced here under (CC BY-NC-SA, 4.0), authorised for open access, and non-profit making for academic purposes only.

Figure 5

Illustration 3.5 ©Wellcome Images, s3_L0018000_L0018253, ‘The New Operating Theatre at St Bartholomew’s Hospital around 1910’, looking recognisably modern with its stacked lecture theatre seats, Wellcome Trust Collection, digital download image reference, https://wellcomecollection.org/works/mtgyyb5w, reproduced under (CC BY-NC-SA, 4.0), authorised for open access, and non-profit making for academic purposes only.

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×