Hostname: page-component-586b7cd67f-l7hp2 Total loading time: 0 Render date: 2024-11-27T23:40:29.692Z Has data issue: false hasContentIssue false

Ethical dilemmas in psychiatry: When teams disagree

Published online by Cambridge University Press:  02 January 2018

Rights & Permissions [Opens in a new window]

Summary

Many ethical dilemmas in medicine are associated with highly unusual clinical situations and are an almost daily challenge for mental health teams. We describe the ethical issues that arose in relation to a significant difference of opinion between team members about using nasogastric clozapine in the treatment of a severely ill patient. We discuss how conflicting emotions and perspectives within teams acquire ethical significance and how negotiation and reflection are essential for good-quality ethical reasoning to take place.

Learning Objectives

• Understand the different effects and importance of reasoning and emotions in moral decision-making

• Use a clinical scenario involving a difficult and controversial procedure to explore the impact of social persuasion in moral decision-making

• Consider the effects of heuristics against rational thinking

Type
Articles
Copyright
Copyright © The Royal College of Psychiatrists 2017 

Ethical dilemmas are common in mental health settings. Often they arise because the presence of mental disorders causes many patients to lack capacity to make decisions for themselves and to depend on clinicians to attend to their best interests in ethics and law. However, patients are also vulnerable to clinicians as a result of their lack of autonomy, and there is a sad history of the abuse and exploitation of patients by mental healthcare professionals.

Ethical tensions also arise because mental healthcare is delivered by clinical teams comprising different types of people with different views, roles and responsibilities. We discuss here a case where strong emotions generated conflict regarding a novel clinical intervention. Although the field of moral psychology acknowledges the influences of emotions and interpersonal persuasion on decision-making and associated cognitions, the bioethics literature offers little practical comment.

We sought the patient's permission to publish this case and he gave express consent. We are grateful to him for his generosity. Identifying details have been changed.

Case vignette

Clinical background

Ben was a man in his 40s. He had treatment-resistant schizophrenia, which for two decades had been unresponsive to multiple depot and oral antipsychotic medications, including dosages well over (up to 1000%) the British National Formulary (BNF) licensed maxima, and in combination with mood stabilisers. Ben refused clozapine and the necessary blood tests, so this had never been prescribed.

Detained under section 3 of the Mental Health Act 1983 and lacking capacity because of his floridly psychotic mental state, Ben was subject to involuntary treatment. Following psychotically driven assaults he moved up levels of secure care and after several years in seclusion was transferred to a high secure hospital. By the time of his arrival in high security Ben was very distressed and could not attend to his own most basic needs. He was dishevelled, went for long periods neither eating nor drinking and had lost a great deal of weight. He had reached a crisis point.

Clinical intervention

The team considered the various potential interventions and their associated risks and benefits (Table 1) and the treating consultant proposed the enforced nasogastric administration of clozapine, used in the USA (Reference FisherFisher 2003) but novel in the UK. However, the combination of the patient's extreme and at times childlike vulnerability, the anticipated further distress, uncertainty of outcome and ongoing risk of violence all evoked powerful emotions in the professionals involved and many had concerns about how best or whether to proceed.

TABLE 1 Risks and benefits of potential treatment strategies considered by the team

Consultation regarding the novel intervention and team concerns

In light of the difficulties in initiating a novel and unusual intervention, extensive discussions took place during the weekly multidisciplinary team meetings as well as with professionals and others inside and outside the hospital. These involved a medical peer review including the lead psychiatrist, key nursing, auxiliary and social care staff, the hospital's legal department and the team in charge of physical restraint. The treating consultant and social worker also consulted Ben's family. Both Ben's lawyer and an independent mental health advocate (IMHA) attended care team meetings and discussed the alternatives. For advice about the technical aspects of the procedure an intensive care physician from the local general hospital and a psychiatrist in North America who had described its use in a published paper (Reference FisherFisher 2003) were contacted.

As required by the Mental Health Act 1983, the team sought authorisation of the proposed treatment from a second opinion approved doctor (SOAD). The SOAD, who was an independent psychiatrist appointed by the Care Quality Commission (CQC), met with nursing staff and the team psychologist before agreeing to authorise the plan. The CQC was consulted as to whether the nasogastric route should be specified.Footnote a

Although anxious about the proposal, everyone was concerned about Ben's desperate condition. Many alternative treatments, such as atypical antipsychotics, high-dose regimes and covert medication, were suggested; all either lacked likely efficacy as they had already been unsuccessful over the previous two decades, or themselves presented serious problems in relation to harms and ethical difficulties.

Concerned team members were reassured by the wide-ranging consultation, inclusion of the patient's family and legal team as well as support from both within the hospital and externally. Nurses discussed the issues among themselves formally during handovers and reflective practice and informally. The team were also united in wanting to help a desperately ill individual and they had benefited from multiple shared experiences over many years of successfully initiating clozapine with dramatic and sustained improvements.

Treatment initiation and clinical progress

Ben was offered oral clozapine every day for a month. In the final week, the treating consultant visited daily, offering the oral preparation and explaining in simple terms the need for treatment. He eventually explained that if Ben continued to refuse medication, with no clinical improvement, he would be held and a ‘tube’ would be used to administer it.

Ben continued to refuse and so involuntary treatment was initiated for a 5-day trial. He was restrained by nursing staff while a junior doctor inserted a nasogastric tube; a stomach aspirate was taken to confirm position and then the dose of clozapine was given. Interventions lasted up to 15 minutes. The treating consultant was present for all but one restraint. This provided support for the team as a whole and the junior medical staff in particular. It also allowed the team to debrief and determine whether or not to persist. This decision tried to take into account the severity of Ben's physical and mental states, the distress and difficulty of the procedure for him and team members, the lack of less restrictive options likely to work and the risks of deterioration through dose re-titration if no clozapine was taken for 48 hours. This judgement could only be made effectively by first-hand knowledge of the process and it was apparent that some staff needed the psychological support of knowing that the consultant was present.

In light of rapid improvement in Ben's mental state, with reduced distress and improved food and fluid intake but ongoing resistance to oral medication, the initial 5-day plan was extended at the suggestion of the nursing staff, and medical and physical healthcare staff made themselves available at the weekends to ensure that there was no break in clozapine titration.

Ben received nasogastric clozapine under restraint four times over 19 days. The other days he took the oral clozapine reluctantly. By day 20, Ben ate and drank regularly, took clozapine without demur and was markedly less fearful and hostile. Seclusion ended the next day. His level of functioning markedly improved and other patients who recognised his vulnerability befriended him. Seven months after the initiation of clozapine, and following some negotiation, he transferred to a low secure hospital. Shortly before he moved from high security he had escorted community leave for the first time in years, to a nearby beach and for fish and chips. When he left, he thanked the care team looking after him, giving both them and his treating consultant a thank you card. He remained in telephone communication with the ward nursing staff for over a year, because he liked them.

Ben continues to be markedly improved, taking oral clozapine, and he has been transferred out of forensic services to a rehabilitation hospital. He is able to take unescorted leave and travels alone across his home city to visit a relative. His current team are looking to discharge him to a supported community setting. When asked about his experiences he was neither distressed nor appeared to have meaningful recall of what had happened.

The ethical dilemmas – moral philosophy

Restraint and restriction to achieve a beneficent outcome

The primary ethical dilemma was whether the intended benefit of the proposed intervention justified or obliged the degree of physical intrusion, restriction and restraint.

Principle-based ethics

The standard medical ethical imperatives of beneficence (doing good), non-maleficence (avoiding harms), autonomy (respecting an individual's wishes) and justice (distributing resources fairly) provide a start but not a conclusion. The intention of enforced medication that necessitated intrusion and restraint was to do good, to ease the extreme distress that was restricting Ben's autonomy by improving his mental and physical health, and to deliver the care and rehabilitation that he deserved. However, this argument could not convince all; deontological or principle-based ethical frameworks are particularly hard taskmasters. Beneficence and its counterpart non-maleficence present particular challenges. Primum non nocere, first do no harm, is usually seen as more compelling than doing good. The intervention proposed appeared to have a high likelihood of causing some degree of harm in the form of fear and suffering during restraint. Further harms might have included worsening Ben's psychosis, or causing physical harm by administering medication through a misplaced nasogastric tube. In contrast, and freely acknowledged by his treating psychiatrist, there was no guarantee of success of any sort. Some health professionals would propose that, on this principle-based analysis alone, the intervention could never be justified. Further, although Ben's ability to express his autonomy was limited, he could at least scream, spit and fight. He did not at all agree. A physically stronger individual, or one with greater mental capacity, might be able to resist or refuse and so, arguably, Ben's vulnerabilities were exploited (Reference PerlinPerlin 2004). However, the limitations of a principle-based approach are that it potentially places little weight on the consequences of inaction and offers no process for weighing potential harms against potential benefits.

Consequentialism

As the name suggests, consequentialism judges the morality of a course of action on the outcome. A decision to accept the status quo would have a very poor, if not fatal, outcome, whether or not disguised by a ‘therapeutic fig leaf’ such as a high antipsychotic dose or polypharmacy regime. Team members might have felt safer in not acting, but a strict consequentialist approach treats omissions as morally significant actions, so harms from decisions not to act are as significant as harms from positive decisions to take action (which in this case entailed trying an intervention that was potentially risky).

Virtue-based ethics

The virtue-ethics framework requires that actions are driven by positive intents, such as benevolence and a desire for justice. Following the breakdowns in care in Mid Staffordshire NHS Foundation Trust (2010), the nursing strategy for England set out a virtues-based approach embedding the six ‘Cs’: care, compassion, competence, communication, courage and commitment (Department of Health 2012). An approach based on virtues focuses on the intentions of the team, which are to help the patient and reduce their suffering overall. Although the intervention would likely cause some minimal harm (and possibly some wrong), neither would be a primary intention. Such an action is an example of the principle of double effect (Reference FootFoot 1967), where it is morally justifiable to carry out an action with an intended positive outcome even if there is a known but unintended harmful outcome. This approach is commonplace in medicine whenever a treatment is given that carries a risk of harm. The complexity here is that Ben lacked capacity and therefore was vulnerable to the actions of his carers, so their intentions were therefore crucial.

Paternalism

‘Paternalism’ refers to situations in medicine when healthcare professionals override the choices and views of patients, in effect acting like a parent to a child (as the term suggests). Paternalism is generally seen as wrong conduct in medicine, although in situations where a person lacks capacity to make autonomous choices, it is only ‘weakly’ paternalistic for professionals to make choices on their behalf. There is case law and policy to ensure that, when professionals act in a paternalistic way towards a patient, they must do so only for the shortest of times and in the patient's best interests. There is no justification for strong paternalism, i.e. overriding the choices and views of patients with capacity, although English mental health law does legislate for this situation.

Ben's case is a case of weak paternalism because he lacked capacity to make any choices about treatment acceptance or refusal. English mental health law supports weak paternalism by professionals, as does the law on capacity and the best interests test (within the Mental Capacity Act 2005). Where this case is complex is that pursuing Ben's best mental health interests entailed a potential short-term risk to his physical best interests. Further, overriding his resistance felt disrespectful to Ben's little remaining autonomy. Best interests analysis tends to be consequentialist in nature; the team had made a good case that it was in Ben's best interests to try this intervention. However, it has been pointed out that there are other interests than medical ones, and preservation of dignity is one of them.

Legal mechanism and proxy decision-making

In England and Wales incapacitous or non-consenting patients detained under the Mental Health Act 1983 may be administered drug treatments for mental disorders for longer than 3 months only if a SOAD approves the plan, including the route of administration. In this case, the team also consulted Ben's family, advocate and solicitor. Consulting a wide range of parties, including people who knew the patient before they became unwell, can help ensure that the patient's best interests take precedence (rather than any ulterior motives of the clinicians) and can also help determine what the patient may have chosen had they been able.

Professional and team anxieties

Despite legal authorisation and ethical justification, anxieties among the care team remained that led to conscious (and possibly unconscious) obstructions.

Prior to treatment

During initial discussions concern was raised within the care team, as some did not agree. Perhaps fuelled by the fear of subsequent criticism were harm to occur, they expressed the view that it was degrading and inhumane to restrain a physically vulnerable man even if it would help him. Some psychiatrists, who had accepted Ben's transfer to high security to establish clozapine, then expressed the view that the plan was ‘just wrong’, believing nasogastric administration to be an act of last resort. They suggested that it would be better (ethically and clinically) if he were left to deteriorate to the point of life-threatening dehydration, when a nasogastric tube could be used to give fluid and also, if he lived, clozapine.

Some nurses anonymously (via the care team meetings) asked to be excused from participating, and other medical professionals implored the team to consider the high-dose strategies that had already failed. Covert medication was suggested, but Ben already believed his food poisoned, left large amounts scattered in his room and ate on a less than daily basis. It was hoped he would improve and regain some sense of reality. How then to explain the improvement to him? Although covert medication is used with patients with dementia or intellectual impairment, there are ethical concerns about the level of deceit involved, and professional guidance prohibits its use in the treatment of patients with schizophrenia (Reference Welsh and DeahlWelsh 2002; Royal College of Psychiatrists 2004). Some professionals in Ben's case seemed to be suggesting that they would be prepared to cause harm and distress only in order to save life, not to improve health, and they did not express the same concerns about deceiving a person without capacity to consent as they did about intrusive restraint. They also discounted the quantitative risk if the nasogastric tube were to be misplaced in giving a large volume of fluid as opposed to a small volume of active drug.

During treatment

While the procedure was ongoing, senior professionals who were not actively involved continued to comment that the technique should never be used in any circumstances owing to the traumatising and aversive consequences it would have for Ben. This raised anxiety in the treating team about whether their analysis was correct and introduced a degree of resentment towards colleagues who were perceived as unsupportive of a difficult decision.

Active demonstrations of discontent persisted within small parts of the clinical team. The clozapine tablets which were to be offered to Ben for a month prior to any decision to restrain were placed in the wrong cupboard, the one enteral syringe for nasogastric clozapine (which was reusable and had been marked ‘do not throw away’) was mislaid and, on the only occasion the treating psychiatrist was absent, the junior doctor placing the tube overheard a team member say: ‘This is cruel’.

So where did the objections come from? How were they resolved? What have the team members learnt? – Moral psychology

How are ethical decisions made?

One might suppose that ethical decision-making is done by carefully collating all the available evidence and weighing up the likelihoods and consequences of the anticipated outcomes in an aware and reflective way, supported by a societal consensus regarding rights and values, and then reaching a decision, without emotional influences, that can be applied within a legal framework.

How likely is that and what comes first anyway, the decision or the reasoning? A major theme of moral psychology is the investigation of what drives moral decision-making. Psychiatrists in particular should not be surprised by Haidt's social intuition theory: this posits that, rather than the series of careful cognitive steps suggested above, moral decisions are intuitive, with moral reasoning taking the form of post hoc rationalisations made to influence others, which in turn may be influenced or changed through discussion (Reference HaidtHaidt 2001). Reference Greene and HaidtGreene & Haidt (2002) then developed the dual process theory of moral reasoning, acknowledging that both emotional and reasoned judgements can take place.

Functional magnetic resonance imaging (fMRI) supports the hypothesised relationships between intuitive judgements and emotion, and between consequentialist approaches and reasoned thought, by demonstrating that principle-based decisions are made quickly and with activation of the amygdala, whereas consequentialist decision-making takes longer and there is frontal lobe activation (Reference Greene, Sommerville and NystromGreene 2001, Reference Greene and Gazzaniga2009).

The emotion of disgust is particularly cited as one that may result in moral judgements and reasoning in ways that are irrational in terms of consequences. The ‘yuk factor’ (Reference Haidt, Rozin and McCauleyHaidt 1997) – in the case of Ben, the restraint of a weak and vulnerable man to place a nasogastric tube for treatment he resists – is likely to evoke various painful emotions, such as anger, fear and resentment, but from different perspectives and resulting in opposed conclusions. For example, from team members: ‘He is weak and vulnerable. Restraint and a nasogastric tube is cruel/disgusting. We must do everything we can to protect him’. From the treating consultant: ‘He will die. I am afraid I/we will be blamed. We must do everything we can to save him’. Alternatively, but perhaps less likely, more thought-out conclusions from principle-based and consequentialist positions might have been in opposition.

When decisions of any sort are being made, decision-making that seems to be based on purely cognitive appraisal may in fact be affected by the short-term experience of positive or negative emotions (Reference Kahneman and TverskyKahneman 1979). If decision makers are facing difficult, unpleasant, risky and uncertain choices, unconscious and automatic emotional reactions have a significant influence on their attitudes, appraisals and actions (Reference DamasioDamasio 2005; Reference Slovic and PetersSlovic 2006). This has a much wider application in psychiatric professional judgements, particularly about risk.

Resolving ethical conflicts

Ethical disagreement in healthcare teams is clearly not new, strange or undesirable. For some emotionally demanding types of work it may be inevitable (Reference Kovacs, Kovacs and HegedusKovacs 2010). However, the key issue to consider in terms of team functioning is the degree to which team members may lack awareness of these differences and their emotional responses to them. They may be unaware of the different value perspectives that they are taking and of the similarities and differences between these different perspectives. This can cause anxiety in people trained to make difficulties disappear, and this may be acted out in ways that affect the quality of team functioning.

In the case of Ben, the clinical team were made to feel like ‘bad’ people for doing something they thought was right, even after a lengthy process of ethical and legal debate supporting their plan. In his description of values-based practice in mental healthcare, Reference FulfordFulford (2008) argues that different ethical positions may reflect important differences in value perception and weighting, and that it is important for teams to explore this ethical ‘dis-sensus’, because there may be more than one ethically sustainable argument. It can be hard for healthcare professionals to accept that in some situations there is no single ‘good’ intervention that will save the patient and allow them to retain their sense of themselves as ‘good’ people.

Reference HaidtHaidt (2001) proposes that the main purposes of moral positions are to influence others, maintain social relationships and preserve self-definitional attitudes. Moral positions can be changed through private reflection and also by the reasoned persuasion, emotions and intuitions of others. It is in the sharing of one another's judgements as well as reasons that the emotions, intuitions and ideas of others can move. In this case, the treating consultant shared his fears about the consequences of inaction – death through neglect – as well as the more formal approaches of weighing up and analysing. This may well have helped shift the intuitions, and hence conclusions, of opposing team members, which may have been dominated by fears, for Ben's welfare or of criticism from others. Sharing thinking might also have helped team members take a consequentialist approach, looking at the outcomes of each of the unpleasant alternatives.

A particular difficulty of this case was that, although many other treatment avenues were suggested, each had been tried repeatedly and failed (apart from electroconvulsive therapy (ECT), which would have entailed greater physical health risks). But as well as the discussions at care team meetings, many others were involved, were seen to be involved, visited the ward and talked with staff about their judgements and reasoning. This allowed many opportunities for social persuasion, the influence of one person's judgement on the other's intuition, as well as reasoned persuasion, the influence of one person's reasoning on the other's intuition. This was quite clear when nurses who spoke of their fears to the SOAD were told: ‘You should not be worried about doing this; you should be worried about not doing this’. This was a clear statement of a consequentialist position and it had a very powerful effect.

What seems key here is the degree to which professionals can become aware of their emotional biases and the extent to which they can become aware of the anxiety, fear, competitive arousal and anger that may influence decision-making in difficult circumstances. Reference Woodbridge and FulfordWoodbridge & Fulford (2004) suggest that teams in ethical conflict have group discussions to explore their different perspectives, similar to the reflective practice processes that are now regularly recommended for teams who work in emotionally demanding clinical settings such as mental healthcare (Reference GrahamGraham 2000; Reference Yakeley, Hale and JohnstonYakeley 2014). It seems especially important in mental health settings, where professionals have to work closely with people whose mental disorders make them irrational, that those professionals pay close attention to their own capacity for ‘irrationality’.

Irresolvable ethical disagreement

Although not affecting the action in this case, some ethical disagreements are irresolvable. These are associated with moral conviction, a strong absolute belief that something is right or wrong, moral or amoral (Reference Skitka, Bauman and SargisSkitka 2005). Typical examples are abortion and capital punishment. These beliefs tend to be perceived by the holder as universal matters of fact and are difficult, if not impossible, to change through discussion. Interestingly, at least one vocal critic maintains the view that, although retrospectively the decision was correct in Ben's case, a similar course of action should never be taken again.

What was learnt?

In this case, the team's assessment was accurate, and therapeutic success was achieved, more quickly and to a higher degree than expected. The anticipated good outcomes (based on clinical experience and evidence) did happen; the only ‘bad’ consequences were that Ben suffered some short-term discomfort and loss of dignity, and some staff were unhappy and anxious, also in the short term. However, nursing staff in particular radically shifted position from initial fear and skepticism to actively suggesting a similar course of action with subsequent patients. The team worked together cohesively once Ben's condition began to improve. The ability to take a flexible approach was probably very important, although this caused some difficulties. A series of questions were raised in the discussions before the restraints began: ‘How long will we carry on for?’, ‘When will we stop?’, ‘What if Ben makes no improvement?’, ‘How will we know if it is worth it?’ Good questions, with no easy answers, other than: ‘We will make a decision together, taking everything into account’. Had Ben shown little or no improvement, treatment would probably not have been pursued beyond a fortnight, but then the alternatives would have been very limited.

After an interval of about 2 years, further patients at Ashworth Hospital have received similar interventions, including from consultants who had initially been very fearful. At least one psychologist has remarked: ‘Why didn't we do this earlier?’ About half the patients for whom the intervention was planned accepted oral medication without recourse to restraint or nasogastric administration at all (some had been informed of the plan for nasogastric administration after oral clozapine had been refused, others were persuaded before the planned use of nasogastric clozapine was revealed). Subsequent team decision-making has not been associated with conflict at anything like the degree first experienced. Instead, there has been great uncertainty from external teams asked to assess patients who have markedly improved following some use of nasogastric clozapine and for whom transfer out of high secure hospital has been requested.

Ethics, law and intrusive interventions in mental healthcare

There is extensive literature on the ethics of mental healthcare, focusing on the tension between respect for autonomy and a duty of beneficence (Reference AdsheadAdshead 2000, Reference Adshead2007). Severe mental illness typically deprives people of their capacity to make truly autonomous decisions, leaving professionals vulnerable to a persistent tension between trying as far as possible to respect what patients say they want while incapacitous and maximising their welfare and restoring capacity, at which stage they might well express another view. This tension is exaggerated, augmented and developed by the legal process invoked. Most countries have legislation that allows for some form of involuntary restraint or treatment of patients who have lost capacity to make decisions for themselves as a result of their mental disorders.

Different jurisdictions have different approaches to this dilemma: the UK has mental health legislation that allows for involuntary treatment of a person with a professionally diagnosed mental disorder, whereas other jurisdictions allow forced treatment only in rare and limited circumstances, such as when capacity can be shown to be abolished by the disorder. In the USA, involuntary treatment can be given only after a legal hearing to establish lack of capacity. It is of interest that although US culture traditionally favours freedom from state interference and personal freedom of choice, nasogastric administration of clozapine has been expressly authorised by the courts (Reference Mossman and LehrerMossman 2000, Reference Mossman2002).

Most discussions of ethical dilemmas in mental healthcare assume one professional trying to decide between two courses of action, one good and one not so good, where ‘good’ may have multiple meanings. These discussions assume something like a replicable, rational cognitive process. In reality, unconscious emotional influences may have a profound effect.

What our case raises are two aspects of ethical reasoning in mental healthcare that are not often discussed:

  1. group dynamics and how ethical decisions are made in teams and complex systems

  2. general heuristic biases against rational decision-making.

Group dynamics

Within teams, a range of negative behaviours can be observed that profoundly influence group dynamics: competition, pairing and scapegoating (Reference Yank, Lindsay and BarberYank 1992). Unrecognised, these behaviours can create hierarchical autocratic atmospheres, splitting and rigidity, and isolation of solitary opinions. Conflicts within groups and teams have been reported to the UK's National Clinical Assessment Service as a common reason for calling into question the professional capacity of doctors (Reference Donaldson, Panesar and McAvoyDonaldson 2014).

At present within the National Health Service (NHS), group consensus is increasingly valued in healthcare teams. However, increasingly, clinical teams can struggle to manage the anxiety that goes with the management of complex cases, as staff are subject to steadily higher levels of scrutiny, especially following critical incidents or inquiries, in a working climate where blame and ‘finger-pointing’ at individuals are favoured (Mid Staffordshire NHS Foundation Trust Inquiry 2010). In clinical contexts where the risk is high and there is strong emotion, anxiety about who will be held accountable if things go wrong can lead to hostile behaviours and interactions within and between teams. Staff may be reluctant to take risks or be innovative, even if this means that patient welfare suffers (as seems to have been the case in Mid Staffordshire). These pressures may be even more painful in mental healthcare, where the evidence base is less well established and there are many empirical uncertainties about how best to treat really complex conditions.

Unconscious heuristics in decision-making

Unconscious emotions can have a significant influence on high-risk and high-cost decisions. In the case of Ben, the patient was so disturbed and distressed that it was inevitable his care would evoke very strong feelings in all those involved, no matter how peripherally. The treatment proposed was novel, came with no guarantee of success, was inherently intrusive and would result in at least temporary distress. It is likely that the anxiety about doing harm made it difficult for staff to ‘see’ the complexity of the different therapeutic options.

Particularly in mental healthcare, without an acute crisis, there is potentially a heuristic bias against doing something rather than nothing, especially in relation to high-risk cases, i.e. professionals may not see inaction as a form of action in itself that may cause harm (Reference Spranca, Minsk and BaronSpranca 1991). In the UK, nursing staff, and society in general, have very low approval of most restrictive practices such as mechanical restraint, although interestingly not of seclusion (Reference Whittington, Bowers and NolanWhittington 2009). Therefore actions involving involuntary treatment and physical restraint, which do not fit with the traditional professional ideal of the ‘good’ carer, may be met with ambivalence. But UK preferences are not ubiquitous in psychiatry as a whole. Across Europe, the frequency and type of coercive interventions, and legal restrictions on them, vary widely: in the UK (Reference Raboch, Kalisova and NawkaRaboch 2010) seclusion is particularly common, but mechanical restraint is rarely if ever used (Reference Bowers, van der Werf and VokkolainenBowers 2007).

It is possible that there is still some lingering remnant of the ethos that allowed Mary Barnes to live for 5 years in the experimental therapeutic community at Kingsley Hall, London, smeared with her own excrement (Reference Bjarnason and EdgarBjarnason 2001); what commentators have called ‘rotting with their rights on’ (Reference GutheilGutheil 1980). Unconscious disgust and fear can lead to staff avoiding action in the name of respect for liberty and freedom, a freedom that people who lack capacity cannot enjoy or exercise.

Clozapine – unconscious heuristics?

There may also be unconscious anxiety and suspicion about the use of clozapine. Despite clozapine's positive evidence base, it is not used involuntarily like other medication in mental healthcare and arguably remains underutilised (Reference Mistry and OsbornMistry 2011; Reference Stroup, Gerhard and CrystalStroup 2014). A UK survey reported that, where clozapine had been enforced, restraint to take blood was uncommon, and a significant minority of professionals viewed the use of clozapine under restraint as ‘always wrong’ (Reference Pereira, Beer and PatonPereira 1999a,Reference Pereira, Beer and Patonb). One psychiatrist has argued that studies of involuntary clozapine should not have been published at all, suggesting (possibly tongue in cheek) that they were ‘instructions in martial arts for psychiatrists’ (Reference FahyFahy 2000). Other commentators have offered technical objections, or recommended involuntary ECT, which appears to be at least as restrictive, invasive and hazardous (Reference BarnesBarnes 1999), especially in patients like Ben, whose physical health was compromised by dehydration and starvation. In some professional minds, the chief objection to forced nasogastric administration of clozapine appears to be the association with the protests of the oppressed, evoking historical images of the treatment of political prisoners, suffragettes or detainees in Guantanamo Bay.

Conclusions

A key feature of the complexities in ethical decision-making is that emotions are likely to be stronger when there are uncertainties and fears of harm to the patient or of external criticism. When outcomes are uncertain, but actions are sure to produce some harm, even with the potential for a far greater reward, this creates a classic scenario for risk aversion (Reference Kahneman and TverskyKahneman 1979). Most people are reluctant to stake the chance of a sure loss now for a possible far larger return some time later. But failure to take a difficult action can be just as morally questionable as taking an action that causes some degree of harm.

Making time for reflection and articulation of anxieties and perspectives is a crucial part of the process of resolving such tensions. There is some limited evidence that, when reflective spaces are provided on a regular basis, this empowers teams to take difficult decisions without using immature defence mechanisms (Reference Hinshelwood and SkogstadHinshelwood 2002). Negotiation within and between teams requires similar skills in terms of making time and listening to different perspectives. Conflict resolution training, and attention to team emotions when there are no ‘good’ decisions to be made, are vital in mental health settings where complex cases are the norm.

MCQs

Select the single best option for each question stem

  1. 1 In England and Wales, where a patient cannot consent to treatment the administration of clozapine via a nasogastric tube must be approved by:

    1. a both a second opinion approved doctor (SOAD) and the patient's family

    2. b both a SOAD and an independent mental health advocate (IMHA)

    3. c both a SOAD and a tribunal

    4. d both a SOAD and a judge

    5. e only a SOAD.

  2. 2 The following are examples of the correct application of given ethical principles:

    1. a respecting autonomy: not bathing an extremely psychotic man with psychosis who is covered in excrement (and so respecting his wishes)

    2. b respecting autonomy: not prohibiting access to pornography for a patient who has no history of sexual deviance

    3. c beneficence: preventing a patient with capacity making an unwise financial decision

    4. d proportionality: waiting until a psychotic patient who refuses to eat becomes life-threateningly ill before treating them against their expressed wishes

    5. e consensus decision-making: avoiding making decisions when a team cannot agree.

  3. 3 When considering the ethical implications of a decision:

    1. a emotional reactions are often important factors

    2. b professionals from different disciplines will always share common value perspectives

    3. c the success or failure of the action taken is more important than the intended aim

    4. d in a multi-disciplinary team setting, all members must agree

    5. e causing any harm must be avoided.

  4. 4 If a team is faced with a situation in which a particular action carries an immediate chance of some harm, followed by a greater chance of a larger benefit:

    1. a the action should not be taken: primum non nocere

    2. b they are in a classic situation for risk aversion

    3. c the ethical argument of ‘double effect’ will not be sufficient to justify any harm that may arise

    4. d team members are likely to resolve the problem and reach a consensus easily

    5. e a psychiatrist making a decision to act should be referred to the General Medical Council for unethical practice.

  5. 5 In team decision-making in hospital settings:

    1. a problems relating to conflicts within teams rarely call the professional practice of doctors into question

    2. b a process of ethical decision-making is carried out by weighing up all the information and likely consequences in an aware and reflective way, supported by a societal consensus regarding values, codified by a legal framework without emotional influences

    3. c team members may have opposed value perceptions, resulting in more than one ethically sustainable argument, a dis-sensus

    4. d individuals who object on principle are likely to respond to a logical argument setting out the benefits of the action proposed

    5. e members of the same profession are likely to follow similar decision-making patterns, regardless of the country in which the care is delivered.

MCQ answers

1 e 2 b 3 a 4 b 5 c

Footnotes

Declaration of Interest

E.S. has received speaker fees from Novartis and Jansen Pharmaceuticals (none since 2009).

a At the time, the advice from the CQC secretariat was that the nasogastric as opposed to the oral route needed to be specified. Subsequently, the principal SOAD has indicated that the oral and nasogastric routes are equivalent: both are enteral.

References

Adshead, G (2000) Care or custody? Ethical dilemmas in forensic psychiatry. Journal of Medical Ethics, 26: 302–4.Google Scholar
Adshead, G (2007) Coercion and capacity to consent. BMC Psychiatry, 7 (suppl 1): S30.Google Scholar
Barnes, TRE (1999) Commentary: The risks of enforcing clozapine therapy. The Psychiatrist, 23: 656–7.Google Scholar
Bjarnason, S, Edgar, D (2001) Mary Barnes: gifted artist who documented her struggle with mental illness in paint and print. The Guardian, 13 July (https://www.theguardian.com/news/2001/jul/13/guardianobituaries.books).Google Scholar
Bowers, L, van der Werf, B, Vokkolainen, A, et al (2007) International variation in containment measures for disturbed psychiatric inpatients: a comparative questionnaire survey. International Journal of Nursing Studies, 44: 357–64.Google Scholar
Damasio, AR (2005) Descartes' Error: Emotion, Reason, and the Human Brain. Penguin.Google Scholar
Department of Health (2012) Compassion in Practice: Nursing, Midwifery and Care Staff – Our Vision and Strategy. NHS Commissioning Board.Google Scholar
Donaldson, LJ, Panesar, SS, McAvoy, PA, et al (2014) Identification of poor performance in a national medical workforce over 11 years: an observational study. BMJ Quality & Safety, 23: 147–52.Google Scholar
Fahy, TJ (2000) Martial arts for psychiatrists. The Psychiatrist, 24: 197–8.Google Scholar
Fisher, WA (2003) Elements of successful restraint and seclusion reduction programs and their application in a large, urban, state psychiatric hospital. Journal of Psychiatric Practice, 9: 715.Google Scholar
Foot, P (1967) The problem of abortion and the doctrine of double effect. Oxford Review, 5: 515.Google Scholar
Fulford, KWM (2008) Values-based practice: a new partner to evidence-based practice and a first for psychiatry? Mens Sana Monographs, 6(1): 1021.Google Scholar
Graham, IW (2000) Reflective practice and its role in mental health nurses' practice development: a year-long study. Journal of Psychiatric and Mental Health Nursing, 7: 109–17.Google Scholar
Greene, JD, Sommerville, RB, Nystrom, LE, et al (2001) An fMRI investigation of emotional engagement in moral judgment. Science, 293: 2105–8.Google Scholar
Greene, J, Haidt, J (2002) How (and where) does moral judgment work? Trends in Cognitive Sciences, 6: 517–23.Google Scholar
Greene, JD (2009) The cognitive neuroscience of moral judgment. In The Cognitive Neurosciences (4th edn) (ed Gazzaniga, M): 9871001. MIT Press.Google Scholar
Gutheil, TG (1980) In search of true freedom: drug refusal, involuntary medication, and ‘rotting with your rights on’. American Journal of Psychiatry, 137: 327–8.Google Scholar
Haidt, J, Rozin, P, McCauley, C, et al (1997) Body, psyche, and culture: the relationship between disgust and morality. Psychology and Developing Societies, 9: 107–31.Google Scholar
Haidt, J (2001) The emotional dog and its rational tail: a social intuitionist approach to moral judgment. Psychological Review, 108: 814–34.Google Scholar
Hinshelwood, RD, Skogstad, W (2002) Observing Organisations: Anxiety, Defence and Culture in Health Care. Routledge.Google Scholar
Kahneman, D, Tversky, A (1979) Prospect theory: an analysis of decision under risk. Econometrica, 47: 263–91.Google Scholar
Kovacs, M, Kovacs, E, Hegedus, K (2010) Is emotional dissonance more prevalent in oncology care? Emotion work, burnout and coping. Psycho-Oncology, 19: 855–62.Google Scholar
Mid Staffordshire NHS Foundation Trust Inquiry (2010) Independent Inquiry into Care Provided by Mid Staffordshire NHS Foundation Trust January 2005 to March 2009 (HC375). TSO (The Stationery Office).Google Scholar
Mistry, H, Osborn, D (2011) Underuse of clozapine in treatment-resistant schizophrenia. Advances in Psychiatric Treatment, 17: 250–5.Google Scholar
Miyamoto, S, Jarskog, LF, Fleischhacker, WW (2015) Schizophrenia: when clozapine fails. Current Opinion in Psychiatry, 28: 243–8.Google Scholar
Mossman, D, Lehrer, DS (2000) Conventional and atypical antipsychotics and the evolving standard of care. Psychiatric Services, 51: 1528–35.Google Scholar
Mossman, D (2002) Unbuckling the chemical straightjacket: the legal significance of recent advances in the pharmacological treatment of psychosis. San Diego Law Review, 39: 1033–164.Google Scholar
Pereira, S, Beer, D, Paton, C (1999a) Enforcing treatment with clozapine: survey of views and practice. The Psychiatrist, 23: 342–5.Google Scholar
Pereira, S, Beer, D, Paton, C (1999b) When all else fails: a locally devised structured decision process for enforcing clozapine therapy. The Psychiatrist, 23: 654–6.Google Scholar
Perlin, ML (2004) ‘Salvation’ or a ‘lethal dose’? Attitudes and advocacy in right to refuse treatment cases. Journal of Forensic Psychology Practice, 4(4): 5169.Google Scholar
Raboch, J, Kalisova, L, Nawka, A, et al (2010) Use of coercive measures during involuntary hospitalization: findings from ten European countries. Psychiatric Services, 61: 1012–7.Google Scholar
Royal College of Psychiatrists (2004) College statement on covert administration of medicines. Psychiatric Bulletin, 28: 385–6.Google Scholar
Royal College of Psychiatrists (2014) Consensus Statement on High-Dose Antipsychotic Medication (College Report CR190). Royal College of Psychiatrists.Google Scholar
Skitka, LJ, Bauman, CW, Sargis, EG (2005) Moral conviction: another contributor to attitude strength or something more? Journal of Personality and Social Psychology, 88: 895917.Google Scholar
Slovic, P, Peters, E (2006) Risk perception and affect. Current Directions in Psychological Science, 15: 322–5.Google Scholar
Spranca, M, Minsk, E, Baron, J (1991) Omission and commission in judgment and choice. Journal of Experimental Social Psychology, 27: 76105.Google Scholar
Stroup, TS, Gerhard, T, Crystal, S, et al (2014) Geographic and clinical variation in clozapine use in the United States. Psychiatric Services, 65: 186–92.Google Scholar
Treloar, A, Philpot, M, Beats, B (2001) Concealing medication in patients' food. Lancet, 357: 62–4.Google Scholar
Welsh, S, Deahl, MP (2002) Modern psychiatric ethics. Lancet, 359: 253–5.Google Scholar
Whittington, R, Bowers, L, Nolan, P, et al (2009) Approval ratings of inpatient coercive interventions in a national sample of mental health service users and staff in England. Psychiatric Services, 60: 792–8.Google Scholar
Woodbridge, K, Fulford, K (2004) Whose Values? A Workbook for Values-Based Practice in Mental Health Care. Sainsbury Centre for Mental Health.Google Scholar
Yakeley, J, Hale, R, Johnston, J, et al (2014) Psychiatry, subjectivity and emotion – deepening the medical model. Psychiatric Bulletin, 38: 97101.Google Scholar
Yank, GR, Lindsay, RJ, Barber, JW, et al (1992) Ethical issues for academic participants in state–university collaboration programs. Hospital & Community Psychiatry, 43: 1213–7.Google Scholar
Figure 0

TABLE 1 Risks and benefits of potential treatment strategies considered by the team

Submit a response

eLetters

No eLetters have been published for this article.