For over 130 years the MDU has helped doctors when things go wrong when treating patients, and we have had the opportunity to see first-hand what works well in promoting safer patient care and what does not.
In our contribution to World Patient Safety Day 2019 we looked at the fundamental importance of getting organisational culture right – moving the focus from blaming an individual to working out why that person, within the system, was not able to deliver the safest care possible.
To help illustrate the futility of a blame culture, the tragic and avoidable deaths associated with accidental intrathecal injection of vincristine is a useful starting point. For decades, doctors were criminalised for their part in fatal drug administration errors, but this did not help. What was ultimately needed was to understand the human and systemic factors that influenced their behaviours.
The death of a patient due to a drug error is a tragedy that should be preventable. It is devastating for the patient's family and for the staff involved, and both speak with one voice when they say that the error must not be allowed to happen again. But drug errors, even very serious ones, are difficult to eradicate and the cases of intrathecal administration of vincristine are a case in point, where the MDU repeatedly argued for change.
Vincristine, an important drug in the treatment of Hodgkin's disease, was known since the 1960s to be potentially fatal if administered intrathecally1. In 1978, Shepherd and colleagues2 at the Texas Children's Hospital published a case report of the accidental intrathecal administration of vincristine to a five-year-old child who subsequently died. But this was not a dry, academic description of what happened; it included recommended solutions, including the need to clearly label syringes and not place intrathecal and intravenous drugs in the same treatment area.
Intrathecal vincristine deaths
We assume those observations were not heeded, because the cases kept occurring. MDU senior solicitor Ian Barker discusses the case of Dr Michael Prentice (whom he was involved in defending in 1993) in an earlier MDU journal article about defending doctors in criminal proceedings. The case was important in clarifying the law on gross negligence manslaughter, and as it happened also arose from the accidental administration of vincristine by the intrathecal route. The incident was in 1990 and was not an isolated occurrence; in the 16 years to 2001, 11 patients died from accidental intrathecal vincristine injection. The NHS was not learning from these fatal errors and prosecuting those medical practitioners involved was not the solution.
In England, two reports3 about intrathecal medication errors were published in 2001, followed by national guidance4 from the Department of Health the same year. Physical preventive measures (infusion device connectors) were subsequently introduced into the NHS in 2009 and 20115. The outcome of these positive interventions is that they made a difference, and there have been no further deaths from the accidental administration of intrathecal vincristine since January 2001.
Many people will be surprised that it took decades to bring about such important patient safety change. It is better understood in terms of organisational culture, which is explained below.
Just and learning culture
NHS Improvement's Patient Safety Strategy has rightly put a patient safety culture central to its work on delivering safer systems. It recognises that, ''just cultures' in the NHS are too often thwarted by fear and blame' and sets out specific actions to support a patient safety culture.
The history of intrathecal vincristine deaths exemplify the tension that arose between a perceived need to find fault and blame, and the need to learn from the tragedy that occurred. Returning to the case of Dr Prentice, a raft of systemic failings including inadequate clinical supervision, absence of a data sheet for the cytotoxic drugs and very inexperienced junior doctors and student nurses being involved in the process all contributed to the catastrophic outcome.
But the system is almost set up to fail; a patient's unexpected and preventable death may lead to a criminal investigation, a coronial investigation, disciplinary and regulatory procedures, civil litigation and media coverage. Sometimes these processes are necessary, but where the focus almost immediately narrows to an individual's failings, what is often lost is the opportunity to properly and rigorously examine the context and the wider systems failures that are often at the heart of such incidents.
It is no coincidence that airline pilot and academic Sidney Dekker, in writing his first edition of his book Just Culture, was motivated by the criminalisation of error in aviation and healthcare. In the third edition6, Dekker notes that there is no evidence that criminalising error is helpful or effective:
'There is no evidence, however, that the original purposes of a judicial system (such as prevention, retribution, or rehabilitation – not to mention getting a 'true' account of what happened or actually serving 'justice') are furthered by criminalising human error… Not only is the criminalisation of human error by justice systems a possible misuse of tax money – money that could be spent on better ways to improve safety – it can actually end up hurting the interests of the society that the justice system is supposed to serve… If you want people in a system to account for their mistakes in ways that can help the system learn and improve, then charging and convicting a practitioner is unlikely to do that.'
Establishing a just culture
If there is one thing everyone can agree on, it is that changing an organisation's culture is not easy. As Dekker says, just culture is not a 'particular programme or blueprint' that can be bought off the shelf or learned from reading a book about it. The good news is that some NHS-specific guidance is available to those who want to understand better what needs to be done to establish and embed a just culture in their organisations.
…what is often lost is the opportunity to properly and rigorously examine the context and the wider systems failures…
NHS Resolution's report Being Fair brings together theory and practical experience on how a just culture can be achieved. Key to their approach is a move away from retributive justice (punishing the individual who made the error) to restorative justice (rebuilding trust in the person and organisation who made the error).
The main points are summarised below.
NHS Resolution's expert group agreed three future aims:
- prioritise learning about how to minimise the conditions and behaviours that can underpin or lead to error rather than apportion individual blame
- build a consistent approach for all staff, no matter what profession or what background
- be determined to avoid, where possible, inappropriate exclusion and disciplinary action against staff unless there is evidence of wilful intent to cause harm.
The need for a proportionate response in respect of individuals when things go wrong is mirrored in NHS Improvement's Just culture guide, which describes a step-wise assessment of an individual's culpability in a patient safety incident. This includes:
- deliberateness of action (wilfulness)
- the health of the individual
- the foreseeability of the event (related in part to whether protocols or training had been given to prevent such situations)
- whether other similar professional groups would have acted in a similar way
- whether there are any mitigating circumstances.
The explicit acceptance in the Patient Safety Strategy to promote a just and learning culture in the NHS is welcome. We know that the move to a just and learning culture will not happen overnight. It will require knowledgeable and motivated clinical leaders and managers, and time to restore trust and confidence in processes designed to investigate clinical errors.
But perhaps most of all it will require an acknowledgement that the punitive systems of old have not served patients and their families well – but that a fresh approach, built on a just culture, will.
An edited version of this article first appeared on the MDU website as part of World Patient Safety Day 2019.