When you hear hoofbeats, do you think horses or zebras? How identifying and allowing for bias in your clinical practice can help anticipate medico-legal issues.

Cognitive bias arises when automatic thinking leads to a different conclusion from that which would have been reached with more analytical and deliberate thinking. The majority of our thinking as we go about our day to day lives occurs unconsciously and efficiently – indeed, it is this automatic, efficient process which makes it so useful, particularly from an evolutionary perspective, as it enables us to respond quickly to situations.

Problems arise, however, with the complex situations of modern life. The practice of medicine is particularly susceptible to cognitive bias because of its complexity, variety and periodic hidden curve balls. Then add in fatigue, time pressures and other distractions, all of which favour automatic thinking, and it is not surprising that biases commonly occur.

Heuristics are rules of thumb that we process quickly and automatically, and they can still be useful in modern life. For example, if you want to catch a fast-moving, high looping ball, you don't need to solve complex differential equations, consciously or unconsciously. The heuristic that works is to adjust your running speed so the angle of your gaze between your eye and the ball remains constant.

But as well as being useful, the speed of this process can also cause harm, such as leading to stereotyping and making inaccurate judgements.

Some of the heuristics used in medicine have been immortalised through the ages:

  • If it looks like a duck, sounds like a duck, and walks like a duck, it's a duck.
  • Occam's razor: search for the single diagnosis that can explain everything.
  • Common conditions occur commonly: if you hear hoof beats, think horses rather than zebras.

These work most of the time, but not as reliably as the ball catching heuristic. And a medical 'dropped ball' can have catastrophic consequences.

Almost half of medical errors involve reasoning or decision quality issues, with many of the adverse consequences being preventable (Wilson et al: Medical Journal Australia 1999). A paper covering our lack of insight in medicine ten years ago noted only 10% of clinicians admit to having made a diagnostic error over the preceding year, 40% of diagnoses about which clinicians were certain proved wrong at post-mortem, and there was a psychological inertia to changing diagnosis – even if alternatives were suggested by colleagues or decision support tools – with clinicians remaining wedded to their original formulation.

Types of bias

A number of problems stem from too readily following the familiar. Initial facts presented about a case prime certain assumptions about potential diagnoses, and then thought processes and investigations follow to fit with the initial assumptions.

Some examples of bias are listed below, along with the more disciplined thinking and mitigating factors that you could take into account when analysing the potential for bias in a given situation.

Premature closure

I recall an incident as a medical student in Oxford in the early '90s, in the days when patients lined up in corridors was a nightly occurrence. An alcoholic patient slumped in a trolley had been left to sleep it off, before it eventually dawned, at dawn, that the cause of the slumping was actually cerebral and the patient had suffered what proved to then be a large bleed. Alcohol increases the risk of both prejudices and other pathologies.

  • Disciplined thinking: what are my biases about this patient and the diagnosis? What else might it be? Could there be several processes? Think systematically about all possibilities. Is there a rare alternative that should not be missed?
Band-wagon effect

Consider the frequent attender or hypervigilant patient reporting multiple symptoms. They may have had multiple previous consultations and investigations with nothing found – but eventually they will. The challenge is identifying this and being rigorous enough to think systematically with each presentation. This is difficult when there is a poor 'signal to noise' ratio of symptoms to pathology.

  • Disciplined thinking: am I at risk of automatically dismissing symptoms? I need to force myself to think through the possibilities or perhaps get a colleague and fresh pair of eyes to review.
The zebra retreat

Sometimes the junior registrar is at an advantage. They’ve had less automatic conditioning and more recent revision of lists. Uncommon things do occur, sometimes hidden among more common problems. For example, idiopathic intracranial hypertension headache characteristics are similar to those of migraine, and both have visual disturbances. Many patients will have a history of migraine, so ask the right questions about visual disturbance and look at the fundi – or you'll miss the zebra…

  • Disciplined thinking: those hoof beats may belong to zebras, not horses. Another analogy would be to think about the energy expenditure versus cost of running away from that sound in a jungle – it might be a monkey or it might be a tiger. The latter may only be a one in 100 chance, but get it wrong and it's game over. That is worth a lot of wasted energy expenditure on another 99 runs.
Priming

This is when a prior exposure influences the interpretation of a subsequent one. One of my favourite examples is the same chemical being perceived as disgusting or pleasant depending on whether one is shown a picture of sweaty armpits, or ripe soft cheese on a farmhouse table.

In medicine it may be the subtle waft of yesterday's booze on the breath of the patient with tingly feet. This could end up leading one down an alcoholic neuropathy line, when the problem might in fact be B12 deficiency.

  • Disciplined thinking: why am I pursuing this particular line of thinking?What other lines of enquiry could I have followed?
Availability heuristic

We are much more likely to investigate for things that readily come to mind that have recently been seen or presented at meetings.

  • Disciplined thinking: is it just random variation that several buses have come at once? Or could it be previous occurrences have made me more aware, and I've forgotten about all the other differentials that don't come to mind so readily?
MDU app

There are many other cognitive biases to which we can succumb, but perhaps the one seen most often in a medico-legal context is the hindsight bias; the patient suffers some injury and the series of events leading to it are laid out. It all seems so obvious, and terrible, in retrospect.

We have remarkable abilities to create stories that make sense of events and to conclude that if one had been wise, it was obvious that A was going to lead to B. We see this regularly in political analysis and other world events. It is understandable that patients also think this way and this probably contributes to their anger. As professionals, especially those writing medico-legal reports, we must not succumb to this tendency as well.

Summary

We are at the mercy of everything our brains have been exposed to since we were born, as well as our evolutionary histories, and most of our thinking occurs unconsciously. As social psychologist and author Jonathan Haidt puts it, our consciousness is like the rider on the elephant, with relatively little influence on much of our thinking. The more insight we can gain into this fact, the better we can understand our own cognitive biases to protect both patients and ourselves.

Recommended reading

Gerd Gigerenzer, (2014). Risk savvy: how to make good decisions. Allen Lane.

Daniel Kahneman, (2011). Thinking fast and slow. Farrar, Straus and Giroux.

Jonathan Haidt, (2012). The Righteous Mind. Penguin.

This page was correct at publication on 07/10/2019. Any guidance is intended as general guidance for members only. If you are a member and need specific advice relating to your own circumstances, please contact one of our advisers.