I am a
Home I AM A Search Login

All cognition is flawed

RECENT POSTS

GLOBAL YEAR

The 2024 Global Year will examine what is known about sex and gender differences in pain perception and modulation and address sex-and gender-related disparities in both the research and treatment of pain.

Learn More >

Clinicians, like with researchers, can fall prey to potential cognitive bias (Kleinmuntz 1990). It lurks within our minds without us being aware of it, and can present itself in everyday life as a stereotype or an assumption. As clinicians though, the cognitive biases that we have, impact two common things we are required to do; diagnose a problem i.e. ‘what we think it is’ and provide treatment i.e. ‘how to fix it’ (Croskerry 2013).

When identifying a problem that a patient has come to see us for, clinicians can enter into two schools of thinking; heuristics and analytical (Croskerry 2003). Heuristics can be thought of as the shortcuts our brain uses to save energy by solving a problem quickly; it could also be called ‘a rule of thumb’ or ‘an intuitive judgement’. However, while useful when time is short and resources (mental and physical) low, heuristic thinking can lead to trouble, because it increases the chance for cognitive bias to affect the thinking process and lead to diagnostic errors (Kleinmuntz 1990). Over a 100 different cognitive biases have been described (Croskerry 2003), substantially more than what one blog can cover. This post will identify just a few biases and provide examples of how they may affect clinical practice.

Let’s look at the usual presentation of a patient to a clinician. A patient presents with a particular problem. The clinician listens and gathers some routine information such as medication use and lifestyle information. After this the clinician then has to gather further information regarding the presenting problem, herein is the potential for bias, if the clinician arrives at a hypothesis (‘what we think it is’) too quickly and fails to adjust this hypothesis when new information is given (known as anchoring) the clinician increases the chance of a diagnostic error and can develop an incorrect treatment plan (‘how to fix it’) (Graber, Gordon & Franklin 2002). This anchoring bias can be severe when it is combined with confirmation bias. This is where the clinician searches for information that will prove their hypothesis correct rather than looking for information that may prove it to be incorrect (Rabin & Schrag 1999).

For example a patient presents to the emergency department with multiple stab wounds to the chest, head and arms. The patient is intoxicated but calm and co-operative. There are no signs of lung problems and all physical signs (apart from the multiple stab wounds) are normal. The first concern is the stab wound that would cause the most damage to the important organs of the chest. After chest scans and a physical examination of the chest wound. The danger is ruled out and following treatment the patient is discharged. The patient returns 4 days later with blurred vision, vomiting and trouble concentrating. A CT scan of the head shows a knife wound to the head that had penetrated the brain. In this example you can see both biases, the clinician ‘anchored’ onto the chest injury and then used scans and a physical examination of the chest to confirm the initial problem and therefore failed to search for further possibilities or other injuries. This example is perhaps dramatic but it shows you how terrible it can be if you are to miss something, even when it appears that the clinician has done an in-depth examination (example adapted from (Croskerry 2013)) .

Other biases can come into play when receiving previously gathered information (referrals or handovers) from other clinicians. Often the previous clinician will inform you of what they think it is and, like a snowball rolling down the hill, this thought gathers momentum and what started out as a possibility evolves into a “certainty” (known as diagnosis momentum) (Croskerry 2003). Often accompanying this bias is how the previous or referring clinician ‘frames’ the information, which might promote a particular view of the problem and limit other possibilities (Croskerry 2003).

For example a patient is sent to a psychiatric facility by her general doctor for severe symptoms of anxiety and depression. She has been having trouble breathing and has fainted several times. The psychiatrist wants to rule out a chest infection and sends her on to the hospital for an x-ray of her chest to rule out any problems with her chest. At the hospital the patient is assessed and it is noted that she is a smoker, overweight and has asthma. The patient’s chest is checked and chest films are found to be normal. The doctor at the hospital finds that the breathing problems are due to the anxiety. As the patient is leaving, she faints – they are unable to resuscitate her and the monitor has shown that her heart has stopped. The autopsy shows multiple pelvic vein blood clots extending from the femoral vein and in both lungs, which would have caused the breathing problems. To have diagnosed the blood clots, further tests would have needed to have been conducted. Once again a dramatic example, but here one can clearly see the effect of framing and diagnosis momentum. The possibility that the breathing problems were due to anxiety gradually gathered momentum until it stuck.  Despite the hospital clinician knowing conflicting evidence, this was the smoking and weight. Complicating this more was the way in which the information had been ‘framed’ as a ‘chest infection’ affecting the thinking processes of the other clinicians (example adapted from (Croskerry 2013).

The problem with cognitive bias is it accounts for a large proportion of incorrect diagnoses, some which can have a huge negative impact on the patient (Croskerry 2013). What makes it so tricky is that clinicians are required to make decisions based on the information given to us by patients, other clinicians etc. and things like funding, resources and time limit the clinicians’ ability to run multiple tests and spend long hours with a patient (Graber 2003). However, research has shown several ways that we can minimise cognitive bias. One way is called metacognition which is ‘thinking about your thinking’ (Croskerry 2002). In a clinicians case this might mean that you use the analytical school of thinking and double check your possibility before you accept it as true or that when conflicting information is given that you re-think your previous possibility (Croskerry 2002). Another which is frequently done is asking oneself ‘what else could this be?’ or searching for evidence that may disprove your first hypothesis (‘what I think this is’). Lastly, using practice scenarios where cognitive biases can be highlighted and identify ways to decrease them (Croskerry 2002). Clinicians’ can start to use these techniques to minimise the effect of cognitive bias in clinical practice.

This is the second in a three part series of posts looking at what cognitive bias is, and how cognitive bias influences our clinical practice and research.

About Kerwin Talbot

Kerwin Talbot BiMI completed my degree in Podiatry (that’s right feet) with honours in 2011, after working clinically for 2 years I returned back to the research world. After seeing several complexing patients during my clinical years, I decided to start (rather naively) a PhD in pain and neuroscience, and had the amazing fortune of being made a part of the BiM research team. I also do some clinical and academic teaching for Podiatry at the University of South Australia. My research looks at pain and classical conditioning. With the rest of my freetime I try to balance exercising for triathlons and watching devastatingly bad television.

References:

Croskerry, P 2002, ‘Achieving quality in clinical decision making: cognitive strategies and detection of bias’, Academic Emergency Medicine, vol. 9, no. 11, pp. 1184-1204.

Croskerry, P 2003, ‘The importance of cognitive errors in diagnosis and strategies to minimize them’, Academic Medicine, vol. 78, no. 8, pp. 775-780.

Croskerry, P 2013, ‘From mindless to mindful practice—cognitive bias and clinical decision making’, New England Journal of Medicine, vol. 368, no. 26, pp. 2445-2448.

Graber, M, Gordon, R & Franklin, N 2002, ‘Reducing diagnostic errors in medicine: what’s the goal?’, Academic Medicine, vol. 77, no. 10, pp. 981-992.

Graber, M 2003, ‘Metacognitive training to reduce diagnostic errors: ready for prime time?’, Academic Medicine, vol. 78, no. 8, p. 781.

Kleinmuntz, B 1990, ‘Why we still use our heads instead of formulas: Toward an integrative approach’, Psychological Bulletin, vol. 107, no. 3, p. 296.

Rabin, M & Schrag, JL 1999, ‘First impressions matter: A model of confirmatory bias’, Quarterly journal of Economics, pp. 37-82.

Share this