Heuristics in Clinical Decision Making

Original Editor - User Name

Top Contributors - Merinda Rodseth, Kim Jackson, Tarina van der Stockt, Ewa Jaraczewska and Jess Bell  

Introduction[edit | edit source]

Clinicians make decisions daily which impact on the lives of others (Whelehan 2020).[1] They are forced to continually make decisions despite the uncertainty that often taints the situation and any cognitive limitations that could exist (Gorini 2011). In order to achieve this, clinicians mostly rely on heuristics - simple cognitive shortcuts influenced by our cognitive biases - which assists in clinical decision making (CDM) (Gorini 2011, Croskerry 2013 one, Scott 2017, Whelehan 2020). The advantage of relying on heuristics is that decisions can be made quickly and they are mostly accurate and efficient (Tversky & Kahneman, Scott 2017, Gorini 2011). But there is also a disadvantage in that they often lead to systematic cognitive errors (Gorini 2011, Tversky & Kahneman, Croskerry 2013one). Our cognitive errors are also known as “cognitive biases” (Croskerry 2013one) .

What is cognitive bias?[edit | edit source]

Bias is inherent to human judgement (Croskerry 2013one) and can be defined as:

  • “the psychological tendency to make a decision based on incomplete information and subjective factors rather than empirical evidence” (Yuen 2018)
  • “predictable deviations from rationality” (Croskerry 2013one).

Cognitive biases are evident in CDM when information is inappropriately processed and/or overly focused upon (while ignoring other and more relevant information). This process happens subconsciously with the user unaware of its influence, and mostly happens during automatic System 1 processing using heuristics (Kinsey 2019fire). A systematic review done by Saposnik et al (2016)  found that cognitive biases may be associated with diagnostic inaccuracies but limited information is currently available on the impact thereof on evidence based care (Saposnik 2016).

Heuristics and biases in medical practice[edit | edit source]

Availability heuristic (O’sullivan 2018, Gorini 2011, Dobler 2018, Whelehan 2020, Trimble 2016, Blumenthal 2015, Ehrlinger 2016, Scott 2017, O’Sullivan 2018)[edit | edit source]

  • “Tendency to make likelihood predictions based on what can easily be remembered” (Dobler 2018).
  • “More recent and readily available answers and solutions are preferentially favoured because of ease of recall and incorrectly perceived importance” (O’Sullivan 2018).
  • “Tendency to overestimate the frequency of things if they are more easily brought to mind. Things are judged to be more frequently occurring if they come to mind easily, probably because they are remembered without difficulty or because they were recently encountered.” (Gorini 2011).
  • “Emotionally charged and vivid case studies that come easily to mind (ie, are available) can unduly inflate estimates of the likelihood of the same scenario being repeated” (Scott 2017)

It is important to note that evidence that is most available is not necessarily most relevant and events that are easily remembered do not necessarily occur more frequently (Gorini 2011, Dobler 2018). This heuristic is also closely related to the “Base-rate neglect” fallacy:

  • “... when the underlying incident rates of conditions or population-based knowledge are ignored as if they do not apply to the patient in question” (O’Sullivan 2018).
  • “...tendency to ignore the true prevalence of a disease, either inflating or reducing its base rate...” (Croskerry 2011)

Impact: Base rate neglect overrides the knowledge of the prevalence of disease/conditions and unnecessary tests are ordered regardless of very low probability (O’Sullivan 2018). It can lead to distorted hypothesis generation and thereby result in under- or overestimation of certain diagnoses. It can also heavily influence the types of tests ordered, treatments given and information provided to patients (based on what comes to mind) (Whelehan 2020, podcast, Gorini 2011).

Examples: A physiotherapist who recently attended a course on manipulation will be more inclined to use manipulation in the days following the course (Podcast). This is also evident if asked the name of the capital of Australia. Most people will quickly answer with “Sydney”, whereas Cranberra is actually the capital of Australia. As Sydney is generally more well-known, it comes to mind first. ( Ehrlinger 2016). Conditions/diseases that are less frequently encountered will be less “available” in the clinician’s mind and therefore also less likely to be diagnosed (Gorini 2011, Hussain 2018).

Mitigators: Look for refuting evidence in the history and examination that may be less obvious. Ensure a comprehensive knowledge of differential diagnosis. Be attentive to atypical signs and symptoms which could be indicative of more serious pathology and consult with colleagues on such cases (Hussain 2018).

Anchoring heuristic[edit | edit source]

  • “...the disposition to concentrate on salient features at the very beginning of the diagnostic process and to insufficiently adjust the initial impression in the light of later information ….. this first impression can be described as a starting value or ‘anchor’. ” (Gorini 2011)
  • “...the clinician fixates on a particular aspect of the patient’s initial presentation, excluding other more relevant clinical facts” (Yuen 2018).

Impact: It can be an effective heuristic for ensuring efficacy, but can have a negative influence on judgement when that anchor is no longer applicable to the situation. When it is no longer relevant, it increases the likelihood of incorrect diagnosis and management through premature closure. (Whelehan 2020)

Example: A medical assistant informs a busy doctor of a patient complaining of fatigue and who also seems depressed. The doctor’s thought processes are potentially anchored to the initial label of “depressed patient” and if not deliberately counteracted, the doctor will prescribe antidepressant medication. Should the doctor have inquired about further symptoms, he would have heard of the changes the patient experienced with his skin and hair (unusual in depression) which would have resulted in a more probable diagnosis of hypothyroidism.(Yuen 2018)

Mitigators: Be aware of the “trap” and avoid early guessing. Use a differential diagnosis toolbox. Delay making a diagnosis until you have a full picture of the patient’s signs and symptoms and your information is complete. Involve the patient in the decision making process. (Hussain 2018, Dale talk)

Representativeness heuristic[edit | edit source]

  • “...the assumption that something that seems similar to other things in a certain category is itself a member of that category” (Gorini 2011).
  • “...probabilities are evaluated by the degree to which A represents B, that is, by the degree to which A resembles B….[and] not influenced by factors that should affect judgements…prior probability outcomes...sample sizes...chance...predictability...validity...” (Blumenthal 2015)
  • “The physician looks for prototypical manifestations of disease (pattern recognition) and fails to consider atypical variants” (Ely 2011) i.e. ”if it looks like a duck, quacks like a duck, then it is a duck” (Whelehan video)

Impact: For experienced practitioners pattern recognition leads to prompt treatment and improved efficacy. It can however also restrain CDM to pattern recognition only which results in overemphasis of particular aspects of the assessment while missing atypical presentations. Less commonly known conditions therefore remain undiagnosed and undertreated. This heuristic will also result in misclassification because of overreliance on the prevalence of a condition (Whelehan 2020). Reliance on the representativeness heuristic may also lead to overestimation of improbable diagnoses and over-utilisation of resources due to the impact of the “base-rate neglect” effect (Gorini 2011). This heuristic is particularly evident in older patients with complex multimorbidity and frailty (Scott 2017).

Examples: A young boy spends the majority of his childhood taking apart electronic equipment (radios, old computers) and reading books about the mechanics behind electronics. As he grows into adulthood, would you expect him to study for a degree in business or in engineering. The majority of people will expect him to study engineering based on his interests even though statistically more people will study a business degree. This is associated with “base rate” neglect/fallacy where the established prevalence rates are ignored (podcast). Also associated with this is the “halo-effect” bias - “...the tendency for another person’s perceived traits to “spill over” from one area of their personality to another” (Kinsey 2049). For example, for a patient who is successful in business, works hard and was easy to communicate with, the expectation is also that they are going to follow recommendations, do their exercises and successfully rehabilitate (podcast).

Mitigators: Using a safety check system after the initial diagnosis to shift the thought processes from pattern recognition to analytical processing (Dale video). Consider further hypotheses for symptoms other than those that readily fit the pattern (Ely 2011).

[2]


Confirmation bias[edit | edit source]

  • “...tendency to look for and notice information that is consistent with our pre-existing expectations and beliefs” (Gorini 2011).
  • “...tendency to look for evidence “confirming” a diagnosis rather than disconfirming evidence to refute it” (Hussain 2018)
  • “Tunnel-vision searching for data to support initial diagnoses while actively ignoring potential data which will reject initial hypotheses.” (Whelehan 2020)

Impact: Even though it can support experienced clinicians in resource scarce situations to make quick low risk decisions, it results in premature closure of a diagnosis (Whelehan 2020, Trimble 2016). Clinicians even frame their enquiries to support their beliefs. (podcast). It is closely related to the “anchoring” heuristic (Whelehan 2020). Using confirmation heuristics can also lead to wasted time, effort and resources while the correct diagnosis is missed (Hussain 2018). The confirmation bias leads to tunnel visioning in diagnosis which increases the likelihood of paternalistic approaches (the physician knows best) to healthcare which is more system-based, as opposed to the patient-centered care approach (Dale video).

Example: When a patient presents with raised white blood cells, a physician immediately suspects the patient has an infection instead of asking himself “I wonder why the white cells are raised, what other findings are there”? (O’Sullivan 2018)

Mitigators: Remember that the initial diagnosis is debatable and dependent on both confirming and negating evidence - look at competing hypotheses (Hussain 2018). Be open to feedback and open with the patient to engage in shared decision making. Engage in reflective practice (Dale Talk)

Overconfidence heuristic[edit | edit source]

  • “When a physician is too sure of their own conclusion to entertain other possible differential diagnoses. It may result in decision-making being formulated through opinion or ‘hunch’ as opposed to systematic approaches”. (Whelehan 2020)
  • “...the tendency to overestimate one’s own knowledge and accuracy in making decisions. People place too much faith in their own opinions instead of carefully gathered evidence.” (Gorini 2011)
  • “A common tendency to believe we know more than we do.” (Hussain 2018)

Impact: This heuristic is driven by the general human need to maintain a positive self-image (Gorini 2011). It is heavily influenced by personality and there is an increased risk of illusion of control within situations. This can lead to overestimation of knowledge and understanding and eventually incorrect diagnosis and treatment.

Example: A physician assessing a patient presenting with headaches and dizziness over the last few weeks. The physician is convinced the patient has migraine.The patient, however, thinks they are just “sick”, but the doctor just disregards them without considering alternate hypotheses like otitis media or sinusitis. (Whelehan 2020)

Mitigators: These personality-based biases need to be challenged. Cognitive forcing strategies like simulated environments where they can see the consequences of their overconfidence can help clinicians be more aware of their own limitations and gaps in their knowledge. (Whelehan video)

Bandwagon heuristic[edit | edit source]

  • “...tendency to side with the majority in decision making for fear of standing out. (Whelehan 2020)
  • “Group-think and herd effects, often fueled by influential individuals with authority or charisma, may discourage or dismiss dissenting views about the value of an intervention”. (Scott 2017).

Impact: This heuristic is influenced by the work culture of the physiotherapist and encouraged in settings where non-disclosure is more prevalent. Although it can result in  better harmony and cooperation in teams, it impedes proper decision-making due to the lack of opposing ideas, creativity and feedback on decisions, ultimately resulting in missed learning opportunities and sub-optimal care. (whelehan video)

Example: Students mentored by senior members of staff who overrides individual decision-making. Another example would be a situation with junior doctors on a ward round led by an experienced specialist. The junior doctors would rather concede to the opinion of the specialist than raise their opinions/concerns out of fear for repercussions (Whelehan 2020).

Mitigators: A culture of open disclosure promotes effective communication of team members  and optimises patient care through collaboration. (Whelehan video)

Commission bias[edit | edit source]

  • “A tendency towards action rather than inaction. Better to be safe than sorry” (O’Sullivan 2018)
  • “The tendency in the midst of uncertainty to err on the side of action, regardless of the evidence”. (Yuen 2018)

Impact: The commission bias is an important driver of low value, which includes over-investigation and over-treatment (Dobler 2018, Yuen 2018). It can also lead to overconfidence in clinicians who then treat patients in an inappropriate manner which can result in exacerbation of symptoms or cause bodily harm.

Example: In terminal illness, clinicians may continue to administer futile care fueled by the desire to act. (Yuen 2018).

Mitigators: Clinical mentoring from experienced clinicians to junior clinicians, safety checklists and decision making aids (dale talk)

Omission heuristic[edit | edit source]

  • “Tendency to judge actions that lead to harm as worse or less moral than equally harmful non-actions (omissions)”. (Dobler 2018)
  • “A tendency towards inaction grounded in the principle of ‘Do No Harm'”. (Hussain 2018)

Impact: Managing patients too conservatively can lead to delays in treatment and an inadequate response to the clinical symptoms (Whelehan 2020). The clinician would rather attribute the patient’s outcome to the natural progression of a disease than to his/her own actions (Yuen 2018).

Example: Physicians are more concerned about the potential adverse effects from treatment than the more pertinent risks of morbidity and mortality associated with the disease (Dobler 2018). For example performing sub-optimal depth compressions during cardiac resuscitation in order to avoid causing rib fractures (Yuen 2018). Mitigators: Clinical mentoring from experienced clinicians to junior clinicians, safety checklists and decision making aids (dale talk)

Aggregate heuristic[edit | edit source]

  • “...when physicians believe that aggregated data, such as those used to develop practice guidelines, do not apply to individual patients they are treating” (Whelehan 2020)

Impact: The clinicians believe that aggregated data like those used to develop clinical guidelines, do not apply to them as individual practitioners which results in overriding of the clinical decision-making rules in favour for individual judgement (Whelehan video). This approach can lead to the use of “old-school techniques” and non-evidence based approaches in practice which can result in prolonged ineffective treatment of the patient.

Example: A patient arriving in emergencies after a car accident with severe bleeding. The patient is also a Jehovah’s Witness and will not consent to a blood transfusion in any circumstance. In lieu of what the surgeon considers to be in the patient’s best interest, the surgeon waits until the patient loses consciousness before deciding to operate and transfuse. (Whelehan 2020).

Mitigating:  The use of Evidence Based guidelines and engaging in research as part of continual professional development (Dale Talk).

Status quo bias[edit | edit source]

  • “A preference for the current state and can be explained with loss aversion. Any change is associated with potential losses and discomfort. As people are loss averse (prospect theory), the losses weigh heavier than the gains.” (Dobler 2018)
  • “Having to consider the advantages and disadvantages of ceasing or declining certain interventions is often confronting, resulting in a preference to simply maintain the status quo” (Scott 2017)

Impact: Can result in “clinician inertia” when clinicians do not intensify or step down treatments, even when it is indicated.

Example: Stepping down asthma medication or intensifying treatment for Type 2 diabetes mellitus when indicated. (Dobler 2018)

Framing bias[edit | edit source]

  • “...refers to the fact that people’s reaction to a particular choice varies depending on how it is presented, for example, as a loss or as a gain”. (Dobler 2018)
  • “Reacting to a particular choice differently depending on how the information is presented to you” (O’Sullivan 2018).

Example: A physician telling a patient that the risk of a brain haemorrhage from oral anticoagulation is 2% is perceived very differently to informing the patient that there is a 98% chance of not having a brain haemorrhage on treatment.”(Dobler 2018)

Premature closure bias[edit | edit source]

  • “Tendency to cease inquiry once a possible solution for a problem is found.” (Yuen 2018).
  • “The decision making process ends too soon, the diagnosis is accepted before it has been fully “verified””. (Hussain 2018)
  • “When the diagnosis is made, the thinking stops” (Croskerry 2011)

Impact: Premature closure leads to incomplete assessment of the problem which will result in incorrect conclusions (Yuen 2018).

Affect heuristic[edit | edit source]

  • “Representations of objects and events in people’s minds as tagged to varying degrees of affect. People revert to the “affect pool” (all the positive and negative tags associated with  the representations) in the process of making judgements” (Blumenthal 2015)
  • “Favourable impressions of an intervention may evoke feelings of attachment and persisting judgements of high benefits, despite clear evidence to the contrary” (Scott 2017)

Sunk cost bias[edit | edit source]

  • “...tendency to continue an endeavor once and investment in money, effort or time has been made” (Blumenthal 2015)
  • “Clinicians may persist with low value care principally because considerable time, effort, resources and training have already been invested and cannot be forsaken” (Scott 2017).

Impact: Sum cost fallacy forms part of the confirmation bias. Clinicians justify past investments of time, money and effort through their current investment of time/money and effort and continue to use treatment techniques learnt at school or on a course to justify the time/money/effort spent to confirm their belief that it wasn’t wasted and is not a sunk cost.(podcast)

Cognitive debiasing[edit | edit source]

The solution to overcoming the biases that influence our decision making lies in a series of cognitive interventional steps called “cognitive debiasing” - “a process of creating awareness of existing bias and intervening to minimise it”. (Dobler 2018). Cognitive debiasing is “an essential skill in developing sound clinical reasoning” (Croskerry 2013 two).

Cognitive diabasing is a process that entails changes which cannot occur in a single event, but rather through a series of stages (Croskery 2011two). See Figure 1.

Figure 1: Model of change. Adapted from Croskerry 2013two

Cognitive debiasing strategies[edit | edit source]

Croskerry, Singhal and Mamede (2013 two) proposed three groups of strategies that could aid in mitigating the impact of cognitive bias. These groups have considerable overlap and should be considered as a spectrum and not apart from each other.

a. Education Strategies (Dobler 2018) (Croskerry 2017) (O’Sullivan 2018)

  • Aims to improve clinicians’ abilities to detect the need for debiasing in the future
  • Involves learning about bias and its risks, how to identify bias and providing skills to mitigate it (O’Sullivan 2018, Dobler 2018)
  • Elicit self-awareness of own biases in decision making
  • Cognitive tutoring systems and simulation training to force clinicians to approach their biases directly (Dobler 2018)
  • Little evidence on the effectiveness of education as a debiasing strategy (Dobler 2018, O’Sullivan 2018)

b. Workplace strategies (Croskerry 2013 two, Dobler 2018)

  • Debiasing implemented at the time of problem-solving, while reasoning about the problem at hand (Croskerry 2013two)
  • Includes strategies that relies on the clinician’s cognitive processes, strategies that demand interventions in the settings of practice as well as strategies embedded in the healthcare system to facilitate mitigating cognitive bias (Croskerry 2013two, Dobler 2018).
  • Thorough information gathering
  • Slowing down strategies - induce slow and deliberate reasoning and a switch to System 2 processing for e.g. planned time out in the operating room (Croskerry 2013two, Dobler 2018, O’Sullivan 2018)
  • Reflective practice and role modeling (Scott 2017, Croskerry 2013two, Croskerry 2017, Yuen 2018, Hussain 2018, Trimble 2016, Saposnik 2016).
  • Metacognition and considering alternatives - think about your own thinking and consider “what else could this be”? (Croskerry 2009, Croskerry 2013two, Croskerry 2017, O’Sullivan 2018, Trimble 2016)
  • Group decision strategy - collaboration and discussion between clinicians about cases (Croskerry 2013two, Yuen 2018, Hussain 2018)
  • Personal accountability and feedback - “thinking out loud” (Croskerry 2013two, Trimble 2016, Hussain 2018)
  • Decision support systems - including CDM aids, differential diagnosis generators, checklists (Yuen 2018, Croskerry 2013two, Whelehan 2020, Dobler 2018, Gorini 2011, Ely 2011, O’Sullivan 2018)
  • Exposure control - not providing the nurse’s notes for the doctor to read before assessing the patient (Croskerry 2013two)
  • Some evidence exists for the effectiveness of these interventions (Dobler 2018)

c. Forcing functions/Strategies for individual decision makers (Croskerry 2013two, Dobler 2018)

  • Based on cognitive forcing functions - “...the decision maker is forcing conscious intention to information before taking action” (Dobler 2018) or “...rules that depend on the clinician consciously applying a metacognitive step and cognitively forcing a necessary consideration of alternatives” (Croskery 2013two).
  • Includes deliberate, real-time reflection (Dobler 2018)
  • Clinical prediction rules (Croskerry 2013two)
  • Rule out Worst-case scenario (croskerry 2013two)
  • Consider the opposite (Croskerry 2013two, Dobler 2018)
  • Checklists (croskerry 2013two, Dobler 2018, Ely 2011)
  • Structured review of data, templates (Croskerry 2013two, Croskerry 2011)
  • No evidence in medical context (Dobler 2018)

Many other strategies have also been proposed by others as useful in overcoming cognitive biases:

Culture of error disclosure (Whelehan video)

  • Have an open culture on error making - medical error is inevitable and important to reflect on past activities
  • Mitigates for personality based heuristics
  • Human factors training (Whelehan 2020, Yuen 2018, O’Sullivan 2018)
    • Cognitive overload
    • Difficult patients
    • Sleep deprivation
    • Consider fatigue
    • Impact of affect/emotion - include self-awareness check-ins and resilience training
  • Shared decision making (Scott 2017, Gorini 2011, Tousignant-Laflamme 2017, Hoffmann 2019)
    • “... a partnership between the clinician and the patient, in which the clinician’s knowledge and preferences (in terms of health care interventions) are evaluated together with the patient’s preferences and needs” (Gorini 2011)

Conclusion[edit | edit source]

The study of heuristics rises three questions and thereby has three goals (Raab 2015):

  1. Which heuristic do clinicians use/rely on? To answer this we need to analyse the “adaptive toolbox” (collection of heuristics) that clinician have at their disposal
  2. When should which heuristic be used, i.e. in which situations is a heuristic more likely to be successful? This goal is prescriptive in nature and  requires the study of the ecological reality of heuristics.
  3. How can we improve the decision making? This is an engineering/design goal (“intuitive design”) that strives to design expert systems in order to improve decision making.

The general assumption in the literature is that biases and heuristics are “bad” and result in suboptimal decisions. This might occasionally be the case, but heuristics can also be “strengths'' that allow clinicians to make quick decisions using simple rules, which is often essential in clinical practice. It is therefore also imperative to understand which cognitive biases and heuristics are detrimental to sound CDM and in which contexts (Blumethal 2015). Many of the biases evident in clinicians is recognisable and correctable which underlies the concept of learning and refining clinical practice (Croskery 2013one). The general goal should therefore be to formalise and understand heuristics in order to teach their use more effectively as this will also result in less variation in practice and more efficient health care (Marewski 2012).

Resources[edit | edit source]

  • bulleted list
  • x

or

  1. numbered list
  2. x

References[edit | edit source]

  1. Whelehan DF, Conlon KC, Ridgway PF. Medicine and heuristics: cognitive biases and medical decision-making. Irish journal of medical science. 2020 May 14;189:1477-1484.DOI: 10.1007/s11845-020-02235-1
  2. Intermittent Diversion. Kahneman and Tversky: How heuristics impact our judgment. Available from https://www.youtube.com/watch?v=3IjIVD-KYF4 [last accessed 28 January 2021]