Clinical Decision Making in Physiotherapy Practice

Introduction[edit | edit source]

The majority of time spent working in clinical practice involves thinking and decision-making.[1][2][3] Because decisions are so commonly made, it is easy to assume that anyone can make effective decisions.[1] Effectiveness in clinical practice depends on the decisions made, which stresses the importance of learning how to make decisions optimally. [4][5]Clinical or diagnostic reasoning has been proposed to be the most important core skill of any healthcare practitioner.[6] The dynamic and ever-changing realm of health care demands that practitioners provide meaningful improvements in patients, and the clinical decision process is the only path to achieving it.[5][7] This decision-making can be fast, intuitive, or heuristic to analytical and evidence-based.[8] The rate at which practitioners fail this critical skill is alarmingly high. Not only have large discrepancies (20-40%) been reported between ante and postmortem diagnoses, but many postmortem examinations would not have been necessary if the correct diagnosis was made.[9][10]. The United States spending on health care per person continues to increase, and it has been estimated that nearly 18% ($2.7 trillion) of the United States GDP is spent on healthcare. [11] However, more than 30% is wasted on inappropriate care.[12] Up to 80 000 deaths occur annually in hospitalised patients in the United States due to incorrect diagnoses. In the outpatient setting, around 5%  of adults (12 million US adults annually) are incorrectly diagnosed. [13]

Most of these failures are not attributed to system problems or knowledge failure but to how practitioners think - how they solve problems, reason and ultimately make decisions[5]. To improve healthcare, it is essential to improve clinical reasoning and clinical decision-making.

What is Clinical Decision Making?[edit | edit source]

Clinical decision-making (CDM) is a contextual, continuous and evolving process where data is gathered, interpreted and evaluated to select an evidence-based action. [14] "The CDM process is a highly complex, multi-faceted skill that is developmental in nature and requires a substantial amount of practice with realistic patients to develop.” i.e. learnt behaviour. [7][15] The understanding of CDM is still evolving and is largely based on research in psychology, medicine and nursing.[7][16] It involves multiple different types of reasoning [7]. It is therefore wrapped up in the  “Great Rationality Debate” about the most optimal course of our reasoning, decision-making and actions. [17] Rationality is proposed to be the foremost characteristic of the accomplished decision-maker.[5]

What is the great rationality debate all about?[edit | edit source]

In part, multiple significant theories of rationality could be relevant to the medical field.[17] “Rationality is often defined as acting in a way that helps us achieve our goals, which in the clinical setting typically means a desire to improve our health.”[17] Most significant consequence theories agree that to achieve the goals, we need to take into account both the benefits (gains) and harms (losses) of alternative courses of action.[17] Djulbegovic & Elqayam[17] developed a list of “core ingredients” of rationality and its relevance to the medical field (Table 1).

TABLE 1- Core ingredients/ Principles of rationality commonly identified across theoretical models (highlighted in bold)  (Adapted from Djulbegovic & an Elqayam)[17]

Principle Description
Principle 1 Rational decision-making requires the integration of
  • Benefits (gains)
  • Harms (losses)

to accomplish our goals like improved health

Principle 2 It generally happens under conditions of uncertainty.
  • The rational approach requires the use of reliable evidence to deal with the intrinsic uncertainties.
  • It depends upon cognitive processes that permit the integration of probabilities/uncertainties.
Principle 3 “Rational thinking should be informed by human cognitive architecture.”
  • Made up of type 1 “old mind” reasoning processes (affect‐based, intuitive, fast, resource‐frugal) and type 2 “new mind” processes (“analytic and deliberative, consequential driven, and effortful”)
Principle 4 Rationality is context dependant and should be aware of environmental, and computational constraints of human brains
Principle 5 Rationality (in medicine) is closely related to 'ethics and morality of our actions
  • requires consideration of utilitarian (society‐oriented), duty-bound (individual‐oriented), and right‐based (autonomy, “no decision about me, without me”) ethics

Despite this ongoing debate about defining rationality, there is consensus that it should conform to a normative standard - how the decision “ought” to be made[5]. The dominant paradigm seems to be that “Decision-making should be logical, evidence-based, follow the laws of science and probability and lead to decisions that are consistent with the rational choice theory[5]

How Do We Make Decisions?[edit | edit source]

There are two main frameworks (mental models) described in the medical literature that clinicians employ in clinical decision-making:

  • Fast and frugal heuristics[18]
  • Dual system theory[19] - which can also be described as a two-system mind

Fast and frugal heuristics (FFH) is based on the theory that “people do not rely on a single cognitive strategy but rather select an adequate heuristic from what has been dubbed an adaptive toolbox. Each of the heuristics (decision-making tools) in this cognitive “toolbox” is suitable for different environments. By relying on a heuristic that fits a particular environment, people can make efficient decisions in little time and based on little information (hence ‘fast -and frugal)”.[20]

Heuristics can be defined as:

  • “...mental shortcuts developed by a clinician over time and include recognizing patterns of disease, case experience, intuitive judgment and the “rule of thumb” applications” [15]
  • “...short-cuts, abbreviated ways of thinking, maxims, ‘seen this many times before’ ways of thinking” [21]
  • “...cognitive strategies or mental shortcuts that are automatically and unconsciously employed - are essential for decision making. Heuristics can facilitate decision-making but can also lead to errors. When a heuristic fails, it is referred to as a cognitive bias” [22]

Heuristics are our “gut decisions'' and rely on our intuition - our recognition based on our past experiences. Heuristics utilizes the capacities developed through extended practice.[23] The aim of heuristics is not to “optimise” (find the best solution) but rather to “satisfice” (find a good enough solution).[23]

FFH is used when there is a need to make quick and accurate decisions, and the information is limited to only what is necessary at that specific time.[24][25] This is commonly seen in medicine with triage or emergency medicine. In emergencies, there is no need for a wide spectrum of information. The clinician only needs the immediate information necessary to know what to do at that specific moment, whether to resuscitate, intubate, apply a tourniquet or identify if the patient has a stroke.  In these life-or-death situations, FFH strategies are important, and the algorithmic decision-making trees come into play. The clinician now only considers a series of “yes/no” questions until an answer is reached.[26] These fast-and-frugal decision-making trees and tools can be even more accurate than multiple regression.[23]

This approach cannot be applied to every situation, and in healthcare, it is only applicable in two instances where:

  • Rapid decisions need to be made. The amount of information is therefore limited, and simple decision-making tools/trees are used
  • The decision maker has intuition - a substantial amount of experience in similar enough situations to draw upon.[26]

FFH is not universal in Physiotherapy[26], because

  • Not everybody has intuition. For example, newly qualified clinicians don’t have an experience bank yet to draw upon
  • Most of the situations in physiotherapy are non-urgent. Therefore, fast decisions are not always needed, and physiotherapists can spend time gathering more information.
  • The decision-making trees used mainly depend on the biomedical model, which is essential in triage-type situations where patient-oriented factors like goal setting are unnecessary. Physiotherapy has been moving away from the biomedical model rather than utilising the biopsychosocial approach to healthcare.

The second, and more commonly used approach to CDM, is the Dual-Process theory, developed by Kahneman & Tversky.[19] Dual-process theory incorporates the use of intuitive (System 1) and analytical (System 2) processes toward thinking, reasoning and deciding.[6][19][22][27] See Table 2.

Intuitive/System 1 reasoning[edit | edit source]

  • Similar to FFH and most commonly used in clinical practice.
  • Heavily reliant on the experience of the clinician making the decision[6]
  • It is fast, automatic and uses thin-slicing - relying on our instinctive first impressions to form an unconscious diagnosis [2][6]
  • These decisions use heuristics [6]
  • It is an adaptive mechanism that saves time and effort when making decisions. We spend the majority of our time in this “automatic zone” - where few events are deliberate, and most events automatically trigger the next.[21]
  • These unconscious processes may control our behaviour without us being aware.[27]
  • Heuristics are most effective, but they are also prone to error/biases.[21]

Analytical/System 2 processes[edit | edit source]

  • A slower and more deliberate process.[2][6]
  • Methodical and analytical and involves critical thinking and hypothesis testing[2][6][27]
  • More reliable, safe and effective but also slower and more resource-intensive.[6][28]

Table 2- Comparison of Intuitive and Analytical Approaches to Decision Making[5][6][17][27]

Intuitive decision making

(System 1/”old mind”)

Analytical decision making

(System 2/”new mind”)

  • Depends on experiential-inductive logic
  • Fast
  • Unconscious thinking
  • Automatic, with little effort
  • Pattern recognition/gestalt effect
  • High capacity
  • Associative cognition
  • Evolutionary/hardwired
  • Heuristic
  • Default process
  • Reflexive
  • More prone to error - based on our personal biases
  • Uses hypotheticodeductive processes
  • Slow
  • Deliberate conscious thought
  • Effortful and controlled
  • Hypothesis generation
  • Low capacity
  • Rule-based cognition
  • Acquired through critical thought
  • Normative reasoning
  • Inhibitory
  • Reflective
  • Less prone to error as a high level of attention is present in cognitive load

[29]

The Decision Making Process[edit | edit source]

The decision-making process starts with presenting the patient’s signs and symptoms.

  • If the clinician recognizes the signs and symptoms - the visual presentation (swelling, posture, area of symptoms, biomechanics) or the combination of symptoms/findings (behaviour of symptoms, history, imaging reports) - the Intuitive process (System 1) is immediately and automatically activated. This process is reflexive and unconscious and does not involve any deliberate thinking.[6] Besides the pattern recognition, other System 1 responses may be simultaneously triggered, for example, the clinician's feelings, experiences and perceptions or the ambient environment. [6][21]
  • If the clinician does not recognize the presented signs and symptoms or there is uncertainty, Analytical/System 2 processes are triggered instead. The clinician now incorporates analytical thinking to analyse the presented data to reach multiple hypotheses systematically. This process is slow, effortful and requires conscious thought. [6][15][28]

Even though there seems to be a complete distinction between the Intuitive and Analytical approaches, they seem more probable to interact. Clinicians would integrate both approaches.[6][30]

Dual Process Theory Model[edit | edit source]

The relationship between these two systems (intuitive and analytical) is informed by a model developed by Croskerry[6][21] in the context of diagnostic reasoning. It recognizes that a continuous fluctuation may exist between both approaches within a clinical setting (Figure 1).

He proposes eight major features of this  model[21] :

  1. System 1 processing is fast, automatic and most prevalent. It is usually effective but leans heavily on heuristics, and because it is unchecked, it is more prone to biases
  2. System 2 is slower but deliberate and conscious, which may limit mistakes.
  3. Errors in reasoning occur more often in System 1 reasoning
  4. Repetitive use of Analytical/System 2 processing and the repeated presentation of the same problem to System 2 will lead to System 1 processing as learning occurs and skills are acquired.[4]. Following this concept, skilled clinicians are more prone to automatically use System 1 processing, whereas novices initially rely more on System 2 processing.[30]
  5. Heuristics/biases can be overridden by the deliberate use of clinical reasoning, which means System 2 processes can override System 1 processes.
  6. Conversely, over-reliance on System 1 processes can also override System 2 processes leading to unchecked decisions. This is evident when clinicians ignore well-developed clinical decision rules or habitually persist with clinical practices without substantial evidence.[4]
  7. The clinician can toggle/fluctuate between both systems
  8. In its continuous effort for cognitive ease, the brain will generally default to System 1 processing when possible.

Clinical decision making model.jpeg

Figure 1. Model for diagnostic reasoning (Adapted from Croskerry [6])

[31]

What Influences our Clinical Decision Making?[edit | edit source]

Clinicians are expected to accurately diagnose patients' problems based on limited evidence and in limited time. They must be a highly skilled expert to fulfil this expectation.[32] A patient-centred physiotherapy practice requires shared decision-making based on collaboration between therapists and patients. [33]This shared decision is influenced by a complex interaction between the following: [3][5][6][21][28]

  • Intrinsic factors
    • professional experience, skills and training
    • context-specific knowledge - mindware gaps (lack of knowledge acquirement or forgetting of knowledge)
    • Mindware contamination by bias and incorrect thinking
    • Personal factors like general fatigue, affective state, sleep deprivation, cognitive overload, decision fatigue
    • Individual factors like gender, personality, intelligence
    • Self-reflection
    • Communication
    • Metacognition (thinking about what we are thinking) during the process of gaining experience (on a cognitive level)
  • Extrinsic factors
    • Patient characteristics (appearance, demeanour, communication issues, past experiences with the patient)
    • The unique presentation of the patient (their condition - severity, previous experience with the presenting complaints)
    • Organisational factors in the medical environment (workload, other patients’ needs, interruptions, time restraints, ergonomic factors, cost)
    • Resource limitations (availability of specific tests, procedures, consultants etc.)

Biases can occur in both approaches, but most biases are related to the failure of heuristics (cognitive biases) and are therefore seated in System 1 processing.[5][6] [15]

Most of the cognitive errors that lead to incorrect diagnoses are not due to a lack of knowledge but rather flaws in the collection, integration and verification of data, which often results in premature diagnoses.[15]

Heuristics (system 1) might be best used by experienced clinicians who can recognise patterns more efficiently. Therapists using heuristic style decisions should be aware of the increased chance for bias and the consequences of error and use their analytical system two processes to check their decisions. As was discussed previously, errors in decision-making have a significant impact on patient safety and satisfaction. It is imperative to increase our understanding of the approaches to decision-making and their potential biases to improve our clinical decision-making through increased awareness of our biases/heuristics and limitations.

[34]


References[edit | edit source]

  1. 1.0 1.1 Croskerry P, Nimmo GR. Better clinical decision making and reducing diagnostic error. The journal of the Royal College of Physicians of Edinburgh. 2011 Jun1;41(2):155-62.
  2. 2.0 2.1 2.2 2.3 Smyth O, McCabe C. Think and think again! Clinical decision-making by advanced nurse practitioners in the Emergency Department. Int Emerg Nurs. 2017 Mar 1;31:72-4.
  3. 3.0 3.1 Whelehan DF, Conlon KC, Ridgway PF. Medicine and heuristics: cognitive biases and medical decision-making. Irish journal of medical science. 2020 May 14;189:1477-1484.
  4. 4.0 4.1 4.2 Croskerry P. Context is everything or how could I have been that stupid. Healthc q. 2009 Aug 15;12(Spec No Patient):e171-176.
  5. 5.0 5.1 5.2 5.3 5.4 5.5 5.6 5.7 5.8 Croskerry P. A model for clinical decision-making in medicine. Medical Science Educator. 2017 Dec 1;27(1):9-13.
  6. 6.00 6.01 6.02 6.03 6.04 6.05 6.06 6.07 6.08 6.09 6.10 6.11 6.12 6.13 6.14 6.15 6.16 Croskerry P. A universal model of diagnostic reasoning. Academic medicine. 2009 Aug 1;84(8):1022-8.
  7. 7.0 7.1 7.2 7.3 Huhn K, Black L, Christensen N, Furze J, Vendrely A, Wainwright S. Clinical reasoning: a survey of teaching methods and assessment in entry-level physical therapist clinical education. Journal of Physical Therapy Education. 2018 Sep 1;32(3):241-7.
  8. Miller KE, Singh H, Arnold R, Klein G. Clinical decision-making in complex healthcare delivery systems. InClinical Engineering Handbook 2020 Jan 1 (pp. 858-864). Academic Press.
  9. Gawande A. Final cut. In: Complications: A Surgeon’s Notes on an Imperfect Science. New York: Henry Holt and Company; 2002;197-198
  10. Graber M. Diagnostic errors in medicine: a case of neglect. The Joint Commission Journal on Quality and Patient Safety. 2005 Feb 1;31(2):106-13.
  11. Dieleman JL, Cao J, Chapin A, Chen C, Li Z, Liu A, Horst C, Kaldjian A, Matyasz T, Scott KW, Bui AL. US health care spending by payer and health condition, 1996-2016. Jama. 2020 Mar 3;323(9):863-84.
  12. Berwick DM, Hackbarth AD. Eliminating waste in US health care. Jama. 2012 Apr 11;307(14):1513-6.
  13. Singh H, Meyer AN, Thomas EJ. The frequency of diagnostic errors in outpatient care: estimations from three large observational studies involving US adult populations. BMJ quality & safety. 2014 Sep 1;23(9):727-31.
  14. Tiffen J, Corbridge SJ, Slimmer L. Enhancing clinical decision making: development of a contiguous definition and conceptual framework. Journal of professional nursing. 2014 Sep 1;30(5):399-405.
  15. 15.0 15.1 15.2 15.3 15.4 Hussain A, Oestreicher J. Clinical decision-making: heuristics and cognitive biases for the ophthalmologist. Survey of Ophthalmology. 2018 Jan 1;63(1):119-24.
  16. Chowdhury A, Bjorbækmo WS. Clinical reasoning—embodied meaning-making in physiotherapy. Physiotherapy Theory and Practice. 2017 Jul 3;33(7):550-9.
  17. 17.0 17.1 17.2 17.3 17.4 17.5 17.6 Djulbegovic B, Elqayam S. Many faces of rationality: implications of the great rationality debate for clinical decision‐making. Journal of Evaluation in Clinical Practice. 2017 Oct;23(5):915-22.
  18. Gigerenzer G, Todd PM. Simple heuristics that make us smart. Oxford University Press, USA; 1999.
  19. 19.0 19.1 19.2 Tversky A, Kahneman D. Judgment under uncertainty: Heuristics and biases. science. 1974 Sep 27;185(4157):1124-31.
  20. Hafenbrädl S, Waeger D, Marewski JN, Gigerenzer G. Applied decision making with fast-and-frugal heuristics. Journal of Applied Research in Memory and Cognition. 2016 Jun 1;5(2):215-31.
  21. 21.0 21.1 21.2 21.3 21.4 21.5 21.6 Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ quality & safety. 2013 Oct 1;22(Suppl 2):ii58-64.
  22. 22.0 22.1 Norman GR, Monteiro SD, Sherbino J, Ilgen JS, Schmidt HG, Mamede S. The causes of errors in clinical reasoning: cognitive biases, knowledge deficits, and dual process thinking. Academic Medicine. 2017 Jan 1;92(1):23-30.
  23. 23.0 23.1 23.2 Gigerenzer G. Why heuristics work. Perspectives on psychological science. 2008 Jan;3(1):20-9.
  24. Gigerenzer G. Moral intuition= fast and frugal heuristics? In: Moral psychology 2008 (pp. 1-26). MIT Press.
  25. Gigerenzer G, Kurzenhaeuser S. Fast and frugal heuristics in medical decision making. Science and medicine in dialogue: Thinking through particulars and universals. 2005 Jan 30:3-15.
  26. 26.0 26.1 26.2 Walston Z. How do we make clinical decisions? [Accessed on 22 January 2021]
  27. 27.0 27.1 27.2 27.3 Evans JS. Dual-processing accounts of reasoning, judgment, and social cognition. Annu. Rev. Psychol.. 2008 Jan 10;59:255-78.
  28. 28.0 28.1 28.2 Trimble M, Hamilton P. The thinking doctor: clinical decision making in contemporary medicine. Clinical Medicine. 2016 Aug;16(4):343.   
  29. Learn Liberty. Heuristics, Explained. Published 25 Sep 2017. Available from https://youtu.be/ReFqFPJHLhA. [last accessed 9 Jan 2021]
  30. 30.0 30.1 Price A, Zulkosky K, White K, Pretz J. Accuracy of intuition in clinical decision‐making among novice clinicians. Journal of advanced nursing. 2017 May;73(5):1147-57.
  31. ACAPT. Clinical reasoning in Physical Therapy: Fast & slow thinking. Published on 27 Aug 2020. Available from https://youtu.be/LekUj7dlxlw. [last accessed 9 Jan 2021]
  32. Grote T, Berens P. On the ethics of algorithmic decision-making in healthcare. Journal of medical ethics. 2020 Mar 1;46(3):205-11.
  33. Hoffmann TC, Lewis J, Maher CG. Shared decision-making should be an integral part of physiotherapy practice. Physiotherapy. 2020 Jun 1;107:43-9.
  34. EmPROinsurance. PRI Interviews Dr Pat Croskerry about How Physicians Make Decisions. Published on 18 Dec 2018. Available from https://youtu.be/hgGlrzKaoqI. [last accessed on 9 Jan 2021]