Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Introduction

The foundations of human judgment and decision theory have influenced studies on decision making for decades in various domains. A specific area of human judgment is decision-making under conditions of uncertainty. Medicine is an example of decision-making under conditions of uncertainty where doctors constantly make decisions with incomplete information, knowledge gaps and sometimes with inaccurate information. These conditions are exacerbated in critical care environments (Emergency Departments (ED) and Intensive Care Units (ICU)) which are complex in nature with information intensive, time sensitive, highly stressful, non-deterministic, interruption-laden, and life-critical [1]. Caring for critically ill patients within these situations often requires clinicians to make life-and-death decisions within a few seconds while relying on large quantities of questionable information. In order to make these decisions in a timely manner, the clinician must reduce the large quantity of data to a manageable dimension and quickly determine what information is critical to handle the current situation [2]. Studies have shown that individuals often deal with such situations by using cognitive heuristics, or mental shortcuts [1, 2]. Even though the use of heuristics can lead to appropriate judgments, inappropriate heuristic use can result in severe and systematic errors [35]. In medicine, such errors include incorrect or delayed diagnosis, and inappropriate or delayed treatment, all of which can result in adverse medical events and patient harm. Due to the severe consequences of medical errors, it is imperative to minimize inappropriate use of cognitive heuristics by developing techniques to identify cognitive heuristic use.

The purpose of this chapter is to describe a theoretical framework, with associated methods that characterize physicians’ use of cognitive heuristics and biases when caring for critically ill patients. Given that heuristics can be very beneficial and result in sound judgments, where as biases can (but do not always) result in flawed judgment [35], the framework we developed enables identification of specific actions associated with heuristic and bias use leading to sound decisions, as well as actions leading to flawed judgment. Identification of these events can facilitate the development of computer-based modules that can detect when clinical reasoning is deviating toward flawed judgment, and suggest reasoning strategies to nudge the clinician to sound judgment. These computer modules can be incorporated into biomedical informatics tools to enhance decision-making at the point-of-care. Development of such automated error detection and correction systems are critical for the management of medical errors and enhancing patient safety.

This chapter begins with a review of the literature on theories of decision-making and cognitive heuristics and biases. We then discuss how heuristics and biases are used in medicine and how they can impact clinical reasoning. Next we describe the methods we used to develop and validate a theoretical framework. Our methods include a pilot study where we ascertained physicians’ view of heuristics and biases they use in their daily practice, a proof-of-concept study based in naturalistic data from the ICU, supplemented with a thorough review of the heuristic and bias literature. Following the presentation of our methods, we discuss our critical care cognitive heuristic and bias framework in detail. Finally, we discuss the implications of the framework, suggest ways it can be used in the real world to minimize flawed judgment and enhance patient safety.

Background

Theories of Decision-Making

When individuals make decisions they choose a course of action from a set of alternatives with the aim of achieving a goal [6]. There are two primary categories of decision theory including Normative Decision Theory and Descriptive Decision Theory. Normative decision theories propose the manner in which people should make decisions in order to optimize an outcome, whereas descriptive decision theories depict how individuals actually make decisions. Normative Decision Theories utilize axiomatic mathematical models of human behavior that include probability theories such as Bayesian Theory; and utility theories such as the Expected Utility Theory [79]. Normative decision theories assume an ideal decision maker, who is fully informed and rational, is able to process information with perfect accuracy, resulting in an optimal decision [10]. Since individuals are unable to process information with perfect accuracy, and people do not behave in ways consistent with axiomatic rules, a related area of decision-making came into being that describes how people actually make decisions. Descriptive Decision Theories describe the manner that individuals have been observed making decisions. These theories or models include the Satisficing Model [11], Conjunctive/Disjunctive Model [12], Recognition Primed Decision Model [13], the Mental Model Theory [14], and the Dual Process Theory [15].

A concept that applies to normative and descriptive decision theories is rationality. In general, it is thought that individuals are rational decision makers, in that people make choices to maximize utility or self-benefit. Rational behavior is characterized by an individual who has a “well-organized and stable system of preferences and a skill in computation that enables him to calculate, for the alternative courses of action, which alternative will permit him to reach the highest attainable point on his preference scale” [16]. Rational decision-making is nearly impossible due to the limitations of humans and circumstances humans must face. According to Simon, “rationality denotes a style of behavior that is appropriate to the achievement of given goals, within the limits imposed by given circumstances and constraints” [17]. As a result of studying these limitations, Herbert Simon developed the concept of Bounded Rationality [18] which theorizes that in decision-making, rationality of individuals is limited by three things: (1) available information; (2) cognitive limitations of the mind; and (3) the finite amount of time available to make decisions. When making decisions, we do not always have the information necessary to make the optimal decision. We are limited in formulating and solving complex problems due to our ability to receive, store, retrieve and transmit information. We also find ourselves in time-critical situations that restrict our ability to assess, comprehend and process information in order to make optimal decisions. Such constraints result in humans using heuristics rather than using a strict rigid rule to arrive at a decision.

In summary, this section discussed theories of decision-making across many domains (not specific to medicine). Normative theories of decision-making propose humans are able to arrive at the optimal decision given they have the ability to execute axiomatic mathematical computations during the decision-making process. Descriptive decision theories assert humans do not have the ability to quickly execute these computations, and that decisions are actually made much differently than the normative theories propose. Descriptive theories propose people tend to arrive at a decision when primed by knowledge readily accessible within their memory, and when people arrive at a solution (decision) that is satisfactory, they discontinue problem-solving process [11, 13]. Experts discontinue the problem-solving process quickly, while novices continue problem solving for an extended period. Within both paradigms of decision-making, people are bounded by the limitations imposed by constraints such as cognitive limitations and circumstances such as time constraints. It is these factors that induce the use of mental short cuts to assist in the decision-making process. The normative and descriptive theories also can be applied within the domain of medical decision-making. For a more detailed discussion of the paradigms of cognition in medical decision-making, reference Patel, Kaufman and Arocha [19]. Decision-making techniques specific to the diagnostic process are detailed in (reference section “The Diagnostic Process and the Use and Impact of Heuristics and Biases in Medicine”).

Heuristic and Bias Theoretical Foundation

A cognitive heuristic is a mental shortcut applied to make complex tasks simpler. Kahneman and Tversky spent nearly three decades studying how people make judgments under conditions of uncertainty. Based on empirical studies they found (1) people rely on a limited number of heuristic principles to reduce complex tasks of probabilistic assessment and prediction to simpler judgmental operations; (2) people rely on heuristics when confronted with a complicated judgment or decision; and (3) people use heuristics during problem solving to speed up the process of finding a solution where an exhaustive search is impractical [3, 10, 18, 20]. Use of heuristics are unconscious to the decision-maker, and are largely due to our cognitive and environmental limitations; i.e. the cognitive limitations of short-term memory and memory retrieval, and environmental limitations such as the finite amount of time one has to make a decision, and the need to assess a large amount of information within a short period of time. Use of heuristics can result in a close approximation to the optimal decision suggested by normative theories, can be very efficient, and result in appropriate judgments [5]. However, when not used properly they can also lead to severe and systematic errors, or cognitive bias, which are departures from the normative rational theory [5, 21]. It should be noted that inappropriate use of heuristics and use of biases do not necessarily result in errors or flawed judgment, but such use can result in these events.

Although best known as the work of Kahneman and Tversky, the cognitive heuristic and bias paradigm has also been studied by other researchers including the ABC Research Group headed by Gerd Gigerenzer who takes a disparate approach to heuristic-based reasoning. According to Gigerenzer, other researchers have promoted only one side of heuristics, i.e. heuristics are bad and result in biased judgment [22]. Gigerenzer focuses on the benefits of heuristics and promotes the advantages associated with heuristic use. The approach to heuristic-based reasoning, according to Gigerenzer, is the ‘Fast and Frugal’ strategy that enables decision makers to make good decisions with limited information. The two attributes of this strategy are fast and frugal, where (1) fast involves utilizing a minimum amount of time, knowledge and computation; and (2) frugal involves searching a subset of the available information rather than the entire database. Gigerenzer proposes that both of these attributes are exploited within one’s environmental structure to yield adaptive decisions [22]. Fast and frugal heuristics limit the decision makers’ need to search for information using easily computable stopping rules, and allows them to make choices with easily computable decision rules [23]. This type of reasoning can be used to solve problems of sequential search through options, or to select a choice between simultaneous options that require searching for cues, features or consequences within each option. Gigerenzer and his colleagues consider the ‘Fast and Frugal’ heuristic paradigm a descriptive decision theory in that it captures how people make decisions within the real-world under constraints of limited time, knowledge and computational power [22]. Gigerenzer’s approach does not go unchallenged. A criticism of the fast and frugal strategy is that its simplicity might result in highly inaccurate decisions, compared to complex statistical classification methods that process and combine all available predictors [24].

Given that healthcare is complex with different settings sometimes requiring complicated judgments to be made in an expedient manner (the same conditions in which people commonly use heuristics), a better understanding of the role of heuristics and biases within medicine will enable us to develop and integrate resilient health information technology within these settings.

The Diagnostic Process and the Use and Impact of Heuristics and Biases in Medicine

There have been a number of empirical studies that have shown physicians use of heuristics and biases while gathering and interpreting information during the diagnostic process [4, 19, 2528]. The diagnostic process includes assessing clinical data in order to generate a hypothesis of the patient’s diagnosis (differential diagnosis), followed by reviewing additional data and/or performing a course of action (such as carrying out a procedure or running a medial test) in order to narrow the differential to a more specific list of diseases (rule-in or rule-out specific diseases). Once a diagnosis has been established, action is taken to treat the patient.

During hypothesis generation when a diagnosis or a differential diagnosis is generated, physicians are susceptible to biases based on Representativeness and Availability. Representativeness is used to determine how closely a patient’s findings resemble the prototypical manifestations of diseases [29]. Use of such pattern-recognition methods can lead to errors when the physician does not consider atypical representations [29]. Availability occurs when a diagnosis is triggered by recent cases similar to the current case. If a diagnosis is made based on cases recently assessed, but there are attributes in current case that do not correspond with the disease, a diagnostic error could occur. A misdiagnosis can also occur if the physician assumes this patient cannot possibly have the same diagnosis as the last three patients they have seen (Gambler’s Fallacy) [29]. A number of cognitive biases such as Confirmation Bias, Search Satisficing, Premature Closure and Overconfidence bias can prompt clinicians to make errors when pruning, selecting and/or validating a diagnosis [29]. Search Satisficing, or calling off a search once something is found, may occur when a physician arrives at an initial diagnostic hypothesis based on the review of only a portion of the clinical data available, and does not review additional clinical data once their initial diagnosis has been specified. Premature Closure is when a physician accepts a diagnosis before it has been fully verified. Confirmation Bias is the tendency to look for confirming evidence to support a diagnosis rather than look for disconfirming evidence to refute it even when the latter is persuasive and definitive [29]. When a physician does not review additional data or order additional tests because they are confident in their diagnosis, they may be committing the Overconfidence bias, which is a “tendency to act on incomplete information, intuitions or hunches; or when too much faith is placed in an opinion instead of carefully gathered evidence” [29].

When selecting a course of action to treat the patient, the Omission Bias and Outcome Bias can adversely influence treatment decisions if the physician focuses too heavily on what could happen, rather than what is most likely to happen once a treatment or therapy is initiated [30]. Physicians can underutilize preventive interventions in order to avoid having a direct role in bad outcomes [3032]. Death by natural causes can be viewed as better than death by prescription [3]. Outcome bias is when a physician places too much emphasis on patient outcomes, and does not consider the rationale and evidence underlying medical decisions [3, 33]. Other heuristics physicians use in the therapeutic process include Extrapolation, which is when outcomes are generalized to the general populations not included in clinical trials and/or research studies; and that the extrapolation is done inconsistently. For example, the outcome of a study to test moderate antihypertensive treatment study in men was extrapolated to women (who were not study participants) [22, 34].

Based on empirical studies, Elstein and Chapman describe decision biases they believe are used within medicine including biases occurring when judging the likelihood of events such as potential diagnoses and treatment outcomes; and biases occurring when determining preferences and evaluations of outcome utility when choosing a treatment or patient management plan [35]. Heuristics and biases that can occur when judging the likelihood of events include Support Theory, the Unpacking Principle, Outcome Bias and Confirmation Bias. Support Theory is a descriptive theory that posits an unpacking principle that states providing a more detailed description of an event increase its judged probability [36]. For example, when given a clinical scenario to diagnose, one group of subjects were given three options to choose from – the patient has gastroenteritis, ectopic pregnancy or ‘none of the above’; whereas another group of subjects were given five diagnoses including gastroenteritis, ectopic pregnancy, appendicitis, pyelonephritis, pelvic inflammatory disease, and ‘none of the above’ [37]. For each group, the probability for all options should total 100 %. The percentage assigned to the ‘none of the above’ option in the short-list group should equal the total for the appendicitis, pyelonephritis, pelvic inflammatory disease, and ‘none of the above’ options in the long-list group since the ‘none of the above’ option in the short-list includes the other diseases specified in the long-list condition. The study outcome showed that the probability assigned to the ‘none of the above’ option in the short-list group was 50 %; whereas the sum of the probabilities assigned to the appendicitis, pyelonephritis, pelvic inflammatory disease, and ‘none of the above’ options in the long-list group was 69 %. Unpacking the ‘none of the above’ option by specifying particular diseases (appendicitis, pyelonephritis, pelvic inflammatory disease) resulted in an increase in the probability of the additional diseases [37]. Inflating the probability of a diagnosis can result in misdiagnosis and incorrect and/or delayed treatment. Another bias that occurs when judging the likelihood of events is the Outcome Bias, which is when decisions are evaluated more favorably if they result in a good outcome rather than a poor outcome. An impact of this bias is that a clinician may not attempt a treatment for fear of it producing an unfavorable outcome, when there is no evidence that the poor outcome will occur in the patient being treated. Confirmation Bias is another bias in this category. This is when the decision maker searches for evidence to support an initial hypothesis, and ignores evidence that refutes the hypothesis. Implications of this bias in clinical practice is unnecessary tests may be ordered that are do not contribute to revising an initial hypothesis; having additional data that does not refute a hypothesis does not necessarily increase the accuracy of that diagnosis.

The second decision bias Elstein believes occurs in medicine is determining preferences and evaluations of outcome utility including Framing Effects, Attraction Effect, Sunk Cost Bias and Omission Bias. The Framing Effect is when the less risky outcome is preferred when the same situation is presented differently. In a classical study conducted by Kahneman and Tversky, two groups of health officials were presented with the same scenario of an outbreak of the Asian flu that was expected to kill 600 people [38]. One group of health officials was presented with a plan that would save lives; the other group was presented with the same plan that was framed in terms of lives lost. The lives saved scenario indicated that the plan to combat the outbreak would save 200 lives for sure, with a one-third probability that all 600 people would be saved. The lives lost scenario indicated that 400 people would die for sure, with a two-thirds probability that all 600 people would die. The health officials preferred the plan framed in accordance with lives saved [38]. If present, the framing effect could have implications in clinical practice in that a treatment that may have a better outcome may not be selected simply because it was presented in a manner that implied the outcome would be more detrimental. Another effect that impacts a decision-maker within clinical practice is the Attraction Effect that occurs when adding decision alternatives. The addition of choice options, much like the framing of a situation should not have an effect on the choice made; however studies have shown that factors that should have no effect on the decision does have an effect. Redelmeier and Shafir conducted a study where they presented two groups of family physicians with a case of a patient that had osteoarthritis of the hip and a set of management plans to treat the condition [17]. One group of physicians was presented with two plans (refer to orthopedist and do not start any new medication; refer to orthopedist and start ibuprofen); the other group was presented with three plans (refer to orthopedist and do not start any new medication; refer to orthopedist and start ibuprofen; refer to orthopedist and start Piroxicam). In the group that was presented with two options, 53 % of the physicians selected option one (refer to orthopedist and do not start any new medication). In the group where three options were presented, 72 % chose the first option. The addition of alternative three increased the preference for alternative one; this is called the Attraction Effect [39]. The third choice (commonly referred to as a decoy) is seldom chosen, but it does influence the choice between the other two “attracting market share to the option that is superior in every way to the decoy” [35]. The Sunk Cost Bias is “when a decision maker continues to invest resources in a previously selected action or plan even after it is perceived to be suboptimal” [35]. There have been few empirical studies investigating this bias in medical decision-making. One study investigated the bias during the patient management process by asking residents in Internal Medicine and Family Practice to review four scenarios (one medical scenario and three non-medical scenarios) and decide if the current management strategy should be continued or discontinued. The residents were more likely to stay with the original plan if a high level of resources had already been invested; however, this effect was most evident in the non-medical scenarios. This study demonstrates that there is some evidence that physicians avoid the sunken cost fallacy in their own area of expertise, and that choosing the most effective treatment overrode the sunken cost fallacy in the medical domain [35, 40, 41]. Omission Bias is another bias in this category. This bias is when the decision maker feels an omission, or doing nothing, is a better alternative than an action that leads to a harmful outcome. In medicine, this bias is commonly occurs when a physician chooses not to treat a patient (they opt to do nothing) in order to avoid feeling guilty about committing an act that may bring harm to the patient. This finding has been confirmed by empirical studies that have shown “decision makers saw omissions that led to harmful outcomes as less immoral or less bad than acts that led to the same outcomes” [7, 35, 42].

Most of the empirical studies have investigated heuristic and bias use during the diagnostic process, but a small proportion has studied their use throughout the therapeutic process. Other researchers have investigated heuristic and bias use by looking at specific cognitive processes associated with the diagnostic process, i.e. when judging the likelihood of events and when determining outcome utilities. We know that heuristics and biases play a role in the hypothesis generation, pruning a differential diagnosis, validation of a specific diagnosis, as well as establishing a therapeutic course of action (i.e., patient management plan) [22, 2934]. Our work extends prior research in that we investigate the use of heuristics and biases in both the diagnostic and therapeutic processes in a very specific medical setting – hospital critical care units – where the fast paced environment should induce clinicians to rely on cognitive short-cuts and rules-of-thumb. The manner that heuristics and biases are utilized within hospital emergency departments and intensive care units has not been formally studied. Our work is novel in that we assessed data throughout the entire patient care process and we used naturalistic (real-world) clinical reasoning data. An understanding of the role and impact of heuristics and biases in these environments is required in order to design healthcare information technology systems that enable clinicians to attend to pertinent information (and not become bogged down with irrelevant information), and expedite the decision-making process without compromising the quality of healthcare. This provides the theoretical foundations for our methods.

Methods

Data Collection

We based our critical care heuristic and bias framework on three sources of data including: (a) Review of the heuristic and bias literature; (b) Data from a pilot study conducted to ascertain physicians’ view on heuristics and biases they utilize; and (c) Data from a proof-of-concept study performed to obtain naturalistic clinical reasoning data from critical care settings. We chose to carry out these particular studies in a sequential manner so as to progressively explore the use of heuristics and biases in a broad domain then narrowing our investigation to a very specific domain. We started by exploring published literature for how heuristics and biases are used during decision-making in psychology – heuristics and bias’ domain of origin. We then searched the literature specifically looking for heuristic and bias use in medicine and medical decision-making. Since a large proportion of studies in the literature were based on empirical studies conducted in a laboratory, we wanted to obtain data on heuristic and bias use within real world naturalistic settings. We asked critical care clinicians to provide their perception on the prevalence of heuristics and biases in the ER and ICU (pilot study). We then immersed ourselves into the ICU to observe team interaction and decision-making sessions (proof-of-concept study). We chose this environment as we felt this highly dynamic environment would induce clinicians to use mental short cuts in order to keep pace with the quickly changing and fast paced environment. Conducting these different studies allowed us to build our framework on a solid foundation of rich data from a variety of sources. Procedures of data collection for each source are described below.

Literature Review

We performed a heuristic and bias literature review from multiple domains including psychology and medicine. Our primary focus was on empirical studies assessing heuristic and bias use during the diagnostic and therapeutic processes. We also reviewed literature that documents the opinion of experts on heuristic and bias use in medicine.

Clinicians’ View of Heuristic Use (Pilot Study)

We conducted a pilot study ascertaining critical care attending physicians’ perception of how frequently they use various heuristics and biases during clinical reasoning. We developed a semi-structured questionnaire that contained a definition and clinical example of 37 heuristics and biases [9]. The definitions were drawn from the literature; clinical examples were created with the assistance of physicians. Practicing critical care physicians were contacted via email and asked to rate the prevalence of heuristic and bias use in clinical practice (on a scale of 1–5, where 1 was the least prevalent and 5 was the most prevalent). Attending physicians from various regions of the United States practicing in ER and ICU settings participated in the study.

Naturalistic Clinical Reasoning (Proof-of-Concept Study)

The data was collected during morning rounds at a 16-bed adult Medical Intensive Care Unit (ICU) at large teaching hospital in Houston. A clinician team from the Medical ICU was included in the study. The team consisted of an attending physician, a clinical fellow, residents, trainees-interns, medical students, nurses and ancillary staff. The clinicians conduct the daily patient assessment and management-planning sessions in the MICU. During these sessions, residents presented information on real patients at the bedside, and clinical teams discussed each patient’s status, diagnosis, and management plan. Each morning round lasted approximately 5 h, and researchers spent 3 h per day for 3 days shadowing and observing clinician teams. Clinical team interactions were audio-recorded and transcribed verbatim with all identifiers removed. We used data collected over two morning round sessions. Several months later we conducted a second observation session where we shadowed clinicians 3 h a day for three non-consecutive days. The clinical team observed during this session was a different clinical team than those observed in the first session. A total of 24 h of observations was available for analysis. Table 10.1 provides details the sessions. Data from the first observation, along with data from the literature, was used to develop our framework. Data from the second observation was used to validate and enhance the framework.

Table 10.1 Clinical observation sessions

Data Analysis

Clinicians’ View of Heuristic Use (Pilot Study)

We analyzed the data from the pilot-study by calculating the mean perceived heuristic and bias prevalence provided all study participants. Then the data was also analyzed by comparing participant groups, i.e. comparisons were drawn between ER and ICU physicians.

Naturalistic Clinical Reasoning (Proof-of-Concept Study)

We then analyzed the data from the proof-of-concept study by performing an in-depth coding process employing the Grounded Theory Method [43]. Using 10 % of the transcripts we analyzed the data inductively, reading and rereading transcripts in order to extract relevant text. Themes present in the transcripts were identified and text was grouped according to the emerging themes. Themes included clinicians making decisions on patients’ diagnosis and treatment plan, information used to arrive at a decision and clinicians performing actions associated with standard clinical practice. Once themes were identified, we explored the data by breaking the transcripts into sentences, and sentences into small parts, each part representing a single thought, decision or action. Once single purpose phrases were identified, we assigned a code to each phrase. Once this was process was established, we coded and analyzed the remaining transcripts. We maintained consistency of applying codes so that it would be possible to group codes and determine in where in the patient care process the decision or action was occurring (categories). We located axes between the codes and categories and developed the theoretical framework of heuristics and biases use within critical care settings. A further description of the themes and codes produced are listed below. Table 10.2 is an example of a coded transcript.

Table 10.2 Example of a coded transcript

Decisions Made – A decision was defined as reaching a conclusion after consideration of available clinical data. If possible, a decision was identified as decisions relating to arriving at a diagnosis or a decision regarding the patients’ treatment plan. A decision does not necessarily result in an action being taken; there are times when a conclusion has been reached but no action is performed. The decision to not medicate the patient (Table 10.2) is a decision that was made while caring for the patient.

Information Leading to Decision – This is the information that led the clinical team to make a decision or arrive at a conclusion. The information leading to the decision to not medicate the patient (Table 10.2) is that the patient came out of the seizures on her own.

Actions Associated with Standard Clinical Practice – We identified actions associated with standard clinical practice. Standard clinical practice was defined as thought processes, practices or procedures commonly used when caring for patients. For example, a common technique used when diagnosing a problem is to rule-in and rule-out diseases (diagnoses) by conducting a test or performing a procedure. An example of a standard clinical practice is to perform a head CT for a patient having seizures. An example of a therapeutic standard clinical practice was to give blood when the patient’s Hemoglobin was below a specific level.

Use of Heuristic or Bias – Once the above items were coded, we reviewed the coded transcripts to identify heuristics and biases used while caring for critically ill patients. To accomplish this we mapped events that took place during the critical care process to the definition of heuristics and biases (as documented in the literature). An example of use of the cognitive heuristic Representativeness is shown in Table 10.2 where the attending physician and resident discuss what diseases are associated with (representative of) a seizure. An example (not shown in the tables) of actions corresponding with the Anchoring heuristic (locking on an initial diagnosis early in the diagnostic process) and Confirmation Bias (seeking information that supports an initial diagnosis and overlooking critical data that refutes the initial diagnosis) is if a clinician diagnoses a patient with chest pain with a Myocardial Infarction, but they ignore evidence indicating the patient is not within the population that commonly suffers from a heart attack (a patient that is 25 years of age) and that the patient is of a Type A personality with a very stressful job (all symptoms that may correspond to stress or a panic attack). Once heuristics and biases were identified we determined where in the critical care process the heuristic or bias was used. We identified the heuristics and biases used during the needs assessment, hypothesis generation, hypothesis testing, establishing or revising a treatment (management) plan and monitoring the patient.

Methods for Framework Development and Validation

Our overall goal was to identify heuristic and bias use within critical care settings. For each step of the critical care process, we identified heuristics and biases commonly used; then through consultation with experienced critical care clinicians, we identified associated reasoning errors and patient outcomes. To frame our analysis, we started by identifying the heuristics and biases used in medicine in general, then assessed the real-world critical care environment to ascertain heuristics and biases used specifically in critical care. Once we developed the framework, we validated it with data from real-world decision-making sessions within an intensive care unit.

Heuristics & Biases Used in Medicine

The first step of developing the framework was to compose a list of the heuristics and biases used when physicians diagnose a patient and determine their treatment plan, regardless of where the diagnosis and treatment occurs. Based on our literature review and pilot study, we found that physicians commonly use Representativeness, Anchoring and Adjustment, Availability, Confirmation Bias, Premature Closure, Search Satisficing, Omission and Outcome Bias, and Over Confidence when diagnosing and treating patients.

Heuristics & Biases Used in Critical Care Settings

Steps Within the Critical Care Process

We consulted with two board-certified attending physicians that specialize in critical care to determine the steps that commonly occur in critical care settings. We reviewed the coded transcripts from the proof-of-concept study with these consultants to determine if the processes involved in caring for critically ill patients were evident in the data. During this process we looked for key actions that were consistent when caring for multiple patients. We grouped the events into logical steps, identifying a high-level category and low-level steps that comprise each category.

Heuristic and Bias Use Within Critical Care Process

Once we determined the steps that commonly occur within the critical care environment, we analyzed the data collected during observations (coded transcripts) of the proof-of-concept study to determine what heuristics and biases are used within each step of the critical care process.

Reasoning Errors and Patient Outcomes

Based on our literature review and data from our pilot and proof-of-concept studies we had extensive discussions with the board-certified critical care attending physicians, to comprise a list of potential reasoning errors and patient outcomes associated with inappropriate use of heuristics within each step of the critical care process.

Framework Validation

We validated the framework using real-world data collected during a second observation session from the proof-of-concept study (reference section “Data collection” for a description of the observation sessions). We coded and analyzed the data from the second observation in the same manner as we processed data from the first observation. We then determined if the framework adequately reflected heuristics/biases physicians use in critical care. The worksheet shown in Table 10.3 was used to document our findings of the framework validation. Where applicable, we modified the framework as necessary, adding additional heuristics and biases that were apparent from the second observation session.

Table 10.3 Framework validation worksheet

Results

Literature Review

The results of our literature review are detailed in the Background of this chapter. We have provided an overview of various theories of decision-making (reference section “Theories of Decision-Making”), described the theoretical foundation of heuristics and biases (reference section “Heuristic and Bias Theoretical Foundation”), detailed the use and impact of heuristic and biases throughout the diagnostic process (reference section “The Diagnostic Process and the Use and Impact of Heuristics and Biases in Medicine”) and explained how heuristics and biases are used in critical care settings (reference section “Naturalistic clinical reasoning (proof-of-concept study)”).

Clinicians’ View of Heuristic Use (Pilot Study)

The top five perceived heuristics and biases are detailed in Table 10.4 [9]. Physicians practicing in the ICU perceive Confirmation Bias to be most prevalent, followed by the Availability, Planning Fallacy, In-group Bias and Deformation Professionelle. Emergency room attending physicians perceive Clustering Illusion, Deformation Professionelle, Illusory Correlation, Disconfirmation Bias, and Availability to be most prevalent. Across both groups, attending physicians perceived Availability, Deformation Professionelle, In-group Bias, Planning Fallacy, and Anchoring & Adjustment are most prevalent in critical care settings.

Table 10.4 Most prevalent heuristics and biases used in critical care

The heuristics and biases ER and ICU attending physicians feel are least prevalent are listed in Table 10.5. ICU attending physicians perceive the Value-Induced Bias, Texas Sharpshooter Fallacy, Clustering Illusion, Illusory Correlation and Gambler’s Fallacy to be least prevalent. ER attending physicians perceive and Overconfidence Effect, Confirmation Bias, Hindsight Bias, Retrospective Bias and Representativeness to be least prevalent in their setting. Across both groups attending physicians feel Neglect of Prior Base Rates, Selection Bias, Texas Sharpshooter Fallacy, Value-Induced Bias and Illusory Correlation to be least prevalent.

Table 10.5 Least prevalent heuristics and biases used in critical care

Naturalistic Clinical Reasoning (Proof-of-Concept Study)

Identifying heuristic and bias use within the real world by examining transcripts of team discussion and decision-making sessions was not as straightforward as techniques used to assess data from the pilot study. From the transcripts of the 15 h of team clinical reasoning sessions we used to develop our framework (Session 1), we found evidence of Anchoring & Adjustment, Confirmation Bias, Availability, Search Satisficing, Deformation Professional and In-group Bias (reference Table 10.6). From the transcripts of the 9 h of clinical reasoning sessions used to validate the framework (Session 2), we found evidence of Deformation Professional, In-group Bias, Representativeness and Confirmation Bias (reference Table 10.6).

Table 10.6 Heuristics and biases used in team clinical reasoning sessions

Critical Care Heuristics and Bias Framework

Our critical care heuristic and bias framework is illustrated in Fig. 10.1. This diagram depicts the steps associated with caring for a critically ill patient (top row). The middle of the diagram shows the heuristics and biases associated with each of the critical patient care steps based on data from all three data sources utilized in this research (literature review, pilot study and proof-of-concept study). The bottom of the diagram reflects potential reasoning errors and patient outcomes associated with each of the patient care categories; this data is based on the opinion of our expert critical care physicians. The heuristics, biases, reasoning errors and patient outcomes specified in Fig. 10.1 are not a comprehensive list of items that can occur during the critical care process; they are examples of items identified in this study. The definition of each heuristic and bias listed in Fig. 10.1 can be found in Table 10.7.

Fig. 10.1
figure 1

Critical care heuristic and bias framework

Table 10.7 Definition of heuristics and biases

Steps in Critical Patient Care

Based on the data from our proof-of-concept study and consultation with expert critical care physicians, we identified three main steps of the critical care process: Immediate Need Assessment; Address Problem and Patient Management. The steps presented are a snapshot of a part of critical care process that has been simplified for the purpose of this chapter. It should be noted that even though these steps are presented as linear steps, rarely do they happen linearly as the critical care setting is an ever changing, dynamic setting where actions are dependent on critical changes of a patient’s condition. Table 10.8 contains a description of steps within each of these categories that take place when a patient is in a critical care setting such as the ICU. Critical care physicians are often times dealing with a patient in a life-and-death situation. In such situations, stabilizing the patient as quickly as possible is necessary. Therefore, the first steps of the critical care process are to identify the immediate need of the patient and determine what is required in order to stabilize the patient. After the patient is stabilized, the clinical team identifies the problem associated with the patient’s chief complaint. This step consists of comprising a list of possible hypotheses or diagnoses (differential diagnosis), and determining if a hypothesis is accurate (testing the hypothesis by running test, performing procedures, etc.). When the problem (diagnosis) has been identified, an action is performed to alleviate the problem (treat patient). After the patient’s problem has been addressed, a management plan is developed to bring the patient to an optimal state. The patient is then monitored to ensure the plan is sufficient and the patient improves. If the patient is not improving, the management plan is adjusted. Steps 6 and 7 are repeated until the patient’s health has reached the state where they can be moved out of the critical care environment. These steps are in accordance with the data collected for this study and are not necessarily the steps carried out in every hospital ICU.

Table 10.8 Critically ill patient care process

Heuristics and Biases Used Within Critical Care

Based on the data from our literature review, heuristics and biases critical care physicians indicate they use (pilot study), and data from our real-world observations (proof-of-concept study), we determined that within each step of the critical patient care process several heuristics and biases are prevalent. First we present the descriptive statistics of heuristics and biases critical care physicians use (based on the opinion of critical care providers and observation of team decision making within critical care settings), then we describe where in the care process these heuristics and biases are used.

During the immediate need assessment phase, a number of cognitive heuristics and biases are used. A clinician may base their diagnosis or management plan on similar patients they have recently seen (Availability), or on diseases or treatments common for a set of symptoms (Representativeness). If the clinician locks on to a diagnosis early in the assessment process (Anchoring), seek evidence to support that diagnosis and ignore data that refutes the diagnosis, they are committing Confirmation Bias. If a diagnosis is made without considering the incidence and prevalence rates within the population of the patient, they are committing Base Rate Neglect. If a diagnosis or treatment is not selected because several prior patients have had the same outcome, Gambler’s Fallacy is being used. If a clinician arrives at a diagnosis quickly, without performing diagnostic tests to confirm their decision, they may be Over Confident. If the clinician resorts to inaction out of fear of being held responsible for harming the patient, they are committing Omission Bias. Cognitive heuristics and biases are prevalent when teams of clinicians are collaborating on a patient’s case. In-group Bias can occur when preferential treatment is given to those within your own group. For example, a physician may consider the views of a physician as more accurate than the view of a nurse that has considerable knowledge of the patient. Projection Bias, the tendency to assume others in your group share the same thoughts, beliefs, values and opinions as you, is another bias that can occur within a team decision-making environment.

Many of the same cognitive heuristics and biases apply during the addressing the problem stage of critical patient care. When assessing the patient’s symptoms, commonly a clinician will quickly begin to generate a list of possible diagnoses (hypotheses). During this hypothesis generation step physicians may compare the patient’s signs and symptoms to a mental disease model (Representativeness), or compare the patient to a recent patient they cared for (Availability). Neglecting to take into account disease rates for a specific population (Base Rate Neglect), preferring the opinion of those in your alliance (In-Group Bias) and/or assuming others share your views (Projection Bias) are potential flaws when hypothesizing about the patient’s diagnosis. Once a hypothesis has been generated, physicians look for clinical information to either confirm or refute the hypothesis (test the hypothesis). As they seek information to test the hypothesis, several heuristics and biases such as Over Confidence, Premature Closure, Confirmation Bias and Gambler’s Fallacy may be used.

A different set of heuristics and biases are commonly used during the construction and monitoring of the patient management plan. The heuristics used in this phase include Hyperbolic Discount, which is the tendency for people to prefer immediate payoffs relative to later payoffs; Omission Bias, which is when inaction is selected over action to avoid being held accountable for bring the patient harm; Illusory Correlation, which is when a relationship is inaccurately perceived (i.e., if an assumption is made that a particular event is the cause of the patient’s condition when that event is not connected to the condition); Selective Perception, when a clinician’s expectation affects their perception. Other heuristics used in the patient management phase is Representativeness, Availability and Status Quo Bias which are based on the premise that what works for others will also work in the present situation. Common team-based heuristics that may be used include In-group Bias, Projection Bias, Deformation Professionelle (looking at things according to the conventions of one’s own profession, forgetting any broader point of view) and Illusion of Control (the tendency for one to believe they can control or influence outcomes in which they cannot control). Once the management plan has been established, the patient is monitored to ensure the plan is resolving the issue. Heuristics that are commonly used once the management plan has been established are Self-serving Bias, which is the tendency to claim more responsibility for successes than failures; and Outcome Bias which is the tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time the decision was made.

Reasoning Errors and Patient Outcomes

Based on our findings from the heuristic and bias literature review, data from our pilot and proof-of-concept studies, and the opinion of critical care specialists practicing in our study site, we identified potential reasoning errors that can lead to flawed judgment, which, in turn, can lead to negative patient outcomes.

In the immediate need assessment phase, common potential reasoning errors resulting from inappropriate use of heuristics and/or use of biases include neglecting base rate information for the patient population, considering data that is not critical, ignoring data that is critical, inaccurate mapping of the current patient’s situation to disease models and/or prior patient’s situations, not considering all the possible diagnoses which involves not reviewing additional clinical data or not ordering additional tests once a diagnosis has been reached. As a result of these reasoning errors, patients could receive incorrect treatment and/or a delay of the proper treatment, both of which may lead to patient outcomes of elongated or undue suffering and/or death.

In the address the problem phase of the critical patient care process, potential reasoning errors include not considering data critical to making the correct diagnosis, considering data that is not associated with the correct diagnosis, and not fully investigating all the diagnostic possibilities. These reasoning errors may result in completely missing a diagnosis, incorrectly diagnosing a patient, or a delay in diagnosing a patient. These could lead to not treating a patient, incorrectly treating a patient and/or a delay in treatment.

In the patient management phase, common reasoning errors include not noticing a change in the patient; and having too narrow of a focus, which may occur if the patient is only being monitored for the problem they had when entering the critical care unit, and not recognizing that a preexisting condition of the patient is at a less than optimal state or is impacting their current state. These errors could result in a missed, incorrect or delayed diagnosis and/or treatment; all which can cause patient harm, suffering and/or death.

Discussion

The objective of this research was to develop a framework to characterize the use and impact of cognitive heuristics and biases in complex hospital critical care environments such as emergency rooms and intensive care units. Our framework details heuristics and biases used at each step of the critical patient care process from the time the patient enters critical care, through transition to a non-critical state (including assessing the patient’s immediate needs and stabilizing their condition, identifying and treating problems contributing to the patient’s illness, and developing and monitoring a treatment and management plan). The framework includes heuristics and biases used by individual clinicians making independent decisions and teams of clinicians collaborating on the optimal plan for a patient. In addition, the framework specifies potential reasoning errors and patient outcomes that may occur as a result of inappropriate heuristic use. We developed and validated the framework with real-world clinical decision-making data, a through review of the literature, and physicians’ view of the heuristics and biases they use in their clinical practice.

The findings of our real-world clinical observations indicate that multiple heuristics and biases are used throughout the entire critical patient care process. The majority of the heuristics and biases, reasoning errors and patient outcomes associated with ‘Assessing the Immediate Need’ of the patient are also used during the ‘Addressing the Problem’ phase. It is not surprising that similar heuristics and biases are used during these steps since similar cognitive processes occur; in that clinicians are assessing the patient’s symptoms and clinical data to determine factors contributing to their illness, and ruling in and out applicable diseases. Heuristics and biases used during these steps of patient care are commonly based on specific reasoning strategies such as comprising a differential diagnosis and then narrowing down the diagnosis to a specific disease (Anchoring and Adjustment); basing a diagnosis on past events such as patients the clinician has recently seen (Availability, Gambler’s Fallacy); and comparison of the patient’s signs and symptoms to disease mental models and disease prevalence rates acquired throughout their career (Representativeness, Base Rate Neglect). Flaws in clinical reasoning during the ‘Addressing the Problem’ step (Premature Closure, Confirmation Bias) may be due to the critical nature of the patient and the urgency to determine what is causing the patient to be so ill. Once a clinician has formulated a list of hypotheses (differential diagnosis), they ‘Test the Hypotheses’ by gathering additional clinical data by running tests and/or performing procedures. Our findings indicate that during this step biases are commonly used (Search Satisficing, Confirmation Bias, Outcome Bias and Overconfidence Bias). Since hypotheses are tested after the patient has been stabilized, clinicians have the opportunity (and time) to more thoroughly assess the patient’s illness by running tests and/or procedures. Therefore, it is somewhat surprising that biases are so common when validating the hypotheses.

Our findings also indicate that a unique set of heuristics and biases are used when developing the patient’s treatment and management plan and when monitoring the patient once the treatment plan has been put into action. Heuristics and biases used in these steps are commonly action and/or payoff based. For example, Hyperbolic Discount (tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs) and Omission Bias (tendency toward inaction, or reluctance to treat, due to fear of being held directly responsible for the outcome) are prevalent during the therapeutic stages of patient care. A unique set of reasoning errors occurs when establishing a patient treatment management plan. These potential errors can occur when the focus (or framing) of the problem is not accurate. For example, the clinician may be focusing on treating a specific problem (such as the patient’s chief complaint) instead of realizing that a pre-existing condition may be contributing to the problem and/or impacted by a specified treatment (a drug that fixes one problem may negatively impact another problem). Given that critical care clinicians commonly deal with patients of co-morbidities, it is surprising that such reasoning errors are prevalent. It would be expected that critical care physicians would be more inclined to assess the interaction of a treatment on multiple problems.

Our findings confirm and extend findings of prior research associated with clinicians’ use of heuristics and biases. The critical patient care process we identified from observations in hospital intensive care settings are similar to the steps in the diagnostic and treatment processes identified within the literature [4, 19, 2528, 32, 35]. The heuristics and biases we identified in the ‘Immediate Need Assessment’ and ‘Address Problem’ steps are similar to the heuristics/biases identified in the hypothesis generation and the pruning, selecting and/or validating a diagnosis steps as documented in the literature (reference section “The diagnostic process and the use and impact of heuristics and biases in medicine”) [29]. The heuristics/biases we identified in the ‘Patient Management’ step are similar to those documented in the literature when clinicians select a course of action [29, 32, 35]. Our research extends prior research in that we assessed heuristic and bias use within a specialized area of the hospital that cares for patients with critical life-threatening issues. Limited empirical research exists to assess heuristic and bias use during the therapeutic phase of patient care; our research includes a detailed analysis of this phase of patient care in conjunction with the diagnostic phase. In addition, we assessed heuristics and biases used by a single clinician making a stand-alone decision, as well as a team of clinicians engaged in team decision-making. We not only identified heuristic and bias use within medicine, we also identified potential reasoning errors and patient outcomes associate with such use. A significant contribution of our research, not found in prior research, is that our framework was developed and validated using real-world clinical decision-making data by multiple teams of clinicians. The majority of research studies assessing heuristics and biases have been laboratory-based. Assessing heuristic and bias use within real-world environments, especially in a specialized area such as critical care, provide researchers and the healthcare community with a firm insight on the benefit of heuristic use and how such use can enhance patient care, as well as how inappropriate heuristic/bias use can be detrimental to patients.

Limitations to this research include the generalizability of these findings given the framework was, in-part, based on observations of two clinical teams practicing within the same intensive care unit at the same institution. Even though we followed each clinical team for several days, team interaction was comparable from day-to-day. However, there were differences in team interactions between the two teams, which provides more generalizability than if we had observed only one clinical team. Another limitation to our study was that only one research scientist coded and analyzed the decision-making session transcripts. The results may have differed had multiple researchers coded and analyzed the data. We feel basing the framework on a thorough review of the literature and data collected from multiple studies provides a solid framework to understand decision-making within critical care settings.

Our framework depicts the heuristics, biases, potential reasoning errors and patient outcomes associated with the patient care process occurring in critical care settings. Given that patient care within critical care settings requires clinicians to make life-and-death decisions within a few seconds when assessing large quantities of information, this setting is ripe for heuristic and bias use. Heuristic use can be a powerful reasoning strategy within such an environment. However, when heuristic and bias use results in flawed reasoning, the outcome can be detrimental. As healthcare progresses, it is crucial to incorporate tools into critical care environments that enhance clinical reasoning and enable clinicians to use strategies such as heuristics in a manner that will produce unassailable judgments. The potential exists for technology to play a role in enhancing clinicians’ clinical reasoning, reduce adverse patient outcomes, and improve patient care.

Summary

Critical care settings such as hospital emergency departments and intensive care units are complex environments that are stressful, time sensitive and interruption laden, where clinicians, influenced by factors such as extended work hours and sleep-deprivation, make life critical decisions. Within such dynamic environments, decision-making requires the use of cognitive heuristics, or mental short cuts, in order to sustain the required pace. It is crucial to understand the use and impact of cognitive heuristics and their associated biases by clinicians on patient care within critical care. The objective of this chapter is to describe a theoretical framework with associated methods, designed to characterize the use of cognitive heuristics and biases in critical care. This framework was developed and enhanced by an in-depth coding and analysis of real-world clinical decision-making data collected through an ethnographical study, a study ascertaining physicians’ perspectives of heuristics they use in their daily practice, supplemented by a review of literature on empirical studies assessing use of heuristics and biases. We show that application of the framework can facilitate identification of specific actions associated with heuristics and biases that result in better decisions, and actions with the potential for patient harm. Identification of these actions will permit generation of procedures that can be incorporated into computer-based medical systems to detect reasoning processes leading to flawed judgment, and signal clinicians to alternatives that could lead to unassailable judgments. The development of automated detection and correction systems is critical to the advancement of health information technology within healthcare, the reduction of medical errors and enhancing patient safety.

Implications for Biomedical Informatics

The application of our framework facilitates identification of specific actions associated with heuristic and bias use. These actions can serve as the basis for the development of modules that can be incorporated into computer-based health information tools to recognize when clinicians’ reasoning strategies may lead to flawed judgment, and provide alternative reasoning strategies to enhance clinical reasoning and the patient care process. Our goal is to develop and incorporate such auto-detection and correction tools at the point-of-care in order to reduce medical errors such as missed or incorrect diagnosis and incorrect or delayed treatment. To our knowledge, such a system does not exist at the point-of-care. Incorporating health information technology within critical care settings has the potential to greatly enhance medical decision-making and enhance patient care of the critically ill.

Conclusion

Caring for the critically ill requires clinicians to quickly assess and act upon a large amount of information, as time does not permit an exhaustive search process. The use of cognitive heuristics can be a valuable tool, and provide a means for clinicians to accelerate the process of assessing the immediate need of the patient, identifying the correct diagnosis, and establishing a management plan that will reduce the patient’s pain and suffering.

Our framework characterizes and identifies cognitive heuristics and biases used during this patient care process within critical care settings. It spans the entire patient care process from diagnosing the patient to establishing and monitoring the patient management plan. Our model was validated against data collected from real-world decision-making sessions within an ICU of a large academic hospital. Use of this framework will result in the identification of specific actions and events that lead to flawed judgment within critical care settings. Based on this, computer-based tools can be developed to detect specific actions that lead to flawed judgment and prompt clinicians to consider alternative reasoning strategies that will result in sound judgment, ultimately resulting in enhanced patient care, and a reduction of adverse patient outcomes.

Discussion Questions

  1. 1.

    Select three of the various types of biases described in this papers, and think of an example each from (a) everyday experience, and (b) health care domain. How do these biases influence judgment and decisions?

  2. 2.

    What are the key aspects in the dispute in Kahneman & Tversky and Gigerenzer’s theories on the use of heuristics and biases in decision-making?

  3. 3.

    Describe how the framework in this chapter can inform the development of biomedical informatics tools to enhance clinical decision-making.

  4. 4.

    Define Heuristics. Rule based Expert Systems are sometimes called heuristic system. Explain.