Introduction

Pediatric cardiologists and cardiac surgeons must often make efficient, high-risk decisions in care environments that are fast paced, complex, and characterized by data that is incomplete, noisy, and laden with artifacts. While some environments — like the operating room, catheterization laboratory, or intensive care unit (ICU) — routinely navigate the challenges of these “naturalistic” decision-making environments [1], no subspecialty in the field is immune to subjective experiences of excessive and unsustainable mental effort while performing demanding tasks in difficult care contexts. The mental effort required to perform tasks within these environments is referred to as cognitive load, and the foundational concepts associated with cognitive load are vital to understand for cardiologists because excessive cognitive load leads to medical errors and suboptimal patient outcomes [2,3,4]. This paper will serve as a primer on cognitive load for frontline providers who work in high-stakes cardiology environments. We will emphasize the core concepts necessary to define and measure cognitive load; highlight the key relationships between cognitive load, care processes, and outcomes; and suggest actionable targets for reducing cognitive load based on the available literature.

Cognitive Load Theory

Cognitive load theory (CLT) is based on the concept that human cognitive architecture is composed of two different memory systems: working memory and long-term memory. Working memory is functionally the temporary scratchpad of the brain. For example, it is the system that keeps track of what a partner in conversation just said so that you can respond appropriately. In the cardiac intensive care unit, noting instantaneous vital signs, medication infusion rates, and ventilator settings at the bedside to inform an immediate change in therapy is heavily reliant on working memory. Cognitive load theory is based on the idea that working memory resources are finite and that there are potentially significant consequences for decision-making and task performance when the demands on working memory exceed its available resources [5].

As such, cognitive load theorists have delineated core processes and components central to using working memory resources more efficiently. One mechanism for increasing the efficiency of working memory is to develop learning processes that promote transfer of knowledge and skills from working to long-term memory [6]. Another approach is to understand how different subtypes of cognitive load impact working memory resources and modify those subtypes if possible. According to cognitive load theory, the three subtypes of cognitive load include intrinsic, extraneous, and germane cognitive load (Table 1). Intrinsic cognitive load refers to the cognitive effort associated with the inherent difficulty or complexity of the primary task. Extraneous cognitive load refers to the cognitive effort associated with processing information unrelated to the primary task. This type of load is mainly determined by the environment in which the task is being completed and can interfere with task performance (e.g., an irrelevant conversation taking place at the same time as the primary task). Germane cognitive load refers to the cognitive resources available for learning while completing the task [5, 7].

Table 1 Cognitive load theory and subtypes

As an example, the act of performing quality chest compressions imparts intrinsic cognitive load and the chaos of the surrounding emergency environment imparts extraneous cognitive load. Intrinsic and extraneous cognitive loads are additive, determining total load [5]. Germane cognitive load does not contribute to total cognitive load, per se. Rather, it describes the proportion of working memory resources available to learn from the primary task. With low extraneous cognitive load, the germane cognitive load is high (indicating high learning potential), and with high extraneous cognitive load, the germane cognitive load is low (indicating low learning potential) because relatively more cognitive resources must be spent processing extraneous elements [8].

These relationships are important because the different subtypes of cognitive load have different modifiability. Intrinsic cognitive load is unmodifiable (i.e., it is intrinsic to the task) and primarily determined by task complexity and performer expertise. Germane load is similarly unmodifiable (assuming constant levels of motivation) and varies with extraneous cognitive load as described above. However, extraneous cognitive load is modifiable. Changing the performer’s environment (e.g., by reducing sources of distraction) can reduce extraneous load, freeing up working memory resources that can be devoted to intrinsic and germane cognitive load. In other words, as extraneous cognitive load decreases, the resources available to perform the task and learn from the task increase. Therefore, reducing extraneous cognitive load by manipulating its determinants could have significant impacts on working memory capacity, decision-making, and performance.

Why You Should Care: The Relationship Between Cognitive Load and Task Performance

Cognitive load has a U-shaped relationship with performance and learning, such that task performance and learning typically suffer at very low and high cognitive loads. Since providers in cardiology rarely function at very low levels of cognitive load (i.e., rote, repetitive tasks that quickly induce boredom), for practical purposes, task performance and learning are threatened with increasing cognitive load as one nears the limits of working memory. This relationship has been demonstrated in other high-stakes industries, such as aviation [9]. An abrupt increase in taxi errors, runway incursions, and fatal air carrier accidents in the 1970s was partly related to compromised situational awareness [10, 11], even in well-trained and technically proficient crews [12]. Situational awareness involves perceiving and processing information in the surrounding environment and is highly demanding of working memory. Situational awareness is thus highly vulnerable to excessive cognitive load and almost certainly contributed to the negative aviation outcomes of the 1970s. The aviation industry has since made numerous improvements to electronics, navigation systems, and warning sensors to reduce cognitive load and risk of human error, subsequently spawning the safest era for aviation in history [13].

The relationship between excessive cognitive load and impaired performance and learning is crystallizing in medicine. In the emergency department, excessive cognitive load from frequent interruptions, distractions, and task switching is associated with greater risk for medical errors [14, 15, 16••]. In simulated surgical operations, excessive cognitive load has also been linked with increased likelihood of errors during a complex laparoscopic task [17]. Sources of excessive cognitive load abound in ICU environments, in which patients generate an estimated 1300 + unique data points daily [18]. Held and colleagues studied cognitive load during ICU rounds of a multidisciplinary rounding team [19] and found a mean of 20 extraneous cognitive load events (e.g., interruptions, distractions, redundant communication, and split attention) per hour that was associated with increased subjective cognitive load. The cognitive load of daily ICU rounds has also been shown to induce mental fatigue and impair working memory immediately following rounds [20], which may contribute to “lapses” (i.e., errors of omission) and “slips” (i.e., errors of commission) in performance that underpin preventable ICU safety events [3]. One recent study evaluating the workload of frontline providers in a tertiary care pediatric cardiovascular ICU demonstrated an increase in patient mortality and length of stay when bed occupancy was high and staffing was limited [21••], suggesting a potentially important role of cognitive load in patient outcomes. Further research is needed to understand the magnitude and mediators of the relationship between cognitive load and performance in other high-stakes fields in cardiology. For example, the impact of cognitive load on performance may be mediated by burnout [22, 23•], which can be experienced differently based on specialty and role. Nevertheless, the principle that excessive cognitive load impairs performance and learning as working memory resources are depleted likely applies to nearly every medical and procedural subspecialty in the field.

Measuring Cognitive Load

Evaluating and reducing cognitive load require ways to measure it, and a variety of techniques have been proposed that can be used in isolation or combination (Table 2). Given the strengths and drawbacks of different approaches for measuring cognitive load, utilizing multiple approaches together is likely to provide a more holistic assessment of cognitive load than any individual approach alone.

Table 2 Examples of commonly used cognitive load measurement tools

Numeric (psychometric) rating scales can be used to estimate cognitive load associated with a given task. The two most popular scales are the Paas Mental Effort Rating Scale [24] and the National Aeronautics and Space Administration Task Load Index (NASA-TLX) [25, 26]. The Paas Mental Effort Rating Scale asks respondents to rate the overall mental effort required to complete a task on a 1 to 9 Likert scale, with 1 representing “very, very low mental effort” and 9 representing “very, very high mental effort” [24]. The NASA-TLX asks subjects to rate cognitive load across six subscales, which include mental demand, physical demand, temporal demand, frustration, effort, and performance. Mental and physical demands are self-explanatory, temporal demand relates to the time sensitivity perceived by the respondent, performance is related to how successful the respondent perceives they were able to perform the task, effort relates to how hard they had to work overall to accomplish the perceived level of performance, and frustration relates to feelings of discouragement and stress while performing the task [25, 26]. Each subscale is rated on a 0–100 scale (where 0 is “very low” and 100 is “very high”) and the subscale scores are used to calculate an overall mean demand, or task load index [25, 26]. Numeric rating scales are easy to administer and economical but are subjective and limited to measuring the respondents’ perception of cognitive load in retrospect as opposed to in the moment.

Behavioral and physiologic methods are newer techniques designed to measure objective processes in real time that theoretically correlate with cognitive load while the task is performed. Objective changes in behavior, such as alterations in speech, voice patterns, and phonation, may correlate with changing cognitive demands of a task and be used to estimate cognitive load [27]. Similarly, physiologic parameters, such as alterations in heart rate, pupil dilation, hormone levels, galvanic skin response, and electrical/functional activity of the brain, may also scale with cognitive load [24, 28,29,30,31]. However, while behavioral and physiologic techniques are more objective and contemporaneous than numeric rating scales, the relationship between these parameters and cognitive load is indirect and complex, which may limit interpretability [32, 33].

Another paradigm utilized to quantify cognitive load relates to manipulation of the environment in which the task is performed to better understand the cognitive load of the task of interest. In the dual-task method, as the user performs the primary task (i.e., task of interest), a secondary task is introduced that must be completed simultaneously. The secondary task typically involves simple activities requiring sustained attention (e.g., monitoring or detecting a visual, tactile, or audio stimulus) [5]. Cognitive load of the primary task is inferred by performance on the secondary task — the higher the cognitive load of the primary task, the worse the performance will be on the secondary task (presumably due to less available working memory resources) [34, 35]. The dual-task method is objective and contemporaneous, but the heterogeneity of study designs and indirect relationship with cognitive load may challenge its validity [36].

Tools and Targets for Decreasing Cognitive Load

Multiple strategies have been proposed to decrease excessive cognitive load in medicine, with heavy influences from other fields (e.g., aviation) that have successfully decreased cognitive load and improved performance, learning, and outcomes [53, 54]. Tools span the fields of automation, visualization, clinical decision support, crisis resource management (CRM), simulation, and human factors (HFs), among others. A combination of approaches with rigorous systematic evaluation is likely needed to impact cognitive load, decision-making, task performance, and outcomes.

Increasing Automation

Workflow automation in the healthcare setting has the potential to decrease cognitive load for providers. Automation exists on a spectrum of “assistive” to “autonomous.” To date, most automation technologies are assistive, for example, aiding in administrative processes such as appointment scheduling, billing, delivery of medications, and medication compounding [55]. However, advances in machine learning (ML) have moved the technology toward becoming more autonomous. Zhang et al. describe a convolutional neural network trained to review echocardiogram images to measure ejection fraction and longitudinal strain and detect hypertrophic cardiomyopathy, cardiac amyloidosis, and pulmonary arterial hypertension with comparable or superior results compared to manual interpretation [56]. Closed loop control systems — autonomous systems capable of using a complex algorithm to monitor patients and deliver personalized therapies [57] — may one day reduce the cognitive load of providers who currently spend significant mental effort performing these tasks.

Improving Visualization

Providers in high-stakes cardiology environments are undoubtedly familiar with the impacts of suboptimal visualization in their daily work with electronic health records (EHRs). One study of ICU providers found that after EHR implementation, residents and attending physicians spent more of their time on clinical review and documentation and experienced increased frequency of task switching [58]. Effective visualization is paramount to decrease cognitive load in environments that are data-rich, such as the ICU [59]. The implementation of visualization dashboards can decrease provider cognitive load by translating data elements into visual objects, minimizing user actions required to accomplish a goal, providing spatial organization of data, and removing extraneous or distracting information [60]. High-density visualization platforms that represent data in trends and symbols can increase the efficiency of cognitive processing and reaction time of clinicians in acute care settings [61]. However, there remains significant room for improvement in visualization with regard to user-level customization and data filtering [62], and subjecting providers to a plethora of poorly designed dashboards risks further increasing the cognitive load that they seek to reduce.

Leveraging Machine Learning for Clinical Decision Support

Advances in ML have made it possible to decrease cognitive load in high-stakes cardiology environments via improved predictive capabilities and clinical decision support [63]. ML can forecast hemodynamic or respiratory instability [64], create early warning systems for the detection of sepsis [65•], phenotype heterogeneous diseases like acute respiratory distress syndrome [66•], and interpret echocardiograms [66•]. These tools may implicate the correct diagnosis or therapy sooner than recognized by clinicians, thereby decreasing the cognitive load associated with decision-making [67, 68]. However, the relationship between ML, clinical decision support, cognitive load, and outcomes is complex and understudied. There is a lack of studies that directly assess changes in cognitive load before and after ML implementation [69], despite the field’s growing interest in ML’s impacts on usability, clinical workflows, and decision-making [70]. Concerns about ML’s impact on cognitive load are justified, as even robust algorithms that predict patient decompensation after congenital cardiac surgery result in one alarm per patient per day that must be evaluated by the care team [71•].

Promoting Crisis Resource Management

CRM is the direct response by the healthcare industry to crew resource management in aviation [53]. It has primarily been utilized in surgery, anesthesia, emergency care, and the ICU [72, 73]. CRM focuses on developing the non-technical skills needed for effective teamwork in a crisis, and some of its key principles directly relate to mitigation of excessive cognitive load. For example, CRM’s emphasis on defining roles, sharing mental models, allocating attention wisely, and distributing workload may all impact cognitive load [74] and reduction of cognitive load is an essential element of improving situational awareness and CRM skills [75]. Effective CRM has improved team dynamics and task performance in trauma resuscitation [76] and anesthesia emergencies [77]. Medical trainees who were “high performers” on a simulated clinical examination task were more successful in leveraging CRM to manage excessive cognitive load than those who were “low performers” [78].

Utilizing Simulation

Simulation techniques can be harnessed to recreate clinical environments to assess task performance and cognitive processes in dynamic, realistic circumstances. Pauses and debriefings can decompose complicated tasks (with high overall intrinsic cognitive load) into elemental steps (with lower individual intrinsic cognitive loads), thereby facilitating development of task expertise especially among novice learners [79]. As task expertise develops, encoding learned principles into long-term memory may directly decrease the intrinsic cognitive load of the task when performed in real clinical environments [79]. Furthermore, the simulation laboratory is an ideal environment for testing modifiers of extraneous cognitive load. Novel decision aids, workflows, and environmental modifiers can be piloted in controlled settings, with cognitive load and task performance closely measured in creative experimental designs [69].

Optimizing Human Factors and System Engineering

HFs and systems engineering (SE) encompass frameworks for understanding how humans interact with elements of their environment within a broader system (or systems) of care delivery [80]. Core elements of HF/SE relate to system design, including but not limited to considerations for how humans use and interact with different elements of their environment to optimize system performance (i.e., decision-making and care delivery). Poor environmental design, or at least lack of attention to environmental design, can induce significant extraneous cognitive load that negatively impacts user performance in the complex socio-technical healthcare system. In cardiology acute care settings, constant interruptions by patient monitor alarms, phone calls, paging alerts, and ambient noise can contribute to excessive cognitive load and medical errors [81]. Many of these interruptions are modifiable, for example, through systems that more intelligently triage alerts [82] and minimize noise pollution [83]. These types of interventions may have positive impacts on working memory capacity [84], thereby mitigating the impacts of excessive cognitive load to improve performance.

Conclusion

Effective performance and learning in high-stakes cardiology environments require providers to have a working understanding of cognitive load. Cognitive load is based on the principle that working memory resources are finite and that excessive intrinsic and/or extraneous load can overwhelm working memory capacity and compromise performance. Given the negative impacts on providers, patients, and health systems, significant attention should be paid toward evaluating and reducing cognitive load in high-stakes cardiology environments and medicine more generally. Multiple techniques exist for measuring cognitive load, each with strengths and drawbacks, and a multipronged approach of applying multiple modalities for measurement may holistically reflect cognitive load and its impact on providers. Numerous strategies may decrease cognitive load, including increasing automation, improving visualization, leveraging ML, promoting CRM, utilizing simulation, and optimizing HF/SE. Lessons learned from other industries can inform the implementation of these and other strategies, which will need to be undertaken in combination to decrease excessive cognitive load in medicine. Stakeholders must be aware of cognitive load as a potential threat to safe care delivery, task performance, and learning in high-stakes cardiology environments. Only by recognizing the magnitude and key drivers of excessive cognitive load in our work environments can we employ creative thinking and aggressive mitigation strategies to benefit patients and providers.