Keywords

1 Introduction

In today’s dynamic, evermore complex healthcare environment in which the doubling of the sum of medical knowledge will soon approach months rather than years and disruptive technological innovations continue to change the way clinicians practice [1], surgeons can no longer rely on their own wit and talent to provide quality care to the surgical patient. Instead, they must depend on smoothly functioning, inter-professional teams of other health professionals and disciplines who bring their own expertise within their scope of practice to assist surgeons in guiding their increasingly sick wards through surgical procedures to a successful outcome. Gone are the days in which the autonomous surgeon acted as the “captain of the ship,” dictating to all around every component of the care plan. Instead, the contemporary surgeon must act more like a coach, collaborating with his teammates to ensure effective care is rendered. This fact is especially true, since advances in critical care, anesthesia, pharmacology, surgical technology, physical and occupational therapy, and the like outpace surgeons’ abilities to keep abreast. The Institute of Medicine (IOM) recognized this shift in practice in Health Professions Education: A Bridge to Quality when it designated the ability to work in inter-professional teams as a new core competency [2]. This work was followed by the IOM’s Redesigning Continuing Education in the Health Professions which called on the transformation of continuing education into an inter-professional activity [3].

This expanding emphasis on inter-professional teamwork and team function presents new challenges for contemporary surgical educators. In addition to teaching medical knowledge and technical skills, they must also focus on introducing learners to team-based competencies to ensure the effective development of surgical teams. Such training entails inculcating students new to the profession in teamwork concepts and principles as well as trying to overcome ingrained patterns of detrimental team behavior among practicing clinicians. By taking a human factors (HF) approach to such teaching, surgical educators can meet these challenges. This chapter will start to address how to develop highly reliable surgical teams by discussing the role of HF in promoting safe surgical care. It will do so by addressing the following objectives: (1) discussing theoretical underpinnings of HF and (2) demonstrating its need due to the current inadequacy of surgical teamwork in the clinical environment.

2 The Role of Human Factors in Promoting Safe Surgical Care

Although the term “human factors” was first coined in 1957 with the founding of the Human Factors Society, the field’s origins date back to the beginning of the twentieth century and are closely tied to aviation [4, 5]. In fact, the need to identify qualified individuals for pilot training during World War I was a major impetus to the development of aviation psychology [4, 5]. With the rise of civil aviation during the interwar period, work in the field continued. In fact, it was during this time that the first flight simulator, the Link Trainer, was developed by the American Albert Edward Link in Binghamton, New York [4]. The onset of World War II provided more advances in the field as a result of two major trends: (1) the need to design processes to fit people’s capabilities and minimize their limitations in the face of massive mobilization for the war effort and (2) the inability of humans to overcome poor design due to the rapid technological advances of the period [5]. In the United States, World War II marks the birth of the discipline [5]. After World War II, the field entered a period of rapid expansion with research and development that continues to this day.

Christensen, Topmiller, and Gill have defined the term “human factors” as “…that branch of science and technology that includes what is known and theorized about human behavioral, cognitive, and biologic characteristics that can be validly applied to specification, design, evaluation, operation, maintenance of products, jobs, tasks, and systems to enhance safe, effective, and satisfying use by individuals, groups, and organizations.” [6] Put another way, HF is the study of the interaction of humans with their environment. As Christensen et al.’s definition implies, this “environment” can entail the technology on which an individual works, the system processes and procedures of an individual’s workplace, and the work teams with which an individual interacts.

The central axiom of the field of HF can be summed up by the following adage: “We’re only human.” This maxim encapsulates the HF concept that human error is inevitable, making the construction of an error-free system impossible [7]. Thus, HF is founded on “…a fundamental rejection of the notion that humans are primarily at fault when making errors in the use of a socio-technical system.” [8] Instead, as James Reason [7] has posited, catastrophic errors within complex systems are the result of the combination of unnoticed weaknesses within these systems, so-called latent conditions, with active failures resulting from decisions and actions of individuals that are influenced by these systems. Consequently, multiple holes within the defenses erected to prevent a problem align, much like holes in Swiss cheese, creating a set of circumstances culminating in a catastrophic event. Recent examples of “Swiss cheese in action” can be found in multiple industries: nuclear power [9], offshore oil drilling [10], and, too frequently, healthcare [11].

One of the major goals, therefore, of work in HF is to design systems and devices with defenses in depth for the safe, effective use by humans [12]. In order to optimize the interaction between humans with their work environment, HF experts study human behaviors, abilities, and limitations in an effort to create robust systems adept at avoiding, trapping, and mitigating potential and real threats and errors [14]. Such an application of HF to real world situations is known as HF engineering.

In essence, HF engineers attempt to shape human behavior within a work environment through the design of systems and processes that optimize the recognition and mitigation of problems and deficiencies within those systems. According to Caffazzo and St.-Cyr [8], HF engineers pursue this goal through a two-pronged approach: (1) systems-focused and (2) people-focused (Fig. 25.1 [13,14,15,16]). The former approach is most effective in preventing error, whereas the latter approach allows for the positive impact of human judgment. Systems-based solutions to error reduction include standardization of processes, decreasing complexity and optimizing information processing within systems, the intelligent application of automation and computerization, and force functioning [17]. Of these, force functioning is the most effective, since it involves creating so-called physical constraints that prevents humans from committing an error. The development in anesthesia of the Pin Index Safety System (PISS), in which small cylinders of anesthetic gases can only be attached to the flush valve connector having that gas’s unique pin orientation, is an example of this force functioning in healthcare [16]. The oversized diesel nozzle preventing its insertion in an unleaded gas tank is an everyday example.

Fig. 25.1
figure 1

Approaches to human factors engineering. Human factors engineers use both systems-focused and people-focused approaches in order to create robust systems adept at avoiding, trapping, and mitigating potential and real threats and errors. The effectiveness of such approaches increase moving from people-focused to systems-focused solutions

People-focused approaches involve the application of procedural constraints such as the use of checklists and reminders or policies and procedures. The Joint Commission’s Universal Protocol for Preventing Wrong Site, Wrong Procedure, and Wrong Person Surgery™ [19] is an excellent surgical example of such a constraint. Other people-focused interventions involve training and education to instill expected values and behaviors to be followed in the workplace. In this manner, cultural constraints are fostered to create an environment in which doing “the right thing at the right time” becomes the norm. In such environments, safety becomes the primary priority, superseding all other goals (e.g., profit, efficiency, and the like).

Such a culture of safety is the defining characteristic of a high reliability organization (HRO). In Managing the Unexpected: Assuring High Performance in the Age of Complexity [18], Weick and Sutcliffe define the key principles and attributes of HROs that allow them to perform in a consistent and safe manner in high-risk, dynamic environments. Most notably, HROs demonstrate a preoccupation with failure in which they are consistently searching for weaknesses within the systems and processes of the organizational structure that may lead to threats and hazards before they surface. As a result, HROs possess a sensitivity to operations and reluctance to simplify interpretations of problems in order to avoid missing a potential latent condition. Such sensitivity to operations manifests itself in HROs’ deference to expertise in lieu of rank or seniority when dealing with an issue. All these characteristics combine to create a commitment of resilience within HROs that allows them to adapt fluidly and smoothly to changing situations and conditions within their environment. In a nutshell, an HRO promotes mindfulness in lieu of “mindlessness” among all the individuals working within it.

Two examples outside healthcare demonstrate the benefits of having, and the perils of lacking, what Westrum [19] refers to as a generative organizational culture. The story of the seaman who lost a wrench on the flight deck of the nuclear aircraft carrier USS Carl Vinson is illustrative of how an HRO operates. Such a loss can be potentially catastrophic if the instrument gets sucked into one of the jet engines of the fighter planes taking off and landing. The seaman, therefore, spoke up to inform his superiors of the loss. Consequently, all operations were required to be halted, and the deck was systematically searched until the wrench was found. For revealing his loss, the seaman was officially recognized and rewarded the next day during a ceremony on the aircraft carrier [18].

A cautionary tale is provided by British Petroleum (BP). This energy company, which marketed itself as an environmentally friendly entity, was, in reality, anything but friendly due to an organizational culture that placed profit before safety. Even though the Macondo Well Explosion and Oil Spill in the Gulf of Mexico [11, 20] represents the most recent and costly example of the consequences of this cultural attitude, the preceding Texas City Refinery Explosion [23]and the Prudhoe Bay Trans-Alaska Pipeline Oil Spill [23] reveal that BP was prone to such catastrophic events because of it. Unfortunately, the work of several researchers have demonstrated that the cultural bent of the healthcare industry leans more toward BP than the USS Carl Vinson [21].

Why is achieving HRO status so difficult in healthcare? Runciman and Walton [22] have argued that its diversity of tasks and activity patterns, its lack of regulation, and its focus on sick humans with variable characteristics and outcomes are contributing factors. Given that cultural change can take up to a decade and requires a concerted, coordinated approach [23], one might consider trying to create a culture of safety in the healthcare industry a quixotic endeavor. Fortunately, such change does not need to occur at a macro-system level to ensure its existence at the clinical micro-system level. In fact, such clinical micro-systems, defined as a group of healthcare professionals working together with a shared clinical purpose to provide care to a defined patient population, can independently function like an HRO [24]. Thus, HRO practices can be fostered within an operating room (OR), postanesthesia care unit (PACU), intensive care unit (ICU), emergency department (ED), or on the patient care floor. Additionally, it might be developed within several of these at once or within a service line within an institution, such as perioperative care. Over time, the creation of such pockets of HRO-like clinical micro-systems can assist in changing the overall behavior of the institution as a whole.

The cornerstone to any HRO is having highly reliable team function within that organization [25]. Without teams of individuals performing in such a manner, the communication and resiliency needed to maintain high reliability within an organization are curtailed. Salas et al. [26] has identified key traits and coordinating mechanisms demonstrated by highly reliable teams in HRO settings that have been incorporated into the Team Strategies and Tools to Enhance Performance and Patient Safety (STEPPS)™ [27] program developed by the Department of Defense in coordination with the Agency for Healthcare Research and Quality.

3 Contemporary Surgical Teamwork

Much like the presence of a culture of safety, highly reliable team function tends to be the exception rather than the norm in healthcare. This fact is especially true in surgery and the OR where a sense of tribalism [28], fostered by a silo mentality [29] promotes multi-professional interaction instead of inter-professional teamwork [30]. Thus, the OR is characterized more as a group of experts rather than an expert team [31]. Most damaging, these behaviors are propagated from one generation of clinicians to the next through modeling by students who are influenced by this “hidden curriculum” of their training. Many factors contribute to this toxic work environment: unwanted hierarchical structures [32], role confusion [33], differing perceptions of teamwork [34], weak interpersonal skills among professions [35], and increased tension [36]. Such problems extend beyond the OR to other clinic micro-systems where surgical teams are located including the intensive care unit (ICU) and the surgical wards [37].

Particularly remarkable is the striking lack of communication within surgical teams [38]. Such ineffective communication can be due to misunderstandings, lack of hearing, or inappropriate timing of delivery of information [39]. Unfortunately, it can occur during the management of critical events [40], and its consequences can negatively impact patient care [38]. Thus, even though members of surgical teams are speaking to one another, they are often not understanding the meaning of what is being said. Much like the citizens of the United States, the United Kingdom, and Australia, surgical team members are often separated by a common language (Fig. 25.2)!

Fig. 25.2
figure 2

Separated by a common language! Much like the term “football” connotes different sports in the United States, United Kingdom, and Australia respectively, surgical team members may misunderstand or misinterpret communication between each other

The consequences of ineffective teamwork in surgical micro-systems are manifold. It can result in distractions that can negatively impact team function (Table 25.1 [41,42,43,44]). Thus, the dysfunctions of contemporary surgical teams have tangible consequences that can negatively impact the care given to the patient.

Table 25.1 Disruptions in the operating room

4 Conclusion

Surgical teams are more often ineffective than effective in contemporary clinical practice. Possessing an understanding of HF can help in addressing this gap. Key concepts of HF include the need to create defenses in depth in order to avoid, trap, and mitigate the inevitable errors that occur in a human-designed system as well as employing both systems-focused and people-focused approaches to help promote highly reliable team behavior. By doing so, HF can be applied in the surgical setting to create adaptive teams that can respond to dynamic, high-risk changes in the environment.