Keywords

Almost a quarter of a century has elapsed since Dr. Lucian L. Leape issued his clarion call, “Error in Medicine” [1], in which he charged that the medical community had for too long underestimated or, worse, deliberately ignored the problem of medical errors and the harm they cause.

“All physicians,” he said, “recognize that mistakes are inevitable. Most would like to examine their mistakes and learn from them. From an emotional standpoint, they need the support and understanding of their colleagues and patients when they make mistakes. Yet they are denied both insight and support by misguided concepts of infallibility and by fear: fear of embarrassment by colleagues, fear of patient reaction, and fear of litigation” [1]. As another physician, David Hilfiker, had charged a decade earlier, “The medical profession seems to have no place for its mistakes” [2].

Cognition not Character

Drawing on the behavioral theories of British psychologist James Reason [3], Leape argued that most errors in medicine result not from flaws in individual character (not being careful enough or trying hard enough) but from innate failures in human cognition, from what he termed “aberrations in mental functioning ”  – whether they be unconscious slips or mistakes in judgment. Hospital workers’ slips or mistakes, he continued, were often precipitated and/or facilitated by latent flaws in the design of the systems the workers were a part of, flaws that inadvertently but inevitably set them up to fail.

Accordingly, Leape urged the redesign of systems and the reorganization of processes to maximize patient safety by making it harder for hospital workers to commit errors , by automatically halting and reversing any errors they might commit and by anticipating and neutralizing those conditions that could serve as the preconditions for such mishaps.

Leape concluded:

The most fundamental change that will be needed if hospitals are to make meaningful progress in error reduction is a cultural one. Physicians and nurses need to accept the notion that error is an inevitable accompaniment of the human condition, even among conscientious professionals with high standards. Errors must be accepted as evidence of system flaws not character flaws. Until and unless that happens, it is unlikely that any substantial progress will be made in reducing medical errors. [1]

Leape’s mission to make error prevention a primary focus of medical practice would have a profound effect upon the way hospitals viewed the problem of adverse events. His contributions helped to shape important later studies [4, 5] and led to the implementation of major reforms [6].

A Cultural Revolution

As Leape strove to transform a dysfunctional health care culture and thereby reduce the frequency of medical error, a different revolution had already begun to transform American culture at large, one that would pose unprecedented future challenges to patient safety and care.

Whereas the Industrial Revolution of the nineteenth and early twentieth centuries was essentially machine driven, the new revolution of the late twentieth century was not mechanical but electronic. The first revolution produced progress mainly in manufacturing (giant factories and assembly lines) and transportation (railroads, automobiles, and airplanes); the second, chiefly in communications (television, cellular phones, and the Internet).

The first word processor would appear in 1970; the first silicon chip, in 1971; and the first personal computer, in 1975. By 1994, the year Leape’s article was published, one-third of American homes had a computer, and by the following year, a quarter of them had two. The speed of computers, meanwhile, was doubling every 18 months. And during the mid-1990s, the public use of a newly commercialized, email-equipped Internet was rapidly expanding.

Telephone technology also kept pace. The traditional landline telephone became cordless and cellular in the 1970s. Eventually, deskbound personal computers (PCs) were overtaken by portable, lightweight laptops and sleek tablets. And by 1994, the very year Leape’s article appeared, the twin technologies of telephone and computer merged for the first time in the handheld “Simon Personal Communicator,” the prototype of today’s ubiquitous, Wi-Fi-enhanced smartphone. According to the Pew Research Center, by 2015, smartphones were in the hands of 64% of adult Americans [7] and were being used professionally by more than eight out of ten US physicians [8].

Yet, more important than the popularity of any one of these technologies is their combination, which radically reinforces and intensifies the accelerative effect that each separate technology would have had alone. It is their electronic linkage that keeps pictures, sounds, and data continually coursing on a nonstop, high-speed track, saturating our environment with instancy. And the more our society depends upon electronic information, the more our everyday lives need to keep up with its speed-of-light pace, since our economic and emotional existence is wired into its circuitry. [9]

During the Industrial Revolution , advanced technology had been mostly confined to factories. Then it moved into people’s homes. Today we carry it around in our pockets. With each step, technology’s presence and influence became more intimately entwined with our lives.

Hyperculture

Smartphone speed is stimulating and exhilarating. It gives us what we need and want faster than ever before – from breaking news to the latest sales. Through texts and images, it connects us instantly with our friends and reassures us about our own identities, identities that are now defined by our Internet presence and the social networks we belong to. And because our communicational devices are so essential to our existence, our lives have become unthinkable without them.

The speed at which those devices operate and our personal dependence upon them has created a new kind of society, a “hyperculture” [9], an electronic culture governed by speed. Energized by electrons racing around a nonstop track at the speed of light, a hyperculture creates its own peculiar kind of urgency  – not a real urgency but an artificial one even more demanding, one that sucks us into its all-consuming vortex. Spun around in that vortex, we become convinced we must always keep up or we will fall hopelessly behind, thereby losing everything life has to offer. That struggle creates stress, a stress that can seem unending because we can never match the speed of our machines [10]. And when such stress temporarily relents, the void created by its absence causes us to hunger for a renewal of its hyperstimulation to end our boredom, not unlike those who, upon entering an empty room, automatically flip on the television set to drive away the silence [9].

Our electronic dependency on our devices has spawned a whole new set of psychological maladies : “nomophobia,” the fear of having no mobile phone handy; [11, 12] “phantom vibration and ringing syndrome ,” the sensation that a phone has vibrated in your pocket or rung when, in fact, it has not; [13, 14] and, most characteristically of all, the recent Merriam-Webster entry, “fomo,” the fear of missing out [15]. While these ailments may strike us amusing, there’s nothing laughable about individuals so addicted [1622] to their screens that out of negligence they cause harm to themselves and others.

Smartphone Zombies

In 2011, a woman named Cathy Cruz Marrero was making her way across a mall near Reading, Pennsylvania [23]. Failing to see a fountain directly in front of her, she stumbled over its retaining wall and toppled into the fountain’s pool. Fortunately, she was only drenched and bruised a bit by her encounter, but her mishap was captured by mall surveillance cameras and later, to her acute embarrassment, broadcast by security guards to millions on YouTube, where it can still be seen today [24]. Mrs. Marrero was texting a friend on her cell phone and was so focused on pecking out the letters on her keyboard that she failed to observe the obstacle that lay in her path.

The following year while walking and texting, another woman named Bonnie Miller fell off the edge of a pier in South Bend, Indiana, and had to be fished out of a river by bystanders [25].

Marrero and Miller are merely stragglers in a relentless army of cell phone users now marching across the urban landscapes of New York, London, Tokyo, and Hong Kong. Dubbed “smartphone zombies,” they continually bump into fellow pedestrians, run into trees, and crash into light-poles, like balls in bizarre game of human pinball [26]. Intent on pursuing their electronic lives, notes one Berliner, “they walk in the streets without checking the traffic, they sit silently across from each other in restaurants, whole hordes of them in the subway, and all of them constantly gazing into the screen of their smartphones as if they were staring through a magical looking glass into another dimension, one that seems to be significantly more exciting than the world that surrounds them” [27]. In China, they’re called “dai tan juk,” the “head-down tribe,” and have been assigned special sidewalk lanes to insure the safety of others [28, 29]. And to keep texters from hurting themselves, one British city has even installed experimental shock-absorbent pads on its lampposts [28].

Research shows that staring at a smartphone can narrow your field of vision to 5% [26], and texting while walking can make you deviate as much as 61% from a planned course [30]. In the process, what you lose is called “situational awareness” [31], an ongoing awareness of the physical environment you are in, a quality long valued as a critical component of successful aviation, navigation, and soldiering.

A Deadly Wandering

In civilian life, the loss of situational awareness  – whether from talking on a cell phone or texting – has been responsible for deadly car crashes that kill over 3,000 people a year and injure more than 1,000 a day [32]. Talking on the phone while driving increases the risk of a crash fourfold; texting while driving, sixfold [33]. The human toll these crashes take and the efforts to prevent them have been dramatically documented by Pulitzer-Prize-winning reporter Matt Richtel in his book, A Deadly Wandering: A Tale of Tragedy and Redemption in the Age of the Internet [33].

Surprisingly, in the 5 s that the average person’s eyes are off the road while texting, a car moving at 55 mph can travel the length of a football field [34]. Moreover, according to University of Utah neuroscientist Dr. David Strayer, “depending on the complexity of the driving task, it may take 15 s or more after you’ve pushed ‘send’ before you’re fully back in an unimpaired state” and recover from what he calls “inattention blindness,” not seeing what’s going on around you [33, 34]. And while driving and talking on a phone, especially a hands-free phone, may seem safer, your mind is still somewhere else, with a reaction time worse than that of someone legally drunk [33].

Distracted walking and distracted driving are not simply common and potentially dangerous examples of inattention but organically related behaviors symptomatic of the society we have built and inhabit [9, 3537]. Some inventions – like the motion picture and television screen – long ago showed our eyes distant vistas even as the telephone and radio opened our ears to faraway voices and sounds. But now more than ever, we have become in the words of Thoreau, “the tools of our tools” [38], with the devices we have newly created creating a new kind of us. As a consequence, our latest devices permit us, indeed invite us, to be mentally somewhere other than where we physically are. As we gather around the table for a meal, our remote devices transport us individually to separate universes even though we sit but a couple of feet apart. And with the advent of digital streaming, Neil Postman’s three-decades-old premise that Americans are “amusing ourselves to death” [39] is more portably true today than ever before. Like prisoners in Plato’s legendary cave, we sit in theaters before the feature begins, fitfully checking the apps on our glowing screens lest we miss some seemingly important but inevitably trivial connection with the outside world or stay at home playing video games that enable us to escape from a seemingly intractable reality into a fulfilling realm of fantasy. Meanwhile, drugs both illegal and legal increasingly insulate us from the issues and challenges of the present that cry out for our attention.

In short, what we have manufactured is an age of appsence. And in counseling us to restore a missing sense of “presence” to our lives, many psychologists fail to recognize that our absence from the lives of others, including the lives of those who love and need us, may be less a function of our conscious choice than the consequence of the multiple wired and wireless devices we have eagerly allowed ourselves to become addicted to.

Digital Doctoring

While the upside of digital doctoring , apps included, is indisputable [4044], its potential downside is undeniable [45].

One of the biggest challenges with any new device is its potential to distract the clinician and alienate the patient, ultimately emphasizing technology over people. When the clinician becomes too focused on the data collection process, he or she begins to lose the personal connection that lies at the heart of the patient-clinician relationship. [46]

The tendency of clinicians to focus not on the human beings sitting in front of them but on the disembodied data on their EMRs [4751] – in some cases, about a third of the time [52] – and to thus confuse the real patient with the iPatient [5359] is not a mere lapse in courtesy but a telling by-product of dwelling in what twentieth century French philosopher Jacques Ellul termed a “technological society .” As Ellul wrote: “When technique enters into every area of life, including the human, it ceases to be external to man and becomes his very substance. It is no longer face to face with man but integrated with him, and it progressively absorbs him” [60].

In effect, the radiant device bathes everything else in its own light, coloring the world around it in its own hues until the distinctive identity of the non-device – the human being – fades away.

Rightly revered as a wondrous and portable tool for instantly recording, storing, retrieving, organizing, analyzing, and transmitting medical information [6163], the computer also sends some powerfully subversive signals that have nothing to do with its intended purposes but everything to do with its inherent nature. And the closer our relationship with a computer becomes, the more its lessons sink into our souls until, like obedient slaves, we learn to speak the language of our masters.

Because the currency of computers is data, computers implicitly teach that what is quantitative is superior to what is qualitative and that what can be expressed in numbers is more important than what cannot [9]. Furthermore, because the best computer is the fastest computer, anything slow is automatically labeled as inferior [9]. Yet think for a moment about the things that best define us as human beings – patience, compassion, dedication, and love – qualities that take time to express and cannot be reduced to numbers. If the values of the computer more and more become the values of medicine, how humane will the practice of that medicine be when a patient is viewed chiefly as a storehouse of data to be summarily and impersonally accessed?

In fact, if our daily interactions are mostly with computers, we may risk losing the skill, or even desire, to communicate face to face. We may not even realize we are losing vital listening skills that could otherwise enable us to hear what a patient is really saying. And we may lose the willingness to take time and listen to a vulnerable patient’s narrative in a way that could permit us to better diagnose and heal.

In some cases, the “absence” of physician from patient, even though both are in the same room, is due to the inability to be in more than one place at one time. Forced to choose the focus of his attention, the physician chooses to the machine.

Here Drs. Shelley Ross and Sarah Forgie recount the all-too-familiar story of a busy resident:

A 39-year-old man suffering from multiple facial contusions and a head injury after a water-skiing accident was seen in the emergency department, accompanied by his spouse. The resident began taking a history, then stopped mid-sentence, pulled out his phone, read the screen and began to text. The spouse of the patient said, “What are you doing?” The resident replied, “I have to answer this. It’s about dinner.” He turned his back, continued to text, waited for a response, then texted again. Replacing his phone, he started again with the history. When the spouse complained about the interruption, the resident looked at her blankly, and again stated, “I had to answer it. It was about dinner.” [64]

Another illustration of the computer’s stamp on our behavior is the common term “multitasking,” a term originally applied to advanced computers but now applied to people [65]. Emulating machines, people multitask in the mistaken belief that when you do two things simultaneously, neither of them suffer. Researchers at Stanford University, however, have demonstrated just the opposite [6668]. While multitaskers may delude themselves into thinking they’re being more efficient, and have others convinced they are as well, switching from one task to another actually wastes time and interrupts the undivided attention needed to perform a particular task extremely well. Multitaskers , moreover, are more easily distracted than those focused on a single task. Initially believing that habitual multitaskers had a special gift, the Stanford researchers studied them at work and were amazed to discover the exact opposite. In the frank words of one investigator, Prof. Clifford Nass: “Multitaskers were just lousy at everything… They’re suckers for irrelevancy… Everything distracts them” [65, 67].

If, however, the defining principle of a hyperculture is its inordinate speed, and if multitasking is our misguided and inept way of trying to cope with its multiple demands, then our attempts to fulfill our professional responsibilities are doomed to fall grievously short.

Unfortunately, the multitasking mentality has entered the operating room, convincing health care personnel that they can enjoy their private lives on their smartphones at the very same time they do justice to their medical obligations to others [6973].

A hospital, after all, isn’t the same as a restaurant or mall. At least, it didn’t used to be. But the new reality is an electronic one, a Wi-Fi world that knows no borders, where boundaries that used to separate one place and its accepted behavior from another have ceased to exist.

One shocking study [74, 75], for example, revealed that casual smartphone use was all too common during critical surgeries . In cardiopulmonary bypass procedures, 55.6% of perfusionists admitted using their smartphones for personal business during operations. 49.2% of these sent text messages; 21% accessed email; 15.1% surfed the web; and 3.1% checked and posted on social networking sites. While 78.3% of the perfusionists polled expressed concern about the practice, believing it posed a potentially significant safety risk to patients, over half did it anyway! And in another study [76], 54% of nurse anesthetists and residents admitted accessing their computers in the OR even while they were aware that they were being observed. Most, as it turned out, were checking out vacation cruises on the Internet!

According to Dr. Stephen Luczycki, an anesthesiologist and medical director of a surgical intensive care unit at Yale-New Haven Hospital, his colleagues regularly use their ICU computers for “Amazon, Gmail, I’ve seen all sorts of shopping, I’ve seen eBay. You name it, I’ve seen it” [77]. Texting is also all-too-common and can likewise pose risks [78].

It’s no surprise, therefore, that serious adverse events including fatalities have been reported. During one surgery, a patient was left partially paralyzed after the neurosurgeon, while operating, took personal calls on his wireless headset [76, 79, 80]. In another case [81], a 61-year-old woman died during surgery to correct an irregular heartbeat while her anesthesiologist, it is alleged, posted personal messages on Facebook, all the while failing to notice that his patient had low blood-oxygen levels until 15 or 20 min after she had turned blue.

While these latter two examples of negligence are egregious, they exemplify the inherent dangers of digital distraction and its potentially tragic consequences.

Cognition and Character

We live in an age of distraction [36, 82, 83] in which a million electronic stimuli, and the promise of more, continually compete for our attention and keep us from focusing on what is most essential [84].

A quarter of a century ago, when Lucian Leape issued his call to acknowledge the prevalence of medical errors and to reexamine their fundamental origin, the full impact of this environment of distraction had not yet been felt.

To maximize patient safety, Leape had urged the redesign of hospital systems and the reorganization of their processes, believing that the key to error reduction was a cultural one. Medical errors, he argued, reflected system flaws, not character flaws.

What Leape did not yet recognize was that the culture of an entire nation was changing under the pervasive influence of addictive speed-of-light technologies . The “culture” of a particular hospital, and the practice of medicine within it, had henceforth to be understood as part of a wider culture that presented its own unprecedented temptations to and imposed its own unprecedented demands on every person and every institution.

“Hyperculture medicine ” would indeed mean that information would flow faster and more abundantly than ever before, creating new chances for effective treatment and cure. But when interposed between physician and patient, computer screens would undermine previous opportunities for therapeutic interpersonal communication . Simultaneously, ever-present smartphones would beckon to doctors and nurses and invite them to escape stress or boredom by turning to the seductive and addictive devices in their pockets, thereby evading the responsibilities of their jobs.

New vulnerabilities of human cognition were thus exposed, and new issues of individual character were unmasked. Human error could no longer be blamed on old systems alone because the systems themselves had been increasingly subverted by a new kind of culture with its own new set of values, values that cared little about duty and sacrifice but more about dataflow and self-gratification.

Henceforth, only tougher regulations to guard against the abuse of technology [85] and an educational system emphasizing personal accountability [86, 87] and self-discipline [8891] would permit the profession of medicine to meet the extraordinary challenges of the new electronic age.