1 Introduction

1.1 How Average Can Be “Below-Average”—An Analog to Remember

Consider you are departing Denver International Airport, as I have done many times, plane lifting, turning west toward massive mountains, up to 14,000 feet high. The pilot announces a new altimeter has been installed that provides the average altitude over the previous 10 min of flight. It is easier to use than the last altimeter and much cheaper. The challenge is that average altitude could mean a below-average flight for you—or perhaps the last!

Like average altitudes, average readings of vital signs in medicine may not be adequate but are easy to calculate. Point measures or thresholds are often used in medical practice—beats per minute (bradycardia or not), current temperature (febrile or not), and 750 Hounsfield units (emphysema or not). Average is easy to calculate. Point measures can be done once. But, the scientific method does not endorse doing what is easiest—it calls for using the right measures and tools of inquiry appropriate to the subject of study.

The core clinical practice and applied research topic of this chapter asserts that complexity sciences and mathematics can significantly improve the finding and application of biomarkers that improve patient care. This will be demonstrated throughout this chapter focusing on the rather ancient standard four vital signs—heart rate, respiratory rate, blood pressure, and temperature.

1.2 Why Is It That Complexity Sciences and Mathematics Do Particularly Well in Pursuing Biomarkers?

Complexity sciences provide explanations of particular types of change. Complexity sciences add value by improving insights into previously unrecognized or unanalyzed dynamic patterns and pattern dynamics. These patterns can be emerging and self-organizing, being thermal disequilibrium or nonthermal disequilibrium based—with relevance to a myriad of medical topics—and their biological, bioenergetics, physics, and chemical foundations.

The corollary theoretical proposition of this paper is that human interactions are characterized by qualities and capacities for interaction with the continuous dynamic variability of the exigencies of existence. Humans exist as patterns of interactions with self, others, and environments—built and natural. Humans experience wellness and illness not just in the sense of resisting change, but in the sense of resilience or endurance to change. Embodied self-aware humans engage and interact fully with change in such ways as to proactively leverage and shape change to fit the fundamental requisites of the human organism—to survive, replicate, and thrive.

Complexity sciences support the core proposition that: Continuous dynamic variability of human interactions provides the primary sources of continuous sustained viability.

The various complexity sciences have provided useful insights for improved clinical care. Heart rate variability dynamics (HRVD)—as R–R interval fractal patterns—can provide early detection of neonate ICU infection/sepsis. Blood pressure variability (BPV) can serve as an indicator for risk of stroke and the efficacy of different hypertension drug classes [3,4,5]. Complexity analytics of temperature curve complexity were predictive of survival or mortality [6].

1.3 Vital Signs as Biomarkers

To locate our topic in clinical practice and research, we will use examples involving the long-used standard vital signs—pulse, respiratory rate, blood pressure, and temperature—as biomarkers.Footnote 1

The following definition of biomarkers by Snell and Newbold [7] is used in this chapter:

…any measurement that predicts a patient’s disease state (a diagnostic or prognostic marker) or response to treatment (a clinical endpoint or surrogate for such a measure) can be called a biomarker.

Biomarkers are not just physical samples, nor just measures of direct effect—as shown in this chapter changes in variability of function can indicate physiological alteration, and change in variability of structure can identify anatomical alteration. Responding to changes in the “biomarker” HRVD in the context of a febrile patient is about combating imminent sepsis, rather than the direct change of “the measure” HRVD.

1.4 Mathematics of Complexity Sciences

Complexity sciences are now multiple research fields of particular concepts and methods, measures, and analytics. These include: Complex Responsive Processes [8], Coordination Dynamics [9], and Complex Dissipative Systems [10]. Others include Chaos [11], Complex Adaptive Systems Type I (agents and rules do not change) and Type II (agents and rules can change, innovating what can be done), Synergetics [9, 12], Criticality [13], and Statistical Mechanics (e.g., the Tsallis generalization of Boltzmann–Gibbs applied with wavelet and Tsallis entropy analysis).

Fractal analysis, for example, detrended fluctuation analysis (DFA) and mathematical information entropies (MSE, SampEn, and Tsallis entropy) have become key contributors over the last 30 years to understanding heart rate variability as a diagnostic and prognostic biomarker.

Fractal analysis is derived from fractal geometry [14] and can be used to analyze time series data (like HRVD, and temperature variability) or structures (branching lung structures, vasculatures, and blood clots). Fractals are temporal–spatial scaling symmetries that are adept at quantifying patterns within seemingly biological irregularities across the human phenotype (healthy heart rate variations or vasculature patterns).

Mathematical information entropies, simply stated, look at complexity as probabilities of patterns of expected/unexpected patterns in numerical runs in data. Generally, a higher entropy numerical calculation indicates a greater degree of complexity, whereas a lower one indicates a greater degree of order. Humans exist in continual critical transitions, in Turing’s terms they are “becoming new patterns” [15]—a dynamic balancing between patterns as “too orderly” or “too disorderly.”

2 Vital Signs Variability

This section will look at pragmatic examples of nonlinear, dynamic complexity sciences analysis of vital signs variability—heart rate, respiratory rate, blood pressure, and temperature variability.

2.1 Heart Rate Variability

Heart rate variability dynamics (HRVD) refers to the continuously variable patterning of size and frequency of R–R intervals, typically measured in milliseconds. This provides very different information as a dynamic biomarker than often used “average variability,” “standard deviation,” or other summary statistics (Fig. 1). Averaging removes and obscures information. An important example highlighting the advantage of dynamic analysis of HRVD is the earlier detection and diagnosis of infection and/or sepsis. Time is particularly vital for effective treatment of infection and especially sepsis.

Fig. 1
figure 1

BP mean and SD does not distinguish a healthy from a diseased heart. Mean and standard deviation of heartbeats per minute may not differentiate between health (A): mean 65.0 SD = 4.8 and disease (B): mean 65.0 SD = 4.7. From variability and complexity, Goldberger and DaCosta, www.physionet.org

2.1.1 HRVD for Neonate Infection and Sepsis, Clinical Use, and Significance

A heart rate characteristics (HRC) biomarker is in daily clinical use in over 30 neonatal ICUs across the USA and in Europe [16]. Combined with other patient risk data, it helps provide up to a 24-h added window of diagnostic and prognostic warning of imminent infection/sepsis. Mortality reductions around 20% have been achieved [17]. The index measures heart rate dynamics with SampEn, Standard Deviation, and Sample Asymmetry (transient accelerations—transient decelerations). Clinical use was preceded by a multiyear NIH-sponsored RCT with 3000 infants at nine ICUs [17].

In contrast to information from HRVD about infection/sepsis, the systemic inflammatory response criteria (SIRS) have included heart rate of 90 or greater, along with respiratory rate of ≥20 breaths/min or PaCO2 ≤ 32 mm Hg. In the latest, 3rd Sepsis Guidelines, heart rate and respiratory rate have been dropped as not being of significant contribution. However, these measures were not replaced by their variability counterparts—indeed, it appears that this has never been considered.

2.1.2 HRV Dynamics Potential for Increased Prognostic Window of Infection/Sepsis in Ambulatory Bone Marrow Transplant (BMT) Recipients

The compelling HRV dynamics tracings below show HRV dynamics of ambulatory BMT recipients at high risk for sepsis—14 had an infection/sepsis, and three who remained healthy. The HRVD of those who developed infection/sepsis declined and became more orderly (in red)—calculated using a discrete wavelet transformation (other analytics including DFA confirmed results)—and showed a 35-h prognostic window before the clinical onset of infection. These findings illustrate the strikingly large window for earlier treatment that HRVD might afford to diagnosing and treating infection and/or sepsis—and the attendant implications for outcomes (Fig. 2).

Fig. 2
figure 2

Heart rate variability (HRV) in patients with sepsis. Solid black line indicates treatment administered, taken as point of sepsis diagnosis in these high-risk, immunocompromised patients of sepsis. Dotted black line indicates that HRV has dropped 25% below baseline (reproduced from Ahmad et al. Continuous Multi-Parameter Heart Rate Variability Analysis Heralds Onset of Sepsis in Adults, PLoS ONE. 2009;4(8):e6642. (Creative Commons Attribution License))

2.2 Blood Pressure Variability

Common practice of blood pressure monitoring almost exclusively focuses on mean—generally interpreted as “true”—blood pressure. However, Peter Rothwell’s and colleagues’ research on blood pressure variability over the last decade [3,4,5] has shown that BPV is a better prognostic indicator of stroke risk. Their work also showed differential effects of antihypertensive drug classes on BPV, concluding that:

The opposite effects of calcium-channel blockers [reduction] and β blockers [increase] on variability of blood pressure account for the disparity in observed effects increase on risk of stroke and expected effects based on mean blood pressure. To prevent stroke most effectively, blood-pressure-lowering drugs should reduce mean blood pressure without increasing variability; ideally they should reduce both [4].

These findings have implications for the design of “flexipills” (i.e., compounding several drugs in one tablet) for the treatment of hypertension and the prevention of strokes. As O’Brien [18] a member of the Rothwell group explains, such combinations give physician the ability to achieve two treatment objectives—lowering mean blood pressure levels and reducing BPV.

Blood pressure variability (BPV), which predicts cardiovascular outcome, especially stroke, should be considered as a target for treatment. The recent introduction of variable doses of combination drugs in “flexipills” …provides a means of not only lowering BP, but of also reducing BPV by using medication with contrasting modes of action. Recently, amlodipine/perindopril has been show to significantly reduce total and cardiovascular mortality, compared with atenolol/diuretic.

These findings show that average can be a below-average in terms of one measure relating to a desired outcome—obsessional attention to mean blood pressure ignores the role of BP variability.

Peter Rothwell argues that the management of hypertension has been clouded by the fact that physicians and scientists have been distracted from consideration of variability by giving obsessional attention to mean blood pressure (BP). The hypertension guidelines, which insist on reduction of BP per se and remove BP variability from consideration, may have done science a disservice by obscuring the influence of BP makes a very valuable contribution to outcome, it does not always account fully for the benefit of therapeutic intervention, which might also be due in part, to a reduction in BPV [19, p. 25].

Shifting or broadening the perspective to include variability of a vital sign, like blood pressure, has implications for clinical practice. As O’Brien [19] observed:

…the sternest historical indictment from future generations will be directed at our insistence on permitting isolated BP measurements to dictate diagnostic and management policies of hypertension in the light of an abundance of evidence …How many patients have been subjected to unnecessary or inappropriate therapy and continue to be so mismanaged at the time of writing.

O’Brien cites predecessors to current BPV studies, particularly the landmark publication in 1904 of Theodore Janeway’s work and the later work during the 1960s by Pickering’s Oxford group. Their research showed marked elevation in BP during doctor visits—the “white-coat effect”—resulting in much unnecessary prescribing, overtreatment, and patient harm.

The introduction of ambulatory blood pressure monitoring in the 1960s provides multiple measures taking into account temporal and contextual conditions, providing a reasonable idea of BPV. Rothwell’s team [3, 4] limited their analysis to summary statistical methods, and future work on BP variability will use more advanced analytics as this applied research advances. More important than issues of measurement is the general acceptance of the importance of variability for clinical care.

2.3 Temperature Curve Variability

Use of the clinical thermometer gained routine use in the 1800s with innovative reduction of thermometer length from a foot to about 5 in. and time to take a reading from twenty minutes to five [20]. Expected human temperature, discerned in the history of temperature taking, is generally viewed as the set-point of, or range around, 98.6 F or about 37 C.

Varela [21] noted the challenge arising from temperature averages—“Measuring body temperature is one of the oldest clinical tools available. Nevertheless, after hundreds of years of experience with this tool, its diagnostic and prognostic value remains limited.” Papaioannou [22] pointed out the paradox that temperature is in fact a “continuous quantitative variable, [but] its measurement has been considered [only as a dichotomous] snapshot of a process, indicating …[a] febrile or afebrile [state].”

The works by Varela, Papaioannou, and others illustrate the “nonlinear dynamic turn,” in which researchers across sciences are turning away from averages, set-points, and linear approximations arising from complex adaptive data-sets. Temperature Curve Complexity analysis mines the data between temperature measures to improve diagnostic and prognostic insights. Varela and colleagues [23] showed that temperature curve complexityFootnote 2 in critically ill patients suffering multiple organ failure has high clinical utility:

  • An inverse correlation between the clinical status and ApEn temperature curve complexity

  • Reduced mean and minimum ApEn complexity indicating high likelihood of fetal outcome

  • An increase in 0.1 units in minimum or mean ApEn increased the odds of surviving 15.4 and 18.5-folds, respectively

Papaioannou [22] confirmed Varela et al.’s. [23] earlier findings showing that

…complexity analysis of temperature signals can assess inherent thermoregulatory dynamics during systemic inflammation and has increased discriminating value in patients with infectious versus non-infectious conditions, probably associated with severity of illness.

and that:

…the analysis of a continuously monitored temperature curve in critically ill patients using sophisticated techniques from signal processing theory, such as wavelets and multiscale entropy, was able to discriminate patients with systemic inflammatory response syndrome (SIRS), sepsis and septic shock with an accuracy of 80% [25].

2.3.1 Temperature Curve Variability in Preterm Infants

Preterm infants are at increased risk of morbidity and mortality due to poor control of their body temperature. Temperature curve complexity dynamics, expressed as scaling exponent α (T α) calculated from DFA of a temperature time series, showed a negative association with gestational age and a positive association with the need for ventilatory support. T α provides an assessment of an infant’s autonomic maturity and disease severity. These findings point to possible added diagnostic and prognostic insights, and improve decision-making when to transfer a neonate from the incubator to an open cot which currently is more trial and error based on weight [26].

2.4 A Multiple Variability Biomarker Index for Sepsis

These findings suggest that it might be possible to create a multiple variability biomarker index for sepsis status and morbidity and mortality risk incorporating HRC/HRV, BPV, and temperature curve variability.

3 The Fractal Nature of Structure and Function

The fractal nature of structure and function allows for the most energy efficient vital organ function. This section briefly describes the fractal branching structure of the airways on respiratory physiology.

3.1 The Fractal Nature of the Branching Airways

As a measure fractal dimension (DF) quantifies how the branching structures of airways make more use out of the available chest cavity than an equivalent Euclidean structure could make [27]. Calculating DF can distinguish healthy from diseased lungs [28] (Fig. 3).

Fig. 3
figure 3

Three-fractal dimension distinguishes healthy and diseased lungs. A measure to quantify and differentiate healthy vs diseased lung. (A) and (B) Healthy control subject fractal dimension = 1.83. (C) and (D) Asthma patient, fatal attack reduced fractal dimension = 1.72

The advantage of the fractal lung structure seems to be at least two-folds, given the volume used: the surface exchange area is increased and transport costs are reduced. Other branching structures, such as the retinal vasculature, also tap into this advantage.

3.2 Respiratory Rate Variability

Papaioannou and colleagues [29] used several complexity analytics to study difficulties of weaning from mechanical ventilation. DFA, SampEn, fractal dimension (here applied to time series), and largest Lyapunov exponent (measures phase space divergence in three-dimensional space of system states). They concluded that “…complexity analysis of respiratory signals can assess inherent breathing pattern dynamics and has increased prognostic impact upon weaning outcome in surgical patients.” The team thought advantages of variability analysis included:

  • Observing over longer time periods

  • Different perspective—why and how much values deviate from the mean

  • Continuous real-time information for any weaning process point

They did not assume stationary time series behavior.

3.2.1 Biologically Variable Artificial Ventilation

Biologically Variable Artificial Ventilation describes the use of noisy mechanical ventilation. This approach overcomes the disadvantages of set-point, fixed parameter ventilation, mimicking the breathing patterns of healthy people. As Brewster et al. [30] pointed out:

Most ventilators monotonously deliver the same sized breaths, like clockwork; however, healthy people do not breathe this way. This has led to the development of a biologically variable ventilator—one that incorporates noise.

They applied a static compliance function [31] with insights from Jensen’s inequality about nonlinear averaging [32] to determine the best way to add “noise” to vary ventilation rate and tidal volume.

The benefit of adding “noise” leads to “higher mean volume (at the same mean pressure) or lower mean pressure (at the same mean volume) …[resulting in] enhanced gas exchange or less stress on the lungs. [31]” The fractal nature of structure and function improves respiratory performance—increased gas exchange area at reduced respiratory effort—and improves post-anesthetic outcomes (Fig. 4).

Fig. 4
figure 4

Graphs illustrating Jensen’s inequality. From Denny [32]. (a) Linear function average value equals function evaluated at average input. (b) Convex function average is greater than function evaluated at average input. (c) Concave function average value is less than function evaluated at average input

4 Jensen’s Inequality—or the “Fallacy of the Average”

This section explains in more detail Jensen’s inequality proposition used in the work above by Brewster et al. [30] for Biologically Variable Mechanical Ventilation. Jensen’s inequality is a durable example (since 1904!) of how to understand difficulties in biology and medical inquiry using (1) averages and summary statistics and (2) assumed bell shape/normal distributions. Denny [32] states:

Biologists often cope with variation in physiological, environmental and ecological processes by measuring how living systems perform under average conditions. However, performance at average conditions is seldom equal to average performance across a range of conditions. This basic property of non-linear averaging—known as ‘Jensen’s inequality’ or ‘the fallacy of the average’—has important implications for all of biology.

Averages are not necessarily helpful to detect, diagnose, or extrapolate information from typically nonlinear biological processes, but more importantly, they may well overlook proper therapeutic choices or suggest incorrect/harmful treatments. This is clearly seen in Brewster et al.’s [30] analysis in relation to mechanical ventilation where fixed parameter “monotonous” mechanical ventilation results in lung injury and is prevented by biologically variable “noise-assisted” ventilation.

Uses of Jensen’s inequality occur across research fields and disciplines. It is progressing in ecology and evolutionary studies to better explain, e.g., population dynamics. The evaluation of risks and uncertainty in the “human generated financial asset market” reveals the flaw—or fallacy—of averages; they are below average in usefulness for evaluations of nonlinear functions [33].

Denny [32] concludes:

Because nature is variable and biological response functions are typically nonlinear, it is dangerous to assume that average performance is equal to the performance under average conditions. Ecological physiologists and evolutionary biologists have heeded this warning in their attempt to predict the effects of the looming shifts in Earth’s climate. For example …increased variance in temperature is likely to have greater impact [on species] than the increase in average temperature .

Jensen’s inequality provides the mathematical scaffolding for the role complexity sciences are providing for biomarkers and cautions about nonlinear averaging superseding use of averages and linear assumptions.

5 Implications

The future is already here, it’s just unevenly distributed.

William Gibson [34]

Vital signs variability analysis provides major advantages over the use of averages, point measures, and threshold indicators in detecting important diagnostic and prognostic signs. The demonstrated advantages of vital sign variability described herein indicate how the use of averages, point measures, and thresholds may generate missed diagnostic and prognostic insights. These insights raise two important questions:

  • What is the cost of missing the early signs of sepsis for the patient’s morbidity and mortality arising in the context of constant time pressures on care staff? and

  • How can we design better search strategies for variability biomarkers?

Ongoing applied research will find added uses for the integration of dynamic biomarkers. The “Nonlinear Dynamic Turn” is not just a matter of better data analysis compared to mechanistic models [35] but also entails the need to change our fundamental assumptions of the nature of health, wellness, and illness—we need a new theory of disease. The abductive question that comes to mind is:

  • If omnipresent variability characterizes human interactions and requires proper analytics—then, what abductive explanation, what theory, describes ontological fundamentals of being human, as having qualities and capacities that enable continuous variability for continuous viability?

Besides of these philosophical questions, we also need pragmatic questions to enhance the future of clinical care like:

  • What applied research can develop a biomarker variability index for infection/sepsis? Candidate markers could include [36, 37]:

    • HRVD (infection/sepsis risk, vasopressor independence)

    • Temperature curve variability (survival prospects)

    • Systolic BPV (28 days mortality)

    • Clot structure variability (sepsis coagulopathies)

  • How do humans actually embrace and leverage, not just survive, and be resilient to continuous change? How does the embrace of change help generate persistence?

  • A thread to follow here is that fractals are “fractal temporal–spatial scaling symmetries” [14]. Mathematical symmetries, in a simple definition, describe actions taken on a particular object that leave that object changed in its essential aspects [9]. For example, the heart as it was at the start of the day resembles the heart as it still is at the end of the day (ceteris paribus). How do fractal symmetries help understand that persistence?

The implications of bio-symmetries of interaction point to the entrained presence (pari passu as to biological not physics symmetries, q.v. Longo & Montévil [38]) of conservation laws (inherent in symmetry mathematics), here in the biological context of energy and momentum. This indicates, importantly, that when an integrated bio-symmetries variability index is altered—episodically like in sepsis, or chronically like in COPD—information may be present to investigate the correlated disruptions of useful energy generation in the body, e.g., the higher resting energy expenditure (REE) seen in various chronic diseases.

The Transformative Aspects of This Study

  • Consolidates evidence-based examples of current or potential advantageous clinical use of particular complexity sciences and mathematics to improve care processes and outcomes.

  • Highlights a general mathematical analysis issue of using averages across nonlinear functions by illustrating Jensen’s inequality.

  • Points to integrative potential and lines of research on the implications of biological mathematical symmetries as characterized by the ubiquitous usefulness of analysis by fractal temporal–spatial scaling symmetries.

Take Home Message

  • Complexity sciences, applied as Dynamic Variability Analysis, already has improved clinical care arising from variability biomarkers.

  • More resources should be used to apply complexity sciences to search for and accelerate the clinical use of dynamic biomarkers.

  • The “Nonlinear Turn” of improved dynamic analytics of process and structure (epistemology) invokes the abductive question: How do we develop a matching processual human ontology—what does it mean to be continuously dynamically variable?