1 Introduction

Technology, in common with many other activities, tends toward avoidance of risks by investors. Uncertainty is ruled out if possible. People generally prefer the predictable. Few recognize how destructive this can be, how it imposes severe limits on variability and thus makes whole populations fatally vulnerable to the shocking ways our universe can throw the dice.

Frank Herbert (Heretics of Dune)

This handbook of Uncertainty Quantification (UQ) consists of six chapters, each with its own chapter editor. The choice of these chapters reflects a convergence of opinions on part of the editors in chief and organizes the handbook around methodological developments, algorithms for the statistical exploration of the forward model, sensitivity analysis, risk assessment, codes of practice, and software. Most inference problems of current significance to the UQ community can be assembled using building blocks from these six components. The contributions consist of overview articles of interest both to newcomers and veterans of UQ.

Scientific progress proceeds in increments, and its transformative jumps invariably entail falsifying prevalent theories. This involves comparing predictions from theory with experimental evidence. While this recipe for advancing knowledge remains as effective today as it has been throughout history, its two key ingredients carry within them a signature of their own time and are thus continually evolving. Indeed, both predicting and observing the physical world, the two main ingredients of the scientific process, reflect our perspective on the physical world and are delineated by technology.

The pace of innovation across the whole scientific spectrum, coupled with previously unimaginable capabilities to both observe and analyze the physical world, has heightened the expectations that the scientific machinery can anticipate the state of the world and can thus serve to improve comfort and health and to mitigate disasters.

Uncertainty quantification is the rational process by which proximity between predictions and observations is characterized. It can be thought of as the task of determining appropriate uncertainties associated with model-based predictions. More broadly, it is a field that combines concepts from applied mathematics, engineering, computational science, and statistics, producing methodology, tools, and research to connect computational models to the actual physical systems they simulate. In this broader interpretation, UQ is relevant to a wide span of investigations. These range from seeking detailed quantitative predictions for a well-understood and accurately modeled engineering systems to exploratory investigations focused on understanding trade-offs in a new or even hypothetical physical system.

Uncertainty in model-based predictions arises from a variety of sources including (1) uncertainty in model inputs (e.g. parameters, initial conditions, boundary conditions, forcings), (2) model discrepancy or inadequacy due to the difference between the model and the true system, (3) computational costs, limiting the number of model runs and supporting analysis computations that can be carried out, and (4) solution and coding errors. Verification can help eliminate solution and coding errors. Speeding up a model by replacing it with a reduced order model is a strategy for trading off error/uncertainty between (2) and (3) above. Similarly, obtaining additional data, or higher quality data, is often helpful in reducing uncertainty due to (1) but will do little to reduce uncertainty from other sources. The multidisciplinary nature of UQ makes it ripe for exploiting synergies at the intersection of a number of disciplines that comprise this new field. More specifically, for instance,

  • Novel application of principles and approaches from different fields can be combined to produce effective, synergistic solutions for UQ problems,

  • The features and nuances of a particular application typically call for specific methodological advances and approaches.

  • Novel solutions and approaches often appear from adapting concepts and algorithms from one field of research to another in order to solve a particular UQ problem.

  • The concept of “codesign” – building and representing computational models, and analysis approaches and algorithms with HPC architecture in mind – is natural in UQ research, leading to novel solutions in UQ problems.

  • Every effort to quantify uncertainty can be leveraged to make new studies – modeling efforts, data collections, computational approaches, etc. – more accurate and/or more efficient.

Managing these trade-offs to the best effect, considering computational costs, personnel costs, cost of data acquisition, etc., depends on the goals of the investigation, as well as the characteristics of the models involved. Unlike more data-driven fields, such as data mining, machine learning, and signal processing, UQ is more commonly focused on leveraging information from detailed models of complex physical systems. Because of this, UQ brings forward a unique set of issues regarding the combination of detailed computational models with experimental or observational data. Quite often the availability of this data is limited, tilting the balance toward leveraging the computational models. Key considerations in UQ investigations include

  • The amount and relevance of the available system observations,

  • The accuracy and uncertainty accompanying the system observations,

  • The complexity of the system being modeled,

  • The degree of extrapolation required for the prediction relative to the available observations and the level of empiricism encoded in the model,

  • The computational demands (run time, computing infrastructure) of the computational model,

  • The accuracy of the computational model’s solution relative to that of the mathematical model (numerical error),

  • The accuracy of the computational model’s solution relative to that of the true system (model discrepancy),

  • The existence of model parameters that require calibration using the available system observations,

  • The availability of alternative computational models to assess the impact of different modeling schemes or implementations on the prediction.

The concept of a well-posed UQ problem is nucleating in response to the flurry of activities in this field. In particular, whether UQ can be framed as a problem in approximation theory on product spaces, or as an optimization problem that relates evidence to decisions, or as a Bayesian inference problem with likelihoods constrained by hierarchical evidence, points to a convergence of mathematical rigor, engineering pragmatism, and statistical reasoning, all powered by developments in computational science.

Our initial intent for this introductory chapter was to present a common notation that ties a common thread throughout the handbook. This proved premature, and the diversity of these contributions points to a still nascent field of UQ. Although, at present, UQ lacks a coherent general presentation, much like the state of probability theory before its rigorous formulation by Kolmogorov in the 1930s, the potential for such a development is clear, and we hope that this handbook on UQ will contribute to its development by presenting an overview of fundamental challenges, applications, and emerging results.