Probabilistic numerics and uncertainty in computations.
Bottom Line:
Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty.We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance.We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms.
View Article:
PubMed Central - PubMed
Affiliation: Department of Empirical Inference , Max Planck Institute for Intelligent Systems , Tübingen, Germany.
ABSTRACT
We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. No MeSH data available. Related in: MedlinePlus |
Related In:
Results -
Collection
License getmorefigures.php?uid=PMC4528661&req=5
Mentions: We term the probabilistic numeric approach to quadrature Bayesian quadrature. Diaconis [12] may have been first to point out a clear connection between a Gaussian process regression model and a deterministic quadrature rule, an observation subsequently generalized by Wahba [22, §8] and O'Hagan [23], and also noted by [24]. Details can be found in these works; here we construct an intuitive example highlighting the practical challenges of assigning uncertainty to the result of a computation. For concreteness, consider (black in figure 1a). Evidently, f has a compact symbolic form and f(x) can be computed for virtually any x∈R in nanoseconds. It is a wholly deterministic object. Nevertheless, the real number2.1F=∫−33f(x) dxhas no simple analytic value, in the sense that it cannot be natively evaluated in low-level code. Quadrature rules offer ‘black box’ estimates of F. These rules have been optimized so heavily (e.g. [21]) that they could almost be called ‘low level’, but their results do not come with the strict error bounds of floating-point operations; instead, assumptions about f are necessary to bound error. Perhaps the simplest quadrature rule is the trapezoid rule, which amounts to linear interpolation of f (red line in figure 1a(i)): evaluate f(xi) on a grid of N points, and compute2.2F^midpoint=∑i=2N12[f(xi)+f(xi−1)](xi−xi−1).Figure 1. |
View Article: PubMed Central - PubMed
Affiliation: Department of Empirical Inference , Max Planck Institute for Intelligent Systems , Tübingen, Germany.
We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.
No MeSH data available.