Probabilistic numerics

July 13, 2023 — September 25, 2023

calculus
dynamical systems
geometry
Hilbert space
how do science
Lévy processes
machine learning
neural nets
physics
regression
sciml
SDEs
signal processing
statistics
statmech
stochastic processes
surrogate
time series
uncertainty
Figure 1

Probabilistic Numerics claims:

Probabilistic numerics (PN) aims to quantify uncertainty arising from intractable or incomplete numerical computation and from stochastic input. This new paradigm treats a numerical problem as one of statistical inference instead. The probabilistic viewpoint provides a principled way to encode structural knowledge about a problem. By giving an explicit role to uncertainty from all sources, in particular from the computation itself, PN gives rise to new applications beyond the scope of classical methods.

Typical numerical tasks to which PN may be applied include optimization, integration, the solution of ordinary and partial differential equations, and the basic tasks of linear algebra, e.g. solution of linear systems and eigenvalue problems.

As well as offering an enriched reinterpretation of classical methods, the PN approach has several concrete practical points of value. The probabilistic interpretation of computation

  • allows to build customized methods for specific problems with bespoke priors
  • formalizes the design of adaptive methods using tools from decision theory
  • provides a way of setting parameters of numerical methods via the Bayesian formalism
  • expedites the solution of mutually related problems of similar type
  • naturally incorporates sources of stochasticity in the computation
  • can give structural uncertainty via a probability measure compared to an error estimate

and finally it offers a principled approach of including numerical error in the propagation of uncertainty through chains of computations.

1 Questions

Connection to sparse GPs?

2 Incoming

Herding: Chai et al. (2019);Welling (2009).

Connection to ensemble Kalman methods?

3 References

Bach. 2015. On the Equivalence Between Kernel Quadrature Rules and Random Feature Expansions.”
Chai, Ton, Garnett, et al. 2019. Automated Model Selection with Bayesian Quadrature.”
Hennig, Ipsen, Mahsereci, et al. 2022. Probabilistic Numerical Methods - From Theory to Implementation (Dagstuhl Seminar 21432).” Edited by Philipp Hennig, Ilse C.F. Ipsen, Maren Mahsereci, and Tim Sullivan. Dagstuhl Reports.
Hennig, Osborne, and Girolami. 2015. Probabilistic Numerics and Uncertainty in Computations.” Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences.
Hennig, Osborne, and Kersting. 2022. Probabilistic Numerics: Computation as Machine Learning.
Huszár, and Duvenaud. 2016. Optimally-Weighted Herding Is Bayesian Quadrature.”
O’Hagan. 1991. Bayes–Hermite Quadrature.” Journal of Statistical Planning and Inference.
Song, Zhang, Smola, et al. 2008. Tailoring Density Estimation via Reproducing Kernel Moment Matching.” In Proceedings of the 25th International Conference on Machine Learning. ICML ’08.
Welling. 2009. Herding Dynamical Weights to Learn.” In Proceedings of the 26th Annual International Conference on Machine Learning. ICML ’09.