(Junior Professor of Mathematics)
- Free University of Berlin / Zuse Institute Berlin
Beginning with a seminal paper of Diaconis (1988), the aim of so-called "probabilistic numerics" is to compute probabilistic solutions to deterministic problems arising in numerical analysis by casting them as statistical inference problems. For example, numerical integration of a deterministic function can be seen as the integration of an unknown/random function, with evaluations of the integrand at the integration nodes proving partial information about the integrand. Advantages offered by this viewpoint include: access to the Bayesian representation of prior and posterior uncertainties; better propagation of uncertainty through hierarchical systems than simple worst-case error bounds; and appropriate accounting for numerical truncation and round-off error in inverse problems, so that the replicability of deterministic simulations is not confused with their accuracy, thereby yielding an inappropriately concentrated Bayesian posterior. This talk will describe recent work on probabilistic numerical solvers for ordinary and partial differential equations, including their theoretical construction, convergence rates, and applications to forward and inverse problems. Joint work with Andrew Stuart (Warwick).
Biography: Tim Sullivan obtained his PhD in mathematics from Warwick in 2009, under the supervision of Florian Theil. He did his post-doctoral research at Caltech, under the supervision of Michael Ortiz and Houman Owhadi, funded by the Department of Energy's interdisciplinary Predictive Science Academic Alliance Program. He has been a Warwick Zeeman Lecturer (Assistant Professor) in the Mathematics Institute of the University of Warwick, UK and is now Junior Professor of Mathematics at the Free University of Berlin and the Zuse Institute Berlin. His main research interests are mathematical foundations and numerical methods for uncertainty quantification, with a particular interest in rigorous quantification of uncertainties for
'high-consequence low-information' problems in which assessments must be made using highly incomplete models and data.