Bayesian methods based on Gaussian process priors are frequently used in statistical inverse problems arising with partial differential equations (PDEs). They can be implemented by Markov chain Monte Carlo (MCMC) algorithms. The underlying statistical models are naturally high- or infinite-dimensional and the present book presents a rigorous mathematical analysis of the statistical performance, and algorithmic complexity, of such methods in a natural setting of non-linear random design regression.
Due to the non-linearity present in many of these inverse problems, natural least squares functionals are non-convex and the Bayesian paradigm presents an attractive alternative to optimisation-based approaches. This book develops a general theory of Bayesian inference for non-linear forward maps and rigorously considers two PDE model examples arising with Darcy’s problem and a Schrödinger equation. The focus is initially on statistical consistency of Gaussian process methods, and then moves on to study local fluctuations and approximations of posterior distributions by Gaussian or log-concave measures whose curvature is described by PDE mapping properties of underlying ‘information operators’. Applications to the algorithmic runtime of gradient-based MCMC methods are discussed as well as computation time lower bounds for worst case performance of some algorithms.