### Welcome!

I am **Associate Professor in Predictive Modelling** in the Mathematics Institute and School of Engineering at the University of Warwick.
I have wide interests in uncertainty quantification the broad sense, understood as the meeting point of numerical analysis, applied probability and statistics, and scientific computation.
On this site you will find information about how to contact me, my research, publications, and teaching activities.

### Autoencoders in function space

Justin Bunker, Mark Girolami, Hefin Lambley, Andrew Stuart and I have just uploaded a preprint of our paper “Autoencoders in function space” to the arXiv.

**Abstract.** Autoencoders have found widespread application, in both their original deterministic form and in their variational formulation (VAEs). In scientific applications it is often of interest to consider data that are comprised of functions; the same perspective is useful in image processing. In practice, discretisation (of differential equations arising in the sciences) or pixellation (of images) renders problems finite dimensional, but conceiving first of algorithms that operate on functions, and only then discretising or pixellating, leads to better algorithms that smoothly operate between different levels of discretisation or pixellation. In this paper function-space versions of the autoencoder (FAE) and variational autoencoder (FVAE) are introduced, analysed, and deployed. Well-definedness of the objective function governing VAEs is a subtle issue, even in finite dimension, and more so on function space. The FVAE objective is well defined whenever the data distribution is compatible with the chosen generative model; this happens, for example, when the data arise from a stochastic differential equation. The FAE objective is valid much more broadly, and can be straightforwardly applied to data governed by differential equations. Pairing these objectives with neural operator architectures, which can thus be evaluated on any mesh, enables new applications of autoencoders to inpainting, superresolution, and generative modelling of scientific data.

Published on Monday 5 August 2024 at 12:00 UTC #preprint #bunker #girolami #lambley #stuart #autoencoders

### UQ for Si using interatomic potentials

Iain Best, James Kermode, and I have just uploaded a preprint of our paper “Uncertainty quantification in atomistic simulations of silicon using interatomic potentials” to the arXiv.

**Abstract.** Atomistic simulations often rely on interatomic potentials to access greater time- and length- scales than those accessible to first principles methods such as density functional theory (DFT). However, since a parameterised potential typically cannot reproduce the true potential energy surface of a given system, we should expect a decrease in accuracy and increase in error in quantities of interest calculated from simulations. Quantifying the uncertainty on the outputs of atomistic simulations is thus an important, necessary step so that there is confidence in results and available metrics to explore improvements in said simulations. Here, we address this research question by forming ensembles of Atomic Cluster Expansion (ACE) potentials, and using Conformal Prediction with DFT training data to provide meaningful, calibrated error bars on several quantities of interest for silicon: the bulk modulus, elastic constants, relaxed vacancy formation energy, and the vacancy migration barrier. We evaluate the effects on uncertainty bounds using a range of different potentials and training sets.

Published on Saturday 24 February 2024 at 12:00 UTC #preprint #kermode #best #uq #interatomc-potentials

### Unbounded images of Gaussian and other stochastic processes in Analysis and Applications

The final version of “Images of Gaussian and other stochastic processes under closed, densely-defined, unbounded linear operators” by Tadashi Matsumoto and myself has just appeared in *Analysis and Applications*.

The purpose of this article is to provide a self-contained rigorous proof of the well-known formula for the mean and covariance function of a stochastic process — in particular, a Gaussian process — when it is acted upon by an *unbounded* linear operator such as an ordinary or partial differential operator, as used in probabilistic approaches to the solution of ODEs and PDEs.
This result is easy to establish in the case of a bounded operator, but the unbounded case requires a careful application of Hille's theorem for the Bochner integral of a Banach-valued random variable.

T. Matsumoto and T. J. Sullivan. “Images of Gaussian and other stochastic processes under closed, densely-defined, unbounded linear operators.” *Analysis and Applications* 22(3):619–633, 2024.

**Abstract.**
Gaussian processes (GPs) are widely-used tools in spatial statistics and machine learning and the formulae for the mean function and covariance kernel of a GP \(v\) that is the image of another GP \(u\) under a linear transformation \(T\) acting on the sample paths of \(u\) are well known, almost to the point of being folklore. However, these formulae are often used without rigorous attention to technical details, particularly when \(T\) is an unbounded operator such as a differential operator, which is common in several modern applications. This note provides a self-contained proof of the claimed formulae for the case of a closed, densely-defined operator \(T\) acting on the sample paths of a square-integrable stochastic process. Our proof technique relies upon Hille's theorem for the Bochner integral of a Banach-valued random variable.

Published on Wednesday 21 February 2024 at 10:00 UTC #publication #anal-appl #prob-num #gp #matsumoto

### Hille’s theorem for locally convex spaces

I have just uploaded a preprint of the paper “Hille's theorem for Bochner integrals of functions with values in locally convex spaces” to the arXiv. This paper extends the relatively well known result that Bochner integration commutes with closed operators from the classical Banach space setting to the case of more general locally convex spaces.

Published on Thursday 18 January 2024 at 09:00 UTC #preprint #functional-analysis #hille-theorem

### Order-theoretic perspectives on MAP estimation in SIAM/ASA JUQ

The final version of “An order-theoretic perspective on modes and maximum a posteriori estimation in Bayesian inverse problems” by Hefin Lambley and myself has just appeared online in the *SIAM/ASA Journal on Uncertainty Quantification*.

On a heuristic level, modes and MAP estimators are intended to be the “most probable points” of a space \(X\) with respect to a probability measure \(\mu\). Thus, in some sense, they would seem to be the greatest elements of some order on \(X\), and a rigorous order-theoretic treatment is called for, especially for cases in which \(X\) is, say, an infinite-dimensional function space. Such an order-theoretic perspective opens up some attractive proof strategies for the existence of modes and MAP estimators but also leads to some interesting counterexamples. In particular, because the orders involved are not total, some pairs of points of \(X\) can be incomparable (i.e. neither is more nor less likely than the other). In fact we show that there are examples for which the collection of such mutually incomparable elements is dense in \(X\).

H. Lambley and T. J. Sullivan. “An order-theoretic perspective on modes and maximum a posteriori estimation in Bayesian inverse problems.” *SIAM/ASA Journal on Uncertainty Quantification* 11(4):1195–1224, 2023.

Published on Friday 20 October 2023 at 09:00 UTC #publication #modes #order-theory #map-estimators #lambley #juq