### Welcome!

I am **Assistant Professor in Predictive Modelling** in the Mathematics Institute and School of Engineering at the University of Warwick and **Research Group Leader for Uncertainty Quantification** at the Zuse Institute Berlin.
I have wide interests in uncertainty quantification the broad sense, understood as the meeting point of numerical analysis, applied probability and statistics, and scientific computation.
On this site you will find information about how to contact me, my research, publications, and teaching activities.

### A rigorous theory of conditional mean embeddings in SIMODS

The article “A rigorous theory of conditional mean embeddings” by Ilja Klebanov, Ingmar Schuster, and myself has just appeared online in the *SIAM Journal on Mathematics of Data Science*.
In this work we take a close mathematical look at the method of conditional mean embedding.
In this approach to non-parametric inference, a random variable \(Y \sim \mathbb{P}_{Y}\) in a set \(\mathcal{Y}\) is represented by its *kernel mean embedding*, the reproducing kernel Hilbert space element

\( \displaystyle \mu_{Y} = \int_{\mathcal{Y}} \psi(y) \, \mathrm{d} \mathbb{P}_{Y} (y) \in \mathcal{G}, \)

and conditioning with respect to an observation \(x\) of a related random variable \(X \sim \mathbb{P}_{X}\) in a set \(\mathcal{X}\) with RKHS \(\mathcal{H}\) is performed using the Woodbury formula\( \displaystyle \mu_{Y|X = x} = \mu_Y + (C_{XX}^{\dagger} C_{XY})^\ast \, (\varphi(x) - \mu_X) . \)

Here \(\psi \colon \mathcal{Y} \to \mathcal{G}\) and \(\varphi \colon \mathcal{X} \to \mathcal{H}\) are the canonical feature maps and the \(C\)'s denote the appropriate centred (cross-)covariance operators of the embedded random variables \(\psi(Y)\) in \(\mathcal{G}\) and \(\varphi(X)\) in \(\mathcal{H}\).

Our article aims to provide rigorous mathematical foundations for this attractive but apparently naïve approach to conditional probability, and hence to Bayesian inference.

I. Klebanov, I. Schuster, and T. J. Sullivan. “A rigorous theory of conditional mean embeddings.” *SIAM Journal on Mathematics of Data Science* 2(3):583–606, 2020.

**Abstract.**
Conditional mean embeddings (CMEs) have proven themselves to be a powerful tool in many machine learning applications. They allow the efficient conditioning of probability distributions within the corresponding reproducing kernel Hilbert spaces by providing a linear-algebraic relation for the kernel mean embeddings of the respective joint and conditional probability distributions. Both centered and uncentered covariance operators have been used to define CMEs in the existing literature. In this paper, we develop a mathematically rigorous theory for both variants, discuss the merits and problems of each, and significantly weaken the conditions for applicability of CMEs. In the course of this, we demonstrate a beautiful connection to Gaussian conditioning in Hilbert spaces.

Published on Wednesday 15 July 2020 at 08:00 UTC #publication #simods #mathplus #tru2 #rkhs #mean-embedding #klebanov #schuster

### Adaptive reconstruction of monotone functions

Luc Bonnet, Jean-Luc Akian, Éric Savin, and I have just uploaded a preprint of our recent work “Adaptive reconstruction of imperfectly-observed monotone functions, with applications to uncertainty quantification” to the arXiv. In this work, motivated by the computational needs of the optimal uncertainty quantification (OUQ) framework, we present and develop an algorithm for reconstructing a monotone function \(F\) given the ability to interrogate \(F\) pointwise but subject to partially controllable one-sided observational errors of the type that one would typically encounter if the observations would arise from a numerical optimisation routine.

**Abstract.**
Motivated by the desire to numerically calculate rigorous upper and lower bounds on deviation probabilities over large classes of probability distributions, we present an adaptive algorithm for the reconstruction of increasing real-valued functions.
While this problem is similar to the classical statistical problem of isotonic regression, we assume that the observational data arise from optimisation problems with partially controllable one-sided errors, and this setting alters several characteristics of the problem and opens natural algorithmic possibilities.
Our algorithm uses imperfect evaluations of the target function to direct further evaluations of the target function either at new sites in the function's domain or to improve the quality of evaluations at already-evaluated sites.
We establish sufficient conditions for convergence of the reconstruction to the ground truth, and apply the method both to synthetic test cases and to a real-world example of uncertainty quantification for aerodynamic design.

Published on Monday 13 July 2020 at 10:00 UTC #preprint #daad #ouq #isotonic #bonnet #akian #savin

### Optimality of probabilistic numerical methods

The paper “Optimality criteria for probabilistic numerical methods” by Chris Oates, Jon Cockayne, Dennis Prangle, Mark Girolami, and myself has just appeared in print:

C. J. Oates, J. Cockayne, D. Prangle, T. J. Sullivan, and M. Girolami. “Optimality criteria for probabilistic numerical methods” in *Multivariate Algorithms and Information-Based Complexity*, ed. F. J. Hickernell and P. Kritzer. *Radon Series on Computational and Applied Mathematics* 27:65–88, 2020.

**Abstract.**
It is well understood that Bayesian decision theory and average case analysis are essentially identical.
However, if one is interested in performing *uncertainty quantification* for a numerical task, it can be argued that standard approaches from the decision-theoretic framework are neither appropriate nor sufficient.
Instead, we consider a particular optimality criterion from Bayesian experimental design and study its implied optimal information in the numerical context.
This information is demonstrated to differ, in general, from the information that would be used in an average-case-optimal numerical method.
The explicit connection to Bayesian experimental design suggests several distinct regimes, in which optimal probabilistic numerical methods can be developed.

Published on Sunday 31 May 2020 at 08:00 UTC #publication #prob-num #oates #cockayne #prangle #girolami

### Online Probabilistic Numerics Minisymposia

Like many international conferences, the SIAM Conference on Uncertainty Quantification planned for 24–27 March 2020 had to be postponed indefinitely in view of the Covid-19 pandemic. Undeterred by this, the speakers of four minisymposia on the theme of Probabilistic Numerical Methods have generously taken the time to adapt their talks for a new medium and record them for general distribution. The talks can be found at http://probabilistic-numerics.org/meetings/SIAMUQ2020/.

We hope that these talks will be of general interest. Furthermore, the speakers have declared themselves ready to answer questions in written form. If you would like to ask any questions or contribute to the discussion, then please submit your question via this form by 10 May 2020.

Organised jointly by Alex Diaz, Alex Geßner, Philipp Hennig, Toni Karvonen, Chris Oates, and myself

Published on Friday 24 April 2020 at 10:00 UTC #event #siam #prob-num #diaz #gessner #hennig #karvonen #oates

### PhD Project on Adaptive Probabilistic Meshless Methods

There is an opening for a PhD student to work with me and co-PIs Jon Cockayne and James Kermode on the project “Adaptive probabilistic meshless methods for evolutionary systems” as part of the EPSRC Centre for Doctoral Training in Modelling of Heterogeneous Systems at the University of Warwick.

This project will develop and implement a new class of numerical solvers for evolving systems such as interacting fluid-structure flows. To cope with extreme strain rates and large deformations these new solvers will be adaptive and meshless, and they will also implicitly represent their own solution uncertainty, thus enabling optimal design and uncertainty quantification. This exciting project brings together aspects of continuum mechanics, numerical methods for partial differential equations, and statistical machine learning.

Interested students should contact me and the other PIs with informal queries. Formal applications should use the HetSys application page.

Published on Monday 20 April 2020 at 08:00 UTC #group #job #phd #warwick #hetsys #kermode #cockayne