# Tim Sullivan

Clear Search

### Adaptive reconstruction of monotone functions

Luc Bonnet, Jean-Luc Akian, Éric Savin, and I have just uploaded a preprint of our recent work “Adaptive reconstruction of imperfectly-observed monotone functions, with applications to uncertainty quantification” to the arXiv. In this work, motivated by the computational needs of the optimal uncertainty quantification (OUQ) framework, we present and develop an algorithm for reconstructing a monotone function $$F$$ given the ability to interrogate $$F$$ pointwise but subject to partially controllable one-sided observational errors of the type that one would typically encounter if the observations would arise from a numerical optimisation routine.

Abstract. Motivated by the desire to numerically calculate rigorous upper and lower bounds on deviation probabilities over large classes of probability distributions, we present an adaptive algorithm for the reconstruction of increasing real-valued functions. While this problem is similar to the classical statistical problem of isotonic regression, we assume that the observational data arise from optimisation problems with partially controllable one-sided errors, and this setting alters several characteristics of the problem and opens natural algorithmic possibilities. Our algorithm uses imperfect evaluations of the target function to direct further evaluations of the target function either at new sites in the function's domain or to improve the quality of evaluations at already-evaluated sites. We establish sufficient conditions for convergence of the reconstruction to the ground truth, and apply the method both to synthetic test cases and to a real-world example of uncertainty quantification for aerodynamic design.

Published on Monday 13 July 2020 at 10:00 UTC #preprint #daad #ouq #isotonic #bonnet #akian #savin

### Luc Bonnet visits the UQ Group

It is a pleasure to announce that Luc Bonnet will be joining the UQ research group for a three-month visit funded by the German Academic Exchange Service (DAAD), with effect from 1 September 2019. He will be working on the application of optimal uncertainty quantification methodology to problems in aerodynamics and aircraft design.

Published on Sunday 1 September 2019 at 09:00 UTC #group #daad #ouq #bonnet

### Bayesian Brittleness in SIAM Review

The 2015 Q4 issue of SIAM Review will carry an article by Houman Owhadi, Clint Scovel, and myself on the brittle dependency of Bayesian posteriors as a function of the prior. This is an abbreviated presentation of results given in full earlier this year in Elec. J. Stat. The PDF is available for free under the terms of the Creative Commons 4.0 licence.

H. Owhadi, C. Scovel, and T. J. Sullivan. “On the brittleness of Bayesian inference.” SIAM Review 57(4):566–582, 2015. doi:10.1137/130938633

Abstract. With the advent of high-performance computing, Bayesian methods are becoming increasingly popular tools for the quantification of uncertainty throughout science and industry. Since these methods can impact the making of sometimes critical decisions in increasingly complicated contexts, the sensitivity of their posterior conclusions with respect to the underlying models and prior beliefs is a pressing question to which there currently exist positive and negative answers. We report new results suggesting that, although Bayesian methods are robust when the number of possible outcomes is finite or when only a finite number of marginals of the data-generating distribution are unknown, they could be generically brittle when applied to continuous systems (and their discretizations) with finite information on the data-generating distribution. If closeness is defined in terms of the total variation (TV) metric or the matching of a finite system of generalized moments, then (1) two practitioners who use arbitrarily close models and observe the same (possibly arbitrarily large amount of) data may reach opposite conclusions; and (2) any given prior and model can be slightly perturbed to achieve any desired posterior conclusion. The mechanism causing brittleness/robustness suggests that learning and robustness are antagonistic requirements, which raises the possibility of a missing stability condition when using Bayesian inference in a continuous world under finite information.

Published on Friday 6 November 2015 at 12:00 UTC #publication #siam-review #ouq #bayesian #owhadi #scovel

### Bayesian Brittleness in Elec. J. Stat.

The Electronic Journal of Statistics has published an article by Houman Owhadi, Clint Scovel, and myself on the brittle dependency of Bayesian posteriors as a function of the prior.

H. Owhadi, C. Scovel, and T. J. Sullivan. “Brittleness of Bayesian inference under finite information in a continuous world.” Electronic Journal of Statistics 9(1):1–79, 2015. doi:10.1214/15-EJS989

Abstract. We derive, in the classical framework of Bayesian sensitivity analysis, optimal lower and upper bounds on posterior values obtained from Bayesian models that exactly capture an arbitrarily large number of finite-dimensional marginals of the data-generating distribution and/or that are as close as desired to the data-generating distribution in the Prokhorov or total variation metrics; these bounds show that such models may still make the largest possible prediction error after conditioning on an arbitrarily large number of sample data measured at finite precision. These results are obtained through the development of a reduction calculus for optimization problems over measures on spaces of measures. We use this calculus to investigate the mechanisms that generate brittleness/robustness and, in particular, we observe that learning and robustness are antagonistic properties. It is now well understood that the numerical resolution of PDEs requires the satisfaction of specific stability conditions. Is there a missing stability condition for using Bayesian inference in a continuous world under finite information?

Published on Tuesday 3 February 2015 at 10:00 UTC #publication #elec-j-stat #ouq #bayesian #owhadi #scovel

### UQ for Legacy Data from Lipschitz Functions in M2AN

Mathematical Modelling and Numerical Analysis has just published a paper by Mike McKerns, Dominic Meyer, Florian Theil, Houman Owhadi, Michael Ortiz, and myself on optimal UQ for legacy data observations of Lipschitz functions.

In this paper, we address both mathematically and numerically the challenge of giving optimal bounds on quantities of interest of the form $$\mathbb{P}_{X \sim \mu}[f(X) \geq t]$$, where the probability distribution $$\mu$$ of $$X$$ is only partially known through some of its moments, and the forward model $$f$$ is partially known through some pointwise observations and smoothness information.

T. J. Sullivan, M. McKerns, D. Meyer, F. Theil, H. Owhadi, and M. Ortiz. “Optimal uncertainty quantification for legacy data observations of Lipschitz functions.” ESAIM. Mathematical Modelling and Numerical Analysis 47(6):1657–1689, 2013. doi:10.1051/m2an/2013083

Abstract. We consider the problem of providing optimal uncertainty quantification (UQ) – and hence rigorous certification – for partially-observed functions. We present a UQ framework within which the observations may be small or large in number, and need not carry information about the probability distribution of the system in operation. The UQ objectives are posed as optimization problems, the solutions of which are optimal bounds on the quantities of interest; we consider two typical settings, namely parameter sensitivities (McDiarmid diameters) and output deviation (or failure) probabilities. The solutions of these optimization problems depend non-trivially (even non-monotonically and discontinuously) upon the specified legacy data. Furthermore, the extreme values are often determined by only a few members of the data set; in our principal physically-motivated example, the bounds are determined by just 2 out of 32 data points, and the remainder carry no information and could be neglected without changing the final answer. We propose an analogue of the simplex algorithm from linear programming that uses these observations to offer efficient and rigorous UQ for high-dimensional systems with high-cardinality legacy data. These findings suggest natural methods for selecting optimal (maximally informative) next experiments.

Published on Friday 30 August 2013 at 18:00 UTC #publication #m2an #ouq #ortiz #owhadi #mckerns #meyer #theil