Tim Sullivan


I am Junior Professor in Applied Mathematics with Specialism in Risk and Uncertainty Quantification at the Free University of Berlin and Research Group Leader for Uncertainty Quantification at the Zuse Institute Berlin. I have wide interests in uncertainty quantification the broad sense, understood as the meeting point of numerical analysis, applied probability and statistics, and scientific computation. On this site you will find information about how to contact me, my research, publications, and teaching activities.


Mathematisches Forschungsinstitut Oberwolfach

UQ at Oberwolfach

Last week, from 11 to 15 March 2019, the Mathematisches Forschungsinstitut Oberwolfach hosted its first full-size workshop on Uncertainty Quantification, organised by Oliver Ernst, Fabio Nobile, Claudia Schillings, and myself. This intensive and invigorating workshop brought together over fifty researchers in mathematics, statistics, and computational science from around the globe.

Photographs from the workshop can be found in the Oberwolfach Photo Collection.

Published on Saturday 16 March 2019 at 17:00 UTC #event #mfo #oberwolfach #ernst #nobile #schillings

Ilja Klebanov Joins the UQ Group

It is a pleasure to announce that Ilja Klebanov will join the UQ research group as a postdoctoral researcher with effect from 1 February 2019. He will be working on project TrU-2 “Demand modelling and control for e-commerce using RKHS transfer operator approaches” within the Berlin Mathematics Excellence Cluster MATH+; the project will be led by Stefan Klus and myself.

Published on Friday 1 February 2019 at 09:00 UTC #group #job #tru2 #mathplus #klus #klebanov

A modern retrospective on probabilistic numerics

Preprint: Probabilistic numerics retrospective

Chris Oates and I have just uploaded a preprint of our paper “A modern retrospective on probabilistic numerics” to the arXiv.

Abstract. This article attempts to cast the emergence of probabilistic numerics as a mathematical-statistical research field within its historical context and to explore how its gradual development can be related to modern formal treatments and applications. We highlight in particular the parallel contributions of Sul'din and Larkin in the 1960s and how their pioneering early ideas have reached a degree of maturity in the intervening period, mediated by paradigms such as average-case analysis and information-based complexity. We provide a subjective assessment of the state of research in probabilistic numerics and highlight some difficulties to be addressed by future works.

Published on Tuesday 15 January 2019 at 12:00 UTC #preprint #prob-num #oates

Optimality criteria for probabilistic numerical methods

Preprint: Optimality of probabilistic numerical methods

Chris Oates, Jon Cockayne, Dennis Prangle, Mark Girolami, and I have just uploaded a preprint of our paper “Optimality criteria for probabilistic numerical methods” to the arXiv.

Abstract. It is well understood that Bayesian decision theory and average case analysis are essentially identical. However, if one is interested in performing uncertainty quantification for a numerical task, it can be argued that the decision-theoretic framework is neither appropriate nor sufficient. To this end, we consider an alternative optimality criterion from Bayesian experimental design and study its implied optimal information in the numerical context. This information is demonstrated to differ, in general, from the information that would be used in an average-case-optimal numerical method. The explicit connection to Bayesian experimental design suggests several distinct regimes in which optimal probabilistic numerical methods can be developed.

Published on Tuesday 15 January 2019 at 11:00 UTC #preprint #prob-num #oates #cockayne #prangle #girolami


It is a pleasure and an honour to announce that, with effect from today, I will be serving as an Associate Editor for the SIAM/ASA Journal on Uncertainty Quantification.

SIAM/ASA Journal on Uncertainty Quantification (JUQ) publishes research articles presenting significant mathematical, statistical, algorithmic, and application advances in uncertainty quantification, defined as the interface of complex modeling of processes and data, especially characterizations of the uncertainties inherent in the use of such models. The journal also focuses on related fields such as sensitivity analysis, model validation, model calibration, data assimilation, and code verification. The journal also solicits papers describing new ideas that could lead to significant progress in methodology for uncertainty quantification as well as review articles on particular aspects. The journal is dedicated to nurturing synergistic interactions between the mathematical, statistical, computational, and applications communities involved in uncertainty quantification and related areas. JUQ is jointly offered by SIAM and the American Statistical Association.

Published on Tuesday 1 January 2019 at 18:00 UTC #editorial #juq