Tim Sullivan

Welcome!

I am Junior Professor in Applied Mathematics with Specialism in Risk and Uncertainty Quantification at the Freie Universität Berlin and Research Group Leader for Uncertainty Quantification at the Zuse Institute Berlin. I have wide interests in uncertainty quantification the broad sense, understood as the meeting point of numerical analysis, applied probability and statistics, and scientific computation. On this site you will find information about how to contact me, my research, publications, and teaching activities.

Free University of Berlin

Course in Inverse Problems at FU Berlin

This Summer Semester 2019, I will offer a course in the mathematics of Inverse Problems at the Freie Universität Berlin. The course load will be 2+2 SWS and the course will be a valid selection for the Numerik IV and Stochastic IV modules.

Inverse problems, meaning the recovery of parameters or states in a mathematcial model that best match some observed data, are ubiquitous in applied sciences. This course will provide an introduction to the deterministic (variational) and stochastic (Bayesian) theories of inverse problems in function spaces.

Course Contents

  1. Examples of inverse problems in mathematics and physical sciences
  2. Preliminaries from functional analysis
  3. Preliminaries from probability theory
  4. Linear inverse problems and variational regularisation
  5. Bayesian regularisation of inverse problems
  6. Monte Carlo methods for Bayesian problems

For further details see the KVV page.

Published on Monday 8 April 2019 at 09:00 UTC #fub #inverse-problems

Compression, inversion, and approximate PCA of dense kernel matrices at near-linear computational complexity

Preprint: Computing with dense kernel matrices at near-linear cost

Florian Schäfer, Houman Owhadi, and I have just uploaded a revised and improved version of our preprint “Compression, inversion, and approximate PCA of dense kernel matrices at near-linear computational complexity” to the arXiv. This paper shows how a surprisingly simple algorithm — the zero fill-in incomplete Cholesky factorisation — with respect to a cleverly-chosen sparsity pattern allows for near-linear complexity compression, inversion, and approximate PCA of square matrices of the form

\( \Theta = \begin{bmatrix} G(x_{1}, x_{1}) & \cdots & G(x_{1}, x_{N}) \\ \vdots & \ddots & \vdots \\ G(x_{N}, x_{1}) & \cdots & G(x_{N}, x_{N}) \end{bmatrix} \in \mathbb{R}^{N \times N} , \)

where \(\{ x_{1}, \dots, x_{N} \} \subset \mathbb{R}^{d}\) is a data set and \(G \colon \mathbb{R}^{d} \times \mathbb{R}^{d} \to \mathbb{R}\) is a covariance kernel function. Such matrices play a key role in, for example, Gaussian process regression and RKHS-based machine learning techniques.

Abstract. Dense kernel matrices \(\Theta \in \mathbb{R}^{N \times N}\) obtained from point evaluations of a covariance function \(G\) at locations \(\{ x_{i} \}_{1 \leq i \leq N}\) arise in statistics, machine learning, and numerical analysis. For covariance functions that are Green's functions of elliptic boundary value problems and homogeneously-distributed sampling points, we show how to identify a subset \(S \subset \{ 1 , \dots , N \}^2\), with \(\# S = O ( N \log (N) \log^{d} ( N /\varepsilon ) )\), such that the zero fill-in incomplete Cholesky factorisation of the sparse matrix \(\Theta_{ij} 1_{( i, j ) \in S}\) is an \(\varepsilon\)-approximation of \(\Theta\). This factorisation can provably be obtained in complexity \(O ( N \log( N ) \log^{d}( N /\varepsilon) )\) in space and \(O ( N \log^{2}( N ) \log^{2d}( N /\varepsilon) )\) in time; we further present numerical evidence that \(d\) can be taken to be the intrinsic dimension of the data set rather than that of the ambient space. The algorithm only needs to know the spatial configuration of the \(x_{i}\) and does not require an analytic representation of \(G\). Furthermore, this factorization straightforwardly provides an approximate sparse PCA with optimal rate of convergence in the operator norm. Hence, by using only subsampling and the incomplete Cholesky factorization, we obtain, at nearly linear complexity, the compression, inversion and approximate PCA of a large class of covariance matrices. By inverting the order of the Cholesky factorization we also obtain a solver for elliptic PDE with complexity \(O ( N \log^{d}( N /\varepsilon) )\) in space and \(O ( N \log^{2d}( N /\varepsilon) )\) in time.

Published on Tuesday 26 March 2019 at 12:00 UTC #publication #preprint #prob-num #schaefer #owhadi

BMS Summer School on Mathematics of Deep Learning

This summer the Berlin Mathematical School will be offering the BMS Summer School 2019 on “Mathematics of Deep Learning”, 19–30 August 2019, at the Zuse Institute Berlin.

This summer school is aimed at graduate students in mathematics; postdocs are also encouraged to attend. It will offer lectures on both the theory of deep neural networks, and related questions such as generalization, expressivity, or explainability, as well as on applications of deep neural networks (e.g. to PDEs, inverse problems, or specific real-world problems).

The first week will be devoted to the theory of deep neural networks, while the second week has a focus on applications. The format is dominated by 1.5-hour lectures by international experts. In addition, there will also be a poster session for the participants.

Speakers include: Taco Cohen (Qualcomm), Francois Fleuret (IDIAP | EPF, Lausanne), Eldad Haber (University of British Columbia), Robert Jenssen (Tromso), Andreas Krause (ETH Zurich), Gitta Kutyniok (TU Berlin), Ben Leimkuhler (U Edinburgh), Klaus-Robert Müller (TU Berlin), Frank Noe (FU Berlin), Christof Schütte (FU Berlin | ZIB), Vladimir Spokoiny (HU Berlin | WIAS), Rene Vidal (Johns Hopkins University).

For more information, see www.mathplus.de and www.math-berlin.de/academics/summer-schools. The deadline for application is 8 April 2019.

Published on Sunday 24 March 2019 at 08:00 UTC #event #bms #deep-learning

Mathematisches Forschungsinstitut Oberwolfach

UQ at Oberwolfach

Last week, from 11 to 15 March 2019, the Mathematisches Forschungsinstitut Oberwolfach hosted its first full-size workshop on Uncertainty Quantification, organised by Oliver Ernst, Fabio Nobile, Claudia Schillings, and myself. This intensive and invigorating workshop brought together over fifty researchers in mathematics, statistics, and computational science from around the globe.

Photographs from the workshop can be found in the Oberwolfach Photo Collection.

Published on Saturday 16 March 2019 at 17:00 UTC #event #mfo #oberwolfach #ernst #nobile #schillings

Ilja Klebanov Joins the UQ Group

It is a pleasure to announce that Ilja Klebanov will join the UQ research group as a postdoctoral researcher with effect from 1 February 2019. He will be working on project TrU-2 “Demand modelling and control for e-commerce using RKHS transfer operator approaches” within the Berlin Mathematics Excellence Cluster MATH+; the project will be led by Stefan Klus and myself.

Published on Friday 1 February 2019 at 09:00 UTC #group #job #tru2 #mathplus #klus #klebanov