Tim Sullivan

Junior Professor in Applied Mathematics:
Risk and Uncertainty Quantification

Probabilistic Numerics at MCQMC

There will be a workshop on Probabilistic Numerics at this year's MCQMC conference at Stanford University. The workshop will be held on Thursday, 18 August 2016, 15:50–17:50, at the Li Ka Shing Center on the Stanford University campus. Speakers include:

  • Mark Girolami (University of Warwick & Alan Turing Institute) — Probabilistic Numerical Computation: A New Concept?
  • François-Xavier Briol (University of Warwick & University of Oxford) — Probabilistic Integration: A Role for Statisticians in Numerical Analysis?
  • Chris Oates (University of Technology Sydney) — Probabilistic Integration for Intractable Distributions
  • Jon Cockayne (University of Warwick) — Probabilistic meshless methods for partial differential equations and Bayesian inverse problems

Update, 19 August 2016. The slides from the talks can be found here, on Chris Oates' website.

Published on Sunday 31 July 2016 at 14:00 UTC #prob-num #event

Hans Kersting

UQ Talks: Hans Kersting

Next week Hans Kersting (MPI Tübingen) will give a talk in the UQ research seminar about “UQ in probabilistic ODE solvers”.

Time and Place. Tuesday 14 June 2016, 12:15–13:15, ZIB Seminar Room 2006, Zuse Institute Berlin, Takustraße 7, 14195 Berlin

Abstract. In an ongoing push to construct probabilistic extensions of classic ODE solvers for application in statistics and machine learning, two recent papers have provided distinct methods that return probability measures instead of point estimates, based on sampling and filtering respectively. While both approaches leverage classical numerical analysis, by building on well-studied solutions of existing seminal solvers, the different constructions of probability measures strike a divergent balance between a formal quantification of epistemic uncertainty and a low computational overhead.

On the one hand, Conrad et al. proposed to randomise existing non-probabilistic one-step solvers by adding suitably scaled Gaussian noise after every step and thereby inducing a probability measure over the solution space of the ODE which contracts to a Dirac measure on the true unknown solution in the order of convergence of the underlying classic numerical method. But the computational cost of these methods is significantly above that of classic solvers.

On the other hand, Schober et al. recast the estimation of the solution as state estimation by a Gaussian (Kalman) filter and proved that employing a integrated Wiener process prior returns a posterior Gaussian process whose maximum likelihood (ML) estimate matches the solution of classic Runge–Kutta methods. In an attempt to amend this method's rough uncertainty calibration while sustaining its negligible cost overhead, we propose a novel way to quantify uncertainty in this filtering framework by probing the gradient using Bayesian quadrature.

Published on Monday 6 June 2016 at 10:00 UTC #event #uq-talk #prob-num

Zuse 75

The Digital Future: 75th Anniversary of the Zuse Z3

11 May 2016 marks the seventy-fifth anniversary of the unveiling of Konrad Zuse's Z3 computer. The Z3 was the world's first working programmable, fully automatic digital computer.

In celebration of this landmark achievement in computational science, the Zuse Institute, the Berlin–Brandenburg Academy of Sciences, and Der Tagesspiegel are organising a conference on “The Digital Future: 75 Years Zuse Z3 and the Digital Revolution”. For further information, see www.zib.de/zuse75.

Published on Monday 2 May 2016 at 11:00 UTC #event

Steven Niederer

UQ Talks: Steven Niederer

This week Steven Niederer (King's College London) will talk about “Linking physiology and cardiology through mathematical models”

Time and Place. Thursday 28 April 2016, 11:00–12:00, Room 4027 of the Zuse Institute Berlin, Takustraße 7, 14195 Berlin

Abstract. Much effort has gone into the analysis of cardiac function using mathematical and computational models. To fully realise the potential of these studies requires the translation of these models into clinical applications to aid in diagnosis and clinical planning.

To achieve this goal requires the integration of multiple disparate clinical data sets into a common modelling framework. To this end we have developed a coupled electro-mechanics model of the human heart. This model combines patient specific anatomical geometry, active contraction, electrophysiology, tissue heterogeneities and boundary conditions fitted to comprehensive imaging and catheter clinical measurements.

This multi-scale computational model allows us to link sub cellular mechanisms to whole organ function. This provides a novel tool to determine the mechanisms that underpin treatment out comes and offers the ability to determine hidden variables that provide new metrics of cardiac function. Specifically we report on the application of these methods in patients receiving cardiac resynchronisation therapy and ablation for atrial fibrillation.

Published on Sunday 24 April 2016 at 07:00 UTC #event #uq-talk

ECMath Colloquium

ECMath Colloquium

This week's colloquium at the Einstein Center for Mathematics Berlin will be on the topic of “Sparsity: Statistics, Optimization and Applications.” The speakers will be:

  • Peter Richtárik (Edinburgh): Empirical Risk Minimization: Complexity, Duality, Sampling, Sparsity and Big Data
  • Gitta Kutyniok (TU Berlin): Anisotropic Structures and Sparsity-based Regularization
  • Mario Figueiredo (Lisbon): Learning with Strongly Correlated Variables: Ordered Weighted ℓ1 Regularization

Time and Place. Friday 22 April 2016, 14:00–17:00, Humboldt-Universität zu Berlin, Main Building Room 2.094, Unter den Linden 6, 10099 Berlin

Published on Monday 18 April 2016 at 08:00 UTC #event

← Newer | 1 | 2 | 3 | Older →