Tim Sullivan

#schuster

Clear Search

Exact active subspace Metropolis-Hastings, with applications to the Lorenz-96 system

Preprint: Active subspace Metropolis-Hastings

Ingmar Schuster, Paul Constantine and I have just uploaded a preprint of our latest article, “Exact active subspace Metropolis–Hastings, with applications to the Lorenz-96 system”, to the arXiv. This paper reports on our first investigations into the acceleration of Markov chain Monte Carlo methods using active subspaces as compared to other adaptivity techniques, and is supported by the DFG through SFB 1114 Scaling Cascades in Complex Systems.

Abstract. We consider the application of active subspaces to inform a Metropolis–Hastings algorithm, thereby aggressively reducing the computational dimension of the sampling problem. We show that the original formulation, as proposed by Constantine, Kent, and Bui-Thanh (SIAM J. Sci. Comput., 38(5):A2779–A2805, 2016), possesses asymptotic bias. Using pseudo-marginal arguments, we develop an asymptotically unbiased variant. Our algorithm is applied to a synthetic multimodal target distribution as well as a Bayesian formulation of a parameter inference problem for a Lorenz-96 system.

Published on Friday 8 December 2017 at 08:00 UTC #publication #preprint #mcmc #sfb1114 #schuster #constantine

Ingmar Schuster Joins the UQ Group

It is a pleasure to announce that Ingmar Schuster will join the UQ research group as a postdoctoral researcher with effect from 15 June 2017. He will be working on project A06 “Enabling Bayesian uncertainty quantification for multiscale systems and network models via mutual likelihood-informed dimension reduction” as part of SFB 1114 Scaling Cascades in Complex Systems.

Published on Thursday 15 June 2017 at 08:00 UTC #group #sfb1114 #schuster

UQ Talks: Ingmar Schuster

Ingmar Schuster (Université Paris-Dauphine) “Gradient Importance Sampling”

Time and Place. Friday 11 March 2016, 11:15–12:45, Room 126 of Arnimallee 6 (Pi-Gebäude), 14195 Berlin

Abstract. Adaptive Monte Carlo schemes developed over the last years usually seek to ensure ergodicity of the sampling process in line with MCMC tradition. This poses constraints on what is possible in terms of adaptation. In the general case ergodicity can only be guaranteed if adaptation is diminished at a certain rate. Importance Sampling approaches offer a way to circumvent this limitation and design sampling algorithms that keep adapting. Here I present an adaptive variant of the discretized Langevin algorithm for estimating integrals with respect to some target density that uses an Importance Sampling instead of the usual Metropolis–Hastings correction.

Published on Monday 7 March 2016 at 12:00 UTC #event #uq-talk #schuster