Convergence rates of Gaussian ODE filters in Statistics and Computing
The paper “Convergence rates of Gaussian ODE filters” by Hans Kersting, Philipp Hennig, and myself has just appeared in the journal Statistics and Computing. In this work, we examine the strong convergence rates of probabilistic solvers for ODEs of the form \(\dot{x}(t) = f(x(t))\) that are based upon Gaussian filtering. In some sense, this work combines the numerical analysis perspective of Conrad et al. (2016) and Lie et al. (2019) with the filtering perspective on probabilistic numerical methods for ODEs of Schober et al. (2014).
H. Kersting, T. J. Sullivan, and P. Hennig. “Convergence rates of Gaussian ODE filters.” Statistics and Computing 30(6):1791–1816, 2020.
Abstract. A recently introduced class of probabilistic (uncertainty-aware) solvers for ordinary differential equations (ODEs) applies Gaussian (Kalman) filtering to initial value problems. These methods model the true solution \(x\) and its first \(q\) derivatives a priori as a Gauss–Markov process \(X\), which is then iteratively conditioned on information about \(\dot{x}\). This article establishes worst-case local convergence rates of order \(q + 1\) for a wide range of versions of this Gaussian ODE filter, as well as global convergence rates of order \(q\) in the case of \(q = 1\) and an integrated Brownian motion prior, and analyses how inaccurate information on \(\dot{x}\) coming from approximate evaluations of \(f\) affects these rates. Moreover, we show that, in the globally convergent case, the posterior credible intervals are well calibrated in the sense that they globally contract at the same rate as the truncation error. We illustrate these theoretical results by numerical experiments which might indicate their generalizability to \(q \in \{ 2, 3 , \dots \}\).
Published on Tuesday 15 September 2020 at 09:00 UTC #publication #stco #prob-num #kersting #hennig