\dm_csml_event_details UCL ELLIS

Adaptive Tuning for Metropolis Adjusted Langevin Trajectories


Lionel Riou-Durand


University of Warwick


Friday, 09 December 2022




Function Space, UCL Centre for Artificial Intelligence, 1st Floor, 90 High Holborn, London WC1V 6BH



Event series

DeepMind/ELLIS CSML Seminar Series


Hamiltonian Monte Carlo (HMC) is a widely used sampler for continuous probability distributions. In many cases, the underlying Hamiltonian dynamics exhibit a phenomenon of resonance which decreases the efficiency of the algorithm and makes it very sensitive to hyperparameter values. This issue can be tackled efficiently, either via the use of trajectory length randomization (RHMC) or via partial momentum refreshment. The second approach is connected to the kinetic Langevin diffusion, and has been mostly investigated through the use of Generalized HMC (GHMC). However, GHMC induces momentum flips upon rejections causing the sampler to backtrack and waste computational resources. In this work we focus on a recent algorithm bypassing this issue, named Metropolis Adjusted Langevin Trajectories (MALT). We build upon recent strategies for tuning the hyperparameters of RHMC which target a bound on the Effective Sample Size (ESS) and adapt it to MALT, thereby enabling the first user-friendly deployment of this algorithm. We construct a method to optimize a sharper bound on the ESS and reduce the estimator variance. Easily compatible with parallel implementation, the resultant Adaptive MALT algorithm is competitive in terms of ESS rate and hits useful tradeoffs in memory usage when compared to GHMC, RHMC and NUTS.


Lionel Riou-Durand is a postdoctoral fellow at the University of Warwick within Statistics since September 2019. He is working in the CoSInES project (Computational Statistical Inference for Engineering and Security), whose principal investigator is Gareth Roberts. He defended my PhD thesis in July 2019. He did his PhD at the Center for Research in Economics and Statistics (CREST), under the supervision of Nicolas Chopin and Arnak Dalalyan. His research themes are connected to computational methods for statistics and machine learning. His primary focus is on sampling algorithms, which are widely used tools for the numerical approximation of statistical estimators. He is particularly interested in measuring their accuracy, evaluating their computational complexities and studying their robustness. The approximation of statistical estimators involves two interdependent challenges, since the aim is to guarantee a negligible numerical error compared to the statistical uncertainty, while controlling the computational burden of the algorithm. These algorithms are widely used to approximate statistical estimators based upon high dimensional integrals, omnipresent in Bayesian statistics. Their construction and study are connected to several mathematical fields, such as stochastic processes, optimal transport, numerical analysis and optimisation. An overall objective is to develop efficient and reliable algorithms, while making these easily accessible by practitioners.