\dm_csml_event_details UCL ELLIS

Scaling up MCMC: a subsampling approach


Remi Bardenet


Deptartment of Statistics, Oxford


Friday, 23 May 2014






MPEB 1.03

Event series

DeepMind/ELLIS CSML Seminar Series


Markov chain Monte Carlo (MCMC) methods are often deemed far too computationally intensive to be of any practical use for large datasets. In this talk, I will describe a methodology that aims to scale up the Metropolis-Hastings (MH) algorithm in this context. We propose an approximate implementation of the accept/reject step of MH based on concentration inequalities, which only requires evaluating the likelihood of a random subset of the data, yet is guaranteed to coincide with the accept/reject step based on the full dataset with a probability superior to a user-specified tolerance level. This adaptive subsampling technique is an alternative to the recent approach developed in (Korattikara et al., to appear in ICML'14), and it allows to establish rigorously that the resulting approximate MH algorithm samples from a perturbed version of the target distribution of interest. Furthermore, the total variation distance between this perturbed target and the target of interest is controlled explicitely. I will demonstrate the benefits and limitations of this scheme on several examples.

Joint work with Arnaud Doucet and Chris Holmes, ICML'14.
Paper link