\dm_csml_event_details UCL ELLIS

Distributed Bayesian Learning


Speaker

Yee-Whye Teh

Affiliation

University of Oxford

Date

Wednesday, 01 June 2016

Time

13:00-14:00

Location

Zoom

Link

Roberts G06 Sir Ambrose Fleming LT

Event series

DeepMind/ELLIS CSML Seminar Series

Abstract

Full title: Distributed Bayesian Learning with Stochastic Natural-gradient
Expectation Propagation and the Posterior Server

We make two contributions to Bayesian machine learning algorithms. Firstly, we propose stochastic natural gradient expectation propagation (SNEP), a novel alternative to expectation propagation (EP), a popular variational inference algorithm. SNEP is a black box variational algorithm, in that it does not require any simplifying assumptions on the distribution of interest, beyond the existence of some Monte Carlo sampler for estimating the moments of the EP tilted distributions. Further, as opposed to EP which has no guarantee of convergence, SNEP can be shown to be convergent, even when using Monte Carlo moment estimates.

Secondly, we propose a novel architecture for distributed Bayesian learning which we call the posterior server. The posterior server allows scalable and robust Bayesian learning in cases where a dataset is stored in a distributed manner across a cluster, with each compute node containing a disjoint subset of data. An independent Markov chain Monte Carlo (MCMC) sampler is run on each compute node, with direct access only to the local data subset, but which targets an approximation to the global posterior distribution given all data across the whole cluster. This is achieved by using a distributed asynchronous implementation of SNEP to pass messages across the cluster. We demonstrate SNEP and the posterior server on distributed Bayesian learning of logistic regression and neural networks.

Authors: Yee Whye Teh, Leonard Hasenclever, Thibaut Lienart, Sebastian
Vollmer, Stefan Webb, Balaji Lakshminarayanan, Charles Blundell

Paper

Speaker website

Biography