\dm_csml_event_details UCL ELLIS

ICML preview talks


Speaker

David Barber, Kacper Chwiałkowski, Dino Sejdinovic

Affiliation

UCL

Date

Friday, 13 June 2014

Time

13:00-14:00

Location

Zoom

Link

Malet Place Engineering Building 1.02

Event series

DeepMind/ELLIS CSML Seminar Series

Abstract

A chance to hear previews of ICML 2014 talks from CSML researchers.

Talk 1

Gaussian Processes for Bayesian Estimation in Ordinary Differential Equations

Yali Wang and David Barber

Bayesian parameter estimation in coupled ordinary differential equations (ODEs) is challenging due to the high computational cost of numerical integration. In gradient matching a separate
data model is introduced with the property that its gradient may be calculated easily. Parameter
estimation is then achieved by requiring consistency between the gradients computed from the
data model and those specified by the ODE. We propose a Gaussian process model that directly
links state derivative information with system observations, simplifying previous approaches and
improving estimation accuracy.

Talk 2

A Kernel Independence Test for Random Processes

Kacper Chwiałkowski and Arthur Gretton

A non-parametric approach to the problem of testing the independence of two random processes will be presented. The test statistic is the Hilbert-Schmidt Independence Criterion (HSIC), which was used previously in testing independence for i.i.d. pairs of variables. The asymptotic behaviour of HSIC will be established when computed from samples drawn from random processes. We will show that earlier bootstrap procedures which worked in the i.i.d. case will fail for random processes, and an alternative consistent estimate of the p-values will be proposed. Tests on artificial data and real-world forex data indicate that the new test procedure discovers dependence which is missed by linear approaches, while the earlier bootstrap procedure returns an elevated number of false positives.

Talk 3

Kernel adaptive Metropolis-Hastings

D. Sejdinovic, H. Strathmann, M. Lomeli Garcia, C. Andrieu and A. Gretton

Abstract: A Kernel Adaptive Metropolis-Hastings algorithm is introduced, for the purpose of sampling from a target distribution with strongly nonlinear support. The algorithm embeds the trajectory of the Markov chain into a reproducing kernel Hilbert space (RKHS), such that the feature space covariance of the samples informs the choice of proposal. The procedure is computationally efficient and straightforward to implement, since the RKHS moves can be integrated out analytically: our proposal distribution in the original space is a normal distribution whose mean and covariance depend on where the current sample lies in the support of the target distribution, and adapts to its local covariance structure. Furthermore, the procedure requires neither gradients nor any other higher order information about the target, making it particularly attractive for contexts such as Pseudo-Marginal MCMC. Kernel Adaptive Metropolis-Hastings outperforms competing fixed and adaptive samplers on multivariate, highly nonlinear target distributions, arising in both real-world and synthetic examples.
Code: https://github.com/karlnapf/kameleon-mcmc

Biography