\dm_csml_event_details UCL ELLIS

Variational approximate inference in linear latent variable models


Speaker

Ed Challis

Affiliation

UCL

Date

Friday, 11 January 2013

Time

12:30-14:00

Location

Zoom

Link

Cruciform B404 - LT2

Event series

DeepMind/ELLIS CSML Seminar Series

Abstract

Linear latent variable models (such as factor analysis and probabilistic principal components analysis) and Bayesian generalized linear models (such as logistic regression and noise robust linear regression) are used widely throughout Machine Learning and Statistics. However, in all but the simplest cases inference remains computationally intractable.

This talk will focus on parametric Kullback-Leibler approximate inference methods as applied to such models. Parametric Kullback-Leibler approximate inference provides both a parametric approximation to the intractable posterior and lower bound to its normalisation constant. I will present my work on developing Gaussian KL approximate inference methods and introduce a new flexible approximating density class for which parametric KL inference is tractable and efficient.

Slides for the talk: PDF

Biography