\dm_csml_event_details
Speaker |
Yingzhen Li |
---|---|
Affiliation |
University of Cambridge |
Date |
Friday, 01 December 2017 |
Time |
13:00-14:00 |
Location |
Zoom |
Link |
Roberts Building G08 Sir David Davies LT |
Event series |
Jump Trading/ELLIS CSML Seminar Series |
Abstract |
This talk describes very recent efforts on developing approximate inference algorithms that enables approximations of arbitrary form. I will start by revisiting fundamental tractability issues of Bayesian computation and argue that density evaluation of the approximate posterior is mostly unnecessary. Then I will present 4 different categories of wild approximate inference methods that has been explored recently, with the focus on two of them developed by myself and colleagues. I will briefly cover: 1. the amortised MCMC algorithm that improves the approximate posterior by following the particle update of a valid MCMC sampler; and 2. a gradient estimation method that allow variational inference to be applied to those approximate distributions without a tractable density. |
Biography |