\dm_csml_event_details
Speaker |
David Duvenaud |
---|---|
Affiliation |
University of Toronto |
Date |
Friday, 12 February 2021 |
Time |
14:00-15:00 |
Location |
Zoom |
Link |
https://ucl.zoom.us/j/99166798620 |
Event series |
Jump Trading/ELLIS CSML Seminar Series |
Abstract |
Abstract: We show how to do gradient-based stochastic variational inference in stochastic differential equations (SDEs), in a way that allows the use of adaptive SDE solvers. This allows us to scalably fit a new family of richly-parameterized distributions over irregularly-sampled time series. We apply latent SDEs to motion capture data, and to demonstrate infinitely-deep Bayesian neural networks. We also discuss the pros and cons of this barely-explored model class, comparing it to Gaussian processes and neural processes. Some technical details are in this paper: https://arxiv.org/abs/2001.01328 Bio: David Duvenaud is an assistant professor in computer science at the University of Toronto. His research focuses on continuous-time models, latent-variable models, and deep learning. His postdoc was done at Harvard University, and his Ph.D. at the University of Cambridge. David also co-founded Invenia, an energy forecasting company. |
Biography |