\dm_csml_event_details
Speaker |
Umut Şimşekli |
---|---|
Affiliation |
Télécom ParisTech |
Date |
Thursday, 09 May 2019 |
Time |
13:00-14:00 |
Location |
Zoom |
Link |
Roberts G08 |
Event series |
Jump Trading/ELLIS CSML Seminar Series |
Abstract |
By building up on the recent theory that established the connection between implicit generative modeling and optimal transport, in this talk, I will present a novel parameter-free algorithm for learning the underlying distributions of complicated datasets and sampling from them. The proposed algorithm is based on a functional optimization problem, which aims at finding a measure that is 'close to the data distribution as much as possible' and also 'expressive enough' for generative modeling purposes. The problem will be formulated as a gradient flow in the space of probability measures. The connections between gradient flows and stochastic differential equations will let us develop a computationally efficient algorithm for solving the optimization problem, where the resulting algorithm will resemble the recent dynamics-based Markov Chain Monte Carlo algorithms. I will then present finite-time error guarantees for the proposed algorithm. I will finally present some experimental results, which support our theory and shows that our algorithm is able to capture the structure of challenging distributions. |
Biography |