\dm_csml_event_details
Speaker |
Tom Furmston |
---|---|
Affiliation |
UCL |
Date |
Friday, 11 May 2012 |
Time |
12:30-14:00 |
Location |
Zoom |
Link |
Wilkins Haldane Room, Wilkins Building |
Event series |
Jump Trading/ELLIS CSML Seminar Series |
Abstract |
Gradient-based algorithms are one of the methods of choice for the optimisation of Markov Decision Processes. In this talk we will present a novel approximate Newton algorithm for the optimisation of such models. The algorithm has various desirable properties over the naive application of Newton's method. Firstly the approximate Hessian is guaranteed to be negative-semidefinite over the entire parameter space in the case where the controller is $\log$-concave in the control parameters. Additionally the inference required for our approximate Newton method is often the same as that required for first order methods, such as steepest gradient ascent. The approximate Hessian also has many nice sparsity properties that are not present in the Hessian and that make its inversion efficient in many situations of interest. We also provide an analysis that highlights a relationship between our approximate Newton method and both Expectation Maximisation and natural gradient ascent. Empirical results suggest that the algorithm has excellent convergence and robustness properties. Time permitting we will then go onto the problem of performing inference in gradient-based algorithms, where we shall focus on model-based inference. Slides for the talk: PDF |
Biography |