\dm_csml_event_details UCL ELLIS

Series Expansion Methods for Approximate Learning, Filtering and Smoothing in Diffusions


Speaker

Amos Storkey

Affiliation

Edinburgh University

Date

Friday, 21 November 2014

Time

13:00-14:00

Location

Zoom

Link

Roberts G08 (Sir David Davies lecture theatre)

Event series

DeepMind/ELLIS CSML Seminar Series

Abstract

Many systems in science, engineering and finance are described using
known forms of parameterised differential equations. Often there are
unknown random influences to such systems and so a stochastic
differential system is an appropriate model. However inference and
learning in general nonlinear stochastic differential systems is
notoriously hard, primarily because the finite time transition
probability cannot be explicitly represented.

I will discuss the series expansion approach for approximating a
diffusion, and demonstrate examples of the method in direct
application for parameter estimation in diffusion processes, and via
nonlinear Kalman filters and the unscented particle filter.

This talk describes joint work with Simon Lyons and Simo Sarkka, and was funded by a MSR Cambridge PhD fellowship.

Bio:

Amos Storkey is a reader (associate professor) at the School of Informatics, Edinburgh University. He did his PhD in Neural Networks at the Neural Systems Group, Imperial College, London. His research interests include: machine learning markets; Bayesian methods for brain imaging; continuous time/depth systems; dynamical Boltzmann machine models; scalable deep learning.

Video of the talk here.

Biography