\dm_csml_event_details UCL ELLIS

Generating Sequences with Recurrent Neural Networks


Speaker

Alex Graves

Affiliation

Google Deepmind

Date

Friday, 25 April 2014

Time

13:00-14:00

Location

Zoom

Link

Malet Place Engineering Building 1.02

Event series

DeepMind/ELLIS CSML Seminar Series

Abstract

Generating sequential data is the closest computers get to dreaming. Digital dreams are likely to play a crucial role in the future of AI, by helping agents to simulate, predict and interpret their surroundings. This talk shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with large-scale structure, simply by
predicting one step at a time. The method is demonstrated for character-level language modelling (where the data are discrete) and speech and handwriting generation (where the data are real-valued). A novel extension allows the network to condition its predictions on an auxiliary input sequence, making it possible to speak or write specific texts.

Biography