\dm_csml_event_details UCL ELLIS

Implicit Regularization for Optimal Sparse Recovery


Speaker

Varun Kanade

Affiliation

University of Oxford

Date

Friday, 15 November 2019

Time

13:00-14:00

Location

Zoom

Link

Roberts Building G06 Sir Ambrose Fleming LT

Event series

Jump Trading/ELLIS CSML Seminar Series

Abstract

We present an implicit regularization scheme for gradient descent methods
applied to unpenalized least squares regression to solve the problem of
reconstructing a sparse signal from an underdetermined system of linear
measurements under the restricted isometry assumption. For a given
parameterization yielding a non-convex optimization problem, we show that
prescribed choices of initialization, step size and stopping time yield a
statistically and computationally optimal algorithm that achieves the minimax
rate with the same cost required to read the data up to poly-logarithmic
factors. Beyond minimax optimality, we show that our algorithm adapts to
instance difficulty and yields a dimension-independent rate when the
signal-to-noise ratio is high enough. We validate our findings with numerical
experiments and compare our algorithm against explicit $\ell_{1}$ penalization.
Going from hard instances to easy ones, our algorithm is seen to undergo a
phase transition, eventually matching least squares with an oracle knowledge of
the true support.

(based on joint work with Patrick Rebeschini and Tomas Vaskevicius)

Varun Kanade is an associate professor at University of Oxford in the Department of Computer Science. He has been a Simons Postdoctoral Fellow at the University of California, Berkeley and a FSMP postdoctoral fellow at ENS, Paris. He obtained his Ph.D. from Harvard University in 2012. His research interests are in machine learning and theoretical computer science.

Biography