Workshop on Inverse Problems and Machine Learning

May 27-29, 2019

Program

 

Monday, May 27, 2019

08:30 - 09:00
Registration (Room 5345) and Coffee & Croissants (Room 6245)


09:00 - 09:40
Mikhail (Misha) Belkin
(The Ohio State University)
Rethinking the bias-variance trade-off
Abstract
09:45 - 10:25
Dirk Lorenz
(Technische Universität Braunschweig)
Unrolling of primal-dual algorithms
Abstract
10:30 - 11:10
Coffee break
11:10 - 11:50
Dejan Slepcev
(Carnegie Mellon University)
Proper weighted Laplacian for semi-supervised learning
Abstract
12:00 - 13:30
Lunch break
13:30 - 14:10
Michael Mahoney
(UC Berkeley)
Why deep learning works: traditional and heavy-tailed implicit self-regularization in deep neural networks
Abstract
14:15 - 14:55
Christoph Brune
(University of Twente)
Deep learning decomposition for inverse problems
Abstract
15:00 - 15:30
Coffee break
15:30 - 16:10
Venkat Chandrasekaran
(California Institute of Technology)
A geometric perspective on false discovery control
Abstract
16:15 - 16:55
Giovanni S. Alberti
(Universit`a di Genova)
Adversarial deformations in deep neural networks
Abstract

 

Tuesday, May 28, 2019

08:30 - 09:00
Coffee & Croissants


09:00 - 09:40
Jeffrey Willam Calder
(University of Minnesota)
PDE continuum limits for prediction with expert advice
Abstract
09:45 - 10:25
Christoph Schwab
(ETH Zurich)
Deep neural network expression in Bayesian PDE constrained data assimilation and inversion
Abstract
10:30 - 11:10
Coffee break
11:10 - 11:50
Bamdad Hosseini
(California Institute of Technology)
Consistency of semi-supervised learning algorithms on graphs
Abstract
12:00 - 13:30
Lunch break
13:30 - 14:10
Nicolas Le Roux
(Google Brain - Montréal)
On the interplay between noise and curvature in deep learning
Abstract
14:15 - 14:55
Matteo Santacesaria
(University of Genoa)
Inverse problems for PDEs via compressed sensing
Abstract
15:00 - 15:30
Coffee break
15:30 - 16:10
Matthew Thorpe
(University of Cambridge)
On the well posedness of graph Laplacian regularisation in semi-supervised learning
Abstract
16:15 - 16:55
Nikola Kovachki
(California Institute of Technology)
Continuous time limits for momentum methods as implemented in machine learning
Abstract

 

Wednesday, May 29, 2019

08:30 - 09:00
Coffee & Croissants


09:00 - 09:40
Mauro Maggioni
(Johns Hopkins University)
Statistical learning & dynamical systems: exploiting hidden low-dimensional structures
Abstract
09:45 - 10:25
Bharath Sriperumbudur
(The Pennsylvania State University)
Distribution regression: theory and applications
10:30 - 11:10
Coffee break
11:10 - 11:50
Daniel Sanz-Alonso
(University of Chicago)
Function-space inverse problems and semi-supervised learning
Abstract
12:00 - 13:30
Lunch break
13:30 - 14:10
Franca Hoffmann
(California Institute of Technology)
Geometric insights into spectral clustering by graph Laplacian embeddings
Abstract
14:15 - 14:55
Eldad Haber
(The University of British Columbia)
Efficient architectures for deep neural networks
Abstract
15:00 - 15:30
Coffee break
15:30 - 16:10
Alfredo Garbuno Inigo
(Caltech)
Optimize, learn, sample
Abstract
16:15 - 16:55
Adam M. Oberman
(McGill University)