|
Le lundi 27 mai 2019 |
||||||||
|
08:30 - 09:00
|
Inscription (salle 5345) et café-croissants (salle 6245)
|
|
09:00 - 09:40
|
Mikhail (Misha) Belkin
(The Ohio State University) Rethinking the bias-variance trade-off | Résumé |
09:45 - 10:25
|
Dirk Lorenz
(Technische Universität Braunschweig) Unrolling of primal-dual algorithms | Résumé |
10:30 - 11:10
|
Pause-café
|
|
11:10 - 11:50
|
Dejan Slepcev
(Carnegie Mellon University) Proper weighted Laplacian for semi-supervised learning | Résumé |
12:00 - 13:30
|
Pause-déjeuner
|
|
13:30 - 14:10
|
Michael Mahoney
(UC Berkeley) Why deep learning works: traditional and heavy-tailed implicit self-regularization in deep neural networks | Résumé |
14:15 - 14:55
|
Christoph Brune
(University of Twente) Deep learning decomposition for inverse problems | Résumé |
15:00 - 15:30
|
Pause-café
|
|
15:30 - 16:10
|
Venkat Chandrasekaran
(California Institute of Technology) A geometric perspective on false discovery control | Résumé |
16:15 - 16:55
|
Giovanni S. Alberti
(Universit`a di Genova) Adversarial deformations in deep neural networks | Résumé |
Le mardi 28 mai 2019 |
||||||||
|
08:30 - 09:00
|
Café croissants
|
|
09:00 - 09:40
|
Jeffrey Willam Calder
(University of Minnesota) PDE continuum limits for prediction with expert advice | Résumé |
09:45 - 10:25
|
Christoph Schwab
(ETH Zurich) Deep neural network expression in Bayesian PDE constrained data assimilation and inversion | Résumé |
10:30 - 11:10
|
Pause-café
|
|
11:10 - 11:50
|
Bamdad Hosseini
(California Institute of Technology) Consistency of semi-supervised learning algorithms on graphs | Résumé |
12:00 - 13:30
|
Pause-déjeuner
|
|
13:30 - 14:10
|
Nicolas Le Roux
(Google Brain - Montréal) On the interplay between noise and curvature in deep learning | Résumé |
14:15 - 14:55
|
Matteo Santacesaria
(University of Genoa) Inverse problems for PDEs via compressed sensing | Résumé |
15:00 - 15:30
|
Pause-café
|
|
15:30 - 16:10
|
Matthew Thorpe
(University of Cambridge) On the well posedness of graph Laplacian regularisation in semi-supervised learning | Résumé |
16:15 - 16:55
|
Nikola Kovachki
(California Institute of Technology) Continuous time limits for momentum methods as implemented in machine learning | Résumé |
Le mercredi 29 mai 2019 |
||||||||
|
08:30 - 09:00
|
Café croissants
|
|
09:00 - 09:40
|
Mauro Maggioni
(Johns Hopkins University) Statistical learning & dynamical systems: exploiting hidden low-dimensional structures | Résumé |
09:45 - 10:25
|
Bharath Sriperumbudur
(The Pennsylvania State University) Distribution regression: theory and applications |
10:30 - 11:10
|
Pause-café
|
|
11:10 - 11:50
|
Daniel Sanz-Alonso
(University of Chicago) Function-space inverse problems and semi-supervised learning | Résumé |
12:00 - 13:30
|
Pause-déjeuner
|
|
13:30 - 14:10
|
Franca Hoffmann
(California Institute of Technology) Geometric insights into spectral clustering by graph Laplacian embeddings | Résumé |
14:15 - 14:55
|
Eldad Haber
(The University of British Columbia) Efficient architectures for deep neural networks | Résumé |
15:00 - 15:30
|
Pause-café
|
|
15:30 - 16:10
|
Alfredo Garbuno Inigo
(Caltech) Optimize, learn, sample | Résumé |
16:15 - 16:55
|
Adam M. Oberman
(McGill University) |
Mise à jour: Le lundi 27 mai 2019 14:07