Workshop on Learning:
Optimization and Stochastics

5-7 July 2020

Overview

The goal of the workshop is to bring together experts in various areas of mathematics and computer science related to the theory of machine learning, and to learn about recent and exciting developments in a relaxed atmosphere. The workshop will take place on EPFL campus, with social activities in the Lake Geneva area.

Workshop preview

The workshop has been postponed to the same time next year due to the COVID-19 situation, with a short Zoom preview of the workshop held instead. The preview featured several open problem/research presentations:

  • Lower Bounds for Sampling (Peter Bartlett; slides)
  • Black-box complexity of optimization in low dimensions (Sebastien Bubeck; slides)
  • Learning neural networks (Adam Klivans)
  • Cutting Convex Sets with Margin (Shay Moran; slides)
  • From Nesterov's Estimate Sequence to Riemannian Acceleration (Survit Sra; slides)

Speakers

Peter Bartlett (UC Berkeley)
Gerard Ben Arous (New York University)
Simina Branzei (Purdue University)
Sebastian Bubeck (Microsoft Research Redmond)
John Duchi (Stanford University)
Moritz Hardt (UC Berkeley)
Piotr Indyk (MIT)
Prateek Jain (Microsoft Research India)
Stefanie Jegelka (MIT)
Gauri Joshi (CMU)
Adam Klivans (UT Austin)
Pravesh Kothari (CMU)
James Lee (University of Washington)
Yin Tat Lee (University of Washington)
Aleksander Madry (MIT)
Shay Moran (Technion)
Andrea Montanari (Stanford University)
Praneeth Netrapalli (Microsoft Research India)
Alexander Rakhlin (MIT)
Nati Srebro (TTIC)
Suvrit Sra (MIT)

Tentative Program



6.7.20

09:00 - 12:00 Talks
12:00 - 13:30 Lunch
14:00 - 16:30 Talks
18:30 Dinner cruise

7.7.20

09:00 - 12:00 Talks
12:00 - 13:30 Lunch
14:00 - 16:30 Talks
17:00 Closing session

Registration

Coming soon.