LIONS @ 2020 OPT Workshop on Optimization for Machine Learning

OPT 2020 workshop internet site © 2021 EPFL

OPT 2020 workshop internet site © 2021 EPFL

Professor Volkan Cevher was invited to give a talk at 12th OPT Workshop on Optimization for Machine Learning.
2020 edition of OPT workshop has been run as a virtual event together with NeurIPS, from 11-12 December.
Slides to the presentation can be found on the following site.

Abstract:

In this talk, we review some of the recent advances in first-order methods for convex and non-convex optimization as well as their universality properties. We say an algorithm is universal if it does not require to know whether the optimization objective is smooth or not. We first recall the AdaGrad method and show that AdaGrad is a universal algorithm without any modifications: It implicitly exploits the smoothness of the problem to achieve the standard O(1/k) rate in the presence of smooth objectives, where k is the iteration count. To this end, we introduce an accelerated, universal variant of AdaGrad, dubbed as AcceleGrad, that in addition obtains the optimal convergence rate of O(1/k^2) in the smooth setting with deterministic oracles. We then introduce UniXGrad, which is the first algorithm that simultaneously achieves optimal rates for smooth or non-smooth problems with either deterministic or stochastic first-order oracles in the constrained convex setting. We conclude the presentation with results in non-convex optimization revolving around ADAM-type algorithms, including new convergence rates.