Abstract: We introduce a new framework for unifying and systematizing the performance analysis of first-order black-box optimization algorithms for unconstrained convex minimization. The low-cost iteration complexity enjoyed by first-order algorithms renders them particularly relevant for applications in machine learning and large-scale data analysis. Our approach is based on sum-of-squares optimization, which allows to introduce a hierarchy of semidefinite programs (SDPs) that give increasingly better convergence bounds for higher levels of the hierarchy. The (dual of the) first level of the sum-of-squares hierarchy corresponds to the SDP reformulation of the Performance Estimation Problem, first introduced by Drori and Teboulle [Math. Program., 145(1):451-482, 2014] and developed further by Taylor, Hendrickx, and Glineur [Math. Program., 161(1):307-345, 2017]. Illustrating the usefulness of our approach, we recover, in a unified manner, several known convergence bounds for four widely-used first-order algorithms, and also derive new convergence results for noisy gradient descent with inexact line search methods.
This is joint work with Sandra S. Y. Tan (ECE, NUS) and Antonios Varvitsiotis (Singapore University of Technology and Design). Details can be found here. https://arxiv.org/abs/1906.04648
Bio: Vincent Tan is an associate professor in the Department of Electrical and Computer Engineering and Department of Mathematics at the National University of Singapore. His interests are in information theory, machine learning and high-dimensional statistics. He is a Distinguished Lecturer of the IEEE Information Theory Society.