On the Iteration Complexity of Oblivious First-Order Optimization Algorithms.

Yossi Arjevani, Ohad Shamir

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We consider a broad class of first-order optimization algorithms which are \emphoblivious, in the sense that their step sizes are scheduled regardless of the function under consideration, except for limited side-information such as smoothness or strong convexity parameters. With the knowledge of these two parameters, we show that any such algorithm attains an iteration complexity lower bound of Ω(\sqrtL/ε) for L-smooth convex functions, and \tildeΩ(\sqrtL/μ\ln(1/ε)) for L-smooth μ-strongly convex functions. These lower bounds are stronger than those in the traditional oracle model, as they hold independently of the dimension. To attain these, we abandon the oracle model in favor of a structure-based approach which builds upon a framework recently proposed in Arjevani et al. (2015). We further show that without knowing the strong convexity parameter, it is impossible to attain an iteration complexity better than \tildeΩ\sqrt(L/μ)\ln(1/ε). This result is then used to formalize an observation regarding L-smooth convex functions, namely, that the iteration complexity of algorithms employing time-invariant step sizes must be at least Ω(L/ε).
Original languageEnglish
Title of host publicationICML 2016
PublisherPMLR
Pages908-916
Number of pages9
StatePublished - 2016
Event33rd International Conference on Machine Learning, ICML 2016 - New York City, United States
Duration: 19 Jun 201624 Jun 2016

Publication series

NameProceedings of Machine Learning Research
PublisherPMLR
Volume48
ISSN (Electronic)2640-3498

Conference

Conference33rd International Conference on Machine Learning, ICML 2016
Country/TerritoryUnited States
CityNew York City
Period19/06/1624/06/16

Cite this