Dimension-free iteration complexity of finite sum optimization problems

Yossi Arjevani, Ohad Shamir

Research output: Contribution to journalConference articlepeer-review

15 Scopus citations

Abstract

Many canonical machine learning problems boil down to a convex optimization problem with a finite sum structure. However, whereas much progress has been made in developing faster algorithms for this setting, the inherent limitations of these problems are not satisfactorily addressed by existing lower bounds. Indeed, current bounds focus on first-order optimization algorithms, and only apply in the often unrealistic regime where the number of iterations is less than O(d/n) (where d is the dimension and n is the number of samples). In this work, we extend the framework of Arjevani et al. [3, 5] to provide new lower bounds, which are dimension-free, and go beyond the assumptions of current bounds, thereby covering standard finite sum optimization methods, e.g., SAG, SAGA, SVRG, SDCA without duality, as well as stochastic coordinate-descent methods, such as SDCA and accelerated proximal SDCA.

Original languageEnglish
Pages (from-to)3548-3555
Number of pages8
JournalAdvances in Neural Information Processing Systems
StatePublished - 2016
Externally publishedYes
Event30th Annual Conference on Neural Information Processing Systems, NIPS 2016 - Barcelona, Spain
Duration: 5 Dec 201610 Dec 2016

Bibliographical note

Publisher Copyright:
© 2016 NIPS Foundation - All Rights Reserved.

Fingerprint

Dive into the research topics of 'Dimension-free iteration complexity of finite sum optimization problems'. Together they form a unique fingerprint.

Cite this