IDEAL: Inexact decentralized accelerated augmented Lagrangian method

Yossi Arjevani, Joan Bruna, Bugra Can, Mert Gürbüzbalaban, Stefanie Jegelka, Hongzhou Lin

Research output: Contribution to journalConference articlepeer-review

7 Scopus citations

Abstract

We introduce a framework for designing primal methods under the decentralized optimization setting where local functions are smooth and strongly convex. Our approach consists of approximately solving a sequence of sub-problems induced by the accelerated augmented Lagrangian method, thereby providing a systematic way for deriving several well-known decentralized algorithms including EXTRA [47] and SSDA [43]. When coupled with accelerated gradient descent, our framework yields a novel primal algorithm whose convergence rate is optimal and matched by recently derived lower bounds. We provide experimental results that demonstrate the effectiveness of the proposed algorithm on highly ill-conditioned problems.

Original languageAmerican English
JournalAdvances in Neural Information Processing Systems
Volume2020-December
StatePublished - 2020
Externally publishedYes
Event34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Virtual, Online
Duration: 6 Dec 202012 Dec 2020

Bibliographical note

Funding Information:
YA and JB acknowledge support from the Sloan Foundation and Samsung Research. BC and MG acknowledge support from the grants NSF DMS-1723085 and NSF CCF-1814888. HL and SJ acknowledge support by The Defense Advanced Research Projects Agency (grant number YFA17 N66001-17-1-4039). The views, opinions, and/or findings contained in this article are those of the author and should not be interpreted as representing the official views or policies, either expressed or implied, of the Defense Advanced Research Projects Agency or the Department of Defense.

Publisher Copyright:
© 2020 Neural information processing systems foundation. All rights reserved.

Cite this