Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization

Shai Shalev-Shwartz, Tong Zhang*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

113 Scopus citations


We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state-of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression, Lasso, and multiclass SVM. Experiments validate our theoretical findings.

Original languageAmerican English
Pages (from-to)105-145
Number of pages41
JournalMathematical Programming
Issue number1-2
StatePublished - 1 Jan 2016

Bibliographical note

Publisher Copyright:
© 2014, Springer-Verlag Berlin Heidelberg and Mathematical Optimization Society.


  • 90C06
  • 90C15
  • 90C25


Dive into the research topics of 'Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization'. Together they form a unique fingerprint.

Cite this