Abstract
We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state-of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression, Lasso, and multiclass SVM. Experiments validate our theoretical findings.
| Original language | English |
|---|---|
| Pages (from-to) | 105-145 |
| Number of pages | 41 |
| Journal | Mathematical Programming |
| Volume | 155 |
| Issue number | 1-2 |
| DOIs | |
| State | Published - 1 Jan 2016 |
Bibliographical note
Publisher Copyright:© 2014, Springer-Verlag Berlin Heidelberg and Mathematical Optimization Society.
Keywords
- 90C06
- 90C15
- 90C25