Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization

Shai Shnlev-Shwartz*, Tong Zhang

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

28 Scopus citations

Abstract

2014 We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state- of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression. Lasso. and multi- class SVM. Experiments validate our theoretical findings.

Original languageAmerican English
Title of host publication31st International Conference on Machine Learning, ICML 2014
PublisherInternational Machine Learning Society (IMLS)
Pages111-119
Number of pages9
ISBN (Electronic)9781634393973
StatePublished - 2014
Event31st International Conference on Machine Learning, ICML 2014 - Beijing, China
Duration: 21 Jun 201426 Jun 2014

Publication series

Name31st International Conference on Machine Learning, ICML 2014
Volume1

Conference

Conference31st International Conference on Machine Learning, ICML 2014
Country/TerritoryChina
CityBeijing
Period21/06/1426/06/14

Fingerprint

Dive into the research topics of 'Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization'. Together they form a unique fingerprint.

Cite this