TY - JOUR
T1 - Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
AU - Shalev-Shwartz, Shai
AU - Zhang, Tong
N1 - Publisher Copyright:
© 2014, Springer-Verlag Berlin Heidelberg and Mathematical Optimization Society.
PY - 2016/1/1
Y1 - 2016/1/1
N2 - We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state-of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression, Lasso, and multiclass SVM. Experiments validate our theoretical findings.
AB - We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state-of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression, Lasso, and multiclass SVM. Experiments validate our theoretical findings.
KW - 90C06
KW - 90C15
KW - 90C25
UR - http://www.scopus.com/inward/record.url?scp=84953283129&partnerID=8YFLogxK
U2 - 10.1007/s10107-014-0839-0
DO - 10.1007/s10107-014-0839-0
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:84953283129
SN - 0025-5610
VL - 155
SP - 105
EP - 145
JO - Mathematical Programming
JF - Mathematical Programming
IS - 1-2
ER -