Pegasos: Primal estimated sub-GrAdient sOlver for SVM

Shai Shalev-Shwartz*, Yoram Singer, Nathan Srebro

*Corresponding author for this work

Research output: Contribution to conferencePaperpeer-review

707 Scopus citations

Abstract

We describe and analyze a simple and effective iterative algorithm for solving the optimization problem cast by Support Vector Machines (SVM). Our method alternates between stochastic gradient descent steps and projection steps. We prove that the number of iterations required to obtain a solution of accuracy is (1/). In contrast, previous analyses of stochastic gradient descent methods require (1/2) iterations. As in previously devised SVM solvers, the number of iterations also scales linearly with 1/, where is the regularization parameter of SVM. For a linear kernel, the total run-time of our method is (d/()), where d is a bound on the number of non-zero features in each example. Since the run-time does not depend directly on the size of the training set, the resulting algorithm is especially suited for learning from large datasets. Our approach can seamlessly be adapted to employ non-linear kernels while working solely on the primal objective function. We demonstrate the efficiency and applicability of our approach by conducting experiments on large text classification problems, comparing our solver to existing state-of-the-art SVM solvers. For example, it takes less than 5 seconds for our solver to converge when solving a text classification problem from Reuters Corpus Volume 1 (RCV1) with 800,000 training examples.

Original languageEnglish
Pages807-814
Number of pages8
DOIs
StatePublished - 2007
Event24th International Conference on Machine Learning, ICML 2007 - Corvalis, OR, United States
Duration: 20 Jun 200724 Jun 2007

Conference

Conference24th International Conference on Machine Learning, ICML 2007
Country/TerritoryUnited States
CityCorvalis, OR
Period20/06/0724/06/07

Fingerprint

Dive into the research topics of 'Pegasos: Primal estimated sub-GrAdient sOlver for SVM'. Together they form a unique fingerprint.

Cite this