Some theory for generalized boosting algorithms

Peter J. Bickel, Ya'acov Ritov*, Alon Zakai

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

55 Scopus citations

Abstract

We give a review of various aspects of boosting, clarifying the issues through a few simple results, and relate our work and that of others to the minimax paradigm of statistics. We consider the population version of the boosting algorithm and prove its convergence to the Bayes classifier as a corollary of a general result about Gauss-Southwell optimization in Hilbert space. We then investigate the algorithmic convergence of the sample version, and give bounds to the time until perfect separation of the sample. We conclude by some results on the statistical optimality of the L2 boosting.

Original languageEnglish
Pages (from-to)705-732
Number of pages28
JournalJournal of Machine Learning Research
Volume7
StatePublished - 2006

Keywords

  • AdaBoost
  • Classification
  • Cross-validation
  • Gauss-Southwell algorithm
  • Non-parametric convergence rate

Fingerprint

Dive into the research topics of 'Some theory for generalized boosting algorithms'. Together they form a unique fingerprint.

Cite this