A PTAS for agnostically learning halfspaces

Amit Daniely*

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

7 Scopus citations

Abstract

We present a PTAS for agnostically learning halfspaces w.r.t. the uniform distribution on the d dimensional sphere. Namely, we show that for every μ > 0 there is an algorithm that runs in time poly (d, 1/ϵ) , and is guaranteed to return a classifier with error at most (1 + μ)opt + μ, where opt is the error of the best halfspace classifier. This improves on Awasthi, Balcan and Long Awasthi et al. (2014) who showed an algorithm with an (unspecified) constant approximation ratio. Our algorithm combines the classical technique of polynomial regression (e.g. Linial et al. (1989); Kalai et al. (2005)), together with the new localization technique of Awasthi et al. (2014).

Original languageEnglish
Pages (from-to)484-502
Number of pages19
JournalProceedings of Machine Learning Research
Volume40
StatePublished - 2015
Event28th Conference on Learning Theory, COLT 2015 - Paris, France
Duration: 2 Jul 20156 Jul 2015

Bibliographical note

Publisher Copyright:
© 2015 A. Agarwal & S. Agarwal.

Keywords

  • Agnostic learning
  • Approximation algorithms
  • Halfspaces
  • Localization
  • Polynomial approximation
  • Polynomial regression
  • Uniform distribution

Fingerprint

Dive into the research topics of 'A PTAS for agnostically learning halfspaces'. Together they form a unique fingerprint.

Cite this