Complexity theoretic limitations on learning DNF's

Research output: Contribution to journalConference articlepeer-review

55 Scopus citations

Abstract

Using the recently developed framework of Daniely et al. (2014), we show that under a natural assumption on the complexity of random K-SAT, learning DNF formulas is hard. Furthermore, the same assumption implies the hardness of various learning problems, including intersections of ω(log(n)) halfspaces, agnostically learning conjunctions, as well as virtually all (distribution free) learning problems that were previously shown hard (under various complexity assumptions).

Original languageEnglish
Pages (from-to)815-830
Number of pages16
JournalJournal of Machine Learning Research
Volume49
StatePublished - 6 Jun 2016
Event29th Conference on Learning Theory, COLT 2016 - New York, United States
Duration: 23 Jun 201626 Jun 2016

Bibliographical note

Publisher Copyright:
© 2016 A. Daniely & S. Shalev-Shwartz.

Keywords

  • DNFs
  • Hardness of learning

Fingerprint

Dive into the research topics of 'Complexity theoretic limitations on learning DNF's'. Together they form a unique fingerprint.

Cite this