Abstract
Using the recently developed framework of Daniely et al. (2014), we show that under a natural assumption on the complexity of random K-SAT, learning DNF formulas is hard. Furthermore, the same assumption implies the hardness of various learning problems, including intersections of ω(log(n)) halfspaces, agnostically learning conjunctions, as well as virtually all (distribution free) learning problems that were previously shown hard (under various complexity assumptions).
| Original language | English |
|---|---|
| Pages (from-to) | 815-830 |
| Number of pages | 16 |
| Journal | Journal of Machine Learning Research |
| Volume | 49 |
| State | Published - 6 Jun 2016 |
| Event | 29th Conference on Learning Theory, COLT 2016 - New York, United States Duration: 23 Jun 2016 → 26 Jun 2016 |
Bibliographical note
Publisher Copyright:© 2016 A. Daniely & S. Shalev-Shwartz.
Keywords
- DNFs
- Hardness of learning