Abstract
Using the recently developed framework of Daniely et al. (2014), we show that under a natural assumption on the complexity of random K-SAT, learning DNF formulas is hard. Furthermore, the same assumption implies the hardness of various learning problems, including intersections of ω(log(n)) halfspaces, agnostically learning conjunctions, as well as virtually all (distribution free) learning problems that were previously shown hard (under various complexity assumptions).
Original language | English |
---|---|
Pages (from-to) | 815-830 |
Number of pages | 16 |
Journal | Proceedings of Machine Learning Research |
Volume | 49 |
Issue number | June |
State | Published - 6 Jun 2016 |
Event | 29th Conference on Learning Theory, COLT 2016 - New York, United States Duration: 23 Jun 2016 → 26 Jun 2016 |
Bibliographical note
Funding Information:Amit Daniely was a recipient of the Google Europe Fellowship in Learning Theory, and this research was supported in part by this Google Fellowship. Shai Shalev-Shwartz is supported by the Israeli Science Foundation grant number 590-10. We thank Uri Feige, Guy Kindler and Nati Linial for valuable discussions.
Publisher Copyright:
© 2016 A. Daniely & S. Shalev-Shwartz.
Keywords
- DNFs
- Hardness of learning