Abstract
We present a PTAS for learning random constant-depth networks. We show that for any fixed ϵ > 0 and depth i, there is a poly-time algorithm that for any distribution on √d· Sd−1 learns random Xavier networks of depth i, up to an additive error of ϵ. The algorithm runs in time and sample complexity of (d̄)poly(ϵ−1), where d̄is the size of the network. For some cases of sigmoid and ReLU-like activations the bound can be improved to (d̄)polylog(ϵ−1), resulting in a quasi-poly-time algorithm for learning constant depth random networks.
Original language | English |
---|---|
Title of host publication | Advances in Neural Information Processing Systems 36 - 37th Conference on Neural Information Processing Systems, NeurIPS 2023 |
Editors | A. Oh, T. Neumann, A. Globerson, K. Saenko, M. Hardt, S. Levine |
Publisher | Neural information processing systems foundation |
ISBN (Electronic) | 9781713899921 |
State | Published - 2023 |
Event | 37th Conference on Neural Information Processing Systems, NeurIPS 2023 - New Orleans, United States Duration: 10 Dec 2023 → 16 Dec 2023 Conference number: 37 |
Publication series
Name | Advances in Neural Information Processing Systems |
---|---|
Volume | 36 |
ISSN (Print) | 1049-5258 |
Conference
Conference | 37th Conference on Neural Information Processing Systems, NeurIPS 2023 |
---|---|
Country/Territory | United States |
City | New Orleans |
Period | 10/12/23 → 16/12/23 |
Bibliographical note
Publisher Copyright:© 2023 Neural information processing systems foundation. All rights reserved.