Proving the Lottery Ticket Hypothesis: Pruning is All You Need

Erati Malacli*, Gilad Ydiudai, Shai Shalev-Schwartz, Oliad Sliainii

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

15 Scopus citations

Abstract

The lottery ticket hypothesis (Frankle and Carbin. 2018), states that a randomly-initialized network contains a small subnetwork such that, when trained in isolation, can compete with the performance of the original network. We prove an even stronger hypothesis (as was also conjectured in Rarnanujan et al., 2019), showing that for every bounded distribution and every target network with bounded weights. a sufficiently over-parameterized neural network with random weights contains a subnetwork with roughly the same accuracy as the target network, without any further training.

Original languageAmerican English
Title of host publication37th International Conference on Machine Learning, ICML 2020
EditorsHal Daume, Aarti Singh
PublisherInternational Machine Learning Society (IMLS)
Pages6638-6647
Number of pages10
ISBN (Electronic)9781713821120
StatePublished - 1 Mar 2020
Externally publishedYes
Event37th International Conference on Machine Learning, ICML 2020 - Virtual, Online
Duration: 13 Jul 202018 Jul 2020

Publication series

Name37th International Conference on Machine Learning, ICML 2020
VolumePartF168147-9

Conference

Conference37th International Conference on Machine Learning, ICML 2020
CityVirtual, Online
Period13/07/2018/07/20

Bibliographical note

Funding Information:
Acknowledgements: This research is supported by the European Research Council (TheoryDL project), and by European Research Council (ERC) grant 754705.

Publisher Copyright:
© 2020 37th International Conference on Machine Learning, ICML 2020. All rights reserved.

Fingerprint

Dive into the research topics of 'Proving the Lottery Ticket Hypothesis: Pruning is All You Need'. Together they form a unique fingerprint.

Cite this