Training neural networks is traditionally done by providing a sequence of random mini-batches sampled uniformly from the entire training data. In this work, we analyze the effect of curriculum learning, which involves the non-uniform sampling of mini-batches, on the training of deep networks, and specifically CNNs trained for image recognition. To employ curriculum learning, the training algorithm must resolve 2 problems: (i) sort the training examples by difficulty; (ii) compute a series of mini-batches that exhibit an increasing level of difficulty. We address challenge (i) using two methods: transfer learning from some competitive "teacher" network, and bootstrapping. In our empirical evaluation, both methods show similar benefits in terms of increased learning speed and improved final performance on test data. We address challenge (ii) by investigating different pacing functions to guide the sampling. The empirical investigation includes a variety of network architectures, using images from CIFAR-10, CIFAR-100 and subsets of ImageNet. We conclude with a novel theoretical analysis of curriculum learning, where we show how it effectively modifies the optimization landscape. We then define the concept of an ideal curriculum, and show that under mild conditions it does not change the corresponding global minimum of the optimization function.
|Title of host publication||36th International Conference on Machine Learning, ICML 2019|
|Publisher||International Machine Learning Society (IMLS)|
|Number of pages||10|
|State||Published - 2019|
|Event||36th International Conference on Machine Learning, ICML 2019 - Long Beach, United States|
Duration: 9 Jun 2019 → 15 Jun 2019
|Name||36th International Conference on Machine Learning, ICML 2019|
|Conference||36th International Conference on Machine Learning, ICML 2019|
|Period||9/06/19 → 15/06/19|
Bibliographical noteFunding Information:
This work was supported in part by a grant from the Israel Science Foundation (ISF), MAFAT Center for Deep Learning, and the Gatsby Charitable Foundations.
Copyright 2019 by the author(s).