Abstract
Finite-sum optimization problems are ubiquitous in machine learning, and are commonly solved using first-order methods which rely on gradient computations. Recently, there has been growing interest in second-order methods, which rely on both gradients and Hessians. In principle, second-order methods can require much fewer iterations than first-order methods, and hold the promise for more efficient algorithms. Although computing and manipulating Hessians is prohibitive for high-dimensional problems in general, the Hessians of individual functions in finite-sum problems can often be efficiently computed, e.g. because they possess a low-rank structure. Can second-order information indeed be used to solve such problems more efficiently? In this paper, we provide evidence that the answer - perhaps surprisingly - is negative, at least in terms of worst-case guarantees. We also discuss what additional assumptions and algorithmic approaches might potentially circumvent this negative result.
Original language | English |
---|---|
Title of host publication | 34th International Conference on Machine Learning, ICML 2017 |
Publisher | International Machine Learning Society (IMLS) |
Pages | 274-297 |
Number of pages | 24 |
ISBN (Electronic) | 9781510855144 |
State | Published - 2017 |
Externally published | Yes |
Event | 34th International Conference on Machine Learning, ICML 2017 - Sydney, Australia Duration: 6 Aug 2017 → 11 Aug 2017 |
Publication series
Name | 34th International Conference on Machine Learning, ICML 2017 |
---|---|
Volume | 1 |
Conference
Conference | 34th International Conference on Machine Learning, ICML 2017 |
---|---|
Country/Territory | Australia |
City | Sydney |
Period | 6/08/17 → 11/08/17 |
Bibliographical note
Publisher Copyright:© 2017 International Machine Learning Society (IMLS). All rights reserved.