Abstract
We present a faster interior-point method for optimizing sum-of-squares (SOS) polynomials, which are a central tool in polynomial optimization and capture convex programming in the Lasserre hierarchy. Let p=∑iqi2 be an n-variate SOS polynomial of degree 2d. Denoting by L:=(n+dd) and U:=(n+2d2d) the dimensions of the vector spaces in which qi’s and p live respectively, our algorithm runs in time O~ (LU1.87). This is polynomially faster than state-of-art SOS and semidefinite programming solvers, which achieve runtime O~ (L0.5min { U2.37, L4.24}). The centerpiece of our algorithm is a dynamic data structure for maintaining the inverse of the Hessian of the SOS barrier function under the polynomial interpolant basis, which efficiently extends to multivariate SOS optimization, and requires maintaining spectral approximations to low-rank perturbations of elementwise (Hadamard) products. This is the main challenge and departure from recent IPM breakthroughs using inverse-maintenance, where low-rank updates to the slack matrix readily imply the same for the Hessian matrix.
Original language | English |
---|---|
Pages (from-to) | 2843-2884 |
Number of pages | 42 |
Journal | Algorithmica |
Volume | 85 |
Issue number | 9 |
DOIs | |
State | Published - Sep 2023 |
Bibliographical note
Funding Information:Supported by NSF CAREER award CCF-1844887. This project has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (Grant Agreement No. 757481-ScaleOpt). Supported by NSF CAREER Award CCF-1844887 and ISF Grant #3011005535.
Publisher Copyright:
© 2023, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
Keywords
- Convex optimization
- Dynamic matrix inverse
- Interior point methods
- Sum-of-squares optimization