Abstract
We settle the complexity of dynamic least-squares regression (LSR), where rows and labels (A(t), b(t)) can be adaptively inserted and/or deleted, and the goal is to efficiently maintain an ϵ-approximate solution to minx(t)||A(t) x(t)-b(t)||_2 for all t ∈[T]. We prove sharp separations (d2-o(1). vs. .∼ d) between the amortized update time of: (i) Fully vs. Partially dynamic 0.01-LSR; (ii) High vs. low-accuracy LSR in the partially-dynamic (insertion-only) setting.Our lower bounds follow from a gap-amplification reduction-reminiscent of iterative refinement-from the exact version of the Online Matrix Vector Conjecture (OMv) [HKNS15], to constant approximate OMv over the reals, where the i-th online product Hv(i) only needs to be computed to 0.1 -relative error. All previous fine-grained reductions from OMv to its approximate versions only show hardness for inverse polynomial approximation ϵ= n-Ω(1) (additive or multiplicative). This result is of independent interest in fine-grained complexity and for the investigation of the OMv Conjecture, which is still widely open.
Original language | English |
---|---|
Title of host publication | Proceedings - 2023 IEEE 64th Annual Symposium on Foundations of Computer Science, FOCS 2023 |
Publisher | IEEE Computer Society |
Pages | 1605-1627 |
Number of pages | 23 |
ISBN (Electronic) | 9798350318944 |
DOIs | |
State | Published - 2023 |
Event | 64th IEEE Annual Symposium on Foundations of Computer Science, FOCS 2023 - Santa Cruz, United States Duration: 6 Nov 2023 → 9 Nov 2023 |
Publication series
Name | Proceedings - Annual IEEE Symposium on Foundations of Computer Science, FOCS |
---|---|
ISSN (Print) | 0272-5428 |
Conference
Conference | 64th IEEE Annual Symposium on Foundations of Computer Science, FOCS 2023 |
---|---|
Country/Territory | United States |
City | Santa Cruz |
Period | 6/11/23 → 9/11/23 |
Bibliographical note
Publisher Copyright:© 2023 IEEE.
Keywords
- Numerical linear algebra
- dynamic algorithms
- fine-grained complexity