2023 IEEE 64th Annual Symposium on Foundations of Computer Science (FOCS)
Download PDF

Abstract

We settle the complexity of dynamic least-squares regression (LSR), where rows and labels \left(\mathbf{A}^{(t)}, \mathbf{b}^{(t)}\right)(A(t),b(t)) can be adaptively inserted and/or deleted, and the goal is to efficiently maintain an \epsilonϵ-approximate solution to \min _{\mathbf{x}^{(t)}}\left\|\mathbf{A}^{(t)} \mathbf{x}^{(t)}-\mathbf{b}^{(t)}\right\|_{2} for all t \in[T]. We prove sharp separations \left(d^{2-o(1)}\right. vs. \left.\sim d\right) between the amortized update time of: (i) Fully vs. Partially dynamic 0.01-LSR; (ii) High vs. low-accuracy LSR in the partially-dynamic (insertion-only) setting.Our lower bounds follow from a gap-amplification reduction–reminiscent of iterative refinement-from the exact version of the Online Matrix Vector Conjecture (OMv) [HKNS15], to constant approximate OMv over the reals, where the i-th online product \mathrm{Hv}^{(i)} only needs to be computed to 0.1 -relative error. All previous fine-grained reductions from OMv to its approximate versions only show hardness for inverse polynomial approximation \epsilon= n^{-\omega(1)} (additive or multiplicative). This result is of independent interest in fine-grained complexity and for the investigation of the OMv Conjecture, which is still widely open.
Like what you’re reading?
Already a member?
Get this article FREE with a new membership!

Related Articles