Approximating the span of principal components via iterative least-squares

Yariv Aizenbud*, Barak Sober

*Corresponding author for this work

Research output: Contribution to journalLetterpeer-review

Abstract

Over of the last century, Principal Component Analysis (PCA) has become one of the pillars of modern scientific methods. Although PCA is typically viewed as a statistical tool aiming at finding orthogonal directions on which the variance is maximized, its first introduction by Pearson at 1901 was in the framework of the non-linear least-squares minimization problem of fitting a plane to scattered data points. Since linear least-squares regression also fits a plane to scattered data points, PCA and linear least-squares regression have thus a natural kinship, which we explore in this paper. In particular, we present an iterated linear least-squares approach, yielding a sequence of subspaces that converges to the space spanned by the leading principal components. The key observation, by which we establish our result, is that each iteration of the Power (or Subspace) Iterations, applied to the covariance matrix, can be interpreted as a solution to a linear least-squares problem.

Original languageAmerican English
Pages (from-to)84-92
Number of pages9
JournalApplied and Computational Harmonic Analysis
Volume63
DOIs
StatePublished - Mar 2023

Bibliographical note

Publisher Copyright:
© 2022 Elsevier Inc.

Keywords

  • Least-squares
  • Principal component analysis
  • Singular value decomposition
  • Subspace iterations

Fingerprint

Dive into the research topics of 'Approximating the span of principal components via iterative least-squares'. Together they form a unique fingerprint.

Cite this