Sum-of-Squares lower bounds for sparse PCA

Tengyu Ma, Avi Wigderson

Research output: Contribution to journalConference articlepeer-review

37 Scopus citations

Abstract

This paper establishes a statistical versus computational trade-off for solving a basic high-dimensional machine learning problem via a basic convex relaxation method. Specifically, we consider the Sparse Principal Component Analysis (Sparse PCA) problem, and the family of Sum-of-Squares (SoS, aka Lasserre/Parillo) convex relaxations. It was well known that in large dimension p, a planted k-sparse unit vector can be in principle detected using only n ≈ k log p (Gaussian or Bernoulli) samples, but all efficient (polynomial time) algorithms known require n ≈ k2 samples. It was also known that this quadratic gap cannot be improved by the the most basic semi-definite (SDP, aka spectral) relaxation, equivalent to a degree-2 SoS algorithms. Here we prove that also degree-4 SoS algorithms cannot improve this quadratic gap. This average-case lower bound adds to the small collection of hardness results in machine learning for this powerful family of convex relaxation algorithms. Moreover, our design of moments (or "pseudo-expectations") for this lower bound is quite different than previous lower bounds. Establishing lower bounds for higher degree SoS algorithms for remains a challenging problem.

Original languageEnglish
Pages (from-to)1612-1620
Number of pages9
JournalAdvances in Neural Information Processing Systems
Volume2015-January
StatePublished - 2015
Externally publishedYes
Event29th Annual Conference on Neural Information Processing Systems, NIPS 2015 - Montreal, Canada
Duration: 7 Dec 201512 Dec 2015

Fingerprint

Dive into the research topics of 'Sum-of-Squares lower bounds for sparse PCA'. Together they form a unique fingerprint.

Cite this