The power of sum-of-squares for detecting hidden structure
with Sam Hopkins, Pravesh Kothari, Aaron Potechin, Prasad Raghavendra, Tselil Schramm. FOCS 2017.
abstract
We study planted problems—finding hidden structures in random noisy inputs—through the lens of the sum-of-squares semidefinite programming hierarchy (SOS). This family of powerful semidefinite programs has recently yielded many new algorithms for planted problems, often achieving the best known polynomial-time guarantees in terms of accuracy of recovered solutions and robustness to noise. One theme in recent work is the design of spectral algorithms which match the guarantees of SOS algorithms for planted problems. Classical spectral algorithms are often unable to accomplish this: the twist in these new spectral algorithms is the use of spectral structure of matrices whose entries are low-degree polynomials of the input variables.
We prove that for a wide class of planted problems, including refuting random constraint satisfaction problems, tensor and sparse PCA, densest--subgraph, community detection in stochastic block models, planted clique, and others, eigenvalues of degree- matrix polynomials are as powerful as SOS semidefinite programs of size roughly . For such problems it is therefore always possible to match the guarantees of SOS without solving a large semidefinite program.
Using related ideas on SOS algorithms and low-degree matrix polynomials (and inspired by recent work on SOS and the planted clique problem [Barak, Hopkins, Kelner, Kothari, Moitra, Potechin; FOCS'16]), we prove a new SOS lower bound for the tensor PCA problem: given a random tensor with the form , where has iid entries and is a random unit vector, recover . We prove that for , when , SOS algorithms require time to distinguish from a tensor without a rank- spike. This matches the best known algorithms, which run in time roughly for some .
keywords
- sum-of-squares method
- eigenvalues
- semidefinite programming
- lower bounds
- average-case complexity