Cornell University; Member, School of Mathematics
October 25, 2016
The sum-of-squares (SOS) method is a conceptually simple algorithmic technique for polynomial optimization that---quite surprisingly---captures and generalizes the best known efficient algorithms for a wide range of NP-hard optimization problems. For many of these, especially problems related to Khot's Unique Games Conjecture, no strong limitations for SOS are known, unlike for many other algorithmic techniques. Therefore, there is hope that SOS gives efficient algorithms with significantly stronger guarantees than other techniques. We consider the problem of finding a rank-1 n-by-n matrix close to a given linear subspace in Frobenius norm. We will show that SOS solves this problem in time $\exp(\sqrt n)$, significantly faster than the previous best $\exp(n)$ brute-force algorithm. As a consequence, we also get improved algorithms for one of the most basic problems in quantum information, called separability, which asks to decide if a given state of a composite quantum system is separable or entangled. Our proof is based on a general framework for analyzing SOS in terms of restricted proof systems and on a generalization of Lovett's bound for the log rank conjecture in communication complexity. Based on joint work with Boaz Barak and Pravesh Kothari.