Identity Testing of Tensors, Low Rank Recovery and Compressed Sensing

A matrix A naturally defines a quadratic form x^tAy. If A is of rank <=r, then the rank<=r decomposition of A corresponds naturally to a size ~2nr circuit for computing the quadratic form. It is clear how to perform "white box" polynomial identity testing for such circuits, and the motivating question for this work is to explore black-box identity testing. The probabilistic method shows that there is a size ~2nr hitting set for such polynomials. In this work we match this bound (over large fields), which is optimal up to constant factors.

Further, we explore how A can be reconstructed from the evaluations of the quadratic form. Similar probabilistic constructions show that there exist ~4nr evaluations which determine any such matrix A. Again, we match this bound (over large fields) with an explicit construction, and furthermore give a polynomial-time algorithm to reconstruct A from such evaluations. More generally, we show an efficient reduction from (exact) low-rank matrix reconstruction to (exact) sparse recovery. This reduction is novel in the compressed-sensing realm as it is field independent and unrelated to convex optimization.

Finally, using matrices as a base case we also derive a quasi-polynomial hitting set for higher-order tensors.

Joint work with Michael Forbes.

Date

Speakers

Amir Shpilka

Affiliation

Technion