The local convexity of solving systems of quadratic equations

06/25/2015
by   Chris D. White, et al.
0

This paper considers the recovery of a rank r positive semidefinite matrix X X^T∈R^n× n from m scalar measurements of the form y_i := a_i^T X X^T a_i (i.e., quadratic measurements of X). Such problems arise in a variety of applications, including covariance sketching of high-dimensional data streams, quadratic regression, quantum state tomography, among others. A natural approach to this problem is to minimize the loss function f(U) = ∑_i (y_i - a_i^TUU^Ta_i)^2 which has an entire manifold of solutions given by {XO}_O∈O_r where O_r is the orthogonal group of r× r orthogonal matrices; this is non-convex in the n× r matrix U, but methods like gradient descent are simple and easy to implement (as compared to semidefinite relaxation approaches). In this paper we show that once we have m ≥ C nr ^2(n) samples from isotropic gaussian a_i, with high probability (a) this function admits a dimension-independent region of local strong convexity on lines perpendicular to the solution manifold, and (b) with an additional polynomial factor of r samples, a simple spectral initialization will land within the region of convexity with high probability. Together, this implies that gradient descent with initialization (but no re-sampling) will converge linearly to the correct X, up to an orthogonal transformation. We believe that this general technique (local convexity reachable by spectral initialization) should prove applicable to a broader class of nonconvex optimization problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset