Global Convergence of Gradient Descent for Asymmetric Low-Rank Matrix Factorization

06/27/2021
by   Tian Ye, et al.
16

We study the asymmetric low-rank factorization problem: min_𝐔∈ℝ^m × d, 𝐕∈ℝ^n × d1/2𝐔𝐕^⊤ -Σ_F^2 where Σ is a given matrix of size m × n and rank d. This is a canonical problem that admits two difficulties in optimization: 1) non-convexity and 2) non-smoothness (due to unbalancedness of 𝐔 and 𝐕). This is also a prototype for more complex problems such as asymmetric matrix sensing and matrix completion. Despite being non-convex and non-smooth, it has been observed empirically that the randomly initialized gradient descent algorithm can solve this problem in polynomial time. Existing theories to explain this phenomenon all require artificial modifications of the algorithm, such as adding noise in each iteration and adding a balancing regularizer to balance the 𝐔 and 𝐕. This paper presents the first proof that shows randomly initialized gradient descent converges to a global minimum of the asymmetric low-rank factorization problem with a polynomial rate. For the proof, we develop 1) a new symmetrization technique to capture the magnitudes of the symmetry and asymmetry, and 2) a quantitative perturbation analysis to approximate matrix derivatives. We believe both are useful for other related non-convex problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset