Authors
Rong Ge, Jason D Lee, Tengyu Ma
Publication date
2016
Journal
Advances in neural information processing systems
Volume
29
Description
Matrix completion is a basic machine learning problem that has wide applications, especially in collaborative filtering and recommender systems. Simple non-convex optimization algorithms are popular and effective in practice. Despite recent progress in proving various non-convex algorithms converge from a good initial point, it remains unclear why random or arbitrary initialization suffices in practice. We prove that the commonly used non-convex objective function for matrix completion has no spurious local minima---all local minima must also be global. Therefore, many popular optimization algorithms such as (stochastic) gradient descent can provably solve matrix completion with\textit {arbitrary} initialization in polynomial time.
Total citations
201620172018201920202021202220232024196694126126102695842
Scholar articles
R Ge, JD Lee, T Ma - Advances in neural information processing systems, 2016