Authors
Hongyi Zhang, Sashank J Reddi, Suvrit Sra
Publication date
2016
Journal
Advances in Neural Information Processing Systems
Volume
29
Description
We study optimization of finite sums of\emph {geodesically} smooth functions on Riemannian manifolds. Although variance reduction techniques for optimizing finite-sums have witnessed tremendous attention in the recent years, existing work is limited to vector space problems. We introduce\emph {Riemannian SVRG}(\rsvrg), a new variance reduced Riemannian optimization method. We analyze\rsvrg for both geodesically\emph {convex} and\emph {nonconvex}(smooth) functions. Our analysis reveals that\rsvrg inherits advantages of the usual SVRG method, but with factors depending on curvature of the manifold that influence its convergence. To our knowledge,\rsvrg is the first\emph {provably fast} stochastic Riemannian method. Moreover, our paper presents the first non-asymptotic complexity analysis (novel even for the batch setting) for nonconvex Riemannian optimization. Our results have several implications; for instance, they offer a Riemannian perspective on variance reduced PCA, which promises a short, transparent convergence analysis.
Total citations
20162017201820192020202120222023202421533394238374323
Scholar articles
H Zhang, SJ Reddi, S Sra - Advances in Neural Information Processing Systems, 2016