Authors
Lorenzo Rosasco, Silvia Villa, Bằng Công Vũ
Publication date
2020/12
Journal
Applied Mathematics & Optimization
Volume
82
Pages
891-917
Publisher
Springer US
Description
We study the extension of the proximal gradient algorithm where only a stochastic gradient estimate is available and a relaxation step is allowed. We establish convergence rates for function values in the convex case, as well as almost sure convergence and convergence rates for the iterates under further convexity assumptions. Our analysis avoid averaging the iterates and error summability assumptions which might not be satisfied in applications, e.g. in machine learning. Our proofing technique extends classical ideas from the analysis of deterministic proximal gradient algorithms.
Total citations
20142015201620172018201920202021202220232024311151118151516212813
Scholar articles
L Rosasco, S Villa, BC Vũ - Applied Mathematics & Optimization, 2020