Authors
Martin Benning, Marta M Betcke, Matthias J Ehrhardt, Carola-Bibiane Schönlieb
Publication date
2021
Journal
SIAM Journal on Imaging Sciences
Volume
14
Issue
2
Pages
814-843
Publisher
Society for Industrial and Applied Mathematics
Description
We propose an extension of a special form of gradient descent---in the literature known as linearized Bregman iteration---to a larger class of nonconvex functions. We replace the classical (squared) two norm metric in the gradient descent setting with a generalized Bregman distance, based on a proper, convex, and lower semicontinuous function. The algorithm's global convergence is proven for functions that satisfy the Kurdyka--Łojasiewicz property. Examples illustrate that features of different scale are being introduced throughout the iteration, transitioning from coarse to fine. This coarse-to-fine approach with respect to scale allows us to recover solutions of nonconvex optimization problems that are superior to those obtained with conventional gradient descent, or even projected and proximal gradient descent. The effectiveness of the linearized Bregman iteration in combination with early stopping is illustrated for …
Total citations
20182019202020212022202320242645853
Scholar articles
M Benning, MM Betcke, MJ Ehrhardt, CB Schönlieb - SIAM Journal on Imaging Sciences, 2021