Authors
Guy Narkiss, Michael Zibulevsky
Publication date
2005/10/30
Pages
26
Publisher
Technion-IIT, Department of Electrical Engineering
Description
We present the Sequential Subspace Optimization (SESOP) method for largescale smooth unconstrained problems. At each iteration we search for a minimum of the objective function over a subspace spanned by the current gradient and by directions of few previous steps. We also include into this subspace the direction from the starting point to the current point, and a weighted sum of all previous gradients, following [Nemirovski-1982]. This safeguard measure provides an optimal worst case convergence rate of order 1/N2 (for convex problems), where N is the iteration count. In the case of quadratic objective, the method is equivalent to the conjugate gradients method. We identify an important class of problems, where subspace optimization can be implemented extremely fast. This happens when the objective function is a combination of expensive linear mappings with computationally cheap non-linear functions. This is a typical situation in many applications, like tomography, signal and image denoising with Basis Pursuit, pattern recognition with Support Vector Machine, and many others. We demonstrate highly competitive numerical results using examples from the mentioned areas.
Total citations
200520062007200820092010201120122013201420152016201720182019202020212022202320242410912556364987811176