Authors
Suriya Gunasekar, Jason Lee, Daniel Soudry, Nathan Srebro
Publication date
2018/7/3
Conference
International Conference on Machine Learning
Pages
1832-1841
Publisher
PMLR
Description
We study the bias of generic optimization methods, including Mirror Descent, Natural Gradient Descent and Steepest Descent with respect to different potentials and norms, when optimizing underdetermined linear models or separable linear classification problems. We ask the question of whether the global minimum (among the many possible global minima) reached by optimization can be characterized in terms of the potential or norm, and indecently of hyper-parameter choices, such as stepsize and momentum.
Total citations
201820192020202120222023202412416686759661
Scholar articles
S Gunasekar, J Lee, D Soudry, N Srebro - International Conference on Machine Learning, 2018