Authors
Xiao-Hu Yu, Guo-An Chen, Shi-Xin Cheng
Publication date
1995/5
Journal
IEEE Transactions on Neural Networks
Volume
6
Issue
3
Pages
669-677
Publisher
IEEE
Description
It has been observed by many authors that the backpropagation (BP) error surfaces usually consist of a large amount of flat regions as well as extremely steep regions. As such, the BP algorithm with a fixed learning rate will have low efficiency. This paper considers dynamic learning rate optimization of the BP algorithm using derivative information. An efficient method of deriving the first and second derivatives of the objective function with respect to the learning rate is explored, which does not involve explicit calculation of second-order derivatives in weight space, but rather uses the information gathered from the forward and backward propagation, Several learning rate optimization approaches are subsequently established based on linear expansion of the actual outputs and line searches with acceptable descent value and Newton-like methods, respectively. Simultaneous determination of the optimal learning rate …
Total citations
199519961997199819992000200120022003200420052006200720082009201020112012201320142015201620172018201920202021202220232024281014128139812111711611421154138775813129157
Scholar articles
XH Yu, GA Chen, SX Cheng - IEEE Transactions on Neural Networks, 1995