Authors
Xiao-Hu Yu, Guo-An Chen
Publication date
1997/4/1
Journal
Neural networks
Volume
10
Issue
3
Pages
517-527
Publisher
Pergamon
Description
This paper considers efficient backpropagation learning using dynamically optimal learning rate (LR) and momentum factor (MF). A family of approaches exploiting the derivatives with respect to the LR and MF is presented, which does not need to explicitly compute the first two order derivatives in weight space, but rather makes use of the information gathered from the forward and backward procedures. The computational and storage burden for estimating the optimal LR and MF at most triple that of the standard backpropagation algorithm (BPA); however, the backpropagation learning procedure can be accelerated with remarkable savings in running time. Extensive computer simulations provided in this paper indicate that at least a magnitude of savings in running time can be achieved using the present family of approaches. © 1997 Elsevier Science Ltd. All Rights Reserved.
Total citations
199819992000200120022003200420052006200720082009201020112012201320142015201620172018201920202021202220232024668921010615128119855977758912941