Authors
Johan AK Suykens, Joos Vandewalle
Publication date
2000/7
Journal
IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications
Volume
47
Issue
7
Pages
1109-1114
Publisher
IEEE
Description
The method of support vector machines (SVM's) has been developed for solving classification and static function approximation problems. In this paper we introduce SVM's within the context of recurrent neural networks. Instead of Vapnik's epsilon insensitive loss function, we consider a least squares version related to a cost function with equality constraints for a recurrent network. Essential features of SVM's remain, such as Mercer's condition and the fact that the output weights are a Lagrange multiplier weighted sum of the data points. The solution to recurrent least squares (LS-SVM's) is characterized by a set of nonlinear equations. Due to its high computational complexity, we focus on a limited case of assigning the squared error an infinitely large penalty factor with early stopping as a form of regularization. The effectiveness of the approach is demonstrated on trajectory learning of the double scroll attractor in …
Total citations
199920002001200220032004200520062007200820092010201120122013201420152016201720182019202020212022202320242146511172941374350334135262628242325121414911
Scholar articles
JAK Suykens, J Vandewalle - IEEE Transactions on Circuits and Systems I …, 2000
JAK Suykens, J Vandewalle - IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS …, 2000