Authors
Tony Van Gestel, Johan AK Suykens, Gert Lanckriet, Annemie Lambrechts, Bart De Moor, Joos Vandewalle
Publication date
2002/5/1
Journal
Neural computation
Volume
14
Issue
5
Pages
1115-1147
Publisher
MIT Press
Description
The Bayesian evidence framework has been successfully applied to the design of multilayer perceptrons (MLPs) in the work of MacKay. Nevertheless, the training of MLPs suffers from drawbacks like the nonconvex optimization problem and the choice of the number of hidden units. In support vector machines (SVMs) for classification, as introduced by Vapnik, a nonlinear decision boundary is obtained by mapping the input vector first in a nonlinear way to a high-dimensional kernel-induced feature space in which a linear large margin classifier is constructed. Practical expressions are formulated in the dual space in terms of the related kernel function, and the solution follows from a (convex) quadratic programming (QP) problem. In least-squares SVMs (LS-SVMs), the SVM problem formulation is modified by introducing a least-squares cost function and equality instead of inequality constraints, and the solution …
Total citations
20012002200320042005200620072008200920102011201220132014201520162017201820192020202120222023202439192315272816292119107271326182312111014105