Authors
Tony Van Gestel, Johan AK Suykens, Bart Baesens, Stijn Viaene, Jan Vanthienen, Guido Dedene, Bart De Moor, Joos Vandewalle
Publication date
2004/1
Journal
Machine learning
Volume
54
Pages
5-32
Publisher
Kluwer Academic Publishers-Plenum Publishers
Description
In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LS-SVMs), a least squares cost function is proposed so as to obtain a linear set of equations in the dual space. While the SVM classifier has a large margin interpretation, the LS-SVM formulation is related in this paper to a ridge regression approach for classification with binary targets and to Fisher's linear discriminant analysis in the feature space. Multiclass categorization problems are represented by a set of binary classifiers using different output coding schemes. While regularization is used to control the effective number of parameters of the LS-SVM classifier, the sparseness property of SVMs is lost due to the choice of the 2-norm. Sparseness can be imposed in a second stage by gradually …
Total citations
200320042005200620072008200920102011201220132014201520162017201820192020202120222023202410101838422960476054626855555445463340312614
Scholar articles
T Van Gestel, JAK Suykens, B Baesens, S Viaene… - Machine learning, 2004
TGJA K. Suykens; Bart Baesens; Stijn Viaene; Jan … - Machine Learning, 2004