Authors
Guido Bugmann
Publication date
1998/8/31
Journal
Neurocomputing
Volume
20
Issue
1-3
Pages
97-110
Publisher
Elsevier
Description
The performances of normalised RBF (NRBF) nets and standard RBF nets are compared in simple classification and mapping problems. In normalized RBF networks, the traditional roles of weights and activities in the hidden layer are switched. Hidden nodes perform a function similar to a Voronoi tessellation of the input space, and the output weights become the network's output over the partition defined by the hidden nodes. Consequently, NRBF nets lose the localized characteristics of standard RBF nets and exhibit excellent generalization properties, to the extent that hidden nodes need to be recruited only for training data at the boundaries of class domains. Reflecting this, a new learning rule is proposed that greatly reduces the number of hidden nodes needed in classification tasks. As for mapping applications, it is shown that NRBF nets may outperform standard RBFs nets and exhibit more uniform errors. In …
Total citations
19992000200120022003200420052006200720082009201020112012201320142015201620172018201920202021202220232024433631413481198101186910789121054