Authors
Halbert White
Publication date
1990/1/1
Journal
Neural networks
Volume
3
Issue
5
Pages
535-549
Publisher
Pergamon
Description
It has been recently shown (e.g., Hornik, Stinchcombe & White, 1989, 1990) that sufficiently complex multilayer feedforward networks are capable of representing arbitrarily accurate approximations to arbitrary mappings. We show here that these approximations are learnable by proving the consistency of a class of connectionist nonparametric regression estimators for arbitrary (square integrable) regression functions. The consistency property ensures that as network “experience” accumulates (as indexed by the size of the training set), the probability of network approximation error exceeding any specified level tends to zero. A key feature of the demonstration of consistency is the proper control of the growth of network complexity as a function of network experience. We give specific growth rates for network complexity compatible with consistency. We also consider automatic and semi-automatic data-driven …
Total citations
19911992199319941995199619971998199920002001200220032004200520062007200820092010201120122013201420152016201720182019202020212022202320241028383229313335394833343130406038452722273322161918212215161721173