Authors
Byoung-Tak Zhang, Heinz Muehlenbein
Publication date
1993/6
Journal
Complex Systems
Volume
7
Issue
3
Pages
199-220
Publisher
[Champaign, IL, USA: Complex Systems Publications
Description
Genetic algorithms have been used for neural networks in two main ways: to optimize the network architecture and to train the weights of a xed architecture. While most previous work focuses on only one of these two options, this paper investigates an alternative evolutionary approach called Breeder Genetic Programming (BGP) in which the architecture and the weights are optimized simultaneously. The genotype of each network is represented as a tree whose depth and width are dynamically adapted to the particular application by speci cally de ned genetic operators. The weights are trained by a next-ascent hillclimbing search. A new tness function is proposed that quanti es the principle of Occam's razor. It makes an optimal trade-o between the error tting ability and the parsimony of the network. Simulation results on two benchmark problems of di ering complexity suggest that the method nds minimal size networks on clean data. The experiments on noisy data show that using Occam's razor not only improves the generalization performance, it also accelerates the convergence speed of evolution.
Total citations
199319941995199619971998199920002001200220032004200520062007200820092010201120122013201420152016201720182019202020212022202320241465482661144212712169669584461252547