Authors
Heinz Mühlenbein, Robin Höns
Publication date
2005/3
Journal
Evolutionary Computation
Volume
13
Issue
1
Pages
1-27
Publisher
MIT Press
Description
Estimation of Distribution Algorithms (EDA) have been proposed as an extension of genetic algorithms. In this paper we explain the relationship of EDA to algorithms developed in statistics, artificial intelligence, and statistical physics. The major design issues are discussed within a general interdisciplinary framework. It is shown that maximum entropy approximations play a crucial role. All proposed algorithms try to minimize the Kullback-Leibler divergence KLD between the unknown distribution p ( x ) and a class q ( x ) of approximations. However, the Kullback-Leibler divergence is not symmetric. Approximations which suppose that the function to be optimized is additively decomposed (ADF) minimize KLD ( q || p ), the methods which learn the approximate model from data minimize KLD ( p || q ). This minimization is identical to maximizing the log-likelihood . In the paper three classes of algorithms are …
Total citations
200420052006200720082009201020112012201320142015201620172018201920202021202220232024151119171412676314311312