Authors
Kagan Tumer, Joydeep Ghosh
Publication date
1996/2/1
Journal
Pattern recognition
Volume
29
Issue
2
Pages
341-348
Publisher
Pergamon
Description
Combining or integrating the outputs of several pattern classifiers has led to improved performance in a multitude of applications. This paper provides an analytical framework to quantify the improvements in classification results due to combining. We show that combining networks linearly in output space reduces the variance of the actual decision region boundaries around the optimum boundary. This result is valid under the assumption that the a posteriori probability distributions for each class are locally monotonic around the Bayes optimum boundary. In the absence of classifier bias, the error is shown to be proportional to the boundary variance, resulting in a simple expression for error rate improvements. In the presence of bias, the error reduction, expressed in terms of a bias reduction factor, is shown to be less than or equal to the reduction obtained in the absence of bias. The analysis presented here …
Total citations
199619971998199920002001200220032004200520062007200820092010201120122013201420152016201720182019202020212022202320248119111317191918152423332116252827161118178139191467