Authors
Bojan Cestnik, Ivan Bratko
Publication date
1991
Conference
Machine Learning—EWSL-91: European Working Session on Learning Porto, Portugal, March 6–8, 1991 Proceedings 5
Pages
138-150
Publisher
Springer Berlin Heidelberg
Description
In this paper we introduce a new method for decision tree pruning, based on the minimisation of the expected classification error method by Niblett and Bratko. The original Niblett-Bratko pruning algorithm uses Laplace probability estimates. Here we introduce a new, more general Bayesian approach to estimating probabilities which we call m-probability-estimation. By varying a parameter m in this method, tree pruning can be adjusted to particular properties of the learning domain, such as level of noise. The resulting pruning method improves on the original Niblett-Bratko pruning in the following respects: apriori probabilities can be incorporated into error estimation, several trees pruned to various degrees can be generated, and the degree of pruning is not affected by the number of classes. These improvements are supported by experimental findings. m-probability-estimation also enables the combination …
Total citations
199119921993199419951996199719981999200020012002200320042005200620072008200920102011201220132014201520162017201820192020202120222023251271519193768766631211784758343376463
Scholar articles
B Cestnik, I Bratko - Machine Learning—EWSL-91: European Working …, 1991