Authors
Thomas G Dietterich, Ghulum Bakiri
Publication date
1994
Journal
Journal of artificial intelligence research
Volume
2
Pages
263-286
Description
Multiclass learning problems involve finding a definitionfor an unknown function f (x) whose range is a discrete setcontaining k> 2 values (ie, k``classes''). Thedefinition is acquired by studying collections of training examples ofthe form [x_i, f (x_i)]. Existing approaches tomulticlass learning problems include direct application of multiclassalgorithms such as the decision-tree algorithms C4. 5 and CART, application of binary concept learning algorithms to learn individualbinary functions for each of the k classes, and application ofbinary concept learning algorithms with distributed outputrepresentations. This paper compares these three approaches to a newtechnique in which error-correcting codes are employed as adistributed output representation. We show that these outputrepresentations improve the generalization performance of both C4. 5and backpropagation on a wide range of multiclass learning tasks. Wealso demonstrate that this approach is robust with respect to changesin the size of the training sample, the assignment of distributedrepresentations to particular classes, and the application ofoverfitting avoidance techniques such as decision-tree pruning. Finally, we show that---like the other methods---the error-correctingcode technique can provide reliable class probability estimates. Taken together, these results demonstrate that error-correcting outputcodes provide a general-purpose method for improving the performanceof inductive learning programs on multiclass problems.
Total citations
199519961997199819992000200120022003200420052006200720082009201020112012201320142015201620172018201920202021202220232024111125373240639910911915316118920518821220420018319918715817319615116414012010939
Scholar articles
TG Dietterich, G Bakiri - Journal of artificial intelligence research, 1994