Authors
Sanjoy Dasgupta, Michael Littman, David McAllester
Publication date
2001
Journal
Advances in neural information processing systems
Volume
14
Description
The rule-based bootstrapping introduced by Yarowsky, and its cotraining variant by Blum and Mitchell, have met with considerable empirical success. Earlier work on the theory of co-training has been only loosely related to empirically useful co-training algorithms. Here we give a new PAC-style bound on generalization error which justifies both the use of confidences—partial rules and partial labeling of the unlabeled data—and the use of an agreement-based objective function as suggested by Collins and Singer. Our bounds apply to the multiclass case, ie, where instances are to be assigned one of labels for
Total citations
Scholar articles
S Dasgupta, M Littman, D McAllester - Advances in neural information processing systems, 2001