Authors
Alina Beygelzimer, Daniel Hsu, John Langford, Tong Zhang
Publication date
2010/6/14
Journal
Arxiv preprint arXiv:1006.2588
Description
We present and analyze an agnostic active learning algorithm that works without keeping a version space. This is unlike all previous approaches where a restricted set of candidate hypotheses is maintained throughout learning, and only hypotheses from this set are ever returned. By avoiding this version space approach, our algorithm sheds the computational burden and brittleness associated with maintaining version spaces, yet still allows for substantial improvements over supervised learning for classification.
Total citations
200920102011201220132014201520162017201820192020202120222023202413917261626221361415151385
Scholar articles
A Beygelzimer, DJ Hsu, J Langford, T Zhang - Advances in neural information processing systems, 2010