Authors
Sanjoy Dasgupta, Daniel Hsu, Claire Monteleoni
Publication date
2007/12
Journal
Advances in neural information processing systems
Volume
20
Pages
353-360
Description
We present an agnostic active learning algorithm for any hypothesis class of bounded VC dimension under arbitrary data distributions. Most previ-ous work on active learning either makes strong distributional assumptions, or else is computationally prohibitive. Our algorithm extends the simple scheme of Cohn, Atlas, and Ladner [1] to the agnostic setting, using re-ductions to supervised learning that harness generalization bounds in a simple but subtle manner. We provide a fall-back guarantee that bounds the algorithm’s label complexity by the agnostic PAC sample complexity. Our analysis yields asymptotic label complexity improvements for certain hypothesis classes and distributions. We also demonstrate improvements experimentally.
Total citations
200620072008200920102011201220132014201520162017201820192020202120222023202421624172823351727172512253332242210
Scholar articles
S Dasgupta, DJ Hsu, C Monteleoni - Advances in neural information processing systems, 2007