Authors
David JC MacKay
Publication date
1992/7/1
Journal
Neural computation
Volume
4
Issue
4
Pages
590-604
Publisher
MIT Press
Description
Learning can be made more efficient if we can actively select particularly salient data points. Within a Bayesian learning framework, objective functions are discussed that measure the expected informativeness of candidate measurements. Three alternative specifications of what we want to gain information about lead to three different criteria for data selection. All these criteria depend on the assumption that the hypothesis space is correct, which may prove to be their main weakness.
Total citations
19941995199619971998199920002001200220032004200520062007200820092010201120122013201420152016201720182019202020212022202320241120191014222521252634475066616362757963574953658282151859411870