Authors
Justin Zobel
Publication date
1998/8/1
Book
Proceedings of the 21st annual international ACM SIGIR conference on Research and development in information retrieval
Pages
307-314
Description
Two stages in measurement of techniques for information retrieval are gathering of documents for relevance assessment and use of the assessments to numerically evaluate effectiveness. We consider both of these stages in the context of the TREC experiments, to determine whether they lead to measurements that are trustworthy and fair. Our detailed empirical investigation of the TREC results shows that the measured relative performance of systems appears to be reliable, but that recall is overestimated: it is likely that many relevant documents have not been found. We propose a new pooling strategy that can significantly in- crease the number of relevant documents found for given effort, without compromising fairness.
Total citations
20002001200220032004200520062007200820092010201120122013201420152016201720182019202020212022202320249172427224050524336472922353426333033241318272212
Scholar articles