Authors
Charles LA Clarke, Maheedhar Kolla, Gordon V Cormack, Olga Vechtomova, Azin Ashkan, Stefan Büttcher, Ian MacKinnon
Publication date
2008/7/20
Book
Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval
Pages
659-666
Description
Evaluation measures act as objective functions to be optimized by information retrieval systems. Such objective functions must accurately reflect user requirements, particularly when tuning IR systems and learning ranking functions. Ambiguity in queries and redundancy in retrieved documents are poorly reflected by current evaluation measures. In this paper, we present a framework for evaluation that systematically rewards novelty and diversity. We develop this framework into a specific evaluation measure, based on cumulative gain. We demonstrate the feasibility of our approach using a test collection based on the TREC question answering track.
Total citations
20082009201020112012201320142015201620172018201920202021202220232024436601119110297931069380786465505435
Scholar articles
CLA Clarke, M Kolla, GV Cormack, O Vechtomova… - Proceedings of the 31st annual international ACM …, 2008