Authors
Myriam C Traub, Jacco van Ossenbruggen, Jiyin He, Lynda Hardman
Publication date
2014
Conference
Advances in Information Retrieval: 36th European Conference on IR Research, ECIR 2014, Amsterdam, The Netherlands, April 13-16, 2014. Proceedings 36
Pages
112-123
Publisher
Springer International Publishing
Description
Tasks that require users to have expert knowledge are difficult to crowdsource. They are mostly too complex to be carried out by non-experts and the available experts in the crowd are difficult to target. Adapting an expert task into a non-expert user task, thereby enabling the ordinary “crowd” to accomplish it, can be a useful approach. We studied whether a simplified version of an expert annotation task can be carried out by non-expert users. Users conducted a game-style annotation task of oil paintings. The obtained annotations were compared with those from experts. Our results show a significant agreement between the annotations done by experts and non-experts, that users improve over time and that the aggregation of users’ annotations per painting increases their precision.
Total citations
20142015201620172018201920202021202220232312111
Scholar articles
MC Traub, J van Ossenbruggen, J He, L Hardman - Advances in Information Retrieval: 36th European …, 2014