Publications
Crowdsourcing for relevance evaluation
Alonso, O.; Rose, D. E. & Stewart, B.
SIGIR Forum, 42(2) 9-15 (2008) [pdf]
Relevance evaluation is an essential part of the development and maintenance of information retrieval systems. Yet traditional evaluation approaches have several limitations; in particular, conducting new editorial evaluations of a search system can be very expensive. We describe a new approach to evaluation called TERC, based on the crowdsourcing paradigm, in which many online users, drawn from a large community, each performs a small evaluation task.