Аннотация

Relevance evaluation is an essential part of the development and maintenance of information retrieval systems. Yet traditional evaluation approaches have several limitations; in particular, conducting new editorial evaluations of a search system can be very expensive. We describe a new approach to evaluation called TERC, based on the crowdsourcing paradigm, in which many online users, drawn from a large community, each performs a small evaluation task.

Линки и ресурсы

URL:
ключ BibTeX:
alonso2008crowdsourcing
искать в:

Комментарии и рецензии  
(0)

Комментарии, или рецензии отсутствуют. Вы можете их написать!

Tags


Цитировать эту публикацию