Author | Title | Year | Journal/Proceedings | Reftype | DOI/URL |
---|---|---|---|---|---|
Alonso, O., Rose, D. E. & Stewart, B. | Crowdsourcing for relevance evaluation | 2008 | SIGIR Forum | article | DOIURL |
Abstract: Relevance evaluation is an essential part of the development and maintenance of information retrieval systems. Yet traditional evaluation approaches have several limitations; in particular, conducting new editorial evaluations of a search system can be very expensive. We describe a new approach to evaluation called TERC, based on the crowdsourcing paradigm, in which many online users, drawn from a large community, each performs a small evaluation task. | |||||
BibTeX:
@article{alonso2008crowdsourcing, author = {Alonso, Omar and Rose, Daniel E. and Stewart, Benjamin}, title = {Crowdsourcing for relevance evaluation}, journal = {SIGIR Forum}, publisher = {ACM}, year = {2008}, volume = {42}, number = {2}, pages = {9--15}, url = {http://doi.acm.org/10.1145/1480506.1480508}, doi = {http://dx.doi.org/10.1145/1480506.1480508} } |
Created by JabRef export filters on 25/04/2024 by the social publication management platform PUMA