QuickSearch:   Number of matching entries: 0.

Search Settings

    AuthorTitleYearJournal/ProceedingsReftypeDOI/URL
    Alonso, O., Rose, D.E. & Stewart, B. Crowdsourcing for relevance evaluation 2008 SIGIR Forum
    Vol. 42(2), pp. 9-15 
    article DOI URL 
    Abstract: Relevance evaluation is an essential part of the development and maintenance of information retrieval systems. Yet traditional evaluation approaches have several limitations; in particular, conducting new editorial evaluations of a search system can be very expensive. We describe a new approach to evaluation called TERC, based on the crowdsourcing paradigm, in which many online users, drawn from a large community, each performs a small evaluation task.
    BibTeX:
    @article{alonso2008crowdsourcing,
      author = {Alonso, Omar and Rose, Daniel E. and Stewart, Benjamin},
      title = {Crowdsourcing for relevance evaluation},
      journal = {SIGIR Forum},
      publisher = {ACM},
      year = {2008},
      volume = {42},
      number = {2},
      pages = {9--15},
      url = {http://doi.acm.org/10.1145/1480506.1480508},
      doi = {http://dx.doi.org/10.1145/1480506.1480508}
    }
    

    Created by JabRef on 16/04/2024.