TREC 2012 Crowdsourcing Track - Call for Participation
June 2012 – November 2012https://sites.google.com/site/treccrowd/
Goals
As part of the National Institute of Standards and Technology (NIST)'s annual Text REtrieval Conference (TREC), the Crowdsourcing track investigates emerging crowd-based methods for search evaluation and/or developing hybrid automation and crowd search systems.This year, our goal is to evaluate approaches to crowdsourcing high quality relevance judgments for two different types of media:
- textual documents
- images
Participants may use any crowdsourcing methods and platforms, including home-grown systems. Submissions will be evaluated against a gold standard set of labels and against consensus labels over all participating teams.
Tentative Schedule
- Jun 1: Document corpora, training topics (for image task) and task guidelines available
- Jul 1: Training labels for the image task
- Aug 1: Test data released
- Sep 15: Submissions due
- Oct 1: Preliminary results released
- Oct 15: Conference notebook papers due
- Nov 6-9: TREC 2012 conference at NIST, Gaithersburg, MD, USA
- Nov 15: Final results released
- Jan 15, 2013: Final papers due
Participation
To take part, please register by submitting a formal application directly to NIST (even if returning participant). See the bottom part of the page at http://trec.nist.gov/pubs/call2012.htmlParticipants should also join our Google Group discussion list, where all track related communications will take place.
Organizers
- Gabriella Kazai, Microsoft Research
- Matthew Lease, University of Texas at Austin
- Panagiotis G. Ipeirotis, New York University
- Mark D. Smucker, University of Waterloo
Further information
For further information, please visit https://sites.google.com/site/treccrowd/We welcome any questions you may have, either by emailing the organizers or by posting on the Google Group discussion page.