TREC 2012 Crowdsourcing Track - Call for ParticipationJune 2012 – November 2012
GoalsAs part of the National Institute of Standards and Technology (NIST)'s annual Text REtrieval Conference (TREC), the Crowdsourcing track investigates emerging crowd-based methods for search evaluation and/or developing hybrid automation and crowd search systems.
This year, our goal is to evaluate approaches to crowdsourcing high quality relevance judgments for two different types of media:
- textual documents
Participants may use any crowdsourcing methods and platforms, including home-grown systems. Submissions will be evaluated against a gold standard set of labels and against consensus labels over all participating teams.
- Jun 1: Document corpora, training topics (for image task) and task guidelines available
- Jul 1: Training labels for the image task
- Aug 1: Test data released
- Sep 15: Submissions due
- Oct 1: Preliminary results released
- Oct 15: Conference notebook papers due
- Nov 6-9: TREC 2012 conference at NIST, Gaithersburg, MD, USA
- Nov 15: Final results released
- Jan 15, 2013: Final papers due
ParticipationTo take part, please register by submitting a formal application directly to NIST (even if returning participant). See the bottom part of the page at http://trec.nist.gov/pubs/call2012.html
Participants should also join our Google Group discussion list, where all track related communications will take place.
- Gabriella Kazai, Microsoft Research
- Matthew Lease, University of Texas at Austin
- Panagiotis G. Ipeirotis, New York University
- Mark D. Smucker, University of Waterloo
Further informationFor further information, please visit https://sites.google.com/site/treccrowd/
We welcome any questions you may have, either by emailing the organizers or by posting on the Google Group discussion page.