The Text Retrieval Conference (TREC) workshop series encourages
research in information retrieval and related applications by
providing a large test collection, uniform scoring procedures,
and a forum for organizations interested in comparing their
results. Details about TREC
can be found at the TREC web site, http://trec.nist.gov.
You are invited to participate in TREC 2021. TREC 2021 will
consist of a set of tasks known as "tracks". Each track focuses
on a particular subproblem or variant of the retrieval task as
described below. Organizations may choose to participate in any or
all of the tracks. Training and test materials are available from
NIST for some tracks; other tracks will use special collections that
are available from other organizations for a fee.
Dissemination of TREC work and results other than in the (publicly
available) conference proceedings is welcomed, but the conditions of
participation specifically preclude any advertising claims based
on TREC results. All retrieval results submitted to NIST are
published in the Proceedings and are archived on the TREC web site
with the submitting organization identified.
Schedule:
Schedule:
As soon as possible -- submit your application to participate in
TREC 2021 as described below.
We accept applications to participate until late May, but
applying earlier means you can be involved in track discussions.
Processing applications requires some manual effort on our end.
Once your application is processed (at most a few business days),
the "Welcome to TREC" email message with
details about TREC participation will be sent to the email address
provided in the application.
July--August
Results submission deadline for most tracks.
Specific deadlines for each track will be included in
the track guidelines, which will be finalized in the spring.
Some tracks tracks are likely to have a spring submission deadline,
so be sure to subscribe to the track lists described below.
September 30 (estimated)
Relevance judgments and individual
evaluation scores due back to participants.
Nov 17--19
TREC 2021 conference at NIST in Gaithersburg, Md. USA if in-person meeting can be held. Otherwise, a virtual conference during this week.
Task Description
Below is a brief summary of the tasks. Complete descriptions of
tasks performed in previous years are included in the Overview
papers in each of the TREC proceedings (in the Publications section
of the web site).
The exact definition of the tasks to be performed in each track for
TREC 2021 is still being formulated. Track discussion takes place
on the track mailing list (or other communication medium). To join
the discussion,
follow the instructions for the track as detailed below.
TREC 2021 will contain eight tracks. Seven of the tracks ran in TREC 2020; the
Precision Medicine track that ran for the past several years will change
to the Clinical Trials track.
Clinical Trials Track
The goal of the new Clinical Trials track is to focus research on the
clinical trials matching problem: given a free text summary of a patient
health record, find suitable clinical trials for that patient.
Track coordinators:
Dina Demner-Fushman, U.S. National Library of Medicine
William Hersh, Oregon Health and Science University
Kirk Roberts, University of Texas Health Science Center
Ellen Voorhees, NIST
Track Web Page:
http://www.trec-cds.org/
Mailing list:
Google group, name: trec-cds
Conversational Assistance Track
The main aim of Conversational Assistance Track (CAsT) is to advance research
on conversational search systems. The goal of the track is to create
reusable benchmarks for open-domain information centric conversational dialogues.
Track coordinators:
Jamie Callan, Carnegie Mellon University
Jeff Dalton, University of Glasgow
Chenyan Xiong, Microsoft Research
Track Web Page:
Conversational Assistance track web page
Mailing list:
Google group, name: trec-cast
Twitter: @treccast
Deep Learning Track
The Deep Learning track focuses on IR tasks where a large training set is available, allowing us to compare a variety of retrieval approaches including deep neural networks and strong non-neural approaches, to see what works best in a large-data regime.
Track coordinators:
Daniel Campos, University of Illinois
Nick Craswell, Microsoft
Jimmy Lin, Microsoft
Bhaskar Mitra, Microsoft
Emine Yilmaz, University College London
Track Web Page:
Deep Learning track web page
Fair Ranking Track
The Fair Ranking track focuses on building two-sided systems that offer fair exposure to ranked content producers while ensuring high results quality for ranking consumers.
Track coordinators:
Michael Ekstrand, Boise State University
Isaac Johnson, Wikimedia
Graham McDonald, University of Glasgow
Amifa Raj, Boise State University
Track Web Page:
Fair Ranking track web page
Mailing list:
Google group, name: fair-trec
Health Misinformation Track
The Health Misinformation track aims to (1) provide a venue for research on
retrieval methods that promote better decision making with search engines,
and (2) develop new online and offline evaluation
methods to predict the decision making quality induced by search results.
Consumer health information is used as the domain of interest in the track.
Track coordinators:
Charlie Clarke, University of Waterloo
Maria Maistro, University of Copenhagen
Mark Smucker, University of Waterloo
Track Web Page:
Health Misinformation track web page
Mailing list:
Google group, name: trec-health-misinformation-track
Incident Streams Track
The Incident Streams track is designed
to bring together academia and industry to research technologies to
automatically process social media streams during emergency situations with
the aim of categorizing information and aid requests made on social
media for emergency service operators.
Track coordinators:
Cody Buntain, New Jersey Institute of Technology
Richard McCreadie, University of Glasgow
Ian Soboroff, NIST
Track Web Page:
Incident Streams track web page
Track Mailing List:
Google group, name: trec-is
News Track
The News track features modern search tasks in the news domain.
In partnership with The Washington Post, the track develops test collections
that support the search needs of news readers and news writers in the
current news environment.
Track coordinators:
Donna Harman, NIST
Shudong Huang, NIST
Ian Soboroff, NIST
Track Web Page:
News track web page
Mailing list:
Google group, name: trec-news-track
Podcasts Track
The aim of the Podcasts track is to develop methods
for information retrieval and content understanding from open-domain
podcast transcripts and audio.
Track coordinators:
Ann Clifton, Spotify
Ben Carterette, Spotify
Maria Eskevich, CLARIN ERIC
Gareth Jones, Dublin City University
Rosie Jones, Spotify
Jussi Karlgren, Spotify
Sravana Reddy, Spotify
Md Iftekhar Tanveer, Spotify
Track Web Page:
Podcasts track web page (updated 4/2021)
Mailing list:
Google group, name: trec-podcasts
Conference Format
The conference itself will be used as a forum both for presentation
of results (including failure analyses and system comparisons),
and for more lengthy system presentations describing retrieval
techniques used, experiments run using the data, and other issues
of interest to researchers in information retrieval.
All groups will be invited to present their results in a joint
poster session (assuming in-person meeting is possible).
Some groups may also be selected to present
during plenary talk sessions.
Application Details
Organizations wishing to participate in TREC 2021 should respond
to this call for participation by submitting an application.
Participants in previous TRECs who wish to participate
in TREC 2021 must submit a new application.
To apply, submit the online application at
http://ir.nist.gov/trecsubmit.open/application.html
The application system
will send an acknowledgement to the email address
supplied in the form once it has processed the form.
Any questions about conference participation should be sent
to the general TREC email address, trec (at) nist.gov.
|