Call to TREC 2020

Return to the TREC home page TREC home          National Institute of Standards and Technology Home Page

TREC Statement on Product Testing and Advertising



February 2020 - November 2020

Conducted by:
National Institute of Standards and Technology (NIST)

The Text Retrieval Conference (TREC) workshop series encourages research in information retrieval and related applications by providing a large test collection, uniform scoring procedures, and a forum for organizations interested in comparing their results. Details about TREC can be found at the TREC web site,

You are invited to participate in TREC 2020. TREC 2020 will consist of a set of tasks known as "tracks". Each track focuses on a particular subproblem or variant of the retrieval task as described below. Organizations may choose to participate in any or all of the tracks. Training and test materials are available from NIST for some tracks; other tracks will use special collections that are available from other organizations for a fee.

Dissemination of TREC work and results other than in the (publicly available) conference proceedings is welcomed, but the conditions of participation specifically preclude any advertising claims based on TREC results. All retrieval results submitted to NIST are published in the Proceedings and are archived on the TREC web site with the submitting organization identified.


Schedule: As soon as possible -- submit your application to participate in TREC 2020 as described below.
Submitting an application will add you to the active participants' mailing list. On Feb 20, NIST will announce a new password for the "active participants" portion of the TREC web site.

Beginning March 1
Document disks used in some existing TREC collections distributed to participants who have returned the required forms. Please note that no disks will be shipped before March 1.

Results submission deadline for most tracks. Specific deadlines for each track will be included in the track guidelines, which will be finalized in the spring. Some tracks tracks are likely to have a spring submission deadline, so be sure to subscribe to the track lists described below.

September 30 (estimated)
Relevance judgments and individual evaluation scores due back to participants.

Nov 18--20
TREC 2020 conference at NIST in Gaithersburg, Md. USA.

Task Description

Below is a brief summary of the tasks. Complete descriptions of tasks performed in previous years are included in the Overview papers in each of the TREC proceedings (in the Publications section of the web site).

The exact definition of the tasks to be performed in each track for TREC 2020 is still being formulated. Track discussion takes place on the track mailing list (or other communication medium). To join the discussion, follow the instructions for the track as detailed below. You must be invited to join the TREC Slack channels. Instructions on how to request an invitation will be sent by email as part of the processing of your TREC 2020 registration.

TREC 2020 will contain eight tracks.

Conversational Assistance Track

The main aim of Conversational Assistance Track (CAsT) is to advance research on conversational search systems. The goal of the track is to create reusable benchmarks for open-domain information centric conversational dialogues.
Track coordinators:
Jeff Dalton, University of Glasgow
Chenyan Xiong, Microsoft Research
Jamie Callan, Carnegie Mellon University
Track Web Page:
CAsTweb page
Mailing list:
Google group, name: trec-cast
Twitter: @treccast

Deep Learning Track

The Deep Learning track focuses on IR tasks where a large training set is available, allowing us to compare a variety of retrieval approaches including deep neural networks and strong non-neural approaches, to see what works best in a large-data regime.
Track coordinators:
Nick Craswell, Microsoft
Bhaskar Mitra, Microsoft
Emine Yilmaz, University College London
Daniel Campos, Microsoft
Track Web Page:
Deep Learning track web page
Mailing list:
Slack: deep-learning channel of TREC Slack

Fair Ranking Track

The Fair Ranking track focuses on building two-sided systems that offer fair exposure to ranked content producers while ensuring high results quality for ranking consumers.
Track coordinators:
Asia Biega, Microsoft Research Montreal
Fernando Diaz, Microsoft Research Montreal
Michael Ekstrand, Boise State University
Track Web Page:
Fairness track web page
Mailing list:
Google group, name: fair-trec

Health Misinformation Track

The Health Misinformation track aims to (1) provide a venue for research on retrieval methods that promote better decision making with search engines, and (2) develop new online and offline evaluation methods to predict the decision making quality induced by search results. Consumer health information is used as the domain of interest in the track. (This track was called the Decision Track in TREC 2019.)
Track coordinators:
Charlie Clarke, University of Waterloo
Maria Maistro, University of Copenhagen
Mark Smucker, University of Waterloo
Guido Zuccon, University of Queensland
Track Web Page:
Health Misinformation web page
Mailing list:
Google group, name: trec-decision-track

Incident Streams Track

The Incident Streams track is designed to bring together academia and industry to research technologies to automatically process social media streams during emergency situations with the aim of categorizing information and aid requests made on social media for emergency service operators.
Track coordinators:
Richard McCreadie, University of Glasgow
Cody Buntain, NYU
Ian Soboroff, NIST
Track Web Page:
Incident Streams track web page
Track Mailing List:
Google group, name: trec-is

News Track

The News track features modern search tasks in the news domain. In partnership with The Washington Post, the track develops test collections that support the search needs of news readers and news writers in the current news environment.
Track coordinators:
Shudong Huang, NIST
Donna Harman, NIST
Ian Soboroff, NIST
Track Web Page:
Mailing list:
Google group, name: trec-news-track

Podcasts Track

A new track for 2020. The aim of the Podcasts track is to develop methods for information retrieval and content understanding from open-domain podcast transcripts and audio.
Track coordinators:
Aasish Pappu, Spotify
Ann Clifton, Spotify
Ben Carterette, Spotify
Gareth Jones, Dublin City University
Rosie Jones, Spotify
Jussi Karlgren, Spotify
Sravana Reddy, Spotify
Track Web Page:
Podcasts track web page
Mailing list:; sign up at!forum/trec-podcasts

Precision Medicine Track

The Precision Medicine track focuses on building systems that use data (e.g., a patient's past medical history and genomic information) to link oncology patients to clinical trials for new treatments as well as evidence-based literature to identify the most effective existing treatments. Track coordinators:
Kirk Roberts, University of Texas Health Science Center
Dina Demner-Fushman, U.S. National Library of Medicine
Ellen Voorhees, NIST
William Hersh, Oregon Health and Science University
Alexander Lazar, University of Texas MD Anderson Cancer Center
Shubham Pant, University of Texas MD Anderson Cancer Center
Track Web Page:
Mailing list:
Google group, name: trec-cds

Conference Format

The conference itself will be used as a forum both for presentation of results (including failure analyses and system comparisons), and for more lengthy system presentations describing retrieval techniques used, experiments run using the data, and other issues of interest to researchers in information retrieval. All groups will be invited to present their results in a joint poster session. Some groups may also be selected to present during plenary talk sessions.

Application Details

Organizations wishing to participate in TREC 2020 should respond to this call for participation by submitting an application. Participants in previous TRECs who wish to participate in TREC 2020 must submit a new application. To apply, submit the online application at

The application system will send an acknowledgement to the email address supplied in the form once it has processed the form.

Any questions about conference participation should be sent to the general TREC email address, trec (at)

The TREC Conference series is sponsored by NIST's
Information Technology Laboratory (ITL)
Information Access Division (IAD)
Retrieval Group

Last updated: Tuesday, 18-Feb-2020 13:00:09 MST
Date created: December 16, 2019

privacy policy / security notice / accessibility statement
disclaimer / FOIA