Call to TREC 2019

Return to the TREC home page TREC home          National Institute of Standards and Technology Home Page

TREC Statement on Product Testing and Advertising



February 2019 - November 2019

Conducted by:
National Institute of Standards and Technology (NIST)

The Text Retrieval Conference (TREC) workshop series encourages research in information retrieval and related applications by providing a large test collection, uniform scoring procedures, and a forum for organizations interested in comparing their results. Details about TREC can be found at the TREC web site,

You are invited to participate in TREC 2019. TREC 2019 will consist of a set of tasks known as "tracks". Each track focuses on a particular subproblem or variant of the retrieval task as described below. Organizations may choose to participate in any or all of the tracks. Training and test materials are available from NIST for some tracks; other tracks will use special collections that are available from other organizations for a fee.

Dissemination of TREC work and results other than in the (publicly available) conference proceedings is welcomed, but the conditions of participation specifically preclude any advertising claims based on TREC results. All retrieval results submitted to NIST are published in the Proceedings and are archived on the TREC web site with the submitting organization identified. The workshop in November is open only to participating groups that submit retrieval results for at least one track and to selected government invitees.


Schedule: As soon as possible -- submit your application to participate in TREC 2019 as described below.
Submitting an application will add you to the active participants' mailing list. On Feb 21, NIST will announce a new password for the "active participants" portion of the TREC web site.

Beginning March 1
Document disks used in some existing TREC collections distributed to participants who have returned the required forms. Please note that no disks will be shipped before March 1.

Results submission deadline for most tracks. Specific deadlines for each track will be included in the track guidelines, which will be finalized in the spring. Some tracks may have a spring submission deadline.

September 30 (estimated)
Relevance judgments and individual evaluation scores due back to participants.

Nov 13--15
TREC 2019 conference at NIST in Gaithersburg, Md. USA.

Task Description

Below is a brief summary of the tasks. Complete descriptions of tasks performed in previous years are included in the Overview papers in each of the TREC proceedings (in the Publications section of the web site).

The exact definition of the tasks to be performed in each track for TREC 2019 is still being formulated. Track discussion takes place on the track mailing list (or other comunication medium). To join the discussion, follow the instructions for the track as detailed below.

TREC 2019 will contain eight tracks.

Complex Answer Retrieval Track

The focus of the Complex Answer Retrieval track is on developing systems that are capable of answering complex information needs by collating relevant information from an entire corpus.
Track coordinators:
Laura Dietz, University of New Hampshire
Ben Gamari, Well-typed LLP
Track Web Page:
Mailing list:
Google group, name: trec-car

Conversational Assistance Track

The Conversational Assistance Track is a forum for building and testing systems that engage in open-domain information centric conversational dialogues.
Track coordinators:
Jeff Dalton, University of Glasgow
Chenyan Xiong, Microsoft Research
Jamie Callan, Carnegie Mellon University
Mailing list:
Google group, name: trec-cast
Twitter: @treccast

Decision Track

The Decision Track aims to (1) provide a venue for research on retrieval methods that promote better decision making with search engines, and (2) develop new online and offline evaluation methods to predict the decision making quality induced by search results.
Track coordinators:
Christina Lioma, University of Copenhagen
Mark Smucker, University of Waterloo
Guido Zuccon, University of Queensland
Mailing list:
Google group, name: trec-decision-track

Deep Learning Track

The Deep Learning track focuses on IR tasks where a large training set is available, allowing us to compare a variety of retrieval approaches including deep neural networks and strong non-neural approaches, to see what works best in a large-data regime.
Track coordinators:
Nick Craswell, Microsoft
Bhaskar Mitra, Microsoft and University College London
Emine Yilmaz, University College London
Daniel Campos, Microsoft
Track Web Page:
Deep Learning track web page Mailing list:
The list will use the new (forthcoming) TREC Slack. Register to participate to access the Deep Learning Track challenge.

Fair Ranking Track

The Fair Ranking track focuses on building two-sided systems that offer fair exposure to ranked content producers while ensuring high results quality for ranking consumers.
Track coordinators:
Asia Biega, Max Planck Institute for Informatics
Fernando Diaz, Microsoft Research Montreal
Michael Ekstrand, Boise State University
Track Web Page:
Fairness track web page
Mailing list:
Google group, name: fair-trec

Incident Streams Track

The Incident Streams track is designed to bring together academia and industry to research technologies to automatically process social media streams during emergency situations with the aim of categorizing information and aid requests made on social media for emergency service operators.
Track coordinators:
Richard McCreadie, University of Glasgow
Cody Buntain, NYU
Ian Soboroff, NIST
Track Web Page:
Incident Streams track web page
Track Mailing List:
Google group, name: trec-is

News Track

The News track features modern search tasks in the news domain. In partnership with The Washington Post, the track develops test collections that support the search needs of news readers and news writers in the current news environment.
Track coordinators:
Shudong Huang, NIST
Donna Harman, NIST
Ian Soboroff, NIST
Track Web Page:
Mailing list:
Google group, name: trec-news-track

Precision Medicine Track

This track is a specialization of the Clinical Decision Support track of previous TRECs. It focuses on building systems that use data (e.g., a patient's past medical history and genomic information) to link oncology patients to clinical trials for new treatments as well as evidence-based literature to identify the most effective existing treatments.
Track coordinators:
Kirk Roberts, University of Texas Health Science Center
Dina Demner-Fushman, U.S. National Library of Medicine
Ellen Voorhees, NIST
William Hersh, Oregon Health and Science University
Alexander Lazar, University of Texas MD Anderson Cancer Center
Shubham Pant, University of Texas MD Anderson Cancer Center
Track Web Page:
Mailing list:
Google group, name: trec-cds

Conference Format

The conference itself will be used as a forum both for presentation of results (including failure analyses and system comparisons), and for more lengthy system presentations describing retrieval techniques used, experiments run using the data, and other issues of interest to researchers in information retrieval. All groups will be invited to present their results in a joint poster session. Some groups may also be selected to present during plenary talk sessions.

Application Details

Organizations wishing to participate in TREC 2019 should respond to this call for participation by submitting an application. Participants in previous TRECs who wish to participate in TREC 2019 must submit a new application. To apply, submit the online application at

The application system will send an acknowledgement to the email address supplied in the form once it has processed the form.

Any questions about conference participation should be sent to the general TREC email address, trec (at)

The TREC Conference series is sponsored by NIST's
Information Technology Laboratory (ITL)
Information Access Division (IAD)
Retrieval Group

Last updated: Thursday, 20-Dec-2018 07:28:41 EST
Date created: December 19, 2018

privacy policy / security notice / accessibility statement
disclaimer / FOIA