Call to TREC 2018

Return to the TREC home page TREC home          National Institute of Standards and Technology Home Page

TREC Statement on Product Testing and Advertising



February 2018 - November 2018

Conducted by:
National Institute of Standards and Technology (NIST)

The Text Retrieval Conference (TREC) workshop series encourages research in information retrieval and related applications by providing a large test collection, uniform scoring procedures, and a forum for organizations interested in comparing their results. Details about TREC can be found at the TREC web site,

You are invited to participate in TREC 2018. TREC 2018 will consist of a set of tasks known as "tracks". Each track focuses on a particular subproblem or variant of the retrieval task as described below. Organizations may choose to participate in any or all of the tracks. Training and test materials are available from NIST for some tracks; other tracks will use special collections that are available from other organizations for a fee.

Dissemination of TREC work and results other than in the (publicly available) conference proceedings is welcomed, but the conditions of participation specifically preclude any advertising claims based on TREC results. All retrieval results submitted to NIST are published in the Proceedings and are archived on the TREC web site. The workshop in November is open only to participating groups that submit retrieval results for at least one track and to selected government invitees.


Schedule: As soon as possible -- submit your application to participate in TREC 2018 as described below.
Submitting an application will add you to the active participants' mailing list. On Feb 28, NIST will announce a new password for the "active participants" portion of the TREC web site.

Beginning March 1
Document disks used in some existing TREC collections distributed to participants who have returned the required forms. Please note that no disks will be shipped before March 1.

Results submission deadline for most tracks. Specific deadlines for each track will be included in the track guidelines, which will be finalized in the spring. Some tracks may have a late spring submission deadline.

September 30 (estimated)
Relevance judgments and individual evaluation scores due back to participants.

Nov 14--16
TREC 2018 conference at NIST in Gaithersburg, Md. USA. Some tracks may hold dedicated track workshops on Tuesday, November 13.

Task Description

Below is a brief summary of the tasks. Complete descriptions of tasks performed in previous years are included in the Overview papers in each of the TREC proceedings (in the Publications section of the web site).

The exact definition of the tasks to be performed in each track for TREC 2018 is still being formulated. Track discussion takes place on the track mailing list or wiki. To join a track mailing list, follow the instructions for the track as detailed below. For questions about the track, post your question to the track mailing list once you join.

TREC 2018 will contain seven tracks.


This is a new track for 2018, which will run in parallel (with somewhat different emphases) in CLEF 2018, NTCIR-14, and TREC 2018. The overall goal of the track is to develop and tune a reproducibility evaluation protocol for IR.
Track coordinators:
Nicola Ferro, University of Padua
Tetsuya Sakai, Waseda University
Ian Soboroff, NIST
Track Web Page:
Mailing list:
Google group, name: centre-eval

Common Core Track

The Common Core track uses an ad hoc search task over news documents. As such, it serves as a common task for a wide spectrum of IR researchers to attract a diverse run set that can be used to investigate new methodologies for test collection construction.
Track coordinators:
Evangelos Kanoulas, University of Amsterdam
James Allan, University of Massachusetts
Donna Harman, NIST
Track Web Page:
Mailing list:
Google group, name: trec-core

Complex Answer Retrieval Track

The focus of the Complex Answer Retrieval track is on developing systems that are capable of answering complex information needs by collating relevant information from an entire corpus.
Track coordinators:
Laura Dietz, University of New Hampshire
Jeff Dalton, University of Glasgow
Ben Gamari, Well-typed LLP
Manisha Verma, University College London
Presenjit Mitra, Penn State
Nick Craswell, Microsoft
Track Web Page:
Mailing list:
Google group, name: trec-car

Incident Streams Track

This is a new track for TREC 2018. The Incident Streams track is designed to bring together academia and industry to research technologies to automatically process social media streams during emergency situations with the aim of categorizing information and aid requests made on social media for emergency service operators.
Track coordinators:
Richard McCreadie, University of Glasgow
Ian Soboroff, NIST
Track Web Page:
Incident Streams track web page
Track Mailing List:
Google group, name: trec-is

News Track

The News Track is a new track for 2018. It will feature modern search tasks in the news domain. In partnership with The Washington Post, we will develop test collections that support the search needs of news readers and news writers in the current news environment.
Track coordinators:
Shudong Huang, NIST
Donna Harman, NIST
Ian Soboroff, NIST
Track Web Page:
Mailing list:
Google group, name: trec-news-track

Precision Medicine Track

This track is a specialization of the Clinical Decision Support track of previous TRECs. It focuses on building systems that use data (e.g., a patient's past medical history and genomic information) to link oncology patients to clinical trials for new treatments as well as evidence-based literature to identify the most effective existing treatments. Track coordinators:
Kirk Roberts, University of Texas Health Science Center
Dina Demner-Fushman, U.S. National Library of Medicine
Ellen Voorhees, NIST
William Hersh, Oregon Health and Science University
Alexander Lazar, University of Texas MD Anderson Cancer Center
Shubham Pant, University of Texas MD Anderson Cancer Center
Track Web Page:
Mailing list:
Google group, name: trec-cds

Real-Time Summarization Track

The Real-Time Summarization (RTS) track explores techniques for constructing real-time update summaries from social media streams in response to users' information needs. The track will include a "Personalized ArXiv Digest" subtask in 2018 that addresses the task of generating article recommendations (to real subscribers based on their real user interest profiles) in the form of a regular email digest.
Track coordinators:
Jimmy Lin, University of Waterloo
Krisztian Balog, University of Stavanger
Track Web Page:
Mailing list:
Google group, name: trec-rts

Conference Format

The conference itself will be used as a forum both for presentation of results (including failure analyses and system comparisons), and for more lengthy system presentations describing retrieval techniques used, experiments run using the data, and other issues of interest to researchers in information retrieval. All groups will be invited to present their results in a joint poster session. Some groups may also be selected to present during plenary talk sessions.

Application Details

Organizations wishing to participate in TREC 2018 should respond to this call for participation by submitting an application. Participants in previous TRECs who wish to participate in TREC 2018 must submit a new application. To apply, submit the online application at

The application system will send an acknowledgement to the email address supplied in the form once it has processed the form.

Any questions about conference participation should be sent to the general TREC email address, trec (at)

The TREC Conference series is sponsored by NIST's
Information Technology Laboratory (ITL)
Information Access Division (IAD)
Retrieval Group

Last updated: Monday, 02-Apr-2018 08:54:26 EDT
Date created: December 21, 2017

privacy policy / security notice / accessibility statement
disclaimer / FOIA