Call to TREC 2015

Return to the TREC home page TREC home          National Institute of Standards and Technology Home Page

TREC Statement on Product Testing and Advertising



February 2015 - November 2015

Conducted by:
National Institute of Standards and Technology (NIST)

The Text Retrieval Conference (TREC) workshop series encourages research in information retrieval and related applications by providing a large test collection, uniform scoring procedures, and a forum for organizations interested in comparing their results. Now in its 24th year, the conference has become the major experimental effort in the field. Participants in the previous TREC conferences have examined a wide variety of retrieval techniques and retrieval environments, including cross-language retrieval, retrieval of web documents, multimedia retrieval, and question answering. Details about TREC can be found at the TREC web site,

You are invited to participate in TREC 2015. TREC 2015 will consist of a set of tasks known as "tracks". Each track focuses on a particular subproblem or variant of the retrieval task as described below. Organizations may choose to participate in any or all of the tracks. Training and test materials are available from NIST for some tracks; other tracks will use special collections that are available from other organizations for a fee.

Dissemination of TREC work and results other than in the (publicly available) conference proceedings is welcomed, but the conditions of participation specifically preclude any advertising claims based on TREC results. All retrieval results submitted to NIST are published in the Proceedings and are archived on the TREC web site. The workshop in November is open only to participating groups that submit retrieval results for at least one track and to selected government invitees.


Schedule: As soon as possible -- submit your application to participate in TREC 2015 as described below.
Submitting an application will add you to the active participants' mailing list. On Feb 25, NIST will announce a new password for the "active participants" portion of the TREC web site.

Beginning March 1
Document disks used in some existing TREC collections distributed to participants who have returned the required forms. Please note that no disks will be shipped before March 1.

Results submission deadline for most tracks. Specific deadlines for each track will be included in the track guidelines, which will be finalized in the spring.

September 30 (estimated)
relevance judgments and individual evaluation scores due back to participants.

Nov 17--20
TREC 2015 conference at NIST in Gaithersburg, Md. USA

Task Description

Below is a brief summary of the tasks. Complete descriptions of tasks performed in previous years are included in the Overview papers in each of the TREC proceedings (in the Publications section of the web site).

The exact definition of the tasks to be performed in each track for TREC 2015 is still being formulated. Track discussion takes place on the track mailing list or wiki. To join a track mailing list, follow the instructions for the track as detailed below. For questions about the track, post your question to the track mailing list once you join.

TREC 2015 will contain eight tracks.

Clinical Decision Support Track

The clinical decision support track investigates techniques for linking medical cases to biomedical literature relevant for patient care.
Track coordinators:
Kirk Roberts, U.S. National Library of Medicine
William Hersh, Oregon Health and Science University
Matthew Simpson
Ellen Voorhees, NIST
Track Web Page:
Mailing list:

Contextual Suggestion Track

The Contextual Suggestion track investigates search techniques for complex information needs that are highly dependent on context and user interests.
Track coordinators:
Adriel Dean-Hall, University of Waterloo
Charles L A Clarke, University of Waterloo
Jaap Kamps, University of Amsterdam
Julia Kiseleva, Eindhoven University of Technology
Track Web Page:
Mailing list:!forum/treccontext/

Dynamic Domain Track

This track focuses on domain-specific search algorithms that adapt to the dynamic information needs of professional users as they explore in complex domains.
Track coordinators:
Grace Hui Yang, Georgetown University
John Frank, MIT and Diffeo
Ian Soboroff, NIST
Track Web Page:
Mailing list:

Live QA Track

In this track systems will generate answers to real questions originating from real users via a live question stream, in real time.
Track coordinators:
Eugene Agichtein, Yahoo Labs and Emory University
David Carmel, Yahoo Labs
Donna Harman, NIST
Track Web Page:
Track Mailing List:
Subscribe yourself to the mailing list by mailing to using the subject: subscribe

Microblog Track

The Microblog track examines the nature of real-time information needs and their satisfaction in the context of microblogging environments such as Twitter.
Track coordinators:
Miles Efron, University of Illinois
Jimmy Lin, University of Maryland
Track Web Page:
Mailing list:

Tasks Track

The goal of the Tasks track is to test whether systems can induce the possible tasks users might be trying to accomplish given a query.
Track coordinators:
Ben Carterette, University of Delaware
Nick Craswell, Microsoft
Evangelos Kanoulas, University of Amsterdam
Manisha Verma, University College London
Emine Yilmaz, University College London
Track Web Page:
Mailing list:

Temporal Summarization Track

The goal of the Temporal Summarization track is to develop systems that allow users to efficiently monitor the information associated with an event over time.
Track coordinators:
Javad Aslam, Northeastern University
Fernando Diaz, Microsoft Research
Matthew Ekstrand-Abueg, Northeastern University
Richard McCreadie, University of Glasgow
Virgil Pavlu, Northeastern University
Tetsuya Sakai, Waseda University
Track Web Page:

Total Recall Track

The focus of the Total Recall Track is to evaluate methods to achieve very high recall, including methods that include a human assessor in the loop.
Track coordinators:
Adam Roegiest, University of Waterloo
Gordon V. Cormack, University of Waterloo
Maura R. Grossman, Wachtell, Lipton, Rosen & Katz
Charles L A Clarke, University of Waterloo
Track Web Page:
Mailing list:

Conference Format

The conference itself will be used as a forum both for presentation of results (including failure analyses and system comparisons), and for more lengthy system presentations describing retrieval techniques used, experiments run using the data, and other issues of interest to researchers in information retrieval. All groups will be invited to present their results in a joint poster session. Some groups may also be selected to present during plenary talk sessions.

Application Details

Organizations wishing to participate in TREC 2015 should respond to this call for participation by submitting an application. Participants in previous TRECs who wish to participate in TREC 2015 must submit a new application. To apply, submit the online application at

The application system will send an acknowledgement to the email address supplied in the form once it has processed the form.

Any questions about conference participation should be sent to the general TREC email address, trec (at)

The TREC Conference series is sponsored by NIST's
Information Technology Laboratory (ITL)
Information Access Division (IAD)
Retrieval Group
Search the TREC site:

Last updated: Monday, 23-Feb-2015 08:21:12 MST
Date created: December 16, 2014
privacy policy / security notice / accessibility statement
disclaimer / FOIA