Call to TREC 2017

Return to the TREC home page TREC home          National Institute of Standards and Technology Home Page


TREC Statement on Product Testing and Advertising

CALL FOR PARTICIPATION

TEXT RETRIEVAL CONFERENCE (TREC) 2017

February 2017 - November 2017


Conducted by:
National Institute of Standards and Technology (NIST)

The Text Retrieval Conference (TREC) workshop series encourages research in information retrieval and related applications by providing a large test collection, uniform scoring procedures, and a forum for organizations interested in comparing their results. Details about TREC can be found at the TREC web site, http://trec.nist.gov.

You are invited to participate in TREC 2017. TREC 2017 will consist of a set of tasks known as "tracks". Each track focuses on a particular subproblem or variant of the retrieval task as described below. Organizations may choose to participate in any or all of the tracks. Training and test materials are available from NIST for some tracks; other tracks will use special collections that are available from other organizations for a fee.

Dissemination of TREC work and results other than in the (publicly available) conference proceedings is welcomed, but the conditions of participation specifically preclude any advertising claims based on TREC results. All retrieval results submitted to NIST are published in the Proceedings and are archived on the TREC web site. The workshop in November is open only to participating groups that submit retrieval results for at least one track and to selected government invitees.

Schedule:

Schedule: As soon as possible -- submit your application to participate in TREC 2017 as described below.
Submitting an application will add you to the active participants' mailing list. On Feb 22, NIST will announce a new password for the "active participants" portion of the TREC web site.

Beginning March 1
Document disks used in some existing TREC collections distributed to participants who have returned the required forms. Please note that no disks will be shipped before March 1.

July--August
Results submission deadline for most tracks. Specific deadlines for each track will be included in the track guidelines, which will be finalized in the spring. Some tracks may have a late spring submission deadline.

September 30 (estimated)
Relevance judgments and individual evaluation scores due back to participants.

Nov 15--17
TREC 2017 conference at NIST in Gaithersburg, Md. USA. Some tracks may hold dedicated track workshops on Tuesday, November 14.

Task Description

Below is a brief summary of the tasks. Complete descriptions of tasks performed in previous years are included in the Overview papers in each of the TREC proceedings (in the Publications section of the web site).

The exact definition of the tasks to be performed in each track for TREC 2017 is still being formulated. Track discussion takes place on the track mailing list or wiki. To join a track mailing list, follow the instructions for the track as detailed below. For questions about the track, post your question to the track mailing list once you join.

TREC 2017 will contain eight tracks.

Common Core Track

This is a new track for TREC 2017. The track will serve as a common task for a wide spectrum of IR researchers, thus attracting a diverse run set that can be used to investigate new methodologies for test collection construction.
Track coordinators:
Evangelos Kanoulas, University of Amsterdam
James Allan, University of Massachusetts
Donna Harman, NIST
Track Web Page:
trec-core.github.io/2017/
Mailing list:
Google group, name: trec-core


Complex Answer Retrieval Track

This is a new track for TREC 2017. Its focus is on developing systems that are capable of answering complex information needs by collating relevant information from an entire corpus.
Track coordinators:
Laura Dietz, University of New Hampshire
Manisha Verma, University College London
Filip Radlinski, Google
Nick Craswell, Microsoft
Track Web Page:
trec-car.cs.unh.edu
Mailing list:
Google group, name: trec-car


Dynamic Domain Track

This track focuses on interactive search algorithms that adapt to the dynamic information needs of professional users as they explore in complex domains.
Track coordinators:
Grace Hui Yang, Georgetown University
Ian Soboroff, NIST
Track Web Page:
trec-dd.org
Mailing list:
Google group, name: trec-dd


Live QA Track

In this track systems generate answers to real questions originating from real users via a live question stream, in real time.
Track coordinators:
Eugene Agichtein, Emory University
Asma Ben Abacha, National Institutes of Health
Eric Nyberg, Carnegie Mellon University
Donna Harman, NIST
Yuval Pinter, Georgia Institute of Technology
Track Web Page:
trec-liveqa.org
Track Mailing List:
Subscribe yourself to the mailing list by sending an email message to trec-liveqa-join@nist.gov and following the instructions in the email response you will receive.


OpenSearch Track

The OpenSearch track explores The "Living Labs" evaluation paradigm for IR that involves real users of operational search engines. The task in the track will continue/expand upon the ad hoc Academic Search task of TREC 2016.
Track coordinators:
Krisztian Balog, University of Stavanger
Maarten de Rijke, Unversity of Amsterdam
Anne Schuth, Blendle
Track Web Page:
trec-open-search.org
Mailing list:
Google group, name: trec-open-search

Precision Medicine Track

This track is a specialization of the Clinical Decision Support track of previous TRECs. It will focus on building systems that use data (e.g., a patient's past medical history and genomic information) to link oncology patients to clinical trials for new treatments as well as evidence-based literature to identify the most effective existing treatments. Track coordinators:
Kirk Roberts, University of Texas Health Science Center
Dina Demner-Fushman, U.S. National Library of Medicine
Ellen Voorhees, NIST
William Hersh, Oregon Health and Science University
Shubham Pant, University of Texas MD Anderson Cancer Center
Track Web Page:
http://www.trec-cds.org/
Mailing list:
Google group, name: trec-cds


Real-Time Summarization Track

The Real-Time Summarization (RTS) track explores techniques for constructing real-time update summaries from social media streams in response to users' information needs.
Track coordinators:
Jimmy Lin, University of Waterloo
Adam Roegiest, University of Waterloo
Luchen Tan, University of Waterloo
Richard McCreadie, University of Glasgow
Track Web Page:
trecrts.github.io
Mailing list:
Google group, name: trec-rts


Tasks Track

The goal of the Tasks track is to test whether systems can induce the possible tasks users might be trying to accomplish given a query.
Track coordinators:
Emine Yilmaz, University College London
Rishabh Mehrotra, University College London
Ben Carterette, University of Delaware
Evangelos Kanoulas, University of Amsterdam
Nick Craswell, Microsoft
Peter Bailey, Microsoft
Track Web Page:
www.cs.ucl.ac.uk/tasks-track-2017/
Mailing list:
Google group, name: tasks-track




Conference Format

The conference itself will be used as a forum both for presentation of results (including failure analyses and system comparisons), and for more lengthy system presentations describing retrieval techniques used, experiments run using the data, and other issues of interest to researchers in information retrieval. All groups will be invited to present their results in a joint poster session. Some groups may also be selected to present during plenary talk sessions.

Application Details


Applicaitons to participate in TREC 2017 are now closed.

Any questions about conference participation should be sent to the general TREC email address, trec (at) nist.gov.


The TREC Conference series is sponsored by NIST's
Information Technology Laboratory (ITL)
Information Access Division (IAD)
Retrieval Group

Last updated: Tuesday, 16-May-2017 07:57:03 MDT
Date created: December 15, 2016
trec@nist.gov

privacy policy / security notice / accessibility statement
disclaimer / FOIA