The Text Retrieval Conference (TREC) workshop series encourages
research in information retrieval and related applications by
providing a large test collection, uniform scoring procedures,
and a forum for organizations interested in comparing their
results. Now in its 25th year, the conference has become
the major experimental effort in the field. Participants in
the previous TREC conferences have examined a wide variety
of retrieval techniques and retrieval environments,
including cross-language retrieval, web retrieval, document filtering,
multimedia retrieval, and question answering. Details about TREC
can be found at the TREC web site, http://trec.nist.gov.
You are invited to participate in TREC 2016. TREC 2016 will
consist of a set of tasks known as "tracks". Each track focuses
on a particular subproblem or variant of the retrieval task as
described below. Organizations may choose to participate in any or
all of the tracks. Training and test materials are available from
NIST for some tracks; other tracks will use special collections that
are available from other organizations for a fee.
Dissemination of TREC work and results other than in the (publicly
available) conference proceedings is welcomed, but the conditions of
participation specifically preclude any advertising claims based
on TREC results. All retrieval results submitted to NIST are
published in the Proceedings and are archived on the TREC web site.
The workshop in November is open only to participating groups that
submit retrieval results for at least one track and to selected
government invitees.
Schedule:
Schedule:
As soon as possible -- submit your application to participate in
TREC 2016 as described below.
Submitting an application will add you to the active participants'
mailing list. On Feb 25, NIST will announce a new password
for the "active participants" portion of the TREC web site.
Beginning March 1
Document disks used in some existing TREC collections distributed
to participants who have returned the required forms.
Please note that no disks will be shipped before March 1.
July--August
Results submission deadline for most tracks.
Specific deadlines for each track will be included in
the track guidelines, which will be finalized in the spring.
Some tracks may have a late spring submission deadline.
September 30 (estimated)
Relevance judgments and individual
evaluation scores due back to participants.
Nov 15--18
TREC 2016 conference at NIST in Gaithersburg, Md. USA.
We anticipate some sort of 25th anniversary celebration will be
included in the program.
Task Description
Below is a brief summary of the tasks. Complete descriptions of
tasks performed in previous years are included in the Overview
papers in each of the TREC proceedings (in the Publications section
of the web site).
The exact definition of the tasks to be performed in each track for
TREC 2016 is still being formulated. Track discussion takes place
on the track mailing list or wiki. To join a track mailing list,
follow the instructions for the track as detailed below.
For questions about the track, post your question to the track mailing list
once you join.
TREC 2016 will contain eight tracks.
Clinical Decision Support Track
The clinical decision support track investigates techniques for linking medical cases to biomedical literature relevant for patient care.
Track coordinators:
Kirk Roberts, U.S. National Library of Medicine
Dina Demner-Fushman, U.S. National Library of Medicine
William Hersh, Oregon Health and Science University
Ellen Voorhees, NIST
Track Web Page:
http://www.trec-cds.org/
Mailing list:
Google group, name: trec-cds
Contextual Suggestion Track
The Contextual Suggestion track investigates search techniques for
complex information needs that are highly dependent on context
and user interests.
Track coordinators:
Seyyed Hadi Hashemi, University Amsterdam
Jaap Kamps, University of Amsterdam
Julia Kiseleva, Eindhoven University of Technology
Charles L A Clarke, University of Waterloo
Track Web Page:
http://sites.google.com/site/treccontext/
Mailing list:
Google group, name: treccontext
Dynamic Domain Track
This track focuses on domain-specific search algorithms that adapt to the
dynamic information needs of professional users as they explore
in complex domains.
Track coordinators:
Grace Hui Yang, Georgetown University
Ian Soboroff, NIST
Track Web Page:
http://trec-dd.org/
Mailing list:
Google group, name: trec-dd
Live QA Track
In this track systems will generate answers to real questions originating
from real users via a live question stream, in real time.
Track coordinators:
David Carmel, Yahoo Labs
Dan Pelleg, Yahoo Labs
Yuval Pinter, Yahoo Labs
Eugene Agichtein, Emory University
Donna Harman, NIST
Track Web Page:
http://trec-liveqa.org/
Track Mailing List:
Subscribe yourself to the mailing list by following the instructions at the
trec-liveqa@nist.gov subscription page
OpenSearch Track
The OpenSearch track explores an evaluation paradigm for IR that
involves real users of operational search engines.
For this first year of the track the task will be ad hoc Academic Search.
Track coordinators:
Anne Schuth, University of Amsterdam
Krisztian Balog, University of Stavanger
Track Web Page:
http://trec-open-search.org/
Mailing list:
Google group, name: trec-open-search
Real-Time Summarization Track
The Real-Time Summarization (RTS) track explores techniques for
constructing real-time update summaries from social media streams
in response to users' information needs.
Track coordinators:
Fernando Diaz, Microsoft Research
Jimmy Lin, University of Waterloo
Richard McCreadie, University of Glasgow
Adam Roegiest, University of Waterloo
Mailing list:
Google group, name: trec-rts
Tasks Track
The goal of the Tasks track is to test whether systems can induce the
possible tasks users might be trying to accomplish given a query.
Track coordinators:
Emine Yilmaz, University College London
Manisha Verma, University College London
Rishabh Mehrotra, University College London
Ben Carterette, University of Delaware
Evangelos Kanoulas, University of Amsterdam
Nick Craswell, Microsoft
Peter Bailey, Microsoft
Track Web Page:
http://www.cs.ucl.ac.uk/tasks-track-2016/
Mailing list:
Google group, name: tasks-track-2016
Total Recall Track
The focus of the Total Recall Track is to evaluate methods to achieve
very high recall, including methods that include a human assessor
in the loop.
Track coordinators:
Adam Roegiest, University of Waterloo
Gordon V. Cormack, University of Waterloo
Maura R. Grossman, Wachtell, Lipton, Rosen & Katz
Charles L A Clarke, University of Waterloo
Track Web Page:
http://trec-total-recall.org/
Mailing list:
Google group, name: trec-total-recall
Conference Format
The conference itself will be used as a forum both for presentation
of results (including failure analyses and system comparisons),
and for more lengthy system presentations describing retrieval
techniques used, experiments run using the data, and other issues
of interest to researchers in information retrieval.
All groups will be invited to present their results in a joint
poster session. Some groups may also be selected to present
during plenary talk sessions.
Application Details
Organizations wishing to participate in TREC 2016 should respond
to this call for participation by submitting an application.
Participants in previous TRECs who wish to participate
in TREC 2016 must submit a new application.
To apply, submit the online application at
http://ir.nist.gov/trecsubmit.open/application.html
The application system
will send an acknowledgement to the email address
supplied in the form once it has processed the form.
Any questions about conference participation should be sent
to the general TREC email address, trec (at) nist.gov.
|