Call to TREC 2022

Return to the TREC home page TREC home          National Institute of Standards and Technology Home Page


TREC Statement on Product Testing and Advertising

CALL FOR PARTICIPATION

TEXT RETRIEVAL CONFERENCE (TREC) 2022

February 2022 - November 2022


Conducted by:
National Institute of Standards and Technology (NIST)

The Text Retrieval Conference (TREC) workshop series encourages research in information retrieval and related applications by providing a large test collection, uniform scoring procedures, and a forum for organizations interested in comparing their results. Details about TREC can be found at the TREC web site, http://trec.nist.gov.

You are invited to participate in TREC 2022. TREC 2022 will consist of a set of tasks known as "tracks". Each track focuses on a particular subproblem or variant of the retrieval task as described below. Organizations may choose to participate in any or all of the tracks. Training and test materials are available from NIST for some tracks; other tracks will use special collections that are available from other organizations for a fee.

Dissemination of TREC work and results other than in the (publicly available) conference proceedings is welcomed, but the conditions of participation specifically preclude any advertising claims based on TREC results. All retrieval results submitted to NIST are published in the Proceedings and are archived on the TREC web site with the submitting organization identified.

Schedule:

Schedule: As soon as possible -- submit your application to participate in TREC 2022 as described below.
Submitting an application will add you to the active participants' mailing list. On Feb 22, NIST will announce a new password for the "active participants" portion of the TREC web site. We accept applications to participate until late May, but applying earlier means you can be involved in track discussions. Processing applications requires some manual effort on our end. Once your application is processed (at most a few business days), the "Welcome to TREC" email message with details about TREC participation will be sent to the email address provided in the application.

July--August
Results submission deadline for most tracks. Specific deadlines for each track will be included in the track guidelines, which will be finalized in the spring. Some tracks tracks are likely to have a spring submission deadline, so be sure to subscribe to the track lists described below.

September 30 (estimated)
Relevance judgments and individual evaluation scores due back to participants.

Nov 14--18
TREC 2022 conference at NIST in Gaithersburg, Md. USA if in-person meeting can be held. Otherwise, a virtual conference during this week.

Task Description

Below is a brief summary of the tasks. Complete descriptions of tasks performed in previous years are included in the Overview papers in each of the TREC proceedings (in the Publications section of the web site).

The exact definition of the tasks to be performed in each track for TREC 2022 is still being formulated. Track discussion takes place on the track mailing list (or other communication medium). To join the discussion, follow the instructions for the track as detailed below.

TREC 2022 will contain seven tracks. Five of the tracks ran in TREC 2021; the Incident Streams, News, and Podcast tracks have ended, and two new tracks, CrisisFACTS and NeuCLIR, are starting.

Clinical Trials Track

The goal of the new Clinical Trials track is to focus research on the clinical trials matching problem: given a free text summary of a patient health record, find suitable clinical trials for that patient.
Anticipated timeline: TBD
Track coordinators:
Dina Demner-Fushman, U.S. National Library of Medicine
William Hersh, Oregon Health and Science University
Kirk Roberts, University of Texas Health Science Center
Ellen Voorhees, NIST
Track Web Page:
http://www.trec-cds.org/
Mailing list:
Google group, name: trec-cds


Conversational Assistance Track

The main aim of Conversational Assistance Track (CAsT) is to advance research on conversational search systems. The goal of the track is to create reusable benchmarks for open-domain information centric conversational dialogues.
Anticipated timeline: Topics released in late Spring, results due in August
Track coordinators:
Leif Azzopardi, University of Strathclyde
Jeff Dalton, University of Glasgow
Mohammed Alian Nejadi, University of Amsterdam
Paul Ogbonoko, University of Glasgow
Johanne Trippas, University of Melbourne
Svitlana Vakulenko, University of Amsterdam
Track Web Page:
Conversational Assistance track web page
Mailing list:
Google group, name: trec-cast
Track Slack:
treccast.slack.com Twitter: @treccast


CrisisFACTS Track

The CrisisFACTS track focuses on temporal summarization for first responders in emergency situations. These summaries differ from traditional summarization in that they order information by time and produce a series of short updates instead of a longer narrative.
Anticipated timeline: Results due in July/August
Track coordinators:
Cody Buntain, University of Maryland
Richard McCreadie, University of Glasgow
Track Web Page:
CrisisFACTS track web page
Mailing list:
Google group, name: trec-is


Deep Learning Track

The Deep Learning track focuses on IR tasks where a large training set is available, allowing us to compare a variety of retrieval approaches including deep neural networks and strong non-neural approaches, to see what works best in a large-data regime.
Anticipated timeline: Results due in early August
Track coordinators:
Daniel Campos, University of Illinois at Urbana-Champaign
Nick Craswell, Microsoft
Jimmy Lin, University of Waterloo
Bhaskar Mitra, Microsoft
Emine Yilmaz, University College London
Track Web Page:
Deep Learning track web page


Fair Ranking Track

The Fair Ranking track focuses on building two-sided systems that offer fair exposure to ranked content producers while ensuring high results quality for ranking consumers.
Anticipated timeline: Training queries in June/July, evaluation queries in July, results due at the beginning of August
Track coordinators:
Michael Ekstrand, Boise State University
Isaac Johnson, Wikimedia Foundation
Graham McDonald, University of Glasgow
Amifa Raj, Boise State University
Track Web Page:
Fair Ranking track web page
Mailing list:
Google group, name: fair-trec

Health Misinformation Track

The Health Misinformation track aims to (1) provide a venue for research on retrieval methods that promote better decision making with search engines, and (2) develop new online and offline evaluation methods to predict the decision making quality induced by search results. Consumer health information is used as the domain of interest in the track.
Anticipated timeline: TBD
Track coordinators:
Charlie Clarke, University of Waterloo
Maria Maistro, University of Copenhagen
Mark Smucker, University of Waterloo
Track Web Page:
Health Misinformation track web page
Mailing list:
Google group, name: trec-health-misinformation-track


NeuCLIR Track

Cross-language Information Retrieval (CLIR) has been studied at TREC and subsequent evaluation forums for more than twenty years, but recent advances in the application of deep learning to information retrieval (IR) warrant a new, large-scale effort that will enable exploration of classical and modern IR techniques for this task.
Anticipated timeline: Document collection available in January, evaluation topics and baseline results in June, final submissions in July
Track coordinators:
Dawn Lawrie, Johns Hopkins University
Sean MacAvaney, University of Glasgow
James Mayfield, Johns Hopkins University
Paul McNamee, Johns Hopkins University
Douglas W. Oard, University of Maryland
Luca Soldaini, Amazon Alexa AI
Eugene Yang, Johns Hopkins University
Track Web Page:
NeuCLIR track web page
Mailing list:
Google group, name: neuclir-participants





Conference Format

The conference itself will be used as a forum both for presentation of results (including failure analyses and system comparisons), and for more lengthy system presentations describing retrieval techniques used, experiments run using the data, and other issues of interest to researchers in information retrieval. All groups will be invited to present their results in a joint poster session (assuming in-person meeting is possible). Some groups may also be selected to present during plenary talk sessions.

Application Details


Organizations wishing to participate in TREC 2022 should respond to this call for participation by submitting an application. Participants in previous TRECs who wish to participate in TREC 2022 must submit a new application. To apply, submit the online application at

http://ir.nist.gov/trecsubmit.open/application.html

The application system will send an acknowledgment to the email address supplied in the form once it has processed the form.

Any questions about conference participation should be sent to the general TREC email address, trec (at) nist.gov.


The TREC Conference series is sponsored by NIST's
Information Technology Laboratory (ITL)
Information Access Division (IAD)
Retrieval Group

Last updated: Wednesday, 22-Dec-2021 07:20:51 MST
Date created: December 20, 2021
trec@nist.gov

privacy policy / security notice / accessibility statement
disclaimer / FOIA