Call for Proposals for TREC 2023 |
---|
TREC home |
The Text REtrieval Conference (TREC) is a community-based evaluation workshop series focused on tasks in information retrieval and information access. The community evaluation process serves to build reusable resources that support retrieval research beyond the workshop itself. TREC consists of a series of "tracks", a set of tasks each focused on some facet of the retrieval problem. Examples of recent tracks include conversational search, retrieval of podcast episodes, cross-language retrieval, retrieval of web documents, and question answering. The tracks invigorate TREC by encouraging research in new areas of information retrieval. The tracks also support the retrieval research community by creating the infrastructure such as test collections necessary for task-specific research. The TREC Program Committee seeks proposals for tracks to take place in TREC 2023 (January - November 2023). Track proposals may be no longer than four pages, and must include:
The set of tracks run in any particular year depends on the interests of the participants and sponsors, as well as on the suitability of the problem to the TREC environment. Proposals for tracks will be reviewed by the TREC Program Committee, and decisions will be shared in time for coordinators of new tracks to plan to attend the TREC Conference (November 14-18, 2022). All current TREC tracks must submit a proposal to continue, and such proposals will be considered alongside new track proposals. Proposals for continuing tracks should indicate how the proposed iteration fits into an ongoing plan for the track. Generally, the PC will have to decide to terminate an existing track to accommodate a newly proposed track, so it is certainly possible that no new tracks will be selected for the coming year. Track proposals must be submitted by September 18, 2023, AoE. The criteria for judging a track proposal are: a strong advocate who is willing to be the track coordinator (track coordinator is a volunteer position); a large enough core of interested researchers to make the track viable; the availability of sufficient resources such as appropriate corpora and assessors with expertise in the area; and the fit with other tracks. Proposals need to contain enough information for the PC to assess the criteria above. Proposals should contain an explicit statement of the goals of the track (i.e., what is expected to be learned and/or what infrastructure would be created if the track were run). If relevance judging (or some similar sort of annotation) is required, the proposal needs to include where the judging would occur (NIST or elsewhere?), any special qualifications the assessors would need (special domain expertise required?), as well as an estimate of the amount of time such assessing would require. Any special constraints on the document sets needed should also be noted. Finally, proposals must contain full contact details of the proposer. On the flip side, proposals need to be concise and to-the-point: if your proposal is more than four pages, it is too long. Send a proposal as a PDF document to trec@nist.gov. Proposals in other formats will be bounced unopened. Aside from the length limit of four pages, there are no formatting restrictions.
Ian Soboroff |
|
is an agency of the U.S. Commerce Department |
Date created: Wednesday, 02-Sept-25 privacy policy / security notice / accessibility statement disclaimer / FOIA trec@nist.gov |