Cross-Language
[Filtering]
[Interactive]
[Query]
[Question Answering]
[Spoken Document Retrieval]
[Web]
- BBN Technologies
-
TREC-9 Cross Lingual Retrieval at BBN, page 106
- Chinese University at Hong Kong
-
TREC-9 CLIR at CUHK Disambiguation by Similarity Values Between Adjacent Words,
page 151
- Fudan University
-
FDU at TREC-9: CLIR, Filtering and QA Tasks, page 189
- IBM T.J. Watson Research Center
-
English-Chinese Informatin Retrieval at IBM, page 223
- Johns Hopkins University, APL
-
The HAIRCUT System at TREC-9, page 273
- Korea Advanced Institute of Science and Technology
-
TREC-9 Experiments at KAIST: QA, CLIR and Batch Filtering, page 303
- Microsoft Research China, Tsinghua University China, and
the Université de Montréal
-
TREC-9 CLIR Experiments at MSRCN, page 343
- MNIS-TextWise Labs
-
CINDOR Trec-9 English-Chinese Evaluation, page 379
- National Taiwan University
-
Description of NTU QA and CLIR Systems in TREC-9, page 389
- Queens College, CUNY
-
TREC-9 Cross Language, Web and Question-Answering Track Experiments using PIRCS,
page 419
- RMIT University
-
Melbourne TREC-9 Experiments, page 437
- University of California at Berkeley
-
TREC-9 Cross-Language Information Retrieval (English-Chinese) Overview, page 15
-
English-Chinese Cross-Language IR Using Bilingual Dictionaries, page 517
- University of Maryland
-
TREC-9 Experiments at Maryland: Interactive CLIR, page 543
- University of Massachusetts
-
INQUERY and TREC-9, page 551
Filtering
[Cross-Language] [Interactive]
[Query]
[Question Answering]
[Spoken Document Retrieval]
[Web]
- Carnegie Mellon University
kNN at TREC-9, page 127
YFilter at TREC-9, page 135
- Fudan University
FDU at TREC-9: CLIR, Filtering and QA Tasks, page 189
- Informatique-CDC, ESPCI
Training Context-Sensitive Neural Networks with Few Relevant Examples
for the TREC-9 Routing,
page 257
- IRIT-SIG
Mercure at trec9: Web and Filtering tasks, page 263
- KDD R&D Laboratories, Inc., Waseda University
Experiments on the TREC-9 Filtering Track, page 295
- Korea Advanced Institute of Science and Technology
-
TREC-9 Experiments at KAIST: QA, CLIR and Batch Filtering, page 303
- Microsoft Research, WhizzBang Labs
-
The TREC-9 Filtering Track Final Report, page 25
- Microsoft Research Ltd., UK
-
Microsoft Cambridge at TREC-9: Filtering Track, page 361
- Queens College, CUNY
-
TREC-9 Cross Language, Web and Question-Answering Track Experiments using PIRCS,
page 419
- Rutgers University
Logical Analysis of Data in the TREC-9 Filtering Track, page 453
- Université de Montréal
-
The System RELIEFS: A New Approach for Information Filtering, page 573
- University of Iowa
-
Filters and Answers: The University of Iowa TREC-9 Results, page 533
- University of Nijmegen
-
Incrementality, Half-life, and Threshold Optimization for Adaptive Document Filtering,
page 589
Interactive
[Filtering]
[Cross-Language]
[Query]
[Question Answering]
[Spoken Document Retrieval]
[Web]
- Chapman University
Passive Feedback Collection--An Attempt to Debunk the Myth of
Clickthroughs, page 141
- National Institute of Standards and Technology and the Oregon Health Sciences
University
-
TREC-9 Interactive Track Report, page 41
- Oregon Health Sciences University
-
Further Analysis of Whether Batch and User Evaluations Give the Same Results with a
Question-Answering Task, page 407
- RMIT University
-
Melbourne TREC-9 Experiments, page 437
- Rutgers University
Support for Question-Answering in Interactive Information Retrieval:
Rutgers' TREC-9 Interactive Track Experience, page 463
- University of Glasgow
-
Question Answering, Relevance Feedback and Summarisation: Trec-9 Interactive Track
Report, page 523
- University of Sheffield
-
Sheffield Interactive Experiment at TREC-9, page 645
Query
[Cross-Language]
[Filtering]
[Interactive]
[Question Answering]
[Spoken Document Retrieval]
[Web]
- Hummingbird
Hummingbird's Fulcrum SearchServer at TREC-9, page 211
- Microsoft Research Ltd., UK
-
Microsoft Cambridge at TREC-9: Filtering Track, page 361
- National Institute of Standards and Technology
Query Expansion Seen Through Return Order of Relevant Documents, page 51
- SabIR Research, Inc.
Query Expansion Seen Through Return Order of Relevant Documents, page 51
The TREC-9 Query Track, page 81
- Sun Microsystems Laboratories
-
Halfway to Question Answering, page 489
- University of Massachusetts
-
INQUERY and TREC-9, page 551
Question Answering
[Cross-Language]
[Filtering]
[Interactive]
[Query]
[Spoken Document Retrieval]
[Web]
- CL Research
Syntactic Clues and Lexical Resources in Question-Answering, page 157
- Fudan University
FDU at TREC-9: CLIR, Filtering and QA Tasks, page 189
- IBM T.J. Watson Research Center
One Search Engine or Two for Question-Answering, page 235
IBM's Statistical Question Answering System, page 231
- Imperial College of Science, Technology and Medicine
A Simple Question Answering System, page 251
- Korea Advanced Institute of Science and Technology
-
TREC-9 Experiments at KAIST: QA, CLIR and Batch Filtering, page 303
- Korea University
-
Question Answering Considering Semantic Categories and Co-Occurence Density, page 317
- LIMSI-CNRS
-
QALC--The Question-Answering System of LIMSI-CNRS, page 325
- Microsoft Research Ltd.
-
Question Answering Using a Large NLP System, page 355
- The MITRE Corporation
-
Another Sys Called Qanda, page 369
- National Institute of Standards and Technology
-
Overview of the TREC-9 Question Answering Track, page 71
- National Taiwan University
-
Description of NTU QA and CLIR Systems in TREC-9, page 389
- NTT Data Corporation
-
NTT DATA TREC-9 Question Answering Track Report, page 399
- Queens College, CUNY
-
TREC-9 Cross Language, Web and Question-Answering Track Experiments using PIRCS, page 419
- Southern Methodist University
-
FALCON: Boosting Knowledge for Answer Engines, page 479
- Sun Microsystems Laboratories
-
Halfway to Question Answering, page 489
- Syracuse University
-
Question Answering: CNLP at the TREC-9 Question Answering Track, page 501
- Universidad de Alicante
-
A Semantic Approach to Question Answering Systems, page 511
- Universitá di Pisa - Italy
-
The PISAB Question Answering System, page 621
- Université de Montréal
-
Goal-Driven Answer Extraction, page 563
- University of Iowa
-
Filters and Answers: The University of Iowa TREC-9 Results, page 533
- University of Massachusetts
-
INQUERY and TREC-9, page 551
- University of Sheffield
-
University of Sheffield TREC-9 QA System, page 635
- University of Southern California
-
Question Answering in Webclopedia, page 655
- University of Waterloo, CTIT
-
Question Answering by Passage Selection (MultiText Experiments for TREC-9), page 673
Spoken Document Retrieval
[Cross-Language]
[Filtering]
[Interactive]
[Query]
[Question Answering]
[Web]
- Cambridge University
Spoken Document Retrieval for TREC-9 at Cambridge University, page 117
- LIMSI
The LIMSI SDR System for TREC-9, page 335
- National Institute of Standards and Technology
- Spoken Document Retrieval Track Slides
- University of Sheffield
-
The Thisl SDR System at TREC-9, page 627
Web
[Cross-Language]
[Filtering]
[Interactive]
[Query]
[Question Answering]
[Spoken Document Retrieval]
- AT&T Labs-Research
AT&T Labs at TREC-9, page 103
- CSIRO Mathematics and Information Sciences
ACSys/CSIRO TREC-9 Experiments, page 167
Melbourne TREC-9 Experiments,
page 437
Overview of the TREC-9 Web Track,
page 87
- CWI, Amsterdam
The Mirror DBMS at TREC-9, page 171
- Dublin City University
Dublin City University Experiments in Connectivity Analysis for TREC-9,
page 179
- Fujitsu Laboratories, Ltd.
Fujitsu Laboratories TREC-9 Report, page 203
- Hummingbird
Hummingbird's Fulcrum SearchServer at TREC-9, page 211
- Illinois Institute of Technology
IIT TREC-9-Entity Based Feedback with Fusion, page 241
- IRIT-SIG
Mercure at trec9: Web and Filtering tasks, page 263
- Johns Hopkins University, APL
The HAIRCUT System at TREC-9, page 273
- Justsystem Corporation
Reflections on "Aboutness" TREC-9 Evaluation Experiments at Justsystem, page 281
- Queens College, CUNY
-
TREC-9 Cross Language, Web and Question-Answering Track Experiments using PIRCS,
page 419
- RICOH Co., Ltd.
-
Structuring and Expanding Queries in the Probablistic Model, page 427
- RMIT University
-
Melbourne TREC-9 Experiments, page 437
- SabIR Research, Inc.
SabIR Research at TREC-9, page 475
- TNO-TPD and Univ. of Twente
-
TNO-UT at TREC-9: How Different are Web Documents?, page 665
- University of Bangkok, Thailand
-
Kasetsart University TREC-9 Experiments, page 289
- Université de Neuchâtel
-
Report on the TREC-9 Experiment: Link-based Retrieval an Distributed Collections,
page 579
- University of North Carolina, Chapel Hill
-
Information Space Based on HTML Structure, page 601
- University of Padova, Italy
-
Web Document Retrieval Using Passage Retrieval, Connectivity Information,
and Automatic Link Weighting--TREC-9 Report, page 611
- University of Waterloo, CTIT
-
Question Answering by Passage Selection (MultiText Experiments for TREC-9), page 673
|
Last updated: Friday, 05-Oct-2001 13:17:36 UTC
Date created: Wednesday, 01-Aug-17
trec@nist.gov |
| | | | | |