- AT&T Labs-Research
AT&T at TREC-9, page 103
- BBN Technologies
TREC-9 Cross Lingual Retrieval at BBN, page 106
- Bertin Technologies
QALC--The Question-Answering System of LIMSI-CNRS, page 325
- Cambridge University
Spoken Document Retrieval for TREC-9 at Cambridge University, page 117
- Carnegie Mellon University
kNN at TREC-9, page 127
YFilter at TREC-9, page 135
One Search Engine or Two for Question-Answering, page 235
- Chapman University
Passive Feedback Collection--An Attempt to Debunk the Myth of Clickthroughs, page 141
- Chinese University at Hong Kong
TREC-9 CLIR at CUHK Disambiguation by Similarity Values Between Adjacent Words, page 151
- CL Research
Syntactic Clues and Lexical Resources in Question-Answering, page 157
- CSIRO Mathematics and Information Sciences
ACSys/CSIRO TREC-9 Experiments, page 167
Melbourne TREC-9 Experiments, page 437
Overview of the TREC-9 Web Track, page 87
- CWI, Amsterdam
The Mirror DBMS at TREC-9, page 171
- Dublin City University
Dublin City University Experiments in Connectivity Analysis for TREC-9, page 179
- ESPCI
Training Context-Sensitive Neural Networks with Few Relevant Examples for the TREC-9 Routing
,
page 257
- Fudan University
FDU at TREC-9: CLIR, Filtering and QA Tasks,
page 189
- Fujitsu Laboratories, Ltd.
Fujitsu Laboratories TREC-9 Report,
page 203
- Hummingbird
Hummingbird's Fulcrum SearchServer at TREC-9, page 209
- IBM T.J. Watson Research Center
English-Chinese Information Retrieval at IBM, page 223
One Search Engine or Two for Question-Answering, page 235
IBM's Statistical Question Answering System, page 229
- Illinois Institute of Technology
IIT TREC-9-Entity Based Feedback with Fusion, page 241
- Imperial College of Science, Technology and Medicine
A Simple Question Answering System, page 249
- Informatique-CDC
Training Context-Sensitive Neural Networks with Few Relevant Examples for the TREC-9 Routing,
page 257
- IRIT-SIG
Mercure at trec9: Web and Filtering tasks, page 263
- Johns Hopkins University, APL
The HAIRCUT System at TREC-9, page 273
- Justsystem Corporation
Reflections on "Aboutness" TREC-9 Evaluation Experiments at Justsystem, page 281
- KDD R&D Laboratories, Inc.
Experiments on the TREC-9 Filtering Track, page 295
- Korea Advanced Institute of Science and Technology
-
TREC-9 Experiments at KAIST: QA, CLIR and Batch Filtering, page 303
- Korea University
-
Question Answering Considering Semantic Categories and Co-Occurence Density, page 317
- LIMSI-CNRS
-
QALC--The Question-Answering System of LIMSI-CNRS, page 325
The LIMSI SDR System for TREC-9, page 335
- Microsoft Research
-
The TREC-9 Filtering Track Final Report, page 25
- Microsoft Research China
-
TREC-9 CLIR Experiments at MSRCN, page 343
- Microsoft Research Ltd.
-
Question Answering Using a Large NLP System, page 355
- Microsoft Research Ltd., UK
-
Microsoft Cambridge at TREC-9: Filtering Track, page 361
- The MITRE Corporation
-
Another Sys Called Qanda, page 369
- MNIS-TextWise Labs
-
CINDOR Trec-9 English-Chinese Evaluation, page 379
- National Institute of Standards and Technology
-
Overview of the Ninth Text REtrieval Conference (TREC-9), page 1
-
TREC-9 Interactive Track Report, page 41
Overview of the TREC-9 Question Answering Track, page 71
Spoken Document Retrieval Track Slides
- National Taiwan University
-
Description of NTU QA and CLIR Systems in TREC-9, page 389
- NCR Corporation
-
IIT TREC-9-Entity Based Feedback with Fusion, page 241
- NTT Data Corporation
-
NTT DATA TREC-9 Question Answering Track Report, page 399
- Oregon Health Sciences University
-
Further Analysis of Whether Batch and User Evaluations Give the Same Results with a Question-Answering Task, page 407
-
TREC-9 Interactive Track Report, page 41
- Queens College, CUNY
-
TREC-9 Cross Language, Web and Question-Answering Track Experiments using PIRCS, page 417
- RICOH Co., Ltd.
-
Structuring and Expanding Queries in the Probablistic Model, page 427
- RMIT University
-
Melbourne TREC-9 Experiments, page 437
- Rutgers University
Support for Question-Answering in Interactive Information Retrieval: Rutgers' TREC-9 Interactive Track Experience, page 463
Logical Analysis of Data in the TREC-9 Filtering Track, page 453
IBM's Statistical Question Answering System, page 229
- SabIR Research, Inc.
SabIR Research at TREC-9, page 475
Query Expansion Seen Through Return Order of Relevant Documents, page 51
The TREC-9 Query Track, page 81
- Southern Methodist University
-
FALCON: Boosting Knowledge for Answer Engines, page 479
- Sun Microsystems Laboratories
-
Halfway to Question Answering, page 489
- Syracuse University
-
Question Answering: CNLP at the TREC-9 Question Answering Track, page 501
- TNO-TPD
-
TNO-UT at TREC-9: How Different are Web Documents?, page 665
- Tsinghua University China
-
TREC-9 CLIR Experiments at MSRCN, page 343
- U.S. Government
-
IIT TREC-9 - Entity Based Feedback with Fusion, page 241
- Universidad de Alicante
-
A Semantic Approach to Question Answering Systems, page 511
- Universitá di Pisa - Italy
-
The PISAB Question Answering System, page 621
- Université de Montréal
-
TREC-9 CLIR Experiments at MSRCN, page 343
-
Goal-Driven Answer Extraction, page 563
-
The System RELIEFS: A New Approach for Information Filtering, page 573
- Université de Neuchâtel
-
Report on the TREC-9 Experiment: Link-based Retrieval an Distributed Collections, page 579
- University of California at Berkeley
-
TREC-9 Cross-Language Information Retrieval (English-Chinese) Overview, page 15
English-Chinese Cross-Language IR Using Bilingual Dictionaries, page 517
- University of Glasgow
-
Question Answering, Relevance Feedback and Summarisation: Trec-9 Interactive Track Report, page 523
- University of Iowa
-
Filters and Answers: The University of Iowa TREC-9 Results, page 533
- University of Maryland
-
TREC-9 Experiments at Maryland: Interactive CLIR, page 543
- University of Massachusetts
-
INQUERY and TREC-9, page 551
- University of Melbourne
-
Melbourne TREC-9 Experiments, page 437
- University of Michigan
-
One Search Engine or Two for Question-Answering, page 235
- University of Nijmegen
-
Incrementality, Half-life, and Threshold Optimization for Adaptive Document Filtering, page 589
- University of North Carolina, Chapel Hill
-
Information Space Based on HTML Structure, page 601
- University of Padova, Italy
-
Web Document Retrieval Using Passage Retrieval, Connectivity Information, and Automatic Link Weighting--TREC-9 Report, page 611
- University of Sheffield
-
University of Sheffield TREC-9 QA System, page 635
-
The Thisl SDR System at TREC-9, page 627
-
Sheffield Interactive Experiment at TREC-9, page 645
- University of Southern California
-
Question Answering in Webclopedia, page 655
- University of Twente, CTIT
-
TNO-UT at TREC-9: How Different are Web Documents?, page 665
- University of Waterloo, CTIT
-
Question Answering by Passage Selection (MultiText Experiments for TREC-9), page 673
- University of Bangkok, Thailand
-
Kasetsart University TREC-9 Experiments, page 289
- Waseda University
-
Experiments on the TREC-9 Filtering Track, page 295
- WhizzBang Labs
-
The TREC-9 Filtering Track Final Report, page 25
|