TABLE OF CONTENTS
FOREWORD
ABSTRACT
PAPERS
- Overview of the Ninth Text REtrieval Conference (TREC-9), page 1
E. Voorhees, D. Harman, National Institute of Standards and Technology
- TREC-9 Cross-Language Information Retrieval (English-Chinese) Overview, page 15
F. Gey, A. Chen, University of California, Berkeley
- The TREC-9 Filtering Track Final Report, page 25
S. Robertson, Microsoft Research D.A. Hull, WhizzBang Labs The TREC-9 Interactive Track Report, page 41
W. Hersh, Oregon Health Sciences University Paul Over, National Institute of Standards and Technology
- Query Expansion Seen Through Return Order
of Relevant Documents, page 51
W. Liggett, NIST C. Buckley, SabIR Research, Inc.
- Overview of the TREC-9 Question Answering Track, page 71
E. Voorhees, NIST
- The TREC-9 Query Track, page 81
Chris Buckley, Sabir Research, Inc.
- Spoken Document Retrieval Track Slides
J. Garofolo, J. Lard, E. Voorhees, NIST
- Overview of the TREC-9 Web Track, page 87
D. Hawking, CSIRO Mathematical and Information Sciences
- Structuring and Expanding Queries in the
Probabilistic Model, page 427
Y. Ogawa, H. Mano, M. Narita, S. Honma, RICOH Co., Ltd.
- IIT TREC-9 - Entity Based Feedback with Fusion,
page 241
A. Chowdhury, S. Beitzel, E. Jensen, M. Sai-lee, D. Grossman,
O. Frieder, Illinois Institute of Technology
M.C. McCabe, U.S. Government
D. Holmes, NCR Corporation
- TREC-9 Cross Language, Web and Question-Answering Track Experiments using PIRCS, page 419
K.L. Kwok, L. Grunfeld, N. Dinstl, M. Chan, Queens College, CUNY
- TREC-9 CLIR Experiments at MSRCN, page 343
J. Gao, E. Xun, M. Zhou, C. Huang, Microsoft Research China
J-Y Nie, Université de Montréal
J. Zhang, Y. Su, Tsinghua University China
- FALCON: Boosting Knowledge for Answer Engines,
page 479
S. Harabagiu, D. Moldovan, M. Pasca, R. Mihalcea,
M. Surdeanu, R. Bunescu, R. Gîrju, V. Rus, P. Morarescu, Southern Methodist University
- IBM's Statistical Question Answering System,
page 229
A. Ittycheriah, M. Franz, W-J Zhu, A. Ratnaparkhi, IBM T.J. Watson Research Center
R.J. Mammone, Rutgers University
- Question Answering by Passage Selection
(MultiText Experiments for TREC-9), page 673
C.L.A. Clarke, G.V. Cormack, D.I.E. Kisman, T.R. Lynam, University of Waterloo
- Question Answering in Webclopedia,
page 655
E. Hovy, L. Gerber, U. Hermjakob, M. Junk, C-Y Lin, University of Southern California
- Filters and Answers: The University of Iowa TREC-9 Results, page 533
E. Catona, D. Eichmann, P. Srinivasan, University of Iowa
-
The LIMSI SDR System for TREC-9, page 335
J.-L. Gauvain, L. Lamel, C. Barras, G. Adda, Y. de Kercardio, LIMSI-CNRS
- Microsoft Cambridge at TREC-9: Filtering
Track, page 361
S.E. Robertson, S. Walker, Microsoft Research Ltd., UK
- AT&T at TREC-9
, page 103
A. Singhal, M. Kaszkiel, AT&T Labs-Research
- Spoken Document Retrieval for TREC-9 at
Cambridge University, page 117
S.E. Johnson, P. Jourlin, K. Spärck Jones, P.C. Woodland, Cambridge University
- kNN at TREC-9,
page 127
T. Ault, Y. Yang, Carnegie Mellon University
- YFilter at TREC-9,
page 135
Y. Zhang, J. Callan, Carnegie Mellon University
- Passive Feedback Collection--An Attempt to Debunk the Myth of Clickthroughs,
page 141
C. Vogt, Chapman University
- TREC-9 CLIR at CUHK: Disambiguation by Similarity Values Between Adjacent Words,
page 151
H. Jin, K-F Wong, The Chinese University of Hong Kong
- Syntactic Clues and Lexical Resources in Question-Answering,
page 157
K.C. Litkowski, CL Research
- Dublin City University Experiments in Connectivity Analysis for TREC-9,
page 179
C. Gurrin, A.F. Smeaton, Dublin City University
- FDU at TREC-9: CLIR, Filtering and QA Tasks,
page 189
L. Wu, X-j Huang, Y. Guo, B. Liu, Y. Zhang, Fudan University
- Fujitsu Laboratories TREC-9 Report,
page 203
I. Namba, Fujitsu Laboratories, Ltd.
- Hummingbird's Fulcrum SearchServer at TREC-9,
page 209
S. Tomlinson, T. Blackwell, Hummingbird
- English-Chinese Information Retrieval at IBM,
page 223
M. Franz, J.S. McCarley, W-J Zhu, IBM T.J. Watson Research Center
- One Search Engine or Two for Question-Answering,
page 235
J. Prager, E. Brown, IBM T.J. Watson Research Center
D.R. Radev, University of Michigan
K. Czuba, Carnegie-Mellon University
- Training Context-Sensitive Neural Networks with Few Relevant Examples for the TREC-9 Routing,
page 257
M. Stricker, Informatique-CDC and ESPCI
F. Vichot, F. Wolinski, Informatique-CDC
G. Dreyfus, ESPCI
- Mercure at trec9: Web and Filtering tasks,
page 263
M. Abchiche, M. Boughanem, T. Dkaki, J. Mothe, C. Soule Dupuy, M. Tmar, IRIT-SIG
- The HAIRCUT System at TREC-9,
page 273
P. McNamee, J. Mayfield, C. Piatko, The Johns Hopkins University, APL
- Experiments on the TREC-9 Filtering Track,
page 295
K. Hoashi, K. Matsumoto, N. Inoue, K. Hashimoto, KDD R&D Laboratories, Inc.
T. Hasegawa, K. Shirai, Waseda University
- TREC-9 Experiments at KAIST: QA, CLIR and Batch Filtering,
page 303
K-S Lee, J-H Oh, JX Huang, J-H Kim, K-S Choi, Korea Advanced Institute of Science and Technology
- Question Answering Considering Semantic Categories and Co-Occurrence Density,
page 317
S-M Kim, D-H Baek, S-B Kim, H-C Rim, Korea University
- QALC--The Question-Answering System of LIMSI-CNRS,
page 235
O. Ferret, B. Grau, M. Hurault-Plantet, G. Illouz, C. Jacquemin, LIMSI-CNRS
N. Masson, P. Lecuyer, Bertin Technologies
- Question Answering Using a Large NLP System,
page 355
D. Elworthy, Microsoft Research Ltd.
- NTT DATA TREC-9 Question Answering Track Report,
page 399
T. Takaki, NTT Data Corporation
- Description of NTU QA and CLIR Systems in TREC-9,
page 389
C-J Lin, W-C Lin, H-H Chen, National Taiwan University
- Further Analysis of Whether Batch and User Evaluations Give the Same Results with a Question-Answering Task,
page 407
W. Hersh, A. Turpin, L. Sacherek, D. Olson, S. Price, B. Chan, D. Kraemer, Oregon Health Sciences University
- Melbourne TREC-9 Experiments,
page 437
D. D'Souza, M. Fuller, J. Thom, P. Vines, J. Zobel, RMIT University
O. de Kretser, University of Melbourne
R. Wilkinson, M. Wu, CSIRO, Division of Mathematics and Information Science
- Support for Question-Answering in Interactive Information Retrieval: Rutgers' TREC-9 Interactive Track Experience,
page 463
N.J. Belkin, A. Keller, D. Kelly, J. Perez-Carballo, C. Sikora, Y. Sun, Rutgers University
- Halfway to Question Answering,
page 489
W.A. Woods, S. Green, P. Martin, A. Houston, Sun Microsystems Laboratories
- Question Answering: CNLP at the TREC-9 Question Answering Track,
page 501
A. Diekema, X. Liu, J. Chen, H. Wang, N. McCracken, O. Yilmazel, E.D. Liddy, Syracuse University, School of Information Studies
- CINDOR TREC-9 English-Chinese Evaluation,
page 379
M.E. Ruiz, S. Rowe, M. Forrester, P. Sheridan, MNIS-TextWise Labs
- TNO-UT at TREC-9: How Different are Web Documents?,
page 665
W. Kraaij, TNO-TPD
T. Westerveld, University of Twente, CTIT
- A Semantic Approach to Question Answering Systems,
page 511
J.L. Vicedo, A. Ferrandez, Universidad de Alicante
- The PISAB Question Answering System,
page 621
G. Attardi, C. Burrini, Universitá di Pisa - Italy
- Goal-Driven Answer Extraction,
page 563
M. Laszlo, L. Kosseim, G. Lapalme, Université de Montréal
-
The System RELIEFS: A New Approach for Information Filtering,
page 573
C. Brouard and J-Y Nie, Université de Montréal
-
Report on the TREC-9 Experiment: Link-based Retrieval and Distributed Collections,
page 579
J. Savoy, Y. Rasolofo, Université de Neuchâtel
- English-Chinese Cross-Language IR Using Bilingual Dictionaries,
page 517
A. Chen, H. Jiang, School of Information Management and Systems, University of California at Berkeley
F. Gey, UC Data Archive & Technical Assistance (UC DATA), University of California at Berkeley
- Question Answering, Relevance Feedback and Summarisation: TREC-9 Interactive Track Report,
page 523
N. Alexander, C. Brown, J. Jose, I. Ruthven, A. Tombros, University of Glasgow
- INQUERY and TREC-9,
page 551
J. Allan, M.E. Connell, W.B. Croft, F-F Feng, D. Fisher, X. Li, Center for Intelligent Information Retrieval, Department of Computer Science, University of Massachusetts
- Information Space Based on HTML Structure,
page 601
G. Newby, University of North Carolina, Chapel Hill
- Web Document Retrieval Using Passage Retrieval, Connectivity Information, and Automatic Link Weighting--TREC-9 Report,
page 611
F. Crivellari, M. Melucci, University of Padova (Italy)
- The Thisl SDR System at TREC-9,
page 627
S. Renals, D. Abberley, University of Sheffield, UK
- University of Sheffield TREC-9 Q&A System,
page 635
S. Scott, R. Gaizauskas, University of Sheffield
- The Mirror DBMS at TREC-9,
page 171
A.P. de Vries, CWI, Amsterdam, The Netherlands
- TREC-9 Cross-lingual Retrieval at BBN,
page 106
J. Xu, R. Weischedel, BBN Technologies
- Sheffield Interactive Experiment at TREC-9,
page 645
M. Beaulieu, H. Fowkes, H. Joho, University of Sheffield, UK
- Reflections on "Aboutness" TREC-9 Evaluation Experiments at Justsystem,
page 281
S. Fujita, Justsystem Corporation
- Incrementality, Half-life, and Threshold Optimization for Adaptive Document Filtering,
page 589
A. Arampatzis, J. Beney, C.H.A. Koster, T.P. van der Weide, University of Nijmegen
- ACSys/CSIRO TREC-9 Experiments,
page 167
D. Hawking, CSIRO Mathematics and Information Sciences
- Kasetsart University TREC-9 Experiments,
page 289
P. Narasetsathaporn, A. Rungsawang, Kasetsart University, Bangkok, Thailand
- SabIR Research at TREC-9,
page 475
C. Buckley, J. Walz, SabIR Research
- A Simple Question Answering System,
page 249
R. J. Cooper, S.M. Rüger, Imperial College of Science, Technology and Medicine
- TREC-9 Experiments at Maryland: Interactive CLIR,
page 543
D.W. Oard, G-A Levow, C.I. Cabezas, University of Maryland
- Logical Analysis of Data in the TREC-9 Filtering Track,
page 453
E. Boros, P.B. Kantor, D.J. Neu, Rutgers University
- Another Sys Called Qanda,
page 369
E. Breck, J. Burger, L. Ferro, W. Greiff, M. Light, I. Mani, J. Rennie, The MITRE Corporation
APPENDICES
-
TREC-9 Results, page A-1 and
Evaluation Techniques and Measures, page A-15
-
-
Cross-Language Runs List, page A-2
- Cross-Language track results, page A-20
- Filtering Runs List, page A-3
- Filtering track results, page A-58
- Interactive track results, page A-88
- Question Answering Runs List, page A-5
- Question Answer track results, page A-100
- Query track results, page A-95
- Spoken Document Retrieval Runs List, page A-12
- Spoken Document Retrieval track results, page A-178
- Web Runs List, page A-12
- Web track results, page A-210
|