Elicitation and use of relevance feedback information

作者:

Highlights:

摘要

The paper presents two approaches to interactively refining user search formulations and their evaluation in the new High Accuracy Retrieval from Documents (HARD) track of TREC-12. The first method consists of asking the user to select a number of sentences that represent documents. The second method consists of showing to the user a list of noun phrases extracted from the initial document set. Both methods then expand the query based on the user feedback. The TREC results show that one of the methods is an effective means of interactive query expansion and yields significant performance improvements. The paper presents a comparison of the methods and detailed analysis of the evaluation results.

论文关键词:Information retrieval,Query expansion,Natural language processing,Interactive retrieval,Relevance feedback

论文评审过程:Received 5 April 2004, Accepted 21 October 2004, Available online 8 December 2004.

论文官网地址:https://doi.org/10.1016/j.ipm.2004.10.006