Exploring Topic Difficulty in Information Retrieval Systems Evaluation

4Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Experimental or relevance assessment cost as well as reliability of an information retrieval (IR) evaluation is highly correlated to the number of topics used. The need of many assessors to produce equivalent large relevance judgments often incurs high cost and time. So, large number of topics in retrieval experiment is not practical and economical. This experiment proposes an approach to identify most effective topics in evaluating IR systems with regards to topic difficulty. The proposed approach is capable of identifying which topics and topic set size are reliable when evaluating system effectiveness. Easy topics appeared to be most suitable for effectively evaluating IR systems.

Cite

CITATION STYLE

APA

Ting Pang, W., Rajagopal, P., Wang, M., Zhang, S., & Devi Ravana, S. (2019). Exploring Topic Difficulty in Information Retrieval Systems Evaluation. In Journal of Physics: Conference Series (Vol. 1339). Institute of Physics Publishing. https://doi.org/10.1088/1742-6596/1339/1/012019

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free