With huge quantities of multimedia information becoming available on the Internet everyday, our foremost mechanisms to find information still rely on text-based retrieval systems with their keyword-based query interfaces. However little to nothing is known about the retrieval performance and/or the quality of the user interface of these search engines. Often when a retrieval system is developed the evaluation focuses either on the retrieval performance analysis of the retrieval strategy, or on the usability testing of the interface offered by the retrieval system. Both experiments are time consuming to set up and often require the same preconditions to be fulfilled, i.e. a test reference collection, and in the case of usability testing respondents, to be available. The contribution of this article is twofold. It discusses a testbed for the evaluation of a wide variety of retrieval systems that allows both a usability and a retrieval experiment to be conducted in the same platform. Besides greatly reducing the effort needed to set up and perform such experiments, it also allows for the investigation of the relationship between usability testing and retrieval performance analysis of retrieval systems. Secondly, it presents the results of a case study with the testbed, comparing three major search engines available on the Internet. © Springer-Verlag 2004.
CITATION STYLE
Van Zwol, R., & Van Oostendorp, H. (2004). Google’s “i’m feeling lucky”, truly a gamble? Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3306, 378–389. https://doi.org/10.1007/978-3-540-30480-7_39
Mendeley helps you to discover research relevant for your work.