Live Demonstrator of EEG and Eye-Tracking Input for Disambiguation of Image Search Results

9Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

When searching images on the web, users are often confronted with irrelevant results due to ambiguous queries. Consider a search term like Bill: Results will probably consist of multiple images depicting Bill Clinton, Bill Cosby and money bills. Given that the user is only interested in pictures of money bills, most of the results are irrelevant. We built a demo application that exploits EEG and eye-tracking data for the disambiguation of one of two possible interpretations of an ambiguous search term. The demo exhibits the integration of sensor input into a modern web application.

Cite

CITATION STYLE

APA

Golenia, J. E., Wenzel, M., & Blankertz, B. (2015). Live Demonstrator of EEG and Eye-Tracking Input for Disambiguation of Image Search Results. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9359, pp. 81–86). Springer Verlag. https://doi.org/10.1007/978-3-319-24917-9_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free