We propose a saliency model to estimate the task-driven eyemovement. Human eye movement patterns is affected by observer’s task and mental state [1]. However, the existing saliency model are detected from the low-level image features such as bright regions, edges, colors, etc. In this paper, the tasks (e.g., evaluation of a piano performance) are given to the observer who is watching the music videos. Unlike existing visual-based methods, we use musical score features and image features to detect a saliency. We show that our saliency model outperforms existing models that use eye movement patterns.
CITATION STYLE
Numano, S., Enami, N., & Ariki, Y. (2015). Task-driven saliency detection on music video. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9009, pp. 658–671). Springer Verlag. https://doi.org/10.1007/978-3-319-16631-5_48
Mendeley helps you to discover research relevant for your work.