A novel video retrieval method to support a user’s recollection of past events aiming for wearable information playing

4Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Our system supports a user’s location-based recollection of past events with direct input such as in always ‘gazing’ video data, which allows the user to associate by simply looking at a viewpoint, and providing stable online and real-time video retrieval. We propose three functional methods: image retrieval with motion information, video scene segmentation, and real-time video retrieval. Our experimental results have shown that these functions are effective enough to perform wearable information playing.

Cite

CITATION STYLE

APA

Kawamura, T., Kono, Y., & Kidode, M. (2001). A novel video retrieval method to support a user’s recollection of past events aiming for wearable information playing. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2195, pp. 24–31). Springer Verlag. https://doi.org/10.1007/3-540-45453-5_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free