Our system supports a user’s location-based recollection of past events with direct input such as in always ‘gazing’ video data, which allows the user to associate by simply looking at a viewpoint, and providing stable online and real-time video retrieval. We propose three functional methods: image retrieval with motion information, video scene segmentation, and real-time video retrieval. Our experimental results have shown that these functions are effective enough to perform wearable information playing.
CITATION STYLE
Kawamura, T., Kono, Y., & Kidode, M. (2001). A novel video retrieval method to support a user’s recollection of past events aiming for wearable information playing. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2195, pp. 24–31). Springer Verlag. https://doi.org/10.1007/3-540-45453-5_4
Mendeley helps you to discover research relevant for your work.