Automatic annotation of tennis action for content-based retrieval by integrated audio and visual information

13Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper proposes a method of automatically annotating tennis action through the integrated use of audio and video information. The proposed method extracts ball-hitting times called "impact times" using audio information, and evaluates the position relations between the player and the ball at the impact time to identify the player's basic actions, such as forehand swing, overhead swing, etc. Simulation results show that the detection rate for impact time influences the recognition rate of the player's basic actions. They also reveal that using audio information avoids some event recognition failures that cannot be averted when using only video information, demonstrating the performance and the validity of our approach. © Springer-Verlag Berlin Heidelberg 2003.

Cite

CITATION STYLE

APA

Miyamori, H. (2003). Automatic annotation of tennis action for content-based retrieval by integrated audio and visual information. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2728, 331–341. https://doi.org/10.1007/3-540-45113-7_33

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free