This paper proposes a method of automatically annotating tennis action through the integrated use of audio and video information. The proposed method extracts ball-hitting times called "impact times" using audio information, and evaluates the position relations between the player and the ball at the impact time to identify the player's basic actions, such as forehand swing, overhead swing, etc. Simulation results show that the detection rate for impact time influences the recognition rate of the player's basic actions. They also reveal that using audio information avoids some event recognition failures that cannot be averted when using only video information, demonstrating the performance and the validity of our approach. © Springer-Verlag Berlin Heidelberg 2003.
CITATION STYLE
Miyamori, H. (2003). Automatic annotation of tennis action for content-based retrieval by integrated audio and visual information. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2728, 331–341. https://doi.org/10.1007/3-540-45113-7_33
Mendeley helps you to discover research relevant for your work.