Abstract
We propose a new method for determining the correspondence between audio and visual events based on the selective attention which is observed in a living organism. For auditory attention, our system recognizes a target sound in noisy environments using an auditory filter that focuses on a specific period corresponding to visual events. It was confirmed that the success rate of correspondence between audio and visual events increased from 78.6% without the auditory filter, to 95.2% with the auditory filter. This certifies the importance of visual information for auditory attention. We also realized a visual attention of localizing a sound source and adjusting the line of sight and the iris of a camera when the system perceived a sound, but could not obtain visual information. The success rate of experiments based on visual attention is 93.3%. These results show the effectiveness of the proposed method.
Cite
CITATION STYLE
Nishibori, K., Takeuchi, Y., Matsumoto, T., Kudo, H., & Ohnishi, N. (2008). A biologically inspired method for finding correspondence between audio-visual events based on selective attention. Kyokai Joho Imeji Zasshi/Journal of the Institute of Image Information and Television Engineers, 62(7), 1086–1097. https://doi.org/10.3169/itej.62.1086
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.