How animals integrate information from various senses to navigate and generate perceptions is a fundamental question. Bats are ideal animal models to study multisensory integration due to their reliance on vision and echolocation, two modalities that allow distal sensing with high spatial resolution. Using three behavioral paradigms, we studied different aspects of multisensory integration in Egyptian fruit bats. We show that bats learn the three-dimensional shape of an object using vision only, even when using both vision and echolocation. Nevertheless, we demonstrate that they can classify objects using echolocation and even translate echoic information into a visual representation. Last, we show that in navigation, bats dynamically switch between the modalities: Vision was given more weight when deciding where to fly, while echolocation was more dominant when approaching an obstacle. We conclude that sensory integration is task dependent and that bimodal information is weighed in a more complex manner than previously suggested.
CITATION STYLE
Danilovich, S., & Yovel, Y. (2019). Integrating vision and echolocation for navigation and perception in bats. Science Advances, 5(6). https://doi.org/10.1126/sciadv.aaw6503
Mendeley helps you to discover research relevant for your work.