Object size determines the spatial spread of visual time

12Citations
Citations of this article
31Readers
Mendeley users who have this article in their library.

Abstract

A key question for temporal processing research is how the nervous system extracts event duration, despite a notable lack of neural structures dedicated to duration encoding. This is in stark contrast with the orderly arrangement of neurons tasked with spatial processing. In this study, we examine the linkage between the spatial and temporal domains. We use sensory adaptation techniques to generate after-effects where perceived duration is either compressed or expanded in the opposite direction to the adapting stimulus’ duration. Our results indicate that these after-effects are broadly tuned, extending over an area approximately five times the size of the stimulus. This region is directly related to the size of the adapting stimulus—the larger the adapting stimulus the greater the spatial spread of the aftereffect. We construct a simple model to test predictions based on overlapping adapted versus non-adapted neuronal populations and show that our effects cannot be explained by any single, fixed-scale neural filtering. Rather, our effects are best explained by a self-scaled mechanism underpinned by duration selective neurons that also pool spatial information across earlier stages of visual processing.

Cite

CITATION STYLE

APA

Fulcher, C., McGraw, P. V., Roach, N. W., Whitaker, D., & Heron, J. (2016). Object size determines the spatial spread of visual time. Proceedings of the Royal Society B: Biological Sciences, 283(1835). https://doi.org/10.1098/rspb.2016.1024

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free