Humans can learn to abstract and conceptualize the shared visual features defining an object category in object learning. Therefore, learning is generalizable to transformations of familiar objects and even tonewobjects that differ in other physical properties. In contrast, visual perceptual learning (VPL), improvement in discriminating fine differences of a basic visual feature through training, is commonly regarded as specific and low-level learning because the improvement often disappears when the trained stimulus is simply relocated or rotated in the visual field. Such location and orientation specificity is taken as evidence for neural plasticity in primary visual cortex (V1) or improved readout of V1 signals. However, new training methods have shown complete VPL transfer across stimulus locations and orientations, suggesting the involvement of high-level cognitive processes. Here we report that VPL bears similar properties of object learning. Specifically, we found that orientation discrimination learning is completely transferrable between luminance gratings initially encoded in V1 and bilaterally symmetric dot patterns encoded in higher visual cortex. Similarly, motion direction discrimination learning is transferable between first- and second-order motion signals. These results suggest thatVPLcan take place at a conceptual level and generalize to stimuli with different physical properties. Our findings thus reconcile perceptual and object learning into a unified framework.
CITATION STYLE
Wang, R., Wang, J., Zhang, J. Y., Xie, X. Y., Yang, Y. X., Luo, S. H., … Li, W. (2016). Perceptual learning at a conceptual level. Journal of Neuroscience, 36(7), 2238–2246. https://doi.org/10.1523/JNEUROSCI.2732-15.2016
Mendeley helps you to discover research relevant for your work.