When we see motion, our perception of how one image feature moves depends on the behaviour of other features nearby. In particular, the Gestaltists proposed the law of shared common fate1,2, in which features tend to be perceived as moving together, that is, coherently. Recent psychophysical findings, such as the cooperativity of the motion system3,4 and motion capture5,6, support this law. Computationally, coherence is a sensible assumption, because if two features are close then they probably belong to the same object and thus tend to move together. Moreover, the measurement of local motion may be inaccurate7 and so the integration of motion information over large areas may help to improve the performance. Present theories of visual motion, however, do not account fully for these coherent motion percepts. We propose here a theory that does account for these phenomena and also provides a solution to the aperture problem8, where the local information in the image flow is insufficient to specify the motion uniquely. © 1988 Nature Publishing Group.
CITATION STYLE
Yuille, A. L., & Grzywacz, N. M. (1988). A computational theory for the perception of coherent visual motion. Nature, 333(6168), 71–74. https://doi.org/10.1038/333071a0
Mendeley helps you to discover research relevant for your work.