A Neural Model of Distance-Dependent Percept of Object Size Constancy.
Ontology highlight
ABSTRACT: Size constancy is one of the well-known visual phenomena that demonstrates perceptual stability to account for the effect of viewing distance on retinal image size. Although theories involving distance scaling to achieve size constancy have flourished based on psychophysical studies, its underlying neural mechanisms remain unknown. Single cell recordings show that distance-dependent size tuned cells are common along the ventral stream, originating from V1, V2, and V4 leading to IT. In addition, recent research employing fMRI demonstrates that an object's perceived size, associated with its perceived egocentric distance, modulates its retinotopic representation in V1. These results suggest that V1 contributes to size constancy, and its activity is possibly regulated by feedback of distance information from other brain areas. Here, we propose a neural model based on these findings. First, we construct an egocentric distance map in LIP by integrating horizontal disparity and vergence through gain-modulated MT neurons. Second, LIP neurons send modulatory feedback of distance information to size tuned cells in V1, resulting in a spread of V1 cortical activity. This process provides V1 with distance-dependent size representations. The model supports that size constancy is preserved by scaling retinal image size to compensate for changes in perceived distance, and suggests a possible neural circuit capable of implementing this process.
SUBMITTER: Qian J
PROVIDER: S-EPMC4489391 | biostudies-literature | 2015
REPOSITORIES: biostudies-literature
ACCESS DATA