Unknown

Dataset Information

0

Visual influence on path integration in darkness indicates a multimodal representation of large-scale space.


ABSTRACT: Our ability to return to the start of a route recently performed in darkness is thought to reflect path integration of motion-related information. Here we provide evidence that motion-related interoceptive representations (proprioceptive, vestibular, and motor efference copy) combine with visual representations to form a single multimodal representation guiding navigation. We used immersive virtual reality to decouple visual input from motion-related interoception by manipulating the rotation or translation gain of the visual projection. First, participants walked an outbound path with both visual and interoceptive input, and returned to the start in darkness, demonstrating the influences of both visual and interoceptive information in a virtual reality environment. Next, participants adapted to visual rotation gains in the virtual environment, and then performed the path integration task entirely in darkness. Our findings were accurately predicted by a quantitative model in which visual and interoceptive inputs combine into a single multimodal representation guiding navigation, and are incompatible with a model of separate visual and interoceptive influences on action (in which path integration in darkness must rely solely on interoceptive representations). Overall, our findings suggest that a combined multimodal representation guides large-scale navigation, consistent with a role for visual imagery or a cognitive map.

SUBMITTER: Tcheang L 

PROVIDER: S-EPMC3024704 | biostudies-other | 2011 Jan

REPOSITORIES: biostudies-other

altmetric image

Publications

Visual influence on path integration in darkness indicates a multimodal representation of large-scale space.

Tcheang Lili L   Bülthoff Heinrich H HH   Burgess Neil N  

Proceedings of the National Academy of Sciences of the United States of America 20110103 3


Our ability to return to the start of a route recently performed in darkness is thought to reflect path integration of motion-related information. Here we provide evidence that motion-related interoceptive representations (proprioceptive, vestibular, and motor efference copy) combine with visual representations to form a single multimodal representation guiding navigation. We used immersive virtual reality to decouple visual input from motion-related interoception by manipulating the rotation or  ...[more]

Similar Datasets

| S-EPMC7244182 | biostudies-literature
| S-EPMC9898225 | biostudies-literature
| S-EPMC5732274 | biostudies-literature
| S-EPMC3420935 | biostudies-literature
| S-EPMC8156799 | biostudies-literature
| S-EPMC9251113 | biostudies-literature
| S-EPMC9945067 | biostudies-literature
| S-EPMC6373447 | biostudies-literature
| S-EPMC8794728 | biostudies-literature
| S-EPMC7355300 | biostudies-literature