Systematic misperceptions of 3-D motion explained by Bayesian inference.
Ontology highlight
ABSTRACT: People make surprising but reliable perceptual errors. Here, we provide a unified explanation for systematic errors in the perception of three-dimensional (3-D) motion. To do so, we characterized the binocular retinal motion signals produced by objects moving through arbitrary locations in 3-D. Next, we developed a Bayesian model, treating 3-D motion perception as optimal inference given sensory noise in the measurement of retinal motion. The model predicts a set of systematic perceptual errors, which depend on stimulus distance, contrast, and eccentricity. We then used a virtual-reality headset as well as a standard 3-D desktop stereoscopic display to test these predictions in a series of perceptual experiments. As predicted, we found evidence that errors in 3-D motion perception depend on the contrast, viewing distance, and eccentricity of a stimulus. These errors include a lateral bias in perceived motion direction and a surprising tendency to misreport approaching motion as receding and vice versa. In sum, we present a Bayesian model that provides a parsimonious account for a range of systematic misperceptions of motion in naturalistic environments.
SUBMITTER: Rokers B
PROVIDER: S-EPMC6691918 | biostudies-literature | 2018 Mar
REPOSITORIES: biostudies-literature
ACCESS DATA