Predictions drive neural representations of visual events ahead of incoming sensory information.
Ontology highlight
ABSTRACT: The transmission of sensory information through the visual system takes time. As a result of these delays, the visual information available to the brain always lags behind the timing of events in the present moment. Compensating for these delays is crucial for functioning within dynamic environments, since interacting with a moving object (e.g., catching a ball) requires real-time localization of the object. One way the brain might achieve this is via prediction of anticipated events. Using time-resolved decoding of electroencephalographic (EEG) data, we demonstrate that the visual system represents the anticipated future position of a moving object, showing that predictive mechanisms activate the same neural representations as afferent sensory input. Importantly, this activation is evident before sensory input corresponding to the stimulus position is able to arrive. Finally, we demonstrate that, when predicted events do not eventuate, sensory information arrives too late to prevent the visual system from representing what was expected but never presented. Taken together, we demonstrate how the visual system can implement predictive mechanisms to preactivate sensory representations, and argue that this might allow it to compensate for its own temporal constraints, allowing us to interact with dynamic visual environments in real time.
SUBMITTER: Blom T
PROVIDER: S-EPMC7132318 | biostudies-literature | 2020 Mar
REPOSITORIES: biostudies-literature
ACCESS DATA