Accuracy of Motor Error Predictions for Different Sensory Signals.
Ontology highlight
ABSTRACT: Detecting and evaluating errors in action execution is essential for learning. Through complex interactions of the inverse and the forward model, the human motor system can predict and subsequently adjust ongoing or subsequent actions. Inputs to such a prediction are efferent and afferent signals from various sources. The aim of the current study was to examine the impact of visual as well as a combination of efferent and proprioceptive input signals to error prediction in a complex motor task. Predicting motor errors has been shown to be correlated with a neural signal known as the error-related negativity (Ne/ERN). Here, we tested how the Ne/ERN amplitude was modulated by the availability of different sensory signals in a semi-virtual throwing task where the action outcome (hit or miss of the target) was temporally delayed relative to movement execution allowing participants to form predictions about the outcome prior to the availability of knowledge of results. 19 participants practiced the task and electroencephalogram was recorded in two test conditions. In the Visual condition, participants received only visual input by passively observing the throwing movement. In the EffProp condition, participants actively executed the task while visual information about the real and the virtual effector was occluded. Hence, only efferent and proprioceptive signals were available. Results show a significant modulation of the Ne/ERN in the Visual condition while no effect could be observed in the EffProp condition. In addition, amplitudes of the feedback-related negativity in response to the actual outcome feedback were found to be inversely related to the Ne/ERN amplitudes. Our findings indicate that error prediction is modulated by the availability of input signals to the forward model. The observed amplitudes were found to be attenuated in comparison to previous studies, in which all efferent and sensory inputs were present. Furthermore, we assume that visual signals are weighted higher than proprioceptive signals, at least in goal-oriented tasks with visual targets.
SUBMITTER: Joch M
PROVIDER: S-EPMC6090479 | biostudies-literature | 2018
REPOSITORIES: biostudies-literature
ACCESS DATA