Unknown

Dataset Information

0

Manual Gestures Modulate Early Neural Responses in Loudness Perception.


ABSTRACT: How different sensory modalities interact to shape perception is a fundamental question in cognitive neuroscience. Previous studies in audiovisual interaction have focused on abstract levels such as categorical representation (e.g., McGurk effect). It is unclear whether the cross-modal modulation can extend to low-level perceptual attributes. This study used motional manual gestures to test whether and how the loudness perception can be modulated by visual-motion information. Specifically, we implemented a novel paradigm in which participants compared the loudness of two consecutive sounds whose intensity changes around the just noticeable difference (JND), with manual gestures concurrently presented with the second sound. In two behavioral experiments and two EEG experiments, we investigated our hypothesis that the visual-motor information in gestures would modulate loudness perception. Behavioral results showed that the gestural information biased the judgment of loudness. More importantly, the EEG results demonstrated that early auditory responses around 100 ms after sound onset (N100) were modulated by the gestures. These consistent results in four behavioral and EEG experiments suggest that visual-motor processing can integrate with auditory processing at an early perceptual stage to shape the perception of a low-level perceptual attribute such as loudness, at least under challenging listening conditions.

SUBMITTER: Sun J 

PROVIDER: S-EPMC8440995 | biostudies-literature |

REPOSITORIES: biostudies-literature

Similar Datasets

| S-EPMC2929818 | biostudies-other
| S-EPMC7755647 | biostudies-literature
| S-EPMC4738254 | biostudies-other
| S-EPMC4485572 | biostudies-literature
| S-EPMC4381242 | biostudies-literature
| S-EPMC6698263 | biostudies-other
| S-EPMC7536367 | biostudies-literature
| S-EPMC7518190 | biostudies-literature