The integration of audio-tactile information is modulated by multimodal social interaction with physical contact in infancy.
Ontology highlight
ABSTRACT: Interaction between caregivers and infants is multimodal in nature. To react interactively and smoothly to such multimodal signals, infants must integrate all these signals. However, few empirical infant studies have investigated how multimodal social interaction with physical contact facilitates multimodal integration, especially regarding audio - tactile (A-T) information. By using electroencephalogram (EEG) and event-related potentials (ERPs), the present study investigated how neural processing involved in A-T integration is modulated by tactile interaction. Seven- to 8-months-old infants heard one pseudoword both whilst being tickled (multimodal 'A-T' condition), and not being tickled (unimodal 'A' condition). Thereafter, their EEG was measured during the perception of the same words. Compared to the A condition, the A-T condition resulted in enhanced ERPs and higher beta-band activity within the left temporal regions, indicating neural processing of A-T integration. Additionally, theta-band activity within the middle frontal region was enhanced, which may reflect enhanced attention to social information. Furthermore, differential ERPs correlated with the degree of engagement in the tickling interaction. We provide neural evidence that the integration of A-T information in infants' brains is facilitated through tactile interaction with others. Such plastic changes in neural processing may promote harmonious social interaction and effective learning in infancy.
SUBMITTER: Tanaka Y
PROVIDER: S-EPMC6969118 | biostudies-literature |
REPOSITORIES: biostudies-literature
ACCESS DATA