Unknown

Dataset Information

0

Representational interactions during audiovisual speech entrainment: Redundancy in left posterior superior temporal gyrus and synergy in left motor cortex.


ABSTRACT: Integration of multimodal sensory information is fundamental to many aspects of human behavior, but the neural mechanisms underlying these processes remain mysterious. For example, during face-to-face communication, we know that the brain integrates dynamic auditory and visual inputs, but we do not yet understand where and how such integration mechanisms support speech comprehension. Here, we quantify representational interactions between dynamic audio and visual speech signals and show that different brain regions exhibit different types of representational interaction. With a novel information theoretic measure, we found that theta (3-7 Hz) oscillations in the posterior superior temporal gyrus/sulcus (pSTG/S) represent auditory and visual inputs redundantly (i.e., represent common features of the two), whereas the same oscillations in left motor and inferior temporal cortex represent the inputs synergistically (i.e., the instantaneous relationship between audio and visual inputs is also represented). Importantly, redundant coding in the left pSTG/S and synergistic coding in the left motor cortex predict behavior-i.e., speech comprehension performance. Our findings therefore demonstrate that processes classically described as integration can have different statistical properties and may reflect distinct mechanisms that occur in different brain regions to support audiovisual speech comprehension.

SUBMITTER: Park H 

PROVIDER: S-EPMC6095613 | biostudies-literature | 2018 Aug

REPOSITORIES: biostudies-literature

altmetric image

Publications

Representational interactions during audiovisual speech entrainment: Redundancy in left posterior superior temporal gyrus and synergy in left motor cortex.

Park Hyojin H   Ince Robin A A RAA   Schyns Philippe G PG   Thut Gregor G   Gross Joachim J  

PLoS biology 20180806 8


Integration of multimodal sensory information is fundamental to many aspects of human behavior, but the neural mechanisms underlying these processes remain mysterious. For example, during face-to-face communication, we know that the brain integrates dynamic auditory and visual inputs, but we do not yet understand where and how such integration mechanisms support speech comprehension. Here, we quantify representational interactions between dynamic audio and visual speech signals and show that dif  ...[more]

Similar Datasets

| S-EPMC7470920 | biostudies-literature
| S-EPMC7703954 | biostudies-literature
| S-EPMC2967728 | biostudies-other
| S-EPMC6332511 | biostudies-literature
| S-EPMC2536697 | biostudies-literature
| S-EPMC4162511 | biostudies-literature
| S-EPMC5838482 | biostudies-literature
| S-EPMC6957234 | biostudies-literature
| S-EPMC7379161 | biostudies-literature
| S-EPMC6153351 | biostudies-literature