Unknown

Dataset Information

0

Dynamic EEG analysis during language comprehension reveals interactive cascades between perceptual processing and sentential expectations.


ABSTRACT: Understanding spoken language requires analysis of the rapidly unfolding speech signal at multiple levels: acoustic, phonological, and semantic. However, there is not yet a comprehensive picture of how these levels relate. We recorded electroencephalography (EEG) while listeners (N = 31) heard sentences in which we manipulated acoustic ambiguity (e.g., a bees/peas continuum) and sentential expectations (e.g., Honey is made by bees). EEG was analyzed with a mixed effects model over time to quantify how language processing cascades proceed on a millisecond-by-millisecond basis. Our results indicate: (1) perceptual processing and memory for fine-grained acoustics is preserved in brain activity for up to 900 msec; (2) contextual analysis begins early and is graded with respect to the acoustic signal; and (3) top-down predictions influence perceptual processing in some cases, however, these predictions are available simultaneously with the veridical signal. These mechanistic insights provide a basis for a better understanding of the cortical language network.

SUBMITTER: Sarrett ME 

PROVIDER: S-EPMC7682806 | biostudies-literature |

REPOSITORIES: biostudies-literature

Similar Datasets

| S-EPMC8354901 | biostudies-literature
| S-EPMC4585526 | biostudies-literature
| S-EPMC7331387 | biostudies-literature
| S-EPMC2427059 | biostudies-literature
| S-EPMC4405243 | biostudies-literature
| S-EPMC3515598 | biostudies-literature
| S-EPMC6381768 | biostudies-literature
| S-EPMC6870623 | biostudies-literature
| S-EPMC10655903 | biostudies-literature
| S-EPMC6694754 | biostudies-literature