Unknown

Dataset Information

0

Auditory selective attention to speech modulates activity in the visual word form area.


ABSTRACT: Selective attention to speech versus nonspeech signals in complex auditory input could produce top-down modulation of cortical regions previously linked to perception of spoken, and even visual, words. To isolate such top-down attentional effects, we contrasted 2 equally challenging active listening tasks, performed on the same complex auditory stimuli (words overlaid with a series of 3 tones). Instructions required selectively attending to either the speech signals (in service of rhyme judgment) or the melodic signals (tone-triplet matching). Selective attention to speech, relative to attention to melody, was associated with blood oxygenation level-dependent (BOLD) increases during functional magnetic resonance imaging (fMRI) in left inferior frontal gyrus, temporal regions, and the visual word form area (VWFA). Further investigation of the activity in visual regions revealed overall deactivation relative to baseline rest for both attention conditions. Topographic analysis demonstrated that while attending to melody drove deactivation equivalently across all fusiform regions of interest examined, attending to speech produced a regionally specific modulation: deactivation of all fusiform regions, except the VWFA. Results indicate that selective attention to speech can topographically tune extrastriate cortex, leading to increased activity in VWFA relative to surrounding regions, in line with the well-established connectivity between areas related to spoken and visual word perception in skilled readers.

SUBMITTER: Yoncheva YN 

PROVIDER: S-EPMC2820701 | biostudies-literature | 2010 Mar

REPOSITORIES: biostudies-literature

altmetric image

Publications

Auditory selective attention to speech modulates activity in the visual word form area.

Yoncheva Yuliya N YN   Zevin Jason D JD   Maurer Urs U   McCandliss Bruce D BD  

Cerebral cortex (New York, N.Y. : 1991) 20090701 3


Selective attention to speech versus nonspeech signals in complex auditory input could produce top-down modulation of cortical regions previously linked to perception of spoken, and even visual, words. To isolate such top-down attentional effects, we contrasted 2 equally challenging active listening tasks, performed on the same complex auditory stimuli (words overlaid with a series of 3 tones). Instructions required selectively attending to either the speech signals (in service of rhyme judgment  ...[more]

Similar Datasets

| S-EPMC3756304 | biostudies-literature
| S-EPMC9298413 | biostudies-literature
| S-EPMC7264140 | biostudies-literature
| S-EPMC6898452 | biostudies-literature
| S-EPMC7535924 | biostudies-literature
| S-EPMC3386120 | biostudies-literature
| S-EPMC2706007 | biostudies-literature
| S-EPMC6328158 | biostudies-literature
| S-EPMC9838839 | biostudies-literature
| S-EPMC5928127 | biostudies-literature