Unknown

Dataset Information

0

Audio-visual spatial alignment improves integration in the presence of a competing audio-visual stimulus.


ABSTRACT: In order to parse the world around us, we must constantly determine which sensory inputs arise from the same physical source and should therefore be perceptually integrated. Temporal coherence between auditory and visual stimuli drives audio-visual (AV) integration, but the role played by AV spatial alignment is less well understood. Here, we manipulated AV spatial alignment and collected electroencephalography (EEG) data while human subjects performed a free-field variant of the "pip and pop" AV search task. In this paradigm, visual search is aided by a spatially uninformative auditory tone, the onsets of which are synchronized to changes in the visual target. In Experiment 1, tones were either spatially aligned or spatially misaligned with the visual display. Regardless of AV spatial alignment, we replicated the key pip and pop result of improved AV search times. Mirroring the behavioral results, we found an enhancement of early event-related potentials (ERPs), particularly the auditory N1 component, in both AV conditions. We demonstrate that both top-down and bottom-up attention contribute to these N1 enhancements. In Experiment 2, we tested whether spatial alignment influences AV integration in a more challenging context with competing multisensory stimuli. An AV foil was added that visually resembled the target and was synchronized to its own stream of synchronous tones. The visual components of the AV target and AV foil occurred in opposite hemifields; the two auditory components were also in opposite hemifields and were either spatially aligned or spatially misaligned with the visual components to which they were synchronized. Search was fastest when the auditory and visual components of the AV target (and the foil) were spatially aligned. Attention modulated ERPs in both spatial conditions, but importantly, the scalp topography of early evoked responses shifted only when stimulus components were spatially aligned, signaling the recruitment of different neural generators likely related to multisensory integration. These results suggest that AV integration depends on AV spatial alignment when stimuli in both modalities compete for selective integration, a common scenario in real-world perception.

SUBMITTER: Fleming JT 

PROVIDER: S-EPMC8016516 | biostudies-literature |

REPOSITORIES: biostudies-literature

Similar Datasets

| S-EPMC7930005 | biostudies-literature
| S-EPMC6979315 | biostudies-literature
| S-EPMC7788022 | biostudies-literature
| S-EPMC4831745 | biostudies-literature
| S-EPMC9334025 | biostudies-literature
| S-EPMC3102664 | biostudies-literature
| S-EPMC5247431 | biostudies-literature
| S-EPMC6865810 | biostudies-literature
| S-EPMC5633019 | biostudies-literature
| S-EPMC6542535 | biostudies-literature