Unknown

Dataset Information

0

Seeing the song: left auditory structures may track auditory-visual dynamic alignment.


ABSTRACT: Auditory and visual signals generated by a single source tend to be temporally correlated, such as the synchronous sounds of footsteps and the limb movements of a walker. Continuous tracking and comparison of the dynamics of auditory-visual streams is thus useful for the perceptual binding of information arising from a common source. Although language-related mechanisms have been implicated in the tracking of speech-related auditory-visual signals (e.g., speech sounds and lip movements), it is not well known what sensory mechanisms generally track ongoing auditory-visual synchrony for non-speech signals in a complex auditory-visual environment. To begin to address this question, we used music and visual displays that varied in the dynamics of multiple features (e.g., auditory loudness and pitch; visual luminance, color, size, motion, and organization) across multiple time scales. Auditory activity (monitored using auditory steady-state responses, ASSR) was selectively reduced in the left hemisphere when the music and dynamic visual displays were temporally misaligned. Importantly, ASSR was not affected when attentional engagement with the music was reduced, or when visual displays presented dynamics clearly dissimilar to the music. These results appear to suggest that left-lateralized auditory mechanisms are sensitive to auditory-visual temporal alignment, but perhaps only when the dynamics of auditory and visual streams are similar. These mechanisms may contribute to correct auditory-visual binding in a busy sensory environment.

SUBMITTER: Mossbridge JA 

PROVIDER: S-EPMC3806747 | biostudies-literature | 2013

REPOSITORIES: biostudies-literature

altmetric image

Publications

Seeing the song: left auditory structures may track auditory-visual dynamic alignment.

Mossbridge Julia A JA   Grabowecky Marcia M   Suzuki Satoru S  

PloS one 20131023 10


Auditory and visual signals generated by a single source tend to be temporally correlated, such as the synchronous sounds of footsteps and the limb movements of a walker. Continuous tracking and comparison of the dynamics of auditory-visual streams is thus useful for the perceptual binding of information arising from a common source. Although language-related mechanisms have been implicated in the tracking of speech-related auditory-visual signals (e.g., speech sounds and lip movements), it is n  ...[more]

Similar Datasets

| S-EPMC3762867 | biostudies-literature
| S-EPMC3871798 | biostudies-other
| S-EPMC7034432 | biostudies-literature
| S-EPMC7225351 | biostudies-literature
| S-EPMC7670876 | biostudies-literature
| S-EPMC7189489 | biostudies-literature
| S-EPMC8532969 | biostudies-literature
| S-EPMC9092957 | biostudies-literature
| S-EPMC9911736 | biostudies-literature
| S-EPMC10021028 | biostudies-literature