Visual influences on echo suppression.
Ontology highlight
ABSTRACT: Locating sounds in realistic scenes is challenging because of distracting echoes and coarse spatial acoustic estimates. Fortunately, listeners can improve performance through several compensatory mechanisms. For instance, their brains perceptually suppress short latency (1-10 ms) echoes by constructing a representation of the acoustic environment in a process called the precedence effect. This remarkable ability depends on the spatial and spectral relationship between the first or precedent sound wave and subsequent echoes. In addition to using acoustics alone, the brain also improves sound localization by incorporating spatially precise visual information. Specifically, vision refines auditory spatial receptive fields and can capture auditory perception such that sound is localized toward a coincident visual stimulus. Although visual cues and the precedence effect are each known to improve performance independently, it is not clear whether these mechanisms can cooperate or interfere with each other. Here we demonstrate that echo suppression is enhanced when visual information spatially and temporally coincides with the precedent wave. Conversely, echo suppression is inhibited when vision coincides with the echo. These data show that echo suppression is a fundamentally multisensory process in everyday environments, where vision modulates even this largely automatic auditory mechanism to organize a coherent spatial experience.
SUBMITTER: Bishop CW
PROVIDER: S-EPMC3068473 | biostudies-other | 2011 Feb
REPOSITORIES: biostudies-other
ACCESS DATA