Project description:During sustained viewing of an ambiguous stimulus, an individual's perceptual experience will generally switch between the different possible alternatives rather than stay fixed on one interpretation (perceptual rivalry). Here, we measured pupil diameter while subjects viewed different ambiguous visual and auditory stimuli. For all stimuli tested, pupil diameter increased just before the reported perceptual switch and the relative amount of dilation before this switch was a significant predictor of the subsequent duration of perceptual stability. These results could not be explained by blink or eye-movement effects, the motor response or stimulus driven changes in retinal input. Because pupil dilation reflects levels of norepinephrine (NE) released from the locus coeruleus (LC), we interpret these results as suggestive that the LC-NE complex may play the same role in perceptual selection as in behavioral decision making.
Project description:ObjectivesThe listening demand incurred by speech perception fluctuates in normal conversation. At the acoustic-phonetic level, natural variation in pronunciation acts as speedbumps to accurate lexical selection. Any given utterance may be more or less phonetically ambiguous-a problem that must be resolved by the listener to choose the correct word. This becomes especially apparent when considering two common speech registers-clear and casual-that have characteristically different levels of phonetic ambiguity. Clear speech prioritizes intelligibility through hyperarticulation which results in less ambiguity at the phonetic level, while casual speech tends to have a more collapsed acoustic space. We hypothesized that listeners would invest greater cognitive resources while listening to casual speech to resolve the increased amount of phonetic ambiguity, as compared with clear speech. To this end, we used pupillometry as an online measure of listening effort during perception of clear and casual continuous speech in two background conditions: quiet and noise.DesignForty-eight participants performed a probe detection task while listening to spoken, nonsensical sentences (masked and unmasked) while recording pupil size. Pupil size was modeled using growth curve analysis to capture the dynamics of the pupil response as the sentence unfolded.ResultsPupil size during listening was sensitive to the presence of noise and speech register (clear/casual). Unsurprisingly, listeners had overall larger pupil dilations during speech perception in noise, replicating earlier work. The pupil dilation pattern for clear and casual sentences was considerably more complex. Pupil dilation during clear speech trials was slightly larger than for casual speech, across quiet and noisy backgrounds.ConclusionsWe suggest that listener motivation could explain the larger pupil dilations to clearly spoken speech. We propose that, bounded by the context of this task, listeners devoted more resources to perceiving the speech signal with the greatest acoustic/phonetic fidelity. Further, we unexpectedly found systematic differences in pupil dilation preceding the onset of the spoken sentences. Together, these data demonstrate that the pupillary system is not merely reactive but also adaptive-sensitive to both task structure and listener motivation to maximize accurate perception in a limited resource system.
Project description:Pupil dilation has been reliably identified as a physiological marker of consciously reportable mental effort. This classical finding raises the question of whether or not pupil dilation could be a specific somatic signature of conscious processing. In order to explore this possibility, we engaged healthy volunteers in the 'local global' auditory paradigm we previously designed to disentangle conscious from non-conscious processing of novelty. We discovered that consciously reported violations of global (inter-trials) regularity were associated with a pupil dilation effect both in an active counting task and in a passive attentive task. This pupil dilation effect was detectable both at the group-level and at the individual level. In contrast, unreported violations of this global regularity, as well as unreported violations of local (intra-trial) regularity that do not require conscious access, were not associated with a pupil dilation effect. We replicated these findings in a phonemic version of the 'local global'. Taken together these results strongly suggest that pupil dilation is a somatic marker of conscious access in the auditory modality, and that it could therefore be used to easily probe conscious processing at the individual level without interfering with participant's stream of consciousness by questioning him/her.
Project description:The measurement of cognitive resource allocation during listening, or listening effort, provides valuable insight in the factors influencing auditory processing. In recent years, many studies inside and outside the field of hearing science have measured the pupil response evoked by auditory stimuli. The aim of the current review was to provide an exhaustive overview of these studies. The 146 studies included in this review originated from multiple domains, including hearing science and linguistics, but the review also covers research into motivation, memory, and emotion. The present review provides a unique overview of these studies and is organized according to the components of the Framework for Understanding Effortful Listening. A summary table presents the sample characteristics, an outline of the study design, stimuli, the pupil parameters analyzed, and the main findings of each study. The results indicate that the pupil response is sensitive to various task manipulations as well as interindividual differences. Many of the findings have been replicated. Frequent interactions between the independent factors affecting the pupil response have been reported, which indicates complex processes underlying cognitive resource allocation. This complexity should be taken into account in future studies that should focus more on interindividual differences, also including older participants. This review facilitates the careful design of new studies by indicating the factors that should be controlled for. In conclusion, measuring the pupil dilation response to auditory stimuli has been demonstrated to be sensitive method applicable to numerous research questions. The sensitivity of the measure calls for carefully designed stimuli.
Project description:The dynamics of auditory stream segregation were evaluated using repeating triplets composed of pure tones or the syllable /ba/. Stimuli differed in frequency (tones) or fundamental frequency (speech) by 4, 6, 8, or 10 semitones, and the standard frequency was either 250?Hz (tones and speech) or 400?Hz (tones). Twenty normal-hearing adults participated. For both tones and speech, a two-stream percept became more likely as frequency separation increased. Perceptual organization for speech tended to be more integrated and less stable compared to tones. Results suggest that prior data patterns observed with tones in this paradigm may generalize to speech stimuli.
Project description:OBJECTIVES:This study measured the impact of auditory spectral resolution on listening effort. Systematic degradation in spectral resolution was hypothesized to elicit corresponding systematic increases in pupil dilation, consistent with the notion of pupil dilation as a marker of cognitive load. DESIGN:Spectral resolution of sentences was varied with two different vocoders: (1) a noise-channel vocoder with a variable number of spectral channels; and (2) a vocoder designed to simulate front-end processing of a cochlear implant, including peak-picking channel selection with variable synthesis filter slopes to simulate spread of neural excitation. Pupil dilation was measured after subject-specific luminance adjustment and trial-specific baseline measures. Mixed-effects growth curve analysis was used to model pupillary responses over time. RESULTS:For both types of vocoder, pupil dilation grew with each successive degradation in spectral resolution. Within each condition, pupillary responses were not related to intelligibility scores, and the effect of spectral resolution on pupil dilation persisted even when only analyzing trials in which responses were 100% correct. CONCLUSIONS:Intelligibility scores alone were not sufficient to quantify the effort required to understand speech with poor resolution. Degraded spectral resolution results in increased effort required to understand speech, even when intelligibility is at 100%. Pupillary responses were a sensitive and highly granular measurement to reveal changes in listening effort. Pupillary responses might potentially reveal the benefits of aural prostheses that are not captured by speech intelligibility performance alone as well as the disadvantages that are overcome by increased listening effort.
Project description:Our ability to detect target sounds in complex acoustic backgrounds is often limited not by the ear's resolution, but by the brain's information-processing capacity. The neural mechanisms and loci of this "informational masking" are unknown. We combined magnetoencephalography with simultaneous behavioral measures in humans to investigate neural correlates of informational masking and auditory perceptual awareness in the auditory cortex. Cortical responses were sorted according to whether or not target sounds were detected by the listener in a complex, randomly varying multi-tone background known to produce informational masking. Detected target sounds elicited a prominent, long-latency response (50-250 ms), whereas undetected targets did not. In contrast, both detected and undetected targets produced equally robust auditory middle-latency, steady-state responses, presumably from the primary auditory cortex. These findings indicate that neural correlates of auditory awareness in informational masking emerge between early and late stages of processing within the auditory cortex.
Project description:A fundamental goal of cognitive neuroscience is to explain how mental decisions originate from basic neural mechanisms. The goal of the present study was to investigate the neural correlates of perceptual decisions in the context of emotional perception. To probe this question, we investigated how fluctuations in functional MRI (fMRI) signals were correlated with behavioral choice during a near-threshold fear detection task. fMRI signals predicted behavioral choice independently of stimulus properties and task accuracy in a network of brain regions linked to emotional processing: posterior cingulate cortex, medial prefrontal cortex, right inferior frontal gyrus, and left insula. We quantified the link between fMRI signals and behavioral choice in a whole-brain analysis by determining choice probabilities by means of signal-detection theory methods. Our results demonstrate that voxel-wise fMRI signals can reliably predict behavioral choice in a quantitative fashion (choice probabilities ranged from 0.63 to 0.78) at levels comparable to neuronal data. We suggest that the conscious decision that a fearful face has been seen is represented across a network of interconnected brain regions that prepare the organism to appropriately handle emotionally challenging stimuli and that regulate the associated emotional response.
Project description:Detecting and integrating information across the senses is an advantageous mechanism to efficiently respond to the environment. In this study, a simple auditory-visual detection task was employed to test whether pupil dilation, generally associated with successful target detection, could be used as a reliable measure for studying multisensory integration processing in humans. We recorded reaction times and pupil dilation in response to a series of visual and auditory stimuli, which were presented either alone or in combination. The results indicated faster reaction times and larger pupil diameter to the presentation of combined auditory and visual stimuli than the same stimuli when presented in isolation. Moreover, the responses to the multisensory condition exceeded the linear summation of the responses obtained in each unimodal condition. Importantly, faster reaction times corresponded to larger pupil dilation, suggesting that also the latter can be a reliable measure of multisensory processes. This study will serve as a foundation for the investigation of auditory-visual integration in populations where simple reaction times cannot be collected, such as developmental and clinical populations.
Project description:In quiescent states such as anesthesia and slow wave sleep, cortical networks show slow rhythmic synchronized activity. In sensory cortices this rhythmic activity shows a stereotypical pattern that is recapitulated by stimulation of the appropriate sensory modality. The amygdala receives sensory input from a variety of sources, and in anesthetized animals, neurons in the basolateral amygdala (BLA) show slow rhythmic synchronized activity. Extracellular field potential recordings show that these oscillations are synchronized with sensory cortex and the thalamus, with both the thalamus and cortex leading the BLA. Using whole-cell recording in vivo we show that the membrane potential of principal neurons spontaneously oscillates between up- and down-states. Footshock and auditory stimulation delivered during down-states evokes an up-state that fully recapitulates those occurring spontaneously. These results suggest that neurons in the BLA receive convergent input from networks of cortical neurons with slow oscillatory activity and that somatosensory and auditory stimulation can trigger activity in these same networks.