Prosodic and narrative processing in American Sign Language: an fMRI study.
Ontology highlight
ABSTRACT: Signed languages such as American Sign Language (ASL) are natural human languages that share all of the core properties of spoken human languages but differ in the modality through which they are communicated. Neuroimaging and patient studies have suggested similar left hemisphere (LH)-dominant patterns of brain organization for signed and spoken languages, suggesting that the linguistic nature of the information, rather than modality, drives brain organization for language. However, the role of the right hemisphere (RH) in sign language has been less explored. In spoken languages, the RH supports the processing of numerous types of narrative-level information, including prosody, affect, facial expression, and discourse structure. In the present fMRI study, we contrasted the processing of ASL sentences that contained these types of narrative information with similar sentences without marked narrative cues. For all sentences, Deaf native signers showed robust bilateral activation of perisylvian language cortices as well as the basal ganglia, medial frontal, and medial temporal regions. However, RH activation in the inferior frontal gyrus and superior temporal sulcus was greater for sentences containing narrative devices, including areas involved in processing narrative content in spoken languages. These results provide additional support for the claim that all natural human languages rely on a core set of LH brain regions, and extend our knowledge to show that narrative linguistic functions typically associated with the RH in spoken languages are similarly organized in signed languages.
SUBMITTER: Newman AJ
PROVIDER: S-EPMC2908987 | biostudies-literature | 2010 Aug
REPOSITORIES: biostudies-literature
ACCESS DATA