Project description:Strengthening the connections between sign language and written language may improve reading skills in deaf and hard-of-hearing (DHH) signing children. The main aim of the present study was to investigate whether computerized sign language-based literacy training improves reading skills in DHH signing children who are learning to read. Further, longitudinal associations between sign language skills and developing reading skills were investigated. Participants were recruited from Swedish state special schools for DHH children, where pupils are taught in both sign language and spoken language. Reading skills were assessed at five occasions and the intervention was implemented in a cross-over design. Results indicated that reading skills improved over time and that development of word reading was predicted by the ability to imitate unfamiliar lexical signs, but there was only weak evidence that it was supported by the intervention. These results demonstrate for the first time a longitudinal link between sign-based abilities and word reading in DHH signing children who are learning to read. We suggest that the active construction of novel lexical forms may be a supramodal mechanism underlying word reading development.
Project description:We investigated the robust correlation between American Sign Language (ASL) and English reading ability in 51 young deaf signers ages 7;3 to 19;0. Signers were divided into 'skilled' and 'less-skilled' signer groups based on their performance on three measures of ASL. We next assessed reading comprehension of four English sentence structures (actives, passives, pronouns, reflexive pronouns) using a sentence-to-picture-matching task. Of interest was the extent to which ASL proficiency provided a foundation for lexical and syntactic processes of English. Skilled signers outperformed less-skilled signers overall. Error analyses further indicated greater single-word recognition difficulties in less-skilled signers marked by a higher rate of errors reflecting an inability to identify the actors and actions described in the sentence. Our findings provide evidence that increased ASL ability supports English sentence comprehension both at the levels of individual words and syntax. This is consistent with the theory that first language learning promotes second language through transference of linguistic elements irrespective of the transparency of mapping of grammatical structures between the two languages.
Project description:Some studies have concluded that sign language hinders spoken language development for deaf and hard-of-hearing (DHH) children even though sign language exposure could protect DHH children from experiencing language deprivation. Furthermore, this research has rarely considered the bilingualism of children learning a signed and a spoken language. Here we compare spoken English development in 2-6-year-old deaf and hearing American Sign Language-English bilingual children to each other and to monolingual English speakers in a comparison database. Age predicted bilinguals' language scores on all measures, whereas hearing status was only significant for one measure. Both bilingual groups tended to score below monolinguals. Deaf bilinguals' scores differed more from monolinguals, potentially because of later age of and less total exposure to English, and/or to hearing through a cochlear implant. Overall, these results are consistent with typical early bilingual language development. Research and practice must treat signing-speaking children as bilinguals and consider the bilingual language development literature.
Project description:Previous studies have suggested that deafness could lead to deficits in motor skills and other body-related abilities. However, the literature regarding motor skills in deaf adults is scarce and existing studies often included participants with heterogeneous language backgrounds and deafness etiologies, thus making it difficult to delineate the effects of deafness. In this study, we investigated motor learning in deaf native signers and hearing nonsigners. To isolate the effects of deafness and those of acquiring a signed language, we additionally tested a group of hearing native signers. Two well-established paradigms of motor learning were employed, in which participants had to adapt their hand movements to a rotation of the visual feedback (Experiment 1) or to the introduction of a force field (Experiment 2). Proprioceptive estimates were assessed before and after adaptation. Like hearing nonsigners, deaf and hearing signers showed robust adaptation in both motor adaptation paradigms. No significant differences in motor adaptation and memory were observed between deaf signers and hearing nonsigners, as well as between hearing signers and hearing nonsigners. Moreover, no discernible group differences in proprioceptive accuracy were observed. These findings challenge the prevalent notion that deafness leads to deficits in motor skills and other body-related abilities.
Project description:Age of acquisition (AoA) effects have been used to support the notion of a critical period for first language acquisition. In this study, we examine AoA effects in deaf British Sign Language (BSL) users via a grammaticality judgment task. When English reading performance and nonverbal IQ are factored out, results show that accuracy of grammaticality judgement decreases as AoA increases, until around age 8, thus showing the unique effect of AoA on grammatical judgement in early learners. No such effects were found in those who acquired BSL after age 8. These late learners appear to have first language proficiency in English instead, which may have been used to scaffold learning of BSL as a second language later in life.
Project description:Real time hand movement trajectory tracking based on machine learning approaches may assist the early identification of dementia in ageing Deaf individuals who are users of British Sign Language (BSL), since there are few clinicians with appropriate communication skills, and a shortage of sign language interpreters. Unlike other computer vision systems used in dementia stage assessment such as RGBD video with the aid of depth camera, activities of daily living (ADL) monitored by information and communication technologies (ICT) facilities, or X-Ray, computed tomography (CT), and magnetic resonance imaging (MRI) images fed to machine learning algorithms, the system developed here focuses on analysing the sign language space envelope (sign trajectories/depth/speed) and facial expression of deaf individuals, using normal 2D videos. In this work, we are interested in providing a more accurate segmentation of objects of interest in relation to the background, so that accurate real-time hand trajectories (path of the trajectory and speed) can be achieved. The paper presents and evaluates two types of hand movement trajectory models. In the first model, the hand sign trajectory is tracked by implementing skin colour segmentation. In the second model, the hand sign trajectory is tracked using Part Affinity Fields based on the OpenPose Skeleton Model [1, 2]. Comparisons of results between the two different models demonstrate that the second model provides enhanced improvements in terms of tracking accuracy and robustness of tracking. The pattern differences in facial and trajectory motion data achieved from the presented models will be beneficial not only for screening of deaf individuals for dementia, but also for assessment of other acquired neurological impairments associated with motor changes, for example, stroke and Parkinson's disease.
Project description:Under natural conditions, listeners use both auditory and visual speech cues to extract meaning from speech signals containing many sources of variability. However, traditional clinical tests of spoken word recognition routinely employ isolated words or sentences produced by a single talker in an auditory-only presentation format. The more central cognitive processes used during multimodal integration, perceptual normalization, and lexical discrimination that may contribute to individual variation in spoken word recognition performance are not assessed in conventional tests of this kind. In this article, we review our past and current research activities aimed at developing a series of new assessment tools designed to evaluate spoken word recognition in children who are deaf or hard of hearing. These measures are theoretically motivated by a current model of spoken word recognition and also incorporate "real-world" stimulus variability in the form of multiple talkers and presentation formats. The goal of this research is to enhance our ability to estimate real-world listening skills and to predict benefit from sensory aid use in children with varying degrees of hearing loss.
Project description:The purpose of this study was to compare developmental trajectories of oral language acquisition of children who are deaf and hard of hearing (DHH) and children with typical hearing across the preschool years. Thirty children who are DHH who use amplification and spoken language and 31 children with typical hearing completed an early language and literacy assessment battery every six months from age 4 to age 6. The developmental trajectories of each group's language skills were examined via growth curve analysis. Oral language skills were lower for children who are DHH than for children with typical hearing at study entry. For vocabulary, children who are DHH demonstrated growth over the two years but did not close the gap in performance over time. For morphosyntax, specifically verb tense marking, children who are DHH demonstrated growth over preschool, becoming more adult-like in their productions.
Project description:BackgroundApproximately 235,000 deaf and hard of hearing (DHH) people live in Germany. Due to communication barriers, medical care for this group is difficult in many respects. Especially in the case of acute illnesses, the possibilities of communication, e.g., through sign language interpreters, are limited. This study investigates the satisfaction of DHH patients with medical care in Germany in unplanned medical consultations. The aim of this study is to provide insights into DHH patient's perception of medical care, to identify barriers and avoidance behaviours that stem from fears, miscommunication, and prior experiences.MethodsWe obtained data from adult DHH participants between February and April 2022 throughout Germany via an online survey in German Sign Language. The responses of N = 383 participants (65% female, M = 44 years, SD = 12.70 years) were included in statistical analyses. Outcomes were convictions of receiving help, satisfaction with healthcare provision, and avoiding healthcare visits; further variables were concerns during healthcare visits, incidences of miscommunication, and a communication score. We calculated t-tests, ANOVAs, correlations, and linear and logistic regression analyses.ResultsOur main findings show that (1) DHH patients were unsatisfied with provided healthcare (M = 3.88; SD = 2.34; range 0-10); (2) DHH patients reported many concerns primarily about communication and treatment aspects when visiting a doctor; and (3) 57% of participants deliberately avoided doctor visits even though they experienced symptoms. Factors such as concerns during doctor's visits (B = -0.18; 95%CI: -0.34--0.02; p = .027) or miscommunication with medical staff (B = -0.19; 95%CI: -0.33-0.06; p = .006) were associated with satisfaction with medical care, while we found almost no associations with gender and location, and only few with age and education.ConclusionsOverall, our findings suggest that DHH patients are unsatisfied with provided healthcare, they deliberately avoid doctor visits, and they face various communication barriers. This study revealed several communication-related determinants of satisfaction with healthcare in DHH patients, such as incidences of miscommunication and the communication score. Communication-related barriers have high potential to be addressed in collaboration with the DHH community. To improve the medical care and the satisfaction with healthcare in DHH patients, training healthcare professionals, digital technologies, and other communication-enhancing interventions should be explored in future intervention studies.
Project description:Hearing parents with deaf children face difficult decisions about what language(s) to use with their child. Sign languages such as American Sign Language (ASL) are fully accessible to deaf children, yet most hearing parents are not proficient in ASL prior to having a deaf child. Parents are often discouraged from learning ASL based in part on an assumption that it will be too difficult, yet there is little evidence supporting this claim. In this mixed-methods study, we surveyed hearing parents of deaf children (n = 100) who had learned ASL to learn more about their experiences. In their survey responses, parents identified a range of resources that supported their ASL learning as well as frequent barriers. Parents identified strongly with belief statements indicating the importance of ASL and affirmed that learning ASL is attainable for hearing parents. We discuss the implications of this study for parents who are considering ASL as a language choice and for the professionals who guide them.