Project description:This work revisits the problem of breathing cues used for management of speaking turns in multiparty casual conversation. We propose a new categorization of turn-taking events which combines the criterion of speaker change with whether the original speaker inhales before producing the next talkspurt. We demonstrate that the latter criterion could be potentially used as a good proxy for pragmatic completeness of the previous utterance (and, by extension, of the interruptive character of the incoming speech). We also present evidence that breath holds are used in reaction to incoming talk rather than as a turn-holding cue. In addition to analysing dimensions which are routinely omitted in studies of interactional functions of breathing (exhalations, presence of overlapping speech, breath holds), the present study also looks at patterns of breath holds in silent breathing and shows that breath holds are sometimes produced toward the beginning (and toward the top) of silent exhalations, potentially indicating an abandoned intention to take the turn. We claim that the breathing signal can thus be successfully used for uncovering hidden turn-taking events, which are otherwise obscured by silence-based representations of interaction.
Project description:Conversations are an essential form of communication in daily family life. Specific patterns of caregiver-child conversations have been linked to children's socio-cognitive development and child-relationship quality beyond the immediate family environment. Recently, interpersonal neural synchronization has been proposed as a neural mechanism supporting conversation. Here, we present a functional near-infrared spectroscopy (fNIRS) hyperscanning study looking at the temporal dynamics of neural synchrony during mother-child conversation. Preschoolers (20 boys and 20 girls, M age 5;07 years) and their mothers (M age 36.37 years) were tested simultaneously with fNIRS hyperscanning while engaging in a free verbal conversation lasting for 4 min. Neural synchrony (using wavelet transform coherence analysis) was assessed over time. Furthermore, each conversational turn was coded for conversation patterns comprising turn-taking, relevance, contingency and intrusiveness. Results from linear mixed-effects modeling revealed that turn-taking, but not relevance, contingency or intrusiveness predicted neural synchronization during the conversation over time. Results are discussed to point out possible variables affecting parent-child conversation quality and the potential functional role of interpersonal neural synchronization for parent-child conversation.
Project description:Behavioral coordination and synchrony contribute to a common biological mechanism that maintains communication, cooperation and bonding within many social species, such as primates and birds. Similarly, human language and social systems may also be attuned to coordination to facilitate communication and the formation of relationships. Gross similarities in movement patterns and convergence in the acoustic properties of speech have already been demonstrated between interacting individuals. In the present studies, we investigated how coordinated movements contribute to observers' perception of affiliation (friends vs. strangers) between two conversing individuals. We used novel computational methods to quantify motor coordination and demonstrated that individuals familiar with each other coordinated their movements more frequently. Observers used coordination to judge affiliation between conversing pairs but only when the perceptual stimuli were restricted to head and face regions. These results suggest that observed movement coordination in humans might contribute to perceptual decisions based on availability of information to perceivers.
Project description:Older adults' daily conversations with other older adults enable them to connect to their surrounding communities and improve their friendships. However, typical aging processes and fluctuations in family caring might cause conversation changes. The purpose of this study was to explore the quantitative contributions of conversation turns (CTs) and speaking roles (SRs) in Mandarin-Chinese-speaking conversation dyads between mutually familiar healthy older adults (HOAs). A total of 20 HOAs aged 65 or over were recruited. Each dyad conversed for ten minutes once a week for five weeks, five sessions per dyad, for a total of 50 sessions. The frequency and percentages of the coded CTs and SRs contributed by each HOA were individually tallied and calculated. Quantitatively symmetrical contributions of CTs and SRs occurred in Mandarin-Chinese-speaking conversation dyads between mutually familiar HOAs. Although typical aging processes might change conversations, both Mandarin-Chinese-speaking HOAs serve as active interlocutors to each other in taking CTs and SRs to co-construct their conversation processes and content in their dyadic conversation. Sufficient knowledge of conversation co-constructions might lead them to have more supportive environments to connect to surrounding communities and improve their friendships.
Project description:Despite the emphasis on engaging in shared decision-making for decisions involving life-prolonging interventions, there remains uncertainty about which communication strategies are best to achieve shared decision-making. In this paper, we present the communication strategies used in a code status discussion in a single case audio recorded as part of a research study of how patients and physicians make decisions about the plan of care during daily rounds. When presenting this case at various forums to demonstrate our findings, we found that some clinicians viewed the communication strategies used in the case as an exemplar of shared decision-making, whereas other clinicians viewed them as perpetuating paternalism. Given this polarized reaction, the purpose of this perspective paper is to examine the communication strategies used in the code status discussion and compare those strategies with our current conceptualization of shared decision-making and communication best practices.
Project description:Allen James Wilcox was born on 30 September 1946 in Columbus, OH. He studied medicine at the University of Michigan, graduated in 1973, and after a rotating internship, he completed a master's degree in maternal and child health (1976) and a PhD in epidemiology (1979) at the University of North Carolina in Chapel Hill. After graduation, he went to work at the National Institute of Environmental Health Sciences (NIEHS, one of the US National Institutes of Health) in Durham, NC, where he has spent his career. He developed a research program in reproductive and perinatal epidemiology, a relatively unexplored area at the time. His studies include the early pregnancy study, which documented the extent of subclinical pregnancy loss in humans and established the fertile days of a woman's menstrual cycle. He served as the Chief of the Epidemiology Branch from 1991 to 2001, and as Editor-in-Chief of the journal EPIDEMIOLOGY from 2001 to 2014. His textbook, Fertility and Pregnancy-An Epidemiologic Perspective, was published by Oxford University Press in 2010. He was elected to the American Epidemiological Society in 1989, and served as its president in 2003. He also served as president of the Society of Pediatric and Perinatal Epidemiological Research (1996) and the president of the Society of Epidemiological Research (1998). He holds adjunct teaching appointments at the University of North Carolina, Harvard University, and the University of Bergen (Norway), which awarded him an honorary doctoral degree in 2008.
Project description:We report 2 experiments during which participants conversed with either a confederate (Experiment 1) or a close friend (Experiment 2) while tracking a moving target on a computer screen. In both experiments, talking led to worse performance on the tracking task than listening. We attribute this finding to the increased cognitive demands of speech planning and monitoring. Growth curve analyses of task performance during the beginning and end of conversation segments revealed dynamical changes in the impact of conversation on visuomotor task performance, with increasing impact during the beginning of speaking segments and decreasing impact during the beginning of listening segments. At the end of speaking and listening segments, this pattern reversed. These changes became more pronounced with increased difficulty of the task. Together, these results show that the planning and monitoring aspects of conversation require the majority of the attentional resources that are also used for nonlinguistic visuomotor tasks. The fact that similar results were obtained when conversing with both a confederate and a friend indicates that our findings apply to a wide range of conversational situations. This is the first study to show the fine-grained time course of the shifting attentional demands of conversation on a concurrently performed visuomotor task.
Project description:ObjectiveTo understand the impact of cost conversations on the following decision-making outcomes: patients' knowledge about their conditions and treatment options, decisional conflict, and patient involvement.Patients and methodsIn 2020 we performed a secondary analysis of a randomly selected set of 220 video recordings of clinical encounters from trials run between 2007 and 2015. Videos were obtained from eight practice-based randomized trials and one pre-post-prospective study comparing care with and without shared decision-making (SDM) tools.ResultsThe majority of trial participants were female (61%) and White (86%), with a mean age of 56, some college education (68%), and an income greater than or equal to $40,000 per year (75%), and who did not participate in an encounter aided by an SDM tool (52%). Cost conversations occurred in 106 encounters (48%). In encounters with SDM tools, having a cost conversation lead to lower uncertainty scores (2.1 vs 2.6, P=.02), and higher knowledge (0.7 vs 0.6, P=.04) and patient involvement scores (20 vs 15.7, P=.009) than in encounters using SDM tools where cost conversations did not occur. In a multivariate model, we found slightly worse decisional conflict scores when patients started cost conversations as opposed to when the clinicians started cost conversations. Furthermore, we found higher levels of knowledge when conversations included indirect versus direct cost issues.ConclusionCost conversations have a minimal but favorable impact on decision-making outcomes in clinical encounters, particularly when they occurred in encounters aided by an SDM tool that raises cost as an issue.
Project description:Background and aimsHumans communicate primarily through spoken language and speech perception is a core function of the human auditory system. Among the autistic community, atypical sensory reactivity and social communication difficulties are pervasive, yet the research literature lacks in-depth self-report data on speech perception in this population. The present study aimed to elicit detailed first-person accounts of autistic individuals' abilities and difficulties perceiving the spoken word.MethodsSemi-structured interviews were conducted with nine autistic adults. The interview schedule addressed interviewees' experiences of speech perception, factors influencing those experiences, and responses to those experiences. Resulting interview transcripts underwent thematic analysis. The six-person study team included two autistic researchers, to reduce risk of neurotypical 'overshadowing' of autistic voices.ResultsMost interviewees reported pronounced difficulties perceiving speech in the presence of competing sounds. They emphasised that such listening difficulties are distinct from social difficulties, though the two can add and interact. Difficulties were of several varieties, ranging from powerful auditory distraction to drowning out of voices by continuous sounds. Contributing factors encompassed not only features of the soundscape but also non-acoustic factors such as multisensory processing and social cognition. Participants also identified compounding factors, such as lack of understanding of listening difficulties. Impacts were diverse and sometimes disabling, affecting socialising, emotions, fatigue, career, and self-image. A wide array of coping mechanisms was described.ConclusionsThe first in-depth qualitative investigation of autistic speech-perception experiences has revealed diverse and widespread listening difficulties. These can combine with other internal, interpersonal, and societal factors to induce profound impacts. Lack of understanding of such listening difficulties - by the self, by communication partners, by institutions, and especially by clinicians - appears to be a crucial exacerbating factor. Many autistic adults have developed coping strategies to lessen speech-perception difficulties or mitigate their effects, and these are generally self-taught due to lack of clinical support.ImplicationsThere is a need for carefully designed, adequately powered confirmatory research to verify, quantify, and disentangle the various forms of listening difficulty, preferably using large samples to explore heterogeneity. More immediate benefit might be obtained through development of self-help and clinical guidance materials, and by raising awareness of autistic listening experiences and needs, among the autistic community, communication partners, institutions, and clinicians.