Project description:BackgroundCertain facial configurations are believed to be associated with distinct affective meanings (i.e. basic facial expressions), and such associations are common across cultures (i.e. universality of facial expressions). However, recently, many studies suggest that various types of contextual information, rather than facial configuration itself, are important factor for facial emotion perception.Methodology/principal findingsTo examine systematically how contextual information influences individuals' facial emotion perception, the present study estimated direct observers' perceptual thresholds for detecting negative facial expressions via a forced-choice psychophysical procedure using faces embedded in various emotional contexts. We additionally measured the individual differences in affective information-processing tendency (BIS/BAS) as a possible factor that may determine the extent to which contextual information on facial emotion perception is used. It was found that contextual information influenced observers' perceptual thresholds for facial emotion. Importantly, individuals' affective-information tendencies modulated the extent to which they incorporated context information into their facial emotion perceptions.Conclusions/significanceThe findings of this study suggest that facial emotion perception not only depends on facial configuration, but the context in which the face appears as well. This contextual influence appeared differently with individual's characteristics of information processing. In summary, we conclude that individual character traits, as well as facial configuration and the context in which a face appears, need to be taken into consideration regarding facial emotional perception.
Project description:PurposeThe aim of this study was to determine the feasibility and effectiveness of a 25-week school-based intervention and its ability to improve interoception and emotion regulation in an autistic pediatric population.MethodOne-group pre- and posttest design implementing The Interoception Curriculum: A Guide to Developing Mindful Self-Regulation in a self-contained school. Participants were 14 (11 male, 3 female) students between 9 and 19 years old. The Behavior Rating Inventory of Executive Function 2 (BRIEF-2) and the Caregiver Questionnaire for Interoceptive Awareness-2nd Edition (CQIA-2) were used to determine changes in interoceptive awareness and emotion regulation.ResultsStatistically significant improvements were found between the preintervention and postintervention scores for both interoceptive awareness and emotion regulation.ConclusionThis was the first study to examine the Interoception Curriculum in its entirety, providing evidence that the use of the Interoception Curriculum is feasible in a school setting and suggests that this intervention is effective for improvement of interoception. Findings also suggest that this improvement in interoception is related to improvement in emotional regulation for an autistic pediatric population.
Project description:Facial expressions of emotion are produced by contracting and relaxing the facial muscles in our face. I hypothesize that the human visual system solves the inverse problem of production, that is, to interpret emotion, the visual system attempts to identify the underlying muscle activations. I show converging computational, behavioral and imaging evidence in favor of this hypothesis. I detail the computations performed by the human visual system to achieve the decoding of these facial actions and identify a brain region where these computations likely take place. The resulting computational model explains how humans readily classify emotions into categories as well as continuous variables. This model also predicts the existence of a large number of previously unknown facial expressions, including compound emotions, affect attributes and mental states that are regularly used by people. I provide evidence in favor of this prediction.
Project description:BACKGROUND:Whether individuals with anorexia nervosa (AN) are able to accurately perceive emotions from faces of others is unclear. Furthermore, whether individuals with AN process images of their own face differently to healthy individuals has thus far not been investigated. Therefore, the aim of this study was to investigate facial affect processing and the processing of one's own face through measures of emotion identification, functional magnetic resonance imaging (fMRI) and eyetracking. METHODS:Twenty-four females with AN and 25 matched healthy control participants were presented with an implicit emotion processing task during fMRI and eyetracking, followed by an explicit emotion identification task. RESULTS:The AN group were found to 'hyperscan' stimuli and avoided visually attending to salient features of their own face images. RESULTS of the fMRI revealed increased activity to own face stimuli in AN in the right inferior and middle temporal gyri, and right lingual gyrus. AN participants were not found to display emotion identification deficits to the standard emotional face stimuli. DISCUSSION:The findings are discussed in terms of increased anxiety to disorder-relevant stimuli in AN. Potential clinical implications are discussed in relation to the use of eyetracking techniques to improve the perception of self in AN.
Project description:Multiple psychiatric disorders are associated with difficulties in facial emotion recognition. However, generalized anxiety disorder may be associated with more accurate recognition of others' emotional expressions, particularly expressions of happiness and fear, which index safety and threat. Children aged 9-14 from a community sample (N = 601) completed a facial emotion labeling task. Children's symptoms of depressive and anxiety syndromes were assessed by self- and parent-report. Elevated symptoms of generalized anxiety disorder were associated with more accurate facial emotion recognition (β = 0.16, p = 0.007), specifically recognition of happiness (β = 0.17, p = 0.002) and fear (β = 0.15, p = 0.006). Elevated depressive symptoms were associated with less accurate facial emotion recognition (β = -0.12, p = 0.018), specifically happiness (β = -0.15, p = 0.002). Elevated symptoms of separation anxiety disorder were also associated with less accurate facial emotion recognition (β = -0.16, p = 0.003), specifically happiness (β = -0.15, p = 0.006) and fear (β = -0.15, p = 0.005), which highlights the importance of distinguishing between anxiety syndromes. Results held when adjusting for child age and sex. Evidence that symptoms of generalized anxiety disorder are associated with more accurate recognition of happiness and fear is consistent with theories of heightened social vigilance and support a transdiagnostic role of facial emotion recognition that may inform the psychosocial development of youth with anxiety and depressive symptoms.
Project description:BACKGROUND:Facial emotion perception is a major social skill, but its molecular signal pathway remains unclear. The MET/AKT cascade affects neurodevelopment in general populations and face recognition in patients with autism. This study explores the possible role of MET/AKT cascade in facial emotion perception. METHODS:One hundred and eighty two unrelated healthy volunteers (82 men and 100 women) were recruited. Four single nucleotide polymorphisms (SNP) of MET (rs2237717, rs41735, rs42336, and rs1858830) and AKT rs1130233 were genotyped and tested for their effects on facial emotion perception. Facial emotion perception was assessed by the face task of Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT). Thorough neurocognitive functions were also assessed. RESULTS:Regarding MET rs2237717, individuals with the CT genotype performed better in facial emotion perception than those with TT (p = 0.016 by ANOVA, 0.018 by general linear regression model [GLM] to control for age, gender, and education duration), and showed no difference with those with CC. Carriers with the most common MET CGA haplotype (frequency = 50.5%) performed better than non-carriers of CGA in facial emotion perception (p = 0.018, df = 1, F = 5.69, p = 0.009 by GLM). In MET rs2237717/AKT rs1130233 interaction, the C carrier/G carrier group showed better facial emotion perception than those with the TT/AA genotype (p = 0.035 by ANOVA, 0.015 by GLM), even when neurocognitive functions were controlled (p = 0.046 by GLM). CONCLUSIONS:To our knowledge, this is the first study to suggest that genetic factors can affect performance of facial emotion perception. The findings indicate that MET variances and MET/AKT interaction may affect facial emotion perception, implicating that the MET/AKT cascade plays a significant role in facial emotion perception. Further replication studies are needed.
Project description:Recent theoretical accounts argue that conceptual knowledge dynamically interacts with processing of facial cues, fundamentally influencing visual perception of social and emotion categories. Evidence is accumulating for the idea that a perceiver's conceptual knowledge about emotion is involved in emotion perception, even when stereotypic facial expressions are presented in isolation1-4. However, existing methods have not allowed a comprehensive assessment of the relationship between conceptual knowledge and emotion perception across individuals and emotion categories. Here we use a representational similarity analysis approach to show that conceptual knowledge predicts the representational structure of facial emotion perception. We conducted three studies using computer mouse-tracking5 and reverse-correlation6 paradigms. Overall, we found that when individuals believed two emotions to be conceptually more similar, faces from those categories were perceived with a corresponding similarity, even when controlling for any physical similarity in the stimuli themselves. When emotions were rated conceptually more similar, computer-mouse trajectories during emotion perception exhibited a greater simultaneous attraction to both category responses (despite only one emotion being depicted; studies 1 and 2), and reverse-correlated face prototypes exhibited a greater visual resemblance (study 3). Together, our findings suggest that differences in conceptual knowledge are reflected in the perceptual processing of facial emotion.
Project description:BackgroundImpairment in facial emotion perception is an important domain of social cognition deficits in schizophrenia. Although impaired facial emotion perception has been found in individuals with negative schizotypy (NS), little is known about the corresponding change in brain functional connectivity.MethodsSixty-four participants were classified into a high NS group (n = 34) and a low NS group (n = 30) based on their total scores on the Chapman scales for physical and social anhedonia. All participants undertook a facial emotion discrimination functional imaging task that consisted of four emotional valences (angry, fear, happy, and neutral). For univariate analysis, the signal change at the bilateral amygdala was compared for each emotional contrast using SPSS (P < .05). For the functional connectivity analysis, we calculated the beta-series functional connectivity of the bilateral amygdala with the medial prefrontal cortex (mPFC) and compared the group differences in SPM12 (P < .05, small volume family-wise error correction).ResultsNo significant differences were found between the high and low NS groups in accuracy and reaction time in the facial emotion discrimination task. The high NS group showed reduced brain activations at the amygdala under fearful and neutral conditions. Reduced functional connectivity between the amygdala and the mPFC/dorsal anterior cingulate cortex under the happy and fearful conditions in the high NS group was also found.ConclusionsOur findings suggest that the individuals with high NS showed altered brain activity and functional connectivity at the amygdala during facial emotion processing and provide new evidence for understanding social cognition deficits in at-risk individuals.
Project description:Emotional content is particularly salient, but situational factors such as cognitive load may disturb the attentional prioritization towards affective stimuli and interfere with their processing. In this study, 31 autistic and 31 typically developed children volunteered to assess their perception of affective prosodies via event-related spectral perturbations of neuronal oscillations recorded by electroencephalography under attentional load modulations induced by Multiple Object Tracking or neutral images. Although intermediate load optimized emotion processing by typically developed children, load and emotion did not interplay in children with autism. Results also outlined impaired emotional integration emphasized in theta, alpha and beta oscillations at early and late stages, and lower attentional ability indexed by the tracking capacity. Furthermore, both tracking capacity and neuronal patterns of emotion perception during task were predicted by daily-life autistic behaviors. These findings highlight that intermediate load may encourage emotion processing in typically developed children. However, autism aligns with impaired affective processing and selective attention, both insensitive to load modulations. Results were discussed within a Bayesian perspective that suggests atypical updating in precision between sensations and hidden states, towards poor contextual evaluations. For the first time, implicit emotion perception assessed by neuronal markers was integrated with environmental demands to characterize autism.
Project description:Some researchers have argued that normal human observers can exhibit "blindsight-like" behavior: the ability to discriminate or identify a stimulus without being aware of it. However, we recently used a bias-free task to show that what looks like blindsight may in fact be an artifact of typical experimental paradigms' susceptibility to response bias. While those findings challenge previous reports of blindsight in normal observers, they do not rule out the possibility that different stimuli or techniques could still reveal perception without awareness. One intriguing candidate is emotion processing, since processing of emotional stimuli (e.g. fearful/happy faces) has been reported to potentially bypass conscious visual circuits. Here we used the bias-free blindsight paradigm to investigate whether emotion processing might reveal "featural blindsight," i.e. ability to identify a face's emotion without introspective access to the task-relevant features that led to the discrimination decision. However, we saw no evidence for emotion processing "featural blindsight": as before, whenever participants could identify a face's emotion they displayed introspective access to the task-relevant features, matching predictions of a Bayesian ideal observer. These results add to the growing body of evidence that perceptual discrimination ability without introspective access may not be possible for neurologically intact observers.