首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
According to theories of emotional complexity, individuals low in emotional complexity encode and represent emotions in visceral or action-oriented terms, whereas individuals high in emotional complexity encode and represent emotions in a differentiated way, using multiple emotion concepts. During functional magnetic resonance imaging, participants viewed valenced animated scenarios of simple ball-like figures attending either to social or spatial aspects of the interactions. Participant’s emotional complexity was assessed using the Levels of Emotional Awareness Scale. We found a distributed set of brain regions previously implicated in processing emotion from facial, vocal and bodily cues, in processing social intentions, and in emotional response, were sensitive to emotion conveyed by motion alone. Attention to social meaning amplified the influence of emotion in a subset of these regions. Critically, increased emotional complexity correlated with enhanced processing in a left temporal polar region implicated in detailed semantic knowledge; with a diminished effect of social attention; and with increased differentiation of brain activity between films of differing valence. Decreased emotional complexity was associated with increased activity in regions of pre-motor cortex. Thus, neural coding of emotion in semantic vs action systems varies as a function of emotional complexity, helping reconcile puzzling inconsistencies in neuropsychological investigations of emotion recognition.  相似文献   

2.
Patients with borderline personality disorder (BPD) exhibit impairment in labeling of facial emotional expressions. However, it is not clear whether these deficits affect the whole domain of basic emotions, are valence-specific, or specific to individual emotions. Whether BPD patients' errors in a facial emotion recognition task create a specific pattern also remains to be elucidated. Our study tested two hypotheses: first, we hypothesized, that the emotion perception impairment in borderline personality disorder is specific to the negative emotion domain. Second, we hypothesized, that BPD patients would show error patterns in a facial emotion recognition task more commonly and more systematically than healthy comparison subjects. Participants comprised 33 inpatients with BPD and 32 matched healthy control subjects who performed a computerized version of the Ekman 60 Faces test. The indices of emotion recognition and the direction of errors were processed in separate analyses. Clinical symptoms and personality functioning were assessed using the Symptom Checklist-90-Revised and the Young Schema Questionnaire Long Form. Results showed that patients with BPD were less accurate than control participants in emotion recognition, in particular, in the discrimination of negative emotions, while they were not impaired in the recognition of happy facial expressions. In addition, patients over-attributed disgust and surprise and under-attributed fear to the facial expressions relative to controls. These findings suggest the importance of carefully considering error patterns, besides measuring recognition accuracy, especially among emotions with negative affective valence, when assessing facial affect recognition in BPD.  相似文献   

3.
The importance of the right hemisphere in emotion perception in general has been well documented but its precise role is disputed. We compared the performance of 30 right hemisphere damaged (RHD) patients, 30 left hemisphere damaged (LHD) patients, and 50 healthy controls on both facial and vocal affect perception tasks of specific emotions. Brain damaged subjects had a single episode cerebrovascular accident localised to one hemisphere. The results showed that right hemisphere patients were markedly impaired relative to left hemisphere and healthy controls on test performance: labelling and recognition of facial expressions and recognition of emotions conveyed by prosody. This pertained at the level of individual basic emotions, positive versus negative, and emotional expressions in general. The impairment remained highly significant despite covarying for the group's poorer accuracy on a neutral facial perception test and identification of neutral vocal expressions. The LHD group were only impaired relative to controls on facial emotion tasks when their performance was summed over all the emotion categories and before age and other cognitive factors were taken into account. However, on the prosody test the LHD patients showed significant impairment, performing mid-way between the right hemisphere patients and healthy comparison group. Recognition of positive emotional expressions was better than negative in all subjects, and was not relatively poorer in the LHD patients. Recognition of individual emotions in one modality correlated weakly with recognition in another, in all three groups. These data confirm the primacy of the right hemisphere in processing all emotional expressions across modalities--both positive and negative--but suggest that left hemisphere emotion processing is modality specific. It is possible that the left hemisphere has a particular role in the perception of emotion conveyed through meaningful speech.  相似文献   

4.
Objective: Existing single-case studies have reported deficit in recognizing basic emotions through facial expression and unaffected performance with body expressions, but not the opposite pattern. The aim of this paper is to present a case study with impaired emotion recognition through body expressions and intact performance with facial expressions. Methods: In this single-case study we assessed a 30-year-old patient with autism spectrum disorder, without intellectual disability, and a healthy control group (n = 30) with four tasks of basic and complex emotion recognition through face and body movements, and two non-emotional control tasks. To analyze the dissociation between facial and body expressions, we used Crawford and Garthwaite’s operational criteria, and we compared the patient and the control group performance with a modified one-tailed t-test designed specifically for single-case studies. Results: There were no statistically significant differences between the patient’s and the control group’s performances on the non-emotional body movement task or the facial perception task. For both kinds of emotions (basic and complex) when the patient’s performance was compared to the control group’s, statistically significant differences were only observed for the recognition of body expressions. There were no significant differences between the patient’s and the control group’s correct answers for emotional facial stimuli. Conclusions: Our results showed a profile of impaired emotion recognition through body expressions and intact performance with facial expressions. This is the first case study that describes the existence of this kind of dissociation pattern between facial and body expressions of basic and complex emotions.  相似文献   

5.
The recognition of facial expressions of emotion is impaired in semantic dementia (SD) and is associated with right-sided brain atrophy in areas known to be involved in emotion processing, notably the amygdala. Whether patients with SD also experience difficulty recognizing emotions conveyed by other media, such as music, is unclear. Prior studies have used excerpts of known music from classical or film repertoire but not unfamiliar melodies designed to convey distinct emotions. Patients with SD (n = 11), Alzheimer's disease (n = 12) and healthy control participants (n = 20) underwent tests of emotion recognition in two modalities: unfamiliar musical tunes and unknown faces as well as volumetric MRI. Patients with SD were most impaired with the recognition of facial and musical emotions, particularly for negative emotions. Voxel-based morphometry showed that the labelling of emotions, regardless of modality, correlated with the degree of atrophy in the right temporal pole, amygdala and insula. The recognition of musical (but not facial) emotions was also associated with atrophy of the left anterior and inferior temporal lobe, which overlapped with regions correlating with standardized measures of verbal semantic memory. These findings highlight the common neural substrates supporting the processing of emotions by facial and musical stimuli but also indicate that the recognition of emotions from music draws upon brain regions that are associated with semantics in language.  相似文献   

6.
Autism spectrum disorders (ASD) are characterized by early onset qualitative impairments in reciprocal social development. However, whether individuals with ASD exhibit impaired recognition of facial expressions corresponding to basic emotions is debatable. To investigate subtle deficits in facial emotion recognition, we asked 14 children diagnosed with high-functioning autism (HFA)/AS and 17 typically developing peers to complete a new highly sensitive test of facial emotion recognition. The test stimuli comprised faces expressing increasing degrees of emotional intensity that slowly changed from a neutral to a full-intensity happiness, sadness, surprise, anger, disgust, or fear expression. We assessed individual differences in the intensity of stimuli required to make accurate judgments about emotional expressions. We found that, different emotions had different identification thresholds and the two groups were generally similar in terms of the sequence of discrimination threshold of six basic expressions. It was easier for individuals in both groups to identify emotions that were relatively fully expressed (e.g., intensity >?50%). Compared with control participants, children with ASD generally required stimuli with significantly greater intensity for the correct identification of anger, disgust, and fear expressions. These results suggest that individuals with ASD do not have a general but rather a selective impairment in basic emotion recognition.  相似文献   

7.
Although evidence from primates suggests an important role for the anterior temporal cortex in social behaviour, human research has to date concentrated almost solely on the orbitofrontal cortex and amygdala. By describing four cases of the temporal variant of frontotemporal dementia we show how this degenerative condition provides an excellent model for investigating the role of the anterior temporal lobe, especially the right, in emotions, empathy and social behaviour. Assessments of semantic memory, processing of emotional facial expression and emotional prosody were made, empathy was measured, and facial expressions of emotion were coded. Of the two right handers described, one subject with predominantly left temporal lobe atrophy had severe semantic impairment but normal performance on all emotional tasks. In contrast, the subject with right temporal lobe atrophy showed severely impaired recognition of emotion from faces and voices that was not due to semantic or perceptual difficulties. Empathy was lost, interpersonal skills were severely affected and facial expression of emotion was characterized by a fixed expression that was unresponsive to situations. Additionally, two left handers with right temporal lobe atrophy are described. One demonstrated the same pattern of hemispheric lateralization as the right handers and had emotional impairment. The other left hander showed the opposite pattern of deficits, suggesting a novel presentation of anomalous dominance with reversed hemispheric specialization of semantic memory and emotional processing.  相似文献   

8.
S B Hamann  R Adolphs 《Neuropsychologia》1999,37(10):1135-1141
Bilateral damage to the amygdala in humans has been previously linked to two deficits in recognizing emotion in facial expressions: recognition of individual basic emotions, especially fear, and recognition of similarity among emotional expressions. Although several studies have examined recognition of individual emotions following amygdala damage, only one subject has been examined on recognition of similarity. To assess the extent to which deficits in recognizing similarity among facial expressions might be a general consequence of amygdala damage, we examined this ability in two subjects with complete bilateral amygdala damage. Both subjects had previously demonstrated entirely normal recognition of individual facial emotions. Here we report that these two patients also are intact in their ability to recognize similarity between emotional expressions. These results indicate that, like the recognition of individual basic emotions in facial expressions, the recognition of similarity among emotional expressions does not have an absolute dependence on the amygdala.  相似文献   

9.
BackgroundA plethora of research on facial emotion recognition in autism spectrum disorders (ASD) exists and reported deficits in ASD compared to controls, particularly for negative basic emotions. However, these studies have largely used static high intensity stimuli. The current study investigated facial emotion recognition across three levels of expression intensity from videos, looking at accuracy rates to investigate impairments in facial emotion recognition and error patterns (’confusions’) to explore potential underlying factors.MethodTwelve individuals with ASD (9 M/3F; M(age) = 17.3) and 12 matched controls (9 M/3F; M(age) = 16.9) completed a facial emotion recognition task including 9 emotion categories (anger, disgust, fear, sadness, surprise, happiness, contempt, embarrassment, pride) and neutral, each expressed by 12 encoders at low, intermediate, and high intensity.ResultsA facial emotion recognition deficit was found overall for the ASD group compared to controls, as well as deficits in recognising individual negative emotions at varying expression intensities. Compared to controls, the ASD group showed significantly more, albeit typical, confusions between emotion categories (at high intensity), and significantly more confusions of emotions as ‘neutral’ (at low intensity).ConclusionsThe facial emotion recognition deficits identified in ASD, particularly for negative emotions, are in line with previous studies using other types of stimuli. Error analysis showed that individuals with ASD had difficulties detecting emotional information in the face (sensitivity) at low intensity, and correctly identifying emotional information (specificity) at high intensity. These results suggest different underlying mechanisms for the facial emotion recognition deficits at low vs high expression intensity.  相似文献   

10.
11.
Despite the high relevance of emotion processing for social functioning, the study of the impairment of facial affect in multiple sclerosis (MS) has received little attention. Previous research reported evidence for emotion processing deficits but the nature and extent are not fully explained. Thirty-five MS patients underwent dedicated neuropsychological assessment of emotion processing using two facial affect recognition tasks and self-report measures of alexithymia. For comparison, healthy participants served as controls. Relative to healthy controls, MS patients were impaired in facial affect recognition on four of the six Ekman basic emotions, except happiness and disgust. The MS patients were more alexithymic than the healthy controls. These data provide evidence for deficits in the recognition of emotional face expressions and emotional introspection.  相似文献   

12.
Physiological hormonal fluctuations during the menstrual cycle, postpartum, and menopause have been implicated in the modulation of mood, cognition, and affective disorders. Taking into account that women's performance in memory tasks can also fluctuate with circulating hormones levels across the menstrual cycle, the cognitive performance in a working memory task for emotional facial expressions, using the six basic emotions as stimuli in the delayed matching-to-sample, was evaluated in young women in different phases of the menstrual cycle. Our findings suggest that high levels of estradiol in the follicular phase could have a negative effect on delayed matching-to-sample working memory task, using stimuli with emotional valence. Moreover, in the follicular phase, compared to the menstrual phase, the percent of errors was significantly higher for the emotional facial expressions of sadness and disgust. The evaluation of the response times (time employed to answer) for each facial expression with emotional valence showed a significant difference between follicular and luteal in reference to the emotional facial expression of sadness. Our results show that high levels of estradiol in the follicular phase could impair the performance of working memory. However, this effect is specific to selective facial expressions suggesting that, across the phases of the menstrual cycle, in which conception risk is high, women could give less importance to the recognition of the emotional facial expressions of sadness and disgust. This study is in agreement with research conducted on non-human primates, showing that fluctuations of ovarian hormones across the menstrual cycle influence a variety of social and cognitive behaviors. Moreover, our data could also represent a useful tool for investigating emotional disturbances linked to menstrual cycle phases and menopause in women.  相似文献   

13.
AIM: To conduct a systematic literature review about the influence of gender on the recognition of facial expressions of six basic emotions.METHODS: We made a systematic search with the search terms (face OR facial) AND (processing OR recognition OR perception) AND (emotional OR emotion) AND (gender or sex) in PubMed, PsycINFO, LILACS, and SciELO electronic databases for articles assessing outcomes related to response accuracy and latency and emotional intensity. The articles selection was performed according to parameters set by COCHRANE. The reference lists of the articles found through the database search were checked for additional references of interest.RESULTS: In respect to accuracy, women tend to perform better than men when all emotions are considered as a set. Regarding specific emotions, there seems to be no gender-related differences in the recognition of happiness, whereas results are quite heterogeneous in respect to the remaining emotions, especially sadness, anger, and disgust. Fewer articles dealt with the parameters of response latency and emotional intensity, which hinders the generalization of their findings, especially in the face of their methodological differences.CONCLUSION: The analysis of the studies conducted to date do not allow for definite conclusions concerning the role of the observer’s gender in the recognition of facial emotion, mostly because of the absence of standardized methods of investigation.  相似文献   

14.
15.
BACKGROUND: Persons suffering from schizophrenia have impaired perception of emotional expressions, but it is not clear whether this is part of a generalized deficit in cognitive function. AIM: To test for existence of emotion-specific deficits by studying the effects of valence on recognition of facial emotional expressions. METHODS: 24 male subjects suffering from chronic schizophrenia were examined with two tests of perception of emotion: the Penn Emotion Acuity Test (PEAT 40) and the Emotion Differentiation Task (EMODIFF). Clinical state was assessed with the Scale for the Assessment of Negative Symptoms (SANS) and Scale for the Assessment of Positive Symptoms (SAPS), visual memory with the Benton Visual Retention Test (BVRT) and motor function with the finger tapping test. RESULTS: Identification of happy facial expressions showed significant negative correlation with age, cumulated time in hospital and length of current hospitalization; positive correlations were found with visual retention and finger tapping scores. Identification of sad facial expressions showed significant correlation only with cumulated time in hospital while identification of neutral facial expressions showed no significant correlations. Discrimination between degrees of happy but not sad facial expression showed a positive correlation with negative symptoms. CONCLUSION: Perception of happy and sad emotion relates differently to significant illness parameters. This differentiability supports the existence of an emotion-specific deficit in perception of emotions in schizophrenia and of separate channels for processing positive and negative emotions.  相似文献   

16.
Deep brain stimulation of the subthalamic nuclei (STN-DBS) for the treatment of levodopa-induced motor complications in advanced Parkinson's disease (APD) has been associated with neuropsychiatric disorders. It has been suggested that a postoperative decline in visual emotion recognition is responsible for those adverse events, although there is also evidence that emotional processing deficits can be present before surgery.The aim of the present study is to compare the ability to recognize emotions before and one year after surgery in APD. Methods: Consecutively operated APD patients were tested pre-operatively and one year after STN-DBS by the Comprehensive Affect Testing System (CATS), which evaluates visual recognition of 7 basic emotions (happiness, sadness, anger, fear, surprise, disgust and neutral) on facial expressions and 4 emotions on prosody (happiness, sadness, anger and fear).ResultsIn a sample of 30 patients 6 had depression or apathy at baseline that significantly increased to 14 post-surgery. There were no significant changes in the tests of identity discrimination, discrimination of emotional faces, naming of emotional faces, recognition of emotional prosody, and naming of emotional prosody after STN-DBS. The results of emotion tests could not predict the development of the neuropsychiatric symptoms.DiscussionThis study does not support the hypothesis of an acquired change in emotion recognition, either in faces or in prosody, after STN-DBS in APD patients. Neuropsychiatric symptoms appearing after STN-DBS should not be attributed to new deficits in emotional recognition.  相似文献   

17.
Autism Spectrum Disorders (ASD) are characterised by social and communication impairment, yet evidence for deficits in the ability to recognise facial expressions of basic emotions is conflicting. Many studies reporting no deficits have used stimuli that may be too simple (with associated ceiling effects), for example, 100% ‘full-blown’ expressions. In order to investigate subtle deficits in facial emotion recognition, 21 adolescent males with high-functioning Austism Spectrum Disorders (ASD) and 16 age and IQ matched typically developing control males completed a new sensitive test of facial emotion recognition which uses dynamic stimuli of varying intensities of expressions of the six basic emotions (Emotion Recognition Test; Montagne et al., 2007). Participants with ASD were found to be less accurate at processing the basic emotional expressions of disgust, anger and surprise; disgust recognition was most impaired - at 100% intensity and lower levels, whereas recognition of surprise and anger were intact at 100% but impaired at lower levels of intensity.  相似文献   

18.
The Ekman 60-Faces (EK-60F) Test is a well-known neuropsychological tool assessing emotion recognition from facial expressions. It is the most employed task for research purposes in psychiatric and neurological disorders, including neurodegenerative diseases, such as the behavioral variant of Frontotemporal Dementia (bvFTD). Despite its remarkable usefulness in the social cognition research field, to date, there are still no normative data for the Italian population, thus limiting its application in a clinical context. In this study, we report procedures and normative data for the Italian version of the test. A hundred and thirty-two healthy Italian participants aged between 20 and 79 years with at least 5 years of education were recruited on a voluntary basis. They were administered the EK-60F Test from the Ekman and Friesen series of Pictures of Facial Affect after a preliminary semantic recognition test of the six basic emotions (i.e., anger, fear, sadness, happiness, disgust, surprise). Data were analyzed according to the Capitani procedure [1]. The regression analysis revealed significant effects of demographic variables, with younger, more educated, female subjects showing higher scores. Normative data were then applied to a sample of 15 bvFTD patients which showed global impaired performance in the task, consistently with the clinical condition. We provided EK-60F Test normative data for the Italian population allowing the investigation of global emotion recognition ability as well as selective impairment of basic emotions recognition, both for clinical and research purposes.  相似文献   

19.
《Social neuroscience》2013,8(2):241-251
Abstract

According to the reverse simulation model of embodied simulation theory, we recognize others' emotions by subtly mimicking their expressions, which allows us to feel the corresponding emotion through facial feedback. Previous studies examining whether facial mimicry is necessary for facial expression recognition were limited by potentially distracting manipulations intended to artificially restrict facial mimicry or very small samples of people with facial paralysis. We addressed these limitations by collecting the largest sample to date of people with Moebius syndrome, a condition characterized by congenital bilateral facial paralysis. In this Internet-based study, 37 adults with Moebius syndrome and 37 matched control participants completed a facial expression recognition task. People with Moebius syndrome did not differ from the control group or normative data in emotion recognition accuracy, and accuracy was not related to extent of ability to produce facial expressions. Our results do not support the hypothesis that reverse simulation with facial mimicry is necessary for facial expression recognition.  相似文献   

20.

Objective

To investigate the ability of patients with myotonic dystrophy type 1 to recognise basic facial emotions. We also explored the relationship between facial emotion recognition, neuropsychological data, personality, and CTG repeat expansion data in the DM‐1 group.

Methods

In total, 50 patients with DM‐1 (28 women and 22 men) participated, with 41 healthy controls. Recognition of facial emotional expressions was assessed using photographs of basic emotions. A set of tests measured cognition and personality dimensions, and CTG repeat size was quantified in blood lymphocytes.

Results

Patients with DM‐1 showed impaired recognition of facial emotions compared with controls. A significant negative correlation was found between total score of emotion recognition in a forced choice task and CTG repeat size. Furthermore, specific cognitive functions (vocabulary, visuospatial construction ability, and speed) and personality dimensions (reward dependence and cooperativeness) correlated with scores on the forced choice emotion recognition task.

Conclusion

These findings revealed a CTG repeat dependent facial emotion recognition deficit in the DM‐1 group, which was associated with specific neuropsychological functions. Furthermore, a correlation was found between facial emotional recognition ability and personality dimensions associated with sociability. This adds a new clinically relevant dimension in the cognitive deficits associated with DM‐1.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号