首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
《Social neuroscience》2013,8(6):705-716
ABSTRACT

There is compelling evidence that semantic memory is involved in emotion recognition. However, its contribution to the recognition of emotional valence and basic emotions remains unclear. We compared the performance of 10 participants with the semantic variant of primary progressive aphasia (svPPA), a clinical model of semantic memory impairment, to that of 33 healthy participants using three experimental tasks assessing the recognition of: 1) emotional valence conveyed by photographic scenes, 2) basic emotions conveyed by facial expressions, and 3) basic emotions conveyed by prosody sounds. Individuals with svPPA showed significant deficits in the recognition of emotional valence and basic emotions (except happiness and surprise conveyed by facial expressions). However, the performance of the two groups was comparable when the performance on tests assessing semantic memory was added as a covariate in the analyses. Altogether, these results suggest that semantic memory contributes to the recognition of emotional valence and basic emotions. By examining the recognition of emotional valence and basic emotions in individuals with selective semantic memory loss, our results contribute to the refinement of current theories on the role of semantic memory in emotion recognition.  相似文献   

2.
OBJECTIVES: Some studies have reported deficits in the perception of facial expressions among depressed individuals compared with healthy controls, while others have reported negative biases in expression perception. We examined whether altered perception of emotion reflects an underlying trait-like effect in affective disorder by examining facial expression perception in euthymic bipolar patients. METHODS: Sensitivity to six different facial expressions, as well as accuracy of emotion recognition, was examined among 17 euthymic bipolar patients and 17 healthy controls using an interactive computer program. RESULTS: No differences were found between euthymic bipolar patients and controls in terms of sensitivity to any particular emotion. Although initial analysis of the data suggested impairment in the recognition of fear among the patients, identification of this emotion was not relatively impaired compared with that of the other emotions. CONCLUSIONS: The study did not find any conclusive evidence for trait-like deficits in the perception of facially conveyed emotions in bipolar disorder. Altered perception of facial expressions that has been found to accompany depressed mood may instead reflect mood-congruent biases.  相似文献   

3.
Comprehension of affective and nonaffective prosody   总被引:10,自引:0,他引:10  
We studied patients with damage of either the right (RHD) or left hemisphere (LHD) and control subjects to determine whether the RHD patients had a global or limited prosodic defect. Compared with LHD patients and controls, RHD subjects had decreased comprehension of emotional prosody. Both LHD and RHD groups had more impaired comprehension of propositional prosody than controls, but the RHD and LHD groups did not differ. The right hemisphere, therefore, seems to be dominant for comprehending emotional prosody but not propositional prosody.  相似文献   

4.
Intact recognition of emotional prosody following amygdala damage.   总被引:4,自引:0,他引:4  
R Adolphs  D Tranel 《Neuropsychologia》1999,37(11):1285-1292
Bilateral damage to the amygdala in a variety of animal species can impair emotional reactions to stimuli in several sensory modalities. Such damage in humans impairs visual recognition of emotion in facial expressions, but possible impairments in modalities other than vision have not been sufficiently explored. We examined two subjects with complete bilateral amygdala damage, and seven with unilateral amygdala damage, on a standardized task of emotional prosody recognition. The data were compared to those from 15 brain-damaged and from 14 normal control subjects. One of the bilateral amygdala subjects, whose lesions were restricted to the amygdala, was entirely normal in recognizing emotion in prosody on all tasks, the other, whose damage included substantial lesions also in extra-amygdalar structures, especially in right hemisphere, was normal on most, albeit not all, measures of emotional prosody recognition. We suggest that the human amygdala's role in recognizing emotion in prosody may not be as critical as it is for facial expressions, and that extra-amygdalar structures in right hemisphere may be more important for recognizing emotional prosody. It remains possible that recognition of emotion in classes of auditory stimuli other than prosody will require the amygdala.  相似文献   

5.
Studies of patients with brain damage, as well as studies with normal subjects have revealed that the right hemisphere is important for recognizing emotions expressed by faces and prosody. It is unclear, however, if the knowledge needed to perform recognition of emotional stimuli is organized by modality or by the type of emotion. Thus, the purpose of this study is to assess these alternative a priori hypotheses. The participants of this study were 30 stroke patients with right hemisphere damage (RHD) and 31 normal controls (NC). Subjects were assessed with the Polish adaptation of the Right Hemisphere Language Battery of Bryan and the Facial Affect Recognition Test based on work of Ekman and Friesen. RHD participants were significantly impaired on both emotional tasks. Whereas on the visual-faces task the RHD subjects recognized happiness better than anger or sadness, the reverse dissociation was found in the auditory-prosody test. These results confirm prior studies demonstrating the role of the right hemisphere in understanding facial and prosodic emotional expressions. These results also suggest that the representations needed to recognize these emotional stimuli are organized by modality (prosodic-echoic and facial-eidetic) and that some modality specific features are more impaired than others.  相似文献   

6.
The disproportionate impairment for the recognition of facial expressions of disgust in patients with Huntington's disease (HD) forms a double dissociation with the impaired recognition of fear that has been reported in amygdala patients. The dissociation has generated discussion regarding the potential existence of neural substrates dedicated to the recognition of facial signals of specific emotions. The aim of this study was to establish whether the impairment for disgust in HD was restricted solely to the domain of facial perception, or whether HD patients also demonstrate impairment in other kinds of disgust. Fourteen HD patients and fourteen age and education matched healthy controls participated in seven disparate emotion processing tasks. (1) A measure of knowledge for the situational determinants of distinct emotions; (2) recognition of emotion expressed in nonverbal vocalisations; (3) recognition of the emotional content of explicit lexical stimuli; (4) recognition of emotional content in pictures of emotion scenes; (5) a disgust experience questionnaire; (6) a measure of olfactory hedonic responsiveness; (7) a measure of gustatory perception. While verbal aspects of disgust processing were preserved, parallel impairments were revealed for olfactory disgust, vocal disgust expressions, the classification of disgusting pictures, and declarative knowledge of disgust elicitors. The finding of impaired perception of disgust signalled through different input domains suggests that the inability to recognise the facial expression in this population reflects a fundamental problem with disgust processing.  相似文献   

7.
BACKGROUND: The human amygdala is implicated in the formation of emotional memories and the perception of emotional stimuli--particularly fear--across various modalities. OBJECTIVES: To discern the extent to which these functions are related. METHODS: 28 patients who had anterior temporal lobectomy (13 left and 15 right) for intractable epilepsy were recruited. Structural magnetic resonance imaging showed that three of them had atrophy of their remaining amygdala. All participants were given tests of affect perception from facial and vocal expressions and of emotional memory, using a standard narrative test and a novel test of word recognition. The results were standardised against matched healthy controls. RESULTS: Performance on all emotion tasks in patients with unilateral lobectomy ranged from unimpaired to moderately impaired. Perception of emotions in faces and voices was (with exceptions) significantly positively correlated, indicating multimodal emotional processing. However, there was no correlation between the subjects' performance on tests of emotional memory and perception. Several subjects showed strong emotional memory enhancement but poor fear perception. Patients with bilateral amygdala damage had greater impairment, particularly on the narrative test of emotional memory, one showing superior fear recognition but absent memory enhancement. CONCLUSIONS: Bilateral amygdala damage is particularly disruptive of emotional memory processes in comparison with unilateral temporal lobectomy. On a cognitive level, the pattern of results implies that perception of emotional expressions and emotional memory are supported by separate processing systems or streams.  相似文献   

8.
Deficits in the comprehension of facial and prosodic expressions are commonly associated with right hemisphere stroke. However, little is known regarding the impact of these disorders on social relations. We examined facial and prosodic processing, mood, and marital satisfaction in 12 right hemisphere damaged (RHD) stroke patients and nine controls. Results revealed significant impairments in the comprehension of facial expressions and prosody among RHD stroke patients. Nonparametric correlations in the RHD group showed significant associations between marital satisfaction and facial affect discrimination and matching, and nonaffective prosody discrimination. We conclude that deficits in the recognition of nonverbal expressions are associated with reduced relationship satisfaction.  相似文献   

9.
OBJECTIVE: Cognitive deficits associated with frontal lobe dysfunction can occur in amyotrophic lateral sclerosis (ALS), particularly in individuals with bulbar ALS who can also suffer pathologic emotional lability. Because frontal pathophysiology can alter emotional perception, we examined whether emotional perception deficits occur in ALS, and whether they are related to depressive or dementia symptoms. METHODS: Bulbar ALS participants (n=13) and age-matched healthy normal controls (n=12) completed standardized tests of facial emotional and prosodic recognition, the Geriatric Depression Scale, and the Mini-Mental State Examination. Participants identified the basic emotion (happy, sad, angry, afraid, surprised, disgusted) that matched 39 facial expressions and 28 taped, semantically neutral, intoned sentences. RESULTS: ALS patients performed significantly worse than controls on facial recognition but not on prosodic recognition. Eight of 13 patients (62%) scored below the 95% confidence interval of controls in recognizing facial emotions, and 3 of these patients (23% overall) also scored lower in prosody recognition. Among the 8 patients with emotional perceptual impairment, one-half did not have depressive, or memory or cognitive symptoms on screening, whereas the remainder showed dementia symptoms alone or together with depressive symptoms. CONCLUSIONS: Emotional recognition deficits occur in bulbar ALS, particularly with emotional facial expressions, and can arise independent of depressive and dementia symptoms or comorbid with depression and dementia. These findings expand the scope of cognitive dysfunction detected in ALS, and bolsters the view of ALS as a multisystem disorder involving cognitive and also motor deficits.  相似文献   

10.
The purpose of this study was to consider the effects of valence, motoric direction (i.e., approach/withdrawal), and arousal on the perception of facial emotion in patients with unilateral cortical lesions. We also examined the influence of lesion side, site, and size on emotional perception. Subjects were 30 right-hemisphere-damaged (RHD) and 30 left-hemisphere-damaged (LHD) male patients with focal lesions restricted primarily to the frontal, temporal, or parietal lobe. Patient groups were comparable on demographic and clinical neurological variables. Subjects were tested for their ability to match photographs of four facial emotional expressions: happiness, sadness, fear, and anger. Overall, RHD patients were significantly more impaired than LHD patients in perceiving facial emotion. Lesion side, but not site, was associated with motoric direction and valence dimensions. RHD patients had specific deficits relative to LHD patients in processing negative and withdrawal emotions; there were no group differences for positive/approach emotions. Lesion size was not significantly correlated with accuracy of emotional perception.  相似文献   

11.
Emotional signals in spoken language can be conveyed by semantic as well as prosodic cues. We investigated the role of the fronto-parietal operculum, a somatosensory area where the lips, tongue and jaw are represented, in the right hemisphere to detection of emotion in prosody vs. semantics. A total of 14 healthy volunteers participated in the present experiment, which involved transcranial magnetic stimulation (TMS) in combination with frameless stereotaxy. As predicted, compared with sham stimulation, TMS over the right fronto-parietal operculum differentially affected the reaction times for detection of emotional prosody vs. emotional semantics, showing that there is a dissociation at a neuroanatomical level. Detection of withdrawal emotions (fear and sadness) in prosody was delayed significantly by TMS. No effects of TMS were observed for approach emotions (happiness and anger). We propose that the right fronto-parietal operculum is not globally involved in emotion evaluation, but sensitive to specific forms of emotional discrimination and emotion types.  相似文献   

12.
The facial expressions of six basic emotions were posed by two groups of right (N = 23) and left (N = 34) brain damaged patients and by a control group of normal subjects (N = 28). The posed expressions were examined by means of the Facial Action Coding System (FACS) which provides analytical and objective scoring, as by a subjective scale of appropriateness of expression. Results indicated no difference between controls and patients with a lesion in the right or left hemisphere. These findings are inconsistent with the hypothesis that the right hemisphere plays a specific role in the control of posed facial expression. No relationship was observed between posed emotional expressions and facial paralysis or the presence of oral apraxia.  相似文献   

13.
BACKGROUND: It has been suggested that depressed patients have a "negative bias" in recognising other people's emotions; however, the detailed structure of this negative bias is not fully understood. OBJECTIVES: To examine the ability of depressed patients to recognise emotion, using moving facial and prosodic expressions of emotion. METHODS: 16 depressed patients and 20 matched (non-depressed) controls selected one basic emotion (happiness, sadness, anger, fear, surprise, or disgust) that best described the emotional state represented by moving face and prosody. RESULTS: There was no significant difference between depressed patients and controls in their recognition of facial expressions of emotion. However, the depressed patients were impaired relative to controls in their recognition of surprise from prosodic emotions, judging it to be more negative. CONCLUSIONS: We suggest that depressed patients tend to interpret neutral emotions, such as surprise, as negative. Considering that the deficit was seen only for prosodic emotive stimuli, it would appear that stimulus clarity influences the recognition of emotion. These findings provide valuable information on how depressed patients behave in complicated emotional and social situations.  相似文献   

14.
This study investigated the ability of adults with Asperger syndrome to recognize emotional categories of facial expressions and emotional prosodies with graded emotional intensities. The individuals with Asperger syndrome showed poorer recognition performance for angry and sad expressions from both facial and vocal information. The group difference in facial expression recognition was prominent for stimuli with low or intermediate emotional intensities. In contrast to this, the individuals with Asperger syndrome exhibited lower recognition accuracy than typically-developed controls mainly for emotional prosody with high emotional intensity. In facial expression recognition, Asperger and control groups showed an inversion effect for all categories. The magnitude of this effect was less in the Asperger group for angry and sad expressions, presumably attributable to reduced recruitment of the configural mode of face processing. The individuals with Asperger syndrome outperformed the control participants in recognizing inverted sad expressions, indicating enhanced processing of local facial information representing sad emotion. These results suggest that the adults with Asperger syndrome rely on modality-specific strategies in emotion recognition from facial expression and prosodic information.  相似文献   

15.
Facial and vocal expressions are essential modalities mediating the perception of emotion and social communication. Nonetheless, currently little is known about how emotion perception and its neural substrates differ across facial expression and vocal prosody. To clarify this issue, functional MRI scans were acquired in Study 1, in which participants were asked to discriminate the valence of emotional expression (angry, happy or neutral) from facial, vocal, or bimodal stimuli. In Study 2, we used an affective priming task (unimodal materials as primers and bimodal materials as target) and participants were asked to rate the intensity, valence, and arousal of the targets. Study 1 showed higher accuracy and shorter response latencies in the facial than in the vocal modality for a happy expression. Whole-brain analysis showed enhanced activation during facial compared to vocal emotions in the inferior temporal-occipital regions. Region of interest analysis showed a higher percentage signal change for facial than for vocal anger in the superior temporal sulcus. Study 2 showed that facial relative to vocal priming of anger had a greater influence on perceived emotion for bimodal targets, irrespective of the target valence. These findings suggest that facial expression is associated with enhanced emotion perception compared to equivalent vocal prosodies.  相似文献   

16.
BACKGROUND: While there is abundant evidence that patients with Huntington's disease (HD) have an impairment in the recognition of the emotional facial expression of disgust, previous studies have only examined emotion perception using full-blown facial expressions. OBJECTIVE: The current study examines the perception of facial emotional expressions in HD at different levels of intensity to investigate whether more subtle deficits can be detected, possible also in other emotions. METHOD: We compared early symptomatic HD patients with healthy matched controls on emotion perception, presenting short video clips of a neutral face changing into one of the six basic emotions (happiness, anger, fear, surprise, disgust and sadness) with increasing intensity. Overall face perception ability as well as depressive symptoms were taken into account. RESULTS: A specific impairment in recognizing the emotions disgust and anger was found, which was present even at low emotion intensities. CONCLUSION: These results extend previous findings and support the use of more sensitive emotion perception paradigms, which enable the detection of subtle neurobehavioral deficits even in the pre- and early symptomatic stages of the disease.  相似文献   

17.
Spontaneous facial expression of emotions in brain-damaged patients   总被引:1,自引:0,他引:1  
Spontaneous facial expression of emotion was studied in two groups of right (N = 23) and left (N = 39) brain-damaged patients and in a control group of normal subjects (N = 28). To elicit emotions four short movies, constructed to produce positive, negative or neutral emotional responses, were used. The method used to assess the facial expression of emotions was the Facial Action Coding System. Brain-damaged patients showed less facial responses to emotional stimuli than normal controls, but no difference was observed between subjects with right and left-sided lesions either with global or disaggregated data analyses, inconsistent with the hypothesis of a specialization of the right hemisphere for facial emotional expressions. An unexpected difference was observed in response to the unpleasant movie. Both normal controls and left brain-damaged patients often averted their gaze from the screen when unpleasant material was displayed, whereas right brain-damaged patients rarely showed gaze aversion. This finding suggests that the degree of emotional involvement or manner of coping with stressful input may be reduced as a result of right brain damage.  相似文献   

18.
Background : Research addressing prosodic deficits in brain-damaged populations has concentrated on the specialised capabilities of the right and the left cerebral hemispheres in processing the global characteristics of prosody. This focus has been of interest in that the fundamental frequency (F0), duration and intensity acoustic characteristics within a prosodic structure can convey different linguistic and nonlinguistic information. Much of the research investigating this interesting phenomenon has produced conflicting results. As such, different theories have been proposed in an attempt to provide plausible explanations of the conflicting findings regarding hemispheric specialisation in processing prosodic structures. Aims : The purpose of this study was to examine one of the theories, the functional lateralisation theory, through four experiments that altered the linguistic and nonlinguistic functions across a range of prosodic structures. Methods & Procedures : Three groups of subjects participated in each of the four experiments: (1) eight subjects with LHD, (2) eight subjects with RHD, and (3) eight control subjects. The first experiment addressed the extent to which the processing of lexical stress differences would be lateralised to the left or right hemisphere by requiring listeners to determine the meanings and grammatical assignments of two-syllable words conveyed through stressed or unstressed syllables. In another linguistic condition, the second experiment placed demands on syntactic parsing operations by requiring listeners to parse syntactically ambiguous sentences which were disambiguated through the perception of prosodic boundaries located at syntactic junctures. A third linguistic condition required listeners to determine the categorical assignment of a speaker's intention of making a statement or asking a question conveyed through the prosodic structures. The fourth experiment was designed to determine hemispheric lateralisation in processing nonlinguistic prosodic structures. In this experiment, listeners were required to determine the emotional state of a speaker conveyed through the prosodic structures in sentences that contained semantic information which was either congruent or incongruent with the emotional content of the prosodic structures. Results : When subjects were asked to identify lexical stress differences (Experiment 1), syntactically ambiguous sentences (Experiment 2), and questions and statements (Experiment 3) conveyed through prosody, the LHD group demonstrated a significantly poorer performance than the control and RHD groups. When asked to identify emotions conveyed through prosody (Experiment 4), the RHD group demonstrated a significantly poorer performance than the control and LHD groups. Conclusion : These findings support the functional lateralisation theory that proposes a left hemisphere dominance for processing linguistic prosodic structures and a right hemisphere dominance for processing nonlinguistic prosodic structures.  相似文献   

19.
In this report we aim to explore severe deficits in facial affect recognition in three boys all of whom meet the criteria of Asperger's syndrome (AS), as well as overt prosopagnosia in one (B) and covert prosopagnosia in the remaining two (C and D). Subject B, with a familially-based talent of being highly gifted in physics and mathematics, showed no interest in people, a quasi complete lack of comprehension of emotions, and very poor emotional reactivity. The marked neuropsychological deficits were a moderate prosopagnosia and severely disordered recognition of facial emotions, gender and age. Expressive facial emotion, whole body psychomotor expression and speech prosody were quasi absent as well. In all three boys these facial processing deficits were more or less isolated, and general visuospatial functions, attention, formal language and scholastic performances were normal or even highly developed with the exception of deficient gestalt perception in B. We consider the deficient facial emotion perception as an important pathogenetic symptom for the autistic behaviour in the three boys. Prosopagnosia, the absent facial and bodily expression, and speech prosody were important but varying co-morbid disorders. The total clinical picture of non-verbal disordered communication is a complex of predominantly bilateral and/or right hemisphere cortical deficits. Moreover, in B, insensitivity to pain, smells, noises and internal bodily feelings suggested a more general emotional anaesthesia and/or a deficient means of expression. It is possible that a limbic component might be involved, thus making affective appreciation also deficient. Accepted: 18 January 2000  相似文献   

20.
Patients with borderline personality disorder (BPD) exhibit impairment in labeling of facial emotional expressions. However, it is not clear whether these deficits affect the whole domain of basic emotions, are valence-specific, or specific to individual emotions. Whether BPD patients' errors in a facial emotion recognition task create a specific pattern also remains to be elucidated. Our study tested two hypotheses: first, we hypothesized, that the emotion perception impairment in borderline personality disorder is specific to the negative emotion domain. Second, we hypothesized, that BPD patients would show error patterns in a facial emotion recognition task more commonly and more systematically than healthy comparison subjects. Participants comprised 33 inpatients with BPD and 32 matched healthy control subjects who performed a computerized version of the Ekman 60 Faces test. The indices of emotion recognition and the direction of errors were processed in separate analyses. Clinical symptoms and personality functioning were assessed using the Symptom Checklist-90-Revised and the Young Schema Questionnaire Long Form. Results showed that patients with BPD were less accurate than control participants in emotion recognition, in particular, in the discrimination of negative emotions, while they were not impaired in the recognition of happy facial expressions. In addition, patients over-attributed disgust and surprise and under-attributed fear to the facial expressions relative to controls. These findings suggest the importance of carefully considering error patterns, besides measuring recognition accuracy, especially among emotions with negative affective valence, when assessing facial affect recognition in BPD.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号