首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 781 毫秒
1.
Spontaneous facial expression of emotions in brain-damaged patients   总被引:1,自引:0,他引:1  
Spontaneous facial expression of emotion was studied in two groups of right (N = 23) and left (N = 39) brain-damaged patients and in a control group of normal subjects (N = 28). To elicit emotions four short movies, constructed to produce positive, negative or neutral emotional responses, were used. The method used to assess the facial expression of emotions was the Facial Action Coding System. Brain-damaged patients showed less facial responses to emotional stimuli than normal controls, but no difference was observed between subjects with right and left-sided lesions either with global or disaggregated data analyses, inconsistent with the hypothesis of a specialization of the right hemisphere for facial emotional expressions. An unexpected difference was observed in response to the unpleasant movie. Both normal controls and left brain-damaged patients often averted their gaze from the screen when unpleasant material was displayed, whereas right brain-damaged patients rarely showed gaze aversion. This finding suggests that the degree of emotional involvement or manner of coping with stressful input may be reduced as a result of right brain damage.  相似文献   

2.
Aspects of emotional facial expression (responsivity, appropriateness, intensity) were examined in brain-damaged adults with right or left hemisphere cerebrovascular lesions and in normal controls. Subjects were videotaped during experimental procedures designed to elicit emotional facial expression and non-emotional facial movement (paralysis, mobility, praxis). On tasks of emotional facial expression, patients with right hemisphere pathology were less responsive and less appropriate than patients with left hemisphere pathology or normal controls. These results corroborate other research findings that the right cerebral hemisphere is dominant for the expression of facial emotion. Both brain-damaged groups had substantial facial paralysis and impairment in muscular mobility on the hemiface contralateral to site of lesion, and the left brain-damaged group had bucco-facial apraxia. Performance measures of emotional expression and non-emotional movement were uncorrelated, suggesting a dissociation between these two systems of facial behaviour.  相似文献   

3.
Although there is a consensus that patients with schizophrenia have certain deficits in perceiving and expressing facial emotions, previous studies of facial emotion perception in schizophrenia do not present consistent results. The objective of this study was to explore facial emotion perception deficits in Chinese patients with schizophrenia and their non-psychotic first-degree relatives. Sixty-nine patients with schizophrenia, 56 of their first-degree relatives (33 parents and 23 siblings), and 92 healthy controls (67 younger healthy controls matched to the patients and siblings, and 25 older healthy controls matched to the parents) completed a set of facial emotion perception tasks, including facial emotion discrimination, identification, intensity, valence, and corresponding face identification tasks. The results demonstrated that patients with schizophrenia performed significantly worse than their siblings and younger healthy controls in accuracy in a variety of facial emotion perception tasks, whereas the siblings of the patients performed as well as the corresponding younger healthy controls in all of the facial emotion perception tasks. Patients with schizophrenia also showed significantly reduced speed than younger healthy controls, while siblings of patients did not demonstrate significant differences with both patients and younger healthy controls in speed. Meanwhile, we also found that parents of the schizophrenia patients performed significantly worse than the corresponding older healthy controls in accuracy in terms of facial emotion identification, valence, and the composite index of the facial discrimination, identification, intensity and valence tasks. Moreover, no significant differences were found between the parents of patients and older healthy controls in speed after controlling the years of education and IQ. Taken together, the results suggest that facial emotion perception deficits may serve as potential endophenotypes for schizophrenia.  相似文献   

4.
How the brain is lateralised for emotion processing remains a key question in contemporary neuropsychological research. The right hemisphere hypothesis asserts that the right hemisphere dominates emotion processing, whereas the valence hypothesis holds that positive emotion is processed in the left hemisphere and negative emotion is controlled by the right hemisphere. A meta-analysis was conducted to assess unilateral brain-damaged individuals’ performance on tasks of facial emotion perception according to valence. A systematic search of the literature identified seven articles that met the conservative selection criteria and could be included in a meta-analysis. A total of 12 meta-analyses of facial expression perception were constructed assessing identification and labelling tasks according to valence and the side of brain damage. The results demonstrated that both left and right hemisphere damage leads to impairments in emotion perception (identification and labelling) irrespective of valence. Importantly, right hemisphere damage prompted more pronounced emotion perception impairment than left hemisphere damage, across valence, suggesting right hemisphere dominance for emotion perception. Furthermore, right hemisphere damage was associated with a larger tendency for impaired perception of negative than positive emotion across identification and labelling tasks. Overall the findings support Adolphs, Jansari, and Tranel (2001) model whereby the right hemisphere preferentially processes negative facial expressions and both hemispheres process positive facial expressions.  相似文献   

5.
The literature about the lateralization of facial emotion perception according to valence (positive, negative) is conflicting; investigating the underlying processes may shed light on why some studies show right-hemisphere dominance across valence and other studies demonstrate hemispheric differences according to valence. This is the first clinical study to examine whether the use of configural and featural cues underlies hemispheric differences in affective face perception. Right brain-damaged (RBD; n = 17), left brain-damaged (LBD; n = 17) and healthy control (HC; n = 34) participants completed an affective face discrimination task that tested configural processing using whole faces and featural processing using partial faces. No group differences in expression perception according to valence or processing strategy were found. Across emotions, the RBD group was less accurate than the HC group in discriminating whole faces, whilst the RBD and LBD groups were less accurate than HCs in discriminating partial faces. This suggests that the right hemisphere processes facial expressions from configural and featural information, whereas the left hemisphere relies more heavily on featural facial information.  相似文献   

6.
Disorders in nonverbal communication of emotion have been documented in patients with right hemisphere pathology; lexical expression of emotion is virtually unstudied. In this preliminary investigation, emotionally laden slides were used to elicit discourse from right brain-damaged (RBD), left brain-damaged (LBD), and normal control (NC) subjects. New techniques were developed to examine the ability of these subjects to express emotion in words; formalistic and pragmatic analyses of the discourse were conducted. RBDs, relative to NCs and LBDs, were less successful in using words to convey emotion and produced words of lower emotional intensity. LBD aphasics, despite their linguistic deficits, were comparable to NCs in conveying emotional valence. The data tend to support the speculation that the right hemisphere is dominant for lexical expression of emotion. This study has implications for the neuropsychological investigation of language, emotion, and the brain.  相似文献   

7.
The purpose of this study was to consider the effects of valence, motoric direction (i.e., approach/withdrawal), and arousal on the perception of facial emotion in patients with unilateral cortical lesions. We also examined the influence of lesion side, site, and size on emotional perception. Subjects were 30 right-hemisphere-damaged (RHD) and 30 left-hemisphere-damaged (LHD) male patients with focal lesions restricted primarily to the frontal, temporal, or parietal lobe. Patient groups were comparable on demographic and clinical neurological variables. Subjects were tested for their ability to match photographs of four facial emotional expressions: happiness, sadness, fear, and anger. Overall, RHD patients were significantly more impaired than LHD patients in perceiving facial emotion. Lesion side, but not site, was associated with motoric direction and valence dimensions. RHD patients had specific deficits relative to LHD patients in processing negative and withdrawal emotions; there were no group differences for positive/approach emotions. Lesion size was not significantly correlated with accuracy of emotional perception.  相似文献   

8.
Intact recognition of emotional prosody following amygdala damage.   总被引:4,自引:0,他引:4  
R Adolphs  D Tranel 《Neuropsychologia》1999,37(11):1285-1292
Bilateral damage to the amygdala in a variety of animal species can impair emotional reactions to stimuli in several sensory modalities. Such damage in humans impairs visual recognition of emotion in facial expressions, but possible impairments in modalities other than vision have not been sufficiently explored. We examined two subjects with complete bilateral amygdala damage, and seven with unilateral amygdala damage, on a standardized task of emotional prosody recognition. The data were compared to those from 15 brain-damaged and from 14 normal control subjects. One of the bilateral amygdala subjects, whose lesions were restricted to the amygdala, was entirely normal in recognizing emotion in prosody on all tasks, the other, whose damage included substantial lesions also in extra-amygdalar structures, especially in right hemisphere, was normal on most, albeit not all, measures of emotional prosody recognition. We suggest that the human amygdala's role in recognizing emotion in prosody may not be as critical as it is for facial expressions, and that extra-amygdalar structures in right hemisphere may be more important for recognizing emotional prosody. It remains possible that recognition of emotion in classes of auditory stimuli other than prosody will require the amygdala.  相似文献   

9.
The aim of this study was to investigate facial expression recognition (FER) accuracy in social phobia and in particular to explore how facial expressions of emotion were misclassified. We hypothesised that compared with healthy controls, subjects with social phobia would be no less accurate in their identification of facial emotions (as reported in previous studies) but that they would misclassify facial expressions as expressing threatening emotions (anger, fear or disgust). Thirty individuals with social phobia and twenty-seven healthy controls completed a FER task which featured six basic emotions morphed using computer techniques between 0 percent (neutral) and 100 percent intensity (full emotion). Supporting our hypotheses we found no differences between the groups on measures of the accuracy of emotion recognition but that compared with healthy controls the social phobia group were more likely both to misclassify facial expressions as angry and to interpret neutral facial expressions as angry. The healthy control group were more likely to misclassify neutral expressions as sad. The importance of the role of these biases in social phobia needs further replication but may help in understanding the disorder and provide an interesting area for future research and therapy.  相似文献   

10.
To evaluate the processing ability of implicature , especially indirect request and indirect refusal, the Four Scenes Test was administered to 20 left hemisphere damaged (LHD) subjects, 20 right hemisphere damaged (RHD) subjects and 15 normal control subjects. The results showed first, that the performances of the brain-damaged subjects was inferior to that of the control subjects in processing implicature; second, that indirect request understanding was easier than indirect refusal understanding in the brain-damaged subjects, and the third that indirect refusal understanding in of the left hemisphere damaged subjects was inferior to that of the right hemisphere damaged subjects. There was no group difference in indirect request understanding though the ability of brain-damaged subjects was poorer than that of the normal controls. These findings suggest that LHD and RHD subjects were not the same in implicature processing and their responses depended on the implicature's modularity (refusal and request).  相似文献   

11.
Clinical research on facial emotions has focused primarily on differences between right and left hemiface. Social psychology, however, has suggested that differences between upper versus lower facial displays may be more important, especially during social interactions. We demonstrated previously that upper facial displays are perceived preferentially by the right hemisphere, while lower facial displays are perceived preferentially by the left hemisphere. A marginal age-related effect was observed. The current research expands our original cohort to include 26 elderly individuals over age 62. Fifty-six, strongly right-handed, healthy, adult volunteers were tested tachistoscopically by flashing randomized facial displays of emotion to the right and left visual fields. The stimuli consisted of line drawings displaying various combinations of emotions on the upper and lower face. The subjects were tested under two conditions: without attend instruction and with instructions to attend to the upper face. Based on linear regression and discriminant analyses modeling age, subject performance could be divided into two distinct groups: Young (< 62 years) and Old (> 62 years). Without attend instructions, both groups robustly identified the emotion displayed on the lower face, regardless of visual field presentation. With instructions to attend to the upper face, the Old group demonstrated a markedly decreased ability to identify upper facial displays, compared to the Young group. The most significant difference was noted in the left visual field/right hemisphere. Our results demonstrate a significant decline in the processing of upper facial emotions by the right hemisphere in older individuals, thus providing partial support for the right hemisphere hypothesis of cognitive aging. The decreased ability to perceive upper facial displays coupled with age-related deficits in processing affective prosody may well cause impaired psychosocial competency in the elderly.  相似文献   

12.
A great number of studies have shown that non-clinical individuals rely predominantly on the right hemisphere to process facial emotion. Previous studies have shown that males suffering from Asperger's syndrome show a typical right hemisphere bias for processing facial emotion (happiness and sadness) but a reduced right hemisphere bias for processing facial identity. This study looks at the lateralisation of all six basic emotions using the chimeric faces test in 64 non-clinical participants (32 males, 32 females) and correlates it with their autistic traits measured using the Broad Autistic Phenotype Questionnaire. For males only, regression analyses showed a relationship between the aloof personality trait and lateralisation for fear, happiness, and surprise. Males with high autistic scores on the aloof personality subscale (showing a lack of interest in social interaction) were more strongly lateralised to the right hemisphere for processing fear, happiness, and surprise. For males there was no relationship with anger, disgust, sadness, or non-facial stimuli, and for females there were no significant relationships at all. The autistic traits of rigidity and pragmatic language were not significant predictors of emotion lateralisation. The over-reliance on the right hemisphere for processing facial emotion in males seems to support the idea that the autistic brain could be seen as hyper-masculinised, possibly due to prenatal testosterone exposure.  相似文献   

13.
ObjectivesAlthough emotional cues like facial emotion expressions seem to be important in social interaction, there is no specific training about emotional cues for psychiatrists. Here, we aimed to investigate psychiatrists' ability of facial emotion recognition and relation with their clinical identification as psychotherapy–psychopharmacology oriented or being adult and childhood-adolescent psychiatrist.MethodsFacial Emotion Recognition Test was performed to 130 psychiatrists that were constructed by a set of photographs (happy, sad, fearful, angry, surprised, disgusted and neutral faces) from Ekman and Friesen's.ResultsPsychotherapy oriented adult psychiatrists were significantly better in recognizing sad facial emotion (p = .003) than psychopharmacologists while no significant differences were detected according to therapeutic orientation among child-adolescent psychiatrists (for each, p > .05). Adult psychiatrists were significantly better in recognizing fearful (p = .012) and disgusted (p = .003) facial emotions than child-adolescent psychiatrists while the latter were better in recognizing angry facial emotion (p = .008).ConclusionFor the first time, we have shown some differences on psychiatrists' facial emotion recognition ability according to therapeutic identification and being adult or child-adolescent psychiatrist. It would be valuable to investigate how these differences or training the ability of facial emotion recognition would affect the quality of patient–clinician interaction and treatment related outcomes.  相似文献   

14.
In this review article, we summarize our results from magnetoencephalography (MEG) and electroencephalography (EEG) studies on face perception. The primary results were as follows: (1) facial (eye and mouth) movements are processed differently from general motion perception, but eye and mouth movements are likely processed in the same manner. (2) In a study investigating the interaction between auditory and visual stimuli relating to vowel sounds in the auditory cortex, vowel sound perception in the auditory cortex, at least in the primary processing stage, was not affected by simultaneously viewing mouth movements. (3) In a study investigating the effects of face contour and features on early occipitotemporal activity when viewing eye movement, there was evidence of specific information processing for eye movements in the occipitotemporal region, and this activity was significantly influenced by whether the movements appeared with the face contour and/or features. (4) In a study investigating the effects of inverting facial contour (hair and chin) and features (eyes, nose and mouth) on the processing of static and dynamic face perception, activity in the right fusiform area was more affected by the inversion of features, whereas activity in the left fusiform area was more affected by disruption of the spatial relationship between the contour and features in static face perception, and activity in the right occipitotemporal area was most affected by inversion of the facial contour in dynamic face perception. (5) In a study investigating the perception of changes in facial emotion, the areas of the brain involved in perceiving changes in facial emotion were found to have not matured by 14 years of age.  相似文献   

15.
To depict the neural substrates for facial emotion recognition and to determine whether their activation is confounded by a verbal factor, we studied eight normal volunteers with functional magnetic resonance imaging (fMRI). Verbal and non-verbal sample stimuli were used in a facial emotion matching task and a gender matching task (control condition). Compared with the gender tasks, the emotion tasks significantly activated the right ventral prefrontal cortex, the right lingual cortex, and the left lateral fusiform cortex, irrespective of sample stimuli. The visual association cortices showed a significant interaction between the task and the material presented, as the activation for verbal materials was higher than for non-verbal materials during the emotion matching tasks. By contrast, no significant interaction was found in the right ventral prefrontal cortex. These results suggest that the verbal factor has a different effect on the neural networks for facial emotion, processing.  相似文献   

16.
There is an increasing amount of evidence which suggests that each hemisphere is differently specialised for processing facial stimuli, with the right hemisphere specialised for the processing of configural information and the left hemisphere specialised for the processing of featural information. While there is evidence for this distinction from studies of face recognition, it has not been shown in studies of lateralisation for processing facial emotion. In this study the chimeric faces test was used with faces expressing anger, disgust, fear, happiness, sadness or surprise, presented in either an upright or an inverted orientation. When presented upright, a significant right hemisphere bias was found for all six emotions. However, when inverted, a significant left hemisphere bias was found for the processing of happiness and surprise, but not for the processing of negative emotions (although the analysis was approaching significance for anger). These findings support the hypothesis that each hemisphere is differently specialised for processing facial emotion, but contradicts previous work that examined the effects of inversion on chimeric face stimuli.  相似文献   

17.
Narme P  Bonnet AM  Dubois B  Chaby L 《Neuropsychologia》2011,49(12):3295-3302
Parkinson's disease (PD) has been frequently associated with facial emotion recognition impairments, which could adversely affect the social functioning of those patients. Facial emotion recognition requires processing of the spatial relations between facial features, known as the facial configuration. Few studies, however, have investigated this ability in people with PD. We hypothesized that facial emotion recognition impairments in patients with PD could be accounted for by a deficit in configural processing. To assess this hypothesis, three tasks were proposed to 10 patients with PD and 10 healthy controls (HC): (i) a facial emotion recognition task with upright faces, (ii) a similar task with upside-down faces, to explore the face inversion effect, and (iii) a configural task to assess participants’ abilities to detect configural modifications made on a horizontal or vertical axis. The results showed that when compared with the HC group, the PD group had impaired facial emotion recognition, in particular for faces expressing anger and fear, and exhibited reduced face inversion effect for these emotions. More importantly, the PD group's performance on the configural task to detect vertical modifications was lower than the HC group's. Taken together, these results suggest the presence of a configural processing alteration in patients with PD, especially for vertical, second-order information. Furthermore, configural performance was positively correlated with emotion recognition for anger, disgust, and fear, suggesting that facial emotion recognition could be related, at least partially, to configural processing.  相似文献   

18.
Children and adults with mental retardation were tested on their ability to recognize facial expressions of emotion. The sample consisted of 80 children and adults with mental retardation and a control group of 80 nonhandicapped children matched on mental age and gender. Ekman and Friesen's normed photographs of the six basic emotions (anger, disgust, fear, happiness, sadness, and surprise) were used in a recognition task of facial expressions. Subjects were individually read two-sentence stories identifying a specific emotion, presented with a randomized array of the six photographs of the basic facial expressions of emotion, and then asked to select the photograph that depicted the emotion identified in the story. This procedure was repeated with 24 different stories, with each of the six basic emotions being represented four times. Results showed that, as a group, individuals with mental retardation were not as proficient as their mental-age-matched nonhandicapped control subjects at recognizing facial expressions of emotion. Although adults with mild mental retardation were more proficient at this task than those with moderate mental retardation, this finding was not true for children. There was a modest difference between the children with moderate mental retardation and their nonhandicapped matched controls in their ability to recognize facial expression of disgust.  相似文献   

19.
Facial expression processing and the attribution of facial emotions to a context were investigated in adults with Down syndrome (DS) in two experiments. Their performances were compared with those of a child control group matched for receptive vocabulary. The ability to process faces without emotional content was controlled for, and no differences appeared between the two groups. Specific impairments were found in the DS group according to the task modalities and the type of facial emotional expressions. In the emotion matching condition, the DS adults showed overall difficulties whereas in the identification and recognition conditions they were particularly impaired when processing the neutral expression. In the emotion attribution task, they exhibited difficulties with the sad expression only and the analysis of their error pattern revealed that they rarely selected this expression throughout the task. The sad emotion was the only one that showed a significant relationship with the facial expression processing tasks.  相似文献   

20.
Asymmetry in comprehension of facial expression of emotions was explored in the present study by analysing alpha band variation within the right and left cortical sides. Second, the behavioural activation system (BAS) and behavioural inhibition system (BIS) were considered as an explicative factor to verify the effect of a motivational/emotional variable on alpha activity. A total of 19 participants looked at an ample range of facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral) in random order. The results demonstrated that anterior frontal sites were more active than central and parietal sites in response to facial stimuli. Moreover, right and left side responses varied as a function of emotional types, with an increased right frontal activity for negative, aversive emotions vs an increased left response for positive emotion. Finally, whereas higher BIS participants generated more right hemisphere activation for some negative emotions (such as fear, anger, surprise, and disgust), BAS participants were more responsive to positive emotion (happiness) within the left hemisphere. Motivational significance of facial expressions was considered to elucidate cortical differences in participants' responses to emotional types.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号