首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
《Social neuroscience》2013,8(1):36-49
Facial expressions play a critical role in social interactions by eliciting rapid responses in the observer. Failure to perceive and experience a normal range and depth of emotion seriously impact interpersonal communication and relationships. As has been demonstrated across a number of domains, abnormal emotion processing in individuals with psychopathy plays a key role in their lack of empathy. However, the neuroimaging literature is unclear as to whether deficits are specific to particular emotions such as fear and perhaps sadness. Moreover, findings are inconsistent across studies. In the current experiment, 80 incarcerated adult males scoring high, medium, and low on the Hare Psychopathy Checklist-Revised (PCL-R) underwent functional magnetic resonance imaging (fMRI) scanning while viewing dynamic facial expressions of fear, sadness, happiness, and pain. Participants who scored high on the PCL-R showed a reduction in neuro-hemodynamic response to all four categories of facial expressions in the face processing network (inferior occipital gyrus, fusiform gyrus, and superior temporal sulcus (STS)) as well as the extended network (inferior frontal gyrus and orbitofrontal cortex (OFC)), which supports a pervasive deficit across emotion domains. Unexpectedly, the response in dorsal insula to fear, sadness, and pain was greater in psychopaths than non-psychopaths. Importantly, the orbitofrontal cortex and ventromedial prefrontal cortex (vmPFC), regions critically implicated in affective and motivated behaviors, were significantly less active in individuals with psychopathy during the perception of all four emotional expressions.  相似文献   

2.

Background

Abnormal neural responses to others’ emotions, particularly cues of threat and distress, have been implicated in the development of chronic violence. We examined neural responses to several emotional cues within a prospectively identified group of chronically violent men. We also explored the association between neural responses to social emotions and psychopathic features.

Methods

We compared neural responses to happy, sad, angry, fearful and neutral faces between chronically violent (n = 22) and non-violent (n = 20) men using functional magnetic resonance imaging (fMRI). Participants were prospectively identified from a longitudinal study based on information collected from age 7 to 27 years. We assessed psychopathic features using a self-report measure administered in adulthood.

Results

The chronically violent men exhibited significantly reduced neural responses in the dorsomedial prefrontal cortex to all faces, regardless of the emotional content, compared with nonviolent men. We also observed a hyperactive amygdala response to neutral faces in chronically violent men, but only within the context of viewing happy faces. Moreover, they exhibited a greater dorsomedial prefrontal cortex response to mildly fearful faces than nonviolent men. These abnormalities were not associated with psychopathic features in chronically violent men.

Limitations

It remains unclear whether the observed neural abnormalities preceded or are a consequence of persistent violence, and these results may not generalize to chronically violent women.

Conclusion

Chronically violent men exhibit a reduced neural response to facial cues regardless of emotional content. It appears that chronically violent men may view emotionally ambiguous facial cues as potentially threatening and implicitly reinterpret subtle cues of fear in others so they no longer elicit a negative response.  相似文献   

3.
Studies investigating the ability to recognize emotional facial expressions in non-demented individuals with Parkinson's disease (PD) have yielded equivocal findings. A possible reason for this variability may lie in the confounding of emotion recognition with cognitive task requirements, a confound arising from the lack of a control condition using non-emotional stimuli. The present study examined emotional facial expression recognition abilities in 20 non-demented patients with PD and 23 control participants relative to their performance on a non-emotional landscape categorization test with comparable task requirements. We found that PD participants were normal on the control task but exhibited selective impairments in the recognition of facial emotion, specifically for anger (driven by those with right hemisphere pathology) and surprise (driven by those with left hemisphere pathology), even when controlling for depression level. Male but not female PD participants further displayed specific deficits in the recognition of fearful expressions. We suggest that the neural substrates that may subserve these impairments include the ventral striatum, amygdala, and prefrontal cortices. Finally, we observed that in PD participants, deficiencies in facial emotion recognition correlated with higher levels of interpersonal distress, which calls attention to the significant psychosocial impact that facial emotion recognition impairments may have on individuals with PD.  相似文献   

4.
Previous work has suggested that elevated levels of trait anxiety are associated with an increased ability to accurately recognize the facial expression of fear. However, to date, recognition has only been assessed after viewing periods of 10s, despite the fact that the process of emotion recognition from faces typically takes a fraction of this time. The current study required participants with either high or low levels of non-clinical trait anxiety to make speeded emotional classification judgments to a series of facial expressions drawn from seven emotional categories. Following previous work it was predicted that recognition of fearful facial expressions would be more accurate in the high-trait anxious group compared with the low-trait anxious group. However, contrary to this prediction, no anxiety-related differences in emotion perception were observed across all seven emotions. This suggests that anxiety does not influence the perception of fear as has been previously proposed.  相似文献   

5.
Facial expression recognition is a central feature of emotional and social behaviour and previous studies have found that alcoholics are impaired in this skill when presented with single emotions of differing intensities. The aim of this study was to explore biases in alcoholics' recognition of emotions when they were a mixture of two closely related emotions. The amygdala is intimately involved in encoding of emotions, especially those related to fear. In animals an increased number of withdrawals from alcohol leads to increased seizure sensitivity associated with facilitated transmission in the amygdala and related circuits. A further objective therefore was to explore the effect of previous alcohol detoxifications on the recognition of emotional facial expressions. Fourteen alcoholic inpatients were compared with 14 age and sex matched social drinking controls. They were asked to rate how much of each of six emotions (happiness, surprise, fear, sadness, disgust and anger) were present in morphed pictures portraying a mix of two of those emotions. The alcoholic group showed enhanced fear responses to all of the pictures compared to the controls and showed a different pattern of responding on anger and disgust. There were no differences between groups on decoding of sad, happy and surprised expressions. In addition the enhanced fear recognition found in the alcoholic group was related to the number of previous detoxifications. These results provide further evidence for impairment in facial expression recognition present in alcoholic patients. In addition, since the amygdala has been associated with the processing of facial expressions of emotion, particularly those of fear, the present data furthermore suggest that previous detoxifications may be related to changes within the amygdala.  相似文献   

6.
S B Hamann  R Adolphs 《Neuropsychologia》1999,37(10):1135-1141
Bilateral damage to the amygdala in humans has been previously linked to two deficits in recognizing emotion in facial expressions: recognition of individual basic emotions, especially fear, and recognition of similarity among emotional expressions. Although several studies have examined recognition of individual emotions following amygdala damage, only one subject has been examined on recognition of similarity. To assess the extent to which deficits in recognizing similarity among facial expressions might be a general consequence of amygdala damage, we examined this ability in two subjects with complete bilateral amygdala damage. Both subjects had previously demonstrated entirely normal recognition of individual facial emotions. Here we report that these two patients also are intact in their ability to recognize similarity between emotional expressions. These results indicate that, like the recognition of individual basic emotions in facial expressions, the recognition of similarity among emotional expressions does not have an absolute dependence on the amygdala.  相似文献   

7.
The ability to evaluate the intensity of emotional facial expressions was investigated in patients undergoing the intracarotid sodium amytal procedure. It was found that when the hemisphere non-dominant for language (usually right) was anesthetized, the patients' ratings of the intensity of emotional expressions in photographs were lower than baseline ratings of these expressions. Such an effect was not seen with anesthetization of the hemisphere dominant for language (usually left). Ratings of shades of gray (which served as control stimuli) showed no such effect. The findings are interpreted in terms of a right hemisphere superiority in the perception and evaluation of emotional expression.  相似文献   

8.
Interpersonal contacts depend to a large extent on understanding emotional facial expressions of others. Several neurological conditions may affect proficiency in emotional expression recognition. It has been shown that chronic alcoholics are impaired in labelling emotional expressions. More specifically, they mislabel sad expressions, regarding them as more hostile. Surprisingly, there has been relatively little research on patients with Korsakoff's syndrome as a result of chronic alcohol abuse. The current study investigated 23 patients diagnosed with Korsakoff's syndrome compared to 23 matched control participants. This study is the first to make use of a newly developed sensitive paradigm to measure emotion recognition for several emotions (anger, disgust, fear, happiness, sadness and surprise). The results show that patients with Korsakoff's syndrome are impaired at recognizing angry, fearful and surprised facial emotional expressions. These deficits might be due to the reported sub-cortical brain dysfunction in Korsakoff's syndrome.  相似文献   

9.
People spontaneously mimic a variety of behaviors, including emotional facial expressions. Embodied cognition theories suggest that mimicry reflects internal simulation of perceived emotion in order to facilitate its understanding. If so, blocking facial mimicry should impair recognition of expressions, especially of emotions that are simulated using facial musculature. The current research tested this hypothesis using four expressions (happy, disgust, fear, and sad) and two mimicry-interfering manipulations (1) biting on a pen and (2) chewing gum, as well as two control conditions. Experiment 1 used electromyography over cheek, mouth, and nose regions. The bite manipulation consistently activated assessed muscles, whereas the chew manipulation activated muscles only intermittently. Further, expressing happiness generated most facial action. Experiment 2 found that the bite manipulation interfered most with recognition of happiness. These findings suggest that facial mimicry differentially contributes to recognition of specific facial expressions, thus allowing for more refined predictions from embodied cognition theories.  相似文献   

10.
Parkinson's disease (PD) is associated with impairments in facial emotion recognition as well as visual and executive dysfunction. We investigated whether facial emotion categorization impairments in PD are attributable to visual scanning abnormalities by recording the eye movements of 16 non-demented PD and 20 healthy control (HC) participants during an emotion recognition task. We examined the influence of several factors that can affect visual scanning, including oculomotor, basic visual, and cognitive abilities (executive function). Increases in the number and duration of fixations in the top regions of surprise facial expressions were related to increases in recognition accuracy for this emotion in PD participants with left-sided motor-symptom onset. Compared to HC men, HC women spent less time fixating on fearful expressions. PD participants displayed oculomotor abnormalities (antisaccades), but these were unrelated to scanning patterns. Performance on visual measures (acuity, contrast sensitivity) correlated with scanning patterns in the PD group only. Poorer executive function was associated with longer fixation times in PD and with a greater number of fixations in HC. Our findings indicate a specific relation between facial emotion categorization impairments and scanning of facial expressions in PD. Furthermore, PD and HC participants’ scanning behaviors during an emotion categorization task were driven by different perceptual processes and cognitive strategies. Our results underscore the need to consider differences in perceptual and cognitive abilities in studies of visual scanning, particularly when examining this ability in patient populations for which both vision and cognition are impaired.  相似文献   

11.
In ADHD, impaired interpersonal relationships have been documented. They have been hypothesized to be secondary to impairment of receptive nonverbal language. Recognition of emotional facial expressions is an important aspect of receptive nonverbal language, and it has been demonstrated to be central to organization of emotional and social behavior. This study investigated the identification of facial expression of four emotions (joy, anger, disgust, and sadness) in a group of 30 children aged 7-12 years who met the DSM-IV criteria for ADHD disorder of the predominantly hyperactive-impulsive type and have no comorbid mental retardation, specific learning difficulties, developmental coordination disorder, pervasive developmental disorders, conduct disorder, bipolar disorder, or substance abuse, and in 30 matched unimpaired control children. The test used includes 16 validated photographs depicting these emotions in varying intensities constructed by morphing. Children with ADHD exhibited a general deficit in decoding emotional facial expressions, with specific deficit in identifying anger and sadness. Self-rating of the task difficulty revealed lack of awareness of decoding errors in the ADHD group as compared with control subjects. Within the ADHD group, there was a significant correlation between interpersonal problems and emotional facial expression decoding impairment, which was more marked for anger expressions. These results suggest suboptimal nonverbal decoding abilities in ADHD that may have important implications for therapy.  相似文献   

12.
Deficits in decoding emotional facial expressions in Parkinson's disease   总被引:2,自引:0,他引:2  
INTRODUCTION: The basal ganglia have numerous connections not only with the motor cortex but also with the prefrontal and limbic cortical areas. Therefore, basal ganglia lesions can disturb motor function but also cognitive function and emotion processing. The aim of the present study was to assess the consequences of Parkinson's disease (PD) on ability to decode emotional facial expressions (EFEs)-a method commonly used to investigate non-verbal emotion processing. METHODS: Eighteen PD patients participated in the study, together with 18 healthy subjects strictly matched with respect to age, education and sex. The patients were early in the course of the disease and had not yet received any antiparkinsonian treatment. Decoding of EFEs was assessed using a standardized, quantitative task where the expressions were of moderate intensity, i.e. quite similar to those experienced in everyday life. A set of tests also assessed executive function. Visuospatial perception, depression and anxiety were measured. RESULTS: Early in the course of the disease, untreated PD patients were significantly impaired in decoding EFEs, as well as in executive function. The deficits were significantly interrelated, although neither was significantly related to severity of the motor symptoms. Visuospatial perception was not impaired, and the patients' impairment was related neither to their depression nor to their anxiety score. The PD patients' impairment in decoding EFEs was related to a systematic response bias. CONCLUSION: Early in the course of PD, non-verbal emotional information processing is disturbed. This suggests that in PD, nigrostriatal dopaminergic depletion leads not only to motor and cognitive disturbances but also to emotional information processing deficits. The observed correlation pattern does not enable adoption of a clear-cut position in the debate over totally or partially segregated functional organization of the basal ganglia circuits.  相似文献   

13.
Abstract Affective judgments can often be influenced by emotional information people unconsciously perceive, but the neural mechanisms responsible for these effects and how they are modulated by individual differences in sensitivity to threat are unclear. Here we studied subliminal affective priming by recording brain potentials to surprise faces preceded by 30-msec happy or fearful prime faces. Participants showed valence-consistent changes in affective ratings of surprise faces, although they reported no knowledge of prime-face expressions, nor could they discriminate between prime-face expressions in a forced-choice test. In conjunction with the priming effect on affective evaluation, larger occipital P1 potentials at 145-175 msec were found with fearful than with happy primes, and source analyses implicated the bilateral extrastriate cortex in this effect. Later brain potentials at 300-400 msec were enhanced with happy versus fearful primes, which may reflect differential attentional orienting. Personality testing for sensitivity to threat, especially social threat, was also used to evaluate individual differences potentially relevant to subliminal affective priming. Indeed, participants with high trait anxiety demonstrated stronger affective priming and greater P1 differences than did those with low trait anxiety, and these effects were driven by fearful primes. Results thus suggest that unconsciously perceived affective information influences social judgments by altering very early perceptual analyses, and that this influence is accentuated to the extent that people are oversensitive to threat. In this way, perception may be subject to a variety of influences that govern social preferences in the absence of concomitant awareness of such influences.  相似文献   

14.
《Social neuroscience》2013,8(6):633-638
Although neuromyelitis optica (NMO) is classically recognized as an affectation of optic nerves and spinal cord, recent reports have shown brain atrophy and cognitive dysfunction in this condition. Importantly, emotion-related brain regions appear to be impaired in NMO. However, no studies of NMO’ emotional processing have been published. The goal of the current study was to investigate facial emotion recognition in 10 patients with NMO and 10 healthy controls by controlling for relevant cognitive factors. Consistent with previous reports, NMO patients performed poorly across cognitive domains (divided attention, working memory, and information-processing speed). Our findings further evidence the relative inability of NMO patients to recognize negative emotions (disgust, anger, and fear), in comparison to controls, with these deficits not explained by other cognitive impairments. Results provide the first evidence that NMO may impair the ability to recognize negative emotions. These impairments appear to be related to possible damage in brain regions underling emotional networks, including the anterior cingulate cortex, amygdala, and medial prefrontal cortex. Findings increased both our understanding of NMO’s cognitive impairment, and the neural networks underlying negative emotions.  相似文献   

15.
The linguistic and cognitive profiles of five deaf adults with a sign language disorder were compared with those of matched deaf controls. The test involved a battery of sign language tests, a signed narrative discourse task and a neuropsychological test protocol administered in sign language. Spatial syntax and facial processing were examined in detail and correlated with language and cognitive findings. The battery clearly differentiated the performance of the clinical participants from that of the normal controls. Further, test performance of the clinical individuals was distinct and showed marked correlations with neurological history, as well as with cognitive profiles. The important role of narrative discourse as a clinically sensitive diagnostic tool is described.  相似文献   

16.
The amygdala plays a central role in processing facial affect, responding to diverse expressions and features shared between expressions. Although speculation exists regarding the nature of relationships between expression- and feature-specific amygdala reactivity, this matter has not been fully explored. We used functional magnetic resonance imaging and principal component analysis (PCA) in a sample of 300 young adults, to investigate patterns related to expression- and feature-specific amygdala reactivity to faces displaying neutral, fearful, angry or surprised expressions. The PCA revealed a two-dimensional correlation structure that distinguished emotional categories. The first principal component separated neutral and surprised from fearful and angry expressions, whereas the second principal component separated neutral and angry from fearful and surprised expressions. This two-dimensional correlation structure of amygdala reactivity may represent specific feature-based cues conserved across discrete expressions. To delineate which feature-based cues characterized this pattern, face stimuli were averaged and then subtracted according to their principal component loadings. The first principal component corresponded to displacement of the eyebrows, whereas the second principal component corresponded to increased exposure of eye whites together with movement of the brow. Our results suggest a convergent representation of facial affect in the amygdala reflecting feature-based processing of discrete expressions.  相似文献   

17.
Early processing of the six basic facial emotional expressions   总被引:18,自引:0,他引:18  
Facial emotions represent an important part of non-verbal communication used in everyday life. Recent studies on emotional processing have implicated differing brain regions for different emotions, but little has been determined on the timing of this processing. Here we presented a large number of unfamiliar faces expressing the six basic emotions, plus neutral faces, to 26 young adults while recording event-related potentials (ERPs). Subjects were naive with respect to the specific questions investigated; it was an implicit emotional task. ERPs showed global effects of emotion from 90 ms (P1), while latency and amplitude differences among emotional expressions were seen from 140 ms (N170 component). Positive emotions evoked N170 significantly earlier than negative emotions and the amplitude of N170 evoked by fearful faces was larger than neutral or surprised faces. At longer latencies (330-420 ms) at fronto-central sites, we also found a different pattern of effects among emotions. Localization analyses confirmed the superior and middle-temporal regions for early processing of facial expressions; the negative emotions elicited later, distinctive activations. The data support a model of automatic, rapid processing of emotional expressions.  相似文献   

18.
Neuropsychological and neuroimaging evidence suggests that the human brain contains facial expression recognition detectors specialized for specific discrete emotions. However, some human behavioral data suggest that humans recognize expressions as similar and not discrete entities. This latter observation has been taken to indicate that internal representations of facial expressions may be best characterized as varying along continuous underlying dimensions. To examine the potential compatibility of these two views, the present study compared human and support vector machine (SVM) facial expression recognition performance. Separate SVMs were trained to develop fully automatic optimal recognition of one of six basic emotional expressions in real-time with no explicit training on expression similarity. Performance revealed high recognition accuracy for expression prototypes. Without explicit training of similarity detection, magnitude of activation across each emotion-specific SVM captured human judgments of expression similarity. This evidence suggests that combinations of expert classifiers from separate internal neural representations result in similarity judgments between expressions, supporting the appearance of a continuous underlying dimensionality. Further, these data suggest similarity in expression meaning is supported by superficial similarities in expression appearance.  相似文献   

19.
BACKGROUND: Depersonalisation disorder is characterised by emotion suppression, but the cerebral mechanisms of this symptom are not yet fully understood. AIMS: To compare brain activation and autonomic responses of individuals with the disorder and healthy controls. METHOD: Happy and sad emotion expressions in increasing intensities (neutral to intense) were presented in an implicit event-related functional magnetic resonance imaging (fMRI) design with simultaneous measurement of autonomic responses. RESULTS: Participants with depersonalisation disorder showed fMRI signal decreases, whereas the control group showed signal increases in response to emotion intensity increases in both happy and sad expressions. The analysis of evoked haemodynamic responses from regions exhibiting functional connectivity between central and autonomic nervous systems indicated that in depersonalisation disorder initial modulations of haemodynamic response occurred significantly earlier (2 s post-stimulus) than in the control group (4-6 s post-stimulus). CONCLUSIONS: The results suggest that fMRI signal decreases are possible correlates of emotion suppression in depersonalisation disorder.  相似文献   

20.
We review recent researches in neural mechanisms of facial recognition in the light of three aspects: facial discrimination and identification, recognition of facial expressions, and face perception in itself. First, it has been demonstrated that the fusiform gyrus has a main role of facial discrimination and identification. However, whether the FFA (fusiform face area) is really a special area for facial processing or not is controversial; some researchers insist that the FFA is related to 'becoming an expert' for some kinds of visual objects, including faces. Neural mechanisms of prosopagnosia would be deeply concerned to this issue. Second, the amygdala seems to be very concerned to recognition of facial expressions, especially fear. The amygdala, connected with the superior temporal sulcus and the orbitofrontal cortex, appears to operate the cortical function. The amygdala and the superior temporal sulcus are related to gaze recognition, which explains why a patient with bilateral amygdala damage could not recognize only a fear expression; the information from eyes is necessary for fear recognition. Finally, even a newborn infant can recognize a face as a face, which is congruent with the innate hypothesis of facial recognition. Some researchers speculate that the neural basis of such face perception is the subcortical network, comprised of the amygdala, the superior colliculus, and the pulvinar. This network would relate to covert recognition that prosopagnosic patients have.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号