首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
2.
Neuropsychological and neuroimaging evidence suggests that the human brain contains facial expression recognition detectors specialized for specific discrete emotions. However, some human behavioral data suggest that humans recognize expressions as similar and not discrete entities. This latter observation has been taken to indicate that internal representations of facial expressions may be best characterized as varying along continuous underlying dimensions. To examine the potential compatibility of these two views, the present study compared human and support vector machine (SVM) facial expression recognition performance. Separate SVMs were trained to develop fully automatic optimal recognition of one of six basic emotional expressions in real-time with no explicit training on expression similarity. Performance revealed high recognition accuracy for expression prototypes. Without explicit training of similarity detection, magnitude of activation across each emotion-specific SVM captured human judgments of expression similarity. This evidence suggests that combinations of expert classifiers from separate internal neural representations result in similarity judgments between expressions, supporting the appearance of a continuous underlying dimensionality. Further, these data suggest similarity in expression meaning is supported by superficial similarities in expression appearance.  相似文献   

3.
Intrinsic emotional expressions such as those communicated by faces and vocalizations have been shown to engage specific brain regions, such as the amygdala. Although music constitutes another powerful means to express emotions, the neural substrates involved in its processing remain poorly understood. In particular, it is unknown whether brain regions typically associated with processing ‘biologically relevant’ emotional expressions are also recruited by emotional music. To address this question, we conducted an event-related functional magnetic resonance imaging study in 47 healthy volunteers in which we directly compared responses to basic emotions (fear, sadness and happiness, as well as neutral) expressed through faces, non-linguistic vocalizations and short novel musical excerpts. Our results confirmed the importance of fear in emotional communication, as revealed by significant blood oxygen level-dependent signal increased in a cluster within the posterior amygdala and anterior hippocampus, as well as in the posterior insula across all three domains. Moreover, subject-specific amygdala responses to fearful music and vocalizations were correlated, consistent with the proposal that the brain circuitry involved in the processing of musical emotions might be shared with the one that have evolved for vocalizations. Overall, our results show that processing of fear expressed through music, engages some of the same brain areas known to be crucial for detecting and evaluating threat-related information.  相似文献   

4.
An ongoing debate in human memory research is whether the encoding and the retrieval of memory engage the same part of the hippocampus and the same cells, or whether encoding preferentially involves the anterior part of the hippocampus and retrieval its posterior part. Here, we used a human to rat translational behavioral approach combined to high‐resolution molecular imaging to address this issue. We showed that successful memory performance is predicted by encoding and reactivation patterns only in the dorsal part of the rat hippocampus (posterior part in humans), but not in the ventral part (anterior part in humans). Our findings support the view that the encoding and the retrieval processes per se are not segregated along the longitudinal axis of the hippocampus, but that activity predictive of successful memory is and concerns specifically the dorsal part of the hippocampus. In addition, we found evidence that these processes are likely to be mediated by the activation/reactivation of the same cells at this level. Given the translational character of the task, our results suggest that both the encoding and the retrieval processes take place in the same cells of the posterior part of the human hippocampus. © 2015 Wiley Periodicals, Inc.  相似文献   

5.
BACKGROUND: Washing symptoms in Obsessive-Compulsive Disorder (OCD) are associated with increased trait sensitivity to disgust. This study explored neural systems underlying sensitivity to symptom-unrelated disgust and fear in OCD using functional neuroimaging. METHODS: Seventeen OCD subjects and 19 controls viewed facial expressions of disgust and fear (versus neutral) presented just above the level of conscious awareness in a backward masking paradigm. RESULTS: The OCD group showed greater activation than controls in the left ventrolateral prefrontal cortex, but reduced activation in the thalamus, to facial expressions of disgust. There were no between-group differences in response to fear. Further analysis using a median-split to divide OCD subjects into high and low washers suggested that the enhanced ventrolateral prefrontal cortex response was being driven by predominantly female OCD subjects with high washing symptoms. These subjects also reported higher levels of trait sensitivity to disgust. CONCLUSIONS: These findings are consistent with previous reports of increased response to symptom-relevant and generally disgusting stimuli in neural regions associated with disgust and autonomic response processing in OCD patients with prominent washing symptoms. Together, these findings point to increased sensitivity to disgust stimuli as a component of the pathophysiology of the washing/contamination symptom dimension of OCD.  相似文献   

6.
Task instructions modulate neural responses to fearful facial expressions.   总被引:9,自引:0,他引:9  
BACKGROUND: The amygdala, hippocampus, ventral, and dorsal prefrontal cortices have been demonstrated to be involved in the response to fearful facial expressions. Little is known, however, about the effect of task instructions upon the intensity of responses within these regions to fear-inducing stimuli. METHODS: Using functional magnetic resonance imaging, we examined neural responses to alternating, 30-sec blocks of fearful and neutral expressions in nine right-handed male volunteers during three different 5-min conditions: 1) passive viewing; 2) performance of a gender-decision task, with no explicit judgment of facial emotion; 3) performance of an emotionality judgment task - an explicitly emotional task. RESULTS: There was a significant effect of task upon activation within the left hippocampus and the left inferior occipital gyrus, and upon the magnitude of response within the left hippocampus, with maximal activation in these regions occurring during passive viewing, and minimal during performance of the explicit task. Performance of the gender-decision and explicit tasks, but not passive viewing, was also associated with activation within ventral frontal cortex. CONCLUSIONS: Neural responses to fearful facial expressions are modulated by task instructions.  相似文献   

7.
Amygdala response to facial expressions in children and adults.   总被引:11,自引:0,他引:11  
BACKGROUND: The amygdala plays a central role in the human response to affective or emotionally charged stimuli, particularly fear-producing stimuli. We examined the specificity of the amygdala response to facial expressions in adults and children. METHODS: Six adults and 12 children were scanned in a 1.5-T scanner during passive viewing of fearful and neutral faces using an EPI BOLD sequence. All scans were registered to a reference brain, and analyses of variance were conducted on the pooled data to examine interactions with age and gender. RESULTS: Overall, we observed predominantly left amygdala and substantia innominata activity during the presentation of nonmasked fearful faces relative to fixation, and a decrease in activation in these regions with repeated exposure to the faces. Adults showed increased left amygdala activity for fearful faces relative to neutral faces. This pattern was not observed in the children who showed greater amygdala activity with neutral faces than with fearful faces. For the children, there was an interaction of gender and condition whereby boys but not girls showed less activity with repeated exposure to the fearful faces. CONCLUSIONS: This is the first study to examine developmental differences in the amygdala response to facial expressions using functional magnetic resonance imaging.  相似文献   

8.
《Brain stimulation》2021,14(3):607-615
BackgroundNeuroimaging studies suggest that the inferior frontal operculum (IFO) is part of a neuronal network involved in facial expression processing, but the causal role of this region in emotional face discrimination remains elusive.ObjectiveWe used cathodal (inhibitory) tDCS to test whether right (r-IFO) and left (l-IFO) IFO play a role in discriminating basic facial emotions in healthy volunteers. Specifically, we tested if the two sites are selectively involved in the processing of facial expressions conveying high or low arousal emotions. Based on the Arousal Hypothesis we expected to find a modulation of high and low arousal emotions by cathodal tDCS of the r-IFO and the l-IFO, respectively.MethodsFirst, we validated an Emotional Faces Discrimination Task (EFDT). Then, we targeted the r-IFO and the l-IFO with cathodal tDCS (i.e. the cathode was placed over the right or left IFO, while the anode was placed over the contralateral supraorbital area) during facial emotions discrimination on the EFDT. Non-active (i.e. sham) tDCS was a control condition.ResultsOverall, participants manifested the “happy face advantage”. Interestingly, tDCS to r-IFO enhanced discrimination of faces expressing anger (a high arousal emotion), whereas, tDCS to l-IFO decreased discrimination of faces expressing sadness (a low arousal emotion).ConclusionsOur findings revealed a differential causal role of r-IFO and l-IFO in the discrimination of specific high and low arousal emotions. Crucially, these results suggest that cathodal tDCS might reduce the neural noise triggered by facial emotions, improving discrimination of high arousal emotions but disrupting discrimination of low arousal emotions. These findings offer new insights for treating clinical population with deficits in processing facial expressions.  相似文献   

9.
This study investigates the voluntary production of emotional facial expressions in 43 brain-damaged and 9 control subjects. The expressions of right- and left-hemisphere lesion groups did not differ significantly, but those of the anterior lesion group were impoverished relative to the posterior lesion and control groups. Deficits of voluntary expression were dissociable from impairments in "non-emotional" facial-motor functions, dysphasia, and unilateral neglect.  相似文献   

10.
In psychiatrically-well subjects the modulation of event related potentials (ERPs) by emotional facial expressions is found in several ERPs from -100 ms and later. A face-related EPR, the N170, is abnormally reduced in schizophrenia to faces relative to other complex objects and research suggests emotional modulation of N170 may be reduced as well. To further examine facial emotion modulation of N170, subjects detected neutral facial expressions from among five emotional expressions (happy, sad, fearful, angry, and disgusted). Over occipitotemporal sites, psychiatrically-well subjects showed bilateral differences in N170 amplitude among expressions (P = 0.014). Schizophrenia subjects failed to show this modulation (P = 0.551). Accuracy on the task did not differ between groups, nor did the pattern of errors. However, in patients, greater positive and negative symptom ratings were associated with increased failure to button press to neutral faces, suggesting misattribution of emotion to neutral expressions in the more ill patients. Because the N170 is largely specific to faces, these results suggest that an impairment specific to the visual processing of facial expressions contributes to the well-known behavioral abnormalities in facial emotion tasks in schizophrenia.  相似文献   

11.
We investigated facial expressivity in 19 people with Parkinson's disease (PD; 14 men and 5 women) and 26 healthy controls (13 men and 13 women). Participants engaged in experimental situations that were designed to evoke emotional facial expressions, including watching video clips and holding conversations, and were asked to pose emotions and imitate nonemotional facial movements. Expressivity was measured with subjective rating scales, objective facial measurements (Facial Action Coding System), and self-report questionnaires. As expected, PD participants showed reduced spontaneous facial expressivity across experimental situations. PD participants also had more difficulty than controls posing emotional expressions and imitating nonemotional facial movements. Despite these difficulties, however, PD participants' overall level of expressivity was still tied to emotional experience and social context.  相似文献   

12.
Early processing of the six basic facial emotional expressions   总被引:18,自引:0,他引:18  
Facial emotions represent an important part of non-verbal communication used in everyday life. Recent studies on emotional processing have implicated differing brain regions for different emotions, but little has been determined on the timing of this processing. Here we presented a large number of unfamiliar faces expressing the six basic emotions, plus neutral faces, to 26 young adults while recording event-related potentials (ERPs). Subjects were naive with respect to the specific questions investigated; it was an implicit emotional task. ERPs showed global effects of emotion from 90 ms (P1), while latency and amplitude differences among emotional expressions were seen from 140 ms (N170 component). Positive emotions evoked N170 significantly earlier than negative emotions and the amplitude of N170 evoked by fearful faces was larger than neutral or surprised faces. At longer latencies (330-420 ms) at fronto-central sites, we also found a different pattern of effects among emotions. Localization analyses confirmed the superior and middle-temporal regions for early processing of facial expressions; the negative emotions elicited later, distinctive activations. The data support a model of automatic, rapid processing of emotional expressions.  相似文献   

13.
The study examined the perception of facial expressions of different emotional intensities in obsessive-compulsive disorder (OCD) subtypes. Results showed that the High Risk Assessment and Checking subtype was more sensitive in perceiving the emotions fear and happiness. This suggests that altered affective processing may underlie the clinical manifestation of OCD.  相似文献   

14.
Selective disruption of the recognition of facial expressions of anger   总被引:5,自引:0,他引:5  
Appetitive aggression occurs in the context of resource/dominance disputes in a wide variety of species. Hence, the possibility arises that a specific neural system may have evolved to detect and coordinate responses to this specific form of challenge or threat. The dopamine system has been implicated in the processing of signals of aggression in social-agonistic encounters in several species. Here we report that dopaminergic antagonism in healthy male volunteers, following acute administration of the dopamine D2-class receptor antagonist sulpiride, leads to a selective disruption in the recognition of facial expressions of anger (signals of appetitive aggression in humans), but leaves intact recognition of other emotions and the matching of unfamiliar faces.  相似文献   

15.
The amygdala plays a central role in processing facial affect, responding to diverse expressions and features shared between expressions. Although speculation exists regarding the nature of relationships between expression- and feature-specific amygdala reactivity, this matter has not been fully explored. We used functional magnetic resonance imaging and principal component analysis (PCA) in a sample of 300 young adults, to investigate patterns related to expression- and feature-specific amygdala reactivity to faces displaying neutral, fearful, angry or surprised expressions. The PCA revealed a two-dimensional correlation structure that distinguished emotional categories. The first principal component separated neutral and surprised from fearful and angry expressions, whereas the second principal component separated neutral and angry from fearful and surprised expressions. This two-dimensional correlation structure of amygdala reactivity may represent specific feature-based cues conserved across discrete expressions. To delineate which feature-based cues characterized this pattern, face stimuli were averaged and then subtracted according to their principal component loadings. The first principal component corresponded to displacement of the eyebrows, whereas the second principal component corresponded to increased exposure of eye whites together with movement of the brow. Our results suggest a convergent representation of facial affect in the amygdala reflecting feature-based processing of discrete expressions.  相似文献   

16.
17.
The ability to recognize accurately and respond appropriately to facial expressions of emotion is essential for interpersonal interaction. Individuals with mental retardation typically are deficient in these skills. The ability of 7 adults, 1 with severe and 6 with moderate mental retardation, to recognize facial expressions of emotion correctly was assessed. Then, they were taught this skill using a combination of a discrimination training procedure for differentiating facial movements, directed rehearsal, and Ekman and Friesen's "flashing photograph" technique. Their average increase in accuracy over baseline was at least 30% during the course of the training and over 50% during the last 5 days of the training phase. Further, these individuals were able to generalize their skills from posed photographs to videotaped role plays and were able to maintain their enhanced skills during the 8 to 9 months following the termination of training. This is the first study to show that individuals with mental retardation can be taught skills that enhance their ability to recognize facial expressions of emotion.  相似文献   

18.
Alerting stimuli, such as intense tones, presented to cats in wakefulness (W) elicit the orienting response (OR) and/or the acoustic startle reflex (ASR) in conjunction with elicited ponto-geniculo-occipital waves (PGOE) from the lateral geniculate body (LGB) and elicited waves from the thalamic central lateral nucleus (CLE). Alerting stimuli presented during rapid eye movement sleep (REM) and non-rapid eye movement sleep (NREM) also elicit PGOE. We presented tones in W, REM and NREM to determine whether CLE could be obtained in sleep and to examine the patterns of responsiveness of PGOE and CLE across behavioral states. Also, we recorded ASR and OR and compared the response patterns of behavioral and central correlates of alerting. The subjects were 7 cats; all exhibited spontaneously occurring waves in LGB and CL. All cats exhibited PGOE and 5 cats exhibited CLE in W, REM and NREM. PGOE and CLE showed less evidence of habituation than did ASR and OR. The pattern of responsiveness of CLE across behavioral states was different from that found for PGOE, and spontaneous CL waves were much rarer than the LGB waves. ASR was elicited in 5 cats during W trials, and in 3 cats during REM trials. OR habituated rapidly in W and did not occur in REM and NREM. The data indicate that central mechanisms of alerting function in sleep states as well as in W and suggest that CLE and PGOE reflect activity in mechanisms underlying cortical desynchronization and visual processes which may act in concert during alerting.  相似文献   

19.
S B Hamann  R Adolphs 《Neuropsychologia》1999,37(10):1135-1141
Bilateral damage to the amygdala in humans has been previously linked to two deficits in recognizing emotion in facial expressions: recognition of individual basic emotions, especially fear, and recognition of similarity among emotional expressions. Although several studies have examined recognition of individual emotions following amygdala damage, only one subject has been examined on recognition of similarity. To assess the extent to which deficits in recognizing similarity among facial expressions might be a general consequence of amygdala damage, we examined this ability in two subjects with complete bilateral amygdala damage. Both subjects had previously demonstrated entirely normal recognition of individual facial emotions. Here we report that these two patients also are intact in their ability to recognize similarity between emotional expressions. These results indicate that, like the recognition of individual basic emotions in facial expressions, the recognition of similarity among emotional expressions does not have an absolute dependence on the amygdala.  相似文献   

20.
We examined how cognitive set influences the long latency components of normal postural responses in the legs. We disturbed the postural stability of standing human subjects with sudden toe-up ankle rotations. To influence the subjects' cognitive set, we varied the rotation amplitude either predictably (serial 4 degrees versus serial 10 degrees) or unpredictably (random mixture of 4 degrees and 10 degrees). The subjects' responses to these ankle rotations were assessed from the EMG activity of the tibialis anterior, the medial gastrocnemius, and the vastus lateralis muscles of the left leg. The results indicate that, when the rotation amplitude is predictable, only the amplitude of the long latency (LL) response in tibialis anterior and vastus lateralis varied directly with perturbation size. Furthermore, when the rotation amplitude is unpredictable, the central nervous system selects a default amplitude for the LL response in the tibialis anterior. When normal subjects are exposed to 2 perturbation amplitudes which include the potential risk of falling, the default LL response in tibialis anterior appropriately anticipates the larger amplitude perturbation rather than the smaller or an intermediate one.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号