首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The facial expression of contempt has been regarded to communicate feelings of moral superiority. Contempt is an emotion that is closely related to disgust, but in contrast to disgust, contempt is inherently interpersonal and hierarchical. The aim of this study was twofold. First, to investigate the hypothesis of preferential amygdala responses to contempt expressions versus disgust. Second, to investigate whether, at a neural level, men would respond stronger to biological signals of interpersonal superiority (e.g., contempt) than women. We performed an experiment using functional magnetic resonance imaging (fMRI), in which participants watched facial expressions of contempt and disgust in addition to neutral expressions. The faces were presented as distractors in an oddball task in which participants had to react to one target face. Facial expressions of contempt and disgust activated a network of brain regions, including prefrontal areas (superior, middle and medial prefrontal gyrus), anterior cingulate, insula, amygdala, parietal cortex, fusiform gyrus, occipital cortex, putamen and thalamus. Contemptuous faces did not elicit stronger amygdala activation than did disgusted expressions. To limit the number of statistical comparisons, we confined our analyses of sex differences to the frontal and temporal lobes. Men displayed stronger brain activation than women to facial expressions of contempt in the medial frontal gyrus, inferior frontal gyrus, and superior temporal gyrus. Conversely, women showed stronger neural responses than men to facial expressions of disgust. In addition, the effect of stimulus sex differed for men versus women. Specifically, women showed stronger responses to male contemptuous faces (as compared to female expressions), in the insula and middle frontal gyrus. Contempt has been conceptualized as signaling perceived moral violations of social hierarchy, whereas disgust would signal violations of physical purity. Thus, our results suggest a neural basis for sex differences in moral sensitivity regarding hierarchy on the one hand and physical purity on the other.  相似文献   

2.
Whether common or distinct neural systems underpin perception of different emotions and the degree to which these systems are automatically engaged during emotional perception are unresolved. We performed an event-related fMRI experiment in which subjects viewed morphed emotional faces displaying low or high intensities of disgust, fear, happiness, or sadness under two task conditions. The amygdala and fusiform cortex responded to high-intensity expressions of all emotions, independent of task. Right superior temporal sulcus showed an additive effect of the emotion-directed task and high-intensity emotion. Ventromedial prefrontal and somatosensory cortices, regions implicated in providing representations of somatic states, showed enhanced activity during explicit emotional judgments. We failed to find predicted differences between emotions. The results suggest that amygdala contributes to task-independent perceptual processing of a range of emotions. We interpret ventromedial prefrontal and somatosensory cortex activations as evidence that these regions contribute to explicit emotion processing through linking emotion perception with representations of somatic states previously engendered by emotions.  相似文献   

3.
The fusiform face area (FFA) and the superior temporal sulcus (STS) are suggested to process facial identity and facial expression information respectively. We recently demonstrated a functional dissociation between the FFA and the STS as well as correlated sensitivity of the STS and the amygdala to facial expressions using an interocular suppression paradigm [Jiang, Y., He, S., 2006. Cortical responses to invisible faces: dissociating subsystems for facial-information processing. Curr. Biol. 16, 2023-2029.]. In the current event-related brain potential (ERP) study, we investigated the temporal dynamics of facial information processing. Observers viewed neutral, fearful, and scrambled face stimuli, either visibly or rendered invisible through interocular suppression. Relative to scrambled face stimuli, intact visible faces elicited larger positive P1 (110-130 ms) and larger negative N1 or N170 (160-180 ms) potentials at posterior occipital and bilateral occipito-temporal regions respectively, with the N170 amplitude significantly greater for fearful than neutral faces. Invisible intact faces generated a stronger signal than scrambled faces at 140-200 ms over posterior occipital areas whereas invisible fearful faces (compared to neutral and scrambled faces) elicited a significantly larger negative deflection starting at 220 ms along the STS. These results provide further evidence for cortical processing of facial information without awareness and elucidate the temporal sequence of automatic facial expression information extraction.  相似文献   

4.
The current event-related fMRI study examined the relative involvement of different parts of the medial temporal lobe (MTL), particularly the contribution of hippocampus and perirhinal cortex, in either intentional or incidental recognition of famous faces in contrast to unfamiliar faces. Our intention was to further explore the controversial contribution of MTL in the processing of semantic memory tasks. Subjects viewed a sequence of famous and unfamiliar faces. Two tasks were used encouraging attention to either fame or gender. In the fame task, the subjects were requested to identify the person when seeing his/her face and also to try to generate the name of this person. In the gender task, the subjects were asked to conduct a judgement of a person's gender when seeing his/her face. The visual processing was hence directed to gender and thereby expected to diminish attention to semantic information leading only to a "passive" registration of famous and non-familiar faces. Recognition of famous faces, in both contrasts, produced significant activations in the MTL. First, during the intentional recognition (the person identification task) increased activity was observed in the anterolateral part of left hippocampus, in proximity to amygdala. Second, during the incidental recognition of famous faces (the gender classification task), there was increased activity in the left posterior MTL with focus in the perirhinal cortex. Our results suggest that the hippocampus may be centrally involved in the intentional retrieval of semantic memories while the perirhinal cortex is associated with the incidental recognition of semantic information.  相似文献   

5.
Selective attention, particularly during the processing of emotionally evocative events, is a crucial component of adolescent development. We used functional magnetic resonance imagining (fMRI) with adolescents and adults to examine developmental differences in activation in a paradigm that involved selective attention during the viewing of emotionally engaging face stimuli. We evaluated developmental differences in neural activation for three comparisons: (1) directing attention to subjective responses to fearful facial expressions relative to directing attention to a nonemotional aspect (nose width) of fearful faces, (2) viewing fearful relative to neutral faces while attending to a nonemotional aspect of the face, and (3) viewing fearful relative to neutral faces while attention was unconstrained (passive viewing). The comparison of activation across attention tasks revealed greater activation in the orbital frontal cortex in adults than in adolescents. Conversely, when subjects attended to a nonemotional feature, fearful relative to neutral faces influenced activation in the anterior cingulate more in adolescents than in adults. When attention was unconstrained, adolescents relative to adults showed greater activation in the anterior cingulate, bilateral orbitofrontal cortex, and right amygdala in response to the fearful relative to neutral faces. These findings suggest that adults show greater modulation of activity in relevant brain structures based on attentional demands, whereas adolescents exhibit greater modulation based on emotional content.  相似文献   

6.
Rapid interruption of ongoing motor actions is crucial to respond to unexpected and potentially threatening situations. Yet, it remains unclear how motor inhibition interacts with emotional processes. Here we used a modified stop-signal task including an emotional component (fearful faces) to investigate whether neural circuits engaged by action suppression are modulated by task-irrelevant threat-related signals. Behavioral performance showed that reaction times were prolonged in the presence of incidental threat information, and this emotional slowing was enhanced when incorrect responses were made following stop signals. However, the speed and efficacy of voluntary inhibition was unaffected by emotion. Brain imaging data revealed that emotional cues during stop trials interacted with activity in limbic regions encompassing the basal amygdala and sublenticular extended amygdala region, as well as with the supplementary motor area (SMA). In addition, successful motor inhibition to threat signals selectively recruited a region in lateral orbitofrontal cortex, distinct from areas in inferior frontal gyrus typically associated with voluntary inhibition. Activity in primary motor cortex was lower when incorrect responses were made on stop signal trials accompanied by a fearful face, relative to neutral, in parallel with the slower response times observed behaviorally. Taken together, our findings suggest that the amygdala may not only promote protective motor reactions in emotionally-significant contexts (such as freezing or defensive behavior) but also influence the execution of ongoing actions by modulating brain circuits involved in motor control, so as to afford quick and adaptive changes in current behavior.  相似文献   

7.
Kilts CD  Egan G  Gideon DA  Ely TD  Hoffman JM 《NeuroImage》2003,18(1):156-168
Facial expressions of emotion powerfully influence social behavior. The distributed network of brain regions thought to decode these social signals has been empirically defined using static, usually photographic, displays of such expressions. Facial emotional expressions are however highly dynamic signals that encode the emotion message in facial action patterns. This study sought to determine whether the encoding of facial expressions of emotion by static or dynamic displays is associated with different neural correlates for their decoding. We used positron emission tomography to compare patterns of brain activity in healthy men and women during the explicit judgment of emotion intensity in static and dynamic facial expressions of anger and happiness. Compared to judgments of spatial orientation for moving neutral facial expressions, the judgment of anger in dynamic expressions was associated with increased right-lateralized activity in the medial, superior, middle, and inferior frontal cortex and cerebellum, while judgments of happiness were associated with relative activation of the cuneus, temporal cortex, and the middle, medial, and superior frontal cortex. In contrast, the perception of anger or happiness in static facial expressions activated a motor, prefrontal, and parietal cortical network previously shown to be involved in motor imagery. The direct contrast of dynamic and static expressions indicated differential activation of visual area V5, superior temporal sulcus, periamygdaloid cortex, and cerebellum for dynamic angry expressions and differential activation of area V5, extrastriate cortex, brain stem, and middle temporal cortical activations for dynamic happy expressions. Thus, a distribution of neural activations is related to the analysis of emotion messages in the nearly constant biological motion of the face and differ for angry and happy expressions. Static displays of facial emotional expression may represent noncanonical stimuli that are processed for emotion content by mental strategies and neural events distinct from their more ecologically relevant dynamic counterparts.  相似文献   

8.
The amygdala is related to recognition of faces and emotions, and functional magnetic resonance imaging (fMRI) studies have reported that the amygdala is habituated over time with repetition of facial stimuli. When subjects are presented repeatedly with unfamiliar faces, they come to gradually recognize the unfamiliar faces as familiar. To investigate the brain areas participating in the acquisition of familiarity to repeatedly presented unfamiliar faces, we conducted an fMRI study in 16 healthy subjects. During the task periods, the subjects were instructed to see presented unfamiliar faces repeatedly and to judge whether the face was male or female or whether the face had emotional valences. The experiment consisted of nine sessions. To clarify the brain areas that showed increasing or decreasing activation as the experimental session proceeded, we analyzed the fMRI data using specified linear covariates in the face recognition task from the first session to the ninth session. Imaging data were investigated on a voxel-by-voxel basis for single-group analysis according to the random effect model using Statistical Parametric Mapping. The bilateral posterior cingulate cortices showed significant increases in activity as the experimental sessions proceeded, while the activation in the right amygdala and the left medial fusiform gyrus decreased. Thus, the posterior cingulate cortex may play an important role in the acquisition of facial familiarity.  相似文献   

9.
Schmidt CF  Boesiger P  Ishai A 《NeuroImage》2005,26(3):852-859
In this study, we compared fMRI activation measured with gradient- and spin-echo-based fMRI during visual perception of faces, which is mediated by neural activation within a distributed cortical network. With both fMRI techniques, bilateral activation was observed in multiple regions including the inferior occipital gyrus, fusiform gyrus, superior temporal sulcus, amygdala, inferior frontal gyrus, and orbitofrontal cortex. When compared with the gradient-echo sequence, activation measured with the spin-echo sequence was significantly reduced. This decrease was manifested by smaller cluster size, lower statistical significance, smaller amplitude of the fMRI signal, and smaller number of subjects who showed activation in all face-responsive regions. In orbitofrontal cortex, a region prone to susceptibility-related signal dephasing, the spin-echo acquisition considerably restored the signal, but did not reveal stronger activation when compared with the gradient-echo acquisition. Our data indicate that optimized GE sequences that reduce susceptibility artefacts are sufficient to detect activation in regions such as the orbitofrontal cortex.  相似文献   

10.
Emotions are complex events recruiting distributed cortical and subcortical cerebral structures, where the functional integration dynamics within the involved neural circuits in relation to the nature of the different emotions are still unknown. Using fMRI, we measured the neural responses elicited by films representing basic emotions (fear, disgust, sadness, happiness). The amygdala and the associative cortex were conjointly activated by all basic emotions. Furthermore, distinct arrays of cortical and subcortical brain regions were additionally activated by each emotion, with the exception of sadness. Such findings informed the definition of three effective connectivity models, testing for the functional integration of visual cortex and amygdala, as regions processing all emotions, with domain-specific regions, namely: i) for fear, the frontoparietal system involved in preparing adaptive motor responses; ii) for disgust, the somatosensory system, reflecting protective responses against contaminating stimuli; iii) for happiness: medial prefrontal and temporoparietal cortices involved in understanding joyful interactions. Consistently with these domain-specific models, the results of the effective connectivity analysis indicate that the amygdala is involved in distinct functional integration effects with cortical networks processing sensorimotor, somatosensory, or cognitive aspects of basic emotions. The resulting effective connectivity networks may serve to regulate motor and cognitive behavior based on the quality of the induced emotional experience.  相似文献   

11.
The amygdala has been consistently isolated as a key neural substrate for processing facial displays of affect. Recent evidence from human lesion and functional neuroimaging studies have begun to challenge the notion that the amygdala is reserved for signals of threat (fear/anger). We performed a 4 T fMRI study in which 20 subjects viewed a contemporary set of photographs displaying 6 different facial expressions (fearful, disgusted, angry, sad, neutral, happy) while performing a task with minimal cognitive demand. Across subjects, the left amygdala was activated by each face condition separately, and its response was not selective for any particular emotion category. These results challenge the notion that the amygdala has a specialized role in processing certain emotions and suggest that the amygdala may have a more general-purpose function in processing salient information from faces.  相似文献   

12.
Simon D  Craig KD  Miltner WH  Rainville P 《Pain》2006,126(1-3):309-318
The facial expression of pain is a prominent non-verbal pain behaviour, unique and distinct from the expression of basic emotions. Yet, little is known about the neurobiological basis for the communication of pain. Here, subjects performed a sex-discrimination task while we investigated neural responses to implicit processing of dynamic visual stimuli of male or female faces displaying pain or angry expressions, matched on expression intensity and compared to neutral expression. Stimuli were presented in a mixed blocked/event-related design while blood oxygenation level dependent (BOLD) signal was acquired using whole-brain functional magnetic resonance imaging (fMRI) at 1.5 Tesla. Comparable sustained responses to pain and angry faces were found in the superior temporal sulcus (STS). Stronger transient activation was also observed to male expression of pain (Vs neutral and anger) in high-order visual areas (STS and fusiform face area) and in emotion-related areas including the amygdala (highest peak t-value=10.8), perigenual anterior cingulate cortex (ACC), and SI. Male pain compared to anger expression also activated the ventromedial prefrontal cortex, SII/posterior insula and anterior insula. This is consistent with the hypothesis that the implicit processing of male pain expression triggers an emotional reaction characterized by a threat-related response. Unexpectedly, several areas responsive to male expression, including the amygdala, perigenual ACC, and somatosensory areas, showed a decrease in activation to female pain faces (Vs neutral). This sharp contrast in the response to male and female faces suggests potential differences in the socio-functional role of pain expression in males and females.  相似文献   

13.
The present study examined the functional association of the amygdala and right ventral prefrontal cortex (PFC) during cognitive evaluation of facial expressions. A situation was created where emotional valence of the stimuli was unconsciously manipulated by using subliminal affective priming. Twelve healthy volunteers were asked to evaluate the facial expressions of a target face (500-ms duration) such as "anger", "neutral", or "happy". All target faces expressed relatively weak anger. Just before the presentation of the target face, a prime of three conditions of 35-ms duration, angry face, neutral face, and white blank was presented. The subjects could not consciously identify the primes in this procedure. Activity in the right amygdala was greater with subliminal presentation of the angry prime compared with subliminal presentation of a neutral face or white-blank stimuli. Most importantly, the degree of activation of the right amygdala was negatively correlated with that of the right ventral PFC only with the anger prime. Furthermore, activation of the amygdala was positively correlated with rate of judgment when the subjects recognized anger in the target faces. These results are discussed in terms of the functional association between the right PFC and the amygdala and its influence on cognitive processing.  相似文献   

14.
OBJECTIVE: The reliable discrimination of emotional expressions in faces is essential for adequate social interaction. Deficits in facial emotion processing are an important impairment in schizophrenia with major consequences for social functioning and subjective well-being. Whether neural circuits underlying emotion processing are already altered before illness onset is yet unclear. Investigating neural correlates of emotion processing in individuals clinically at risk for psychosis offers the possibility to examine neural processes unchanged by the manifest disorder and to study trait aspects of emotion dysfunctions. MATERIAL AND METHODS: Twelve subjects clinically at risk for psychosis and 12 matched control subjects participated in this study. fMRI data were acquired during an emotion discrimination task consisting of standardized photographs of faces displaying different emotions (happiness, sadness, anger, fear) as well as faces with neutral facial expression. RESULTS: There were no group differences in behavioral performance. Emotion discrimination was associated with hyperactivations in high-risk subjects in the right lingual and fusiform gyrus as well as the left middle occipital gyrus. Further, high-risk compared to control subjects exhibited stronger activation related to neutral faces relative to emotional faces in the inferior and superior frontal gyri, the cuneus, the thalamus and the hippocampus. CONCLUSIONS: The present study indicates that individuals clinically at risk for psychosis show differences in brain activation associated with processing of emotional and--more pronounced--neutral facial expressions despite an adequate behavioral performance. The proneness to attribute salience to neutral stimuli might indicate a biological risk marker for psychosis.  相似文献   

15.
Effective fear processing relies on the amygdala and medial prefrontal cortex (MPFC). Post-trauma reactions provide a compelling model for examining how the heightened experience of fear impacts these systems. Post-traumatic stress disorder (PTSD) has been associated with excessive amygdala and a lack of MPFC activity in response to nonconscious facial signals of fear, but responses to consciously processed facial fear stimuli have not been examined. We used functional MRI to elucidate the effect of trauma reactions on amygdala-MPFC function during an overt fear perception task. Subjects with PTSD (n = 13) and matched non-traumatized healthy subjects (n = 13) viewed 15 blocks of eight fearful face stimuli alternating pseudorandomly with 15 blocks of neutral faces (stimulus duration 500 ms; ISI 767 ms). We used random effects analyses in SPM2 to examine within- and between-group differences in the MPFC and amygdala search regions of interest. Time series data were used to examine amygdala-MPFC associations and changes across the first (Early) versus second (Late) phases of the experiment. Relative to non-traumatized subjects, PTSD subjects showed a marked bilateral reduction in MPFC activity (in particular, right anterior cingulate cortex, ACC), which showed a different Early-Late pattern to non-traumatized subjects and was more pronounced with greater trauma impact and symptomatology. PTSD subjects also showed a small but significant enhancement in left amygdala activity, most apparent during the Late phase, but reduction in Early right amygdala response. Over the time course, trauma was related to a distinct pattern of ACC and amygdala connections. The findings suggest that major life trauma may disrupt the normal pattern of medial prefrontal and amygdala regulation.  相似文献   

16.
Lesion studies indicate distinct neural systems for recognition of facial identity and emotion. Split-brain experiments also suggest that emotional evaluation of a stimulus can occur without conscious identification. The present study tested a hypothesis of a differential neural response, independent of explicit conscious mediation, to emotional compared to nonemotional faces. The experimental paradigm involved holding in mind an image of a face across a 45-s delay while regional cerebral blood flow was measured using positron emission tomography. Prior to the delay, a single face was presented with an explicit instruction to match it to one of two faces, photographed at different angles from the target face, presented at the end of the delay. Repeated blood flow measures were obtained while subjects held happy or neutral faces in mind or during a neutral control fixation condition without initial face presentation. The representation of emotional faces over a delay period, compared to either the nonemotional or the fixation condition, was associated with significant activation in the left ventral prefrontal cortex, the left anterior cingulate cortex, and the right fusiform gyrus. The findings support our hypothesis of a differential neural response to facial emotion, independent of conscious mediation, in regions implicated in the processing of faces and of emotions.  相似文献   

17.
Pichon S  Rieger SW  Vuilleumier P 《NeuroImage》2012,62(3):1610-1621
To what extent do past experiences shape our behaviors, perceptions, and thoughts even without explicit knowledge of these influences? Behavioral research has demonstrated that various cognitive processes can be influenced by conceptual representations implicitly primed during a preceding and unrelated task. Here we investigated whether emotion processing might also be influenced by prior incidental exposure to negative semantic material and which neural substrates would mediate these effects. During a first (priming) task, participants performed a variant of the hangman game with either negative or neutral emotion-laden words. Subsequently, they performed a second, unrelated visual task with fearful and neutral faces presented at attended or unattended locations. Participants were generally not aware of any relationships between the two tasks. We found that priming with emotional words enhanced amygdala sensitivity to faces in the subsequent visual task, while decreasing discriminative responses to threat. Furthermore, the magnitude of the induced bias in behavior and amygdala activation was predicted by the effectiveness of semantic access observed in the priming task. This demonstrates that emotional processing can be modulated by implicit influence of environmental information processed at an earlier time, independently of volitional control.  相似文献   

18.
Recent studies have cast doubts on the appealing idea that the processing of threat-related stimuli in the amygdala is unconstrained by the availability of attentional resources. However, these studies exclusively used face stimuli presented at fixation and it is unclear whether their conclusion can apply to peripheral face stimuli. Thus, we designed an experiment in which we manipulated the perceptual attentional load of the task used to divert attention from peripheral face stimuli: participants were presented simultaneously with four peripheral pictures (two faces, either both neutral or both fearful, and two houses) that were slightly tilted, and had to match two of these pictures (defined by their position on the screen) either for orientation of the tilt or for identity. The identity task was confirmed to involve greater attentional load than the orientation task by differences in accuracy, reaction times, subsequent face recognition performance, and patterns of activation in several cortical regions. In the orientation task, ignored fearful faces led to stronger activation in the right amygdala than ignored neutral faces. However, this differential response was abolished when participants performed the difficult identity-matching task. Thus, emotional processing of peripheral faces in the amygdala also appears to depend on the available perceptual attentional resources.  相似文献   

19.
In visual perception of emotional stimuli, low- and high-level appraisal processes have been found to engage different neural structures. Beyond emotional facial expression, emotional prosody is an important auditory cue for social interaction. Neuroimaging studies have proposed a network for emotional prosody processing that involves a right temporal input region and explicit evaluation in bilateral prefrontal areas. However, the comparison of different appraisal levels has so far relied upon using linguistic instructions during low-level processing, which might confound effects of processing level and linguistic task. In order to circumvent this problem, we examined processing of emotional prosody in meaningless speech during gender labelling (implicit, low-level appraisal) and emotion labelling (explicit, high-level appraisal). While bilateral amygdala, left superior temporal sulcus and right parietal areas showed stronger blood oxygen level-dependent (BOLD) responses during implicit processing, areas with stronger BOLD responses during explicit processing included the left inferior frontal gyrus, bilateral parietal, anterior cingulate and supplemental motor cortex. Emotional versus neutral prosody evoked BOLD responses in right superior temporal gyrus, bilateral anterior cingulate, left inferior frontal gyrus, insula and bilateral putamen. Basal ganglia and right anterior cingulate responses to emotional versus neutral prosody were particularly pronounced during explicit processing. These results are in line with an amygdala-prefrontal-cingulate network controlling different appraisal levels, and suggest a specific role of the left inferior frontal gyrus in explicit evaluation of emotional prosody. In addition to brain areas commonly related to prosody processing, our results suggest specific functions of anterior cingulate and basal ganglia in detecting emotional prosody, particularly when explicit identification is necessary.  相似文献   

20.
Repeated recognition of the face of a familiar individual is known to show semantic repetition priming effect. In this study, normal subjects were repeatedly presented faces of their colleagues, and the effect of repetition on the regional cerebral blood flow change was measured using positron emission tomography. They repeated a set of three tasks: the familiar-face detection (F) task, the facial direction discrimination (D) task, and the perceptual control (C) task. During five repetitions of the F task, familiar faces were presented six times from different views in a pseudorandom order. Activation reduction through the repetition of the F tasks was observed in the bilateral anterior (anterolateral to the polar region) temporal cortices which are suggested to be involved in the access to the long-term memory concerning people. The bilateral amygdala, the hypothalamus, and the medial frontal cortices, were constantly activated during the F tasks, and considered to be associated with the behavioral significance of the presented familiar faces. Constant activation was also observed in the bilateral occipitotemporal regions and fusiform gyri and the right medial temporal regions during perception of the faces, and in the left medial temporal regions during the facial familiarity detection task, which are consistent with the results of previous functional brain imaging studies. The results have provided further information about the functional segregation of the anterior temporal regions in face recognition and long-term memory.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号