首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Early processing of the six basic facial emotional expressions   总被引:18,自引:0,他引:18  
Facial emotions represent an important part of non-verbal communication used in everyday life. Recent studies on emotional processing have implicated differing brain regions for different emotions, but little has been determined on the timing of this processing. Here we presented a large number of unfamiliar faces expressing the six basic emotions, plus neutral faces, to 26 young adults while recording event-related potentials (ERPs). Subjects were naive with respect to the specific questions investigated; it was an implicit emotional task. ERPs showed global effects of emotion from 90 ms (P1), while latency and amplitude differences among emotional expressions were seen from 140 ms (N170 component). Positive emotions evoked N170 significantly earlier than negative emotions and the amplitude of N170 evoked by fearful faces was larger than neutral or surprised faces. At longer latencies (330-420 ms) at fronto-central sites, we also found a different pattern of effects among emotions. Localization analyses confirmed the superior and middle-temporal regions for early processing of facial expressions; the negative emotions elicited later, distinctive activations. The data support a model of automatic, rapid processing of emotional expressions.  相似文献   

2.
The ability to interpret others' body language is a vital skill that helps us infer their thoughts and emotions. However, individuals with autism spectrum disorder (ASD) have been found to have difficulty in understanding the meaning of people's body language, perhaps leading to an overarching deficit in processing emotions. The current fMRI study investigates the functional connectivity underlying emotion and action judgment in the context of processing body language in high‐functioning adolescents and young adults with autism, using an independent components analysis (ICA) of the fMRI time series. While there were no reliable group differences in brain activity, the ICA revealed significant involvement of occipital and parietal regions in processing body actions; and inferior frontal gyrus, superior medial prefrontal cortex, and occipital cortex in body expressions of emotions. In a between‐group analysis, participants with autism, relative to typical controls, demonstrated significantly reduced temporal coherence in left ventral premotor cortex and right superior parietal lobule while processing emotions. Participants with ASD, on the other hand, showed increased temporal coherence in left fusiform gyrus while inferring emotions from body postures. Finally, a positive predictive relationship was found between empathizing ability and the brain areas underlying emotion processing in ASD participants. These results underscore the differential role of frontal and parietal brain regions in processing emotional body language in autism. Hum Brain Mapp 35:5204–5218, 2014. © 2014 Wiley Periodicals, Inc .  相似文献   

3.
Research on emotional processing in schizophrenia suggests relatively intact subjective responses to affective stimuli “in the moment.” However, neuroimaging evidence suggests diminished activation in brain regions associated with emotional processing in schizophrenia. We asked whether given a more vulnerable cognitive system in schizophrenia, individuals with this disorder would show increased or decreased modulation of working memory (WM) as a function of the emotional content of stimuli compared with healthy control subjects. In addition, we examined whether higher anhedonia levels were associated with a diminished impact of emotion on behavioral and brain activation responses. In the present study, 38 individuals with schizophrenia and 32 healthy individuals completed blocks of a 2-back WM task in a functional magnetic resonance imaging scanning session. Blocks contained faces displaying either only neutral stimuli or neutral and emotional stimuli (happy or fearful faces), randomly intermixed and occurring both as targets and non-targets. Both groups showed higher accuracy but slower reaction time for negative compared to neutral stimuli. Individuals with schizophrenia showed intact amygdala activity in response to emotionally evocative stimuli, but demonstrated altered dorsolateral prefrontal cortex (DLPFC) and hippocampal activity while performing an emotionally loaded WM-task. Higher levels of social anhedonia were associated with diminished amygdala responses to emotional stimuli and increased DLPFC activity in individuals with schizophrenia. Emotional arousal may challenge dorsal-frontal control systems, which may have both beneficial and detrimental influences. Our findings suggest that disturbances in emotional processing in schizophrenia relate to alterations in emotion-cognition interactions rather than to the perception and subjective experience of emotion per se.  相似文献   

4.
Positive‐social emotions mediate one's cognitive performance, mood, well‐being, and social bonds, and represent a critical variable within therapeutic settings. It has been shown that the upregulation of positive emotions in social situations is associated with increased top‐down signals that stem from the prefrontal cortices (PFC) which modulate bottom‐up emotional responses in the amygdala. However, it remains unclear if positive‐social emotion upregulation of the amygdala occurs directly through the dorsomedial PFC (dmPFC) or indirectly linking the bilateral amygdala with the dmPFC via the subgenual anterior cingulate cortex (sgACC), an area which typically serves as a gatekeeper between cognitive and emotion networks. We performed functional MRI (fMRI) experiments with and without effortful positive‐social emotion upregulation to demonstrate the functional architecture of a network involving the amygdala, the dmPFC, and the sgACC. We found that effortful positive‐social emotion upregulation was associated with an increase in top‐down connectivity from the dmPFC on the amygdala via both direct and indirect connections with the sgACC. Conversely, we found that emotion processes without effortful regulation increased network modulation by the sgACC and amygdala. We also found that more anxious individuals with a greater tendency to suppress emotions and intrusive thoughts, were likely to display decreased amygdala, dmPFC, and sgACC activity and stronger connectivity strength from the sgACC onto the left amygdala during effortful emotion upregulation. Analyzed brain network suggests a more general role of the sgACC in cognitive control and sheds light on neurobiological informed treatment interventions.  相似文献   

5.
Previous functional magnetic resonance imaging and event-related brain potential studies revealed that performing a cognitive task may suppress the preferential processing of emotional stimuli. However, these studies utilized simple and artificial tasks (i.e. letter, shape or orientation discrimination tasks), unfamiliar to the participants. The present event-related potential study examined the emotion–attention interaction in the context of a comparably more natural scene categorization task. Deciding whether a natural scene contains an animal or not is a familiar and meaningful task to the participants and presumed to require little attentional resources. The task images were presented centrally and were overlaid upon emotional or neutral background pictures. Thus, implicit emotion and explicit semantic categorization may compete for processing resources in neural regions implicated in object recognition. Additionally, participants passively viewed the same stimulus materials without the demand to categorize task images. Significant interactions between task condition and emotional picture valence were observed for the occipital negativity and late positive potential. In the passive viewing condition, emotional background images elicited an increased occipital negativity followed by an increased late positive potential. In contrast, during the animal-/non-animal-categorization task, emotional modulation effects were replaced by strong target categorization effects. These results suggest that explicit semantic categorization interferes with implicit emotion processing when both processes compete for shared resources.  相似文献   

6.
This study investigates the spatiotemporal brain dynamics of emotional information processing during reading using a combination of surface and intracranial electroencephalography (EEG). Two different theoretical views were opposed. According to the standard psycholinguistic perspective, emotional responses to words are generated within the reading network itself subsequent to semantic activation. According to the neural re-use perspective, brain regions that are involved in processing emotional information contained in other stimuli (faces, pictures, smells) might be in charge of the processing of emotional information in words as well. We focused on a specific emotion—disgust—which has a clear locus in the brain, the anterior insula. Surface EEG showed differences between disgust and neutral words as early as 200 ms. Source localization suggested a cortical generator of the emotion effect in the left anterior insula. These findings were corroborated through the intracranial recordings of two epileptic patients with depth electrodes in insular and orbitofrontal areas. Both electrodes showed effects of disgust in reading as early as 200 ms. The early emotion effect in a brain region (insula) that responds to specific emotions in a variety of situations and stimuli clearly challenges classic sequential theories of reading in favor of the neural re-use perspective.  相似文献   

7.
Misreading facial expressions as signals of social disapproval, such as anger and disgust, may maintain social anxiety. If so, manipulating face processing could be therapeutic. It remains unclear, however, whether socially anxious individuals are in fact more sensitive to disapproving emotions. We assessed decoding of, and cost attributions to, emotional expressions in high and low socially anxious females (n=102) using five emotions (anger, disgust, fear, happiness, and sadness) expressed at 15 intensities (9–65%), providing 75 stimuli (see Supplementary Material). The decoding task briefly presented the stimuli and participants identified the emotion. The cost attribution task asked individuals to rate each stimulus for how costly it would be for them to interact with the person. Random effects regression indicated that social anxiety was not associated with overall decoding accuracy but was associated with a response bias. High socially anxious individuals had a lower threshold for decoding emotions but also more frequently classified low intensity emotions incorrectly. These effects were not emotion-specific. Socially anxious individuals also attributed excessive social cost to expressions of negative valence. Our results provide a novel conceptual framework for understanding emotion decoding in social anxiety, indicating the importance of considering both accuracy and response bias.  相似文献   

8.
The ability to read emotions in the face of another person is an important social skill that can be impaired in subjects with traumatic brain injury (TBI). To determine the brain regions that modulate facial emotion recognition, we conducted a whole-brain analysis using a well-validated facial emotion recognition task and voxel-based lesion symptom mapping (VLSM) in a large sample of patients with focal penetrating TBIs (pTBIs). Our results revealed that individuals with pTBI performed significantly worse than normal controls in recognizing unpleasant emotions. VLSM mapping results showed that impairment in facial emotion recognition was due to damage in a bilateral fronto-temporo-limbic network, including medial prefrontal cortex (PFC), anterior cingulate cortex, left insula and temporal areas. Beside those common areas, damage to the bilateral and anterior regions of PFC led to impairment in recognizing unpleasant emotions, whereas bilateral posterior PFC and left temporal areas led to impairment in recognizing pleasant emotions. Our findings add empirical evidence that the ability to read pleasant and unpleasant emotions in other people''s faces is a complex process involving not only a common network that includes bilateral fronto-temporo-limbic lobes, but also other regions depending on emotional valence.  相似文献   

9.
《Social neuroscience》2013,8(6):705-716
ABSTRACT

There is compelling evidence that semantic memory is involved in emotion recognition. However, its contribution to the recognition of emotional valence and basic emotions remains unclear. We compared the performance of 10 participants with the semantic variant of primary progressive aphasia (svPPA), a clinical model of semantic memory impairment, to that of 33 healthy participants using three experimental tasks assessing the recognition of: 1) emotional valence conveyed by photographic scenes, 2) basic emotions conveyed by facial expressions, and 3) basic emotions conveyed by prosody sounds. Individuals with svPPA showed significant deficits in the recognition of emotional valence and basic emotions (except happiness and surprise conveyed by facial expressions). However, the performance of the two groups was comparable when the performance on tests assessing semantic memory was added as a covariate in the analyses. Altogether, these results suggest that semantic memory contributes to the recognition of emotional valence and basic emotions. By examining the recognition of emotional valence and basic emotions in individuals with selective semantic memory loss, our results contribute to the refinement of current theories on the role of semantic memory in emotion recognition.  相似文献   

10.
This study explored the effect of lateralized left–right resting brain activity on prefrontal cortical responsiveness to emotional cues and on the explicit appraisal (stimulus evaluation) of emotions based on their valence. Indeed subjective responses to different emotional stimuli should be predicted by brain resting activity and should be lateralized and valence-related (positive vs negative valence). A hemodynamic measure was considered (functional near-infrared spectroscopy). Indeed hemodynamic resting activity and brain response to emotional cues were registered when subjects (N = 19) viewed emotional positive vs negative stimuli (IAPS). Lateralized index response during resting state, LI (lateralized index) during emotional processing and self-assessment manikin rating were considered. Regression analysis showed the significant predictive effect of resting activity (more left or right lateralized) on both brain response and appraisal of emotional cues based on stimuli valence. Moreover, significant effects were found as a function of valence (more right response to negative stimuli; more left response to positive stimuli) during emotion processing. Therefore, resting state may be considered a predictive marker of the successive cortical responsiveness to emotions. The significance of resting condition for emotional behavior was discussed.  相似文献   

11.
Individuals often align their emotional states during conversation. Here, we reveal how such emotional alignment is reflected in synchronization of brain activity across speakers and listeners. Two “speaker” subjects told emotional and neutral autobiographical stories while their hemodynamic brain activity was measured with functional magnetic resonance imaging (fMRI). The stories were recorded and played back to 16 “listener” subjects during fMRI. After scanning, both speakers and listeners rated the moment‐to‐moment valence and arousal of the stories. Time‐varying similarity of the blood‐oxygenation‐level‐dependent (BOLD) time series was quantified by intersubject phase synchronization (ISPS) between speaker–listener pairs. Telling and listening to the stories elicited similar emotions across speaker–listener pairs. Arousal was associated with increased speaker–listener neural synchronization in brain regions supporting attentional, auditory, somatosensory, and motor processing. Valence was associated with increased speaker–listener neural synchronization in brain regions involved in emotional processing, including amygdala, hippocampus, and temporal pole. Speaker–listener synchronization of subjective feelings of arousal was associated with increased neural synchronization in somatosensory and subcortical brain regions; synchronization of valence was associated with neural synchronization in parietal cortices and midline structures. We propose that emotion‐dependent speaker–listener neural synchronization is associated with emotional contagion, thereby implying that listeners reproduce some aspects of the speaker's emotional state at the neural level.  相似文献   

12.
BACKGROUND: People with Asperger syndrome (AS) have life-long deficits in social behavior. The biological basis of this is unknown, but most likely includes impaired processing of facial emotion. Human social communication involves processing different facial emotions, and at different intensities. However nobody has examined brain function in people with AS when implicitly (unconsciously) processing four primary emotions at varying emotional intensities. METHODS: We used event-related functional magnetic resonance imaging (MRI) to examine neural responses when people with AS and controls implicitly processed neutral expressions, and mild (25%) and intense (100%) expressions of fear, disgust, happiness, and sadness. We included 18 right-handed adults; 9 with AS and 9 healthy controls who did not differ significantly in IQ. RESULTS: Both groups significantly activated 'face perception' areas when viewing neutral faces, including fusiform and extrastriate cortices. Further, both groups had significantly increased activation of fusiform and other extrastriate regions to increasing intensities of fear and happiness. However, people with AS generally showed fusiform and extrastriate hyporesponsiveness compared to controls across emotion types and intensities. CONCLUSIONS: Fusiform and extrastriate cortices are activated by facial expressions of four primary emotions in people with AS, but generally to a lesser degree than controls. This may partly explain the social impairments of people with AS.  相似文献   

13.
We examined the influence of emotional valence and type of item to be remembered on brain activity during recognition, using faces and scenes. We used multivariate analyses of event-related fMRI data to identify whole-brain patterns, or networks of activity. Participants demonstrated better recognition for scenes vs faces and for negative vs neutral and positive items. Activity was increased in extrastriate cortex and inferior frontal gyri for emotional scenes, relative to neutral scenes and all face types. Increased activity in these regions also was seen for negative faces relative to positive faces. Correct recognition of negative faces and scenes (hits vs correct rejections) was associated with increased activity in amygdala, hippocampus, extrastriate, frontal and parietal cortices. Activity specific to correctly recognized emotional faces, but not scenes, was found in sensorimotor areas and rostral prefrontal cortex. These results suggest that emotional valence and type of visual stimulus both modulate brain activity at recognition, and influence multiple networks mediating visual, memory and emotion processing. The contextual information in emotional scenes may facilitate memory via additional visual processing, whereas memory for emotional faces may rely more on cognitive control mediated by rostrolateral prefrontal regions.  相似文献   

14.
Disorders of the basal ganglia (BG) alter perception and experience of emotions. Left hemisphere BG (LBG) stroke is also associated with depression. The interplay between depression and alterations in emotional processing following LBG stroke was examined. Evoked affective responses to emotion-laden pictorial stimuli were compared among LBG stroke and healthy participants and participants with stroke damage in brain regions not including the LBG selected to equate depression severity (measured using the Hamilton Depression Scale) with LBG damage participants. Brain activity {[O15]water positron emission tomography, PET} was measured in LBG stroke relative to healthy participants to identify changes in regions associated with emotion processing and depression. LBG stroke subjects reported less intense emotions compared with healthy, but not stroke comparison participants. Depression negatively correlated with emotional experience for positive and negative emotions. In response to positive stimuli, LBG subjects exhibited higher activity in amygdala, anterior cingulate, dorsal prefrontal cortex, and insula compared to healthy volunteers. In response to negative stimuli, LBG subjects demonstrated lower activity in right frontal-polar region and fusiform gyrus. Higher baseline activity in amygdala and ventral and mesial prefrontal cortex and lower activity in left dorsal lateral prefrontal cortex were associated with higher depression scores. LBG stroke led to blunted emotions, and brain activity alterations accounting for reduced affective experience, awareness and depression. Depression and fronto-limbic activity changes may contribute to emotional blunting following LBG stroke.  相似文献   

15.
Changes in social and emotional behaviour have been consistently observed in patients with traumatic brain injury. These changes are associated with emotion recognition deficits which represent one of the major barriers to a successful familiar and social reintegration. In the present study, 32 patients with traumatic brain injury, involving the frontal lobe, and 41 ageand education-matched healthy controls were analyzed. A Go/No-Go task was designed, where each participant had to recognize faces representing three social emotions (arrogance, guilt and jealousy). Results suggested that ability to recognize two social emotions (arrogance and jealousy) was significantly reduced in patients with traumatic brain injury, indicating frontal lesion can reduce emotion recognition ability. In addition, the analysis of the results for hemispheric lesion location (right, left or bilateral) suggested the bilateral lesion sub-group showed a lower accuracy on all social emotions.  相似文献   

16.
Somatoform disorder patients suffer from impaired emotion recognition and other emotional deficits. Emotional empathy refers to the understanding and sharing of emotions of others in social contexts. It is likely that the emotional deficits of somatoform disorder patients are linked to disturbed empathic abilities; however, little is known so far about empathic deficits of somatoform patients and the underlying neural mechanisms. We used fMRI and an empathy paradigm to investigate 20 somatoform disorder patients and 20 healthy controls. The empathy paradigm contained facial pictures expressing anger, joy, disgust, and a neutral emotional state; a control condition contained unrecognizable stimuli. In addition, questionnaires testing for somatization, alexithymia, depression, empathy, and emotion recognition were applied. Behavioral results confirmed impaired emotion recognition in somatoform disorder and indicated a rather distinct pattern of empathic deficits of somatoform patients with specific difficulties in “empathic distress.” In addition, somatoform patients revealed brain areas with diminished activity in the contrasts “all emotions”–“control,” “anger”–“control,” and “joy”–“control,” whereas we did not find brain areas with altered activity in the contrasts “disgust”–“control” and “neutral”–“control.” Significant clusters with less activity in somatoform patients included the bilateral parahippocampal gyrus, the left amygdala, the left postcentral gyrus, the left superior temporal gyrus, the left posterior insula, and the bilateral cerebellum. These findings indicate that disturbed emotional empathy of somatoform disorder patients is linked to impaired emotion recognition and abnormal activity of brain regions responsible for emotional evaluation, emotional memory, and emotion generation. Hum Brain Mapp, 2012. © 2011 Wiley Periodicals, Inc.  相似文献   

17.
OBJECTIVE: Patients with schizophrenia are characterized by emotional symptoms such as flattened affect which are accompanied by cerebral dysfunctions. This study aimed at determining changes of mood-related neural correlates under standardized pharmacological therapy in first-episode schizophrenia. METHOD: Using fMRI in a longitudinal approach, 10 first-episode schizophrenia patients (6 males) and 10 healthy subjects (same education, gender and age) were investigated during sad and happy mood induction using facial expressions. Reassessments were carried out following 6 months of standardized antipsychotic treatment. Data analysis focussed on therapy-related changes in cerebral activation and on stable, therapy-independent group differences. RESULTS: According to self ratings, mood induction was successful in both groups and did not reveal time-dependent changes. Patients revealed stable hypoactivations in core brain regions of emotional processing like the anterior cingulate cortex, orbitofrontal and temporal areas as well as the hippocampus. Therapy-related signal increases in pre- and postcentral, inferior temporal and frontal areas were restricted to sadness. DISCUSSION: Stable dysfunctions which are unaffected by therapy and symptom improvement were found in cortico-limbic regions crucially involved in emotion processing. They presumably reflect patients' difficulties in emotion regulation and emotional memory processes. However, therapy-related activation changes were also observed and demonstrate efficacy of antipsychotic therapy on improving emotion functionality. They may represent an increased usage of autobiographic emotional memories and an improved strategy to experience an emotion by mirroring someone else's emotions.  相似文献   

18.
BackgroundScreen media activities (SMAs; e.g., watching videos, playing videogames) have become increasingly prevalent among youth as ways to alleviate or escape from negative emotional states. However, neural mechanisms underlying these processes in youth are incompletely understood.MethodSeventy-nine youth aged 11–15 years completed a monetary incentive delay task during fMRI scanning. Neural correlates of reward/loss processing and their associations with SMAs were explored. Next, brain activations during reward/loss processing in regions implicated in the processing of emotions were examined as potential mediating factors between difficulties in emotion regulation (DER) and engagement in SMAs. Finally, a moderated mediation model tested the effects of depressive symptoms in such relationships.ResultThe emotional components associated with SMAs in reward/loss processing included activations in the left anterior insula (AI) and right dorsolateral prefrontal cortex (DLPFC) during anticipation of working to avoid losses. Activations in both the AI and DLPFC mediated the relationship between DER and SMAs. Moreover, depressive symptoms moderated the relationship between AI activation in response to loss anticipation and SMAs.ConclusionThe current findings suggest that DER link to SMAs through loss-related brain activations implicated in the processing of emotions and motivational avoidance, particularly in youth with greater levels of depressive symptoms. The findings suggest the importance of enhancing emotion-regulation tendencies/abilities in youth and, in particular, their regulatory responses to negative emotional situations in order to guide moderate engagement in SMAs.  相似文献   

19.
The recognition of facial expressions of emotion is impaired in semantic dementia (SD) and is associated with right-sided brain atrophy in areas known to be involved in emotion processing, notably the amygdala. Whether patients with SD also experience difficulty recognizing emotions conveyed by other media, such as music, is unclear. Prior studies have used excerpts of known music from classical or film repertoire but not unfamiliar melodies designed to convey distinct emotions. Patients with SD (n = 11), Alzheimer's disease (n = 12) and healthy control participants (n = 20) underwent tests of emotion recognition in two modalities: unfamiliar musical tunes and unknown faces as well as volumetric MRI. Patients with SD were most impaired with the recognition of facial and musical emotions, particularly for negative emotions. Voxel-based morphometry showed that the labelling of emotions, regardless of modality, correlated with the degree of atrophy in the right temporal pole, amygdala and insula. The recognition of musical (but not facial) emotions was also associated with atrophy of the left anterior and inferior temporal lobe, which overlapped with regions correlating with standardized measures of verbal semantic memory. These findings highlight the common neural substrates supporting the processing of emotions by facial and musical stimuli but also indicate that the recognition of emotions from music draws upon brain regions that are associated with semantics in language.  相似文献   

20.
Supramodal representation of emotion and its neural substrates have recently attracted attention as a marker of social cognition. However, the question whether perceptual integration of facial and vocal emotions takes place in primary sensory areas, multimodal cortices, or in affective structures remains unanswered yet. Using novel computer-generated stimuli, we combined emotional faces and voices in congruent and incongruent ways and assessed functional brain data (fMRI) during an emotional classification task. Both congruent and incongruent audiovisual stimuli evoked larger responses in thalamus and superior temporal regions compared with unimodal conditions. Congruent emotions were characterized by activation in amygdala, insula, ventral posterior cingulate (vPCC), temporo-occipital, and auditory cortices; incongruent emotions activated a frontoparietal network and bilateral caudate nucleus, indicating a greater processing load in working memory and emotion-encoding areas. The vPCC alone exhibited differential reactions to congruency and incongruency for all emotion categories and can thus be considered a central structure for supramodal representation of complex emotional information. Moreover, the left amygdala reflected supramodal representation of happy stimuli. These findings document that emotional information does not merge at the perceptual audiovisual integration level in unimodal or multimodal areas, but in vPCC and amygdala.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号