首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
In daily life, we perceive a person''s facial reaction as part of the natural environment surrounding it. Because most studies have investigated how facial expressions are recognized by using isolated faces, it is unclear what role the context plays. Although it has been observed that the N170 for facial expressions is modulated by the emotional context, it was not clear whether individuals use context information on this stage of processing to discriminate between facial expressions. The aim of the present study was to investigate how the early stages of face processing are affected by emotional scenes when explicit categorizations of fearful and happy facial expressions are made. Emotion effects were found for the N170, with larger amplitudes for faces in fearful scenes as compared to faces in happy and neutral scenes. Critically, N170 amplitudes were significantly increased for fearful faces in fearful scenes as compared to fearful faces in happy scenes and expressed in left-occipito-temporal scalp topography differences. Our results show that the information provided by the facial expression is combined with the scene context during the early stages of face processing.  相似文献   

2.
We present the response pattern of intracranial event-related potentials (ERPs) recorded from depth-electrodes in the human amygdala (four patients) to faces or face parts encoding fearful, happy or neutral expressions. The amygdala showed increased amplitude ERPs (from 200 to 400 ms post-stimulus) in response to the eye region of the face compared to whole faces and to the mouth region. In particular, a strong emotional valence effect was observed, both at group and at single-subject level, with a preferential response to fearful eyes respect to every other stimulus category from 200 to 400 ms after stimulus presentation. A preferential response to smiling eyes compared to happy faces and smiling mouths was also observed at group level from 300 to 400 ms post-stimulus presentation. A complementary time-frequency analysis was performed showing that an increase in the theta frequency band (4-7 Hz) accounted for the main event-related band power (ERBP) change during the 200-500 ms post stimulus interval. The analysis of the ERBPs changes according to their emotional valence showed a strong increase in theta ERBP to fearful eyes, which was higher respect to any other facial stimulus. Moreover, theta ERBP increase to "smiling eyes" was larger respect with that evoked by smiling mouths and whole happy faces. Minimal post-stimulus ERBPs changes were evoked by neutral stimuli. These data are consistent with a special role of the amygdala in processing facial signals, both with negative and positive valence, conveyed by the eye region of the face.  相似文献   

3.
In natural viewing conditions, different stimulus categories such as people, objects, and natural scenes carry relevant affective information that is usually processed simultaneously. But these different signals may not always have the same affective meaning. Using body‐scene compound stimuli, we investigated how the brain processes fearful signals conveyed by either a body in the foreground or scenes in the background and the interaction between foreground body and background scene. The results showed that left and right extrastriate body areas (EBA) responded more to fearful than to neutral bodies. More interestingly, a threatening background scene compared to a neutral one showed increased activity in bilateral EBA and right‐posterior parahippocampal place area (PPA) and decreased activity in right retrosplenial cortex (RSC) and left‐anterior PPA. The emotional scene effect in EBA was only present when the foreground body was neutral and not when the body posture expressed fear (significant emotion‐by‐category interaction effect), consistent with behavioral ratings. The results provide evidence for emotional influence of the background scene on the processing of body expressions. Hum Brain Mapp 35:492–502, 2014. © 2012 Wiley Periodicals, Inc.  相似文献   

4.
Painful events in our environment are often accompanied by stimuli from other sensory modalities. These stimuli may influence the perception and processing of acute pain, in particular when they comprise emotional cues, like facial expressions of people surrounding us. In this whole-head magnetoencephalography (MEG) study, we examined the neuronal mechanisms underlying the influence of emotional (fearful, angry, or happy) compared to neutral facial expressions on the processing of pain in humans. Independent of their valence, subjective pain ratings for intracutaneous inputs were higher when pain stimuli were presented together with emotional facial expressions than when they were presented with a neutral facial expression. Source reconstruction using linear beamforming revealed pain-induced early (70-270 ms) oscillatory beta-band activity (BBA; 15-25 Hz) and gamma-band activity (GBA; 60-80 Hz) in the sensorimotor cortex. The presentation of faces with emotional expressions compared to faces with neutral expressions led to a stronger bilateral suppression of the pain-induced BBA, possibly reflecting enhanced response readiness of the sensorimotor system. Moreover, pain-induced GBA in the sensorimotor cortex was larger for faces expressing fear than for faces expressing anger, which might reflect the facilitation of avoidance-motivated behavior triggered by the concurrent presentation of faces with fearful expressions and painful stimuli. Thus, the presence of emotional cues, like facial expressions from people surrounding us, while receiving acute pain may facilitate neuronal processes involved in the preparation and execution of adequate protective motor responses.  相似文献   

5.
To investigate whether the processing of faces and emotional facial expression can be modulated by spatial attention, ERPs were recorded in response to stimulus arrays containing two faces and two non-face stimuli (houses). In separate trials, attention was focused on the face pair or on the house pair, and facial expression was either fearful or neutral. When faces were attended, a greater frontal positivity in response to arrays containing fearful faces was obtained, starting about 100 ms after stimulus onset. In contrast, with faces located outside the attentional focus, this emotional expression effect was completely eliminated. This differential result demonstrates for the first time a strong attentional gating of brain processes involved in the analysis of emotional facial expression. It is argued that while an initial detection of emotionally relevant events mediated by the amygdala may occur pre-attentively, subsequent stages of emotional processing require focal spatial attention. The face-sensitive N170 component was unaffected by emotional facial expression, but N170 amplitudes were enhanced when faces were attended, suggesting that spatial attention can modulate the structural encoding of faces.  相似文献   

6.
An ERP study on the time course of emotional face processing   总被引:15,自引:0,他引:15  
Eimer M  Holmes A 《Neuroreport》2002,13(4):427-431
Using event-related brain potentials (ERPs), we investigated the time course of facial expression processing in human subjects watching photographs of fearful and neutral faces. Upright fearful faces elicited a frontocentral positivity within 120 ms after stimulus presentation, which was followed by a broadly distributed sustained positivity beyond 250 ms post-stimulus. Emotional expression effects were delayed and attenuated when faces were inverted. In contrast, the face-specific N170 component was completely unaffected by facial expression. We conclude that emotional expression analysis and the structural encoding of faces are parallel processes. Early emotional ERP modulations may reflect the rapid activation of prefrontal areas involved in the analysis of facial expression.  相似文献   

7.
To investigate the impact of spatial frequency on emotional facial expression analysis, ERPs were recorded in response to low spatial frequency (LSF), high spatial frequency (HSF), and unfiltered broad spatial frequency (BSF) faces with fearful or neutral expressions, houses, and chairs. In line with previous findings, BSF fearful facial expressions elicited a greater frontal positivity than BSF neutral facial expressions, starting at about 150 ms after stimulus onset. In contrast, this emotional expression effect was absent for HSF and LSF faces. Given that some brain regions involved in emotion processing, such as amygdala and connected structures, are selectively tuned to LSF visual inputs, these data suggest that ERP effects of emotional facial expression do not directly reflect activity in these regions. It is argued that higher order neocortical brain systems are involved in the generation of emotion-specific waveform modulations. The face-sensitive N170 component was neither affected by emotional facial expression nor by spatial frequency information.  相似文献   

8.
In human social interactions, facial emotional expressions are a crucial source of information. Repeatedly presented information typically leads to an adaptation of neural responses. However, processing seems sustained with emotional facial expressions. Therefore, we tested whether sustained processing of emotional expressions, especially threat-related expressions, would attenuate neural adaptation. Neutral and emotional expressions (happy, mixed and fearful) of same and different identity were presented at 3 Hz. We used electroencephalography to record the evoked steady-state visual potentials (ssVEP) and tested to what extent the ssVEP amplitude adapts to the same when compared with different face identities. We found adaptation to the identity of a neutral face. However, for emotional faces, adaptation was reduced, decreasing linearly with negative valence, with the least adaptation to fearful expressions. This short and straightforward method may prove to be a valuable new tool in the study of emotional processing.  相似文献   

9.
The ability to associate neutral stimuli with motivationally relevant outcomes is an important survival strategy. In this study, we used event‐related potentials (ERPs) to investigate brain dynamics of associative emotional learning when participants were confronted with multiple heterogeneous information. Participants viewed 144 different objects in the context of 144 different emotional and neutral background scenes. During each trial, neutral objects were shown in isolation and then paired with the background scene. All pairings were presented twice to compare ERPs in response to neutral objects before and after single association. After single pairing, neutral objects previously encoded in the context of emotional scenes evoked a larger P100 over occipital electrodes compared to objects that were previously paired with neutral scenes. Likewise, larger late positive potentials (LPPs) were observed over parieto‐occipital electrodes (450–750 ms) for objects previously associated with emotional relative to neutral contexts. The LPP – but not P100 – enhancement was also related to subjective object/context binding. Taken together, our ERP data provide evidence for fast emotional associative learning, as reflected by heightened perceptual and sustained elaborative processing for neutral information previously encountered in emotional contexts. These findings could assist in understanding binding mechanisms in stress and anxiety, as well as in addiction and eating‐related disorders.  相似文献   

10.

Background and objectives

Recent work suggests that the ability to disengage attention from threatening information is impaired in anxiety. The present study compared the difficulty to disengage from angry, fearful and neutral faces in Low Trait Anxious individuals (LTA) versus High Trait Anxious individuals (HTA) at two stages of facial expression processing (i.e., initial and later face processing).

Methods

HTA and LTA individuals performed an attentional shifting task to assess attentional disengagement. Participants had to classify a peripheral target letter, appearing 200 or 500 ms after a face was displayed.

Results

LTA individuals were quicker when the letter appears after 500 ms compared to 200 ms regardless of the emotion of the face. An impaired disengagement in HTA individuals was observed for fearful and angry faces (i.e., no reaction differences between 200 and 500 ms) but not for neutral faces. These results suggest that it is particularly difficult for anxious individuals to switch attention from one stimulus to another if the engaged stimulus is a threatening face.

Limitations

Generalisation of our results is restricted to trait anxiety and emotional facial expression processing.

Conclusions

LTA individuals can benefit from the emotional processing (i.e., from 200 to 500 ms) to make a rapid attentional shift and engagement to the target stimuli whereas HTA individuals did not and continue to process the threatening facial expression. These results also point out the role of top down processes on the regulation of disengagement from threatening information in anxiety.  相似文献   

11.
Visual processing of facial affect   总被引:7,自引:0,他引:7  
To evaluate the role of the fusiform gyrus in identifying and processing facial emotional expression in humans, MEG data were collected while six healthy subjects judged whether photographs of faces displayed emotion (happiness or disgust) compared to neutral faces and equiluminant scrambled faces. For all six subjects, a magnetic source localizing to right fusiform gyrus was evident approximately 150 ms following presentation of face stimuli, but not following non-face stimuli. MEG source strength for this component was greatest for happy, intermediate for disgust, and lowest for neutral facial expressions, suggesting that activity in fusiform gyrus is sensitive to both face-specific stimuli and to the affective content of the face. These findings are considered in the context of a specialized neural face-dependent information system.  相似文献   

12.
We compared electrical brain responses to fearful vs. neutral facial expressions in healthy volunteers while they performed an orthogonal gender decision task. Face stimuli either had a broadband spatial-frequency content, or were filtered to create either low spatial-frequency (LSF) or high spatial-frequency (HSF) faces, always overlapped with their complementary SF content in upside-down orientation to preserve the total stimulus energy. We tested the hypothesis that the coarse LSF content of faces might be responsible for an early modulation of event-related potentials (ERPs) to fearful expressions. Consistent with previous findings, we show that broadband images of fearful faces, relative to neutral faces, elicit a higher global field power of approximately 130 ms poststimulus onset, corresponding to an increased P1 component over lateral occipital electrodes, with neural sources located within the extrastriate visual cortex. Bandpass filtering of faces strongly affected the latency and amplitude of ERPs, with a suppression of the normal N170 response for both LSF and HSF faces, irrespective of expression. Critically, we found that LSF information from fearful faces, unlike HSF information, produced a right-lateralized enhancement of the lateral occipital P1, without any change in the scalp topography, relative to unfiltered (broadband) fearful faces. These results demonstrate that an early P1 response to fear expression depends on a visual pathway preferentially tuned to coarse-magnocellular inputs, and can persist unchanged even when the N170 generators are disrupted by SF filtering.  相似文献   

13.
We investigated how visual and linguistic information interact in the perception of emotion. We borrowed a phenomenon from film theory which states that presentation of an as such neutral visual scene intensifies the percept of fear or suspense induced by a different channel of information, such as language. Our main aim was to investigate how neutral visual scenes can enhance responses to fearful language content in parts of the brain involved in the perception of emotion. Healthy participants’ brain activity was measured (using functional magnetic resonance imaging) while they read fearful and less fearful sentences presented with or without a neutral visual scene. The main idea is that the visual scenes intensify the fearful content of the language by subtly implying and concretizing what is described in the sentence. Activation levels in the right anterior temporal pole were selectively increased when a neutral visual scene was paired with a fearful sentence, compared to reading the sentence alone, as well as to reading of non-fearful sentences presented with the same neutral scene. We conclude that the right anterior temporal pole serves a binding function of emotional information across domains such as visual and linguistic information.  相似文献   

14.
The goal of the present study was to characterize the effectsof valence in facial cues and object targets on event-relatedpotential (ERPs) indices of gaze-directed orienting. Participantswere shown faces at fixation that concurrently displayed dynamicgaze shifts and expression changes from neutral to fearful orhappy emotions. Emotionally-salient target objects subsequentlyappeared in the periphery and were spatially congruent or incongruentwith the gaze direction. ERPs were time-locked to target presentation.Three sequential ERP components were modulated by happy emotion,indicating a progression from an expression effect to a gaze-by-expressioninteraction to a target emotion effect. These effects includedlarger P1 amplitude over contralateral occipital sites for targetsfollowing happy faces, larger centrally distributed N1 amplitudefor targets following happy faces with leftward gaze, and fasterP3 latency for positive targets. In addition, parietally distributedP3 amplitude was reduced for validly cued targets followingfearful expressions. Results are consistent with accounts ofattentional broadening and motivational approach by happy emotion,and facilitation of spatially directed attention in the presenceof fearful cues. The findings have implications for understandinghow socioemotional signals in faces interact with each otherand with emotional features of objects in the environment toalter attentional processes.  相似文献   

15.
People understand others’ emotions quickly from their facial expressions. However, facial expressions of ingroup and outgroup members may signal different social information and thus be mediated by distinct neural activities. We investigated whether there are distinct neuronal responses to fearful and happy expressions of same-race (SR) and other-race (OR) faces. We recorded electroencephalogram from Chinese adults when viewing an adaptor face (with fearful/neutral expressions in Experiment 1 but happy/neutral expressions in Experiment 2) and a target face (with fearful expressions in Experiment 1 but happy expressions in Experiment 2) presented in rapid succession. We found that both fearful and happy (vs neutral) adaptor faces increased the amplitude of a frontocentral positivity (P2). However, a fearful but not happy (vs neutral) adaptor face decreased the P2 amplitudes to target faces, and this repetition suppression (RS) effect occurred when adaptor and target faces were of the same race but not when of different races. RS was observed on two late parietal/central positive activities to fearful/happy target faces, which, however, occurred regardless of whether adaptor and target faces were of the same or different races. Our findings suggest that early affective processing of fearful expressions may engage distinct neural activities for SR and OR faces.  相似文献   

16.
In psychiatrically-well subjects the modulation of event related potentials (ERPs) by emotional facial expressions is found in several ERPs from -100 ms and later. A face-related EPR, the N170, is abnormally reduced in schizophrenia to faces relative to other complex objects and research suggests emotional modulation of N170 may be reduced as well. To further examine facial emotion modulation of N170, subjects detected neutral facial expressions from among five emotional expressions (happy, sad, fearful, angry, and disgusted). Over occipitotemporal sites, psychiatrically-well subjects showed bilateral differences in N170 amplitude among expressions (P = 0.014). Schizophrenia subjects failed to show this modulation (P = 0.551). Accuracy on the task did not differ between groups, nor did the pattern of errors. However, in patients, greater positive and negative symptom ratings were associated with increased failure to button press to neutral faces, suggesting misattribution of emotion to neutral expressions in the more ill patients. Because the N170 is largely specific to faces, these results suggest that an impairment specific to the visual processing of facial expressions contributes to the well-known behavioral abnormalities in facial emotion tasks in schizophrenia.  相似文献   

17.
Individuals with autism spectrum disorders (ASDs) have different automatic responses to faces than typically developing (TD) individuals. We recorded visual evoked potentials (VEPs) in 10 individuals with high-functioning ASD (HFASD) and 10 TD individuals. Visual stimuli consisted of upright and inverted faces (fearful and neutral) and objects presented subliminally in a backward-masking paradigm. In all participants, the occipital N1 (about 100 ms) and P1 (about 120 ms) peaks were major components of the evoked response. We calculated “subliminal face effect (SFE)” scores by subtracting the N1/P1 amplitudes and latencies of the object stimuli from those of the face stimuli. In the TD group, the SFE score for the N1 amplitude was significantly higher for upright fearful faces but not neutral faces, and this score was insignificant when the stimuli were inverted. In contrast, the N1 amplitude of the HFASD subjects did not show this SFE in the upright orientation. There were no significant group differences in SFE scores for P1 amplitude, latency, or N1 latency. Our findings suggest that individuals with HFASD have altered automatic visual processing for emotional faces within the lower level of the visual cortex. This impairment could be a neural component of the disrupted social cognition observed in individuals with HFASD.  相似文献   

18.
Although shyness is presumed to be related to an increased sensitivity to detect motivationally salient social stimuli, we know little of how shyness affects the early perception of facial emotions. We demonstrate here that individual differences in normative shyness were related to brain responses to some emotional faces as early as the P1 electrocortical component, 80-130 ms after stimulus onset. High-shy individuals showed reduced P1 amplitude for fearful faces compared to neutral faces. Low-shy individuals processed happy faces faster than other emotions and showed increased P1 amplitudes for happy faces over neutral faces. Regardless of shyness level, participants showed increased amplitudes in the N170 component (130-200 ms) for all emotions over neutral conditions, particularly for the emotion of fear. This study presents the first evidence that shyness is related to early electrocortical responses to the processing of fearful faces, consistent with a fast-path amygdala sensitivity model.  相似文献   

19.
Faces are multi-dimensional stimuli bearing important social signals, such as gaze direction and emotion expression. To test whether perception of these two facial attributes recruits distinct cortical areas within the right hemisphere, we used single-pulse transcranial magnetic stimulation (TMS) in healthy volunteers while they performed two different tasks on the same face stimuli. In each task, two successive faces were presented with varying eye-gaze directions and emotional expressions, separated by a short interval of random duration. TMS was applied over either the right somatosensory cortex or the right superior lateral temporal cortex, 100 or 200 ms after presentation of the second face stimulus. Participants performed a speeded matching task on the second face during one of two possible conditions, requiring judgements about either gaze direction or emotion expression (same/different as the first face). Our results reveal a significant task-stimulation site interaction, indicating a selective TMS-related interference following stimulations of somatosensory cortex during the emotional expression task. Conversely, TMS of the superior lateral temporal cortex selectively interfered with the gaze direction task. We also found that the interference effect was specific to the stimulus content in each condition, affecting judgements of gaze shifts (not static eye positions) with TMS over the right superior temporal cortex, and judgements of fearful expressions (not happy expressions) with TMS over the right somatosensory cortex. These results provide for the first time a double dissociation in normal subjects during social face recognition, due to transient disruption of non-overlapping brain regions. The present study supports a critical role of the somatosensory and superior lateral temporal regions in the perception of fear expression and gaze shift in seen faces, respectively.  相似文献   

20.
The emotional valence of facial expressions can be reliably discriminated even in the absence of conscious visual experience by patients with lesions to the primary visual cortex (affective blindsight). Prior studies in one such patient (GY) also showed that this non-conscious perception can influence conscious recognition of normally seen emotional faces. Here we report a similar online interaction across hemispheres between conscious and non-conscious perception of emotions in normal observers. Fearful and happy facial expressions were presented either unilaterally (to the left or right visual field) or simultaneously to both visual fields. In bilateral displays, conscious perception of one face in a pair was prevented by backward masking after 20 ms, while the opposite expression remained normally visible. The results showed a bidirectional influence of non-conscious fear processing over conscious recognition of happy as well as fearful expressions. Consciously perceived fearful faces were more readily recognized when they were paired with invisible emotionally congruent fearful expressions in the opposite field, as compared to the single presentation of the same unmasked faces. On the other hand, recognition of unmasked happy faces was delayed by the simultaneous presence of a masked fearful face. No such effect was reported for masked happy expressions. These findings show that non-conscious processing of fear may modulate ongoing conscious evaluation of facial expressions via neural interhemispheric summation even in the intact brain.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号