首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Research investigating the early development of emotional processing has focused mainly on infants' perception of static facial emotional expressions, likely restricting the amount and type of information available to infants. In particular, the question of whether dynamic information in emotional facial expressions modulates infants' neural responses has been rarely investigated. The present study aimed to fill this gap by recording 7-month-olds' event-related potentials to static (Study 1) and dynamic (Study 2) happy, angry, and neutral faces. In Study 1, happy faces evoked a faster right-lateralized negative central (Nc) component compared to angry faces. In Study 2, both happy and angry faces elicited a larger right-lateralized Nc compared to neutral faces. Irrespective of stimulus dynamicity, a larger P400 to angry faces was associated with higher scores on the Negative Affect temperamental dimension. Overall, results suggest that 7-month-olds are sensitive to facial dynamics, which might play a role in shaping the neural processing of facial emotional expressions. Results also suggest that the amount of attentional resources infants allocate to angry expressions is associated to their temperamental traits. These findings represent a promising avenue for future studies exploring the neurobiological processes involved in perceiving emotional expressions using dynamic stimuli.  相似文献   

2.
It has been argued that the amygdala represents an integral component of a vigilance system that is primarily involved in the perception of ambiguous stimuli of biological relevance. The present investigation was conducted to examine the relationship between automatic amygdala responsivity to fearful faces which may be interpreted as an index of trait-like threat sensitivity and spatial processing characteristics of facial emotions. During 3T fMRI scanning, pictures of human faces bearing fearful, angry, and happy expressions were presented to 20 healthy volunteers using a backward masking procedure based on neutral facial expressions. Subsequently, a computer-based face-in-the-crowd task using schematic face stimuli was administered. The neural response of the (right) amygdala to masked fearful faces correlated consistently with response speed to negative and neutral faces. Neither amygdala activation during the masked presentation of angry faces nor amygdala activation during the presentation of happy faces was correlated with any of the response latencies in the face-in-the-crowd task. Our results suggest that amygdala responsivity to masked facial expression is differentially related to the general visual search speed for facial expression. Neurobiologically defined threat sensitivity seems to represent an important determinant of visual scanning behaviour.  相似文献   

3.
Facial Reactions to Facial Expressions   总被引:9,自引:0,他引:9  
Ulf  Dimberg 《Psychophysiology》1982,19(6):643-647
Previous research has demonstrated that different patterns of facial muscle activity are correlated with different emotional states. In the present study subjects were exposed to pictures of happy and angry facial expressions, in response to which their facial electromyographic (EMG) activities, heart rate (HR), and palmar skin conductance responses (SCRs) were recorded. It was found that happy and angry faces evoked different facial EMG response patterns, with increased zygomatic region activity to happy stimuli and increased corrugator region activity to angry stimuli. Furthermore, both happy and angry faces evoked HR decelerations and similar SCR magnitudes. The results are interpreted as suggesting that facial EMG recordings provide a method for distinguishing between response patterns to “positive” and “negative” emotional visual stimuli.  相似文献   

4.
The effects of task demands and the interaction between gender and expression in face perception were studied using event-related potentials (ERPs). Participants performed three different tasks with male and female faces that were emotionally inexpressive or that showed happy or angry expressions. In two of the tasks (gender and expression categorization) facial properties were task-relevant while in a third task (symbol discrimination) facial information was irrelevant. Effects of expression were observed on the visual P100 component under all task conditions, suggesting the operation of an automatic process that is not influenced by task demands. The earliest interaction between expression and gender was observed later in the face-sensitive N170 component. This component showed differential modulations by specific combinations of gender and expression (e.g., angry male vs. angry female faces). Main effects of expression and task were observed in a later occipito-temporal component peaking around 230 ms post-stimulus onset (EPN or early posterior negativity). Less positive amplitudes in the presence of angry faces and during performance of the gender and expression tasks were observed. Finally, task demands also modulated a positive component peaking around 400 ms (LPC, or late positive complex) that showed enhanced amplitude for the gender task. The pattern of results obtained here adds new evidence about the sequence of operations involved in face processing and the interaction of facial properties (gender and expression) in response to different task demands.  相似文献   

5.
This investigation examined the effects of maltreatment during the first year of life on the neural correlates of processing facial expressions of emotion at 30 months of age. Event-related potentials (ERPs) in response to children passively viewing standardized pictures of female models posing angry, happy, and neutral facial expressions were examined. Four ERP waveform components were derived: early negative (N150), early positive (P260), negative central (Nc), and positive slow wave (PSW). Differences in these waveforms between a group of 35 maltreated and 24 nonmaltreated children were reported. The groups did not differ on the early perceptual negative component (N150), whereas the maltreated children had greater P260 amplitude at frontal leads compared to the nonmaltreated children in response to viewing angry facial expressions. For the Nc component, the nonmaltreated comparison children exhibited greater amplitude while viewing pictures of happy faces compared to angry and neutral faces, whereas the maltreated children showed greater Nc amplitude at central sites while viewing angry faces. For the PSW, the nonmaltreated group showed a greater area score in the right hemisphere in response to viewing angry facial expressions compared to the maltreated group. The results are discussed in terms of brain development and function, as well as their implications for the design and evaluation of preventive interventions.  相似文献   

6.
Numerous investigators have tested contentions that angry faces capture early attention more completely than happy faces do in the context of other faces. However, syntheses of studies on early event‐related potentials related to the anger superiority hypothesis have yet to be conducted, particularly in relation to the N200 posterior‐contralateral (N2pc) component which provides a reliable electrophysiological index related to orienting of attention suitable for testing this hypothesis. Fifteen samples (N = 534) from 13 studies featuring the assessment of N2pc amplitudes during exposure to angry‐neutral and/or happy‐neutral facial expression arrays were included for meta‐analysis. Moderating effects of study design features and sample characteristics on effect size variability were also assessed. N2pc amplitudes elicited by affectively valenced expressions (angry and happy) were significantly more pronounced than those elicited by neutral expressions. However, the mean effect size difference between angry and happy expressions was ns. N2pc effect sizes were moderated by sample age, number of trials, and nature of facial images used (schematic vs. real) with larger effect sizes observed when samples were comparatively younger, more task trials were presented and schematic face arrays were used. N2pc results did not support anger superiority hypothesis. Instead, attentional resources allocated to angry versus happy facial expressions were similar in early stages of processing. As such, possible adaptive advantages of biases in orienting toward both anger and happy expressions warrant consideration in revisions of related theory.  相似文献   

7.
BACKGROUND: Emotional Stroop tasks have shown attention biases of clinical populations towards stimuli related to their condition. Asperger Syndrome (AS) is a neuropsychiatric condition with social and communication deficits, repetitive behaviours and narrow interests. Social deficits are particularly striking, including difficulties in understanding others. METHOD: We investigated colour-naming latencies of adults with and without AS to name colours of pictures containing angry facial expressions, neutral expressions or non-social objects. We tested three hypotheses: whether (1) controls show longer colour-naming latencies for angry versus neutral facial expressions with male actors, (2) people with AS show differential latencies across picture types, and (3) differential response latencies persist when photographs contain females. RESULTS: Controls had longer latencies to pictures of male faces with angry compared to neutral expressions. The AS group did not show longer latencies to angry versus neutral expressions in male faces, instead showing slower latencies to pictures containing any facial expression compared to objects. When pictures contained females, controls no longer showed longer latencies for angry versus neutral expressions. However, the AS group still showed longer latencies to all facial picture types, compared to objects, providing further evidence that faces produce interference effects for this clinical group. CONCLUSIONS: The pictorial emotional Stroop paradigm reveals normal attention biases towards threatening emotional faces. The AS group showed Stroop interference effects to all facial stimuli regardless of expression or sex, suggesting that faces cause disproportionate interference in AS.  相似文献   

8.
Neuroimaging evidence suggests that dynamic facial expressions elicit greater activity than static face stimuli in brain structures associated with social cognition, interpreted as greater ecological validity. However, a quantitative meta-analysis of brain activity associated with dynamic facial expressions is lacking. The current study investigated, using three fMRI experiments, activity elicited by (a) dynamic and static happy faces, (b) dynamic and static happy and angry faces, and (c) dynamic faces and dynamic flowers. In addition, using activation likelihood estimate (ALE) meta-analysis, we determined areas concordant across published studies that (a) used dynamic faces and (b) specifically compared dynamic and static emotional faces. The middle temporal gyri (Experiment 1) and superior temporal sulci (STS; Experiment 1 and 2) were more active for dynamic than static faces. In contrasts with the baseline the amygdalae were more active for dynamic faces (Experiment 1 and 2) and the fusiform gyri were active for all conditions (all Experiments). The ALE meta-analyses revealed concordant activation in all of these regions as well as in areas associated with cognitive manipulations (inferior frontal gyri). Converging data from the experiments and the meta-analyses suggest that dynamic facial stimuli elicit increased activity in regions associated with interpretation of social signals and emotional processing.  相似文献   

9.
The angry facial expression is an important socially threatening stimulus argued to have evolved to regulate social hierarchies. In the present study, event-related potentials (ERP) were used to investigate the involvement and temporal dynamics of the frontal and parietal regions in the processing of angry facial expressions. Angry, happy and neutral faces were shown to eighteen healthy right-handed volunteers in a passive viewing task. Stimulus-locked ERPs were recorded from the frontal and parietal scalp sites. The P200, N300 and early contingent negativity variation (eCNV) components of the electric brain potentials were investigated. Analyses revealed statistical significant reductions in P200 amplitudes for the angry facial expression on both frontal and parietal electrode sites. Furthermore, apart from being strongly associated with the anterior P200, the N300 showed to be more negative for the angry facial expression in the anterior regions also. Finally, the eCNV was more pronounced over the parietal sites for the angry facial expressions. The present study demonstrated specific electrocortical correlates underlying the processing of angry facial expressions in the anterior and posterior brain sectors. The P200 is argued to indicate valence tagging by a fast and early detection mechanism. The lowered N300 with an anterior distribution for the angry facial expressions indicates more elaborate evaluation of stimulus relevance. The fact that the P200 and the N300 are highly correlated suggests that they reflect different stages of the same anterior evaluation mechanism. The more pronounced posterior eCNV suggests sustained attention to socially threatening information.  相似文献   

10.
Startle reflex modulation by affective pictures is a well-established effect in human emotion research. However, much less is known about startle modulation by affective faces, despite the growing evidence that facial expressions robustly activate emotion-related brain circuits. In this study, acoustic startle probes were administered to 37 young adult participants (20 women) during the viewing of slides from the Pictures of Facial Affect set including neutral, happy, angry, and fearful faces. The effect of expression valence (happy, neutral, and negative) on startle magnitude was highly significant (p < .001). Startle reflex was strongly potentiated by negative expressions (fearful and angry), however, no attenuation by happy faces was observed. A significant valence by gender interaction suggests stronger startle potentiation effects in females. These results demonstrate that affective facial expressions can produce significant modulation of the startle reflex.  相似文献   

11.
In this experiment, a lateralized right hemisphere effect was found for electrodermal associative learning to facial emotional expressions. Sixty-two subjects were presented simultaneously with a slide of a happy face in the right or left visual half field (VHF) and a slide of an angry face in the opposite VHF. Four groups were formed by the combination of the two VHF positions of angry/happy faces and the administration/omission of shock unconditioned stimuli. The results showed that simultaneous presentation of the angry face to the right hemisphere and the happy face to the left hemisphere, together with shock, resulted in a strong conditioned association with the angry face and a relatively weak association with the happy face. Furthermore, simultaneous presentation of the angry face to the left hemisphere and the happy face to the right hemisphere, together with shock, resulted in a relatively weak association with both stimuli. No significant differences were found for the no-shock control groups. The present results confirm previous findings of a right hemisphere advantage for representation of associative learning.  相似文献   

12.
Processing and maintenance in working memory involve active attention allocation; consequently, it is possible that a recognition process could interfere with the performance of highly demanding working memory tasks. Event-related brain potentials (ERPs) were recorded while fourteen healthy male adults performed a visual verbal dual working memory task. Three conditions were examined: A) reference (with no preceding stimuli); B) happy, angry or neutral faces presented 250 ms prior to task onset for 30 ms; and, C) visual noise control condition. Behavioral results showed that reaction times were significantly longer in the condition preceded by the presentation of faces, regardless of their emotional content. ERPs showed a predominantly right temporo-occipital negative component at around 170 ms during conditions B and C (higher amplitude in B), which probably reflects the pre-categorical structural encoding of the face. Succeeding task-onset, an early positive right temporo-parietal component (P380) appeared during condition B, probably reflecting a delayed reallocation of working memory attentional resources to carry out the task requirements. Afterwards, two other main fronto-parietal components were observed in the three conditions: a positive wave that peaked at around 410 ms, and a subsequent negative component (N585). This latter waveform reached a significantly higher amplitude during the reference condition (A) and was interpreted as mirroring the phonologic-lexical manipulation of the stimuli in working memory. These results suggest that early face detection could induce an attentional decrement that interfere a subsequent visual verbal working memory task performance. They also suggest that while face detection and facial emotional content analysis might be parallel processes, they are not onset-synchronized.  相似文献   

13.
Gender differences in facial reactions to facial expressions   总被引:3,自引:0,他引:3  
This study explored whether males and females differ in facial muscle reactivity when exposed to facial expressions. The study also examined whether the sex of the stimulus faces differentially influences the response patterns to facial stimuli. Thus, the sex was manipulated in a 2 x 2 factorial design by exposing males and females to slides of angry and happy faces displayed by both sexes. Facial electromyographic (EMG) activity was measured from the corrugator and zygomatic muscle regions. The subjects were also required to rate the stimuli on different dimensions. The results showed that angry faces evoked increased corrugator activity whereas happy faces evoked increased zygomatic activity. As predicted, these effects were more pronounced for females, particularly for the response to happy faces. Interestingly, there were no facial EMG effects for gender of stimulus. It was further found that males and females perceived the stimuli similarly. The results are consistent with previous findings indicating that females are more facially reactive than are males.  相似文献   

14.
Preliminary studies have demonstrated that school-aged children (average age 9-10years) show mimicry responses to happy and angry facial expressions. The aim of the present study was to assess the feasibility of using facial electromyography (EMG) as a method to study facial mimicry responses in younger children aged 6-7years to emotional facial expressions of other children. Facial EMG activity to the presentation of dynamic emotional faces was recorded from the corrugator, zygomaticus, frontalis and depressor muscle in sixty-one healthy participants aged 6-7years. Results showed that the presentation of angry faces was associated with corrugator activation and zygomaticus relaxation, happy faces with an increase in zygomaticus and a decrease in corrugator activation, fearful faces with frontalis activation, and sad faces with a combination of corrugator and frontalis activation. This study demonstrates the feasibility of measuring facial EMG response to emotional facial expressions in 6-7year old children.  相似文献   

15.
To investigate the mechanisms by which oxytocin improves socioaffective processing, we measured behavioral and pupillometric data during a dynamic facial emotion recognition task. In a double‐blind between‐subjects design, 47 men received either 24 IU intranasal oxytocin (OXT) or a placebo (PLC). Participants in the OXT group recognized all facial expressions at lower intensity levels than did participants in the PLC group. Improved performance was accompanied by increased task‐related pupil dilation, indicating an increased recruitment of attentional resources. We also found increased pupil dilation during the processing of female compared with male faces. This gender‐specific stimulus effect diminished in the OXT group, in which pupil size specifically increased for male faces. Results suggest that improved emotion recognition after OXT treatment might be due to an intensified processing of stimuli that usually do not recruit much attention.  相似文献   

16.
The present study investigated whether social anxiety modulates the processing of facial expressions. Event-related potentials were recorded during an oddball task in young adults reporting high or low levels of social anxiety as evaluated by the Liebowitz Social Anxiety Scale. Repeated pictures of faces with a neutral expression were infrequently replaced by pictures of the same face displaying happiness, anger, fear or disgust. For all participants, response latencies were shorter in detecting faces expressing disgust and happiness as compared to fear or anger. Low social anxiety individuals evoked enhanced P1 in response to angry faces as compared to other stimuli while high socially anxious participants displayed enlarged P1 for all emotional stimuli as compared to neutral ones, and general higher amplitudes as compared to non-anxious individuals. Conversely, the face-specific N170 and the task-related decision P3b were not influenced by social anxiety. These results suggest increased pre-attentive detection of facial cues in socially anxious individuals and are discussed within the framework of recent models of anxiety.  相似文献   

17.
Pictures of emotional facial expressions or natural scenes are often used as cues in emotion research. We examined the extent to which these different stimuli engage emotion and attention, and whether the presence of social anxiety symptoms influences responding to facial cues. Sixty participants reporting high or low social anxiety viewed pictures of angry, neutral, and happy faces, as well as violent, neutral, and erotic scenes, while skin conductance and event-related potentials were recorded. Acoustic startle probes were presented throughout picture viewing, and blink magnitude, probe P3 and reaction time to the startle probe also were measured. Results indicated that viewing emotional scenes prompted strong reactions in autonomic, central, and reflex measures, whereas pictures of faces were generally weak elicitors of measurable emotional response. However, higher social anxiety was associated with modest electrodermal changes when viewing angry faces and mild startle potentiation when viewing either angry or smiling faces, compared to neutral. Taken together, pictures of facial expressions do not strongly engage fundamental affective reactions, but these cues appeared to be effective in distinguishing between high and low social anxiety participants, supporting their use in anxiety research.  相似文献   

18.
Facial muscular reactions to avatars' static (neutral, happy, angry) and dynamic (morphs developing from neutral to happy or angry) facial expressions, presented for 1 s each, were investigated in 48 participants. Dynamic expressions led to better recognition rates and higher intensity and realism ratings. Angry expressions were rated as more intense than happy expressions. EMG recordings indicated emotion-specific reactions to happy avatars as reflected in increased M. zygomaticus major and decreased M. corrugator supercilii tension, with stronger reactions to dynamic as compared to static expressions. Although rated as more intense, angry expressions elicited no significant M. corrugator supercilii activation. We conclude that facial reactions to angry and to happy facial expressions hold different functions in social interactions. Further research should vary dynamics in different ways and also include additional emotional expressions.  相似文献   

19.
To investigate the time course of emotional expression processing, we recorded ERPs to facial stimuli. The first task was to discriminate emotional expressions. Enhanced negativity of the face-specific N170 was elicited by emotional as opposed to neutral faces, followed by the occipital negativity (240-340 ms poststimulus). The second task was to classify face gender. Here, N170 was unaffected by the emotional expression. However, emotional expression effect was expressed in the anterior positivity (160-250 ms poststimulus) and subsequent occipital negativity (240-340 ms poststimulus). Results support the thesis that structural encoding relevant to gender recognition and simultaneous expression analysis are independent processes. Attention modulates facial emotion processing 140-185 ms poststimulus. Involuntary differentiation of facial expression was observed later (160-340 ms poststimulus), suggesting unintentional attention capture.  相似文献   

20.
Facial expressions are important emotional stimuli during social interactions. Symbolic emotional cues, such as affective words, also convey information regarding emotions that is relevant for social communication. Various studies have demonstrated fast decoding of emotions from words, as was shown for faces, whereas others report a rather delayed decoding of information about emotions from words. Here, we introduced an implicit (color naming) and explicit task (emotion judgment) with facial expressions and words, both containing information about emotions, to directly compare the time course of emotion processing using event-related potentials (ERP). The data show that only negative faces affected task performance, resulting in increased error rates compared to neutral faces. Presentation of emotional faces resulted in a modulation of the N170, the EPN and the LPP components and these modulations were found during both the explicit and implicit tasks. Emotional words only affected the EPN during the explicit task, but a task-independent effect on the LPP was revealed. Finally, emotional faces modulated source activity in the extrastriate cortex underlying the generation of the N170, EPN and LPP components. Emotional words led to a modulation of source activity corresponding to the EPN and LPP, but they also affected the N170 source on the right hemisphere. These data show that facial expressions affect earlier stages of emotion processing compared to emotional words, but the emotional value of words may have been detected at early stages of emotional processing in the visual cortex, as was indicated by the extrastriate source activity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号