首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Research investigating the early development of emotional processing has focused mainly on infants' perception of static facial emotional expressions, likely restricting the amount and type of information available to infants. In particular, the question of whether dynamic information in emotional facial expressions modulates infants' neural responses has been rarely investigated. The present study aimed to fill this gap by recording 7-month-olds' event-related potentials to static (Study 1) and dynamic (Study 2) happy, angry, and neutral faces. In Study 1, happy faces evoked a faster right-lateralized negative central (Nc) component compared to angry faces. In Study 2, both happy and angry faces elicited a larger right-lateralized Nc compared to neutral faces. Irrespective of stimulus dynamicity, a larger P400 to angry faces was associated with higher scores on the Negative Affect temperamental dimension. Overall, results suggest that 7-month-olds are sensitive to facial dynamics, which might play a role in shaping the neural processing of facial emotional expressions. Results also suggest that the amount of attentional resources infants allocate to angry expressions is associated to their temperamental traits. These findings represent a promising avenue for future studies exploring the neurobiological processes involved in perceiving emotional expressions using dynamic stimuli.  相似文献   

2.
Startle reflex modulation by affective pictures is a well-established effect in human emotion research. However, much less is known about startle modulation by affective faces, despite the growing evidence that facial expressions robustly activate emotion-related brain circuits. In this study, acoustic startle probes were administered to 37 young adult participants (20 women) during the viewing of slides from the Pictures of Facial Affect set including neutral, happy, angry, and fearful faces. The effect of expression valence (happy, neutral, and negative) on startle magnitude was highly significant (p < .001). Startle reflex was strongly potentiated by negative expressions (fearful and angry), however, no attenuation by happy faces was observed. A significant valence by gender interaction suggests stronger startle potentiation effects in females. These results demonstrate that affective facial expressions can produce significant modulation of the startle reflex.  相似文献   

3.
Pictures of emotional facial expressions or natural scenes are often used as cues in emotion research. We examined the extent to which these different stimuli engage emotion and attention, and whether the presence of social anxiety symptoms influences responding to facial cues. Sixty participants reporting high or low social anxiety viewed pictures of angry, neutral, and happy faces, as well as violent, neutral, and erotic scenes, while skin conductance and event-related potentials were recorded. Acoustic startle probes were presented throughout picture viewing, and blink magnitude, probe P3 and reaction time to the startle probe also were measured. Results indicated that viewing emotional scenes prompted strong reactions in autonomic, central, and reflex measures, whereas pictures of faces were generally weak elicitors of measurable emotional response. However, higher social anxiety was associated with modest electrodermal changes when viewing angry faces and mild startle potentiation when viewing either angry or smiling faces, compared to neutral. Taken together, pictures of facial expressions do not strongly engage fundamental affective reactions, but these cues appeared to be effective in distinguishing between high and low social anxiety participants, supporting their use in anxiety research.  相似文献   

4.
P3b reflects maltreated children's reactions to facial displays of emotion   总被引:4,自引:0,他引:4  
Processing of emotion information by maltreated and control children was assessed with event-related brain potentials (ERPs). Maltreated children, for whom negative facial displays may be especially salient, and demographically comparable peers were tested to increase knowledge of differential processing of emotion information. ERPs were measured while children responded to pictures depicting facial displays of anger, fear, and happiness. Maltreated children showed larger P3b amplitude when angry faces appeared as targets than did control children; the two groups did not differ when targets were either happy or fearful facial expressions or for nontargets of any emotional content. These results indicate that aberrant emotional experiences associated with maltreatment may alter the allocation of attention and sensitivity that children develop to process specific emotion information.  相似文献   

5.
Emotional facial expressions have affective significance. Smiles, for example, are perceived as positive and responded to with increased happiness, whereas angry expressions are perceived as negative and threatening. Yet, these perceptions are modulated in part by facial morphological cues related to the sex of the expresser. The present research assessed both eyeblink startle and the postauricular reflex during happy and angry expressions by men and women. For this 14 male and 16 female undergraduates saw happy, neutral, and angry facial expressions as well as positive and negative pictures. The postauricular reflex was potentiated during happy expressions and inhibited during anger expressions; however, as expected, this pattern was more clearly found for female expressers. Conversely, the expected pattern of eyeblink startle potentiation during angry faces and inhibition during happy faces was found only for male expressers.  相似文献   

6.
Facial Reactions to Facial Expressions   总被引:9,自引:0,他引:9  
Ulf  Dimberg 《Psychophysiology》1982,19(6):643-647
Previous research has demonstrated that different patterns of facial muscle activity are correlated with different emotional states. In the present study subjects were exposed to pictures of happy and angry facial expressions, in response to which their facial electromyographic (EMG) activities, heart rate (HR), and palmar skin conductance responses (SCRs) were recorded. It was found that happy and angry faces evoked different facial EMG response patterns, with increased zygomatic region activity to happy stimuli and increased corrugator region activity to angry stimuli. Furthermore, both happy and angry faces evoked HR decelerations and similar SCR magnitudes. The results are interpreted as suggesting that facial EMG recordings provide a method for distinguishing between response patterns to “positive” and “negative” emotional visual stimuli.  相似文献   

7.
It has been argued that the amygdala represents an integral component of a vigilance system that is primarily involved in the perception of ambiguous stimuli of biological relevance. The present investigation was conducted to examine the relationship between automatic amygdala responsivity to fearful faces which may be interpreted as an index of trait-like threat sensitivity and spatial processing characteristics of facial emotions. During 3T fMRI scanning, pictures of human faces bearing fearful, angry, and happy expressions were presented to 20 healthy volunteers using a backward masking procedure based on neutral facial expressions. Subsequently, a computer-based face-in-the-crowd task using schematic face stimuli was administered. The neural response of the (right) amygdala to masked fearful faces correlated consistently with response speed to negative and neutral faces. Neither amygdala activation during the masked presentation of angry faces nor amygdala activation during the presentation of happy faces was correlated with any of the response latencies in the face-in-the-crowd task. Our results suggest that amygdala responsivity to masked facial expression is differentially related to the general visual search speed for facial expression. Neurobiologically defined threat sensitivity seems to represent an important determinant of visual scanning behaviour.  相似文献   

8.
Event-related potentials (ERPs) were recorded to brief images of caregivers' and strangers' faces for 72 institutionalized children (IG), ages 7-32 months, and compared with ERPs from 33 children, ages 8-32 months, who had never been institutionalized. All children resided in Bucharest, Romania. Prominent differences in four ERP components were observed: early negative (N170), early positive (P250), midlatency negative (Nc), and positive slow wave (PSW). For all but the P250, the amplitude of these components was larger in the never institutionalized group than the institutionalized group; this pattern was reversed for the P250. Typical effects of the Nc (amplitude greater to stranger vs. caregiver) were observed in both groups; in contrast, the IG group showed an atypical pattern in the PSW. These findings are discussed in the context of the role of experience in influencing the neural circuitry putatively involved in recognizing familiar and novel faces.  相似文献   

9.
Numerous investigators have tested contentions that angry faces capture early attention more completely than happy faces do in the context of other faces. However, syntheses of studies on early event‐related potentials related to the anger superiority hypothesis have yet to be conducted, particularly in relation to the N200 posterior‐contralateral (N2pc) component which provides a reliable electrophysiological index related to orienting of attention suitable for testing this hypothesis. Fifteen samples (N = 534) from 13 studies featuring the assessment of N2pc amplitudes during exposure to angry‐neutral and/or happy‐neutral facial expression arrays were included for meta‐analysis. Moderating effects of study design features and sample characteristics on effect size variability were also assessed. N2pc amplitudes elicited by affectively valenced expressions (angry and happy) were significantly more pronounced than those elicited by neutral expressions. However, the mean effect size difference between angry and happy expressions was ns. N2pc effect sizes were moderated by sample age, number of trials, and nature of facial images used (schematic vs. real) with larger effect sizes observed when samples were comparatively younger, more task trials were presented and schematic face arrays were used. N2pc results did not support anger superiority hypothesis. Instead, attentional resources allocated to angry versus happy facial expressions were similar in early stages of processing. As such, possible adaptive advantages of biases in orienting toward both anger and happy expressions warrant consideration in revisions of related theory.  相似文献   

10.
Anxiety is supposed to enhance the processing of threatening information. Here, we investigated the cortical processing of angry faces during anticipated public speaking. To elicit anxiety, a group of participants was told that they would have to perform a public speech. As a control condition, another group was told that they would have to write a short essay. During anticipation of these tasks, participants saw facial expressions (angry, happy, and neutral) while electroencephalogram was recorded. Event‐related potential analysis revealed larger N170 amplitudes for angry compared to happy and neutral faces in the anxiety group. The early posterior negativity as an index of motivated attention was also enhanced for angry compared to happy and neutral faces in participants anticipating public speaking. These results indicate that fear of public speaking influences early perceptual processing of faces such that especially the processing of angry faces is facilitated.  相似文献   

11.
BACKGROUND: Emotional Stroop tasks have shown attention biases of clinical populations towards stimuli related to their condition. Asperger Syndrome (AS) is a neuropsychiatric condition with social and communication deficits, repetitive behaviours and narrow interests. Social deficits are particularly striking, including difficulties in understanding others. METHOD: We investigated colour-naming latencies of adults with and without AS to name colours of pictures containing angry facial expressions, neutral expressions or non-social objects. We tested three hypotheses: whether (1) controls show longer colour-naming latencies for angry versus neutral facial expressions with male actors, (2) people with AS show differential latencies across picture types, and (3) differential response latencies persist when photographs contain females. RESULTS: Controls had longer latencies to pictures of male faces with angry compared to neutral expressions. The AS group did not show longer latencies to angry versus neutral expressions in male faces, instead showing slower latencies to pictures containing any facial expression compared to objects. When pictures contained females, controls no longer showed longer latencies for angry versus neutral expressions. However, the AS group still showed longer latencies to all facial picture types, compared to objects, providing further evidence that faces produce interference effects for this clinical group. CONCLUSIONS: The pictorial emotional Stroop paradigm reveals normal attention biases towards threatening emotional faces. The AS group showed Stroop interference effects to all facial stimuli regardless of expression or sex, suggesting that faces cause disproportionate interference in AS.  相似文献   

12.
目的探究孤独症患儿对面部情绪表情的认知特征。方法选取2007年3月至2008年9月在中山大学附属第三医院发育行为中心诊断为孤独症的18~36个月患儿作为孤独症组,同期行健康查体的年龄、性别与孤独症组匹配的正常儿童作为对照组,被动观看电脑屏幕显示的高兴、悲伤、惊讶、愤怒和恐惧5种面部基本情绪表情图,观察比较两组幼儿对各种面部表情的视觉注意行为和自身情绪反应。结果研究期间孤独症组和对照组均纳入45例,两组幼儿对各种面部表情的初次注视时间组间效应不明显,而表情效应明显,两组幼儿初次注视高兴和愤怒表情的时间长于注视恐惧表情的时间。但孤独症组对各种面部表情图的回看次数明显少于对照组,总注视时间也明显短于对照组。对照组对不同的面部表情自身情绪反应评分明显不同,对高兴表情的积极情绪评分明显高于其他表情,对高兴表情的消极情绪评分明显低于愤怒和恐惧表情,对悲伤和惊讶表情的消极情绪评分也明显低于恐惧表情。而孤独症组对各种情绪表情的自身情绪反应评分差异无统计学意义。结论孤独症患儿早期不仅对面部情绪表情的视觉注意减少,对面部情绪表情的感知也存在缺陷,尤其对各种负性情绪表情理解困难。  相似文献   

13.
The angry facial expression is an important socially threatening stimulus argued to have evolved to regulate social hierarchies. In the present study, event-related potentials (ERP) were used to investigate the involvement and temporal dynamics of the frontal and parietal regions in the processing of angry facial expressions. Angry, happy and neutral faces were shown to eighteen healthy right-handed volunteers in a passive viewing task. Stimulus-locked ERPs were recorded from the frontal and parietal scalp sites. The P200, N300 and early contingent negativity variation (eCNV) components of the electric brain potentials were investigated. Analyses revealed statistical significant reductions in P200 amplitudes for the angry facial expression on both frontal and parietal electrode sites. Furthermore, apart from being strongly associated with the anterior P200, the N300 showed to be more negative for the angry facial expression in the anterior regions also. Finally, the eCNV was more pronounced over the parietal sites for the angry facial expressions. The present study demonstrated specific electrocortical correlates underlying the processing of angry facial expressions in the anterior and posterior brain sectors. The P200 is argued to indicate valence tagging by a fast and early detection mechanism. The lowered N300 with an anterior distribution for the angry facial expressions indicates more elaborate evaluation of stimulus relevance. The fact that the P200 and the N300 are highly correlated suggests that they reflect different stages of the same anterior evaluation mechanism. The more pronounced posterior eCNV suggests sustained attention to socially threatening information.  相似文献   

14.
Preliminary studies have demonstrated that school-aged children (average age 9-10years) show mimicry responses to happy and angry facial expressions. The aim of the present study was to assess the feasibility of using facial electromyography (EMG) as a method to study facial mimicry responses in younger children aged 6-7years to emotional facial expressions of other children. Facial EMG activity to the presentation of dynamic emotional faces was recorded from the corrugator, zygomaticus, frontalis and depressor muscle in sixty-one healthy participants aged 6-7years. Results showed that the presentation of angry faces was associated with corrugator activation and zygomaticus relaxation, happy faces with an increase in zygomaticus and a decrease in corrugator activation, fearful faces with frontalis activation, and sad faces with a combination of corrugator and frontalis activation. This study demonstrates the feasibility of measuring facial EMG response to emotional facial expressions in 6-7year old children.  相似文献   

15.
Male and female participants (n=19) high or low in speech fear viewed pictures of faces posed in anger, neutral, and joyful expressions for 8s each. Zygomaticus major and corrugator supercilii EMG, skin conductance, and heart rate were measured during picture viewing, and subjective ratings were made after each picture. Compared to anger expressions, joy expressions were responded to with greater zygomatic EMG, less corrugator EMG, and greater heart rate and skin conductance. Physiological response to neutral expressions was similar to response to anger expressions. Expressions posed by women were responded to physiologically more negatively than expressions posed by men. More fearful participants exhibited more negative and less positive facial expressions, and skin conductance responses suggesting greater attention when viewing negative expressions. Results suggest that reactions to facial expressions are influenced by social context, and are not simple mimicry.  相似文献   

16.
This investigation examined the neural and personality correlates of processing infant facial expressions in mothers with substantiated neglect of a child under 5 years old. Event-related potentials (ERPs) were recorded from 14 neglectful and 14 control mothers as they viewed and categorized pictures of infant cries, laughs, and neutral faces. Maternal self-reports of anhedonia and empathy were also completed. Early (negative occipitotemporal component peaking at around 170 ms on the scalp [N170] and positive electrical potential peaking at about 200 ms [P200]) and late positive potential (LPP) components were selected. Both groups of mothers showed behavioral discrimination between the different facial expressions via reaction time and accuracy measures. Neglectful mothers did not exhibit increased N170 amplitude at temporal leads in response to viewing crying versus laughing and neutral expressions compared to control mothers. Both groups had greater P200 and LPP amplitudes at centroparietal leads in response to viewing crying versus neutral facial expressions. However, neglectful mothers displayed an overall attenuated brain response in LPP that was related to their higher scores in social anhedonia but not to their empathy scores. The ERP data suggest that the brain's failures in the early differentiation of cry stimuli and in the sustained processing of infant expressions related to social anhedonia may underlie the insensitive responding in neglectful mothers. The implications of these results for the design and evaluation of preventive interventions are discussed.  相似文献   

17.
An extensive literature documents the infant's ability to recognize and discriminate a variety of facial expressions of emotion. However, little is known about the neural bases of this ability. To examine the neural processes that may underlie infants' responses to facial expressions, we recorded event-related potentials (ERPs) while 7-month-olds watched pictures of a happy face and a fearful face (Experiment 1) or an angry face and a fearful face (Experiment 2). In both experiments an early positive component, a middle-latency negative component and a later positive component were elicited. However, only when the infants saw the happy and fearful faces did the components differ for the two expressions. These results are discussed in the context of the neurobiological processes involved in perceiving facial expressions. © 1996 John Wiley & Sons, Inc.  相似文献   

18.
Developing measures of socioaffective processing is important for understanding the mechanisms underlying emotional-interpersonal traits relevant to health, such as hostility. In this study, cigarette smokers with low (LH; n = 49) and high (HH; n = 43) trait hostility completed the Emotional Interference Gender Identification Task (EIGIT), a newly developed behavioral measure of socioaffective processing biases toward facial affect. The EIGIT involves quickly categorizing the gender of facial pictures that vary in affective valence (angry, happy, neutral, sad). Results showed that participants were slower and less accurate in categorizing the gender of angry faces in comparison to happy, neutral, and sad faces (which did not differ), signifying interference indicative of a socioaffective processing bias toward angry faces. Compared to LH individuals, HH participants exhibited diminished biases toward angry faces on error-based (but not speed-based) measures of emotional interference, suggesting impaired socioaffective processing. The EIGIT may be useful for future research on the role of socioaffective processing in traits linked with poor health.  相似文献   

19.
Does contextual affective information influence the processing of facial expressions already at the relatively early stages of face processing? We measured event-related brain potentials to happy and sad facial expressions primed by preceding pictures with affectively positive and negative scenes. The face-sensitive N170 response amplitudes showed a clear affective priming effect: N170 amplitudes to happy faces were larger when presented after positive vs. negative primes, whereas the N170 amplitudes to sad faces were larger when presented after negative vs. positive primes. Priming effects were also observed on later brain responses. The results support an early integration in processing of contextual and facial affective information. The results also provide neurophysiological support for theories suggesting that behavioral affective priming effects are based, at least in part, on facilitation of encoding of incoming affective information.  相似文献   

20.
The ability to distinguish facial emotions emerges in infancy. Although this ability has been shown to emerge between 5 and 7 months of age, the literature is less clear regarding the extent to which neural correlates of perception and attention play a role in processing of specific emotions. This study's main goal was to examine this question among infants. To this end, we presented angry, fearful, and happy faces to 7-month-old infants (N = 107, 51% female) while recording event-related brain potentials. The perceptual N290 component showed a heightened response for fearful and happy relative to angry faces. Attentional processing, indexed by the P400, showed some evidence of a heightened response for fearful relative to happy and angry faces. We did not observe robust differences by emotion in the negative central (Nc) component, although trends were consistent with previous work suggesting a heightened response to negatively valenced expressions. Results suggest that perceptual (N290) and attentional (P400) processing is sensitive to emotions in faces, but these processes do not provide evidence for a fear-specific bias across components.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号