首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Research investigating the early development of emotional processing has focused mainly on infants' perception of static facial emotional expressions, likely restricting the amount and type of information available to infants. In particular, the question of whether dynamic information in emotional facial expressions modulates infants' neural responses has been rarely investigated. The present study aimed to fill this gap by recording 7-month-olds' event-related potentials to static (Study 1) and dynamic (Study 2) happy, angry, and neutral faces. In Study 1, happy faces evoked a faster right-lateralized negative central (Nc) component compared to angry faces. In Study 2, both happy and angry faces elicited a larger right-lateralized Nc compared to neutral faces. Irrespective of stimulus dynamicity, a larger P400 to angry faces was associated with higher scores on the Negative Affect temperamental dimension. Overall, results suggest that 7-month-olds are sensitive to facial dynamics, which might play a role in shaping the neural processing of facial emotional expressions. Results also suggest that the amount of attentional resources infants allocate to angry expressions is associated to their temperamental traits. These findings represent a promising avenue for future studies exploring the neurobiological processes involved in perceiving emotional expressions using dynamic stimuli.  相似文献   

2.
It remains an open question whether it is possible to assign a single brain operation or psychological function for facial emotion decoding to a certain type of oscillatory activity. Gamma band activity (GBA) offers an adequate tool for studying cortical activation patterns during emotional face information processing. In the present study brain oscillations were analyzed in response to facial expression of emotions. Specifically, GBA modulation was measured when twenty subjects looked at emotional (angry, fearful, happy, and sad faces) or neutral faces in two different conditions: supraliminal (10 ms) vs subliminal (150 ms) stimulation (100 target-mask pairs for each condition). The results showed that both consciousness and significance of the stimulus in terms of arousal can modulate the power synchronization (ERD decrease) during 150-350 time range: an early oscillatory event showed its peak at about 200 ms post-stimulus. GBA was enhanced by supraliminal more than subliminal elaboration, as well as more by high arousal (anger and fear) than low arousal (happiness and sadness) emotions. Finally a left-posterior dominance for conscious elaboration was found, whereas right hemisphere was discriminant in emotional processing of face in comparison with neutral face.  相似文献   

3.
The N170 is widely regarded as a face sensitive potential, having its maximum at occipito-temporal sites, with right-hemisphere dominance. However, it is debatable whether the N170 is modulated by different emotional expressions of a face. The aim of this study was to analyze the N170 elicited by schematic happy and angry faces when the emotional expression is semantically processed. To investigate the influence of different emotional expressions of schematic faces, we used a Prime-Probe procedure with the N400 effect as an indicator for a semantic processing. Eighteen subjects were presented the German word "happiness" or "anger" followed by happy and angry faces. The word-face pair could be congruent or incongruent in emotional meaning. Subjects were instructed to compare the emotional meaning of the words and faces and to count the congruent trials. Event-related potentials were recorded from 124 sites. Congruent faces elicited a smaller negativity in the N400 time range than incongruent faces, indicating that the facial emotional expression was cognitively processed. The face sensitive N170 was most pronounced at posterior and occipital sites, and N170 amplitudes were larger following the angry as compared to the happy faces. It is concluded that different emotional expressions of schematic faces can modulate the N170.  相似文献   

4.
An extensive literature documents the infant's ability to recognize and discriminate a variety of facial expressions of emotion. However, little is known about the neural bases of this ability. To examine the neural processes that may underlie infants' responses to facial expressions, we recorded event-related potentials (ERPs) while 7-month-olds watched pictures of a happy face and a fearful face (Experiment 1) or an angry face and a fearful face (Experiment 2). In both experiments an early positive component, a middle-latency negative component and a later positive component were elicited. However, only when the infants saw the happy and fearful faces did the components differ for the two expressions. These results are discussed in the context of the neurobiological processes involved in perceiving facial expressions. © 1996 John Wiley & Sons, Inc.  相似文献   

5.
Dan O  Raz S 《Biological psychology》2012,91(2):212-220
Attachment-related electrophysiological differences in emotional processing biases were examined using Event-Related Potentials (ERPs). We identified ERP correlates of emotional processing by comparing ERPs elicited in trials with angry and neutral faces. These emotional expression effects were then compared across groups with secure, anxious and avoidant attachment orientations. Results revealed significant interactions between attachment orientation and facial expression in mean amplitudes of the early C1 (50-80ms post-stimulus) and P1 (80-120ms post-stimulus) ERP components. Significant differences in C1 and P1 mean amplitudes were found at occipital and posterior-parietal channels in response to angry compared with neutral faces only within the avoidant attachment group. No such differences were found within the secure or anxious attachment groups. The present study underscores the usefulness of the ERP methodology, as a sensitive measure for the study of emotional processing biases in the research field of attachment.  相似文献   

6.
Affective faces are important stimuli with relevance to healthy and abnormal social and affective information processing. The aim of this study was to investigate the effect of brief presentations of affective faces on attention and emotional state across the time course of stimulus processing, as indexed by startle eyeblink response modulation. Healthy adults were presented with happy, neutral, and disgusted male and female faces that were backward masked by neutral faces. Startle responses were elicited at 300, 800, and 3,500 ms following stimulus presentation to probe early and late startle eyeblink modulation, indicative of attention allocation and emotional state, respectively. Results revealed that, at 300 ms, both face expression and face gender modulated startle eyeblink response, suggesting that more attention was allocated to masked happy compared to disgusted female faces, and masked disgusted compared to neutral male faces. There were no effects of either face expression or face gender on startle modulation at 800 ms. At 3,500 ms, target face expression did not modulate startle, but male faces elicited larger startle responses than female faces, indicative of a more negative emotional state. These findings provide a systematic investigation of attention and emotion modulation by brief affective faces across the time course of stimulus processing.  相似文献   

7.
目的:考察海洛因戒断者视听情绪一致和不一致条件下的情绪识别表现,研究其情绪面孔和声音刺激的整合加工特点。方法:选取男性海洛因戒断者32例(戒断组)和健康男性30例(对照组),采用视听情绪分类任务,要求被试忽略情绪声音而对情绪面孔的类型(愤怒、快乐)做出判断。结果:戒断组在一致条件下的反应时低于不一致[(670±66)ms vs.(687±77)ms,P<0.001];戒断组不一致条件与一致条件的反应时差值低于对照组[(32±31)ms vs.(17±21)ms,P<0.05];在愤怒声音下,对照组对愤怒面孔的反应时低于快乐面孔[(653±94)ms vs.(679±78)ms,P<0.05],而戒断组对两种面孔的反应时差异无统计学意义(P>0.05)。结论:海洛因戒断者表现出了情绪多通道整合促进效应,但其多通道整合加工能力弱于正常人;戒断者多通道整合受损可能体现在对愤怒声音不敏感。  相似文献   

8.
Neuroimaging evidence suggests that dynamic facial expressions elicit greater activity than static face stimuli in brain structures associated with social cognition, interpreted as greater ecological validity. However, a quantitative meta-analysis of brain activity associated with dynamic facial expressions is lacking. The current study investigated, using three fMRI experiments, activity elicited by (a) dynamic and static happy faces, (b) dynamic and static happy and angry faces, and (c) dynamic faces and dynamic flowers. In addition, using activation likelihood estimate (ALE) meta-analysis, we determined areas concordant across published studies that (a) used dynamic faces and (b) specifically compared dynamic and static emotional faces. The middle temporal gyri (Experiment 1) and superior temporal sulci (STS; Experiment 1 and 2) were more active for dynamic than static faces. In contrasts with the baseline the amygdalae were more active for dynamic faces (Experiment 1 and 2) and the fusiform gyri were active for all conditions (all Experiments). The ALE meta-analyses revealed concordant activation in all of these regions as well as in areas associated with cognitive manipulations (inferior frontal gyri). Converging data from the experiments and the meta-analyses suggest that dynamic facial stimuli elicit increased activity in regions associated with interpretation of social signals and emotional processing.  相似文献   

9.
A face-specific brain EEG potential at approximately 160 ms after stimulus presentation has recently been described by various research groups. Most of these studies analysed this face-specific brain potential using smiling faces as stimuli. In electrophysiological studies, however, differences in amplitude due to the emotional valence of the stimuli were described as early as 100 ms after stimulus presentation. In order to investigate the effect of facial expressions with different emotional content on face-specific brain EEG potentials, event-related potentials (ERPs) to faces with sad, happy and neutral expressions were compared to ERPs elicited with buildings in 16 healthy subjects. A face-specific potential at vertex approximately 160 ms after stimulus presentation has been verified in the present study. No significant differences in latency or amplitude of this component were found for different facial expressions.  相似文献   

10.
As human faces are important social signals in everyday life, processing of facial affect has recently entered into the focus of neuroscientific research. In the present study, priming of faces showing the same emotional expression was measured with the help of event-related potentials (ERPs) in order to investigate the temporal characteristics of processing facial expressions. Participants classified portraits of unfamiliar persons according to their emotional expression (happy or angry). The portraits were either preceded by the face of a different person expressing the same affect (primed) or the opposite affect (unprimed). ERPs revealed both early and late priming effects, independent of stimulus valence. The early priming effect was characterized by attenuated frontal ERP amplitudes between 100 and 200 ms in response to primed targets. Its dipole sources were localised in the inferior occipitotemporal cortex, possibly related to the detection of expression-specific facial configurations, and in the insular cortex, considered to be involved in affective processes. The late priming effect, an enhancement of the late positive potential (LPP) following unprimed targets, may evidence greater relevance attributed to a change of emotional expressions. Our results (i) point to the view that a change of affect-related facial configuration can be detected very early during face perception and (ii) support previous findings on the amplitude of the late positive potential being rather related to arousal than to the specific valence of an emotional signal.  相似文献   

11.
Music is one of the most powerful elicitors of subjective emotion, yet it is not clear whether emotions elicited by music are similar to emotions elicited by visual stimuli. This leads to an open question: can music-elicited emotion be transferred to and/or influence subsequent vision-elicited emotional processing? Here we addressed this question by investigating processing of emotional faces (neutral, happy and sad) primed by short excerpts of musical stimuli (happy and sad). Our behavioural experiment showed a significant effect of musical priming: prior listening to a happy (sad) music enhanced the perceived happiness (sadness) of a face irrespective of facial emotion. Further, this musical priming-induced effect was largest for neutral face. Our electrophysiological experiment showed that such crossmodal priming effects were manifested by event related brain potential components at a very early (within 100 ms post-stimulus) stages of neuronal information processing. Altogether, these results offer new insight into the crossmodal nature of music and its ability to transfer emotion to visual modality.  相似文献   

12.
Facial cues of racial outgroup or anger mediate fear learning that is resistant to extinction. Whether this resistance is potentiated if fear is conditioned to angry, other race faces has not been established. Two groups of Caucasian participants were conditioned with two happy and two angry face conditional stimuli (CSs). During acquisition, one happy and one angry face were paired with an aversive unconditional stimulus whereas the second happy and angry faces were presented alone. CS face race (Caucasian, African American) was varied between groups. During habituation, electrodermal responses were larger to angry faces regardless of race and declined less to other race faces. Extinction was immediate for Caucasian happy faces, delayed for angry faces regardless of race, and slowest for happy racial outgroup faces. Combining the facial cues of other race and anger does not enhance resistance to extinction of fear.  相似文献   

13.
Introduction. Selective attention to threat‐related information has been associated with clinical delusions in schizophrenia and nonclinical delusional ideation in healthy individuals. However, it is unclear whether biased attention for threat reflects early engagement effects on selective attention, or later difficulties in disengaging attention from perceived threat. The present study examined which of these processes operate in nonclinical delusion‐prone individuals.

Methods. A total of 100 psychologically healthy participants completed the Peters et al. () Delusions Inventory (PDI). Twenty‐two scoring in the upper quartile (high‐PDI group) and 22 scoring in the lower quartile (low‐PDI group) completed a modified dot‐probe task. Participants detected dot‐probes appearing 200, 500, or 1250 ms after an angry‐neutral face pair or a happy‐neutral face pair.

Results. High‐PDI individuals responded faster to dot‐probes presented in the same location as angry compared to happy faces at the short 200 ms stimulus onset asynchrony (SOA), but only when the emotional faces were presented to the left visual field. At the two longer SOAs (500 ms, 1250 ms), the high‐PDI group were also faster to respond to dot‐probes presented in the same location as angry compared to happy faces and slower to respond to dot‐probes presented in different spatial locations to angry (vs. happy) faces. The latter effects were seen whether emotional faces were presented to the left or the right visual field.

Conclusions. Results support the operation of emotion‐selective engagement and defective disengagement for threat‐related facial expressions (i.e., anger) in delusion‐prone individuals.  相似文献   

14.
Processing and maintenance in working memory involve active attention allocation; consequently, it is possible that a recognition process could interfere with the performance of highly demanding working memory tasks. Event-related brain potentials (ERPs) were recorded while fourteen healthy male adults performed a visual verbal dual working memory task. Three conditions were examined: A) reference (with no preceding stimuli); B) happy, angry or neutral faces presented 250 ms prior to task onset for 30 ms; and, C) visual noise control condition. Behavioral results showed that reaction times were significantly longer in the condition preceded by the presentation of faces, regardless of their emotional content. ERPs showed a predominantly right temporo-occipital negative component at around 170 ms during conditions B and C (higher amplitude in B), which probably reflects the pre-categorical structural encoding of the face. Succeeding task-onset, an early positive right temporo-parietal component (P380) appeared during condition B, probably reflecting a delayed reallocation of working memory attentional resources to carry out the task requirements. Afterwards, two other main fronto-parietal components were observed in the three conditions: a positive wave that peaked at around 410 ms, and a subsequent negative component (N585). This latter waveform reached a significantly higher amplitude during the reference condition (A) and was interpreted as mirroring the phonologic-lexical manipulation of the stimuli in working memory. These results suggest that early face detection could induce an attentional decrement that interfere a subsequent visual verbal working memory task performance. They also suggest that while face detection and facial emotional content analysis might be parallel processes, they are not onset-synchronized.  相似文献   

15.
Pictures of emotional facial expressions or natural scenes are often used as cues in emotion research. We examined the extent to which these different stimuli engage emotion and attention, and whether the presence of social anxiety symptoms influences responding to facial cues. Sixty participants reporting high or low social anxiety viewed pictures of angry, neutral, and happy faces, as well as violent, neutral, and erotic scenes, while skin conductance and event-related potentials were recorded. Acoustic startle probes were presented throughout picture viewing, and blink magnitude, probe P3 and reaction time to the startle probe also were measured. Results indicated that viewing emotional scenes prompted strong reactions in autonomic, central, and reflex measures, whereas pictures of faces were generally weak elicitors of measurable emotional response. However, higher social anxiety was associated with modest electrodermal changes when viewing angry faces and mild startle potentiation when viewing either angry or smiling faces, compared to neutral. Taken together, pictures of facial expressions do not strongly engage fundamental affective reactions, but these cues appeared to be effective in distinguishing between high and low social anxiety participants, supporting their use in anxiety research.  相似文献   

16.
Social anxiety has been characterized by an attentional bias towards threatening faces. Electrophysiological studies have demonstrated modulations of cognitive processing from 100 ms after stimulus presentation. However, the impact of the stimulus features and task instructions on facial processing remains unclear. Event-related potentials were recorded while high and low socially anxious individuals performed an adapted Stroop paradigm that included a colour-naming task with non-emotional stimuli, an emotion-naming task (the explicit task) and a colour-naming task (the implicit task) on happy, angry and neutral faces. Whereas the impact of task factors was examined by contrasting an explicit and an implicit emotional task, the effects of perceptual changes on facial processing were explored by including upright and inverted faces. The findings showed an enhanced P1 in social anxiety during the three tasks, without a moderating effect of the type of task or stimulus. These results suggest a global modulation of attentional processing in performance situations.  相似文献   

17.
Using a spatial cueing paradigm with emotional and neutral facial expressions as cues, we examined early and late patterns of information processing in cognitive avoidant coping (CAV). Participants were required to detect a target that appeared either in the same location as the cue (valid) or in a different location (invalid). Cue–target onset asynchrony (CTOA) was manipulated to be short (250 ms) or long (750 ms). CAV was associated with early facilitation and faster disengagement from angry faces. No effects were found for happy or neutral faces. After completing the spatial cueing task, participants prepared and delivered a public speech and heart rate variability (HRV) was recorded. Disengagement from angry faces was related to a decrease in HRV in response to this task. Together, these data suggest that CAV is related to early engagement followed by disengagement from threat‐related cues that might impact physiological stress responses.  相似文献   

18.
Hemispheric Asymmetry in Conditioning to Facial Emotional Expressions   总被引:1,自引:0,他引:1  
In the present experiment, we report a right hemisphere advantage for autonomic conditioning to facial emotional expressions. Specifically, angry, but not happy, facial expressions showed significantly more resistance to extinction when presented initially to the right as compared to the left hemisphere. Slides of happy and angry faces were used as conditioned stimuli (CS+ and CS-) with shock as the unconditioned stimulus (UCS). Half of the subjects (n = 15) had the angry face as CS+ (and the happy face as CS-), the other half had the happy face as CS+ (and the angry face as CS-). During acquisition, the CSs were presented foveally. During extinction, using the Visual Half-Field (VHF) technique, half of the CS+ and CS- trials were randomly presented in the right visual half-field (initially to the left hemisphere), and half of the trials were presented in the left half-field (initially to the right hemisphere). Stimuli were presented for 210 ms during acquisition, and for 30 ms during extinction. Bilateral skin conductance responses (SCRs) were recorded. The results showed effects of acquisition only for the angry CS+ group. During extinction, there was a significant Conditioning X Half-field interaction which was due to greater SCRs to the CS+ angry face when it was presented in the left half-field. It is concluded that the present results reveal hemisphere asymmetry effects in facial emotional conditioning.  相似文献   

19.
This study investigated the temporal course of attentional biases for threat-related (angry) and positive (happy) facial expressions. Electrophysiological (event-related potential) and behavioral (reaction time [RT]) data were recorded while participants viewed pairs of faces (e.g., angry face paired with neutral face) shown for 500 ms and followed by a probe. Behavioral results indicated that RTs were faster to probes replacing emotional versus neutral faces, consistent with an attentional bias for emotional information. Electrophysiological results revealed that attentional orienting to threatening faces emerged earlier (early N2pc time window; 180–250 ms) than orienting to positive faces (after 250 ms), and that attention was sustained toward emotional faces during the 250–500-ms time window (late N2pc and SPCN components). These findings are consistent with models of attention and emotion that posit rapid attentional prioritization of threat.  相似文献   

20.
The effects of task demands and the interaction between gender and expression in face perception were studied using event-related potentials (ERPs). Participants performed three different tasks with male and female faces that were emotionally inexpressive or that showed happy or angry expressions. In two of the tasks (gender and expression categorization) facial properties were task-relevant while in a third task (symbol discrimination) facial information was irrelevant. Effects of expression were observed on the visual P100 component under all task conditions, suggesting the operation of an automatic process that is not influenced by task demands. The earliest interaction between expression and gender was observed later in the face-sensitive N170 component. This component showed differential modulations by specific combinations of gender and expression (e.g., angry male vs. angry female faces). Main effects of expression and task were observed in a later occipito-temporal component peaking around 230 ms post-stimulus onset (EPN or early posterior negativity). Less positive amplitudes in the presence of angry faces and during performance of the gender and expression tasks were observed. Finally, task demands also modulated a positive component peaking around 400 ms (LPC, or late positive complex) that showed enhanced amplitude for the gender task. The pattern of results obtained here adds new evidence about the sequence of operations involved in face processing and the interaction of facial properties (gender and expression) in response to different task demands.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号