首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Startle reflex modulation by affective pictures is a well-established effect in human emotion research. However, much less is known about startle modulation by affective faces, despite the growing evidence that facial expressions robustly activate emotion-related brain circuits. In this study, acoustic startle probes were administered to 37 young adult participants (20 women) during the viewing of slides from the Pictures of Facial Affect set including neutral, happy, angry, and fearful faces. The effect of expression valence (happy, neutral, and negative) on startle magnitude was highly significant (p < .001). Startle reflex was strongly potentiated by negative expressions (fearful and angry), however, no attenuation by happy faces was observed. A significant valence by gender interaction suggests stronger startle potentiation effects in females. These results demonstrate that affective facial expressions can produce significant modulation of the startle reflex.  相似文献   

2.
The present study investigated whether social anxiety modulates the processing of facial expressions. Event-related potentials were recorded during an oddball task in young adults reporting high or low levels of social anxiety as evaluated by the Liebowitz Social Anxiety Scale. Repeated pictures of faces with a neutral expression were infrequently replaced by pictures of the same face displaying happiness, anger, fear or disgust. For all participants, response latencies were shorter in detecting faces expressing disgust and happiness as compared to fear or anger. Low social anxiety individuals evoked enhanced P1 in response to angry faces as compared to other stimuli while high socially anxious participants displayed enlarged P1 for all emotional stimuli as compared to neutral ones, and general higher amplitudes as compared to non-anxious individuals. Conversely, the face-specific N170 and the task-related decision P3b were not influenced by social anxiety. These results suggest increased pre-attentive detection of facial cues in socially anxious individuals and are discussed within the framework of recent models of anxiety.  相似文献   

3.
This study investigated the temporal course of attentional biases for threat-related (angry) and positive (happy) facial expressions. Electrophysiological (event-related potential) and behavioral (reaction time [RT]) data were recorded while participants viewed pairs of faces (e.g., angry face paired with neutral face) shown for 500 ms and followed by a probe. Behavioral results indicated that RTs were faster to probes replacing emotional versus neutral faces, consistent with an attentional bias for emotional information. Electrophysiological results revealed that attentional orienting to threatening faces emerged earlier (early N2pc time window; 180–250 ms) than orienting to positive faces (after 250 ms), and that attention was sustained toward emotional faces during the 250–500-ms time window (late N2pc and SPCN components). These findings are consistent with models of attention and emotion that posit rapid attentional prioritization of threat.  相似文献   

4.
This investigation examined the effects of maltreatment during the first year of life on the neural correlates of processing facial expressions of emotion at 30 months of age. Event-related potentials (ERPs) in response to children passively viewing standardized pictures of female models posing angry, happy, and neutral facial expressions were examined. Four ERP waveform components were derived: early negative (N150), early positive (P260), negative central (Nc), and positive slow wave (PSW). Differences in these waveforms between a group of 35 maltreated and 24 nonmaltreated children were reported. The groups did not differ on the early perceptual negative component (N150), whereas the maltreated children had greater P260 amplitude at frontal leads compared to the nonmaltreated children in response to viewing angry facial expressions. For the Nc component, the nonmaltreated comparison children exhibited greater amplitude while viewing pictures of happy faces compared to angry and neutral faces, whereas the maltreated children showed greater Nc amplitude at central sites while viewing angry faces. For the PSW, the nonmaltreated group showed a greater area score in the right hemisphere in response to viewing angry facial expressions compared to the maltreated group. The results are discussed in terms of brain development and function, as well as their implications for the design and evaluation of preventive interventions.  相似文献   

5.
Emotional facial expressions have affective significance. Smiles, for example, are perceived as positive and responded to with increased happiness, whereas angry expressions are perceived as negative and threatening. Yet, these perceptions are modulated in part by facial morphological cues related to the sex of the expresser. The present research assessed both eyeblink startle and the postauricular reflex during happy and angry expressions by men and women. For this 14 male and 16 female undergraduates saw happy, neutral, and angry facial expressions as well as positive and negative pictures. The postauricular reflex was potentiated during happy expressions and inhibited during anger expressions; however, as expected, this pattern was more clearly found for female expressers. Conversely, the expected pattern of eyeblink startle potentiation during angry faces and inhibition during happy faces was found only for male expressers.  相似文献   

6.
Facial muscular reactions to avatars' static (neutral, happy, angry) and dynamic (morphs developing from neutral to happy or angry) facial expressions, presented for 1 s each, were investigated in 48 participants. Dynamic expressions led to better recognition rates and higher intensity and realism ratings. Angry expressions were rated as more intense than happy expressions. EMG recordings indicated emotion-specific reactions to happy avatars as reflected in increased M. zygomaticus major and decreased M. corrugator supercilii tension, with stronger reactions to dynamic as compared to static expressions. Although rated as more intense, angry expressions elicited no significant M. corrugator supercilii activation. We conclude that facial reactions to angry and to happy facial expressions hold different functions in social interactions. Further research should vary dynamics in different ways and also include additional emotional expressions.  相似文献   

7.
BACKGROUND: Emotional Stroop tasks have shown attention biases of clinical populations towards stimuli related to their condition. Asperger Syndrome (AS) is a neuropsychiatric condition with social and communication deficits, repetitive behaviours and narrow interests. Social deficits are particularly striking, including difficulties in understanding others. METHOD: We investigated colour-naming latencies of adults with and without AS to name colours of pictures containing angry facial expressions, neutral expressions or non-social objects. We tested three hypotheses: whether (1) controls show longer colour-naming latencies for angry versus neutral facial expressions with male actors, (2) people with AS show differential latencies across picture types, and (3) differential response latencies persist when photographs contain females. RESULTS: Controls had longer latencies to pictures of male faces with angry compared to neutral expressions. The AS group did not show longer latencies to angry versus neutral expressions in male faces, instead showing slower latencies to pictures containing any facial expression compared to objects. When pictures contained females, controls no longer showed longer latencies for angry versus neutral expressions. However, the AS group still showed longer latencies to all facial picture types, compared to objects, providing further evidence that faces produce interference effects for this clinical group. CONCLUSIONS: The pictorial emotional Stroop paradigm reveals normal attention biases towards threatening emotional faces. The AS group showed Stroop interference effects to all facial stimuli regardless of expression or sex, suggesting that faces cause disproportionate interference in AS.  相似文献   

8.
In the present study, the startle reflex was examined with respect to the degree of anger displayed in facial expressions. To this end, 52 participants viewed faces that were morphed to display 0, 20, 40, 60, 80, or 100% anger. As the percentage of anger in faces increased from 0 to 100%, faces were perceived as increasingly angry; however, relative to neutral facial expressions, startle amplitude was only potentiated to maximally angry faces. These data imply a non‐linear relationship between the intensity of angry faces and defensive physiological activity. This pattern of startle modulation suggests a categorical distinction between threatening (100% anger) and other facial expressions presented. These results are further discussed in terms of existing data, and how this paradigm might be utilized in psychopathology research.  相似文献   

9.
Eye movements were monitored during picture viewing, and effects of hedonic content, perceptual composition, and repetition on scanning assessed. In Experiment 1, emotional and neutral pictures that were figure-ground compositions or more complex scenes were presented for a 6-s free viewing period. Viewing emotional pictures or complex scenes prompted more fixations and broader scanning of the visual array, compared to neutral pictures or simple figure-ground compositions. Effects of emotion and composition were independent, supporting the hypothesis that these oculomotor indices reflect enhanced information seeking. Experiment 2 tested an orienting hypothesis by repeatedly presenting the same pictures. Although repetition altered specific scan patterns, emotional, compared to neutral, picture viewing continued to prompt oculomotor differences, suggesting that motivationally relevant cues enhance information seeking in appetitive and defensive contexts.  相似文献   

10.
Previous studies, mainly with Caucasian samples, have shown that facial expressions of emotion are contagious, a phenomenon known as facial mimicry. This study examined facial mimicry using a Japanese sample. Participants were shown a series of Japanese faces (from Matsumoto and Ekman, 1988) on a computer screen expressing "happiness", "sadness", "anger", or "disgust". While viewing the facial expressions, electoromyograms (EMG) of the participants' faces were recorded to see whether their own facial muscles corresponding to the stimulus faces were activated. Consistent with the previous studies using Caucasian samples, all four facial expressions were mimicked. The peak time of mimicry of angry or happy faces was later, while that of disgusted faces was relatively sooner. The potential relation of facial mimicry to "emotional contagion", a social phenomenon whereby subjective feelings transfer between people, is discussed.  相似文献   

11.
When regulating negative emotional reactions, one goal is to reduce physiological reactions. However, not all regulation strategies succeed in doing that. We tested whether heart rate biofeedback helped participants reduce physiological reactions in response to negative and neutral pictures. When viewing neutral pictures, participants could regulate their heart rate whether the heart rate feedback was real or not. In contrast, when viewing negative pictures, participants could regulate heart rate only when feedback was real. Ratings of task success paralleled heart rate. Participants' general level of anxiety, emotion awareness, or cognitive emotion regulation strategies did not influence the results. Our findings show that accurate online heart rate biofeedback provides an efficient way to down-regulate autonomic physiological reactions when encountering negative stimuli.  相似文献   

12.
Previous research indicates that predictive cues can dampen subsequent defensive reactions. The present study investigated whether effects of cuing are specific to aversive stimuli, using modulation of the blink startle reflex as a measure of emotional reactivity. Participants viewed pictures depicting violence, romance/erotica, or mundane content. On half of all trials, a cue (color) predicted the content of the upcoming picture; on the remaining trials, scenes were presented without a cue. Acoustic startle probes were presented during picture viewing on trials with predictive cues and trials without a cue. Replicating previous studies, blink reflexes elicited when viewing violent pictures that had not been preceded by a cue were potentiated compared to uncued mundane scenes, and reflexes were attenuated when viewing scenes of erotica/romance that had not been cued. On the other hand, reflex potentiation when viewing scenes of violence (relative to mundane scenes) was eliminated when these pictures were preceded by a predictive cue, whereas scenes of romance prompted reliable reflex attenuation regardless of whether pictures were cued or not. Taken together, the data suggest that cuing elicits an anticipatory coping process that is specific to aversive stimuli.  相似文献   

13.
Facial expressions are important emotional stimuli during social interactions. Symbolic emotional cues, such as affective words, also convey information regarding emotions that is relevant for social communication. Various studies have demonstrated fast decoding of emotions from words, as was shown for faces, whereas others report a rather delayed decoding of information about emotions from words. Here, we introduced an implicit (color naming) and explicit task (emotion judgment) with facial expressions and words, both containing information about emotions, to directly compare the time course of emotion processing using event-related potentials (ERP). The data show that only negative faces affected task performance, resulting in increased error rates compared to neutral faces. Presentation of emotional faces resulted in a modulation of the N170, the EPN and the LPP components and these modulations were found during both the explicit and implicit tasks. Emotional words only affected the EPN during the explicit task, but a task-independent effect on the LPP was revealed. Finally, emotional faces modulated source activity in the extrastriate cortex underlying the generation of the N170, EPN and LPP components. Emotional words led to a modulation of source activity corresponding to the EPN and LPP, but they also affected the N170 source on the right hemisphere. These data show that facial expressions affect earlier stages of emotion processing compared to emotional words, but the emotional value of words may have been detected at early stages of emotional processing in the visual cortex, as was indicated by the extrastriate source activity.  相似文献   

14.
Developing measures of socioaffective processing is important for understanding the mechanisms underlying emotional-interpersonal traits relevant to health, such as hostility. In this study, cigarette smokers with low (LH; n = 49) and high (HH; n = 43) trait hostility completed the Emotional Interference Gender Identification Task (EIGIT), a newly developed behavioral measure of socioaffective processing biases toward facial affect. The EIGIT involves quickly categorizing the gender of facial pictures that vary in affective valence (angry, happy, neutral, sad). Results showed that participants were slower and less accurate in categorizing the gender of angry faces in comparison to happy, neutral, and sad faces (which did not differ), signifying interference indicative of a socioaffective processing bias toward angry faces. Compared to LH individuals, HH participants exhibited diminished biases toward angry faces on error-based (but not speed-based) measures of emotional interference, suggesting impaired socioaffective processing. The EIGIT may be useful for future research on the role of socioaffective processing in traits linked with poor health.  相似文献   

15.
Male and female participants (n=19) high or low in speech fear viewed pictures of faces posed in anger, neutral, and joyful expressions for 8s each. Zygomaticus major and corrugator supercilii EMG, skin conductance, and heart rate were measured during picture viewing, and subjective ratings were made after each picture. Compared to anger expressions, joy expressions were responded to with greater zygomatic EMG, less corrugator EMG, and greater heart rate and skin conductance. Physiological response to neutral expressions was similar to response to anger expressions. Expressions posed by women were responded to physiologically more negatively than expressions posed by men. More fearful participants exhibited more negative and less positive facial expressions, and skin conductance responses suggesting greater attention when viewing negative expressions. Results suggest that reactions to facial expressions are influenced by social context, and are not simple mimicry.  相似文献   

16.
Background Empirical evidence suggests substantial deficits regarding emotion recognition in bulimia nervosa (BN). The aim of the current study was to investigate electrophysiologic evidence for deficits in emotional face processing in patients with BN. Methods Event-related potentials were recorded from 13 women with BN and 13 matched healthy controls while viewing neutral, happy, fearful, and angry facial expressions. Participants' recognition performance for emotional faces was tested in a subsequent categorization task. In addition, the degree of alexithymia, depression, and anxiety were assessed via questionnaires. Results Categorization of emotional faces was hampered in BN (p = .01). Amplitudes of event-related potentials differed during emotional face processing: face-specific N170 amplitudes were less pronounced for angry faces in patients with BN (mean [M] [standard deviation {SD}] = 1.46 [0.56] μV versus M [SD] = -1.23 [0.61] μV, p = .02). In contrast, P3 amplitudes were more pronounced in patients with BN as compared with controls (M [SD] = 2.64 [0.46] μV versus M [SD] = 1.25 [0.39] μV, p = .04), independent of emotional expression. Conclusions The study provides novel electrophysiologic data showing that emotional faces are processed differently in patients with BN as compared with healthy controls. We suggest that deficits in early automatic emotion classification in BN are followed by an increased allocation of attentional resources to compensate for those deficits. These findings might contribute to a better understanding of the impaired social functioning in BN.  相似文献   

17.
Previous studies have shown that recognition of facial expressions is influenced by the affective information provided by the surrounding scene. The goal of this study was to investigate whether similar effects could be obtained for bodily expressions. Images of emotional body postures were briefly presented as part of social scenes showing either neutral or emotional group actions. In Experiment 1, fearful and happy bodies were presented in fearful, happy, neutral and scrambled contexts. In Experiment 2, we compared happy with angry body expressions. In Experiment 3 and 4, we blurred the facial expressions of all people in the scene. This way, we were able to ascribe possible scene effects to the presence of body expressions visible in the scene and we were able to measure the contribution of facial expressions to the body expression recognition. In all experiments, we observed an effect of social scene context. Bodily expressions were better recognized when the actions in the scenes expressed an emotion congruent with the bodily expression of the target figure. The specific influence of facial expressions in the scene was dependent on the emotional expression but did not necessarily increase the congruency effect. Taken together, the results show that the social context influences our recognition of a person’s bodily expression.  相似文献   

18.
Neuroscience research indicates that individual differences in anxiety may be attributable to a neural system for threat-processing, involving the amygdala, which modulates attentional vigilance, and which is more sensitive to fearful than angry faces. Complementary cognitive studies indicate that high-anxious individuals show enhanced visuospatial orienting towards angry faces, but it is unclear whether fearful faces elicit a similar attentional bias. This study compared biases in initial orienting of gaze to fearful and angry faces, which varied in emotional intensity, in high- and low-anxious individuals. Gaze was monitored whilst participants viewed a series of face-pairs. Results showed that fearful and angry faces elicited similar attentional biases. High-anxious individuals were more likely to direct gaze at intense negative facial expressions, than low-anxious individuals, whereas the groups did not differ in orienting to mild negative expressions. Implications of the findings for research into the neural and cognitive bases of emotion processing are discussed.  相似文献   

19.
Although neutral faces do not initially convey an explicit emotional message, it has been found that individuals tend to assign them an affective content. Moreover, previous research has shown that affective judgments are mediated by the task they have to perform. Using functional magnetic resonance imaging in 21 healthy participants, we focus this study on the cerebral activity patterns triggered by neutral and emotional faces in two different tasks (social or gender judgments). Results obtained, using conjunction analyses, indicated that viewing both emotional and neutral faces evokes activity in several similar brain areas indicating a common neural substrate. Moreover, neutral faces specifically elicit activation of cerebellum, frontal and temporal areas, while emotional faces involve the cuneus, anterior cingulated gyrus, medial orbitofrontal cortex, posterior superior temporal gyrus, precentral/postcentral gyrus and insula. The task selected was also found to influence brain activity, in that the social task recruited frontal areas while the gender task involved the posterior cingulated, inferior parietal lobule and middle temporal gyrus to a greater extent. Specifically, in the social task viewing neutral faces was associated with longer reaction times and increased activity of left dorsolateral frontal cortex compared with viewing facial expressions of emotions. In contrast, in the same task emotional expressions distinctively activated the left amygdale. The results are discussed taking into consideration the fact that, like other facial expressions, neutral expressions are usually assigned some emotional significance. However, neutral faces evoke a greater activation of circuits probably involved in more elaborate cognitive processing.  相似文献   

20.
Using a spatial cueing paradigm with emotional and neutral facial expressions as cues, we examined early and late patterns of information processing in cognitive avoidant coping (CAV). Participants were required to detect a target that appeared either in the same location as the cue (valid) or in a different location (invalid). Cue–target onset asynchrony (CTOA) was manipulated to be short (250 ms) or long (750 ms). CAV was associated with early facilitation and faster disengagement from angry faces. No effects were found for happy or neutral faces. After completing the spatial cueing task, participants prepared and delivered a public speech and heart rate variability (HRV) was recorded. Disengagement from angry faces was related to a decrease in HRV in response to this task. Together, these data suggest that CAV is related to early engagement followed by disengagement from threat‐related cues that might impact physiological stress responses.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号