首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Anxious individuals have been shown to interpret others' emotional states negatively. Since most studies have used facial expressions as emotional cues, we examined whether trait anxiety affects the recognition of emotion in a dynamic face and voice that were presented in synchrony. The face and voice cues conveyed either matched (e.g., happy face and voice) or mismatched emotions (e.g., happy face and angry voice). Participants with high or low trait anxiety were to indicate the perceived emotion using one of the cues while ignoring the other. The results showed that individuals with high trait anxiety were more likely to interpret others' emotions in a negative manner, putting more weight on the to-be-ignored angry cues. This interpretation bias was found regardless of the cue modality (i.e., face or voice). Since trait anxiety did not affect recognition of the face or voice cues presented in isolation, this interpretation bias appears to reflect an altered integration of the face and voice cues among anxious individuals.  相似文献   

2.
The amygdala is purported to play an important role in face processing, yet the specificity of its activation to face stimuli and the relative contribution of identity and expression to its activation are unknown. In the current study, neural activity in the amygdala was recorded as monkeys passively viewed images of monkey faces, human faces, and objects on a computer monitor. Comparable proportions of neurons responded selectively to images from each category. Neural responses to monkey faces were further examined to determine whether face identity or facial expression drove the face-selective responses. The majority of these neurons (64%) responded both to identity and facial expression, suggesting that these parameters are processed jointly in the amygdala. Large fractions of neurons, however, showed pure identity-selective or expression-selective responses. Neurons were selective for a particular facial expression by either increasing or decreasing their firing rate compared with the firing rates elicited by the other expressions. Responses to appeasing faces were often marked by significant decreases of firing rates, whereas responses to threatening faces were strongly associated with increased firing rate. Thus global activation in the amygdala might be larger to threatening faces than to neutral or appeasing faces.  相似文献   

3.
目的探究孤独症患儿对面部情绪表情的认知特征。方法选取2007年3月至2008年9月在中山大学附属第三医院发育行为中心诊断为孤独症的18~36个月患儿作为孤独症组,同期行健康查体的年龄、性别与孤独症组匹配的正常儿童作为对照组,被动观看电脑屏幕显示的高兴、悲伤、惊讶、愤怒和恐惧5种面部基本情绪表情图,观察比较两组幼儿对各种面部表情的视觉注意行为和自身情绪反应。结果研究期间孤独症组和对照组均纳入45例,两组幼儿对各种面部表情的初次注视时间组间效应不明显,而表情效应明显,两组幼儿初次注视高兴和愤怒表情的时间长于注视恐惧表情的时间。但孤独症组对各种面部表情图的回看次数明显少于对照组,总注视时间也明显短于对照组。对照组对不同的面部表情自身情绪反应评分明显不同,对高兴表情的积极情绪评分明显高于其他表情,对高兴表情的消极情绪评分明显低于愤怒和恐惧表情,对悲伤和惊讶表情的消极情绪评分也明显低于恐惧表情。而孤独症组对各种情绪表情的自身情绪反应评分差异无统计学意义。结论孤独症患儿早期不仅对面部情绪表情的视觉注意减少,对面部情绪表情的感知也存在缺陷,尤其对各种负性情绪表情理解困难。  相似文献   

4.
When perceiving emotional facial expressions there is an automatic tendency to react with a matching facial expression. A classic explanation of this phenomenon, termed the matched motor hypothesis, highlights the importance of topographic matching, that is, the correspondence in body parts, between perceived and produced actions. More recent studies using mimicry paradigms have challenged this classic account, producing ample evidence against the matched motor hypothesis. However, research using stimulus-response compatibility (SRC) paradigms usually assumed the effect relies on topographic matching. While mimicry and SRC share some characteristics, critical differences between the paradigms suggest conclusions cannot be simply transferred from one to another. Thus, our aim in the present study was to directly test the matched motor hypothesis using SRC. Specifically, we investigated whether observing emotional body postures or hearing emotional vocalizations produces a tendency to respond with one's face, despite completely different motor actions being involved. In three SRC experiments, participants were required to either smile or frown in response to a color cue, presented concurrently with stimuli of happy and angry facial (experiment 1), body (experiment 2), or vocal (experiment 3) expressions. Reaction times were measured using facial EMG. Whether presenting facial, body, or vocal expressions, we found faster responses in compatible, compared to incompatible trials. These results demonstrate that the SRC effect of emotional expressions does not require topographic matching. Our findings question interpretations of previous research and suggest further examination of the matched motor hypothesis.  相似文献   

5.
Several lines of evidence suggest that emotional responses to facial expressions of emotion have a biological basis. The present study involved 4 experiments where pictures of angry, happy or neutral facial expressions were used as conditioned stimuli in aversive Pavlovian electrodermal conditioning. From an evolutionary perspective it was expected that an angry face should have an excitatory effect on aversively conditioned responses, whereas a happy face should have an inhibitory effect. It was also expected that the effect should be specific for the stimulus person showing the display. The data showed that the stimulus person was a critical mediating factor for obtaining persistent conditioning effects, that is to say, responses which showed resistance to extinction. Persistent responding was primarily manifested when the stimulus person displayed anger during extinction. On the other hand, this effect was inhibited when the person displayed a happy face during extinction. Furthermore, resistance to extinction was increased or decreased dependent on whether the person expressed anger or happiness during acquisition. Thus, consistent with predictions, angry and happy faces exhibited an excitatory and inhibitory effect, respectively, and these effects were mediated by the stimulus person.  相似文献   

6.
P3b reflects maltreated children's reactions to facial displays of emotion   总被引:4,自引:0,他引:4  
Processing of emotion information by maltreated and control children was assessed with event-related brain potentials (ERPs). Maltreated children, for whom negative facial displays may be especially salient, and demographically comparable peers were tested to increase knowledge of differential processing of emotion information. ERPs were measured while children responded to pictures depicting facial displays of anger, fear, and happiness. Maltreated children showed larger P3b amplitude when angry faces appeared as targets than did control children; the two groups did not differ when targets were either happy or fearful facial expressions or for nontargets of any emotional content. These results indicate that aberrant emotional experiences associated with maltreatment may alter the allocation of attention and sensitivity that children develop to process specific emotion information.  相似文献   

7.
We report two functional magnetic resonance imaging experiments showing enhanced responses in human middle superior temporal sulcus for angry relative to neutral prosody. This emotional enhancement was voice specific, unrelated to isolated acoustic amplitude or frequency cues in angry prosody, and distinct from any concomitant task-related attentional modulation. Attention and emotion seem to have separate effects on stimulus processing, reflecting a fundamental principle of human brain organization shared by voice and face perception.  相似文献   

8.
Emotion understanding in postinstitutionalized Eastern European children   总被引:1,自引:0,他引:1  
To examine the effects of early emotional neglect on children's affective development, we assessed children who had experienced institutionalized care prior to adoption into family environments. One task required children to identify photographs of facial expressions of emotion. A second task required children to match facial expressions to an emotional situation. Internationally adopted, postinstitutionalized children had difficulty identifying facial expressions of emotion. In addition, postinstitutionalized children had significant difficulty matching appropriate facial expressions to happy, sad, and fearful scenarios. However, postinstitutionalized children performed as well as comparison children when asked to identify and match angry facial expressions. These results are discussed in terms of the importance of emotional input early in life on later developmental organization.  相似文献   

9.
Human neuropsychological studies suggest that the amygdala is implicated in social cognition, in which cognition of seen gaze-direction, especially the direct gaze, is essential, and that the perception of gaze direction is modulated by the head orientation of the facial stimuli. However, neural correlates to these issues remain unknown. In the present study, neuronal activity was recorded from the macaque monkey amygdala during performance of a sequential delayed non-matching-to-sample task based on gaze direction. The facial stimuli consisted of two head orientations (frontal; straight to the monkey, profile; 30 degrees rightwards from the front) with different gaze directions (directed toward and averted to the left or right of the monkey). Of the 1091 neurons recorded, 61 responded to more than one facial stimulus. Of these face-responsive neurons, 44 displayed responses selective to the facial stimuli (face neurons). Most amygdalar face neurons discriminated both gaze direction and head orientation, and exhibited a significant interaction between the two types about information. Furthermore, factor analysis on the response magnitudes of the face neurons to the facial stimuli revealed that two factors derived from these facial stimuli were correlated with two head orientations. The overall responses of the face neurons to direct gazes in the profile and frontal faces were significantly larger than that to averted gazes. The results suggest that information of both gaze and head direction is integrated in the amygdala, and that the amygdala is implicated in detection of direct gaze.  相似文献   

10.
Individuals with social phobia display neural hyperactivation towards angry facial expressions. However, it is uncertain whether they also show abnormal brain responses when processing angry voices. In an event-related functional magnetic resonance imaging study, we investigated brain responses to neutral and angry voices in 12 healthy control participants and 12 individuals with social phobia when emotional prosody was either task-relevant or task-irrelevant. Regardless of task, both phobic and non-phobic participants recruited a network comprising frontotemporal regions, the amygdala, the insula, and the striatum, when listening to angry compared to neutral prosody. Across participants, increased activation in orbitofrontal cortex during task-relevant as compared to task-irrelevant emotional prosody processing was found. Compared to healthy controls, individuals with social phobia displayed significantly stronger orbitofrontal activation in response to angry versus neutral voices under both task conditions. These results suggest a disorder-associated increased involvement of the orbitofrontal cortex in response to threatening voices in social phobia.  相似文献   

11.
The present study was designed to evaluate whether aversively conditioned responses to facial stimuli are detectable in all three components of the emotional response system, i.e. the expressive/behavioral, the physiological/autonomic and the cognitive/experienced component of emotion. Two groups of subjects were conditioned to angry or happy facial expression stimuli using a 100 dB noise as UCS in a differential aversive conditioning paradigm. The three components of the emotional response system were measured as: Facial-EMG reactions (corrugator and zygomatic muscle regions); autonomic activity (skin conductance, SCR; SCR half recovery time, T/2; heart rate, HR); and ratings of experienced emotion. It was found that responses in all components of the emotional response system were detectable in the angry group as greater EMG and autonomic resistance to extinction and greater self-reported fear. More specifically the angry group showed a resistant conditioning effect for the facial-EMG corrugator muscle that was accompanied by resistant conditioning for SCR frequency, slower SCR recovery, resistant conditioning in HR and a higher self-reported fear than the happy group. Thus, aversive conditioning to angry facial stimuli induce a uniform negative emotional response pattern as indicated by all three components of the emotional response system. These data suggest that a negative 'affect program' triggers responses in the different emotional components. The results suggest that human subjects are biologically prepared to react with a negative emotion to angry facial stimuli.  相似文献   

12.
An extensive literature documents the infant's ability to recognize and discriminate a variety of facial expressions of emotion. However, little is known about the neural bases of this ability. To examine the neural processes that may underlie infants' responses to facial expressions, we recorded event-related potentials (ERPs) while 7-month-olds watched pictures of a happy face and a fearful face (Experiment 1) or an angry face and a fearful face (Experiment 2). In both experiments an early positive component, a middle-latency negative component and a later positive component were elicited. However, only when the infants saw the happy and fearful faces did the components differ for the two expressions. These results are discussed in the context of the neurobiological processes involved in perceiving facial expressions. © 1996 John Wiley & Sons, Inc.  相似文献   

13.
This study investigated the temporal course of attentional biases for threat-related (angry) and positive (happy) facial expressions. Electrophysiological (event-related potential) and behavioral (reaction time [RT]) data were recorded while participants viewed pairs of faces (e.g., angry face paired with neutral face) shown for 500 ms and followed by a probe. Behavioral results indicated that RTs were faster to probes replacing emotional versus neutral faces, consistent with an attentional bias for emotional information. Electrophysiological results revealed that attentional orienting to threatening faces emerged earlier (early N2pc time window; 180–250 ms) than orienting to positive faces (after 250 ms), and that attention was sustained toward emotional faces during the 250–500-ms time window (late N2pc and SPCN components). These findings are consistent with models of attention and emotion that posit rapid attentional prioritization of threat.  相似文献   

14.
Neuroscience research indicates that individual differences in anxiety may be attributable to a neural system for threat-processing, involving the amygdala, which modulates attentional vigilance, and which is more sensitive to fearful than angry faces. Complementary cognitive studies indicate that high-anxious individuals show enhanced visuospatial orienting towards angry faces, but it is unclear whether fearful faces elicit a similar attentional bias. This study compared biases in initial orienting of gaze to fearful and angry faces, which varied in emotional intensity, in high- and low-anxious individuals. Gaze was monitored whilst participants viewed a series of face-pairs. Results showed that fearful and angry faces elicited similar attentional biases. High-anxious individuals were more likely to direct gaze at intense negative facial expressions, than low-anxious individuals, whereas the groups did not differ in orienting to mild negative expressions. Implications of the findings for research into the neural and cognitive bases of emotion processing are discussed.  相似文献   

15.
A considerable body of research has focused on neural responses evoked by emotional facial expressions, but little is known about mother-specific brain responses to infant facial emotions. We used near-infrared spectroscopy to investigate prefrontal activity during discriminating facial expressions of happy, angry, sad, fearful, surprised and neutral of unfamiliar infants and unfamiliar adults by 14 mothers and 14 age-matched females who have never been pregnant (non-mothers). Our results revealed that discriminating infant facial emotions increased the relative oxyHb concentration in mothers' right prefrontal cortex but not in their left prefrontal cortex, compared with each side of the prefrontal cortices of non-mothers. However, there was no difference between mothers and non-mothers in right or left prefrontal cortex activation while viewing adult facial expressions. These results suggest that the right prefrontal cortex is involved in human maternal behavior concerning infant facial emotion discrimination.  相似文献   

16.
Research investigating the early development of emotional processing has focused mainly on infants' perception of static facial emotional expressions, likely restricting the amount and type of information available to infants. In particular, the question of whether dynamic information in emotional facial expressions modulates infants' neural responses has been rarely investigated. The present study aimed to fill this gap by recording 7-month-olds' event-related potentials to static (Study 1) and dynamic (Study 2) happy, angry, and neutral faces. In Study 1, happy faces evoked a faster right-lateralized negative central (Nc) component compared to angry faces. In Study 2, both happy and angry faces elicited a larger right-lateralized Nc compared to neutral faces. Irrespective of stimulus dynamicity, a larger P400 to angry faces was associated with higher scores on the Negative Affect temperamental dimension. Overall, results suggest that 7-month-olds are sensitive to facial dynamics, which might play a role in shaping the neural processing of facial emotional expressions. Results also suggest that the amount of attentional resources infants allocate to angry expressions is associated to their temperamental traits. These findings represent a promising avenue for future studies exploring the neurobiological processes involved in perceiving emotional expressions using dynamic stimuli.  相似文献   

17.
Humans' experience of emotion and comprehension of affective cues varies substantially across the lifespan. Work in cognitive and affective neuroscience has begun to characterize behavioral and neural responses to emotional cues that systematically change with age. This review examines work to date characterizing the maturation of facial expression comprehension, and dynamic changes in amygdala recruitment from early childhood through late adulthood while viewing facial expressions of emotion. Recent neuroimaging work has tested amygdala and prefrontal engagement in experimental paradigms mimicking real aspects of social interactions, which we highlight briefly, along with considerations for future research.  相似文献   

18.
Hemispheric Asymmetry in Conditioning to Facial Emotional Expressions   总被引:1,自引:0,他引:1  
In the present experiment, we report a right hemisphere advantage for autonomic conditioning to facial emotional expressions. Specifically, angry, but not happy, facial expressions showed significantly more resistance to extinction when presented initially to the right as compared to the left hemisphere. Slides of happy and angry faces were used as conditioned stimuli (CS+ and CS-) with shock as the unconditioned stimulus (UCS). Half of the subjects (n = 15) had the angry face as CS+ (and the happy face as CS-), the other half had the happy face as CS+ (and the angry face as CS-). During acquisition, the CSs were presented foveally. During extinction, using the Visual Half-Field (VHF) technique, half of the CS+ and CS- trials were randomly presented in the right visual half-field (initially to the left hemisphere), and half of the trials were presented in the left half-field (initially to the right hemisphere). Stimuli were presented for 210 ms during acquisition, and for 30 ms during extinction. Bilateral skin conductance responses (SCRs) were recorded. The results showed effects of acquisition only for the angry CS+ group. During extinction, there was a significant Conditioning X Half-field interaction which was due to greater SCRs to the CS+ angry face when it was presented in the left half-field. It is concluded that the present results reveal hemisphere asymmetry effects in facial emotional conditioning.  相似文献   

19.
The anterior superior temporal sulcus (STS) of macaque monkeys is thought to be involved in the analysis of incoming perceptual information for face recognition or identification; face neurons in the anterior STS show tuning to facial views and/or gaze direction in the faces of others. Although it is well known that both the anatomical architecture and the connectivity differ between the rostral and caudal regions of the anterior STS, the functional heterogeneity of these regions is not well understood. We recorded the activity of face neurons in the anterior STS of macaque monkeys during the performance of a face identification task, and we compared the characteristics of face neuron responses in the caudal and rostral regions of the anterior STS. In the caudal region, facial views that elicited optimal responses were distributed among all views tested; the majority of face neurons responded symmetrically to right and left views. In contrast, the face neurons in the rostral region responded optimally to a single oblique view; right-left symmetry among the responses of these neurons was less evident. Modulation of the face neuron responses according to gaze direction was more evident in the rostral region. Some of the face neuron responses were specific to a certain combination of a particular facial view and a particular gaze direction, whereas others were associated with the relative spatial relationship between facial view and gaze direction. Taken together, these results indicated the existence of a functional heterogeneity within the anterior STS and suggested a plausible hierarchical organization of facial information processing.  相似文献   

20.
To investigate the neuronal basis underlying face identification, the activity of face neurons in the anterior superior temporal sulcus (STS) and the anterior inferior temporal gyrus (ITG) of macaque monkeys was analyzed during their performance of a face-identification task. The face space was composed by the activities of face neurons during the face-identification task, based on a multidimensional scaling (MDS) method; the face space composed by the anterior STS neurons represented facial views, whereas that composed by the anterior ITG neurons represented facial identity. The temporal correlation between the behavioral reaction time of the animal and the latency of face-related neuronal responses was also analyzed. The response latency of some of the face neurons in the anterior ITG exhibited a significant correlation with the behavioral reaction time, whereas this correlation was not significant in the anterior STS. The correlation of the latency of face-related neuronal responses in the anterior ITG with the behavioral reaction time was not found to be attributed to the correlation between the response latency and the magnitude of the neuronal responses. The present results suggest that the anterior ITG is closely related to judgments of facial identity, and that the anterior STS is closely related to analyses of incoming perceptual information; face identification in monkeys might involve interactions between the two areas.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号