首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Alexithymic individuals have difficulties in identifying and verbalizing their emotions. The amygdala is known to play a central role in processing emotion stimuli and in generating emotional experience. In the present study automatic amygdala reactivity to facial emotion was investigated as a function of alexithymia (as assessed by the 20-Item Toronto Alexithymia Scale). The Beck-Depression Inventory (BDI) and the State-Trait-Anxiety Inventory (STAI) were administered to measure participants' depressivity and trait anxiety. During 3T fMRI scanning, pictures of faces bearing sad, happy, and neutral expressions masked by neutral faces were presented to 21 healthy volunteers. The amygdala was selected as the region of interest (ROI) and voxel values of the ROI were extracted, summarized by mean and tested among the different conditions. A detection task was applied to assess participants' awareness of the masked emotional faces shown in the fMRI experiment. Masked sad and happy facial emotions led to greater right amygdala activation than masked neutral faces. The alexithymia feature difficulties identifying feelings was negatively correlated with the neural response of the right amygdala to masked sad faces, even when controlling for depressivity and anxiety. Reduced automatic amygdala responsivity may contribute to problems in identifying one's emotions in everyday life. Low spontaneous reactivity of the amygdala to sad faces could implicate less engagement in the encoding of negative emotional stimuli.  相似文献   

2.
目的:利用事件相关的功能核磁共振成像技术研究健康汉族女性对动态表情的识别情况并探讨其神经基础。方法:利用1.5T功能核磁共振成像系统检测13名女性健康受试者识别悲伤、喜悦及中性动态表情视频时的脑部反应。图像数据经SPM2软件处理和统计分析,获得脑区激活图。结果:与识别十字架相比,识别中性表情激活左额中回、双侧中央前回、右侧杏仁核、左顶下小叶、右中央后回以及丘脑等。与识别中性表情相比,识别喜悦表情激活右额内侧回、右额上回、右额中回、右前扣带回、左胼胝体下回、右枕上回、右枕下回、左枕中回及右颞上回等脑区,而识别悲伤表情激活左额内侧回、右额中回、左颞下回以及左颞上回等脑区。结论:面孔加工及动态表情的识别由脑内一个分布式神经网络所调控,其中额内侧回参与多种情绪的加工,可能是情绪加工的共同通路,而颞上回主要负责面部动态特征的加工。  相似文献   

3.
目的:研究职务犯罪服刑人员的外显自尊、内隐自尊的特征,以及不同自尊结构职务犯罪服刑人员的人格特征及情绪状况的差异。方法:方便选取某监狱职务犯罪服刑人员共112名,符合入组条件者共88名。使用大五人格量表简版(NEO-FFI)、简式简明心境问卷(POMS-SF)、自尊量表(SES)以及内隐联想测验(IAT)进行问卷调查及行为学实验。按SES、IAT的平均分为界,将被试分为高外显高内隐组(n=13)、高外显低内隐组(n=14)、低外显高内隐组(n=12)、低外显低内隐组(n=12)。结果:获有效问卷51份。内隐联想测验中,被试在相容试验的反应时短于不相容试验的反应时(P=0.022,d=0.45)。外显自尊得分与内隐自尊得分无统计学意义相关(r=0.10,P=0.527)。高外显自尊两组NEO-FFI的神经质得分低于低外显自尊两组。高外显低内隐组的外向性得分高于高外显高内隐组和低外显低内隐组,开放性得分高于较低外显高内隐组,谨慎性得分高于低外显自尊两组。低外显高内隐组POMS-SF的抑郁-沮丧、愤怒-敌意、疲乏及迷惑-混乱得分均高于高外显自尊两组。外显自尊与外向性、开放性及精力得分呈正相关(r=0.40,0.34,0.34,均P0.05),而与神经质、抑郁-沮丧、疲乏得分呈负相关(r=-0.52,-0.35,-0.42,P0.01或0.05);内隐自尊与精力得分呈正相关(r=0.39,P0.01)。结论:职务犯罪服刑人员的外显自尊与内隐自尊是两个相互独立的结构,高外显自尊职务犯罪服刑人员的人格具有情绪相对稳定的特点,低外显高内隐自尊的服刑人员抑郁、愤怒、疲乏及迷惑等负性情绪相对较突出。  相似文献   

4.
It is known that the temporal cortex is involved in perception of emotional facial expressions, and the involvement is relatively independent of the emotional valence of those expressions. The present study revealed a valence-dependent aspect of the temporal cortex through individual differences analyses involving the neuroticism trait, one of the representative affective personality traits. Functional MRI was administered while subjects classified expressions of faces, and neuroticism scores were obtained from individual subjects. Significant brain activity was observed in the temporal pole (TP) during perception of both happy and sad expressions relative to neutral expressions. Correlational analyses revealed that TP activity during perception of sad expressions, but not happy expressions, correlated with the neuroticism scores. These results demonstrate differential roles for the temporal cortex in perception of happy and sad faces, and suggest that TP recruitment during understanding of negative emotions is dependent on the personality of the individuals.  相似文献   

5.
It has been argued that the amygdala represents an integral component of a vigilance system that is primarily involved in the perception of ambiguous stimuli of biological relevance. The present investigation was conducted to examine the relationship between automatic amygdala responsivity to fearful faces which may be interpreted as an index of trait-like threat sensitivity and spatial processing characteristics of facial emotions. During 3T fMRI scanning, pictures of human faces bearing fearful, angry, and happy expressions were presented to 20 healthy volunteers using a backward masking procedure based on neutral facial expressions. Subsequently, a computer-based face-in-the-crowd task using schematic face stimuli was administered. The neural response of the (right) amygdala to masked fearful faces correlated consistently with response speed to negative and neutral faces. Neither amygdala activation during the masked presentation of angry faces nor amygdala activation during the presentation of happy faces was correlated with any of the response latencies in the face-in-the-crowd task. Our results suggest that amygdala responsivity to masked facial expression is differentially related to the general visual search speed for facial expression. Neurobiologically defined threat sensitivity seems to represent an important determinant of visual scanning behaviour.  相似文献   

6.
This experiment investigated the prime frequency effect of masked affective stimuli on effort-related cardiovascular response. Cardiovascular reactivity was recorded during a baseline period and an attention task in which either 1/3, 2/3, or 3/3 of the trials included the presentation of masked emotional facial expressions (sad vs. happy). In the resting trials participants were exposed to masked neutral expressions. As expected, and replicating previous findings (Gendolla and Silvestrini, in press), participants in the 1/3 priming condition showed stronger systolic blood pressure reactivity - indicating more effort - when they were exposed to masked sad faces than when they were exposed to masked happy faces. This effect disappeared in the 2/3 and 3/3 conditions. Findings are interpreted as demonstrating habituation effects of masked affective stimuli on effort mobilization.  相似文献   

7.
To investigate whether subliminally priming for competition influences facial reactions to facial emotional displays, 49 participants were either subliminally competition primed or neutrally primed. Thereafter, they viewed computer generated avatar faces with happy, neutral, and sad expressions while Corrugator supercilii and Zygomaticus major reactions were recorded. Results revealed facial mimicry to happy and sad faces in the neutrally primed group but not the competition primed group. Furthermore, subliminal competition priming enhanced Corrugator supercilii activity after an initial relaxation while viewing happy faces. An impression formation task revealed counter empathic effects confirming successful competition priming. Overall, results indicate that nonconscious processes influence a presumably nonconscious behavior.  相似文献   

8.
The current event-related potential study investigated the perceptual processing of the categorization advantage of happy over sad faces in social anhedonia with a nonclinical sample. A high social anhedonia (HSA, N = 25) group and a low social anhedonia (LSA, N = 27) group performed a facial expression categorization task during which they classified facial expressions (happy, neutral, sad), irrespective of face orientation (upright, upside-down). Behaviorally, happy faces were identified more quickly than sad ones in the upright but not inverted orientation. Electrophysiologically, the LSA group showed earlier N170 latencies for happy than for sad faces in the upright but not upside-down orientation, whereas the HSA group did not show any expression effect on N170 latencies. Moreover, N170 and P2 amplitude results revealed that HSA relative to LSA individuals showed delayed neural discrimination between happy and sad faces. These results suggest that social anhedonia is associated with a deficit of perceptual processing during facial expression categorization.  相似文献   

9.
Recent studies that used adult faces as the baseline have revealed that attentional bias toward infant faces is the strongest for neutral expressions than for happy and sad expressions. However, the time course of the strongest attentional bias toward infant neutral expressions is unclear. To clarify this time course, we combined a behavioral dot-probe task with electrophysiological event-related potentials (ERPs) to measure adults' responses to infant and adult faces with happy, neutral, and sad expressions derived from the same face. The results indicated that compared with the corresponding expressions in adult faces, attentional bias toward infant faces with various expressions resulted in different patterns during rapid and prolonged attention stages. In particular, first, neutral expressions in infant faces elicited greater behavioral attentional bias and P1 responses than happy and sad ones did. Second, sad expressions in infant faces elicited greater N170 responses than neutral and happy ones did; notably, sad expressions elicited greater N170 responses in the left hemisphere in women than in men. Third, late positive potential (LPP) responses were greater for infant faces than for adult faces under each expression condition. Thus, we propose a three-stage model of attentional allocation patterns that reveals the time course of attentional bias toward infant faces with various expressions. This model highlights the prominent role of neutral facial expressions in the attentional bias toward infant faces.  相似文献   

10.
In communication, language can be interpreted differently depending upon the emotional context. To clarify the effect of emotional context on language processing, we performed experiments using a cross-modal priming paradigm with an auditorily presented prime and a visually presented target. The primes were the names of people that were spoken with a happy, sad, or neutral intonation; the targets were interrogative one-word sentences with emotionally neutral content. Using magnetoencephalography, we measured neural activities during silent reading of the targets presented in a happy, sad, or neutral context. We identified two conditional differences: the happy and sad conditions produced less activity than the neutral condition in the right posterior inferior and middle frontal cortices in the latency window from 300 to 400 ms; the happy and neutral conditions produced greater activity than the sad condition in the left posterior inferior frontal cortex in the latency window from 400 to 500 ms. These results suggest that the use of emotional context stored in the right frontal cortex starts at ∼300 ms, that integration of linguistic information with emotional context starts at ∼400 ms in the left frontal cortex, and that language comprehension dependent on emotional context is achieved by ∼500 ms.  相似文献   

11.
Zhao L  Li J 《Neuroscience letters》2006,410(2):126-131
A modified "cross-modal delayed response" paradigm was used to investigate putative processing of face expression in the absence of focused attention to the face. Neutral, happy and sad faces were presented during intervals occurring between a tone and a response imperative signal (a faint click), while subjects were instructed to discriminate the location of the tone as quickly and accurately as possible and to ignore the faces. A neutral face was presented in 80% of the trials whereas the happy and sad faces were presented in the remaining trials - 10%, respectively. Expression mismatch negativity (EMMN) was obtained by subtracting the ERP elicited by neutral faces from that elicited by sad faces or happy faces. The EMMN started from around 120 ms (sad) and 110 ms (happy) lasting up to 430 ms (sad) and 360 ms (happy) post-stimulus. The EMMN elicited by sad faces was more negative than that elicited by happy faces. Both EMMNs distributed over posterior areas and covered larger areas in the right than in the left hemisphere sites (especially for happy EMMN).  相似文献   

12.
Social anxiety has been characterized by an attentional bias towards threatening faces. Electrophysiological studies have demonstrated modulations of cognitive processing from 100 ms after stimulus presentation. However, the impact of the stimulus features and task instructions on facial processing remains unclear. Event-related potentials were recorded while high and low socially anxious individuals performed an adapted Stroop paradigm that included a colour-naming task with non-emotional stimuli, an emotion-naming task (the explicit task) and a colour-naming task (the implicit task) on happy, angry and neutral faces. Whereas the impact of task factors was examined by contrasting an explicit and an implicit emotional task, the effects of perceptual changes on facial processing were explored by including upright and inverted faces. The findings showed an enhanced P1 in social anxiety during the three tasks, without a moderating effect of the type of task or stimulus. These results suggest a global modulation of attentional processing in performance situations.  相似文献   

13.
Previous research has shown that implicitly measured shyness predicted spontaneous shy behavior in social situations, while explicit self-ratings of shyness predicted controlled shy behavior (Asendorpf, Banse, & Mücke, 2002). The present study examined whether these same results would be replicated in Japan. In Study 1, college students (N=47) completed a shyness Implicit Association Test (IAT for shyness) and explicit self-ratings of shyness. In Study 2, friends (N=69) of the Study 1 participants rated those participants on various personality scales. Covariance structure analysis, revealed that only implicit self-concept measured by the shyness IAT predicted other-rated high interpersonal tension (spontaneous shy behavior). Also, only explicit self-concept predicted other-rated low praise seeking (controlled shy behavior). The results of this study are similar to the findings of the previous research.  相似文献   

14.
Developing measures of socioaffective processing is important for understanding the mechanisms underlying emotional-interpersonal traits relevant to health, such as hostility. In this study, cigarette smokers with low (LH; n = 49) and high (HH; n = 43) trait hostility completed the Emotional Interference Gender Identification Task (EIGIT), a newly developed behavioral measure of socioaffective processing biases toward facial affect. The EIGIT involves quickly categorizing the gender of facial pictures that vary in affective valence (angry, happy, neutral, sad). Results showed that participants were slower and less accurate in categorizing the gender of angry faces in comparison to happy, neutral, and sad faces (which did not differ), signifying interference indicative of a socioaffective processing bias toward angry faces. Compared to LH individuals, HH participants exhibited diminished biases toward angry faces on error-based (but not speed-based) measures of emotional interference, suggesting impaired socioaffective processing. The EIGIT may be useful for future research on the role of socioaffective processing in traits linked with poor health.  相似文献   

15.
Hemispheric perception of emotional valence from facial expressions.   总被引:6,自引:0,他引:6  
The authors previously reported that normal subjects are better at discriminating happy from neutral faces when the happy face is located to the viewer's right of the neutral face; conversely, discrimination of sad from neutral faces is better when the sad face is shown to the left, supporting a role for the left hemisphere in processing positive valence and for the right hemisphere in processing negative valence. Here, the authors extend this same task to subjects with unilateral cerebral damage (31 right, 28 left). Subjects with right damage performed worse when discriminating sad faces shown on the left, consistent with the prior findings. However, subjects with either left or right damage actually performed superior to normal controls when discriminating happy faces shown on the left. The authors suggest that perception of negative valence relies preferentially on the right hemisphere, whereas perception of positive valence relies on both left and right hemispheres.  相似文献   

16.
BACKGROUND: The processing of facial emotion involves a distributed network of limbic and paralimbic brain structures. Many of these regions are also implicated in the pathophysiology of mood disorders. Behavioural data indicate that depressed subjects show a state-related positive recognition bias for faces displaying negative emotions. There are sparse data to suggest there may be an analogous, state-related negative recognition bias for negative emotions in mania. We used functional magnetic resonance imaging (fMRI) to investigate the behavioural and neurocognitive correlates of happy and sad facial affect recognition in patients with mania. METHOD: Functional MRI and an explicit facial affect recognition task were used in a case-control design to measure brain activation and associated behavioural response to variable intensity of sad and happy facial expressions in 10 patients with bipolar I mania and 12 healthy comparison subjects. RESULTS: The patients with mania had attenuated subjective rating of the intensity of sad facial expressions, and associated attenuation of activation in the subgenual anterior cingulate and bilateral amygdala, with increased activation in the posterior cingulate and posterior insula. No behavioural or neurocognitive abnormalities were found in response to presentation of happy facial expressions. CONCLUSIONS: Patients with mania showed a specific, mood-congruent, negative bias in sad facial affect recognition, which was associated with an abnormal profile of brain activation in paralimbic regions implicated in affect recognition and mood disorders. Functional imaging of facial emotion recognition may be a useful probe of cortical and subcortical abnormalities in mood disorders.  相似文献   

17.
To determine how sad affect (or brief sad mood) interacts with paralinguistic aspects of speech, we investigated the effect of a happy or sad mood induction on speech production in 49 healthy volunteers. Several speech parameters measuring speech rate, loudness and pitch were examined before and after a standardized mood-induction procedure that involved viewing facial expressions. Speech samples were collected during the self-initiated reading of emotionally "neutral" sentences; there was no explicit demand to produce mood-congruent speech. Results indicated that, after the mood induction, the speech of participants in the sad group was slower, quieter and more monotonous than the speech of participants in the happy group. This speech paradigm provides a model for studying how changes in mood states interact with the motor control of speech.  相似文献   

18.
States of depression are considered to relate to a cognitive bias reactivity to emotional events. Moreover, gender effect may influence differences in emotional processing. The current study is to investigate whether there is an interaction of cognitive bias by gender on emotional processing in minor depression (MiD) and major depression (MaD). N170 component was obtained during a visual emotional oddball paradigm to manipulate the processing of emotional information in 33 MiD, 36 MaD, and 32 controls (CN). Compared with CN, in male, both MiD and MaD had lower N170 amplitudes for happy faces, but MaD had higher N170 amplitudes for sad faces; in female, both MiD and MaD had lower N170 amplitudes for happy and neutral faces, but higher N170 amplitudes for sad faces. Compared with MaD in male, MiD had higher N170 amplitudes for happy faces, lower N170 amplitudes for sad faces; in female, MiD only had higher N170 amplitudes for sad faces. Interestingly, a negative relationship was observed between N170 amplitude and the HDRS score for identification of happy faces in depressed patients while N170 amplitude was positively correlated with the HDRS score for sad faces identification. These results provide novel evidence for the mood-brightening effect with an interaction of cognitive bias by gender on emotional processing. It further suggests that female depression may be more vulnerable than male during emotional face processing with the unconscious negative cognitive bias and depressive syndromes may exist on a spectrum of severity on emotional face processing.  相似文献   

19.
Music is one of the most powerful elicitors of subjective emotion, yet it is not clear whether emotions elicited by music are similar to emotions elicited by visual stimuli. This leads to an open question: can music-elicited emotion be transferred to and/or influence subsequent vision-elicited emotional processing? Here we addressed this question by investigating processing of emotional faces (neutral, happy and sad) primed by short excerpts of musical stimuli (happy and sad). Our behavioural experiment showed a significant effect of musical priming: prior listening to a happy (sad) music enhanced the perceived happiness (sadness) of a face irrespective of facial emotion. Further, this musical priming-induced effect was largest for neutral face. Our electrophysiological experiment showed that such crossmodal priming effects were manifested by event related brain potential components at a very early (within 100 ms post-stimulus) stages of neuronal information processing. Altogether, these results offer new insight into the crossmodal nature of music and its ability to transfer emotion to visual modality.  相似文献   

20.
Although neutral faces do not initially convey an explicit emotional message, it has been found that individuals tend to assign them an affective content. Moreover, previous research has shown that affective judgments are mediated by the task they have to perform. Using functional magnetic resonance imaging in 21 healthy participants, we focus this study on the cerebral activity patterns triggered by neutral and emotional faces in two different tasks (social or gender judgments). Results obtained, using conjunction analyses, indicated that viewing both emotional and neutral faces evokes activity in several similar brain areas indicating a common neural substrate. Moreover, neutral faces specifically elicit activation of cerebellum, frontal and temporal areas, while emotional faces involve the cuneus, anterior cingulated gyrus, medial orbitofrontal cortex, posterior superior temporal gyrus, precentral/postcentral gyrus and insula. The task selected was also found to influence brain activity, in that the social task recruited frontal areas while the gender task involved the posterior cingulated, inferior parietal lobule and middle temporal gyrus to a greater extent. Specifically, in the social task viewing neutral faces was associated with longer reaction times and increased activity of left dorsolateral frontal cortex compared with viewing facial expressions of emotions. In contrast, in the same task emotional expressions distinctively activated the left amygdale. The results are discussed taking into consideration the fact that, like other facial expressions, neutral expressions are usually assigned some emotional significance. However, neutral faces evoke a greater activation of circuits probably involved in more elaborate cognitive processing.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号