首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
How we perceive emotional signals from our environment depends on our personality. Alexithymia, a personality trait characterized by difficulties in emotion regulation has been linked to aberrant brain activity for visual emotional processing. Whether alexithymia also affects the brain’s perception of emotional speech prosody is currently unknown. We used functional magnetic resonance imaging to investigate the impact of alexithymia on hemodynamic activity of three a priori regions of the prosody network: the superior temporal gyrus (STG), the inferior frontal gyrus and the amygdala. Twenty-two subjects performed an explicit task (emotional prosody categorization) and an implicit task (metrical stress evaluation) on the same prosodic stimuli. Irrespective of task, alexithymia was associated with a blunted response of the right STG and the bilateral amygdalae to angry, surprised and neutral prosody. Individuals with difficulty describing feelings deactivated the left STG and the bilateral amygdalae to a lesser extent in response to angry compared with neutral prosody, suggesting that they perceived angry prosody as relatively more salient than neutral prosody. In conclusion, alexithymia may be associated with a generally blunted neural response to speech prosody. Such restricted prosodic processing may contribute to problems in social communication associated with this personality trait.  相似文献   

2.
Change in linguistic prosody generates a mismatch negativity response (MMN), indicating neural representation of linguistic prosody, while change in affective prosody generates a positive response (P3a), reflecting its motivational salience. However, the neural response to concurrent affective and linguistic prosody is unknown. The present paper investigates the integration of these two prosodic features in the brain by examining the neural response to separate and concurrent processing by electroencephalography (EEG). A spoken pair of Swedish words—[?fɑ??s?n] phase and [?fɑ??s?n] damn—that differed in emotional semantics due to linguistic prosody was presented to 16 subjects in an angry and neutral affective prosody using a passive auditory oddball paradigm. Acoustically matched pseudowords—[?vɑ??s?m] and [?vɑ??s?m]—were used as controls. Following the constructionist concept of emotions, accentuating the conceptualization of emotions based on language, it was hypothesized that concurrent affective and linguistic prosody with the same valence—angry [?fɑ??s?n] damn—would elicit a unique late EEG signature, reflecting the temporal integration of affective voice with emotional semantics of prosodic origin. In accordance, linguistic prosody elicited an MMN at 300–350 ms, and affective prosody evoked a P3a at 350–400 ms, irrespective of semantics. Beyond these responses, concurrent affective and linguistic prosody evoked a late positive component (LPC) at 820–870 ms in frontal areas, indicating the conceptualization of affective prosody based on linguistic prosody. This study provides evidence that the brain does not only distinguish between these two functions of prosody but also integrates them based on language and experience.  相似文献   

3.
BackgroundIndividuals with intermittent explosive disorder (IED) were previously found to exhibit amygdala hyperactivation and relatively reduced orbital medial prefrontal cortex (OMPFC) activation to angry faces while performing an implicit emotion information processing task during functional magnetic resonance imaging (fMRI). This study examines the neural substrates associated with explicit encoding of facial emotions among individuals with IED.MethodTwenty unmedicated IED subjects and twenty healthy, matched comparison subjects (HC) underwent fMRI while viewing blocks of angry, happy, and neutral faces and identifying the emotional valence of each face (positive, negative or neutral). We compared amygdala and OMPFC reactivity to faces between IED and HC subjects. We also examined the relationship between amygdala/OMPFC activation and aggression severity.ResultsCompared to controls, the IED group exhibited greater amygdala response to angry (vs. neutral) facial expressions. In contrast, IED and control groups did not differ in OMPFC activation to angry faces. Across subjects amygdala activation to angry faces was correlated with number of prior aggressive acts.ConclusionsThese findings extend previous evidence of amygdala dysfunction in response to the identification of an ecologically-valid social threat signal (processing angry faces) among individuals with IED, further substantiating a link between amygdala hyperactivity to social signals of direct threat and aggression.  相似文献   

4.
ERP evidence for a sex-specific Stroop effect in emotional speech   总被引:7,自引:0,他引:7  
The present study investigated the interaction of emotional prosody and word valence during emotional comprehension in men and women. In a prosody-word interference task, participants listened to positive, neutral, and negative words that were spoken with a happy, neutral, and angry prosody. Participants were asked to rate word valence while ignoring emotional prosody, or vice versa. Congruent stimuli were responded faster and more accurately as compared to incongruent emotional stimuli. This behavioral effect was more salient for the word valence task than for the prosodic task and was comparable between men and women. The event-related potentials (ERPs) revealed a smaller N400 amplitude for congruent as compared to emotionally incongruent stimuli. This ERP effect, however, was significant only for the word valence judgment and only for female listeners. The present data suggest that the word valence judgment was more difficult and more easily influenced by task-irrelevant emotional information than the prosodic task in both men and women. Furthermore, although emotional prosody and word valence may have a similar influence on an emotional judgment in both sexes, ERPs indicate sex differences in the underlying processing. Women, but not men, show an interaction between prosody and word valence during a semantic processing stage.  相似文献   

5.
In the current study, we examined 7-month-old infants' processing of emotional prosody using event-related brain potentials. Infants heard semantically neutral words that were spoken with either a happy, angry, or neutral voice. The event-related brain potential data revealed that angry prosody elicited a more negative response in infants' event-related potentials than did happy or neutral prosody, suggesting greater allocation of attention to angry prosody. A positive slow wave was elicited by angry and happy prosody over temporal electrode sites. This indicates an enhanced sensory processing of the emotionally loaded stimuli (happy and angry). The current findings demonstrate that very early in development, the human brain detects emotionally loaded words and shows differential attentional responses depending on their emotional valence.  相似文献   

6.
Williams syndrome (WS), a neurodevelopmental genetic disorder due to a microdeletion in chromosome 7, is described as displaying an intriguing socio-cognitive phenotype. Deficits in prosody production and comprehension have been consistently reported in behavioral studies. It remains, however, to be clarified the neurobiological processes underlying prosody processing in WS.This study aimed at characterizing the electrophysiological response to neutral, happy, and angry prosody in WS, and examining if this response was dependent on the semantic content of the utterance. A group of 12 participants (5 female and 7 male), diagnosed with WS, with age range between 9 and 31 years, was compared with a group of typically developing participants, individually matched for chronological age, gender and laterality. After inspection of EEG artifacts, data from 9 participants with WS and 10 controls were included in ERP analyses.Participants were presented with neutral, positive and negative sentences, in two conditions: (1) with intelligible semantic and syntactic information; (2) with unintelligible semantic and syntactic information (‘pure prosody’ condition). They were asked to decide which emotion was underlying the auditory sentence.Atypical event-related potentials (ERP) components were related with prosodic processing (N100, P200, N300) in WS. In particular, reduced N100 was observed for prosody sentences with semantic content; more positive P200 for sentences with semantic content, in particular for happy and angry intonations; and reduced N300 for both types of sentence conditions.These findings suggest abnormalities in early auditory processing, indicating a bottom-up contribution to the impairment in emotional prosody processing and comprehension. Also, at least for N100 and P200, they suggest the top-down contributions of semantic processes in the sensory processing of speech. This study showed, for the first time, that abnormalities in ERP measures of early auditory processing in WS are also present during the processing of emotional vocal information. This may represent a physiological signature of underlying impaired on-line language and socio-emotional processing.  相似文献   

7.
IntroductionVocal anger is a salient social signal serving adaptive functions in typical child development. Despite recent advances in the developmental neuroscience of emotion processing with regard to visual stimuli, little remains known about the neural correlates of vocal anger processing in childhood. This study represents the first attempt to isolate a neural marker of vocal anger processing in children using electrophysiological methods.MethodsWe compared ERP wave forms during the processing of non-word emotional vocal stimuli in a population sample of 55 6–11-year-old typically developing children. Children listened to three types of stimuli expressing angry, happy, and neutral prosody and completed an emotion identification task with three response options (angry, happy and neutral/‘ok’).ResultsA distinctive N400 component which was modulated by emotional content of vocal stimulus was observed in children over parietal and occipital scalp regions—amplitudes were significantly attenuated to angry compared to happy and neutral voices.DiscussionFindings of the present study regarding the N400 are compatible with adult studies showing reduced N400 amplitudes to negative compared to neutral emotional stimuli. Implications for studies of the neural basis of vocal anger processing in children are discussed.  相似文献   

8.
Deficits in emotional prosodic processing, the expression of emotions in voice, have been widely reported in patients with schizophrenia, not only in comprehending emotional prosody but also expressing it. Given that prosodic cues are important in memory for voice and speaker identity, Cutting has proposed that prosodic deficits may contribute to the misattribution that appears to occur in auditory hallucinations in psychosis. The present study compared hallucinating patients with schizophrenia, non-hallucinating patients and normal controls on an emotional prosodic processing task. It was hypothesised that hallucinators would demonstrate greater deficits in emotional prosodic processing than non-hallucinators and normal controls. Participants were 67 patients with a diagnosis of schizophrenia or schizoaffective disorder (hallucinating = 38, non-hallucinating = 29) and 31 normal controls. The prosodic processing task used in this study comprised a series of semantically neutral sentences expressed in happy, sad and neutral voices which were rated on a 7-point Likert scale from sad (− 3) through neutral (0) to happy (+ 3). Significant deficits in the prosodic processing tasks were found in hallucinating patients compared to non-hallucinating patients and normal controls. No significant differences were observed between non-hallucinating patients and normal controls. In the present study, patients experiencing auditory hallucinations were not as successful in recognising and using prosodic cues as the non-hallucinating patients. These results are consistent with Cutting's hypothesis, that prosodic dysfunction may mediate the misattribution of auditory hallucinations.  相似文献   

9.
《Schizophrenia Research》2014,152(1):235-241
BackgroundAbnormalities in emotional prosody processing have been consistently reported in schizophrenia and are related to poor social outcomes. However, the role of stimulus complexity in abnormal emotional prosody processing is still unclear.MethodWe recorded event-related potentials in 16 patients with chronic schizophrenia and 16 healthy controls to investigate: 1) the temporal course of emotional prosody processing; and 2) the relative contribution of prosodic and semantic cues in emotional prosody processing. Stimuli were prosodic single words presented in two conditions: with intelligible (semantic content condition—SCC) and unintelligible semantic content (pure prosody condition—PPC).ResultsRelative to healthy controls, schizophrenia patients showed reduced P50 for happy PPC words, and reduced N100 for both neutral and emotional SCC words and for neutral PPC stimuli. Also, increased P200 was observed in schizophrenia for happy prosody in SCC only. Behavioral results revealed higher error rates in schizophrenia for angry prosody in SCC and for happy prosody in PPC.ConclusionsTogether, these data further demonstrate the interactions between abnormal sensory processes and higher-order processes in bringing about emotional prosody processing dysfunction in schizophrenia. They further suggest that impaired emotional prosody processing is dependent on stimulus complexity.  相似文献   

10.
Facial and vocal expressions are essential modalities mediating the perception of emotion and social communication. Nonetheless, currently little is known about how emotion perception and its neural substrates differ across facial expression and vocal prosody. To clarify this issue, functional MRI scans were acquired in Study 1, in which participants were asked to discriminate the valence of emotional expression (angry, happy or neutral) from facial, vocal, or bimodal stimuli. In Study 2, we used an affective priming task (unimodal materials as primers and bimodal materials as target) and participants were asked to rate the intensity, valence, and arousal of the targets. Study 1 showed higher accuracy and shorter response latencies in the facial than in the vocal modality for a happy expression. Whole-brain analysis showed enhanced activation during facial compared to vocal emotions in the inferior temporal-occipital regions. Region of interest analysis showed a higher percentage signal change for facial than for vocal anger in the superior temporal sulcus. Study 2 showed that facial relative to vocal priming of anger had a greater influence on perceived emotion for bimodal targets, irrespective of the target valence. These findings suggest that facial expression is associated with enhanced emotion perception compared to equivalent vocal prosodies.  相似文献   

11.
Background: Several studies have shown that female and male subjects process emotions differently. As women appear to be especially sensitive and responsive to negative and threatening stimuli, gender‐specific emotional processing might be an important factor contributing to the increased likelihood of women compared to men to develop anxiety disorders, e.g. panic disorder (PD). Methods: In this study, gender‐specific neural activation during facial emotion processing was investigated in 20 PD patients (12 women, 8 men) by functional magnetic resonance imaging. Results: Overall, significantly stronger activation, encompassing the amygdala, prefrontal, temporal, and occipital cortical areas, basal ganglia, and thalamus, was observed in women than in men during the processing of angry, fearful, or neutral but not happy facial expressions. Additionally, functional connectivity between the amygdala and prefrontal cortical areas and thalamus during the processing of angry facial expressions was significantly stronger in women than in men. Conclusions: These results emphasize gender as an important variable in neural activation patterns of emotional processing and may help to further elucidate the biological substrate of gender‐specific susceptibility for PD. Depression and Anxiety, 2010. © 2010 Wiley‐Liss, Inc.  相似文献   

12.
《Social neuroscience》2013,8(2):185-196
Abstract

Emotion research is guided both by the view that emotions are points in a dimensional space, such as valence or approach–withdrawal, and by the view that emotions are discrete categories. We determined whether effective connectivity of amygdala with medial orbitofrontal cortex (MOFC) and lateral orbitofrontal cortex (LOFC) differentiates the perception of emotion faces in a manner consistent with the dimensional and/or categorical view. Greater effective connectivity from left MOFC to amygdala differentiated positive and neutral expressions from negatively valenced angry, disgust, and fear expressions. Greater effective connectivity from right LOFC to amygdala differentiated emotion expressions conducive to perceiver approach (happy, neutral, and fear) from angry expressions that elicit perceiver withdrawal. Finally, consistent with the categorical view, there were unique patterns of connectivity in response to fear, anger, and disgust, although not in response to happy expressions, which did not differ from neutral ones.  相似文献   

13.
Emotion research is guided both by the view that emotions are points in a dimensional space, such as valence or approach-withdrawal, and by the view that emotions are discrete categories. We determined whether effective connectivity of amygdala with medial orbitofrontal cortex (MOFC) and lateral orbitofrontal cortex (LOFC) differentiates the perception of emotion faces in a manner consistent with the dimensional and/or categorical view. Greater effective connectivity from left MOFC to amygdala differentiated positive and neutral expressions from negatively valenced angry, disgust, and fear expressions. Greater effective connectivity from right LOFC to amygdala differentiated emotion expressions conducive to perceiver approach (happy, neutral, and fear) from angry expressions that elicit perceiver withdrawal. Finally, consistent with the categorical view, there were unique patterns of connectivity in response to fear, anger, and disgust, although not in response to happy expressions, which did not differ from neutral ones.  相似文献   

14.
BACKGROUND: Little is known about the functional neuroanatomy underlying the processing of emotional stimuli in social phobia. OBJECTIVES: To investigate specific brain activation that is associated with the processing of threat and safety signals in social phobics. METHODS: Using functional magnetic resonance imaging, brain activation was measured in social phobic and nonphobic subjects during the presentation of angry, happy and neutral facial expressions under free viewing conditions. RESULTS: Compared to controls, phobics showed increased activation of extrastriate visual cortex regardless of facial expression. Angry, but not neutral or happy, faces elicited greater insula responses in phobics. In contrast, both angry and happy faces led to increased amygdala activation in phobics. CONCLUSIONS: The results support the hypothesis that the amygdala is involved in the processing of negative and positive stimuli. Furthermore, social phobics respond sensitively not only to threatening but also to accepting faces and common and distinct neural mechanisms appear to be associated with the processing of threat versus safety signals.  相似文献   

15.
Functional magnetic resonance imaging was used to investigate hemodynamic responses to adjectives pronounced in happy and angry intonations of varying emotional intensity. In separate sessions, participants judged the emotional valence of either intonation or semantics. To disentangle effects of emotional prosodic intensity from confounding acoustic parameters, mean and variability of volume and fundamental frequency of each stimulus were included as nuisance variables in the statistical models. A linear dependency between hemodynamic responses and emotional intensity of happy and angry intonations was found in the bilateral superior temporal sulcus during both tasks, indicating that increases of hemodynamic responses in this region are elicited by both positive and negative prosody independent of low-level acoustic properties and task instructions.  相似文献   

16.
There is increasing evidence for a role of the amygdala in processing gaze direction and emotional relevance of faces. In this event-related functional magnetic resonance study we investigated amygdala responses while we orthogonally manipulated head direction, gaze direction and facial expression (angry, happy and neutral). This allowed us to investigate effects of stimulus ambiguity, low-level factors and non-emotional factors on amygdala activation. Averted vs direct gaze induced increased activation in the right dorsal amygdala regardless of facial expression and head orientation. Furthermore, valence effects were found in the ventral amygdala and strongly dependent on head orientation. We observed enhanced activation to angry and neutral vs happy faces for observer-directed faces in the left ventral amygdala while the averted head condition reversed this pattern resulting in increased activation to happy as compared to angry and neutral faces. These results suggest that gaze direction drives specifically dorsal amygdala activation regardless of facial expression, low-level perceptual factors or stimulus ambiguity. The role of the amygdala is thus not restricted to the detection of potential threat, but has a more general role in attention processes. Furthermore, valence effects are associated with activation of the ventral amygdala and strongly influenced by non-emotional factors.  相似文献   

17.
Multiple lines of evidence converge on the human amygdala as a core moderator of facial emotion perception. The major subregions of the human amygdala have been anatomically delineated, but the individual contribution of these subregions to facial emotion perception is unclear. Here we combined functional MRI (fMRI) with cytoarchitectonically defined maximum probabilistic maps to investigate the response characteristics of amygdala subregions in 14 subjects presented with dynamic animations of angry and happy relative to neutral facial expressions. We localized facial emotion-related signal changes in the basolateral and superficial (cortical) subregions of the left amygdala, with most robust responses observed to happy faces. Moreover, we demonstrate a differential neural response to happy faces in ventromedial prefrontal cortex, which is consistent with a hypothesized role of this brain region in positive valence processing. Furthermore, angry and happy faces both evoked temporopolar responses. Our findings extend current models of facial emotion perception in humans by suggesting an intrinsic functional differentiation within the amygdala related to the extraction of value from facial expressions.  相似文献   

18.
Neural substrates of facial emotion processing using fMRI   总被引:9,自引:0,他引:9  
We identified human brain regions involved in the perception of sad, frightened, happy, angry, and neutral facial expressions using functional magnetic resonance imaging (fMRI). Twenty-one healthy right-handed adult volunteers (11 men, 10 women; aged 18-45; mean age 21.6 years) participated in four separate runs, one for each of the four emotions. Participants viewed blocks of emotionally expressive faces alternating with blocks of neutral faces and scrambled images. In comparison with scrambled images, neutral faces activated the fusiform gyri, the right lateral occipital gyrus, the right superior temporal sulcus, the inferior frontal gyri, and the amygdala/entorhinal cortex. In comparisons of emotional and neutral faces, we found that (1) emotional faces elicit increased activation in a subset of cortical regions involved in neutral face processing and in areas not activated by neutral faces; (2) differences in activation as a function of emotion category were most evident in the frontal lobes; (3) men showed a differential neural response depending upon the emotion expressed but women did not.  相似文献   

19.
In depression, patients suffer from emotional and cognitive deficits, among others in semantic processing. If these semantic deficits are cognitive or interact with emotional dysfunctions, is still an open question. The aim of the current study was to investigate the influence of emotional valence on the neural correlates of semantic priming in major depression. In a lexical decision task, positive, negative, and neutral word pairs were presented during fMRI measurement. Nineteen inpatients and 19 demographically matched controls were recruited. Behaviorally, positive and neutral valence induced a priming effect whereas negative valence induced no effect (controls) or even inhibition (slower RT for related stimuli) in patients. At the neural level, the semantic relation effect revealed similar neural activation in right middle frontal regions for patients and controls. Group differences emerged in the right fusiform gyrus and the ACC. Activity associated with positive valence differed at the DLPFC and amygdala and for negative valence at putamen and cerebellum. The activation of amygdala and DLPFC correlated negatively with the severity of depression. To conclude, semantic processing deficits in depression are modulated by emotional valence of the stimulus on the behavioral as well as on neural level in right‐lateralized prefrontal areas and the amygdala. The results highlighted an influence of depression severity on emotion information processing as the severity of symptoms correlated negatively with neural responses to positively and negatively valenced information. Hence, the dysfunctional emotion processing may further enhance the cognitive deficits in depression. Hum Brain Mapp 35:471–482, 2014. © 2012 Wiley Periodicals, Inc.  相似文献   

20.
Emotional stimuli have been shown to preferentially engage initial attention but their sustained effects on neural processing remain largely unknown. The present study evaluated whether emotional faces engage sustained neural processing by examining the attenuation of neural repetition suppression to repeated emotional faces. Repetition suppression of neural function refers to the general reduction of neural activity when processing a repeated stimulus. Preferential processing of emotional face stimuli, however, should elicit sustained neural processing such that repetition suppression to repeated emotional faces is attenuated relative to faces with no emotional content. We measured the reduction of functional magnetic resonance imaging signals associated with immediate repetition of neutral, angry and happy faces. Whereas neutral faces elicited the greatest suppression in ventral visual cortex, followed by angry faces, repetition suppression was the most attenuated for happy faces. Indeed, happy faces showed almost no repetition suppression in part of the right-inferior occipital and fusiform gyri, which play an important role in face-identity processing. Our findings suggest that happy faces are associated with sustained visual encoding of face identity and thereby assist in the formation of more elaborate representations of the faces, congruent with findings in the behavioral literature.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号