首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

It has been suggested that individuals with social anxiety disorder (SAD) are exaggeratedly concerned about approval and disapproval by others. Therefore, we assessed the recognition of facial expressions by individuals with SAD, in an attempt to overcome the limitations of previous studies.

Methods

The sample was formed by 231 individuals (78 SAD patients and 153 healthy controls). All individuals were treatment naïve, aged 18-30 years and with similar socioeconomic level. Participants judged which emotion (happiness, sadness, disgust, anger, fear, and surprise) was presented in the facial expression of stimuli displayed on a computer screen. The stimuli were manipulated in order to depict different emotional intensities, with the initial image being a neutral face (0%) and, as the individual moved on across images, the expressions increased their emotional intensity until reaching the total emotion (100%). The time, accuracy, and intensity necessary to perform judgments were evaluated.

Results

The groups did not show statistically significant differences in respect to the number of correct judgments or to the time necessary to respond. However, women with SAD required less emotional intensity to recognize faces displaying fear (p = 0.002), sadness (p = 0.033) and happiness (p = 0.002), with no significant differences for the other emotions or men with SAD.

Conclusions

The findings suggest that women with SAD are hypersensitive to threat-related and approval-related social cues. Future studies investigating the neural basis of the impaired processing of facial emotion in SAD using functional neuroimaging would be desirable and opportune.  相似文献   

2.
3.
4.
Psychopathic individuals have been shown to respond less strongly than normal controls to emotional stimuli. Data about their ability to judge emotional facial expressions are inconsistent and limited to males. To measure categorical and dimensional evaluations of emotional facial expressions in psychopathic and non-psychopathic women, 13 female psychopathic forensic inpatients, 15 female non-psychopathic forensic inmates and 16 female healthy participants were tested in an emotion-categorizing task. Emotional facial expressions were presented briefly (33 ms) or until buttonpress. Participants were to classify emotional expressions, and to rate their valence and arousal. Group differences in categorization were observed at both presentation times. Psychopathic patients performed worst with briefly presented sad expressions. Moreover, their dimensional evaluation resulted in less positive ratings for happy expressions and less arousal for angry expressions compared with the responses of non-psychopathic and normal subjects. Results shed light on the mechanism possibly underlying the emotional deficits in psychopathic women.  相似文献   

5.
The importance of the right hemisphere in emotion perception in general has been well documented but its precise role is disputed. We compared the performance of 30 right hemisphere damaged (RHD) patients, 30 left hemisphere damaged (LHD) patients, and 50 healthy controls on both facial and vocal affect perception tasks of specific emotions. Brain damaged subjects had a single episode cerebrovascular accident localised to one hemisphere. The results showed that right hemisphere patients were markedly impaired relative to left hemisphere and healthy controls on test performance: labelling and recognition of facial expressions and recognition of emotions conveyed by prosody. This pertained at the level of individual basic emotions, positive versus negative, and emotional expressions in general. The impairment remained highly significant despite covarying for the group's poorer accuracy on a neutral facial perception test and identification of neutral vocal expressions. The LHD group were only impaired relative to controls on facial emotion tasks when their performance was summed over all the emotion categories and before age and other cognitive factors were taken into account. However, on the prosody test the LHD patients showed significant impairment, performing mid-way between the right hemisphere patients and healthy comparison group. Recognition of positive emotional expressions was better than negative in all subjects, and was not relatively poorer in the LHD patients. Recognition of individual emotions in one modality correlated weakly with recognition in another, in all three groups. These data confirm the primacy of the right hemisphere in processing all emotional expressions across modalities--both positive and negative--but suggest that left hemisphere emotion processing is modality specific. It is possible that the left hemisphere has a particular role in the perception of emotion conveyed through meaningful speech.  相似文献   

6.

Objective

We investigated the deficit in the recognition of facial emotions in a sample of medicated, stable Korean patients with schizophrenia using Korean facial emotion pictures and examined whether the possible impairments would corroborate previous findings.

Methods

Fifty-five patients with schizophrenia and 62 healthy control subjects completed the Facial Affect Identification Test with a new set of 44 colored photographs of Korean faces including the six universal emotions as well as neutral faces.

Results

Korean patients with schizophrenia showed impairments in the recognition of sad, fearful, and angry faces [F(1,114)=6.26, p=0.014; F(1,114)=6.18, p=0.014; F(1,114)=9.28, p=0.003, respectively], but their accuracy was no different from that of controls in the recognition of happy emotions. Higher total and three subscale scores of the Positive and Negative Syndrome Scale (PANSS) correlated with worse performance on both angry and neutral faces. Correct responses on happy stimuli were negatively correlated with negative symptom scores of the PANSS. Patients with schizophrenia also exhibited different patterns of misidentification relative to normal controls.

Conclusion

These findings were consistent with previous studies carried out with different ethnic groups, suggesting cross-cultural similarities in facial recognition impairment in schizophrenia.  相似文献   

7.
Facial expressions of emotion display a wealth of important social information that we use to guide our social judgements. The aim of the current study was to investigate whether patients with orbitofrontal cortex (OFC) lesions exhibit an impaired ability to judge the approachability of emotional faces. Furthermore, we also intended to establish whether impaired approachability judgements provided to emotional faces emerged in the presence of preserved explicit facial expression recognition. Using non-parametric statistics, we found that patients with OFC lesions had a particular difficulty using negative facial expressions to guide approachability judgements, compared to healthy controls and patients with frontal lesions sparing the OFC. Importantly, this deficit arose in the absence of an explicit facial expression recognition deficit. In our sample of healthy controls, we also demonstrated that the capacity to recognise facial expressions was not significantly correlated with approachability judgements given to emotional faces. These results demonstrate that the integrity of the OFC is critical for the appropriate assessment of approachability from negatively valenced faces and this ability is functionally dissociable from the capacity to explicitly recognise facial expressions.  相似文献   

8.
Long-term memory (LTM) is enhanced for emotional information, but the influence of stimulus emotionality on short-term memory (STM) is less clear. We examined the electrophysiological correlates of improved visual STM for emotional face identity, focusing on the P1, N170, P3b and N250r event-related potential (ERP) components. These correlates are taken to indicate which memory processing stages and cognitive processes contribute to the improved STM for emotional face identity. In the encoding phase, one or three angry, happy or neutral faces were presented for 2 s, resulting in a memory load of one or three. The subsequent 1-s retention phase was followed by a 2-s retrieval phase, in which participants indicated whether a probe face had been present or not during encoding. Memory performance was superior for angry and happy faces over neutral faces at load three. None of the ERP components during encoding were affected by facial expression. During retrieval, the early P3b was decreased for emotional compared to neutral faces, which presumably reflects greater resource allocation to the maintenance of the emotional faces. Furthermore, the N250r during retrieval was increased for emotional compared to neutral faces, reflecting an enhanced repetition effect for emotional faces. These findings suggest that enhanced visual STM for emotional faces arises from improved maintenance and from improved detection of face repetition at retrieval.  相似文献   

9.
Previous work has suggested that elevated levels of trait anxiety are associated with an increased ability to accurately recognize the facial expression of fear. However, to date, recognition has only been assessed after viewing periods of 10s, despite the fact that the process of emotion recognition from faces typically takes a fraction of this time. The current study required participants with either high or low levels of non-clinical trait anxiety to make speeded emotional classification judgments to a series of facial expressions drawn from seven emotional categories. Following previous work it was predicted that recognition of fearful facial expressions would be more accurate in the high-trait anxious group compared with the low-trait anxious group. However, contrary to this prediction, no anxiety-related differences in emotion perception were observed across all seven emotions. This suggests that anxiety does not influence the perception of fear as has been previously proposed.  相似文献   

10.
The recognition of facial immaturity and emotional expression by children with autism, language disorders, mental retardation, and non-disabled controls was studied in two experiments. Children identified immaturity and expression in upright and inverted faces. The autism group identified fewer immature faces and expressions than control (Exp. 1 & 2), language disordered (Exp. 1), and mental retardation (Exp. 2) groups. Facial inversion interfered with all groups’ recognition of facial immaturity and with control and language disordered groups’ recognition of expression. Error analyses (Exp. 1 & 2) showed similarities between autism and other groups’ perception of immaturity but differences in perception of expressions. Reasons for similarities and differences between children with and without autism when perceiving facial immaturity and expression are discussed.  相似文献   

11.
The neuropeptide oxytocin has been shown to improve many aspects of social cognitive functioning, including facial emotion recognition, and to promote social approach behaviour. In the present study, we investigated the modulatory effects of oxytocin on the recognition of briefly presented facial expressions. In order to diversify the degree of visual awareness for the facial stimuli, presentation duration was systematically varied. Fifty-six participants were administered intranasal oxytocin or a placebo in a double-blind, randomized, between-subjects design. Participants viewed angry and happy target faces or neutral distractors for 18, 35, or 53 ms subsequently masked by neutral faces. Participants had to indicate the presence or absence of the briefly presented target face. Discrimination indices (d') showed that oxytocin generally enhanced detection accuracy of emotional stimuli. This effect was more pronounced for the recognition of happy faces. We provide evidence that a single dose of intranasally administered oxytocin enhances detection of briefly presented emotional stimuli. The possible role of stimulus valence and recognition difficulty is discussed.  相似文献   

12.

Objective

Previous studies reported gender differences for facial emotion recognition in healthy people, with women performing better than men. Few studies that examined gender differences for facial emotion recognition in schizophrenia brought out inconsistent findings. The aim of this study is to investigate gender differences for facial emotion identification and discrimination abilities in patients with schizophrenia.

Methods

35 female and 35 male patients with schizophrenia, along with 35 female and 35 male healthy controls were included in the study. All the subjects were evaluated with Facial Emotion Identification Test (FEIT), Facial Emotion Discrimination Test (FEDT), and Benton Facial Recognition Test (BFRT). Patients'' psychopathological symptoms were rated by means of the Positive and Negative Syndrome Scale (PANSS).

Results

Male patients performed significantly worse than female patients on FEIT total, and negative scores. Male controls performed significantly worse than female controls on FEIT total and negative scores. On all tasks, female patients performed comparable with controls. Male patients performed significantly worse than controls on FEIT, and FEDT.

Conclusion

Women with schizophrenia outperformed men for facial emotion recognition ability in a pattern that is similar with the healthy controls. It could be claimed that male patients with schizophrenia need special consideration for emotion perception deficits.  相似文献   

13.
Hsu SM  Pessoa L 《Neuropsychologia》2007,45(13):3075-3086
While the role of attention in determining the neural fate of unattended emotional items has been investigated in the past, it remains unclear whether bottom-up and top-down factors have differential effects in shaping responses evoked by such stimuli. To study the effects of bottom-up and top-down factors on the processing of neutral and fearful faces, we employed functional magnetic resonance imaging (fMRI) while participants performed attentional tasks that manipulated these factors. To probe the impact of top-down mechanisms on the processing of face distractors, target letters either had to be found among several distinct nontarget letters (attentional load condition) or among identical nontarget letters (baseline condition). To probe the impact of bottom-up factors, we decreased the salience of the targets by reducing their size and contrast relative to baseline (salience condition). Our findings revealed that bottom-up and top-down manipulations produced dissociable effects on amygdala and fusiform gyrus responses to fearful-face distractors when task difficulty was equated. When the attentional load of the main task was high, weaker responses were evoked by fearful-face distractors relative to baseline during the early trials. By contrast, decreasing target salience resulted in increased responses relative to baseline. The present findings suggest that responses evoked by unattended fearful faces are modulated by several factors, including attention and stimulus salience.  相似文献   

14.

Background

Depressed patients are both characterized by social reality distorting maladaptive schemas and facial expression recognition impairments. The aim of the present study was to identify specific associations among symptom severity of depression, early maladaptive schemas and recognition patterns of facially expressed emotions.

Methods

The subjects were inpatients, diagnosed with depression. We used 2 virtual humans for presenting the basic emotions to assess emotion recognition. The Symptom Check List 90 (SCL-90) was used as a self-report measure of psychiatric symptoms and the Beck Depression Inventory (BDI) was applied to assess symptoms of depression. The Young Schema Questionnaire Long Form (YSQ-L) was used to assess the presence of early maladaptive schemas.

Results

The recognition rate for happiness showed significant associations with both the BDI and the depression subscale of the SCL-90. After performing the second order factor analysis of the YSQ-L, we found statistically significant associations between the recognition indices of specific emotions and the main factors of the YSQ-L.

Conclusions

In this study we found correlations between maladaptive schemas and emotion recognition impairments. While both domains likely contribute to the symptoms of depression, we believe that the results will help us to better understand the social cognitive deficits of depressed patients at the schema level and at the emotion recognition level.  相似文献   

15.
Many studies provided evidence that the emotional content of visual stimulations modulates behavioral performance and neuronal activity. Surprisingly, these studies were carried out using stimulations presented in the center of the visual field while the majority of visual events firstly appear in the peripheral visual field. In this study, we assessed the impact of the emotional facial expression of fear when projected in near and far periphery. Sixteen participants were asked to categorize fearful and neutral faces projected at four peripheral visual locations (15° and 30° of eccentricity in right and left sides of the visual field) while reaction times and event-related potentials (ERPs) were recorded. ERPs were analyzed by means of spatio-temporal principal component and baseline-to-peak methods. Behavioral data confirmed the decrease of performance with eccentricity and showed that fearful faces induced shorter reaction times than neutral ones. Electrophysiological data revealed that the spatial position and the emotional content of faces modulated ERPs components. In particular, the amplitude of N170 was enhanced by fearful facial expression. These findings shed light on how visual eccentricity modulates the processing of emotional faces and suggest that, despite impoverished visual conditions, the preferential neural coding of fearful expression of faces still persists in far peripheral vision. The emotional content of faces could therefore contribute to their foveal or attentional capture, like in social interactions.  相似文献   

16.
Previous studies indicate that Multiple Complex Developmental Disorder (MCDD) children differ from PDD-NOS and autistic children on a symptom level and on psychophysiological functioning. Children with MCDD (n = 21) and PDD-NOS (n = 62) were compared on two facets of social-cognitive functioning: identification of neutral faces and facial expressions. Few significant group differences emerged. Children with PDD-NOS demonstrated a more attention-demanding strategy of face processing, and processed neutral faces more similarly to complex patterns whereas children with MCDD showed an advantage for face recognition compared to complex patterns. Results further suggested that any disadvantage in face recognition was related more to the autistic features of the PDD-NOS group rather than characteristics specific to MCDD. No significant group differences emerged for identifying facial expressions. This work was conducted in the Department of Child and Adolescent Psychiatry, Erasmus Medical Center/Sophia Children’s Hospital, Rotterdam.  相似文献   

17.
Central nervous system habituation in humans was studied using functional magnetic resonance imaging and repeated presentations of single fearful and neutral faces. Habituation of blood-oxygenation-level-dependent (BOLD) signal during exposure to face stimuli, collapsed over fearful and neutral expressions, was evident in the right amygdala and hippocampus, as well as in the medial/inferior temporal cortex bilaterally. In the hippocampus, significantly greater habituation was evident on the right as compared to the left side, which could reflect the visual nature of the stimuli. There were no time by expression interaction effects in these regions, suggesting similar neural attenuation rates to fearful and neutral face stimuli. These results indicate that brain regions involved in novelty detection and memory processing habituate at similar rates regardless of whether the face in focus displays an aversive emotional expression or not.  相似文献   

18.
The neural substrates that subserve decoding of different emotional expressions are subject to different rates of degeneration and atrophy in Alzheimer's disease (AD), and there is therefore reason to anticipate that a differentiated profile of affect recognition impairment may emerge. However, it remains unclear whether AD differentially affects the recognition of specific emotions. Further, there is only limited research focused on whether affect recognition deficits in AD generalize to more ecologically valid stimuli. In the present study, relatively mild AD participants (n = 24), older controls (n = 30) and younger controls (n = 30) were administered measures of affect recognition. Significant AD deficits were observed relative to both the younger and older control groups on a measure that involved labeling of static images of facial affect. AD deficits on this measure were observed in relation to all emotions assessed (anger, sadness, happiness, surprise and fear), with the exception of disgust, which was preserved even relative to the younger adult group. The relative preservation of disgust could not be attributed to biases in the choice of labels made, and it is suggested instead that this finding might reflect the relative sparing of the basal ganglia in AD. No significant AD effect was observed for the more ecologically valid measure that involved dynamic displays of facial expressions, in conjunction with paralinguistic and body movement cues, although a trend for greater AD difficulty was observed.  相似文献   

19.
《Seizure》2014,23(10):892-898
PurposeTo describe visual scanning pattern for facial identity recognition (FIR) and emotion recognition (FER) in patients with idiopathic generalized (IGE) and mesial temporal lobe epilepsy (MTLE). Secondary endpoint was to correlate the results with cognitive function.MethodsBenton Facial Recognition Test (BFRT) and Ekman&Friesen series were performed for FIR and FER respectively in 23 controls, 20 IGE and 19 MTLE patients. Eye movements were recorded by a Hi-Speed eye-tracker system. Neuropsychological tools explored cognitive function.ResultsCorrect FIR rate was 78% in controls, 70.7% in IGE and 67.4% (p = 0.009) in MTLE patients. FER hits reached 82.7% in controls, 74.3% in IGE (p = 0.006) and 73.4% in MTLE (p = 0.002) groups. IGE patients failed in disgust (p = 0.005) and MTLE ones in fear (p = 0.009) and disgust (p = 0.03). FER correlated with neuropsychological scores, particularly verbal fluency (r = 0.542, p < 0.001). Eye-tracking revealed that controls scanned faces more diffusely than IGE and MTLE patients for FIR, who tended to top facial areas. A longer scanning of the top facial area was found in the three groups for FER. Gap between top and bottom facial region fixation time decreased in MTLE patients, with more but shorter fixations in bottom facial region. However, none of these findings were statistically significant.ConclusionFIR was impaired in MTLE patients, and FER in both IGE and MTLE, particularly for fear and disgust. Although not statistically significant, those with impaired FER tended to perform more diffuse eye-tracking over the faces and have cognitive dysfunction.  相似文献   

20.
《Social neuroscience》2013,8(4):436-443
Both emotional images and human faces are particularly salient compared to other categories of visual stimuli. The late positive potential (LPP) is larger for emotional than neutral images, and some evidence suggests that the LPP is further enhanced for images containing people. Studies of emotion frequently compare pleasant and unpleasant IAPS pictures to neutral, without an explicit understanding of how the presence of faces in these images may affect attentional allocation and psychophysiological response. The present experiment examined the effect of faces on the LPP elicited by neutral and threatening IAPS images. The LPP was enhanced by faces in neutral images, but no difference was observed between threatening images with and without faces. These results demonstrate that the inclusion of faces in IAPS images significantly impacts the LPP; however, this effect is unique to neutral images.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号