首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The present study had the goal to assess whether individuals mimic and show emotional contagion in response to relatively weak and idiosyncratic dynamic facial expressions of emotions similar to those encountered in everyday life. Furthermore, the question of whether mimicry leads to emotional contagion and in turn facilitates emotion recognition was addressed. Forty-one female participants rated a series of short video clips of stimulus persons expressing anger, sadness, disgust, and happiness regarding the emotions expressed. An unobtrusive measure of emotional contagion was taken. Evidence for mimicry was found for all types of expressions. Furthermore, evidence for emotional contagion of happiness and sadness was found. Mediational analyses could not confirm any relation between mimicry and emotional contagion nor between mimicry and emotion recognition.  相似文献   

2.
Twenty-seven female undergraduates completed three tasks: (1) feel four emotions (happiness, sadness, anger, peacefulness); (2) express these emotions, without trying to feel them; and (3) feel and express clearly these four emotions. During each trial subjects pressed a button to indicate when they had reached the required state, and the latency from emotion cue to button press was measured. Heart rate, skin conductance and EMG from four facial sites (brow, cheek, jaw and mouth) were recorded for 15 s before and after the button press and during a baseline period prior to each trial. Self-reports were obtained after each trial. Facial EMG and patterns of autonomic arousal differentiated among the four emotions within each task. Shorter self-generation latency in the Feel-and-Show versus the Feel condition indicated the facilitative effect of facial expression on the self-generation of emotion. Furthermore, the presence of autonomic changes and self-reported affect in the Show condition supports the sufficiency version of the facial feedback hypothesis. The self-generation method employed as an emotion elicitor was shown to reliably induce emotional reactions and is proposed as a useful technique for the elicitation of various emotional states in the laboratory.  相似文献   

3.
Alexithymic individuals have difficulties in identifying and verbalizing their emotions. The amygdala is known to play a central role in processing emotion stimuli and in generating emotional experience. In the present study automatic amygdala reactivity to facial emotion was investigated as a function of alexithymia (as assessed by the 20-Item Toronto Alexithymia Scale). The Beck-Depression Inventory (BDI) and the State-Trait-Anxiety Inventory (STAI) were administered to measure participants' depressivity and trait anxiety. During 3T fMRI scanning, pictures of faces bearing sad, happy, and neutral expressions masked by neutral faces were presented to 21 healthy volunteers. The amygdala was selected as the region of interest (ROI) and voxel values of the ROI were extracted, summarized by mean and tested among the different conditions. A detection task was applied to assess participants' awareness of the masked emotional faces shown in the fMRI experiment. Masked sad and happy facial emotions led to greater right amygdala activation than masked neutral faces. The alexithymia feature difficulties identifying feelings was negatively correlated with the neural response of the right amygdala to masked sad faces, even when controlling for depressivity and anxiety. Reduced automatic amygdala responsivity may contribute to problems in identifying one's emotions in everyday life. Low spontaneous reactivity of the amygdala to sad faces could implicate less engagement in the encoding of negative emotional stimuli.  相似文献   

4.
Previous studies, mainly with Caucasian samples, have shown that facial expressions of emotion are contagious, a phenomenon known as facial mimicry. This study examined facial mimicry using a Japanese sample. Participants were shown a series of Japanese faces (from Matsumoto and Ekman, 1988) on a computer screen expressing "happiness", "sadness", "anger", or "disgust". While viewing the facial expressions, electoromyograms (EMG) of the participants' faces were recorded to see whether their own facial muscles corresponding to the stimulus faces were activated. Consistent with the previous studies using Caucasian samples, all four facial expressions were mimicked. The peak time of mimicry of angry or happy faces was later, while that of disgusted faces was relatively sooner. The potential relation of facial mimicry to "emotional contagion", a social phenomenon whereby subjective feelings transfer between people, is discussed.  相似文献   

5.
Facial mimicry, the tendency to imitate other's facial expressions, has frequently been described as a reflex-like mechanism that function independent of the relationship between expresser and observer. However, there is also evidence suggesting that it is a social cue regulating social interactions and that consequently mimicry varies as a function of social context and the type of emotion expression shown. Two studies were conducted to the assess impact of social group membership and type of expression on facial mimicry. Results suggest that the level of facial mimicry varies as a function of group membership. Moreover, mimicry levels were influenced by the kind of emotion displayed by the expresser. Although participants mimicked happiness displays regardless of the expresser's group membership, negative emotions were either not mimicked or only when shown by an ingroup member.  相似文献   

6.
Four experiments were conducted to determine whether voluntarily produced emotional facial configurations are associated with differentiated patterns of autonomic activity, and if so, how this might be mediated. Subjects received muscle-by-muscle instructions and coaching to produce facial configurations for anger, disgust, fear, happiness, sadness, and surprise while heart rate, skin conductance, finger temperature, and somatic activity were monitored. Results indicated that voluntary facial activity produced significant levels of subjective experience of the associated emotion, and that autonomic distinctions among emotions: (a) were found both between negative and positive emotions and among negative emotions, (b) were consistent between group and individual subjects' data, (c) were found in both male and female subjects, (d) were found in both specialized (actors, scientists) and nonspecialized populations, (e) were stronger when the voluntary facial configurations most closely resembled actual emotional expressions, and (f) were stronger when experience of the associated emotion was reported. The capacity of voluntary facial activity to generate emotion-specific autonomic activity: (a) did not require subjects to see facial expressions (either in a mirror or on an experimenter's face), and (b) could not be explained by differences in the difficulty of making the expressions or by differences in concomitant somatic activity.  相似文献   

7.
Facial expressions are important emotional stimuli during social interactions. Symbolic emotional cues, such as affective words, also convey information regarding emotions that is relevant for social communication. Various studies have demonstrated fast decoding of emotions from words, as was shown for faces, whereas others report a rather delayed decoding of information about emotions from words. Here, we introduced an implicit (color naming) and explicit task (emotion judgment) with facial expressions and words, both containing information about emotions, to directly compare the time course of emotion processing using event-related potentials (ERP). The data show that only negative faces affected task performance, resulting in increased error rates compared to neutral faces. Presentation of emotional faces resulted in a modulation of the N170, the EPN and the LPP components and these modulations were found during both the explicit and implicit tasks. Emotional words only affected the EPN during the explicit task, but a task-independent effect on the LPP was revealed. Finally, emotional faces modulated source activity in the extrastriate cortex underlying the generation of the N170, EPN and LPP components. Emotional words led to a modulation of source activity corresponding to the EPN and LPP, but they also affected the N170 source on the right hemisphere. These data show that facial expressions affect earlier stages of emotion processing compared to emotional words, but the emotional value of words may have been detected at early stages of emotional processing in the visual cortex, as was indicated by the extrastriate source activity.  相似文献   

8.
Two different emotion regulation strategies, cognitive reappraisal and expressive suppression, are strongly associated with increased neural activity in the prefrontal cognitive control network. In this event-related fMRI study, we investigated whether individual differences in habitual reappraisal and suppression tendencies are related to differences in prefrontal cognitive control processes for emotional information. In order to measure cognitive control over inhibiting a dominant response to happy or sad stimuli (in favor of the opposite valence), thirty-one healthy female participants performed the Cued Emotional Conflict Task (CECT). The Emotion Regulation Questionnaire was used to measure individual differences in everyday use of emotion regulation. Results demonstrate that high reappraisers are behaviorally faster and exert more fronto-cingulate activity when inhibiting a response to sad faces (compared to happy faces, FDR corrected). On the other hand, suppression scores are not correlated with performance to CECT trials. Interestingly, suppression scores are associated with higher amygdala activation during the inhibition of a response to sad faces (compared to happy faces). These data suggest that habitual reappraisal is associated with underlying functional cognitive control processes to inhibit a dominant response to negative material. In contrast, the effort to control negative material has negative consequences in individuals who have a tendency to suppress emotions.  相似文献   

9.
Results obtained with a novel emotional Go/NoGo task allowing the investigation of facial mimicry (FM) during the production and inhibition of voluntary smiles are discussed. Healthy participants were asked to smile rapidly to happy faces and maintain a neutral expression to neutral faces, or the reverse. Replicating and extending previous results, happy faces induced FM, as shown by stronger and faster zygomatic activation to happy than neutral faces in Go trials, and a greater number of false alarms to happy faces in NoGo trials. Facial mimicry effects remained present during participants’ active inhibition of facial movement. Latencies of FM were short with 126-250 ms in Go trials, and 251-375 ms in NoGo trials. The utility of the Go/NoGo task, which allows the assessment of response inhibition in the domain of facial expression by installing strong prepotent motor responses via short stimulus presentation times and a great number of Go trials, is discussed.  相似文献   

10.
Pictures of emotional facial expressions or natural scenes are often used as cues in emotion research. We examined the extent to which these different stimuli engage emotion and attention, and whether the presence of social anxiety symptoms influences responding to facial cues. Sixty participants reporting high or low social anxiety viewed pictures of angry, neutral, and happy faces, as well as violent, neutral, and erotic scenes, while skin conductance and event-related potentials were recorded. Acoustic startle probes were presented throughout picture viewing, and blink magnitude, probe P3 and reaction time to the startle probe also were measured. Results indicated that viewing emotional scenes prompted strong reactions in autonomic, central, and reflex measures, whereas pictures of faces were generally weak elicitors of measurable emotional response. However, higher social anxiety was associated with modest electrodermal changes when viewing angry faces and mild startle potentiation when viewing either angry or smiling faces, compared to neutral. Taken together, pictures of facial expressions do not strongly engage fundamental affective reactions, but these cues appeared to be effective in distinguishing between high and low social anxiety participants, supporting their use in anxiety research.  相似文献   

11.
Preliminary studies have demonstrated that school-aged children (average age 9-10years) show mimicry responses to happy and angry facial expressions. The aim of the present study was to assess the feasibility of using facial electromyography (EMG) as a method to study facial mimicry responses in younger children aged 6-7years to emotional facial expressions of other children. Facial EMG activity to the presentation of dynamic emotional faces was recorded from the corrugator, zygomaticus, frontalis and depressor muscle in sixty-one healthy participants aged 6-7years. Results showed that the presentation of angry faces was associated with corrugator activation and zygomaticus relaxation, happy faces with an increase in zygomaticus and a decrease in corrugator activation, fearful faces with frontalis activation, and sad faces with a combination of corrugator and frontalis activation. This study demonstrates the feasibility of measuring facial EMG response to emotional facial expressions in 6-7year old children.  相似文献   

12.
A considerable body of research has focused on neural responses evoked by emotional facial expressions, but little is known about mother-specific brain responses to infant facial emotions. We used near-infrared spectroscopy to investigate prefrontal activity during discriminating facial expressions of happy, angry, sad, fearful, surprised and neutral of unfamiliar infants and unfamiliar adults by 14 mothers and 14 age-matched females who have never been pregnant (non-mothers). Our results revealed that discriminating infant facial emotions increased the relative oxyHb concentration in mothers' right prefrontal cortex but not in their left prefrontal cortex, compared with each side of the prefrontal cortices of non-mothers. However, there was no difference between mothers and non-mothers in right or left prefrontal cortex activation while viewing adult facial expressions. These results suggest that the right prefrontal cortex is involved in human maternal behavior concerning infant facial emotion discrimination.  相似文献   

13.
Alpha brain oscillation modulation was analyzed in response to masked emotional facial expressions. In addition, behavioural activation (BAS) and behavioural inhibition systems (BIS) were considered as an explicative factor to verify the effect of motivational significance on cortical activity. Nineteen subjects were submitted to an ample range of facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral). The results demonstrated that anterior frontal sites were more active than central and posterior sites in response to facial stimuli. Moreover, right-side responses varied as a function of emotional types, with an increased right-frontal activity for negative emotions. Finally, whereas higher BIS subjects generated a more right hemisphere activation for some negative emotions (such as fear, anger, and surprise), Reward-BAS subjects were more responsive to positive emotion (happiness) within the left hemisphere. Valence and potential threatening power of facial expressions were considered to elucidate these cortical differences.  相似文献   

14.
It remains an open question whether it is possible to assign a single brain operation or psychological function for facial emotion decoding to a certain type of oscillatory activity. Gamma band activity (GBA) offers an adequate tool for studying cortical activation patterns during emotional face information processing. In the present study brain oscillations were analyzed in response to facial expression of emotions. Specifically, GBA modulation was measured when twenty subjects looked at emotional (angry, fearful, happy, and sad faces) or neutral faces in two different conditions: supraliminal (10 ms) vs subliminal (150 ms) stimulation (100 target-mask pairs for each condition). The results showed that both consciousness and significance of the stimulus in terms of arousal can modulate the power synchronization (ERD decrease) during 150-350 time range: an early oscillatory event showed its peak at about 200 ms post-stimulus. GBA was enhanced by supraliminal more than subliminal elaboration, as well as more by high arousal (anger and fear) than low arousal (happiness and sadness) emotions. Finally a left-posterior dominance for conscious elaboration was found, whereas right hemisphere was discriminant in emotional processing of face in comparison with neutral face.  相似文献   

15.
Participants watching a facial expression of emotion tend to respond with the same facial expression. This facial concordance is well known for happiness, but not for other emotions. The present study investigated whether facial expressions of basic six emotions induce facial concordance in participants by average-face method. Facial reactions of 20 subjects were videotaped while watching a facial expression of six emotions performed by amateur actors in a computer display. The six average faces were made from corresponding facial expression of emotions in the display. Newly chosen 62 subjects were asked to classify those average faces into six categories of emotion. The facial concordance was found for happiness and surprise, but not for disgust and fear. However, for average face of anger and that of sadness, classifications were divided into a few categories. This result suggests a possibility that an average faces might have included ambiguous or different faces. It may be necessary to conduct re-classification not with the average-face but with individual faces.  相似文献   

16.
The present study was designed to evaluate whether aversively conditioned responses to facial stimuli are detectable in all three components of the emotional response system, i.e. the expressive/behavioral, the physiological/autonomic and the cognitive/experienced component of emotion. Two groups of subjects were conditioned to angry or happy facial expression stimuli using a 100 dB noise as UCS in a differential aversive conditioning paradigm. The three components of the emotional response system were measured as: Facial-EMG reactions (corrugator and zygomatic muscle regions); autonomic activity (skin conductance, SCR; SCR half recovery time, T/2; heart rate, HR); and ratings of experienced emotion. It was found that responses in all components of the emotional response system were detectable in the angry group as greater EMG and autonomic resistance to extinction and greater self-reported fear. More specifically the angry group showed a resistant conditioning effect for the facial-EMG corrugator muscle that was accompanied by resistant conditioning for SCR frequency, slower SCR recovery, resistant conditioning in HR and a higher self-reported fear than the happy group. Thus, aversive conditioning to angry facial stimuli induce a uniform negative emotional response pattern as indicated by all three components of the emotional response system. These data suggest that a negative 'affect program' triggers responses in the different emotional components. The results suggest that human subjects are biologically prepared to react with a negative emotion to angry facial stimuli.  相似文献   

17.
Introduction: The study investigates how benzodiazepine (BZD) use and detoxification affects empathy and the recognition and intensity rating of emotional facial expressions. The sample comprised 43 participants in three groups: (1) during detoxification (N?=?13), (2) after detoxification (N?=?15), (3) a matched control group (N?=?15). Clinical subjects were recruited from in-patients of an addiction treatment unit.

Methods: Empathy levels were tested with the Empathy Quotient (EQ-Short). Recognition accuracy and emotion intensity rating were based on a computerised task displaying static and dynamic facial expressions of joy, anger, sadness, and fear.

Results: The controls proved more accurate than both experimental groups in identifying facial expressions of negative emotions. Joy recognition proved most accurate overall. Among the clinical subjects, women in particular exhibited an impaired ability to correctly identify negative emotions from facial expressions. Dynamic stimuli were better recognised than static ones albeit only in the experimental groups. No significant differences were found for emotion intensity ratings and EQ scores.

Conclusion: Our findings suggest that the impaired facial emotion recognition accuracy is not caused by deficits in empathy. No improvement was recorded post-detoxification which may indicate impaired interpersonal functioning among BZD users. Further research is warranted in light of this study’s limitations.  相似文献   

18.
Sleep deprivation impacts subjective mood states, but very little research has examined the impact on processing emotional information. In the current study, we investigated the impact of total sleep deprivation on neural responses to emotional facial expressions as well as the accuracy and speed with which these faces were categorized. Forty-nine participants completed two tasks in which they were asked to categorize emotional facial expressions as Happy, Sad, Angry, or Fearful. They were shown the ‘full’ expression of the emotions in one task and more subtle expressions in a second task in which expressions were ‘morphed’ with neutral faces so that the intensity of emotion varied. It was expected that sleep deprivation would lead to greater reactivity (indexed by larger amplitude N170 event-related potentials), particularly for negative and more subtle facial expressions. In the full face task, sleep-deprived (SD) participants were significantly less accurate than controls (C) at identifying Sad faces and slower to identify all emotional expressions. P1 was smaller and N170 was larger for the SD compared to C group, but for all emotions, indicating generalized impairment in low-level visual processing. In the more difficult morphed face task, SD participants were less accurate than C participants for Sad faces; as well, the group difference in reaction time was greatest for Sad faces. For the SD group, N170 increased in amplitude with increasing perceptual difficulty for the Fearful and Angry faces, but decreased in amplitude with increasing difficulty for Sad faces. These data illustrate that sleep deprivation led to greater neural reactivity for the threat-related negative emotions as they became more subtle; however, there was a failure to engage these perceptual resources for the processing of Sad faces. Sleep loss preferentially impacted the processing of Sad faces; this has widespread implications for sleep-deprived groups.  相似文献   

19.

Objective

To explore the feasibility of applying an experimental design to study the relationship between non-verbal emotions and empathy development in simulated consultations.

Method

In video-recorded simulated consultations, twenty clinicians were randomly allocated to either an experimental group (instructed to mimic non-verbal emotions of a simulated patient, SP) or a control group (no such instruction). Baseline empathy scores were obtained before consultation, relational empathy was rated by SP after consultation. Multilevel logistic regression modelled the probability of mimicry occurrence, controlling for baseline empathy and clinical experience. ANCOVA compared group differences on relational empathy and consultation smoothness.

Results

Instructed mimicry lasted longer than spontaneous mimicry. Mimicry was marginally related to improved relational empathy. SP felt being treated more like a whole person during consultations with spontaneous mimicry. Clinicians who displayed spontaneous mimicry felt consultations went more smoothly.

Conclusion

The experimental approach improved our understanding of how non-verbal emotional mimicry contributed to relational empathy development during consultations. Further work should ascertain the potential of instructed mimicry to enhance empathy development.

Practice implications

Understanding how non-verbal emotional mimicry impacts on patients’ perceived clinician empathy during consultations may inform training and intervention programme development.  相似文献   

20.
Mothers’ ability to empathically share offspring's emotional feelings is considered integral to primary affective bonds and a healthy socio-emotional development. What neurobiological mechanism is responsible for this ability in humans? It has been proposed that the psychological and neural components of affective experiences are strictly associated with autonomic-visceral changes. Hence, the vicarious response of empathy may also embody a sharing of changes in body physiology. The present study aimed at investigating whether maternal empathy is accompanied by a synchrony in autonomic responses. We simultaneously recorded, in an ecological context with contact free methodology, the facial thermal imprints of mother and child, while the former observed the latter when involved in a distressing situation. The results showed a situation-specific parallelism between mothers’ and children's facial temperature variations, providing preliminary evidence for a direct affective sharing involving autonomic responding. These findings support a multidimensional approach for the comprehension of emotional parent-child relationships.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号