首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Pictures of emotional facial expressions or natural scenes are often used as cues in emotion research. We examined the extent to which these different stimuli engage emotion and attention, and whether the presence of social anxiety symptoms influences responding to facial cues. Sixty participants reporting high or low social anxiety viewed pictures of angry, neutral, and happy faces, as well as violent, neutral, and erotic scenes, while skin conductance and event-related potentials were recorded. Acoustic startle probes were presented throughout picture viewing, and blink magnitude, probe P3 and reaction time to the startle probe also were measured. Results indicated that viewing emotional scenes prompted strong reactions in autonomic, central, and reflex measures, whereas pictures of faces were generally weak elicitors of measurable emotional response. However, higher social anxiety was associated with modest electrodermal changes when viewing angry faces and mild startle potentiation when viewing either angry or smiling faces, compared to neutral. Taken together, pictures of facial expressions do not strongly engage fundamental affective reactions, but these cues appeared to be effective in distinguishing between high and low social anxiety participants, supporting their use in anxiety research.  相似文献   

2.
Although neutral faces do not initially convey an explicit emotional message, it has been found that individuals tend to assign them an affective content. Moreover, previous research has shown that affective judgments are mediated by the task they have to perform. Using functional magnetic resonance imaging in 21 healthy participants, we focus this study on the cerebral activity patterns triggered by neutral and emotional faces in two different tasks (social or gender judgments). Results obtained, using conjunction analyses, indicated that viewing both emotional and neutral faces evokes activity in several similar brain areas indicating a common neural substrate. Moreover, neutral faces specifically elicit activation of cerebellum, frontal and temporal areas, while emotional faces involve the cuneus, anterior cingulated gyrus, medial orbitofrontal cortex, posterior superior temporal gyrus, precentral/postcentral gyrus and insula. The task selected was also found to influence brain activity, in that the social task recruited frontal areas while the gender task involved the posterior cingulated, inferior parietal lobule and middle temporal gyrus to a greater extent. Specifically, in the social task viewing neutral faces was associated with longer reaction times and increased activity of left dorsolateral frontal cortex compared with viewing facial expressions of emotions. In contrast, in the same task emotional expressions distinctively activated the left amygdale. The results are discussed taking into consideration the fact that, like other facial expressions, neutral expressions are usually assigned some emotional significance. However, neutral faces evoke a greater activation of circuits probably involved in more elaborate cognitive processing.  相似文献   

3.
Research investigating the early development of emotional processing has focused mainly on infants' perception of static facial emotional expressions, likely restricting the amount and type of information available to infants. In particular, the question of whether dynamic information in emotional facial expressions modulates infants' neural responses has been rarely investigated. The present study aimed to fill this gap by recording 7-month-olds' event-related potentials to static (Study 1) and dynamic (Study 2) happy, angry, and neutral faces. In Study 1, happy faces evoked a faster right-lateralized negative central (Nc) component compared to angry faces. In Study 2, both happy and angry faces elicited a larger right-lateralized Nc compared to neutral faces. Irrespective of stimulus dynamicity, a larger P400 to angry faces was associated with higher scores on the Negative Affect temperamental dimension. Overall, results suggest that 7-month-olds are sensitive to facial dynamics, which might play a role in shaping the neural processing of facial emotional expressions. Results also suggest that the amount of attentional resources infants allocate to angry expressions is associated to their temperamental traits. These findings represent a promising avenue for future studies exploring the neurobiological processes involved in perceiving emotional expressions using dynamic stimuli.  相似文献   

4.
When perceiving emotional facial expressions there is an automatic tendency to react with a matching facial expression. A classic explanation of this phenomenon, termed the matched motor hypothesis, highlights the importance of topographic matching, that is, the correspondence in body parts, between perceived and produced actions. More recent studies using mimicry paradigms have challenged this classic account, producing ample evidence against the matched motor hypothesis. However, research using stimulus-response compatibility (SRC) paradigms usually assumed the effect relies on topographic matching. While mimicry and SRC share some characteristics, critical differences between the paradigms suggest conclusions cannot be simply transferred from one to another. Thus, our aim in the present study was to directly test the matched motor hypothesis using SRC. Specifically, we investigated whether observing emotional body postures or hearing emotional vocalizations produces a tendency to respond with one's face, despite completely different motor actions being involved. In three SRC experiments, participants were required to either smile or frown in response to a color cue, presented concurrently with stimuli of happy and angry facial (experiment 1), body (experiment 2), or vocal (experiment 3) expressions. Reaction times were measured using facial EMG. Whether presenting facial, body, or vocal expressions, we found faster responses in compatible, compared to incompatible trials. These results demonstrate that the SRC effect of emotional expressions does not require topographic matching. Our findings question interpretations of previous research and suggest further examination of the matched motor hypothesis.  相似文献   

5.
While the neural regions associated with facial identity recognition are considered to be well defined, the neural correlates of non-moving and moving images of facial emotion processing are less clear. This study examined the brain electrical activity changes in 26 participants (14 males M = 21.64, SD = 3.99; 12 females M = 24.42, SD = 4.36), during a passive face viewing task, a scrambled face task and separate emotion and gender face discrimination tasks. The steady state visual evoked potential (SSVEP) was recorded from 64-electrode sites. Consistent with previous research, face related activity was evidenced at scalp regions over the parieto-temporal region approximately 170 ms after stimulus presentation. Results also identified different SSVEP spatio-temporal changes associated with the processing of static and dynamic facial emotions with respect to gender, with static stimuli predominately associated with an increase in inhibitory processing within the frontal region. Dynamic facial emotions were associated with changes in SSVEP response within the temporal region, which are proposed to index inhibitory processing. It is suggested that static images represent non-canonical stimuli which are processed via different mechanisms to their more ecologically valid dynamic counterparts.  相似文献   

6.
Recent Alzheimer's trials have recruited cognitively normal people at risk for Alzheimer's dementia. Due to the lack of clinical symptoms in normal population, conventional clinical outcome measures are not suitable for these early trials. While several groups are developing new composite cognitive tests that could serve as potential outcome measures by detecting subtle cognitive changes in normal people, there is a need for longitudinal brain imaging techniques that can correlate with temporal changes in these new tests and provide additional objective measures of neuropathological changes in brain. Positron emission tomography (PET) is a nuclear medicine imaging procedure based on the measurement of annihilation photons after positron emission from radiolabeled molecules that allow tracking of biological processes in body, including the brain. PET is a well‐established in vivo imaging modality in Alzheimer's disease diagnosis and research due to its capability of detecting abnormalities in three major hallmarks of this disease. These include (1) amyloid beta plaques; (2) neurofibrillary tau tangles and (3) decrease in neuronal activity due to loss of nerve cell connection and death. While semiquantitative PET imaging techniques are commonly used to set discrete cut‐points to stratify abnormal levels of amyloid accumulation and neurodegeneration, they are suboptimal for detecting subtle longitudinal changes. In this study, we have identified and discussed four critical barriers in conventional longitudinal PET imaging that may be particularly relevant for early Alzheimer's disease studies. These include within and across subject heterogeneity of AD‐affected brain regions, PET intensity normalization, neuronal compensations in early disease stages and cerebrovascular amyloid deposition.  相似文献   

7.
Pre-existing Alzheimer's disease is a risk factor for severe/fatal COVID-19 and infection by SARS-CoV2 virus has been associated with an increased incidence of un-masked Alzheimer's disease. The molecular basis whereby SARS-CoV2 may amplify Alzheimer's disease is not well understood. This study analyzed the molecular changes in autopsy brain tissues from people with pre-existing dementia who died of COVID-19 (n = 5) which was compared to equivalent tissues of people who died of COVID-19 with no history of dementia (n = 8), Alzheimer's disease pre-COVID-19 (n = 10) and aged matched controls (n = 10) in a blinded fashion. Immunohistochemistry analyses for hyperphosphorylated tau protein, α-synuclein, and β-amyloid-42 confirmed the diagnoses of Alzheimer's disease (n = 4), and Lewy body dementia (n = 1) in the COVID-19 group. The brain tissues from patients who died of COVID-19 with no history of dementia showed a diffuse microangiopathy marked by endocytosis of spike subunit S1 and S2 in primarily CD31+ endothelia with strong co-localization with ACE2, Caspase-3, IL6, TNFα, and Complement component 6 that was not associated with SARS-CoV2 RNA. Microglial activation marked by increased TMEM119 and MCP1 protein expression closely paralleled the endocytosed spike protein. The COVID-19 tissues from people with no pre-existing dementia showed, compared to controls, 5-10× fold increases in expression of neuronal NOS and NMDAR2 as well as a marked decrease in the expression of proteins whose loss is associated with worsening Alzheimer's disease: MFSD2a, SHIP1, BCL6, BCL10, and BACH1. In COVID-19 tissues from people with dementia the widespread spike-induced microencephalitis with the concomitant microglial activation co-existed in the same areas where neurons had hyperphosphorylated tau protein suggesting that the already dysfunctional neurons were additionally stressed by the SARS-CoV2 induced microangiopathy. ACE2+ human brain endothelial cells treated with high dose (but not vaccine equivalent low dose) spike S1 protein demonstrated each of the molecular changes noted in the in vivo COVID-19 and COVID-19/Alzheimer's disease brain tissues. It is concluded that fatal COVID-19 induces a diffuse microencephalitis and microglial activation in the brain due to endocytosis of circulating viral spike protein that amplifies pre-existing dementia in at least two ways: 1) modulates the expression of proteins that may worsen Alzheimer's disease and 2) stresses the already dysfunctional neurons by causing an acute proinflammatory/hypercoagulable/hypoxic microenvironment in areas with abundant hyperphosphorylated tau protein and/or βA-42.  相似文献   

8.
Humans' experience of emotion and comprehension of affective cues varies substantially across the lifespan. Work in cognitive and affective neuroscience has begun to characterize behavioral and neural responses to emotional cues that systematically change with age. This review examines work to date characterizing the maturation of facial expression comprehension, and dynamic changes in amygdala recruitment from early childhood through late adulthood while viewing facial expressions of emotion. Recent neuroimaging work has tested amygdala and prefrontal engagement in experimental paradigms mimicking real aspects of social interactions, which we highlight briefly, along with considerations for future research.  相似文献   

9.
Sleep deprivation impacts subjective mood states, but very little research has examined the impact on processing emotional information. In the current study, we investigated the impact of total sleep deprivation on neural responses to emotional facial expressions as well as the accuracy and speed with which these faces were categorized. Forty-nine participants completed two tasks in which they were asked to categorize emotional facial expressions as Happy, Sad, Angry, or Fearful. They were shown the ‘full’ expression of the emotions in one task and more subtle expressions in a second task in which expressions were ‘morphed’ with neutral faces so that the intensity of emotion varied. It was expected that sleep deprivation would lead to greater reactivity (indexed by larger amplitude N170 event-related potentials), particularly for negative and more subtle facial expressions. In the full face task, sleep-deprived (SD) participants were significantly less accurate than controls (C) at identifying Sad faces and slower to identify all emotional expressions. P1 was smaller and N170 was larger for the SD compared to C group, but for all emotions, indicating generalized impairment in low-level visual processing. In the more difficult morphed face task, SD participants were less accurate than C participants for Sad faces; as well, the group difference in reaction time was greatest for Sad faces. For the SD group, N170 increased in amplitude with increasing perceptual difficulty for the Fearful and Angry faces, but decreased in amplitude with increasing difficulty for Sad faces. These data illustrate that sleep deprivation led to greater neural reactivity for the threat-related negative emotions as they became more subtle; however, there was a failure to engage these perceptual resources for the processing of Sad faces. Sleep loss preferentially impacted the processing of Sad faces; this has widespread implications for sleep-deprived groups.  相似文献   

10.
Individuals with social phobia display neural hyperactivation towards angry facial expressions. However, it is uncertain whether they also show abnormal brain responses when processing angry voices. In an event-related functional magnetic resonance imaging study, we investigated brain responses to neutral and angry voices in 12 healthy control participants and 12 individuals with social phobia when emotional prosody was either task-relevant or task-irrelevant. Regardless of task, both phobic and non-phobic participants recruited a network comprising frontotemporal regions, the amygdala, the insula, and the striatum, when listening to angry compared to neutral prosody. Across participants, increased activation in orbitofrontal cortex during task-relevant as compared to task-irrelevant emotional prosody processing was found. Compared to healthy controls, individuals with social phobia displayed significantly stronger orbitofrontal activation in response to angry versus neutral voices under both task conditions. These results suggest a disorder-associated increased involvement of the orbitofrontal cortex in response to threatening voices in social phobia.  相似文献   

11.
Previous studies, mainly with Caucasian samples, have shown that facial expressions of emotion are contagious, a phenomenon known as facial mimicry. This study examined facial mimicry using a Japanese sample. Participants were shown a series of Japanese faces (from Matsumoto and Ekman, 1988) on a computer screen expressing "happiness", "sadness", "anger", or "disgust". While viewing the facial expressions, electoromyograms (EMG) of the participants' faces were recorded to see whether their own facial muscles corresponding to the stimulus faces were activated. Consistent with the previous studies using Caucasian samples, all four facial expressions were mimicked. The peak time of mimicry of angry or happy faces was later, while that of disgusted faces was relatively sooner. The potential relation of facial mimicry to "emotional contagion", a social phenomenon whereby subjective feelings transfer between people, is discussed.  相似文献   

12.
Memory strategy usage and awareness of memory performance are both crucial for memory rehabilitation. We explored Alzheimer's patients' ability to apply and control learning strategies and also their ability to predict the effect of these strategies on subsequent performance. In a rehearsal condition, participants were explicitly asked to overtly rehearse words and were given as long as they liked at study. In a control condition, participants read the words passively at a fixed presentation rate. In all groups, recall was superior in the rehearsal condition than in the reading condition. Alzheimer's patients showed different strategy usage. Overall, people with Alzheimer's disease spend longer studying to-be-remembered words under unpaced conditions, but they do not use this time to rehearse to the same extent as controls. We hypothesize that this failure to rehearse could be based on the inability to use effortful executive mechanisms involved during study.  相似文献   

13.
Recognition of facial expressions of emotions is very important for communication and social cognition. Neuroimaging studies showed that numerous brain regions participate in this complex function. To study spatiotemporal aspects of the neural representation of facial emotion recognition we recorded neuromagnetic activity in 12 healthy individuals by means of a whole head magnetoencephalography system. Source reconstructions revealed that several cortical and subcortical brain regions produced strong neural activity in response to emotional faces at latencies between 100 and 360 ms that were much stronger than those to neutral as well as to blurred faces. Orbitofrontal cortex and amygdala showed affect-related activity at short latencies already within 180 ms after stimulus onset. Some of the emotion-responsive regions were repeatedly activated during the stimulus presentation period pointing to the assumption that these reactivations represent indicators of a distributed interacting circuitry.  相似文献   

14.
We present an investigation of facial expression recognition by three people (BC, LP, and NC) with Mobius syndrome, a congenital disorder producing facial paralysis. The participants were asked to identify the emotion displayed in 10 examples of facial expressions associated with each of 6 basic emotions from the Ekman and Friesen (1976) series. None of the three people with Möbius syndrome was significantly impaired on this task. On a second test of facial expression recognition using computer-morphed facial expressions, NC showed a statistically significant impairment, BC a borderline deficit, and LP was unimpaired. However, even when impairments were found, people with Möbius syndrome still recognised many of the facial expressions shown to them. The recognition of facial expressions by people who have never been able to produce such signals on their own faces demonstrates that the ability to produce facial expressions is not a necessary prerequisite of their recognition.  相似文献   

15.
Previous studies have shown that affective symptoms are part of the clinical picture in amyotrophic lateral sclerosis (ALS), the most common motor neuron disorder in elderly people. Diffuse neurodegeneration of limbic regions (e.g., prefrontal cortex [PFC], amygdala) was demonstrated in ALS post-mortem, although the mechanisms of emotional dysregulation in ALS in vivo remain unclear. Using functional imaging, we assessed the brain responses to emotional faces in 11 cognitively unimpaired ALS patients and 12 healthy controls (HCs). We tested whether regional activities and connectivity patterns in the limbic system differed between ALS patients and HCs and whether the variability in clinical measures modulated the neuroimaging data. Relative to HCs, ALS patients displayed greater activation in a series of PFC areas and altered left amygdala–PFC connectivity. Anxiety modulated the right amygdala–PFC connectivity in HCs but not in ALS patients. Reduced right premotor cortex activity and altered left amygdala–supplementary motor area connectivity were associated with longer disease duration and greater disease severity, respectively. Our findings demonstrate dysfunctions of the limbic system in ALS patients at early stages of the disease, and extend our knowledge about the interplay between emotional brain areas and the regions traditionally implicated in motor control.  相似文献   

16.
《Neurobiology of aging》2014,35(12):2845-2857
Posterior cortical atrophy (PCA) is a neurodegenerative syndrome characterized by impaired higher visual processing skills; however, motor features more commonly associated with corticobasal syndrome may also occur. We investigated the frequency and clinical characteristics of motor features in 44 PCA patients and, with 30 controls, conducted voxel-based morphometry, cortical thickness, and subcortical volumetric analyses of their magnetic resonance imaging. Prominent limb rigidity was used to define a PCA-motor subgroup. A total of 30% (13) had PCA-motor; all demonstrating asymmetrical left upper limb rigidity. Limb apraxia was more frequent and asymmetrical in PCA-motor, as was myoclonus. Tremor and alien limb phenomena only occurred in this subgroup. The subgroups did not differ in neuropsychological test performance or apolipoprotein E4 allele frequency. Greater asymmetry of atrophy occurred in PCA-motor, particularly involving right frontoparietal and peri-rolandic cortices, putamen, and thalamus. The 9 patients (including 4 PCA-motor) with pathology or cerebrospinal fluid all showed evidence of Alzheimer's disease. Our data suggest that PCA patients with motor features have greater atrophy of contralateral sensorimotor areas but are still likely to have underlying Alzheimer's disease.  相似文献   

17.
Observers can deliberately attend to some aspects of a face (e.g. emotional expression) while ignoring others. How do internal goals influence representational geometry in face-responsive cortex? Participants watched videos of naturalistic dynamic faces during MRI scanning. We measured multivariate neural response patterns while participants formed an intention to attend to a facial aspect (age, or emotional valence), and then attended to that aspect, and responses to the face's emotional valence, independent of attention. Distinct patterns of response to the two tasks were found while forming the intention, in left fronto-lateral but not face-responsive regions, and while attending to the face, in almost all face-responsive regions. Emotional valence was represented in right posterior superior temporal sulcus and medial prefrontal cortex, but could not be decoded when unattended. Shifting the focus of attention thus alters cortical representation of social information, probably reflecting neural flexibility to optimally integrate goals and perceptual input.  相似文献   

18.
Recognition of facial expressions of basic emotions was investigated in HL and UJ, two people with Huntington's disease who showed little evidence of general cognitive deterioration. No impairments were found in tests of the perception of age, sex, familiar face identity, unfamiliar face identity, and gaze direction, indicating adequate processing of the face as a physical stimulus. Computer-interpolated ("morphed") images of facial expressions of basic emotions were used to demonstrate deficits in the recognition of disgust and fear for UJ. HL also showed a deficit in the recognition of disgust, and was not very adept (but not significantly impaired) at recognising fear. Other basic emotions (happiness, surprise, sadness, anger) were recognised atnormal levels of performance by HL and UJ. These results show that impairments of emotion recognition can be circumscribed; affecting some emotions more than others, and occurring in people who do not show pronounced perceptual or intellectual deterioration. Questionnaires examining self-assessed emotion indicated normal experience of anger by HL and UJ, but possible abnormalities for disgust and fear. The processes involved in recognising other people's emotions may therefore be linked to those involved in experiencing emotion, and the basic emotions of fear and disgust may have separable neural substrates.  相似文献   

19.
A considerable body of research has focused on neural responses evoked by emotional facial expressions, but little is known about mother-specific brain responses to infant facial emotions. We used near-infrared spectroscopy to investigate prefrontal activity during discriminating facial expressions of happy, angry, sad, fearful, surprised and neutral of unfamiliar infants and unfamiliar adults by 14 mothers and 14 age-matched females who have never been pregnant (non-mothers). Our results revealed that discriminating infant facial emotions increased the relative oxyHb concentration in mothers' right prefrontal cortex but not in their left prefrontal cortex, compared with each side of the prefrontal cortices of non-mothers. However, there was no difference between mothers and non-mothers in right or left prefrontal cortex activation while viewing adult facial expressions. These results suggest that the right prefrontal cortex is involved in human maternal behavior concerning infant facial emotion discrimination.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号