首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
What happens in our brains when we see a face? The neural mechanisms of face processing – namely, the face‐selective regions – have been extensively explored. Research has traditionally focused on visual cortex face‐regions; more recently, the role of face‐regions outside the visual cortex (i.e., non‐visual‐cortex face‐regions) has been acknowledged as well. The major quest today is to reveal the functional role of each this region in face processing. To make progress in this direction, it is essential to understand the extent to which the face‐regions, and particularly the non‐visual‐cortex face‐regions, process only faces (i.e., face‐specific, domain‐specific processing) or rather are involved in a more domain‐general cognitive processing. In the current functional MRI study, we systematically examined the activity of the whole face‐network during face‐unrelated reading task (i.e., written meaningful sentences with content unrelated to faces/people and non‐words). We found that the non‐visual‐cortex (i.e., right lateral prefrontal cortex and posterior superior temporal sulcus), but not the visual cortex face‐regions, responded significantly stronger to sentences than to non‐words. In general, some degree of sentence selectivity was found in all non‐visual‐cortex cortex. Present result highlights the possibility that the processing in the non‐visual‐cortex face‐selective regions might not be exclusively face‐specific, but rather more or even fully domain‐general. In this paper, we illustrate how the knowledge about domain‐general processing in face‐regions can help to advance our general understanding of face processing mechanisms. Our results therefore suggest that the problem of face processing should be approached in the broader scope of cognition in general.  相似文献   

2.
First impressions, especially of emotional faces, may critically impact later evaluation of social interactions. Activity in limbic regions, including the amygdala and ventral striatum, has previously been shown to correlate with identification of emotional content in faces; however, little work has been done describing how these signals may influence emotional face memory. We report an event‐related functional magnetic resonance imaging study in 21 healthy adults where subjects attempted to recognize a neutral face that was previously viewed with a threatening (angry or fearful) or nonthreatening (happy or sad) affect. In a hypothesis‐driven region of interest analysis, we found that neutral faces previously presented with a threatening affect recruited the left amygdala. In contrast, faces previously presented with a nonthreatening affect activated the left ventral striatum. A whole‐brain analysis revealed increased response in the right orbitofrontal cortex to faces previously seen with threatening affect. These effects of prior emotion were independent of task performance, with differences being seen in the amygdala and ventral striatum even if only incorrect trials were considered. The results indicate that a network of frontolimbic regions may provide emotional bias signals during facial recognition. Hum Brain Mapp, 2009. © 2009 Wiley‐Liss, Inc.  相似文献   

3.
The attractiveness of a face is a highly salient social signal, influencing mate choice and other social judgements. In this study, we used event-related functional magnetic resonance imaging (fMRI) to investigate brain regions that respond to attractive faces which manifested either a neutral or mildly happy face expression. Attractive faces produced activation of medial orbitofrontal cortex (OFC), a region involved in representing stimulus-reward value. Responses in this region were further enhanced by a smiling facial expression, suggesting that the reward value of an attractive face as indexed by medial OFC activity is modulated by a perceiver directed smile.  相似文献   

4.
Rolls ET 《Neuropsychologia》2007,45(1):124-143
Neurophysiological evidence is described showing that some neurons in the macaque inferior temporal visual cortex have responses that are invariant with respect to the position, size and view of faces and objects, and that these neurons show rapid processing and rapid learning. Which face or object is present is encoded using a distributed representation in which each neuron conveys independent information in its firing rate, with little information evident in the relative time of firing of different neurons. This ensemble encoding has the advantages of maximising the information in the representation useful for discrimination between stimuli using a simple weighted sum of the neuronal firing by the receiving neurons, generalisation and graceful degradation. These invariant representations are ideally suited to provide the inputs to brain regions such as the orbitofrontal cortex and amygdala that learn the reinforcement associations of an individual's face, for then the learning, and the appropriate social and emotional responses, generalise to other views of the same face. A theory is described of how such invariant representations may be produced in a hierarchically organised set of visual cortical areas with convergent connectivity. The theory proposes that neurons in these visual areas use a modified Hebb synaptic modification rule with a short-term memory trace to capture whatever can be captured at each stage that is invariant about objects as the objects change in retinal view, position, size and rotation. Another population of neurons in the cortex in the superior temporal sulcus encodes other aspects of faces such as face expression, eye gaze, face view and whether the head is moving. These neurons thus provide important additional inputs to parts of the brain such as the orbitofrontal cortex and amygdala that are involved in social communication and emotional behaviour. Outputs of these systems reach the amygdala, in which face-selective neurons are found, and also the orbitofrontal cortex, in which some neurons are tuned to face identity and others to face expression. In humans, activation of the orbitofrontal cortex is found when a change of face expression acts as a social signal that behaviour should change; and damage to the orbitofrontal cortex can impair face and voice expression identification, and also the reversal of emotional behaviour that normally occurs when reinforcers are reversed.  相似文献   

5.
Brain imaging studies in humans have shown that face processing in several areas is modulated by the affective significance of faces, particularly with fearful expressions, but also with other social signals such gaze direction. Here we review haemodynamic and electrical neuroimaging results indicating that activity in the face-selective fusiform cortex may be enhanced by emotional (fearful) expressions, without explicit voluntary control, and presumably through direct feedback connections from the amygdala. fMRI studies show that these increased responses in fusiform cortex to fearful faces are abolished by amygdala damage in the ipsilateral hemisphere, despite preserved effects of voluntary attention on fusiform; whereas emotional increases can still arise despite deficits in attention or awareness following parietal damage, and appear relatively unaffected by pharmacological increases in cholinergic stimulation. Fear-related modulations of face processing driven by amygdala signals may implicate not only fusiform cortex, but also earlier visual areas in occipital cortex (e.g., V1) and other distant regions involved in social, cognitive, or somatic responses (e.g., superior temporal sulcus, cingulate, or parietal areas). In the temporal domain, evoked-potentials show a widespread time-course of emotional face perception, with some increases in the amplitude of responses recorded over both occipital and frontal regions for fearful relative to neutral faces (as well as in the amygdala and orbitofrontal cortex, when using intracranial recordings), but with different latencies post-stimulus onset. Early emotional responses may arise around 120ms, prior to a full visual categorization stage indexed by the face-selective N170 component, possibly reflecting rapid emotion processing based on crude visual cues in faces. Other electrical components arise at later latencies and involve more sustained activities, probably generated in associative or supramodal brain areas, and resulting in part from the modulatory signals received from amygdala. Altogether, these fMRI and ERP results demonstrate that emotion face perception is a complex process that cannot be related to a single neural event taking place in a single brain regions, but rather implicates an interactive network with distributed activity in time and space. Moreover, although traditional models in cognitive neuropsychology have often considered that facial expression and facial identity are processed along two separate pathways, evidence from fMRI and ERPs suggests instead that emotional processing can strongly affect brain systems responsible for face recognition and memory. The functional implications of these interactions remain to be fully explored, but might play an important role in the normal development of face processing skills and in some neuropsychiatric disorders.  相似文献   

6.
This is a case study involving a female patient (NN) with complete loss of autobiographical memory and identity despite normal neurological assessment. To test the hypothesis that patients with dissociative amnesia (DA) possess the ability to covertly process facial identities they are unaware of, we conducted functional magnetic resonance imaging (fMRI) and assessed skin conductance responses (SCR) to (a) strangers, (b) celebrities, and (c) familiar faces not seen since the onset of DA. We also performed associative face‐name memory tasks to test the patient's ability to learn and recall newly learned face‐name pairs. Although NN did not recognize any of the faces of her friends and relatives, their images triggered a stronger involvement of the left fusiform gyrus, the bilateral hippocampus/amygdala region, the orbitofrontal cortex, the middle temporal regions, and the precuneus, along with higher SCR. During recollection of previously learned face‐name pairs, NN (compared to healthy controls) demonstrated a weaker involvement of the hippocampus. Our findings suggest that, in DA, specific arousal systems remain capable of being activated by familiar faces outside of conscious awareness. The decreased activation observed in the hippocampus demonstrates that the functioning of memory‐sensitive regions may be impaired by trauma.  相似文献   

7.
Intelligence is among the key determinants of power and social status in modern societies. In this functional magnetic resonance imaging study, we examined the neural correlates of intelligence evaluation from faces. Participants underwent scans while they evaluated the perceived intelligence and friendliness of faces. We found that medial orbitofrontal cortex activity increased linearly with friendliness ratings. The relationship between perceived intelligence and brain activity was positively linear in the right caudate nucleus and U‐shaped (i.e., strong responses to unintelligent‐looking or intelligent‐looking faces) in the right anterior insula/inferior frontal gyrus. Perceived intelligence was also significantly positively correlated with both friendliness and attractiveness. Furthermore, intelligence rating scores had a positive linear effect on reaction times in the friendliness rating task, suggesting that participants had greater conflicts when making friendliness judgments for faces that appeared to belong to intelligent individuals. In addition, the degree of this effect predicted individual differences in the positive linear modulatory effect of intelligence scores in the right caudate nucleus. Our interpretation was that the activity in the caudate nucleus revealed an approach‐avoidance conflict with regard to highly intelligent people, that is, they were perceived as attractive but also potentially threatening. Although our interpretations are merely suggestive because we did not measure the approach‐avoidance behaviors directly, our findings have important implications for understanding the dynamics of human interaction in modern societies that increasingly allocate power and status based on intelligence.  相似文献   

8.
Individuals with Williams syndrome (WS) demonstrate an abnormally positive social bias. However, the neural substrates of this hypersociability, i.e., positive attribution bias and increased drive toward social interaction, have not fully been elucidated. METHODS: We performed an event-related functional magnetic resonance imaging study while individuals with WS and typically developing controls (TD) matched positive and negative emotional faces. WS compared to TD showed reduced right amygdala activation during presentation of negative faces, as in the previous literature. In addition, WS showed a unique pattern of right orbitofrontal cortex activation. While TD showed medial orbitofrontal cortex activation in response to positive, and lateral orbitofrontal cortex activation to negative, WS showed the opposite pattern. In light of the general notion of a medial/lateral gradient of reward/punishment processing in the orbitofrontal cortex, these findings provide an additional biological explanation for, or correlate of positive attribution bias and hypersociability in WS.  相似文献   

9.
Faces are multi-dimensional stimuli bearing important social signals, such as gaze direction and emotion expression. To test whether perception of these two facial attributes recruits distinct cortical areas within the right hemisphere, we used single-pulse transcranial magnetic stimulation (TMS) in healthy volunteers while they performed two different tasks on the same face stimuli. In each task, two successive faces were presented with varying eye-gaze directions and emotional expressions, separated by a short interval of random duration. TMS was applied over either the right somatosensory cortex or the right superior lateral temporal cortex, 100 or 200 ms after presentation of the second face stimulus. Participants performed a speeded matching task on the second face during one of two possible conditions, requiring judgements about either gaze direction or emotion expression (same/different as the first face). Our results reveal a significant task-stimulation site interaction, indicating a selective TMS-related interference following stimulations of somatosensory cortex during the emotional expression task. Conversely, TMS of the superior lateral temporal cortex selectively interfered with the gaze direction task. We also found that the interference effect was specific to the stimulus content in each condition, affecting judgements of gaze shifts (not static eye positions) with TMS over the right superior temporal cortex, and judgements of fearful expressions (not happy expressions) with TMS over the right somatosensory cortex. These results provide for the first time a double dissociation in normal subjects during social face recognition, due to transient disruption of non-overlapping brain regions. The present study supports a critical role of the somatosensory and superior lateral temporal regions in the perception of fear expression and gaze shift in seen faces, respectively.  相似文献   

10.
A face‐selective neural signal is reliably found in humans with functional MRI and event‐related potential (ERP) measures, which provide complementary information about the spatial and temporal properties of the neural response. However, because most neuroimaging studies so far have studied ERP and fMRI face‐selective markers separately, the relationship between them is still unknown. Here we simultaneously recorded fMRI and ERP responses to faces and chairs to examine the correlations across subjects between the magnitudes of fMRI and ERP face‐selectivity measures. Findings show that the face‐selective responses in the temporal lobe (i.e., fusiform gyrus—FFA) and superior temporal sulcus (fSTS), but not the face‐selective response in the occipital cortex (OFA), were highly correlated with the face‐selective N170 component. In contrast, the OFA was correlated with earlier ERPs at about 110 ms after stimulus‐onset. Importantly, these correlations reveal a temporal dissociation between the face‐selective area in the occipital lobe and face‐selective areas in the temporal lobe. Despite the very different time‐scale of the fMRI and EEG signals, our data show that a correlation analysis across subjects may be informative with respect to the latency in which different brain regions process information. Hum Brain Mapp, 2010. © 2010 Wiley‐Liss, Inc.  相似文献   

11.
Little is known about the relationship between weight status and reward-related brain activity in normal weight humans. We correlated orbitofrontal and anterior cingulate cortex activity as measured by functional magnetic resonance imaging with body mass index in 13 healthy, normal-weight adult women as they viewed images of high-calorie and low-calorie foods, and dining-related utensils. Body mass index correlated negatively with both cingulate and orbitofrontal activity during high-calorie viewing, negatively with orbitofrontal activity during low-calorie viewing, and positively with orbitofrontal activity during presentations of nonedible utensils. With greater body mass, activity was reduced in brain regions important for evaluating and modifying learned stimulus-reward associations, suggesting a relationship between weight status and responsiveness of the orbitofrontal cortex to rewarding food images.  相似文献   

12.
The eyes have it: the neuroethology, function and evolution of social gaze   总被引:27,自引:0,他引:27  
Gaze is an important component of social interaction. The function, evolution and neurobiology of gaze processing are therefore of interest to a number of researchers. This review discusses the evolutionary role of social gaze in vertebrates (focusing on primates), and a hypothesis that this role has changed substantially for primates compared to other animals. This change may have been driven by morphological changes to the face and eyes of primates, limitations in the facial anatomy of other vertebrates, changes in the ecology of the environment in which primates live, and a necessity to communicate information about the environment, emotional and mental states. The eyes represent different levels of signal value depending on the status, disposition and emotional state of the sender and receiver of such signals. There are regions in the monkey and human brain which contain neurons that respond selectively to faces, bodies and eye gaze. The ability to follow another individual's gaze direction is affected in individuals with autism and other psychopathological disorders, and after particular localized brain lesions. The hypothesis that gaze following is "hard-wired" in the brain, and may be localized within a circuit linking the superior temporal sulcus, amygdala and orbitofrontal cortex is discussed.  相似文献   

13.
It is easier to recognize a familiar face than a newly learned face. The neural basis of familiar face recognition has been elucidated in functional imaging and lesion studies. Behavioural and neuropsychological data indicate, however, that brain systems involved in episodic retrieval of familiar and newly learned faces are distinct. In our study, 12 subjects viewed 30 novel faces in an encoding session. In the study condition, event-related functional magnetic resonance imaging (fMRI) was used to compare brain activation during correct recognition of the recently learned faces to that observed during correct rejection of unknown control faces. Differences were present in the left inferior parietal (BA 40) and left medial frontal/anterior cingulate (BA 32/9) cortex. These two regions may be part of a pathway in the dorsal visual stream, responsible for a "feeling of familiarity" in contrast to the ventral pathway in the temporal lobes, which is mainly involved in the recognition of personal identity.  相似文献   

14.
The current study examines the effect of status information on the neural substrates of person perception. In an event-related fMRI experiment, participants were presented with photographs of faces preceded with information denoting either: low or high financial status (e.g., "earns $25,000" or "earns $350,000"), or low or high moral status (e.g., "is a tobacco executive" or "does cancer research"). Participants were asked to form an impression of the targets, but were not instructed to explicitly evaluate their social status. Building on previous brain-imaging investigations, regions of interest analyses were performed for brain regions expected to support either cognitive (i.e., intraparietal sulcus) or emotional (i.e., ventromedial prefrontal cortex) components of social status perception. Activation of the intraparietal sulcus was found to be sensitive to the financial status of individuals while activation of the ventromedial prefrontal cortex was sensitive to the moral status of individuals. The implications of these results towards uncovering the neural substrates of status perception and, more broadly, the extended network of brain regions involved in person perception are discussed.  相似文献   

15.
In two fMRI experiments (n = 44) using tasks with different demands-approach-avoidance versus one-back recognition decisions-we measured the responses to the social value of faces. The face stimuli were produced by a parametric model of face evaluation that reduces multiple social evaluations to two orthogonal dimensions of valence and power [Oosterhof, N. N., & Todorov, A. The functional basis of face evaluation. Proceedings of the National Academy of Sciences, U.S.A., 105, 11087-11092, 2008]. Independent of the task, the response within regions of the occipital, fusiform, and lateral prefrontal cortices was sensitive to the valence dimension, with larger responses to low-valence faces. Additionally, there were extensive quadratic responses in the fusiform gyri and dorsal amygdala, with larger responses to faces at the extremes of the face valence continuum than faces in the middle. In all these regions, participants' avoidance decisions correlated with brain responses, with faces more likely to be avoided evoking stronger responses. The findings suggest that both explicit and implicit face evaluation engage multiple brain regions involved in attention, affect, and decision making.  相似文献   

16.
Developing effective preference modification paradigms is crucial to improve the quality of life in a wide range of behaviors. The cue‐approach training (CAT) paradigm has been introduced as an effective tool to modify preferences lasting months, without external reinforcements, using the mere association of images with a cue and a speeded button response. In the current work for the first time, we used fMRI with faces as stimuli in the CAT paradigm, focusing on face‐selective brain regions. We found a behavioral change effect of CAT with faces immediately and 1‐month after training, however face‐selective regions were not indicative of behavioral change and thus preference change is less likely to rely on face processing brain regions. Nevertheless, we found that during training, fMRI activations in the ventral striatum were correlated with individual preference change. We also found a correlation between preference change and activations in the ventromedial prefrontal cortex during the binary choice phase. Functional connectivity among striatum, prefrontal regions, and high‐level visual regions was also related to individual preference change. Our work sheds new light on the involvement of neural mechanisms in the process of valuation. This could lead to development of novel real‐world interventions.  相似文献   

17.
Face perception is mediated by a distributed cortical network   总被引:11,自引:0,他引:11  
The neural system associated with face perception in the human brain was investigated using functional magnetic resonance imaging (fMRI). In contrast to many studies that focused on discreet face-responsive regions, the objective of the current study was to demonstrate that regardless of stimulus format, emotional valence, or task demands, face perception evokes activation in a distributed cortical network. Subjects viewed various stimuli (line drawings of unfamiliar faces and photographs of unfamiliar, famous, and emotional faces) and their phase scrambled versions. A network of face-responsive regions was identified that included the inferior occipital gyrus, fusiform gyrus, superior temporal sulcus, hippocampus, amygdala, inferior frontal gyrus, and orbitofrontal cortex. Although bilateral activation was found in all regions, the response in the right hemisphere was stronger. This hemispheric asymmetry was manifested by larger and more significant clusters of activation and larger number of subjects who showed the effect. A region of interest analysis revealed that while all face stimuli evoked activation within all regions, viewing famous and emotional faces resulted in larger spatial extents of activation and higher amplitudes of the fMRI signal. These results indicate that a mere percept of a face is sufficient to localize activation within the distributed cortical network that mediates the visual analysis of facial identity and expression.  相似文献   

18.
The present work explores the relationship between interracial contact and the neural substrates of explicit social and non-social judgments about both racial ingroup and outgroup targets. Convergent evidence from univariate and multivariate partial least squares (PLS) analyses reveals that contact shapes the recruitment of brain regions involved in social cognition similarly for both ingroup and outgroup targets. Results support the hypothesis that increased contact is associated with generalized changes in social cognition toward both ingroup and outgroup faces. Specifically, regardless of target race, low- and average-contact perceivers showed the typically observed increased recruitment of temporoparietal junction and dorsomedial prefrontal cortex during social compared to perceptual judgments. However, high-contact perceivers did not show selective recruitment of these brain regions for social judgments. Complimenting univariate results, multivariate PLS analyses reveal that greater perceiver contact leads to reduced co-activation in networks of brain regions associated with face processing (e.g. fusiform gyrus) and salience detection (e.g. anterior cingulate cortex and insula). Across univariate and multivariate analyses, we found no evidence that contact differentially impacted cross-race face perception. Instead, when performing either a social or a novel perceptual task, interracial contact appears to broadly shape how perceivers engage with all faces.  相似文献   

19.
Neuroimaging studies have implicated a set of striatal and orbitofrontal cortex (OFC) regions that are commonly activated during reward processing tasks. Resting‐state functional connectivity (RSFC) studies have demonstrated that the human brain is organized into several functional systems that show strong temporal coherence in the absence of goal‐directed tasks. Here we use seed‐based and graph‐theory RSFC approaches to characterize the systems‐level organization of putative reward regions of at rest. Peaks of connectivity from seed‐based RSFC patterns for the nucleus accumbens (NAcc) and orbitofrontal cortex (OFC) were used to identify candidate reward regions which were merged with a previously used set of regions (Power et al., 2011). Graph‐theory was then used to determine system‐level membership for all regions. Several regions previously implicated in reward‐processing (NAcc, lateral and medial OFC, and ventromedial prefrontal cortex) comprised a distinct, preferentially coupled system. This RSFC system is stable across a range of connectivity thresholds and shares strong overlap with meta‐analyses of task‐based reward studies. This reward system shares between‐system connectivity with systems implicated in cognitive control and self‐regulation, including the fronto‐parietal, cingulo‐opercular, and default systems. Differences may exist in the pathways through which control systems interact with reward system components. Whereas NAcc is functionally connected to cingulo‐opercular and default systems, OFC regions show stronger connectivity with the fronto‐parietal system. We propose that future work may be able to interrogate group or individual differences in connectivity profiles using the regions delineated in this work to explore potential relationships to appetitive behaviors, self‐regulation failure, and addiction.  相似文献   

20.
According to a non‐hierarchical view of human cortical face processing, selective responses to faces may emerge in a higher‐order area of the hierarchy, in the lateral part of the middle fusiform gyrus (fusiform face area [FFA]) independently from face‐selective responses in the lateral inferior occipital gyrus (occipital face area [OFA]), a lower order area. Here we provide a stringent test of this hypothesis by gradually revealing segmented face stimuli throughout strict linear descrambling of phase information [Ales et al., 2012]. Using a short sampling rate (500 ms) of fMRI acquisition and single subject statistical analysis, we show a face‐selective responses emerging earlier, that is, at a lower level of structural (i.e., phase) information, in the FFA compared with the OFA. In both regions, a face detection response emerging at a lower level of structural information for upright than inverted faces, both in the FFA and OFA, in line with behavioral responses and with previous findings of delayed responses to inverted faces with direct recordings of neural activity were also reported. Overall, these results support the non‐hierarchical view of human cortical face processing and open new perspectives for time‐resolved analysis at the single subject level of fMRI data obtained during continuously evolving visual stimulation. Hum Brain Mapp 38:120–139, 2017. © 2016 Wiley Periodicals, Inc.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号