首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
Human neuroimaging and event-related potential (ERP) studies suggest that ventral and lateral temporo-occipital cortex is sensitive to static faces and face parts. Recent fMRI data also show activation by facial movements. In this study we recorded from 22 posterior scalp locations in 20 normal right-handed males to assess ERPs evoked by viewing: (1) moving eyes and mouths in the context of a face; (2) moving and static eyes with and without facial context. N170 and P350 peak amplitude and latency data were analysed. N170 is an ERP previously shown to be preferentially responsive to face and eye stimuli, and P350 immediately follows N170. Major results were: (1) N170 was significantly larger over the bilateral temporal scalp to viewing opening mouths relative to closing mouths, and to eye aversion relative to eyes gazing at the observer; (2) at a focal region over the right inferior temporal scalp, N170 was significantly earlier to mouth opening relative to closing, and to eye aversion relative to eyes gazing at the observer; (3) the focal ERP effect of eye aversion occurred independent of facial context; (4) these differences cannot be attributable to movement per se, as they did not occur in a control condition in which checks moved in comparable areas of the visual field; (5) isolated static eyes produced N170s that were not significantly different from N170s to static full faces over the right inferior temporal scalp, unlike in the left hemisphere where face N170s were significantly larger than eye N170s; (6) unlike N170, P350 exhibited nonspecific changes as a function of stimulus movement. These results suggest that: (1) bilateral temporal cortex forms part of a system sensitive to biological motion, of which facial movements form an important subset; (2) there may be a specialised system for facial gesture analysis that provides input for neuronal circuitry dealing with social attention and the actions of others.  相似文献   

2.
Human neuroimaging and event-related potential (ERP) studies suggest that ventral and lateral temporo-occipital cortex is sensitive to static faces and face parts. Recent fMRI data also show activation by facial movements. In this study we recorded from 22 posterior scalp locations in 20 normal right-handed males to assess ERPs evoked by viewing: (1) moving eyes and mouths in the context of a face; (2) moving and static eyes with and without facial context. N170 and P350 peak amplitude and latency data were analysed. N170 is an ERP previously shown to be preferentially responsive to face and eye stimuli, and P350 immediately follows N170. Major results were: (1) N170 was significantly larger over the bilateral temporal scalp to viewing opening mouths relative to closing mouths, and to eye aversion relative to eyes gazing at the observer; (2) at a focal region over the right inferior temporal scalp, N170 was significantly earlier to mouth opening relative to closing, and to eye aversion relative to eyes gazing at the observer; (3) the focal ERP effect of eye aversion occurred independent of facial context; (4) these differences cannot be attributable to movement per se, as they did not occur in a control condition in which checks moved in comparable areas of the visual field; (5) isolated static eyes produced N170s that were not significantly different from N170s to static full faces over the right inferior temporal scalp, unlike in the left hemisphere where face N170s were significantly larger than eye N170s; (6) unlike N170, P350 exhibited nonspecific changes as a function of stimulus movement. These results suggest that: (1) bilateral temporal cortex forms part of a system sensitive to biological motion, of which facial movements form an important subset; (2) there may be a specialised system for facial gesture analysis that provides input for neuronal circuitry dealing with social attention and the actions of others.  相似文献   

3.
Human visual evoked potentials were recorded during presentation of photos of human and animal faces and various face features. Negative waves with approximate peak latencies of 165 msec (N170) were bilaterally recorded from the occipito-temporal regions. Mean peak latencies of the N170 were shorter for faces than eyes only. Analyses of amplitudes of evoked potentials indicated that the N170 elicited by faces reflected activity of a specific neural system which was insensitive to detailed differences among individual faces regardless of species, and consequently suggest that this system might function to detect existence of faces in general. On the other hand, the mean amplitude of the N170 elicited by human eyes was significantly larger than those by animal eyes. These differences in response latencies and amplitudes of the N170 suggest existence of at least 2 different visual evoked potentials with similar latencies (i.e., N170) which are sensitive to faces in general and human eyes, respectively. Dipole source localization analysis indicated that dipoles for the N170 elicited by eyes were located in the posterior inferior temporal gyrus, and those for faces, located initially in the same region, but moved toward the fusiform and lingual gyri at the late phase of the N170. The results indicated that information processing of faces and eyes separated at least as early as the latency of the N170 at the posterior inferior temporal gyrus as well as the fusiform and lingual gyri, and might provide neurophysiological and anatomical bases to an initial structural encoding stage of human faces.  相似文献   

4.
BACKGROUND: Emotional Stroop tasks have shown attention biases of clinical populations towards stimuli related to their condition. Asperger Syndrome (AS) is a neuropsychiatric condition with social and communication deficits, repetitive behaviours and narrow interests. Social deficits are particularly striking, including difficulties in understanding others. METHOD: We investigated colour-naming latencies of adults with and without AS to name colours of pictures containing angry facial expressions, neutral expressions or non-social objects. We tested three hypotheses: whether (1) controls show longer colour-naming latencies for angry versus neutral facial expressions with male actors, (2) people with AS show differential latencies across picture types, and (3) differential response latencies persist when photographs contain females. RESULTS: Controls had longer latencies to pictures of male faces with angry compared to neutral expressions. The AS group did not show longer latencies to angry versus neutral expressions in male faces, instead showing slower latencies to pictures containing any facial expression compared to objects. When pictures contained females, controls no longer showed longer latencies for angry versus neutral expressions. However, the AS group still showed longer latencies to all facial picture types, compared to objects, providing further evidence that faces produce interference effects for this clinical group. CONCLUSIONS: The pictorial emotional Stroop paradigm reveals normal attention biases towards threatening emotional faces. The AS group showed Stroop interference effects to all facial stimuli regardless of expression or sex, suggesting that faces cause disproportionate interference in AS.  相似文献   

5.
The properties of the face-sensitive N170 component of the event-related brain potential (ERP) were explored through an orientation discrimination task using natural faces, objects, and Arcimboldo paintings presented upright or inverted. Because Arcimboldo paintings are composed of non-face objects but have a global face configuration, they provide great control to disentangle high-level face-like or object-like visual processes at the level of the N170, and may help to examine the implication of each hemisphere in the global/holistic processing of face formats. For upright position, N170 amplitudes in the right occipito-temporal region did not differ between natural faces and Arcimboldo paintings but were larger for both of these categories than for objects, supporting the view that as early as the N170 time-window, the right hemisphere is involved in holistic perceptual processing of face-like configurations irrespective of their features. Conversely, in the left hemisphere, N170 amplitudes differed between Arcimboldo portraits and natural faces, suggesting that this hemisphere processes local facial features. For upside-down orientation in both hemispheres, N170 amplitudes did not differ between Arcimboldo paintings and objects, but were reduced for both categories compared to natural faces, indicating that the disruption of holistic processing with inversion leads to an object-like processing of Arcimboldo paintings due to the lack of local facial features. Overall, these results provide evidence that global/holistic perceptual processing of faces and face-like formats involves the right hemisphere as early as the N170 time-window, and that the local processing of face features is rather implemented in the left hemisphere.  相似文献   

6.
The current event-related potential study investigated the perceptual processing of the categorization advantage of happy over sad faces in social anhedonia with a nonclinical sample. A high social anhedonia (HSA, N = 25) group and a low social anhedonia (LSA, N = 27) group performed a facial expression categorization task during which they classified facial expressions (happy, neutral, sad), irrespective of face orientation (upright, upside-down). Behaviorally, happy faces were identified more quickly than sad ones in the upright but not inverted orientation. Electrophysiologically, the LSA group showed earlier N170 latencies for happy than for sad faces in the upright but not upside-down orientation, whereas the HSA group did not show any expression effect on N170 latencies. Moreover, N170 and P2 amplitude results revealed that HSA relative to LSA individuals showed delayed neural discrimination between happy and sad faces. These results suggest that social anhedonia is associated with a deficit of perceptual processing during facial expression categorization.  相似文献   

7.
Event-related potentials (ERPs) to schematic faces in adults and children.   总被引:6,自引:0,他引:6  
Event-related potentials (ERPs) were recorded from 4-year-old 8-10-year-old children and adults to a schematic face, inverted face and jumbled face. The subjects were instructed to fixate the stimuli and no other response was required. The schematic face and inverted face were shown with a frequency of 20% each and the remaining presentations (60%) were of the jumbled face. P1 and N170 peak latency were measurable in the children and adults. These peaks were at longer latencies in the children. P3 was measurable in the adults and 8-10-year-old children but not the 4-year-olds. The adults had larger and longer latency P1 and smaller amplitude N170 to the inverted face than the other faces. In contrast, the P1 was unaffected by inversion in the children and the N170 was not smaller to the inverted or jumbled face. It is concluded that this result reflects developmental differences in the processing of configuration, with the children relying on an under-specified configuration of the face.  相似文献   

8.
States of depression are considered to relate to a cognitive bias reactivity to emotional events. Moreover, gender effect may influence differences in emotional processing. The current study is to investigate whether there is an interaction of cognitive bias by gender on emotional processing in minor depression (MiD) and major depression (MaD). N170 component was obtained during a visual emotional oddball paradigm to manipulate the processing of emotional information in 33 MiD, 36 MaD, and 32 controls (CN). Compared with CN, in male, both MiD and MaD had lower N170 amplitudes for happy faces, but MaD had higher N170 amplitudes for sad faces; in female, both MiD and MaD had lower N170 amplitudes for happy and neutral faces, but higher N170 amplitudes for sad faces. Compared with MaD in male, MiD had higher N170 amplitudes for happy faces, lower N170 amplitudes for sad faces; in female, MiD only had higher N170 amplitudes for sad faces. Interestingly, a negative relationship was observed between N170 amplitude and the HDRS score for identification of happy faces in depressed patients while N170 amplitude was positively correlated with the HDRS score for sad faces identification. These results provide novel evidence for the mood-brightening effect with an interaction of cognitive bias by gender on emotional processing. It further suggests that female depression may be more vulnerable than male during emotional face processing with the unconscious negative cognitive bias and depressive syndromes may exist on a spectrum of severity on emotional face processing.  相似文献   

9.
OBJECTIVES: The objective of this study is to examine whether configural alterations of faces affect early or late processing stages as a function of their familiarity and their level of representation in memory. We then sought to verify whether the structural encoding stage is susceptible to top-down influences. METHODS: Electrophysiologic and behavioral studies were undertaken, during which unknown and familiar faces were presented upright or upside-down with or without feature alterations. The subjects were asked to determine whether the faces were familiar or not. RESULTS: N170 and N360 amplitudes were larger for familiar faces as well as altered ones. A higher degree of familiarity decreased reaction times (RTs) and N360 latencies, but increased N170 latencies, whereas face alterations increased RTs and latencies of both components examined. However, familiarity interacted with altered face configurations only for RTs and the N170. SIGNIFICANCE: In the perceptual stage, familiar faces seem to develop a more elaborate type of processing because of top-down influences linked to the robust nature of their representations in memory. The more elaborate type of processing for familiar faces has advantageous consequences for the following steps of information processing, by facilitating access to structural representations in memory (N360) as well as the final step reflected by RTs. The fact that configural alterations cause different effects for familiar as opposed to unfamiliar faces indicate that these stimuli are processed in a qualitatively different manner and solicit different representations in memory.  相似文献   

10.
To study the spatial frequency (SF) effects on cortical face processing, we recorded magnetoencephalographic responses in seven healthy subjects to upright and inverted human faces. Four face types were used, including original (broad-band SF, BSF), low SF (LSF, <5 cycles/face), middle SF (MSF, 5-15 cycles/face), and high SF (HSF, >15 cycles/face) face images. Using equivalent current dipole (ECD) modeling, neuromagnetic M170 responses peaking around 160-185 ms were localized in right occipitotemporal region across subjects to BSF faces. M170 responses to LSF faces showed longer latency and smaller amplitude compared with those to BSF faces. We found no significant difference between BSF, MSF, and HSF conditions in M170 amplitude or latency. ECD locations for the four upright face conditions were close to one another, although the mean locations for MSF or HSF seemed more medial than those for BSF or LSF. Longer latencies for inverted than upright faces were observed in BSF (183.4+/-8.5 ms versus 168+/-6.9 ms, P<0.001) and LSF face conditions (223.6+/-13.1 ms versus 207.3+/-16.3 ms, P<0.01). M170 ECDs were located more medial for inverted than upright images in either BSF or LSF condition. In conclusion, the less M170 activation to LSF faces suggests that face parts information is important for early face processing. The cortical representations in right occipitotemporal region for configural and face feature processing are overlapping. Our findings on the face inversion effect suggest that inverted BSF and LSF faces may be processed as objects.  相似文献   

11.
Young adults more accurately remember own-age than older faces. We tested whether this own-age bias (OAB) is reduced by increased experience. Young experts (geriatric nurses) and controls performed a recognition experiment with young and old faces. Critically, while control participants demonstrated better memory for young faces, no OAB was observed in the experts. Event-related potentials revealed larger N170 and P2 amplitudes for young than old faces in both groups, suggesting no group differences during early perceptual processing. At test, N250 repetition effects were more anteriorily distributed for own- than other-age faces in control participants, whereas experts showed no corresponding effects. A larger late positive component (LPC) for old than young faces was observed in controls, but not in experts. Larger LPCs may reflect prolonged stimulus processing compromising memory retrieval. In sum, experience with other-age faces does not affect early perceptual processing, but modulates later stages related to memory retrieval.  相似文献   

12.
This article reviews behavioral and electrophysiological studies of face processing and discusses hypotheses for understanding the nature of face processing impairments in autism. Based on results of behavioral studies, this study demonstrates that individuals with autism have impaired face discrimination and recognition and use atypical strategies for processing faces characterized by reduced attention to the eyes and piecemeal rather than configural strategies. Based on results of electrophysiological studies, this article concludes that face processing impairments are present early in autism, by 3 years of age. Such studies have detected abnormalities in both early (N170 reflecting structural encoding) and late (NC reflecting recognition memory) stages of face processing. Event-related potential studies of young children and adults with autism have found slower speed of processing of faces, a failure to show the expected speed advantage of processing faces versus nonface stimuli, and atypical scalp topography suggesting abnormal cortical specialization for face processing. Other electrophysiological studies have suggested that autism is associated with early and late stage processing impairments of facial expressions of emotion (fear) and decreased perceptual binding as reflected in reduced gamma during face processing. This article describes two types of hypotheses-cognitive/perceptual and motivational/affective--that offer frameworks for understanding the nature of face processing impairments in autism. This article discusses implications for intervention.  相似文献   

13.
Sleep deprivation impacts subjective mood states, but very little research has examined the impact on processing emotional information. In the current study, we investigated the impact of total sleep deprivation on neural responses to emotional facial expressions as well as the accuracy and speed with which these faces were categorized. Forty-nine participants completed two tasks in which they were asked to categorize emotional facial expressions as Happy, Sad, Angry, or Fearful. They were shown the ‘full’ expression of the emotions in one task and more subtle expressions in a second task in which expressions were ‘morphed’ with neutral faces so that the intensity of emotion varied. It was expected that sleep deprivation would lead to greater reactivity (indexed by larger amplitude N170 event-related potentials), particularly for negative and more subtle facial expressions. In the full face task, sleep-deprived (SD) participants were significantly less accurate than controls (C) at identifying Sad faces and slower to identify all emotional expressions. P1 was smaller and N170 was larger for the SD compared to C group, but for all emotions, indicating generalized impairment in low-level visual processing. In the more difficult morphed face task, SD participants were less accurate than C participants for Sad faces; as well, the group difference in reaction time was greatest for Sad faces. For the SD group, N170 increased in amplitude with increasing perceptual difficulty for the Fearful and Angry faces, but decreased in amplitude with increasing difficulty for Sad faces. These data illustrate that sleep deprivation led to greater neural reactivity for the threat-related negative emotions as they became more subtle; however, there was a failure to engage these perceptual resources for the processing of Sad faces. Sleep loss preferentially impacted the processing of Sad faces; this has widespread implications for sleep-deprived groups.  相似文献   

14.
This article reviews behavioral and electrophysiological studies of face processing and discusses hypotheses for understanding the nature of face processing impairments in autism. Based on results of behavioral studies, this study demonstrates that individuals with autism have impaired face discrimination and recognition and use atypical strategies for processing faces characterized by reduced attention to the eyes and piecemeal rather than configural strategies. Based on results of electrophysiological studies, this article concludes that face processing impairments are present early in autism, by 3 years of age. Such studies have detected abnormalities in both early (N170 reflecting structural encoding) and late (NC reflecting recognition memory) stages of face processing. Event-related potential studies of young children and adults with autism have found slower speed of processing of faces, a failure to show the expected speed advantage of processing faces versus nonface stimuli, and atypical scalp topography suggesting abnormal cortical specialization for face processing. Other electrophysiological studies have suggested that autism is associated with early and late stage processing impairments of facial expressions of emotion (fear) and decreased perceptual binding as reflected in reduced gamma during face processing. This article describes two types of hypotheses-cognitive/perceptual and motivational/affective--that offer frameworks for understanding the nature of face processing impairments in autism. This article discusses implications for intervention.  相似文献   

15.
Does contextual affective information influence the processing of facial expressions already at the relatively early stages of face processing? We measured event-related brain potentials to happy and sad facial expressions primed by preceding pictures with affectively positive and negative scenes. The face-sensitive N170 response amplitudes showed a clear affective priming effect: N170 amplitudes to happy faces were larger when presented after positive vs. negative primes, whereas the N170 amplitudes to sad faces were larger when presented after negative vs. positive primes. Priming effects were also observed on later brain responses. The results support an early integration in processing of contextual and facial affective information. The results also provide neurophysiological support for theories suggesting that behavioral affective priming effects are based, at least in part, on facilitation of encoding of incoming affective information.  相似文献   

16.
The aim of this study was to investigate the differences between children and adults in recognizing facial expressions of simple line drawings of "Chernoff's face". First, the angles of the eyebrows and mouth of Chernoff's face were changed in a stepwise way with a personal computer, and the emotional response of the subjects was evaluated by a questionnaire. Second, three drawings of non-target stimuli (neutral face, angry face, and wheelchair) and target stimuli were used to elicit event-related potentials (ERPs). Children had higher scores for the facial expressions than adults, and relied much more on the angles of the eyebrows and mouth. The major ERP findings were (1) the latencies of P100 and N170 were significantly longer in children than adults, (2) the amplitudes of P100 were significantly larger in children than adults, but the N170 amplitudes were not significantly different, and (3) a slow negative shift was recorded with a latency of 240-460ms at the posterior-temporal site for angry face compared with neutral face in adults but not in children. These results suggest that the differences in the electrophysiological recognition of facial expressions can be set at 240ms after appearance of the Chernoff's face in adults but not in children.  相似文献   

17.
Event-related potentials offer evidence for face distinctive neural activity that peaks at about 170 ms following the onset of face stimuli (the N170 effect). We investigated the role of the perceptual mechanism reflected by the N170 effect by comparing the adaptation of the N170 amplitude when target faces were preceded either by identical face images or by different faces relative to when they were preceded by objects. In two experiments, we demonstrate that the N170 is equally adapted by repetition of the same or different faces. Thus, our findings show that the N170 is sensitive to the category rather than the identity of a face. This outcome supports the hypothesis that the N170 effect reflects the activity of a perceptual mechanism which discriminates faces from objects and streams face stimuli to dedicated circuits, specialized in encoding and decoding information about the face.  相似文献   

18.
The present study investigated whether social anxiety modulates the processing of facial expressions. Event-related potentials were recorded during an oddball task in young adults reporting high or low levels of social anxiety as evaluated by the Liebowitz Social Anxiety Scale. Repeated pictures of faces with a neutral expression were infrequently replaced by pictures of the same face displaying happiness, anger, fear or disgust. For all participants, response latencies were shorter in detecting faces expressing disgust and happiness as compared to fear or anger. Low social anxiety individuals evoked enhanced P1 in response to angry faces as compared to other stimuli while high socially anxious participants displayed enlarged P1 for all emotional stimuli as compared to neutral ones, and general higher amplitudes as compared to non-anxious individuals. Conversely, the face-specific N170 and the task-related decision P3b were not influenced by social anxiety. These results suggest increased pre-attentive detection of facial cues in socially anxious individuals and are discussed within the framework of recent models of anxiety.  相似文献   

19.
The P1 and N170 components, two event-related potentials sensitive to face processing, were examined in response to faces and vehicles for children with autism and typical development. P1 amplitude decreased, P1 latency decreased, and N170 amplitude became more negative with age. Children with typical development had larger P1 amplitudes for inverted faces than upright faces, but children with autism did not show this pattern. Children with autism had longer N170 latencies than children with typical development. Smaller P1 amplitudes and more negative N170 amplitudes for upright faces were associated with better social skills for children with typical development.  相似文献   

20.
The P1 and N170 components, two event-related potentials sensitive to face processing, were examined in response to faces and vehicles for children with autism and typical development. P1 amplitude decreased, P1 latency decreased, and N170 amplitude became more negative with age. Children with typical development had larger P1 amplitudes for inverted faces than upright faces, but children with autism did not show this pattern. Children with autism had longer N170 latencies than children with typical development. Smaller P1 amplitudes and more negative N170 amplitudes for upright faces were associated with better social skills for children with typical development.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号