首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 437 毫秒
1.
Is the face-sensitive N170 the only ERP not affected by selective attention?   总被引:10,自引:0,他引:10  
Cauquil AS  Edmonds GE  Taylor MJ 《Neuroreport》2000,11(10):2167-2171
We assessed the effect of directed attention on early neurophysiological indices of face processing, measuring the N170 event-related potential (ERP). Twelve subjects were tested on two tasks each in which they attended either to eyes only or to faces with eyes closed, presented within series of facial and control stimuli. Consistent with the ERP literature, N170 was recorded to facial stimuli at posterior temporal electrodes and a concomitant positive peak at the vertex, with latencies around 150 ms for faces and 174 ms for eyes. However, unlike fMRI studies, neither the latency nor the amplitude of the peaks were sensitive to the target/non-target status of either the eyes or the face stimuli. This suggests that early stages of face processing indexed by N 170 are automatic and unmodified by selective attention.  相似文献   

2.
Although the emotional expression of faces is believed to be accessed rapidly, previous ERP studies hardly found correlates of these processes. Here, we report findings from a study that investigated dichoptic binocular interaction using emotional face stimuli. Thirty-one subjects were briefly presented with schematic normal and scrambled faces (of neutral, positive, or negative expression) that occurred simultaneously in the left and right visual fields. Stimuli for both eyes could be congruent (control) or incongruent (dichoptic). Subjects decided which of the superimposed images in both hemi-fields appeared more "face-like" and during this task, the EEG was recorded from 30 channels. VEPs were analysed topographically according to the influence of the different experimental conditions (defined by presentation form, emotional expression, and location). Behavioural responses to the ambiguous dichoptic stimuli demonstrated a functional eye dominance not related to visual acuity and conventional eye preference. Electrophysiological data revealed three components with mean latencies of 85, 160, and 310 ms. Topography of the second component (equivalent to the face-related N170) differed in left-right and anterior-posterior direction compared with simple checkerboard stimuli. Dichoptic presentation caused reduced field strength of all three, and increased latency of the first component. Faces with negative expression yielded largest field strength of the second and third components. Besides that, emotional expression affected topography not only of late, but also the first component. This provides new evidence about the timing of perceptual processes related to facial expression, indicating that already VEP components occurring at 80-90 ms are sensitive to emotional content.  相似文献   

3.
《Social neuroscience》2013,8(3):285-295
This study investigated the electrophysiological (event-related potential, ERP) and behavioral (reaction time, RT) correlates of gaze-cued shifts of visuospatial attention. Participants viewed centrally presented faces with neutral expressions in which eyes looked straight ahead and then shifted to the left or right. Upon movement of the eyes, the facial expression either stayed the same (neutral) or changed to a fearful or happy expression. Participants' task was to identify a probe letter (T or L), which appeared in either the same or the opposite location to the direction of the eye gaze. There was behavioral evidence of a gaze congruency effect, as RTs were faster when the eyes looked toward rather than away from the location of the target. The ERP data indicated the presence of a significant gaze-congruent anterior directing attention negativity (ADAN) at anterior locations (300–500 ms after the onset of the gaze cue). ERP data did not show evidence of initial orienting of attention triggered by gaze cues in the early directing attention negativity (EDAN) at posterior locations (200–300 ms post-cue onset). The gaze cueing effects in the RT and ERP data were not significantly influenced by the emotional expression of the faces. The presence of the ADAN may reflect neural mechanisms that underlie the holding of attention on gazed-at locations.  相似文献   

4.
Perceived gaze in faces is an important social signal that may influence orienting of attention in normal observers. Would such effects of gaze still occur in patients with right parietal damage and left neglect who usually fail to attend to contralesional space? Two experiments tested for effects of perceived gaze on visual extinction. Face or shape stimuli were presented in the right, left, or both hemifields, with faces looking either straight ahead or toward the opposite field. On bilateral trials, patients extinguished a left shape much less often when a concurrent right face looked leftward rather than straight ahead. This occurred, even though gaze was not relevant to the task and processing of facial signals implied attention to a competing ipsilesional stimulus. By contrast, rightward gaze in faces presented on the left side had no effect on extinction, suggesting that gaze cues are not extracted without attention. Two other experiments examined effects of perceived gaze on the detection of peripheral targets. Targets appeared at one of four possible locations to the right or left of a central face looking either toward the target location, another location on the same side, the opposite side, or straight ahead. Face and gaze were not relevant to the task and not predictive of target location. Patients responded faster when the face looked toward the target on both the contralesional and ipsilesional sides. In contralesional space, gaze allowed shifting of attention in a specific quadrant direction, but only to the first target along the scan path when there were different possible locations on the same side. By contrast, in intact ipsilesional space, attention was selectively directed to one among different eccentric locations. Control experiments showed that symbolic arrow cues did not produce similar effects. These results indicate that even though parietal damage causes spatial neglect and impairs the representation of location on the contralesional side, perceived gaze in faces can still trigger automatic shifts of attention in the contralesional direction, suggesting the existence of specific and anatomically distinct attentional mechanisms.  相似文献   

5.
Several recent studies have begun to examine the neurocognitive mechanisms involved in perceiving and responding to eye contact, a salient social signal of interest and readiness for interaction. Laboratory experiments measuring observers'' responses to pictorial instead of live eye gaze cues may, however, only vaguely approximate the real-life affective significance of gaze direction cues. To take this into account, we measured event-related brain potentials and subjective affective responses in healthy adults while viewing live faces with a neutral expression through an electronic shutter and faces as pictures on a computer screen. Direct gaze elicited greater face-sensitive N170 amplitudes and early posterior negativity potentials than averted gaze or closed eyes, but only in the live condition. The results show that early-stage processing of facial information is enhanced by another person''s direct gaze when the person is faced live. We propose that seeing a live face with a direct gaze is processed more intensely than a face with averted gaze or closed eyes, as the direct gaze is capable of intensifying the feeling of being the target of the other''s interest and intentions. These results may have implications for the use of pictorial stimuli in the social cognition studies.  相似文献   

6.
We investigated young infants’ object encoding and processing in response to isolated eye gaze cues on the neural and behavioral level. In two experiments, 4-month-old infants watched a pair of isolated eyes gazing towards or away from novel objects. Subsequently, the same objects were presented alone (i.e., without eyes). We measured event-related potentials (ERP) in response to object-directed and object-averted eye gaze as well as to the subsequently presented isolated objects. Using eye-tracking methods, we additionally measured infants’ looking behavior in reaction to the subsequently presented isolated objects. The ERP data revealed an enhanced slow wave positivity for object-directed eye gaze, indicating increased encoding of observed gaze cues. Regarding the objects, we found an enhanced Nc amplitude and increased looking times in response to previously uncued objects, indicating a novelty response on the neural and behavioral level. The results suggest that isolated eye gaze stimuli are sufficient to trigger object encoding and facilitate further object processing.  相似文献   

7.
We report results from two experiments in which subjects had to categorize briefly presented upright or inverted natural scenes. In the first experiment, subjects decided whether images contained animals or human faces presented at different scales. Behavioral results showed virtually identical processing speed between the two categories and very limited effects of inversion. One type of event-related potential (ERP) comparison, potentially capturing low-level physical differences, showed large effects with onsets at about 150 msec in the animal task. However, in the human face task, those differences started as early as 100 msec. In the second experiment, subjects responded to close-up views of animal faces or human faces in an attempt to limit physical differences between image sets. This manipulation almost completely eliminated small differences before 100 msec in both tasks. But again, despite very similar behavioral performances and short reaction times in both tasks, human faces were associated with earlier ERP differences compared with animal faces. Finally, in both experiments, as an alternative way to determine processing speed, we compared the ERP with the same images when seen as targets and nontargets in different tasks. Surprisingly, all task-dependent ERP differences had relatively long latencies. We conclude that task-dependent ERP differences fail to capture object processing speed, at least for some categories like faces. We discuss models of object processing that might explain our results, as well as alternative approaches.  相似文献   

8.
S Watanabe  R Kakigi  S Koyama  E Kirino 《Neuroreport》1999,10(10):2193-2198
Brain responses to eyes and whole face were studied by magnetoencephalography (MEG). We used five different visual stimuli, face with opened eyes, face with closed eyes, eyes, scrambled face, and hand. 1M was evoked in response to all kinds of stimuli but 2M peaking at approximately 180 ms was recorded only for face and eyes. The peak latencies of 1M and 2M and the interpeak latency 1M-2M to eyes were significantly longer than those to face, and there was no significant difference of latency between face with opened eyes and face with closed eyes. The 2M both for face and eyes was generated in the same area, the inferior temporal cortex, around the fusiform gyrus.  相似文献   

9.
Previous work has found a left visual field (LVF) advantage for various judgements on faces, including identity and emotional expression. This has been related to possible right-hemisphere specialisation for face processing, and it has been proposed that this might reflect configural processing. We sought to determine whether a similar LVF advantage may also exist for gaze perception. In two experiments, normal adult subjects made judgements for seen gaze direction (left, right or straight). To assess how visual field may influence perception of gaze direction, eye stimuli were briefly presented unilaterally or bilaterally. In the latter case, the gaze direction of the two seen eyes could be congruent or incongruent (i.e. the two eyes could gaze in the same or different directions). For unilateral displays, performance was more accurate for LVF stimuli than RVF. With bilateral incongruent gaze, the LVF eye influenced judgements more strongly than the RVF eye. No such LVF advantage was found in a control experiment, in which subjects judged pupil size for similar eye stimuli. Taken together, these results reveal a LVF advantage for perception of gaze direction. Since only the eye region was visible, our results cannot be due to a LVF bias in processing the entire face context. Instead they suggest lateralisation specifically in processing the direction of seen gaze.  相似文献   

10.
Results from recent event-related brain potential (ERP) studies investigating brain processes involved in the detection and analysis of emotional facial expression are reviewed. In all experiments, emotional faces were found to trigger an increased ERP positivity relative to neutral faces. The onset of this emotional expression effect was remarkably early, ranging from 120 to 180ms post-stimulus in different experiments where faces were either presented at fixation or laterally, and with or without non-face distractor stimuli. While broadly distributed positive deflections beyond 250ms post-stimulus have been found in previous studies for non-face stimuli, the early frontocentrally distributed phase of this emotional positivity is most likely face-specific. Similar emotional expression effects were found for six basic emotions, suggesting that these effects are not primarily generated within neural structures specialised for the automatic detection of specific emotions. Expression effects were eliminated when attention was directed away from the location of peripherally presented emotional faces, indicating that they are not linked to pre-attentive emotional processing. When foveal faces were unattended, expression effects were attenuated, but not completely eliminated. It is suggested that these ERP correlates of emotional face processing reflect activity within a neocortical system where representations of emotional content are generated in a task-dependent fashion for the adaptive intentional control of behaviour. Given the early onset of the emotion-specific effects reviewed here, it is likely that this system is activated in parallel with the ongoing evaluation of emotional content in the amygdala and related subcortical brain circuits.  相似文献   

11.
Episodic memory is supported by recollection, the conscious retrieval of contextual information associated with the encoding of a stimulus. Event-Related Potential (ERP) studies of episodic memory have identified a robust neural correlate of recollection—the left parietal old/new effect—that has been widely observed during recognition memory tests. This left parietal old/new effect is believed to provide an index of generic cognitive operations related to recollection; however, it has recently been suggested that the neural correlate of recollection observed when faces are used as retrieval cues has an anterior scalp distribution, raising the possibility that faces are recollected differently from other types of information. To investigate this possibility, we directly compared neural activity associated with remember responses for correctly recognized face and name retrieval cues. Compound face–name stimuli were studied, and at test either a face or a name was presented alone. Participants discriminated studied from unstudied stimuli, and made a remember/familiar decision for stimuli judged ‘old’. Remembering faces was associated with anterior (500–700 ms) and late right frontal old/new effects (700–900 ms), whereas remembering names elicited mid frontal (300–500 ms) and left parietal (500–700 ms) effects. These findings demonstrate that when directly compared, with reference to common episodes, distinct cognitive operations are associated with remembering faces and names. We discuss whether faces can be remembered in the absence of recollection, or whether there may be more than one way of retrieving episodic context.  相似文献   

12.
Recent research on affective processing has suggested that low spatial frequency information of fearful faces provide rapid emotional cues to the amygdala, whereas high spatial frequencies convey fine-grained information to the fusiform gyrus, regardless of emotional expression. In the present experiment, we examined the effects of low (LSF, <15 cycles/image width) and high spatial frequency filtering (HSF, >25 cycles/image width) on brain processing of complex pictures depicting pleasant, unpleasant, and neutral scenes. Event-related potentials (ERP), percentage of recognized stimuli and response times were recorded in 19 healthy volunteers. Behavioral results indicated faster reaction times in response to unpleasant LSF than to unpleasant HSF pictures. Unpleasant LSF pictures and pleasant unfiltered pictures also elicited significant enhancements of P1 amplitudes at occipital electrodes as compared to neutral LSF and unfiltered pictures, respectively; whereas no significant effects of affective modulation were found for HSF pictures. Moreover, mean ERP amplitudes in the time between 200 and 500 ms post-stimulus were significantly greater for affective (pleasant and unpleasant) than for neutral unfiltered pictures; whereas no significant affective modulation was found for HSF or LSF pictures at those latencies. The fact that affective LSF pictures elicited an enhancement of brain responses at early, but not at later latencies, suggests the existence of a rapid and preattentive neural mechanism for the processing of motivationally relevant stimuli, which could be driven by LSF cues. Our findings confirm thus previous results showing differences on brain processing of affective LSF and HSF faces, and extend these results to more complex and social affective pictures.  相似文献   

13.
Human eyes are a powerful social cue that may automatically attract the attention of an observer. Here we tested whether looking toward open human eyes, as often arises in standard clinical "confrontation" tests, may affect contralesional errors in a group of right brain-damaged patients showing visual extinction. Patients were requested to discriminate peripheral shape-targets presented on the left, right, or bilaterally. On each trial they also saw a central task-irrelevant stimulus, comprising an image of the eye sector of a human face, with those seen eyes open or closed. The conditions with central eye stimuli open (vs closed) induced more errors for contralesional peripheral targets, particularly for bilateral trials. These results suggest that seeing open eyes in central vision may attract attentional resources there, reducing attention to the periphery, particularly for the affected contralesional side. The seen gaze of the examiner may thus need to be considered during confrontation testing and may contribute to the effectiveness of that clinical procedure.  相似文献   

14.
Effects of language learning on categorical perception have been detected in multiple domains. We extended the methods of these studies to gender and pitted the predictions of androcentrism theory and the spatial agency bias against each other. Androcentrism is the tendency to take men as the default gender and is socialized through language learning. The spatial agency bias is a tendency to imagine men before women in the left–right axis in the direction of one's written language. We examined how gender-ambiguous faces were categorized as female or male when presented in the left visual fields (LVFs) and right visual fields (RVFs) to 42 native speakers of English. When stimuli were presented in the RVF rather than the LVF, participants (1) applied a lower threshold to categorize stimuli as male and (2) categorized clearly male faces as male more quickly. Both findings support androcentrism theory suggesting that the left hemisphere, which is specialized for language, processes face stimuli as male-by-default more readily than the right hemisphere. Neither finding evidences an effect of writing direction predicted by the spatial agency bias on the categorization of gender-ambiguous faces.  相似文献   

15.
In a previous experiment using scalp event-related potentials (ERPs), we have described the neuroelectric activities associated with the processing of gender information on human faces (Mouchetant-Rostaing, Giard, Bentin, Aguera, & Pernier, 2000). Here we extend this study by examining the processing of age on faces using a similar experimental paradigm, and we compare age and gender processing. In one session, faces were of the same gender (women) and of one age range (young or old), to reduce gender and age processing. In a second session, faces of young and old women were randomly intermixed but age was irrelevant for the task, hence, age discrimination, if any, was assumed to be incidental. In the third and fourth sessions, faces had to be explicitly categorized according to their age or gender, respectively (intentional discrimination). Neither age nor gender processing affected the occipito-temporal N170 component often associated with the detection of physiognomic features and global structural encoding of faces. Rather, the three age and gender discrimination conditions induced similar fronto-central activities around 145-185 msec. In our previous experiment, this ERP pattern was also found for implicit and explicit categorization of gender from faces but not in a control condition manipulating hand stimuli (Mouchetant-Rostaing, Giard, Bentin, et al., 2000). Whatever their exact nature, these 145-185 msec effects therefore suggest, first, that similar mechanisms could be engaged in age and gender perception, and second, that age and gender may be implicitly processed irrespective of their relevance to the task, through somewhat specialized mechanisms. Additional ERP effects were found at early latencies (45-90 msec) in all three discrimination conditions, and around 200-400 msec during explicit age and gender discrimination. These effects have been previously found in control conditions manipulating nonfacial stimuli and may therefore be related to more general categorization processes.  相似文献   

16.
In a previous experiment aimed at studying gender processing from faces, we had found unexpected early ERP differences (45-85 ms) in task-irrelevant stimuli between a condition in which the stimuli of each gender were delivered in separate runs, and a condition in which the stimuli of both genders were mixed. Similar effects were observed with hand stimuli. These early ERP differences were tentatively related to incidental categorization processes between male and female stimuli. The present study was designed to test the robustness of these early effects for faces, and to examine whether similar effects can also be generated between two classes of non-biological stimuli. We replicated the previous findings for faces, and found similar early differential effects (50-65 ms) for non-biological stimuli (grey and hatched geometrical shapes) only, however, when the two shape categories were separated by conspicuous visual characteristics. While these results can partly be explained by phenomena related to neuronal habituation in the visual cortex, they may also suggest the existence of coarse and automatic categorization processes for rapid distinction between two wide classes of stimuli with strong psychosocial significance for humans.  相似文献   

17.
Paulsen DJ  Neville HJ 《Neuropsychologia》2008,46(10):2532-2544
Previous research has shown that, in the context of event-related potential (ERP) prime-target experiments, processing meaningful stimuli such as words, phonemes, numbers, pictures of objects, and faces elicit negativities around 400 ms. However, there is little information on whether non-symbolic numerical magnitudes elicit this negative component. The present experiments recorded ERPs while adults made same/different judgments to serially presented prime-target pairs of non-symbolic numerical stimuli containing the same, close, or distant quantities. In Experiment 1, a negativity between 350 and 450 ms was elicited for targets preceded by primes of unequal quantity, and this was greater for close than for distant quantities. Change direction (decreasing or increasing) also modulated a similar negativity: a greater negativity was elicited by targets preceded by larger than by smaller quantities. Experiment 2 replicated the numerical distance and change direction effects for numerical judgments, but found no negative distance effect in a color comparison task when the same stimuli were used. Additionally, ERP effects of numerical distance were found under implicit conditions, and task proficiency in the number condition modulated implicit and explicit numerical distance ERP effects. These results suggest that the neural systems involved with processing numerical magnitudes contribute to the construction of meaningful, contextual representations, are partly automatic, and display marked individual differences.  相似文献   

18.
Face and gaze processing were studied using magnetoencephalography in 10 children with autism and 10 normally developing children, aged between 7 and 12 years. The children performed two tasks in which they had to discriminate whether images of faces presented sequentially in pairs were identical. The images showed four different categories of gaze: direct gaze, eyes averted (left or right) and closed eyes but there was no instruction to focus on the direction of gaze. Images of motorbikes were used as control stimuli. Faces evoked strong activity over posterior brain regions at about 100 ms in both groups of children. A response at 140 ms to faces observed over extrastriate cortices, thought to be homologous to the N170 in adults, was weak and bilateral in both groups and somewhat weaker (approaching significance) in the children with autism than in the control children. The response to motorbikes differed between the groups at 100 and 140 ms. Averted eyes evoked a strong right lateralized component at 240 ms in the normally developing children that was weak in the clinical group. By contrast, direct gaze evoked a left lateralized component at 240 ms only in children with autism. The findings suggest that face and gaze processing in children with autism follows a trajectory somewhat similar to that seen in normal development but with subtle differences. There is also a possibility that other categories of object may be processed in an unusual way. The inter-relationships between these findings remain to be elucidated.  相似文献   

19.
《Social neuroscience》2013,8(4):317-331
Abstract

Multiple sources of information from the face guide attention during social interaction. The present study modified the Posner cueing paradigm to investigate how dynamic changes in emotional expression and eye gaze in faces affect the neural processing of subsequent target stimuli. Event-related potentials (ERPs) were recorded while participants viewed centrally presented face displays in which gaze direction (left, direct, right) and facial expression (fearful, neutral) covaried in a fully crossed design. Gaze direction was not predictive of peripheral target location. ERP analysis revealed several sequential effects, including: (1) an early enhancement of target processing following fearful faces (P1); (2) an interaction between expression and gaze (N1), with enhanced target processing following fearful faces with rightward gaze; and (3) an interaction between gaze and target location (P3), with enhanced processing for invalidly cued left visual field targets. Behaviorally, participants responded faster to targets following fearful faces and targets presented in the right visual field, in concordance with the P1 and N1 effects, respectively. The findings indicate that two nonverbal social cues—facial expression and gaze direction—modulate attentional orienting across different temporal stages of processing. Results have implications for understanding the mental chronometry of shared attention and social referencing.  相似文献   

20.
We investigated the ERP correlates of the subjective perception of upright and upside-down ambiguous pictures as faces using two-tone Mooney stimuli in an explicit facial decision task (deciding whether a face is perceived or not in the display). The difficulty in perceiving upside-down Mooneys as faces was reflected by both lower rates of "Face" responses and delayed "Face" reaction times for upside-down relative to upright stimuli. The N170 was larger for the stimuli reported as "faces". It was also larger for the upright than the upside-down stimuli only when they were reported as faces. Furthermore, facial decision as well as stimulus orientation effects spread from 140-190 ms to 390-440 ms. The behavioural delay in 'Face' responses to upside-down stimuli was reflected in ERPs by later effect of facial decision for upside-down relative to upright Mooneys over occipito-temporal electrodes. Moreover, an orientation effect was observed only for the stimuli reported as faces; it yielded a marked hemispheric asymmetry, lasting from 140-190 ms to 390-440 ms post-stimulus onset in the left hemisphere and from 340-390 to 390-440 ms only in the right hemisphere. Taken together, the results supported a preferential involvement of the right hemisphere in the detection of faces, whatever their orientation. By contrast, the early orientation effect in the left hemisphere suggested that upside-down Mooney stimuli were processed as non face objects until facial decision was reached in this hemisphere. The present data show that face perception involves not only spatially but also temporally distributed activities in occipito-temporal regions.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号