首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
It has been shown that stimuli of a task-irrelevant modality receive enhanced processing when they are presented at an attended location in space (crossmodal attention). The present study investigated the effects of visual deprivation on the interaction of the intact sensory systems. Random streams of tactile and auditory stimuli were presented at the left or right index finger of congenitally blind participants. They had to attend to one modality (auditory or tactile) of one side (left or right) and had to respond to deviant stimuli of the attended modality and side. While in a group of sighted participants, early event-related potentials (ERPs) were negatively displaced to stimuli presented at the attended position, compared to the unattended, for both the task-relevant and the task-irrelevant modality, starting as early as 80 ms after stimulus onset (unimodal and crossmodal spatial attention effects, respectively), corresponding crossmodal effects could not be detected in the blind. In the sighted, spatial attention effects after 200 ms were only significant for the task-relevant modality, whereas a crossmodal effect for this late time window was observed in the blind. This positive rather than negative effect possibly indicates an active suppression of task-irrelevant stimuli at an attended location in space. The present data suggest that developmental visual input is essential for the use of space to integrate input of the non-visual modalities, possibly because of its high spatial resolution. Alternatively, enhanced perceptual skills of the blind within the intact modalities may result in reduced multisensory interactions ("inverse effectiveness of multisensory integration").  相似文献   

2.
An increasing number of animal and human studies suggests that different sensory systems share spatial representations in the brain. The aim of the present study was to test whether attending to auditory stimuli presented at a particular spatial location influences the processing of tactile stimuli at that position and vice versa (crossmodal attention). Moreover, it was investigated which processing stages are influenced by orienting attention to a certain stimulus modality (intermodal attention). Event-related brain potentials (ERPs) were recorded from 15 participants while tactile and auditory stimuli were presented at the left or right side of the body midline. The task of the participants was to attend to either the auditory or to the tactile modality, and to respond to infrequent double-stimuli of either the left or right side. Results showed that spatial attention modulated both early and late somatosensory and auditory ERPs when touch and tones were relevant, respectively. Moreover, early somatosensory (N70–100, N125–175) and auditory (N100–170) potentials, but not later deflections, were affected by spatial attention to the other modality, suggesting bi-directional crossmodal links between hearing and touch. Additionally, ERPs were modulated by intermodal selection mechanisms: stimuli elicited enhanced negative early and late ERPs when they belonged to the attended modality compared to those that belonged to the unattended modality. The present results provide evidence for the parallel influence of spatial and intermodal selection mechanisms at early processing stages while later processing steps are restricted to the relevant modality. Electronic Publication  相似文献   

3.
Recent studies indicate that the coordination of spatial attention across modalities may in part be mediated by a supramodal attentional system. We try to extend the concept of a supramodal system and hypothesized that involuntary modulations of auditory attentional processes by irrelevant speech signals influence visuospatial attention, suggesting crossmodal links between vision and speech. In order to test this we recorded event-related brain potentials (ERPs) of 12 healthy subjects in a visuospatial selective attention task. The task to identify target stimuli appearing at lateral visual field locations caused the expected enhancements of the early P1 and N1 ERP components to attended visual stimuli. Understandable and ununderstandable task irrelevant speech was presented either at the visually attended position or in the opposite visual field location. Speech contralateral to unattended visual stimuli led to a decreased N1 amplitude. This effect was stronger for understandable speech. Thus, speech influences the allocation of visual spatial attention if it is presented in the unattended location. The results suggest crossmodal links of speech and visuospatial attention mechanisms at a very early stage of human perception.  相似文献   

4.
One finding in attention research is that visual and auditory attention mechanisms are linked together. Such a link would predict a central, amodal capacity limit in processing visual and auditory stimuli. Here we show that this is not the case. Letter streams were accompanied by asynchronously presented streams of auditory, visual, and audiovisual objects. Either the letter streams or the visual, auditory, or audiovisual parts of the object streams were attended. Attending to various aspects of the objects resulted in modulations of the letter-stream-elicited steady-state evoked potentials (SSVEPs). SSVEPs were larger when auditory objects were attended than when either visual objects alone or when auditory and visual object stimuli were attended together. SSVEP amplitudes were the same in the latter conditions, indicating that attentional capacity between modalities is larger than attentional capacity within one and the same modality.  相似文献   

5.
Attending to a visual or auditory stimulus often requires irrelevant information to be filtered out, both within the modality attended and in other modalities. For example, attentively listening to a phone conversation can diminish our ability to detect visual events. We used functional magnetic resonance imaging (fMRI) to examine brain responses to visual and auditory stimuli while subjects attended visual or auditory information. Although early cortical areas are traditionally considered unimodal, we found that brain responses to the same ignored information depended on the modality attended. In early visual area V1, responses to ignored visual stimuli were weaker when attending to another visual stimulus, compared with attending to an auditory stimulus. The opposite was true in more central visual area MT+, where responses to ignored visual stimuli were weaker when attending to an auditory stimulus. Furthermore, fMRI responses to the same ignored visual information depended on the location of the auditory stimulus, with stronger responses when the attended auditory stimulus shared the same side of space as the ignored visual stimulus. In early auditory cortex, responses to ignored auditory stimuli were weaker when attending a visual stimulus. A simple parameterization of our data can describe the effects of redirecting attention across space within the same modality (spatial attention) or across modalities (cross-modal attention), and the influence of spatial attention across modalities (cross-modal spatial attention). Our results suggest that the representation of unattended information depends on whether attention is directed to another stimulus in the same modality or the same region of space.  相似文献   

6.
Crossmodal links in spatial attention were studied in an experiment where participants had to detect peripheral tactile or visual targets on the attended side, while ignoring all stimuli on the unattended side and in the currently irrelevant modality. Both relevant locations and relevant modalities were specified on a trial-by-trial basis by auditory precues. Spatial orienting in the cue-target interval was reflected in anterior negativities and occipital positivities contralateral to the cued side, either when vision or touch was cued as relevant. These effects resembled previously reported ERP modulations during shifts of visual attention, implicating supramodal mechanisms in the control of spatial attention and demonstrating their independence of cue modality. Early effects of spatial attention on somatosensory and visual ERPs were of equivalent size for currently relevant and irrelevant modalities. Results support the idea that crossmodal links in spatial attention are mediated by supramodal control mechanisms.  相似文献   

7.
We used event-related functional magnetic resonance imaging to study the neural correlates of endogenous spatial attention for vision and touch. We examined activity associated with attention-directing cues (central auditory pure tones), symbolically instructing subjects to attend to one hemifield or the other prior to upcoming stimuli, for a visual or tactile task. In different sessions, subjects discriminated either visual or tactile stimuli at the covertly attended side, during bilateral visuotactile stimulation. To distinguish cue-related preparatory activity from any modulation of stimulus processing, unpredictably on some trials only the auditory cue was presented. The use of attend-vision and attend-touch blocks revealed whether preparatory attentional effects were modality-specific or multimodal. Unimodal effects of spatial attention were found in somatosensory cortex for attention to touch, and in occipital areas for attention to vision, both contralateral to the attended side. Multimodal spatial effects (i.e. effects of attended side irrespective of task-relevant modality) were detected in contralateral intraparietal sulcus, traditionally considered a multimodal brain region; and also in the middle occipital gyrus, an area traditionally considered purely visual. Critically, all these activations were observed even on cue-only trials, when no visual or tactile stimuli were subsequently presented. Endogenous shifts of spatial attention result in changes of brain activity prior to the presentation of target stimulation (baseline shifts). Here, we show for the first time the separable multimodal and unimodal components of such preparatory activations. Additionally, irrespective of the attended side and modality, attention-directing auditory cues activated a network of superior frontal and parietal association areas that may play a role in voluntary control of spatial attention for both vision and touch. Electronic Publication  相似文献   

8.
To compare the effects of music from different cultural environments (Guqin: Chinese music; piano: Western music) on crossmodal selective attention, behavioral and event-related potential (ERP) data in a standard two-stimulus visual oddball task were recorded from Chinese subjects in three conditions: silence, Guqin music or piano music background. Visual task data were then compared with auditory task data collected previously. In contrast with the results of the auditory task, the early (N1) and late (P300) stages exhibited no differences between Guqin and piano backgrounds during the visual task. Taking our previous study and this study together, we can conclude that: although the cultural-familiar music influenced selective attention both in the early and late stages, these effects appeared only within a sensory modality (auditory) but not in cross-sensory modalities (visual). Thus, the musical cultural factor is more obvious in intramodal than in crossmodal selective attention.  相似文献   

9.
Behavioral and event-related potential (ERP) studies have shown that spatial attention is gradually distributed around the center of the attentional focus. The present study compared uni- and crossmodal gradients of spatial attention to investigate whether the orienting of auditory and visual spatial attention is based on modality specific or supramodal representations of space. Auditory and visual stimuli were presented from five speaker locations positioned in the right hemifield. Participants had to attend to the innermost or outmost right position in order to detect either visual or auditory deviant stimuli. Detection rates and event-related potentials (ERPs) indicated that spatial attention is distributed as a gradient. Unimodal spatial ERP gradients correlated with the spatial resolution of the modality. Crossmodal spatial gradients were always broader than the corresponding unimodal spatial gradients. These results suggest that both modality specific and supramodal spatial representations are activated during orienting attention in space.  相似文献   

10.
In everyday life, emotional events are perceived by multiple sensory systems. Research has shown that recognition of emotions in one modality is biased towards the emotion expressed in a simultaneously presented but task irrelevant modality. In the present study, we combine visual and auditory stimuli that convey similar affective meaning but have a low probability of co-occurrence in everyday life. Dynamic face-blurred whole body expressions of a person grasping an object while expressing happiness or sadness are presented in combination with fragments of happy or sad instrumental classical music. Participants were instructed to categorize the emotion expressed by the visual stimulus. The results show that recognition of body language is influenced by the auditory stimuli. These findings indicate that crossmodal influences as previously observed for audiovisual speech can also be obtained from the ignored auditory to the attended visual modality in audiovisual stimuli that consist of whole bodies and music.  相似文献   

11.
The postauricular reflex (PAR) is a vestigial microreflex evoked by an abrupt auditory onset. Previous studies have indicated that the PAR is unaffected by auditory selective attention. Here, we report that the PAR can be modulated by a crossmodal manipulation of attentional demands within the visual modality. Subjects ( N =17) performed a central rapid serial visual presentation (RSVP) task while presented with irrelevant auditory distractor probes that elicited the PAR. Visual attentional demands were manipulated by altering the perceptual load of the RSVP task. The PAR modulated systematically with perceptual load, decreasing in amplitude with increased perceptual load. Results indicate that the PAR can be influenced by attention, at least within the visual modality.  相似文献   

12.
There is debate in the crossmodal cueing literature as to whether capture of visual attention by means of sound is a fully automatic process. Recent studies show that when visual attention is endogenously focused sound still captures attention. The current study investigated whether there is interaction between exogenous auditory and visual capture. Participants preformed an orthogonal cueing task, in which, the visual target was preceded by both a peripheral visual and auditory cue. When both cues were presented at chance level, visual and auditory capture was observed. However, when the validity of the visual cue was increased to 80% only visual capture and no auditory capture was observed. Furthermore, a highly predictive (80% valid) auditory cue was not able to prevent visual capture. These results demonstrate that crossmodal auditory capture does not occur when a competing predictive visual event is presented and is therefore not a fully automatic process.  相似文献   

13.
Recent event-related brain potential (ERP) studies have revealed crossmodal links in spatial attention, but have not yet investigated differences in the spatial tuning of attention between task-relevant and irrelevant modalities. We studied the spatial distribution of attention in vision under conditions where participants were instructed to attend to the left or right-hand in order to detect infrequent targets, and to entirely ignore visual stimuli presented via LEDs at two eccentricities in the left or right hemifield. Hands were located close to two of these four LEDs in different blocks. Visual N1 amplitudes were enhanced when visual stimuli in the cued hemifield were close to the attended hand, relative to visual stimuli presented at the other location on the same side. These within-hemifield attentional modulations of visual processing demonstrate that crossmodal attention is not distributed diffusely across an entire hemifield. The spatial tuning of tactile attention transfers crossmodally to affect vision, consistent with spatial selection at a multimodal level of representation.  相似文献   

14.
The sustained periodic modulation of a stimulus induces an entrainment of cortical neurons responding to the stimulus, appearing as a steady‐state evoked potential (SS‐EP) in the EEG frequency spectrum. Here, we used frequency tagging of SS‐EPs to study the crossmodal links in spatial attention between touch and vision. We hypothesized that a visual stimulus approaching the left or right hand orients spatial attention toward the approached hand, and thereby enhances the processing of vibrotactile input originating from that hand. Twenty‐five subjects took part in the experiment: 16‐s trains of vibrotactile stimuli (4.2 and 7.2 Hz) were applied simultaneously to the left and right hand, concomitantly with a punctate visual stimulus blinking at 9.8 Hz. The visual stimulus was approached toward the left or right hand. The hands were either uncrossed (left and right hands to the left and right of the participant) or crossed (left and right hands to the right and left of the participant). The vibrotactile stimuli elicited two distinct SS‐EPs with scalp topographies compatible with activity in the contralateral primary somatosensory cortex. The visual stimulus elicited a third SS‐EP with a topography compatible with activity in visual areas. When the visual stimulus was over one of the hands, the amplitude of the vibrotactile SS‐EP elicited by stimulation of that hand was enhanced, regardless of whether the hands were uncrossed or crossed. This demonstrates a crossmodal effect of spatial attention between vision and touch, integrating proprioceptive and/or visual information to map the position of the limbs in external space.  相似文献   

15.
Temporally synchronous, auditory cues can facilitate participants’ performance on dynamic visual search tasks. Making auditory cues spatially informative with regard to the target location can reduce search latencies still further. In the present study, we investigated how multisensory integration, and temporal and spatial attention, might conjointly influence participants’ performance on an elevation discrimination task for a masked visual target presented in a rapidly-changing sequence of masked visual distractors. Participants were presented with either spatially uninformative (centrally presented), spatially valid (with the target side), or spatially invalid tones that were synchronous with the presentation of the visual target. Participants responded significantly more accurately following the presentation of the spatially valid as compared to the uninformative or invalid auditory cues. Participants endogenously shifted their attention to the likely location of the target indicated by the valid spatial auditory cue (reflecting top-down, cognitive processing mechanisms), which facilitated their processing of the visual target over and above any bottom-up benefits associated solely with the synchronous presentation of the auditory and visual stimuli. The results of the present study therefore suggest that crossmodal attention (both spatial and temporal) and multisensory integration can work in parallel to facilitate people's ability to most efficiently respond to multisensory information.  相似文献   

16.
The aim of this study was to establish whether spatial attention triggered by bimodal exogenous cues acts differently as compared to unimodal and crossmodal exogenous cues due to crossmodal integration. In order to investigate this issue, we examined cuing effects in discrimination tasks and compared these effects in a condition wherein a visual target was preceded by both visual and auditory exogenous cues delivered simultaneously at the same side (bimodal cue), with conditions wherein the visual target was preceded by either a visual (unimodal cue) or an auditory cue (crossmodal cue). The results of two experiments revealed that cuing effects on RTs in these three conditions with an SOA of 200 ms had comparable magnitudes. Differences at a longer SOA of 600 ms (inhibition of return for bimodal cues, Experiment 1) disappeared when catch trials were included (in Experiment 2). The current data do not support an additional influence of crossmodal integration on exogenous orienting, but are well in agreement with the existence of a supramodal spatial attention module that allocates attentional resources towards stimulated locations for different sensory modalities.  相似文献   

17.
Selective attention allows us to focus on particular sensory modalities and locations. Relatively little is known about how attention to a sensory modality may relate to selection of other features, such as spatial location, in terms of brain oscillations, although it has been proposed that low-frequency modulation (α- and β-bands) may be key. Here, we investigated how attention to space (left or right) and attention to modality (vision or touch) affect ongoing low-frequency oscillatory brain activity over human sensory cortex. Magnetoencephalography was recorded while participants performed a visual or tactile task. In different blocks, touch or vision was task-relevant, whereas spatial attention was cued to the left or right on each trial. Attending to one or other modality suppressed α-oscillations over the corresponding sensory cortex. Spatial attention led to reduced α-oscillations over both sensorimotor and occipital cortex contralateral to the attended location in the cue-target interval, when either modality was task-relevant. Even modality-selective sensors also showed spatial-attention effects for both modalities. The visual and sensorimotor results were generally highly convergent, yet, although attention effects in occipital cortex were dominant in the α-band, in sensorimotor cortex, these were also clearly present in the β-band. These results extend previous findings that spatial attention can operate in a multimodal fashion and indicate that attention to space and modality both rely on similar mechanisms that modulate low-frequency oscillations.  相似文献   

18.
The goal of the present study was to determine if older adults benefited from attention to a specific sensory modality in a voluntary attention task and evidenced changes in voluntary or involuntary attention when compared to younger adults. Suppressing and enhancing effects of voluntary attention were assessed using two cued forced-choice tasks, one that asked participants to localize and one that asked them to categorize visual and auditory targets. Involuntary attention was assessed using the same tasks, but with no attentional cues. The effects of attention were evaluated using traditional comparisons of means and Cox proportional hazards models. All analyses showed that older adults benefited behaviorally from selective attention in both visual and auditory conditions, including robust suppressive effects of attention. Of note, the performance of the older adults was commensurate with that of younger adults in almost all analyses, suggesting that older adults can successfully engage crossmodal attention processes. Thus, age-related increases in distractibility across sensory modalities are likely due to mechanisms other than deficits in attentional processing.  相似文献   

19.
The purpose of the research reported here was to examine a number of issues relating to the nature of selective attention effects on auditory event-related potentials (ERPs), namely, to determine the relative contribution of N1 and slow wave (SW) to the early and late components of Nd respectively, where Nd is defined as the negative shift of attended ERPs relative to unattended ERPs; to examine whether individual differences in Nd morphology are related to performance and the strategies that subjects use; and to determine the contribution of changes in the attended and unattended ERPs to Nd. Auditory ERPs were recorded from subjects as they carried out an auditory selective attention task and a visual target detection task. The auditory selective attention task was a multidimensional task in which stimuli varied on location, pitch and duration and in which the subject's task was to pay attention to a particular location/pitch combination and respond whenever they detected a long-duration target tone. In the visual target detection task, subjects were required to respond whenever they detected a colour change in a light-emitting diode which also acted as a fixation point. Auditory ERPs recorded during the visual task were used to provide a measure of exogenous components uncontaminated by differential effects of selective processing of auditory stimuli. The results suggested that early Nd and N1 are independently generated as Nd did not exhibit the contralateral scalp focus typical of N1, and that late Nd is independent of SW. While substantial differences in Nd morphology were observed over subjects, these differences showed no consistent relationships to performance or to task strategies. Comparison of auditory ERPs during active auditory attention with auditory ERPs recorded during the visual control task indicated that there was an early negative shift of the attended ERP, a later negative shift of the attended ERP which had a frontal focus and a later positive shift of the unattended ERP. These results suggest that there are active processes involved in the processing of stimuli from both the attended and unattended source.  相似文献   

20.
We studied brain activity during the displacement of attention in a modified visuo‐spatial orienting paradigm. Using a behaviorally relevant no‐shift condition as a control, we asked whether ipsi‐ or contralateral parietal alpha band activity is specifically related to covert shifts of attention. Cue‐related event‐related potentials revealed an attention directing anterior negativity (ADAN) contralateral to the shift of attention and P3 and contingent negative variation waveforms that were enhanced in both shift conditions as compared to the no‐shift task. When attention was shifted away from fixation, alpha band activity over parietal regions ipsilateral to the attended hemifield was enhanced relative to the control condition, albeit with different dynamics in the upper and lower alpha subbands. Contralateral‐to‐attended parietal alpha band activity was indistinguishable from the no‐shift task.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号