首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A fundamental question with regard to perceptual development is how multisensory information is processed in the brain during the early stages of development. Although a growing body of evidence has shown the early emergence of modality‐specific functional differentiation of the cortical regions, the interplay between sensory inputs from different modalities in the developing brain is not well understood. To study the effects of auditory input during audio‐visual processing in 3‐month‐old infants, we evaluated the spatiotemporal cortical hemodynamic responses of 50 infants while they perceived visual objects with or without accompanying sounds. The responses were measured using 94‐channel near‐infrared spectroscopy over the occipital, temporal, and frontal cortices. The effects of sound manipulation were pervasive throughout the diverse cortical regions and were specific to each cortical region. Visual stimuli co‐occurring with sound induced the early‐onset activation of the early auditory region, followed by activation of the other regions. Removal of the sound stimulus resulted in focal deactivation in the auditory regions and reduced activation in the early visual region, the association region of the temporal and parietal cortices, and the anterior prefrontal regions, suggesting multisensory interplay. In contrast, equivalent activations were observed in the lateral occipital and lateral prefrontal regions, regardless of sound manipulation. Our findings indicate that auditory input did not generally enhance overall activation in relation to visual perception, but rather induced specific changes in each cortical region. The present study implies that 3‐month‐old infants may perceive audio‐visual multisensory inputs by using the global network of functionally differentiated cortical regions. Hum Brain Mapp, 2013. © 2011 Wiley Periodicals, Inc.  相似文献   

2.
An extrastriate visual area near the human temporo parieto occipital junction (TPO) may selectively mediate motion processing, while contributing little to the perception of color or form. This TPO area may be the human analogue of the monkey middle temporal (MT or V5) and medial superior temporal (MST) extrastriate visual areas. The selectivity of the effect of transcranial magnetic stimulation (TMS) on motion processing was unknown, as was the timecourse of occipital to TPO motion processing. In the first experiment, unilateral TMS was delivered over TPO 50-250 ms after the onset of a random dot motion discrimination display that was presented in the right or left hemifield. TMS reduced the correct discrimination of motion direction only when it was delivered in a discrete time window 100-175 ms following the onset of the display. TMS did not significantly affect hemifield spatial acuity in the same time window. In the second experiment, bilateral TMS delivered over occipital cortex also degraded the discrimination of motion-defined form (MDF) in a discrete time window following the onset of a display presented foveally. Bilateral focal TMS delivered over TPO disrupted the discrimination of MDF in a time window beginning 20-40 ms later than the effect of TMS delivered over occipital cortex. Bilateral focal TMS delivered over TPO also degraded the discrimination of color-defined form, motion direction and color. TMS can trace the timing of visual processing from occipital to extrastriate visual areas.  相似文献   

3.
Ambiguous stimuli have been widely used to study the neuronal correlates of consciousness. Recently, it has been suggested that conscious perception might arise from the dynamic interplay of functionally specialized but widely distributed cortical areas. While previous research mainly focused on phase coupling as a correlate of cortical communication, more recent findings indicated that additional coupling modes might coexist and possibly subserve distinct cortical functions. Here, we studied two coupling modes, namely phase and envelope coupling, which might differ in their origins, putative functions and dynamics. Therefore, we recorded 128‐channel EEG while participants performed a bistable motion task and utilized state‐of‐the‐art source‐space connectivity analysis techniques to study the functional relevance of different coupling modes for cortical communication. Our results indicate that gamma‐band phase coupling in extrastriate visual cortex might mediate the integration of visual tokens into a moving stimulus during ambiguous visual stimulation. Furthermore, our results suggest that long‐range fronto‐occipital gamma‐band envelope coupling sustains the horizontal percept during ambiguous motion perception. Additionally, our results support the idea that local parieto‐occipital alpha‐band phase coupling controls the inter‐hemispheric information transfer. These findings provide correlative evidence for the notion that synchronized oscillatory brain activity reflects the processing of sensory input as well as the information integration across several spatiotemporal scales. The results indicate that distinct coupling modes are involved in different cortical computations and that the rich spatiotemporal correlation structure of the brain might constitute the functional architecture for cortical processing and specific multi‐site communication. Hum Brain Mapp 37:4099–4111, 2016. © 2016 Wiley Periodicals, Inc.  相似文献   

4.
The timing of personal movement with respect to external events has previously been investigated using a synchronized finger‐tapping task with a sequence of auditory or visual stimuli. While visuomotor synchronization is more accurate with moving stimuli than with stationary stimuli, it remains unclear whether the same principle holds true in the auditory domain. Although the right inferior–superior parietal lobe (IPL/SPL), a center of auditory motion processing, is expected to be involved in auditory–motor synchronization with moving sounds, its functional relevance has not yet been investigated. The aim of the present study was thus to clarify whether horizontal auditory motion affects the accuracy of finger‐tapping synchronized with sounds, as well as whether the application of transcranial direct current stimulation (tDCS) to the right IPL/SPL affects this. Nineteen healthy right‐handed participants performed a task in which tapping was synchronized with both stationary sounds and sounds that created apparent horizontal motion. This task was performed before and during anodal, cathodal and sham tDCS application to the right IPL/SPL in separate sessions. The time difference between the onset of the sounds and tapping was larger with apparently moving sounds than with stationary sounds. Cathodal tDCS decreased this difference, anodal tDCS increased the variance of the difference and sham stimulation had no effect. These results supported the hypothesis that auditory motion disturbs efficient auditory–motor synchronization and that the right IPL/SPL plays an important role in tapping in synchrony with moving sounds via auditory motion processing.  相似文献   

5.
A remarkable example of rapid perceptual learning is the visual recalibration of auditory spatial perception, which can result in either a bias (ventriloquism after-effect) or an improvement (multisensory enhancement) in auditory localization. Here, we examine the possibility that these after-effects might depend on two distinct neural pathways (geniculostriate vs. collicular–extrastriate). To this end, patients with a lesion of the striate cortex (hemianopic patients) or temporoparietal cortex (neglect patients) were asked to localize weak sounds, before and after a brief exposure to repetitive auditory–visual stimulation which was given either in the normal or in the affected field. Adaptation comprised spatially disparate (Experiment 1) or spatially coincident (Experiment 2) auditory–visual stimuli. After exposure to spatially disparate stimuli in the normal field, all patients exhibited the usual shifts toward the visual attractor, at each sound location. In contrast, when the same kind of adaptation was given in the affected field, a consistent shift was still evident in neglect patients but not in patients with hemianopia. After adaptation to spatially coincident stimuli, and regardless of the adaptation hemifield, all patients exhibited a significant improvement in auditory localization, which was largest for sounds presented at the adapted location. The findings suggest the presence of two distinct recalibration mechanisms. Adapting to spatially conflicting stimuli invokes a corrective mechanism implemented within the geniculostriate circuit, which tries to reduce the registered discrepancy. Adapting to spatially aligned inputs invokes a mechanism implemented along a collicular–extrastriate circuit, which tries to reduce the localization error.  相似文献   

6.
Prismatic adaptation is increasingly recognised as an effective procedure for rehabilitating symptoms of unilateral spatial neglect - producing relatively long-lasting improvements on a variety of spatial attention tasks. The mechanisms by which the aftereffects of adaptation change neglect patients’ performance on these tasks remain controversial. It is not clear, for example, whether adaptation directly influences the pathological ipsilesional attention bias that underlies neglect, or whether it simply changes exploratory motor behaviour. Here we used visual and auditory versions of a target detection task with a secondary task at fixation. Under these conditions, patients with neglect demonstrated a spatial gradient in their ability to orient to the brief, peripheral visual or auditory targets. Following prism adaptation, we found that overall performance on both the auditory and visual task improved, however, most patients in our sample did not show changes in their visual or auditory spatial gradient of attention, despite adequate aftereffects of adaptation and significant improvement in neglect on visual cancellation. Although there were individual cases that suggested prism-induced changes in visual target detection, and even reversal of the visual spatial gradient, such cases were not evident for the auditory modality. The findings indicate that spatial gradients in stimulus-driven attention may be less responsive to the effects of prism adaptation than neglect symptoms in voluntary orienting and exploratory behaviour. Individual factors such as lesion site and symptom severity may also determine the expression of prism effects on spatial neglect.  相似文献   

7.
Several studies have shown activation of the mirror neuron system (MNS), comprising the temporal, posterior parietal, and sensorimotor areas when observing plausible actions, but far less is known on how these cortical areas interact during the recognition of a plausible action. Here, we recorded neural activity with magnetoencephalography while subjects viewed point‐light displays of biologically plausible and scrambled versions of actions. We were interested in modulations of oscillatory activity and, specifically, in coupling of oscillatory activity between visual and motor areas. Both plausible and scrambled actions elicited modulations of θ (5–7 Hz), α (7–13 Hz), β (13–35 Hz), and γ (55–100 Hz) power within visual and motor areas. When comparing between the two actions, we observed sequential and spatially distinct increases of γ (~65 Hz), β (~25 Hz), and α (~11 Hz) power between 0.5 and 1.3 s in parieto‐occipital, sensorimotor, and left temporal areas. In addition, significant clusters of γ (~65 Hz) and α/β (~15 Hz) power decrease were observed in right temporal and parieto‐occipital areas between 1.3 and 2.0 s. We found β‐power in sensorimotor areas to be positively correlated on a trial‐by‐trial basis with parieto‐occipital γ and left temporal α‐power for the plausible but not for the scrambled condition. These results provide new insights in the neuronal oscillatory activity of the areas involved in the recognition of plausible action movements and their interaction. The power correlations between specific areas underscore the importance of interactions between visual and motor areas of the MNS during the recognition of a plausible action. Hum Brain Mapp 35:581–592, 2014. © 2012 Wiley‐Periodicals, Inc.  相似文献   

8.
Sanabria D  Soto-Faraco S  Spence C 《Neuroreport》2004,15(18):2745-2749
In the present study, we explored the role of visual perceptual grouping on audiovisual motion integration, using an adaptation of the crossmodal dynamic capture task developed by Soto-Faraco et al. The principles of perceptual grouping were used to vary the perceived direction (horizontal vs vertical) and extent of apparent motion within the visual modality. When the critical visual stimuli, giving rise to horizontal local motion, were embedded within a larger array of lights, giving rise to the perception of global motion vertically, the influence of visual motion information on the perception of auditory apparent motion (moving horizontally) was reduced significantly. These results highlight the need to consider intramodal perceptual grouping when investigating crossmodal perceptual grouping.  相似文献   

9.
Behavioral and physiological studies have established that visual attention to a given feature or location can modulate early visual processing. In the present experiment, we asked whether auditory attention can likewise influence visual processing. We used a visual illusion, the motion aftereffect (MAE), to assess the effects of visual and auditory attention on motion processing in human area MT+. We acquired psychophysical and functional magnetic resonance imaging (fMRI) data while subjects fixated and viewed moving and stationary stimuli in alternating blocks. For each of four motion conditions, we measured the duration of the subsequent MAE, the time for activity in MT+ to return to baseline after motion adaptation (decay time), and the magnitude of MT+ activity during motion adaptation. For each subject, we first obtained measures of motion processing in the absence of attentional demands, by comparing reversing and expanding motion conditions. Subjects perceived the MAE following adaptation to expanding but not reversing motion, as observed previously, and decay times in MT+ were selectively prolonged after expanding motion. We then assessed the effects of performing either a visual or an auditory attentional task during expanding motion adaptation. Performance of the attentional task, whether visual or auditory, produced a significant reduction of subsequent MAE perception and associated decay times in MT+, as compared to expanding motion with fixation only. Both attentional tasks also reduced the magnitude of activation during motion adaptation. These data show that auditory attention, like visual attention, can modify sensory processing at a remarkably early stage of the visual hierarchy.  相似文献   

10.
Adaptation to visual or auditory motion affects within‐modality motion processing as reflected by visual or auditory free‐field motion‐onset evoked potentials (VEPs, AEPs). Here, a visual–auditory motion adaptation paradigm was used to investigate the effect of visual motion adaptation on VEPs and AEPs to leftward motion‐onset test stimuli. Effects of visual adaptation to (i) scattered light flashes, and motion in the (ii) same or in the (iii) opposite direction of the test stimulus were compared. For the motion‐onset VEPs, i.e. the intra‐modal adaptation conditions, direction‐specific adaptation was observed – the change‐N2 (cN2) and change‐P2 (cP2) amplitudes were significantly smaller after motion adaptation in the same than in the opposite direction. For the motion‐onset AEPs, i.e. the cross‐modal adaptation condition, there was an effect of motion history only in the change‐P1 (cP1), and this effect was not direction‐specific – cP1 was smaller after scatter than after motion adaptation to either direction. No effects were found for later components of motion‐onset AEPs. While the VEP results provided clear evidence for the existence of a direction‐specific effect of motion adaptation within the visual modality, the AEP findings suggested merely a motion‐related, but not a direction‐specific effect. In conclusion, the adaptation of veridical auditory motion detectors by visual motion is not reflected by the AEPs of the present study.  相似文献   

11.
Proprioceptive influences on auditory and visual spatial localization   总被引:2,自引:0,他引:2  
We evaluated the influence of proprioceptive information about arm position on the perceptual localization of auditory and visual targets attached to the hand. Our approach was to distort the perceived position of the restrained arm by means of mechanical vibration of the biceps brachii muscle; such vibration elicits compelling apparent extension of the stationary forearm (Goodwin, G. M., D. I. McCloskey, and P. B. C. Matthews (1972) Science 175: 1382-1384, Brain 95: 705-748), and subjects report changes in the apparent directions of the auditory and visual targets attached to their hand. These changes are in the same direction and plane as apparent arm motion and their onsets are coincident with or lag slightly behind the experienced displacement of the arm. While visual motion is being experienced, a subject's eyes remain steadily fixating the target light. The pattern of findings demonstrates that proprioceptive information about limb position can influence the central representation of gaze and auditory localization can be similarly influenced. The biasing of auditory localization indicates that identical patterns of arrival time and intensity cues at the two ears can give rise to the perception of sounds in widely disparate spatial positions in relation to the head and body, depending on the proprioceptive representation of the direction of the sound source.  相似文献   

12.
Transcranial magnetic stimulation (TMS) over the occipital pole can produce an illusory percept of a light flash (or ‘phosphene’), suggesting an excitatory effect. Whereas previous reported effects produced by single‐pulse occipital pole TMS are typically disruptive, here we report the first demonstration of a location‐specific facilitatory effect on visual perception in humans. Observers performed a spatial cueing orientation discrimination task. An orientation target was presented in one of two peripheral placeholders. A single pulse below the phosphene threshold applied to the occipital pole 150 or 200 ms before stimulus onset was found to facilitate target discrimination in the contralateral compared with the ipsilateral visual field. At the 150‐ms time window contralateral TMS also amplified cueing effects, increasing both facilitation effects for valid cues and interference effects for invalid cues. These results are the first to show location‐specific enhanced visual perception with single‐pulse occipital pole stimulation prior to stimulus presentation, suggesting that occipital stimulation can enhance the excitability of visual cortex to subsequent perception.  相似文献   

13.
Numerous studies have demonstrated that the vision of lip movements can alter the perception of auditory speech syllables (McGurk effect). While there is ample evidence for integration of text and auditory speech, there are only a few studies on the orthographic equivalent of the McGurk effect. Here, we examined whether written text, like visual speech, can induce an illusory change in the perception of speech sounds on both the behavioural and neural levels. In a sound categorization task, we found that both text and visual speech changed the identity of speech sounds from an /aba/‐/ada/ continuum, but the size of this audiovisual effect was considerably smaller for text than visual speech. To examine at which level in the information processing hierarchy these multisensory interactions occur, we recorded electroencephalography in an audiovisual mismatch negativity (MMN, a component of the event‐related potential reflecting preattentive auditory change detection) paradigm in which deviant text or visual speech was used to induce an illusory change in a sequence of ambiguous sounds halfway between /aba/ and /ada/. We found that only deviant visual speech induced an MMN, but not deviant text, which induced a late P3‐like positive potential. These results demonstrate that text has much weaker effects on sound processing than visual speech does, possibly because text has different biological roots than visual speech.  相似文献   

14.
Previous neuroimaging studies devoted to auditory motion processing have shown the involvement of a cerebral network encompassing the temporoparietal and premotor areas. Most of these studies were based on a comparison between moving stimuli and static stimuli placed at a single location. However, moving stimuli vary in spatial location, and therefore motion detection can include both spatial localisation and motion processing. In this study, we used fMRI to compare neural processing of moving sounds and static sounds in various spatial locations in blindfolded sighted subjects. The task consisted of simultaneously determining both the nature of a sound stimulus (pure tone or complex sound) and the presence or absence of its movement. When movement was present, subjects had to identify its direction. This comparison of how moving and static stimuli are processed showed the involvement of the parietal lobules, the dorsal and ventral premotor cortex and the planum temporale during auditory motion processing. It also showed the specific recruitment of V5, the visual motion area. These results suggest that the previously proposed network of auditory motion processing is distinct from the network of auditory localisation. In addition, they suggest that the occipital cortex can process non-visual stimuli and that V5 is not restricted to visual processing.  相似文献   

15.
Electrical brain stimulation can provide important information about the functional organization of the human visual cortex. Here, we report the visual phenomena evoked by a large number (562) of intracerebral electrical stimulations performed at low‐intensity with depth electrodes implanted in the occipito‐parieto‐temporal cortex of 22 epileptic patients. Focal electrical stimulation evoked primarily visual hallucinations with various complexities: simple (spot or blob), intermediary (geometric forms), or complex meaningful shapes (faces); visual illusions and impairments of visual recognition were more rarely observed. With the exception of the most posterior cortical sites, the probability of evoking a visual phenomenon was significantly higher in the right than the left hemisphere. Intermediary and complex hallucinations, illusions, and visual recognition impairments were almost exclusively evoked by stimulation in the right hemisphere. The probability of evoking a visual phenomenon decreased substantially from the occipital pole to the most anterior sites of the temporal lobe, and this decrease was more pronounced in the left hemisphere. The greater sensitivity of the right occipito‐parieto‐temporal regions to intracerebral electrical stimulation to evoke visual phenomena supports a predominant role of right hemispheric visual areas from perception to recognition of visual forms, regardless of visuospatial and attentional factors. Hum Brain Mapp 35:3360–3371, 2014. © 2013 Wiley Periodicals, Inc .  相似文献   

16.
Task‐irrelevant visual stimuli can enhance auditory perception. However, while there is some neurophysiological evidence for mechanisms that underlie the phenomenon, the neural basis of visually induced effects on auditory perception remains unknown. Combining fMRI and EEG with psychophysical measurements in two independent studies, we identified the neural underpinnings and temporal dynamics of visually induced auditory enhancement. Lower‐ and higher‐intensity sounds were paired with a non‐informative visual stimulus, while participants performed an auditory detection task. Behaviourally, visual co‐stimulation enhanced auditory sensitivity. Using fMRI, enhanced BOLD signals were observed in primary auditory cortex for low‐intensity audiovisual stimuli which scaled with subject‐specific enhancement in perceptual sensitivity. Concordantly, a modulation of event‐related potentials could already be observed over frontal electrodes at an early latency (30–80 ms), which again scaled with subject‐specific behavioural benefits. Later modulations starting around 280 ms, that is in the time range of the P3, did not fit this pattern of brain‐behaviour correspondence. Hence, the latency of the corresponding fMRI‐EEG brain‐behaviour modulation points at an early interplay of visual and auditory signals in low‐level auditory cortex, potentially mediated by crosstalk at the level of the thalamus. However, fMRI signals in primary auditory cortex, auditory thalamus and the P50 for higher‐intensity auditory stimuli were also elevated by visual co‐stimulation (in the absence of any behavioural effect) suggesting a general, intensity‐independent integration mechanism. We propose that this automatic interaction occurs at the level of the thalamus and might signify a first step of audiovisual interplay necessary for visually induced perceptual enhancement of auditory perception.  相似文献   

17.
Converging lines of evidence suggest that auditory system short-term plasticity can enable several perceptual and cognitive functions that have been previously considered as relatively distinct phenomena. Here we review recent findings suggesting that auditory stimulation, auditory selective attention and cross-modal effects of visual stimulation each cause transient excitatory and (surround) inhibitory modulations in the auditory cortex. These modulations might adaptively tune hierarchically organized sound feature maps of the auditory cortex (e.g. tonotopy), thus filtering relevant sounds during rapidly changing environmental and task demands. This could support auditory sensory memory, pre-attentive detection of sound novelty, enhanced perception during selective attention, influence of visual processing on auditory perception and longer-term plastic changes associated with perceptual learning.  相似文献   

18.
The role of induced gamma‐band responses (iGBRs) in the human electroencephalogram (EEG) is a controversial topic. On the one hand, iGBRs have been associated with neuronal activity reflecting the (re‐)activation of cortical object representations. On the other hand, it was shown that miniature saccades (MSs) lead to high‐frequency artifacts in the EEG that can mimic cortical iGBRs. We recorded EEG and eye movements simultaneously while participants were engaged in a combined repetition priming and object recognition experiment. MS rates were mainly modulated by object familiarity in a time window from 100 to 300 ms after stimulus onset. In contrast, artifact‐corrected iGBRs were sensitive to object repetition and object familiarity in a prolonged time window. EEG source analyses revealed that stimulus repetitions modulated iGBRs in temporal and occipital cortex regions while familiarity was associated with activity in parieto‐occipital regions. These results are in line with neuroimaging studies employing functional magnetic resonance imaging or magnetoencephalography. We conclude that MSs reflect early mechanisms of visual perception while iGBRs mirror the activation of cortical networks representing a perceived object.  相似文献   

19.
In auditory-visual synesthesia, sounds automatically elicit conscious and reliable visual experiences. It is presently unknown whether this reflects early or late processes in the brain. It is also unknown whether adult audiovisual synesthesia resembles auditory-induced visual illusions that can sometimes occur in the general population or whether it resembles the electrophysiological deflection over occipital sites that has been noted in infancy and has been likened to synesthesia. Electrical brain activity was recorded from adult synesthetes and control participants who were played brief tones and required to monitor for an infrequent auditory target. The synesthetes were instructed to attend either to the auditory or to the visual (i.e., synesthetic) dimension of the tone, whereas the controls attended to the auditory dimension alone. There were clear differences between synesthetes and controls that emerged early (100 msec after tone onset). These differences tended to lie in deflections of the auditory-evoked potential (e.g., the auditory N1, P2, and N2) rather than the presence of an additional posterior deflection. The differences occurred irrespective of what the synesthetes attended to (although attention had a late effect). The results suggest that differences between synesthetes and others occur early in time, and that synesthesia is qualitatively different from similar effects found in infants and certain auditory-induced visual illusions in adults. In addition, we report two novel cases of synesthesia in which colors elicit sounds, and vice versa.  相似文献   

20.
Blindness induces processes of neural plasticity, resulting in recruitment of the deafferentated visual areas for non‐visual sensory functions. These processes are related to superior abilities of blind compared with sighted individuals for specific auditory and tactile tasks. Recently, an exceptional performance of the blind has been demonstrated for auditory motion perception, with a minimum audible movement angle that was half that of sighted controls (J. Lewald (2013) Neuropsychologia, 51 , 181–186). The present study revealed an electrophysiological correlate of this finding by analysing the so‐called motion‐onset response, a prominent auditory‐evoked potential to the onset of motion. The cN1 component of this response, appearing about 170 ms after motion onset, was two times higher in amplitude for blind compared with matched sighted control subjects. At the time of the cN1, electrical neuroimaging using sLORETA revealed stronger activation in blind than sighted subjects primarily in ventral visual areas (V1v, V2v, VP, V4v) of the right occipital lobe. Activation was also obtained in middle temporal area V5. These findings suggest that blindness results in stronger involvement of both non‐motion areas of the ventral visual stream and motion areas of the dorsal visual stream in processing of auditory motion at the same point in time after motion onset. This argues against the view that visual motion areas, such as area V5, are preferentially recruited for auditory motion analysis in the blind. Rather, cross‐modal reorganization of cortical areas induced by blindness seems to be largely independent of the specific visual functions of the same areas in sighted persons.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号