首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Visual and auditory motion information can be used together to provide complementary information about the movement of objects. To investigate the neural substrates of such cross-modal integration, functional magnetic resonance imaging was used to assess brain activation while subjects performed separate visual and auditory motion discrimination tasks. Areas of unimodal activation included the primary and/or early sensory cortex for each modality plus additional sites extending toward parietal cortex. Areas conjointly activated by both tasks included lateral parietal cortex, lateral frontal cortex, anterior midline and anterior insular cortex. The parietal site encompassed distinct, but partially overlapping, zones of activation in or near the intraparietal sulcus (IPS). A subsequent task requiring an explicit cross-modal speed comparison revealed several foci of enhanced activity relative to the unimodal tasks. These included the IPS, anterior midline, and anterior insula but not frontal cortex. During the unimodal auditory motion task, portions of the dorsal visual motion system showed signals depressed below resting baseline. Thus, interactions between the two systems involved either enhancement or suppression depending on the stimuli present and the nature of the perceptual task. Together, these results identify human cortical regions involved in polysensory integration and the attentional selection of cross-modal motion information.  相似文献   

2.
Recent studies, conducted almost exclusively in primates, have shown that several cortical areas usually associated with modality-specific sensory processing are subject to influences from other senses. Here we demonstrate using single-unit recordings and estimates of mutual information that visual stimuli can influence the activity of units in the auditory cortex of anesthetized ferrets. In many cases, these units were also acoustically responsive and frequently transmitted more information in their spike discharge patterns in response to paired visual-auditory stimulation than when either modality was presented by itself. For each stimulus, this information was conveyed by a combination of spike count and spike timing. Even in primary auditory areas (primary auditory cortex [A1] and anterior auditory field [AAF]), approximately 15% of recorded units were found to have nonauditory input. This proportion increased in the higher level fields that lie ventral to A1/AAF and was highest in the anterior ventral field, where nearly 50% of the units were found to be responsive to visual stimuli only and a further quarter to both visual and auditory stimuli. Within each field, the pure-tone response properties of neurons sensitive to visual stimuli did not differ in any systematic way from those of visually unresponsive neurons. Neural tracer injections revealed direct inputs from visual cortex into auditory cortex, indicating a potential source of origin for the visual responses. Primary visual cortex projects sparsely to A1, whereas higher visual areas innervate auditory areas in a field-specific manner. These data indicate that multisensory convergence and integration are features common to all auditory cortical areas but are especially prevalent in higher areas.  相似文献   

3.
Increasing evidence suggests separate auditory pattern and space processing streams. The present paper describes two magnetoencephalogram studies examining gamma-band activity to changes in auditory patterns using consonant-vowel syllables (experiment 1), animal vocalizations and artificial noises (experiment 2). Two samples of each sound type were presented to passively listening subjects in separate oddball paradigms with 80% standards and 20% deviants differing in their spectral composition. Evoked magnetic mismatch fields peaking approximately 190 ms poststimulus showed a trend for a left-hemisphere advantage for syllables, but no hemispheric differences for the other sounds. Frequency analysis and statistical probability mapping of the differences between deviants and standards revealed increased gamma-band activity above 60 Hz over left anterior temporal/ventrolateral prefrontal cortex for all three types of stimuli. This activity peaked simultaneously with the mismatch responses for animal sounds (180 ms) but was delayed for noises (260 ms) and syllables (320 ms). Our results support the hypothesized role of anterior temporal/ventral prefrontal regions in the processing of auditory pattern change. They extend earlier findings of gamma-band activity over posterior parieto-temporal cortex during auditory spatial processing that supported the putative auditory dorsal stream. Furthermore, earlier gamma-band responses to animal vocalizations may suggest faster processing of fear-relevant information.  相似文献   

4.
Our brain integrates the information provided by the different sensory modalities into a coherent percept, and recent studies suggest that this process is not restricted to higher association areas. Here we evaluate the hypothesis that auditory cortical fields are involved in cross-modal processing by probing individual neurons for audiovisual interactions. We find that visual stimuli modulate auditory processing both at the level of field potentials and single-unit activity and already in primary and secondary auditory fields. These interactions strongly depend on a stimulus' efficacy in driving the neurons but occur independently of stimulus category and for naturalistic as well as artificial stimuli. In addition, interactions are sensitive to the relative timing of audiovisual stimuli and are strongest when visual stimuli lead by 20-80 msec. Exploring the underlying mechanisms, we find that enhancement correlates with the resetting of slow (approximately 10 Hz) oscillations to a phase angle of optimal excitability. These results demonstrate that visual stimuli can modulate the firing of neurons in auditory cortex in a manner that depends on stimulus efficacy and timing. These neurons thus meet the criteria for sensory integration and provide the auditory modality with multisensory contextual information about co-occurring environmental events.  相似文献   

5.
Perceptual suppression of distractors may depend on both endogenous and exogenous factors, such as attentional load of the current task and sensory competition among simultaneous stimuli, respectively. We used functional magnetic resonance imaging (fMRI) to compare these two types of attentional effects and examine how they may interact in the human brain. We varied the attentional load of a visual monitoring task performed on a rapid stream at central fixation without altering the central stimuli themselves, while measuring the impact on fMRI responses to task-irrelevant peripheral checkerboards presented either unilaterally or bilaterally. Activations in visual cortex for irrelevant peripheral stimulation decreased with increasing attentional load at fixation. This relative decrease was present even in V1, but became larger for successive visual areas through to V4. Decreases in activation for contralateral peripheral checkerboards due to higher central load were more pronounced within retinotopic cortex corresponding to 'inner' peripheral locations relatively near the central targets than for more eccentric 'outer' locations, demonstrating a predominant suppression of nearby surround rather than strict 'tunnel vision' during higher task load at central fixation. Contralateral activations for peripheral stimulation in one hemifield were reduced by competition with concurrent stimulation in the other hemifield only in inferior parietal cortex, not in retinotopic areas of occipital visual cortex. In addition, central attentional load interacted with competition due to bilateral versus unilateral peripheral stimuli specifically in posterior parietal and fusiform regions. These results reveal that task-dependent attentional load, and interhemifield stimulus-competition, can produce distinct influences on the neural responses to peripheral visual stimuli within the human visual system. These distinct mechanisms in selective visual processing may be integrated within posterior parietal areas, rather than earlier occipital cortex.  相似文献   

6.
The rat auditory cortex is divided anatomically into several areas, but little is known about the functional differences in information processing between these areas. To determine the filter properties of rat posterior auditory field (PAF) neurons, we compared neurophysiological responses to simple tones, frequency modulated (FM) sweeps, and amplitude modulated noise and tones with responses of primary auditory cortex (A1) neurons. PAF neurons have excitatory receptive fields that are on average 65% broader than A1 neurons. The broader receptive fields of PAF neurons result in responses to narrow and broadband inputs that are stronger than A1. In contrast to A1, we found little evidence for an orderly topographic gradient in PAF based on frequency. These neurons exhibit latencies that are twice as long as A1. In response to modulated tones and noise, PAF neurons adapt to repeated stimuli at significantly slower rates. Unlike A1, neurons in PAF rarely exhibit facilitation to rapidly repeated sounds. Neurons in PAF do not exhibit strong selectivity for rate or direction of narrowband one octave FM sweeps. These results indicate that PAF, like nonprimary visual fields, processes sensory information on larger spectral and longer temporal scales than primary cortex.  相似文献   

7.
The ability to detect and preferentially process salient auditory stimuli, even when irrelevant to a current task, is often critical for adaptive behavior. This stimulus-driven allocation of processing resources is known as "attentional capture." Here we used functional magnetic resonance imaging in humans to investigate brain activity and behavioral effects related to such auditory attentional capture. Participants searched a sequence of tones for a target tone that was shorter or longer than the nontarget tones. An irrelevant singleton feature in the tone sequence resulted in behavioral interference (attentional capture) and activation of parietal and prefrontal cortices only when the singleton was associated with a nontarget tone (nontarget singleton) and not when associated with a target tone (target singleton). In contrast, the presence (vs. absence) of a singleton feature in the sequence was associated with activation of frontal and temporal loci previously associated with auditory change detection. These results suggest that a ventral network involving superior temporal and inferior frontal cortices responds to acoustic variability, regardless of attentional significance, but a dorsal frontoparietal network responds only when a feature singleton captures attention.  相似文献   

8.
Primary sensory cortical responses are modulated by the presence or expectation of related sensory information in other modalities, but the sources of multimodal information and the cellular locus of this integration are unclear. We investigated the modulation of neural responses in the murine primary auditory cortical area Au1 by extrastriate visual cortex (V2). Projections from V2 to Au1 terminated in a classical descending/modulatory pattern, with highest density in layers 1, 2, 5, and 6. In brain slices, whole-cell recordings revealed long latency responses to stimulation in V2L that could modulate responses to subsequent white matter (WM) stimuli at latencies of 5-20 ms. Calcium responses imaged in Au1 cell populations showed that preceding WM with V2L stimulation modulated WM responses, with both summation and suppression observed. Modulation of WM responses was most evident for near-threshold WM stimuli. These data indicate that corticocortical projections from V2 contribute to multimodal integration in primary auditory cortex.  相似文献   

9.
The cognitive and neural bases of the ability to focus attention on information in one sensory modality while ignoring information in another remain poorly understood. We hypothesized that bimodal selective attention results from increased activity in corresponding sensory cortices with a suppression of activity in non-corresponding sensory cortices. In a functional magnetic resonance imaging (fMRI) study, we presented melodies and shapes alone (unimodal) or simultaneously (bimodal). Subjects monitored for changes in an attended modality while ignoring the other. Subsequently, memory for both attended and unattended stimuli was tested. Subjects remembered attended stimuli equally well in unimodal and bimodal conditions, and significantly better than ignored stimuli in bimodal conditions. When a subject focused on a stimulus, the blood-oxygen-level-dependent (BOLD) response increased in sensory cortices corresponding to that modality in both unimodal and bimodal conditions. Additionally, the BOLD response decreased in sensory cortices corresponding to the non-presented modality in unimodal conditions and the unattended modality in bimodal conditions. We conclude that top-down attentional effects modulate the interaction of sensory cortical areas by gating sensory input. This interaction between sensory cortices enhances processing of one modality at the expense of the other during selective attention, and subsequently affects memory encoding.  相似文献   

10.
Visualizing emotionally loaded pictures intensifies peripheral reflexes toward sudden auditory stimuli, suggesting that the emotional context may potentiate responses elicited by novel events in the acoustic environment. However, psychophysiological results have reported that attentional resources available to sounds become depleted, as attention allocation to emotional pictures increases. These findings have raised the challenging question of whether an emotional context actually enhances or attenuates auditory novelty processing at a central level in the brain. To solve this issue, we used functional magnetic resonance imaging to first identify brain activations induced by novel sounds (NOV) when participants made a color decision on visual stimuli containing both negative (NEG) and neutral (NEU) facial expressions. We then measured modulation of these auditory responses by the emotional load of the task. Contrary to what was assumed, activation induced by NOV in superior temporal gyrus (STG) was enhanced when subjects responded to faces with a NEG emotional expression compared with NEU ones. Accordingly, NOV yielded stronger behavioral disruption on subjects' performance in the NEG context. These results demonstrate that the emotional context modulates the excitability of auditory and possibly multimodal novelty cerebral regions, enhancing acoustic novelty processing in a potentially harming environment.  相似文献   

11.
Hemispheric asymmetries during auditory sensory processing were examined using whole-head magnetoencephalographic recordings of auditory evoked responses to monaurally and binaurally presented amplitude-modulated sounds. Laterality indices were calculated for the transient onset responses (P1m and N1m), the transient gamma-band response, the sustained field (SF) and the 40 Hz auditory steady-state response (ASSR). All response components showed laterality toward the hemisphere contralateral to the stimulated ear. In addition, the SF and ASSR showed right hemispheric (RH) dominance. Thus, laterality of sustained response components (SF and ASSR) was distinct from that of transient responses. ASSR and SF are sensitive to stimulus periodicity. Consequently, ASSR and SF likely reflect periodic stimulus attributes and might be relevant for pitch processing based on temporal stimulus regularities. In summary, the results of the present studies demonstrate that asymmetric organization in the cerebral auditory cortex is already established on the level of sensory processing.  相似文献   

12.
BACKGROUND: Functional magnetic resonance imaging (fMRI) using blood-oxygen-level-dependent (BOLD) contrasts is a common method for studying sensory or cognitive brain functions. The aim of the present study was to assess the effect of the intravenous anaesthetic propofol on auditory-induced brain activation using BOLD contrast fMRI. METHODS: In eight neurosurgical patients, musical stimuli were presented binaurally in a block design. Imaging was performed under five conditions: no propofol (or wakefulness) and propofol plasma target concentrations of 0.5, 1.0, 1.5, and 2.0 microg ml(-1). RESULTS: During wakefulness we found activations in the superior temporal gyrus (STG) corresponding to the primary and secondary auditory cortex as well as in regions of higher functions of auditory information processing. The BOLD response decreased with increasing concentrations of propofol but remained partially preserved in areas of basic auditory processing in the STG during propofol 2.0 microg ml(-1). CONCLUSIONS: Our results suggest a dose-dependent impairment of central processing of auditory information after propofol administration. These results are consistent with electrophysiological findings measuring neuronal activity directly, thus suggesting a dose-dependent impairment of central processing of auditory information after propofol administration. However, propofol did not totally blunt primary cortical responses to acoustic stimulation, indicating that patients may process auditory information under general anaesthesia.  相似文献   

13.
Valid expectations are known to improve target detection, but the preparatory attentional mechanisms underlying this perceptual facilitation remain an open issue. Using functional magnetic resonance imaging, we show here that expecting auditory, tactile, or visual targets, in the absence of stimulation, selectively increased baseline activity in corresponding sensory cortices and decreased activity in irrelevant ones. Regardless of sensory modality, expectancy activated bilateral premotor and posterior parietal areas, supplementary motor area as well as right anterior insula and right middle frontal gyrus. The bilateral putamen was sensitive to the modality specificity of expectations during the unexpected omission of targets. Thus, across modalities, detection improvement arising from selectively directing attention to a sensory modality appears mediated through transient changes in pretarget activity. This flexible advance modulation of baseline activity in sensory cortices resolves ambiguities among previous studies unable to discriminate modality-specific preparatory activity from attentional modulation of stimulus processing. Our results agree with predictive-coding models, which suggest that these expectancy-related changes reflect top-down biases--presumably originating from the observed supramodal frontoparietal network--that modulate signal-detection sensitivity by differentially modifying background activity (i.e., noise level) in different input channels. The putamen appears to code omission-related Bayesian "surprise" that depends on the specificity of predictions.  相似文献   

14.
When multiple objects are present in a visual scene, they compete for cortical processing in the visual system; selective attention biases this competition so that representations of behaviorally relevant objects enter awareness and irrelevant objects do not. Deployments of selective attention can be voluntary (e.g., shift or attention to a target's expected spatial location) or stimulus driven (e.g., capture of attention by a target-defining feature such as color). Here we use functional magnetic resonance imaging to show that both of these factors induce spatially selective attentional modulations within regions of human occipital, parietal, and frontal cortex. In addition, the voluntary attentional modulations are temporally sustained, indicating that activity in these regions dynamically tracks the locus of attention. These data show that a convolution of factors, including prior knowledge of location and target-defining features, determines the relative competitive advantage of visual stimuli within multiple stages of the visual system.  相似文献   

15.
Affectively arousing visual stimuli have been suggested to automatically attract attentional resources in order to optimize sensory processing. The present study crosses the factors of spatial selective attention and affective content, and examines the relationship between instructed (spatial) and automatic attention to affective stimuli. In addition to response times and error rate, electroencephalographic data from 129 electrodes were recorded during a covert spatial attention task. This task required silent counting of random-dot targets embedded in a 10 Hz flicker of colored pictures presented to both hemifields. Steady-state visual evoked potentials (ssVEPs) were obtained to determine amplitude and phase of electrocortical responses to pictures. An increase of ssVEP amplitude was observed as an additive function of spatial attention and emotional content. Statistical parametric mapping of this effect indicated occipito-temporal and parietal cortex activation contralateral to the attended visual hemifield in ssVEP amplitude modulation. This difference was most pronounced during selection of the left visual hemifield, at right temporal electrodes. In line with this finding, phase information revealed accelerated processing of aversive arousing, compared to affectively neutral pictures. The data suggest that affective stimulus properties modulate the spatiotemporal process along the ventral stream, encompassing amplitude amplification and timing changes of posterior and temporal cortex.  相似文献   

16.
The posterior field (P) of the cat auditory cortex contains a very high proportion of neurons whose responses change non-monotonically with the sound pressure level (SPL) of tonal stimuli, leading to circumscribed frequency-SPL response areas, and it has therefore been suggested that field P may be specialized for processing of sound intensity. We demonstrate here a great diversity of response areas in field P. Furthermore, by varying tone SPL and rise time, we show that, as in primary auditory cortex (AI), the onset response of a field P neuron is better described as a function of the instantaneous peak pressure (envelope) at the time of response generation than of the steady-state SPL of the stimulus. Such responses could be used to track transients or represent envelopes in more general terms, rather than to code SPL. Compared with AI, field P neurons have relatively long minimum latencies along with a large jitter in spike timing. Tracking would therefore be most effective for slowly varying envelopes, and one function of the inhibition that generates non-monotonicity in field P may be to suppress temporally sluggish responses to rapid transients, such as the onsets of high-SPL, short rise time tones. Field P may thus be specialized for coding slowly varying signals.   相似文献   

17.
The human brain protects the processing of task-relevant stimuli from interference ("conflict") by task-irrelevant stimuli via attentional biasing mechanisms. The lateral prefrontal cortex has been implicated in resolving conflict between competing stimuli by selectively enhancing task-relevant stimulus representations in sensory cortices. Conversely, recent data suggest that conflict from emotional distracters may be resolved by an alternative route, wherein the rostral anterior cingulate cortex inhibits amygdalar responsiveness to task-irrelevant emotional stimuli. Here we tested the proposal of 2 dissociable, distracter-specific conflict resolution mechanisms, by acquiring functional magnetic resonance imaging data during resolution of conflict from either nonemotional or emotional distracters. The results revealed 2 distinct circuits: a lateral prefrontal "cognitive control" system that resolved nonemotional conflict and was associated with enhanced processing of task-relevant stimuli in sensory cortices, and a rostral anterior cingulate "emotional control" system that resolved emotional conflict and was associated with decreased amygdalar responses to emotional distracters. By contrast, activations related to both emotional and nonemotional conflict monitoring were observed in a common region of the dorsal anterior cingulate. These data suggest that the neuroanatomical networks recruited to overcome conflict vary systematically with the nature of the conflict, but that they may share a common conflict-detection mechanism.  相似文献   

18.
Speech perception requires cortical mechanisms capable of analysing and encoding successive spectral (frequency) changes in the acoustic signal. To study temporal speech processing in the human auditory cortex, we recorded intracerebral evoked potentials to syllables in right and left human auditory cortices including Heschl's gyrus (HG), planum temporale (PT) and the posterior part of superior temporal gyrus (area 22). Natural voiced /ba/, /da/, /ga/) and voiceless (/pa/, /ta/, /ka/) syllables, spoken by a native French speaker, were used to study the processing of a specific temporally based acoustico-phonetic feature, the voice onset time (VOT). This acoustic feature is present in nearly all languages, and it is the VOT that provides the basis for the perceptual distinction between voiced and voiceless consonants. The present results show a lateralized processing of acoustic elements of syllables. First, processing of voiced and voiceless syllables is distinct in the left, but not in the right HG and PT. Second, only the evoked potentials in the left HG, and to a lesser extent in PT, reflect a sequential processing of the different components of the syllables. Third, we show that this acoustic temporal processing is not limited to speech sounds but applies also to non-verbal sounds mimicking the temporal structure of the syllable. Fourth, there was no difference between responses to voiced and voiceless syllables in either left or right areas 22. Our data suggest that a single mechanism in the auditory cortex, involved in general (not only speech-specific) temporal processing, may underlie the further processing of verbal (and non-verbal) stimuli. This coding, bilaterally localized in auditory cortex in animals, takes place specifically in the left HG in man. A defect of this mechanism could account for hearing discrimination impairments associated with language disorders.  相似文献   

19.
We used positron emission tomography (PET) to investigate the neural correlates of selective attention in humans. We examined the effects of attending to one side of space versus another (spatial selection) and to one sensory modality versus another (intermodal selection) during bilateral, bimodal stimulation of vision and touch. Attention toward one side resulted in greater activity in several contralateral areas. In somatosensory cortex, these spatial attentional modulations were found only when touch was relevant. In the intraparietal sulcus, spatial attentional effects were multimodal, independent of the modality attended. In occipital areas, spatial modulations were also found during both visual and tactile attention, indicating that tactile attention can affect activity in visual cortex; but occipital areas also showed more activity overall during visual attention. This suggests that while spatial attention can exert multimodal influences on visual areas, these still maintain their specificity for the visual modality. Additionally, irrespective of the attended side, attending to vision activated posterior parietal and superior premotor cortices, while attending to touch activated the parietal operculi. We conclude that attentional selection operates at multiple levels, with attention to locations and attention to modalities showing distinct effects. These jointly contribute to boost processing of stimuli at the attended location in the relevant modality.  相似文献   

20.
We aimed at testing the cortical representation of complex natural sounds within auditory cortex using human functional magnetic resonance imaging (fMRI). To this end, we employed 2 different paradigms in the same subjects: a block-design experiment was to provide a localization of areas involved in the processing of animal vocalizations, whereas an event-related fMRI adaptation experiment was to characterize the representation of animal vocalizations in the auditory cortex. During the first experiment, we presented subjects with recognizable and degraded animal vocalizations. We observed significantly stronger fMRI responses for animal vocalizations compared with the degraded stimuli along the bilateral superior temporal gyrus (STG). In the second experiment, we employed an event-related fMRI adaptation paradigm in which pairs of auditory stimuli were presented in 4 different conditions: 1) 2 identical animal vocalizations, 2) 2 different animal vocalizations, 3) an animal vocalization and its degraded control, and 4) an animal vocalization and a degraded control of a different sound. We observed significant fMRI adaptation effects within the left STG. Our data thus suggest that complex sounds such as animal vocalizations are represented in putatively nonprimary auditory cortex in the left STG. Their representation is probably based on their spectrotemporal dynamics rather than simple spectral features.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号