首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Schmid C  Büchel C  Rose M 《NeuroImage》2011,55(1):304-311
Visual dominance refers to the observation that in bimodal environments vision often has an advantage over other senses in human. Therefore, a better memory performance for visual compared to, e.g., auditory material is assumed. However, the reason for this preferential processing and the relation to the memory formation is largely unknown. In this fMRI experiment, we manipulated cross-modal competition and attention, two factors that both modulate bimodal stimulus processing and can affect memory formation. Pictures and sounds of objects were presented simultaneously in two levels of recognisability, thus manipulating the amount of cross-modal competition. Attention was manipulated via task instruction and directed either to the visual or the auditory modality. The factorial design allowed a direct comparison of the effects between both modalities. The resulting memory performance showed that visual dominance was limited to a distinct task setting. Visual was superior to auditory object memory only when allocating attention towards the competing modality. During encoding, cross-modal competition and attention towards the opponent domain reduced fMRI signals in both neural systems, but cross-modal competition was more pronounced in the auditory system and only in auditory cortex this competition was further modulated by attention. Furthermore, neural activity reduction in auditory cortex during encoding was closely related to the behavioural auditory memory impairment. These results indicate that visual dominance emerges from a less pronounced vulnerability of the visual system against competition from the auditory domain.  相似文献   

2.
Yucel G  McCarthy G  Belger A 《NeuroImage》2007,34(3):1245-1252
Previous studies suggest that involuntary auditory attention evoked by unattended auditory stimuli is not influenced by the primary focus of attention. However, prior studies from our laboratory have found that processing of unattended auditory deviant tones in the auditory and frontal regions is modulated by top-down attentional demands and resource availability. Whether processing of unattended visual deviant stimuli is altered by the availability of attentional resources has not been established. The goal of the current study was to examine the automaticity of these activations, their modulation by attentional capacity, and the neuroanatomical distribution of any attentional effects upon visual deviance detection. We designed an event-related functional magnetic resonance imaging (fMRI) study during which subjects performed a continuous perceptual-motor-visual tracking task whose difficulty was modulated by changing the control dynamics of a joystick. Changes in the anatomical localization, spatial distribution, and intensity of the blood oxygenation level-dependent (BOLD) response associated with unattended infrequent visual changes were examined during low- and high-difficulty tracking conditions of the primary visual task. Results revealed that the unattended deviants elicited BOLD activation in the visual, fusiform, and parietal regions. In these regions, the intensity and extent of the activation evoked by the deviants decreased as a function of the demands of the primary visual task. These findings suggest that processing of unattended visual deviant stimuli is restricted by the attentional demands of a primary task, as previously demonstrated for unattended auditory deviant tones.  相似文献   

3.
Chen JL  Zatorre RJ  Penhune VB 《NeuroImage》2006,32(4):1771-1781
When listening to music, we often spontaneously synchronize our body movements to a rhythm's beat (e.g. tapping our feet). The goals of this study were to determine how features of a rhythm such as metric structure, can facilitate motor responses, and to elucidate the neural correlates of these auditory-motor interactions using fMRI. Five variants of an isochronous rhythm were created by increasing the contrast in sound amplitude between accented and unaccented tones, progressively highlighting the rhythm's metric structure. Subjects tapped in synchrony to these rhythms, and as metric saliency increased across the five levels, louder tones evoked longer tap durations with concomitant increases in the BOLD response at auditory and dorsal premotor cortices. The functional connectivity between these regions was also modulated by the stimulus manipulation. These results show that metric organization, as manipulated via intensity accentuation, modulates motor behavior and neural responses in auditory and dorsal premotor cortex. Auditory-motor interactions may take place at these regions with the dorsal premotor cortex interfacing sensory cues with temporally organized movement.  相似文献   

4.
To form a unified percept of our environment, the human brain integrates information within and across the senses. This MEG study investigated interactions within and between sensory modalities using a frequency analysis of steady-state responses that are elicited time-locked to periodically modulated stimuli. Critically, in the frequency domain, interactions between sensory signals are indexed by crossmodulation terms (i.e. the sums and differences of the fundamental frequencies). The 3 × 2 factorial design, manipulated (1) modality: auditory, visual or audiovisual (2) steady-state modulation: the auditory and visual signals were modulated only in one sensory feature (e.g. visual gratings modulated in luminance at 6 Hz) or in two features (e.g. tones modulated in frequency at 40 Hz & amplitude at 0.2 Hz). This design enabled us to investigate crossmodulation frequencies that are elicited when two stimulus features are modulated concurrently (i) in one sensory modality or (ii) in auditory and visual modalities. In support of within-modality integration, we reliably identified crossmodulation frequencies when two stimulus features in one sensory modality were modulated at different frequencies. In contrast, no crossmodulation frequencies were identified when information needed to be combined from auditory and visual modalities. The absence of audiovisual crossmodulation frequencies suggests that the previously reported audiovisual interactions in primary sensory areas may mediate low level spatiotemporal coincidence detection that is prominent for stimulus transients but less relevant for sustained SSR responses. In conclusion, our results indicate that information in SSRs is integrated over multiple time scales within but not across sensory modalities at the primary cortical level.  相似文献   

5.
Event-related potential (ERP) studies in the visual domain often report an emotion-evoked early posterior negativity (EPN). Studies in the auditory domain have recently shown a similar component. Little source localization has been done on the visual EPN, and no source localization has been done on the auditory EPN. The aim of the current study was to identify the neural generators of the auditory EPN using EEG-fMRI single-trial coupling. Data were recorded from 19 subjects who completed three auditory choice reaction tasks: (1) a control task using neutral tones; (2) a prosodic emotion task involving the categorization of syllables; and (3) a semantic emotion task involving the categorization of words. The waveforms of the emotion tasks diverged from the neutral task over parietal scalp during a very early time window (132-156 ms) and later during a more traditional EPN time window (252-392 ms). In the EEG-fMRI analyses, the variance of the voltage in the earlier time window was correlated with activity in the medial prefrontal cortex, but only in the word task. In the EEG-fMRI analyses of the traditional EPN time window both emotional tasks covaried with activity in the left superior parietal lobule. Our results support previous parietal cortex source localization findings for the visual EPN, and suggest enhanced selective attention to emotional stimuli during the EPN time window.  相似文献   

6.
Speech perception can use not only auditory signals, but also visual information from seeing the speaker's mouth. The relative timing and relative location of auditory and visual inputs are both known to influence crossmodal integration psychologically, but previous imaging studies of audiovisual speech focused primarily on just temporal aspects. Here we used Positron Emission Tomography (PET) during audiovisual speech processing to study how temporal and spatial factors might jointly affect brain activations. In agreement with previous work, synchronous versus asynchronous audiovisual speech yielded increased activity in multisensory association areas (e.g., superior temporal sulcus [STS]), plus in some unimodal visual areas. Our orthogonal manipulation of relative stimulus position (auditory and visual stimuli presented at same location vs. opposite sides) and stimulus synchrony showed that (i) ventral occipital areas and superior temporal sulcus were unaffected by relative location; (ii) lateral and dorsal occipital areas were selectively activated for synchronous bimodal stimulation at the same external location; (iii) right inferior parietal lobule was activated for synchronous auditory and visual stimuli at different locations, that is, in the condition classically associated with the 'ventriloquism effect' (shift of perceived auditory position toward the visual location). Thus, different brain regions are involved in different aspects of audiovisual integration. While ventral areas appear more affected by audiovisual synchrony (which can influence speech identification), more dorsal areas appear to be associated with spatial multisensory interactions.  相似文献   

7.
The analysis of auditory deviant events outside the focus of attention is a fundamental capacity of human information processing and has been studied in experiments on Mismatch Negativity (MMN) and the P3a component in evoked potential research. However, generators contributing to these components are still under discussion. Here we assessed cortical blood flow to auditory stimulation in three conditions. Six healthy subjects were presented with standard tones, frequency deviant tones (MMN condition), and complex novel sounds (Novelty condition), while attention was directed to a nondemanding visual task. Analysis of the MMN condition contrasted with thestandard condition revealed blood flow changes in the left and right superior temporal gyrus, right superior temporal sulcus and left inferior frontal gyrus. Complex novel sounds contrasted with the standard condition activated the left superior temporal gyrus and the left inferior and middle frontal gyrus. A small subcortical activation emerged in the left parahippocampal gyrus and an extended activation was found covering the right superior temporal gyrus. Novel sounds activated the right inferior frontal gyrus when controlling for deviance probability. In contrast to previous studies our results indicate a left hemisphere contribution to a frontotemporal network of auditory deviance processing. Our results provide further evidence for a contribution of the frontal cortex to the processing of auditory deviance outside the focus of directed attention.  相似文献   

8.
The role of attention in speech comprehension is not well understood. We used fMRI to study the neural correlates of auditory word, pseudoword, and nonspeech (spectrally rotated speech) perception during a bimodal (auditory, visual) selective attention task. In three conditions, Attend Auditory (ignore visual), Ignore Auditory (attend visual), and Visual (no auditory stimulation), 28 subjects performed a one-back matching task in the assigned attended modality. The visual task, attending to rapidly presented Japanese characters, was designed to be highly demanding in order to prevent attention to the simultaneously presented auditory stimuli. Regardless of stimulus type, attention to the auditory channel enhanced activation by the auditory stimuli (Attend Auditory>Ignore Auditory) in bilateral posterior superior temporal regions and left inferior frontal cortex. Across attentional conditions, there were main effects of speech processing (word+pseudoword>rotated speech) in left orbitofrontal cortex and several posterior right hemisphere regions, though these areas also showed strong interactions with attention (larger speech effects in the Attend Auditory than in the Ignore Auditory condition) and no significant speech effects in the Ignore Auditory condition. Several other regions, including the postcentral gyri, left supramarginal gyrus, and temporal lobes bilaterally, showed similar interactions due to the presence of speech effects only in the Attend Auditory condition. Main effects of lexicality (word>pseudoword) were isolated to a small region of the left lateral prefrontal cortex. Examination of this region showed significant word>pseudoword activation only in the Attend Auditory condition. Several other brain regions, including left ventromedial frontal lobe, left dorsal prefrontal cortex, and left middle temporal gyrus, showed Attention x Lexicality interactions due to the presence of lexical activation only in the Attend Auditory condition. These results support a model in which neutral speech presented in an unattended sensory channel undergoes relatively little processing beyond the early perceptual level. Specifically, processing of phonetic and lexical-semantic information appears to be very limited in such circumstances, consistent with prior behavioral studies.  相似文献   

9.
The aim of the present study was to find a functional MRI correlate in human auditory cortex of the psychoacoustical effect of release from masking, using amplitude-modulated noise stimuli. A sinusoidal target signal was embedded in a bandlimited white noise, which was either unmodulated or (co)modulated. Psychoacoustical thresholds were measured for the target signals in both types of masking noise, using an adaptive procedure. The mean threshold difference between the unmodulated and the comodulated condition, i.e., the release from masking, was 15 dB. The same listeners then participated in an fMRI experiment, recording activation of auditory cortex in response to tones in the presence of modulated and unmodulated noise maskers at five different signal-to-noise ratios. In general, a spatial dissociation of changes of overall level and signal-to-noise ratio in auditory cortex was found, replicating a previous fMRI study on pure-tone masking. The comparison of the fMRI activation maps for a signal presented in modulated and in unmodulated noise reveals that those regions in the antero-lateral part of Heschl's gyrus previously shown to represent the audibility of a tonal target (rather than overall level) exhibit a stronger activation for the modulated than for the unmodulated conditions. This result is interpreted as a physiological correlate of the psychoacoustical effect of comodulation masking release at the level of the auditory cortex.  相似文献   

10.
Detecting changes in a stream of sensory information is vital to animals and humans. While there have been several studies of automatic change detection in various sensory modalities, olfactory change detection is largely unstudied. We investigated brain regions responsive to both passive and active detection of olfactory change using fMRI. Nine right-handed healthy, normosmic subjects (five men) were scanned in two conditions while breathing in synchrony with a metronome. In one condition, subjects mentally counted infrequent odors (Attend condition), whereas in the other condition, subjects' attention was directed elsewhere as they counted auditory tones (Ignore condition). Odors were delivered via a nasal cannula using a computer-controlled air-dilution olfactometer. Infrequently occurring olfactory stimuli evoked significant (P < .05, corrected) activity in the subgenual cingulate and in central posterior orbitofrontal cortex, but only in the Ignore condition, as confirmed by direct comparison of the Ignore session with the Attend session (P < .05, corrected). Subgenual cingulate and posterior orbital cortex may therefore play a role in detecting discrepant olfactory events while attention is otherwise engaged in another sensory modality.  相似文献   

11.
The temporal synchrony of auditory and visual signals is known to affect the perception of an external event, yet it is unclear what neural mechanisms underlie the influence of temporal synchrony on perception. Using parametrically varied levels of stimulus asynchrony in combination with BOLD fMRI, we identified two anatomically distinct subregions of multisensory superior temporal cortex (mSTC) that showed qualitatively distinct BOLD activation patterns. A synchrony-defined subregion of mSTC (synchronous > asynchronous) responded only when auditory and visual stimuli were synchronous, whereas a bimodal subregion of mSTC (auditory > baseline and visual > baseline) showed significant activation to all presentations, but showed monotonically increasing activation with increasing levels of asynchrony. The presence of two distinct activation patterns suggests that the two subregions of mSTC may rely on different neural mechanisms to integrate audiovisual sensory signals. An additional whole-brain analysis revealed a network of regions responding more with synchronous than asynchronous speech, including right mSTC, and bilateral superior colliculus, fusiform gyrus, lateral occipital cortex, and extrastriate visual cortex. The spatial location of individual mSTC ROIs was much more variable in the left than right hemisphere, suggesting that individual differences may contribute to the right lateralization of mSTC in a group SPM. These findings suggest that bilateral mSTC is composed of distinct multisensory subregions that integrate audiovisual speech signals through qualitatively different mechanisms, and may be differentially sensitive to stimulus properties including, but not limited to, temporal synchrony.  相似文献   

12.
Zhang P  Chen X  Yuan P  Zhang D  He S 《NeuroImage》2006,33(2):715-724
This work investigated the role of cognitive control functions in selective attention when task-relevant and -irrelevant stimuli come from different sensory modalities. We parametrically manipulated the load of an attentive tracking task and investigated its effect on irrelevant acoustic change-related processing. While subjects were performing the visual attentive tracking task, event-related potentials (ERPs) were recorded for frequent standard tones and rare deviant tones presented as auditory distractors. The deviant tones elicited two change-related ERP components: the mismatch negativity (MMN) and the P3a. The amplitude of the MMN, which indexes the early detection of irregular changes, increased with increasing attentional load, whereas the subsequent P3a component, which indicates the involuntary orienting of attention to deviants, was significant only in the lowest load condition. These findings suggest that active exclusion of the early detection process of irrelevant acoustic changes depends on available resources of cognitive control, whereas the late involuntary orienting of attention to deviants can be passively suppressed by high demand on central attentional resources. The present study thus reveals opposing visual attentional load effects at different temporal and functional stages in the rejection of deviant auditory distractors and provides a new perspective on the resolution of the long-standing early versus late attention selection debate.  相似文献   

13.
14.
Bouvier SE  Engel SA 《NeuroImage》2011,57(3):1177-1183
To what extent does attention modulate neural activity in early visual areas? fMRI measurements of attentional modulation in primary visual cortex (V1) show large effects, while single unit recordings show much smaller ones. This discrepancy suggests that fMRI measures of attention may be inflated, perhaps by activity related to other processes. To test whether effects measured with fMRI actually reflect attentional enhancement, we used a rapid acquisition protocol to determine their timing. Subjects were presented with two stimuli on either side of fixation and were cued to attend one and ignore the other. Attended stimuli showed a greater magnitude of response in V1, but this increase was delayed, by roughly one second in time, relative to both unattended responses and response increases due to boosting stimulus contrast. These results suggest that fMRI measurements of attention may primarily depend upon other processes that take a relatively long time to feed back to V1. Our results demonstrate the importance of using the fine timing information available in the fMRI response.  相似文献   

15.
Waves of consciousness: ongoing cortical patterns during binocular rivalry   总被引:1,自引:0,他引:1  
We present here ongoing patterns of distributed brain synchronous activity that correlate with the spontaneous flow of perceptual dominance during binocular rivalry. Specific modulation of the magnetoencephalographic (MEG) response evoked during conscious perception of a frequency-tagged stimulus was evidenced throughout rivalry. Estimation of the underlying cortical sources revealed, in addition to strong bilateral striate and extrastriate visual cortex activation, parietal, temporal pole and frontal contributions. Cortical activity was significantly modulated concomitantly to perceptual alternations in visual cortex, medial parietal and left frontal regions. Upon dominance, coactivation of occipital and frontal regions, including anterior cingulate and medial frontal areas, was established. This distributed cortical network, as measured by phase synchrony in the frequency tag band, was dynamically modulated in concert with the perceptual dominance of the tagged stimulus. While the anteroposterior pattern was recurrent through subjects, individual variations in the extension of the network were apparent.  相似文献   

16.
Spinks JA  Zhang JX  Fox PT  Gao JH  Hai Tan L 《NeuroImage》2004,23(2):517-524
The present study examined the interaction of the central executive in working memory with visual attention. Native Chinese participants were given two versions of a number subtraction task, one of low demand and one of high demand, and were asked to ignore a simultaneously presented peripheral distractor. The distractor could be Chinese or Korean characters, familiar or novel to participants, respectively. Compared with the low-demand subtraction task, brain regions commonly associated with central executive functions, including left middle prefrontal cortex, anterior cingulate cortex, and precentral gyrus/sulcus, were significantly activated in the high-demand task. Critically, there was a significant interaction between distractor type and task demand. Novel distractors captured attention and elicited automatic visual analysis, shown by primary visual cortex activation, only when the subtraction task was of low demand but not when it was of high demand. The results provide confirmatory evidence that the extent to which higher level cognitive resources, specifically, the central executive component of working memory, are absorbed by a cognitive task has an impact upon automatic processing that occurs in response to distracting items.  相似文献   

17.
Event-related potentials (ERPs) have become an important tool in the quest to understand how infants process perceptual information. Identification of the activation loci of the ERP generators is a technique that provides an opportunity to explore the neural substrates that underlie auditory processing. Nevertheless, as infant brain templates from healthy, non-clinical samples have not been available, the majority of source localization studies in infants have used non-realistic head models, or brain templates derived from older children or adults. Given the dramatic structural changes seen across infancy, all of which profoundly affect the electrical fields measured with EEG, it is important to use individual MRIs or age-appropriate brain templates and parameters to explore the localization and time course of auditory ERP sources. In this study 6-month-old infants were presented with a passive oddball paradigm using consonant-vowel (CV) syllables that differed in voice onset time. Dense-array EEG/ERPs were collected while the infants were awake and alert. In addition, MRIs were acquired during natural non-sedated sleep for a subset of the sample. Discrete dipole and distributed source models were mapped onto individual and averaged infant MRIs. The CV syllables elicited a positive deflection at about 200 ms followed by a negative deflection that peaked around 400 ms. The source models generated placed the dipoles at temporal areas close to auditory cortex for both positive and negative responses. Notably, an additional dipole for the positive peak was localized at the frontal area, at the anterior cingulate cortex (ACC) level. ACC activation has been reported in adults, but has not, to date, been reported in infants during processing of speech-related signals. The frontal ACC activation was earlier but smaller in amplitude than the left and right auditory temporal activations. These results demonstrate that in infancy the ERP generators to CV syllables are localized in cortical areas similar to that reported in adults, but exhibit a notably different temporal course. Specifically, ACC activation in infants significantly precedes auditory temporal activation, whereas in adults ACC activation follows that of temporal cortex. We suggest that these timing differences could be related to current maturational changes, to the ongoing construction of language-specific phonetic maps, and/or to more sensitive attentional switching as a response to speech signals in infancy.  相似文献   

18.
PET was used to test the hypothesis that similar neural systems are involved in attending to spectral and to spatial features of sounds. In each of four conditions subjects heard tones varying randomly in frequency and location and responded to either the low- or the high-frequency stimuli, ignoring location, or to stimuli on the left or right, ignoring frequency. In comparison to a silent baseline, CBF increases were observed in auditory cortex bilaterally and in the right superior parietal, right dorsolateral frontal, and right premotor regions, with no modulation as a function of attentional condition. Analysis of regional covariation indicated a coordinated CBF response between the right parietal region and the right frontal and middle temporal regions. The data imply that auditory attention engages a network of right-hemisphere cortical regions for both spatial location and tonal frequency and support a model whereby auditory attention operates at a level at which separate features have been integrated into a unitary representation.  相似文献   

19.
We examined changes in relative cerebral flood flow (relCBF) using PET during a sustained attention paradigm which included auditory stimulation and different tasks of mental counting. Ten normal volunteers underwent PET (15O water) during a baseline state and under experimental conditions which included listening to clicks, serial counting with auditory stimulation, counting with no auditory stimulation, and an additional component of working memory and time estimation. All subjects performed within normal limits in a battery of neurocognitive tests, which included measures of attention and working memory. Both counting with auditory stimulation and counting with no auditory stimulation engaged motor cortex, putamen, cerebellum, and anterior cingulate. Furthermore, counting with no auditory stimulation relative to counting while listening resulted in significantly increased relCBF in the inferior parietal, dorsolateral prefrontal, and anterior cingulate. The findings obtained in this study support the notion that the parietal and dorsolateral prefrontal cortex are involved when time estimation and working memory are taking part in a task requiring sustained attention.  相似文献   

20.
The purpose of this study was to reveal functional areas of the brain modulating processing of selective auditory or visual attention toward utterances. Regional cerebral blood flow was measured in six normal volunteers using positron emission tomography during two selective attention tasks and a control condition. The auditory task activated the auditory, inferior parietal, prefrontal, and anterior cingulate cortices. The visual task activated the visual association, inferior parietal, and prefrontal cortices. Both conditions activated the same area in the superior temporal sulcus. During the visual task, deactivation was observed in the auditory cortex. These results indicate that there exists a modality-dependent selective attention mechanism which activates or deactivates cortical areas in different ways.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号