首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Recent work provides evidence that the infant brain is able to make top-down predictions, but this has been explored only in limited contexts and domains. We build upon this evidence of predictive processing in infants using a new paradigm to examine auditory repetition suppression (RS). RS is a well-documented neural phenomenon in which repeated presentations of the same stimulus result in reduced neural activation compared to non-repeating stimuli. Many theories explain RS using bottom-up mechanisms, but recent work has posited that top-down expectation and predictive coding may bias, or even explain, RS. Here, we investigate whether RS in the infant brain is similarly sensitive to top-down mechanisms. We use fNIRS to measure infants’ neural response in two experimental conditions, one in which variability in stimulus presentation is expected (occurs 75% of the time) and a control condition where variability and repetition are equally likely (50% of the time). We show that 6-month-old infants exhibit attenuated frontal lobe response to blocks of variable auditory stimuli during contexts when variability is expected as compared to the control condition. These findings suggest that young infants’ neural responses are modulated by predictions gained from experience and not simply by bottom-up mechanisms.  相似文献   

2.
Repetition priming refers to enhanced or biased performance with repeatedly presented stimuli. Modality-specific perceptual repetition priming has been demonstrated behaviorally for both visually and auditorily presented stimuli. In functional neuroimaging studies, repetition of visual stimuli has resulted in reduced activation in the visual cortex, as well as in multimodal frontal and temporal regions. The reductions in sensory cortices are thought to reflect plasticity in modality-specific neocortex. Unexpectedly, repetition of auditory stimuli has resulted in reduced activation in multimodal and visual regions, but not in the auditory temporal lobe cortex. This finding puts the coupling of perceptual priming and modality-specific cortical plasticity into question. Here, functional magnetic resonance imaging was used with environmental sounds to reexamine whether auditory priming is associated with reduced activation in the auditory cortex. Participants heard environmental sounds (e.g., animals, machines, musical instruments, etc.) in blocks, alternating between initial and repeated presentations, and decided whether or not each sound was produced by an animal. Repeated versus initial presentations of sounds resulted in repetition priming (faster responses) and reduced activation in the right superior temporal gyrus, bilateral superior temporal sulci, and right inferior prefrontal cortex. The magnitude of behavioral priming correlated positively with reduced activation in these regions. This indicates that priming for environmental sounds is associated with modification of neural activation in modality-specific auditory cortex, as well as in multimodal areas.  相似文献   

3.
Successful interactions with the environment entail interpreting ambiguous sensory information. To address this challenge it has been suggested that the brain optimizes performance through experience. Here we used functional magnetic resonance imaging (fMRI) to investigate whether perceptual experience modulates the cortical circuits involved in visual awareness. Using ambiguous visual stimuli (binocular rivalry or ambiguous structure‐from‐motion) we were able to disentangle the co‐occurring influences of stimulus repetition and perceptual repetition. For both types of ambiguous stimuli we observed that the mere repetition of the stimulus evoked an entirely different pattern of activity modulations than the repetition of a particular perceptual interpretation of the stimulus. Regarding stimulus repetition, decreased fMRI responses were evident during binocular rivalry but weaker during 3‐D motion rivalry. Perceptual repetition, on the other hand, entailed increased activity in stimulus‐specific visual brain regions – for binocular rivalry in the early visual regions and for ambiguous structure‐from‐motion in both early as well as higher visual regions. This indicates that the repeated activation of a visual network mediating a particular percept facilitated its later reactivation. Perceptual repetition was also associated with a response change in the parietal cortex that was similar for the two types of ambiguous stimuli, possibly relating to the temporal integration of perceptual information. We suggest that perceptual repetition is associated with a facilitation of neural activity within and between percept‐specific visual networks and parietal networks involved in the temporal integration of perceptual information, thereby enhancing the stability of previously experienced percepts.  相似文献   

4.
This study mapped the developmental trajectories of cortical regions in comparison to overall brain growth in typically developing, socially-housed infant macaques. Volumetric changes of cortical brain regions were examined longitudinally between 2–24 weeks of age (equivalent to the first 2 years in humans) in 21 male rhesus macaques. Growth of the prefrontal, frontal, parietal, occipital, and temporal cortices (visual and auditory) was examined using MRI and age-specific infant macaque brain atlases developed by our group. Results indicate that cortical volumetric development follows a cubic growth curve, but maturational timelines and growth rates are region-specific. Total intracranial volume (ICV) increased significantly during the first 5 months of life, leveling off thereafter. Prefrontal and temporal visual cortices showed fast volume increases during the first 16 weeks, followed by a plateau, and significant growth again between 20–24 weeks. Volume of the frontal and temporal auditory cortices increased substantially between 2–24 weeks. The parietal cortex showed a significant volume increase during the first 4 months, whereas the volume of the occipital lobe increased between 2–12 weeks and plateaued thereafter. These developmental trajectories show similarities to cortical growth in human infants, providing foundational information necessary to build nonhuman primate (NHP) models of human neurodevelopmental disorders.  相似文献   

5.
A frontoparietal network of brain regions is often implicated in both auditory and visual information processing. Although it is possible that the same set of multimodal regions subserves both modalities, there is increasing evidence that there is a differentiation of sensory function within frontoparietal cortex. Magnetic resonance imaging (MRI) in humans was used to investigate whether different frontoparietal regions showed intrinsic biases in connectivity with visual or auditory modalities. Structural connectivity was assessed with diffusion tractography and functional connectivity was tested using functional MRI. A dorsal–ventral gradient of function was observed, where connectivity with visual cortex dominates dorsal frontal and parietal connections, while connectivity with auditory cortex dominates ventral frontal and parietal regions. A gradient was also observed along the posterior–anterior axis, although in opposite directions in prefrontal and parietal cortices. The results suggest that the location of neural activity within frontoparietal cortex may be influenced by these intrinsic biases toward visual and auditory processing. Thus, the location of activity in frontoparietal cortex may be influenced as much by stimulus modality as the cognitive demands of a task. It was concluded that stimulus modality was spatially encoded throughout frontal and parietal cortices, and was speculated that such an arrangement allows for top–down modulation of modality‐specific information to occur within higher‐order cortex. This could provide a potentially faster and more efficient pathway by which top–down selection between sensory modalities could occur, by constraining modulations to within frontal and parietal regions, rather than long‐range connections to sensory cortices. Hum Brain Mapp 38:255–270, 2017. © 2016 Wiley Periodicals, Inc.  相似文献   

6.
A fundamental question with regard to perceptual development is how multisensory information is processed in the brain during the early stages of development. Although a growing body of evidence has shown the early emergence of modality‐specific functional differentiation of the cortical regions, the interplay between sensory inputs from different modalities in the developing brain is not well understood. To study the effects of auditory input during audio‐visual processing in 3‐month‐old infants, we evaluated the spatiotemporal cortical hemodynamic responses of 50 infants while they perceived visual objects with or without accompanying sounds. The responses were measured using 94‐channel near‐infrared spectroscopy over the occipital, temporal, and frontal cortices. The effects of sound manipulation were pervasive throughout the diverse cortical regions and were specific to each cortical region. Visual stimuli co‐occurring with sound induced the early‐onset activation of the early auditory region, followed by activation of the other regions. Removal of the sound stimulus resulted in focal deactivation in the auditory regions and reduced activation in the early visual region, the association region of the temporal and parietal cortices, and the anterior prefrontal regions, suggesting multisensory interplay. In contrast, equivalent activations were observed in the lateral occipital and lateral prefrontal regions, regardless of sound manipulation. Our findings indicate that auditory input did not generally enhance overall activation in relation to visual perception, but rather induced specific changes in each cortical region. The present study implies that 3‐month‐old infants may perceive audio‐visual multisensory inputs by using the global network of functionally differentiated cortical regions. Hum Brain Mapp, 2013. © 2011 Wiley Periodicals, Inc.  相似文献   

7.
Task‐irrelevant visual stimuli can enhance auditory perception. However, while there is some neurophysiological evidence for mechanisms that underlie the phenomenon, the neural basis of visually induced effects on auditory perception remains unknown. Combining fMRI and EEG with psychophysical measurements in two independent studies, we identified the neural underpinnings and temporal dynamics of visually induced auditory enhancement. Lower‐ and higher‐intensity sounds were paired with a non‐informative visual stimulus, while participants performed an auditory detection task. Behaviourally, visual co‐stimulation enhanced auditory sensitivity. Using fMRI, enhanced BOLD signals were observed in primary auditory cortex for low‐intensity audiovisual stimuli which scaled with subject‐specific enhancement in perceptual sensitivity. Concordantly, a modulation of event‐related potentials could already be observed over frontal electrodes at an early latency (30–80 ms), which again scaled with subject‐specific behavioural benefits. Later modulations starting around 280 ms, that is in the time range of the P3, did not fit this pattern of brain‐behaviour correspondence. Hence, the latency of the corresponding fMRI‐EEG brain‐behaviour modulation points at an early interplay of visual and auditory signals in low‐level auditory cortex, potentially mediated by crosstalk at the level of the thalamus. However, fMRI signals in primary auditory cortex, auditory thalamus and the P50 for higher‐intensity auditory stimuli were also elevated by visual co‐stimulation (in the absence of any behavioural effect) suggesting a general, intensity‐independent integration mechanism. We propose that this automatic interaction occurs at the level of the thalamus and might signify a first step of audiovisual interplay necessary for visually induced perceptual enhancement of auditory perception.  相似文献   

8.
Spatio-temporal constraints for auditory--visual integration   总被引:1,自引:0,他引:1  
The perceptual coherence of auditory and visual information is achieved by integrative brain processes. Specialized single neurons with spatial and temporal interactions of auditory and visual stimuli have been demonstrated by several neurophysiological studies. The present, psychophysical, study investigates possible perceptual correlates of these neuronal features. Subjects had to indicate the point of subjective spatial alignment (PSSA) for a horizontally moving visual stimulus that crossed the position of a stationary sound source. Auditory and visual stimuli consisted of periodic pulses that were systematically varied in their phase relationship or repetition rate. PSSAs obtained for continuous visual stimuli served as a reference. When sound and light pulses were coincident in phase at a repetition rate of 2 Hz, PSSAs were shifted by approximately 3 degrees in a direction opposite to the movement of the visual stimulus (with respect to the reference condition). This shift markedly decreased when the temporal disparity exceeded approximately 100 ms and disappeared near phase opposition (250 ms disparity). With 4 Hz repetition rate (temporal disparity < or =125 ms), there was no significant effect of phase relationship on PSSAs, but still an approximately constant shift with respect to the reference value. Variation of the repetition rate resulted in almost constant shifts in PSSA of approximately 3 degrees between 1 and 4 Hz and a linear decrease (slope 0.27 degrees /Hz) with higher repetition rates. These results suggest a spatio-temporal 'window' for auditory-visual integration, that extends over approximately 100 ms and approximately 3 degrees : when auditory and visual stimuli are within this window, they are always perceived as spatially coincident. These psychophysical findings may be related to properties of bimodal neurons such as have been demonstrated by neurophysiological recordings in midbrain and cortex.  相似文献   

9.
The event‐related potential ‘mismatch negativity’ (MMN) is an indicator of a perceiver's ability to detect deviations in sensory signal streams. MMN and its homologue in animals, mismatch activity (MMA), are differential neural responses to a repeatedly presented stimulus and a subsequent deviant stimulus (oddball). Because neural mechanisms underlying MMN and MMA remain unclear, there is a controversy as to whether MMN and MMA arise solely from stimulus‐specific adaptation (SSA), in which the response to a stimulus cumulatively attenuates with its repetitive presentation. To address this issue, we used electrocorticography and the auditory roving‐oddball paradigm in two awake macaque monkeys. We examined the effect of stimulus repetition number on MMA and on responses to repeated stimuli and oddballs across the cerebral cortex in the time–frequency domain. As the repetition number increased, MMA spread across the temporal, frontal and parietal cortices, and each electrode yielded a larger MMA. Surprisingly, this increment in MMA largely depended on response augmentation to the oddball rather than on SSA to the repeated stimulus. Following sufficient repetition, the oddball evoked a spectral power increment in some electrodes on the frontal cortex that had shown no power increase to the stimuli with less or no preceding repetition. We thereby revealed that repetitive presentation of one stimulus not only leads to SSA but also facilitates the cortical response to oddballs involving a wide range of cortical regions. This facilitative effect might underlie the generation of MMN‐like scalp potentials in macaques that potentially shares similar neural mechanisms with MMN in humans.  相似文献   

10.
Priming, response learning and repetition suppression   总被引:1,自引:1,他引:0  
Horner AJ  Henson RN 《Neuropsychologia》2008,46(7):1979-1991
Prior exposure to a stimulus can facilitate its subsequent identification and classification, a phenomenon called priming. This behavioural facilitation is usually accompanied by a reduction in neural response within specific cortical regions (repetition suppression, RS). Recent research has suggested that both behavioural priming and RS can be largely determined by previously learned stimulus-response associations. According to this view, a direct association forms between the stimulus presented and the response made to it. On a subsequent encounter with the stimulus, this association automatically cues the response, bypassing the various processing stages that were required to select that response during its first presentation. Here we reproduce behavioural evidence for such stimulus-response associations, and show the PFC to be sensitive to such changes. In contrast, RS within ventral temporal regions (such as the fusiform cortex), which are usually associated with perceptual processing, is shown to be robust to response changes. The present study therefore suggests a dissociation between RS within the PFC, which may be sensitive to retrieval of stimulus-response associations, and RS within posterior perceptual regions, which may reflect facilitation of perceptual processing independent of stimulus-response associations.  相似文献   

11.
The synchronous occurrence of the unisensory components of a multisensory stimulus contributes to their successful merging into a coherent perceptual representation. Oscillatory gamma-band responses (GBRs, 30-80 Hz) have been linked to feature integration mechanisms and to multisensory processing, suggesting they may also be sensitive to the temporal alignment of multisensory stimulus components. Here we examined the effects on early oscillatory GBR brain activity of varying the precision of the temporal synchrony of the unisensory components of an audio-visual stimulus. Audio-visual stimuli were presented with stimulus onset asynchronies ranging from -125 to +125 ms. Randomized streams of auditory (A), visual (V), and audio-visual (AV) stimuli were presented centrally while subjects attended to either the auditory or visual modality to detect occasional targets. GBRs to auditory and visual components of multisensory AV stimuli were extracted for five subranges of asynchrony (e.g., A preceded by V by 100+/-25 ms, by 50+/-25 ms, etc.) and compared with GBRs to unisensory control stimuli. Robust multisensory interactions were observed in the early GBRs when the auditory and visual stimuli were presented with the closest synchrony. These effects were found over medial-frontal brain areas after 30-80 ms and over occipital brain areas after 60-120 ms. A second integration effect, possibly reflecting the perceptual separation of the two sensory inputs, was found over occipital areas when auditory inputs preceded visual by 100+/-25 ms. No significant interactions were observed for the other subranges of asynchrony. These results show that the precision of temporal synchrony can have an impact on early cross-modal interactions in human cortex.  相似文献   

12.
Effects of spatially directed auditory attention on human brain activity, as indicated by changes in regional cerebral blood flow (rCBF), were measured with positron emission tomography (PET). Subjects attended to left-ear tones, right-ear tones, or foveal visual stimuli presented at rapid rates in three concurrent stimulus sequences. It was found that attending selectively to the right-ear input activated the auditory cortex predominantly in the left hemisphere and vice versa. This selective tuning of the left and right auditory cortices according to the direction of attention was presumably controlled by executive attention mechanisms of the frontal cortex, where enhanced activation during auditory attention was also observed.  相似文献   

13.
Cortical signals associated with infrequent tone omissions were recorded from 9 healthy adults with a whole-head 122-channel neuromagnetometer. The stimulus sequence consisted of monaural (left or right) 50-ms 1-kHz tones repeated every 0.2 or 0.5 s, with 7% of the tones randomly omitted. Tones elicited typical responses in the supratemporal auditory cortices. Omissions evoked strong responses over temporal and frontal areas, independently of the side of stimulation, with peak amplitudes at 145–195 ms. Response amplitudes were 60% weaker when the subject was not attending to the stimuli. Omission responses originated in supratemporal auditory cortices bilaterally, indicating that auditory cortex plays an important role in the brain's modelling of temporal characteristics of the auditory environment. Additional activity was observed in the posterolateral frontal cortex and in the superior temporal sulcus, more often in the right than in the left hemisphere.  相似文献   

14.
Although some brain areas preferentially process information from a particular sensory modality, these areas can also respond to other modalities. Here we used fMRI to show that such responsiveness to tactile stimuli depends on the temporal frequency of stimulation. Participants performed a tactile threshold-tracking task where the tip of either their left or right middle finger was stimulated at 3, 20, or 100 Hz. Whole-brain analysis revealed an effect of stimulus frequency in two regions: the auditory cortex and the visual cortex. The BOLD response in the auditory cortex was stronger during stimulation at hearable frequencies (20 and 100 Hz) whereas the response in the visual cortex was suppressed at infrasonic frequencies (3 Hz). Regardless of which hand was stimulated, the frequency-dependent effects were lateralized to the left auditory cortex and the right visual cortex. Furthermore, the frequency-dependent effects in both areas were abolished when the participants performed a visual task while receiving identical tactile stimulation as in the tactile threshold-tracking task. We interpret these findings in the context of the metamodal theory of brain function, which posits that brain areas contribute to sensory processing by performing specific computations regardless of input modality.  相似文献   

15.
We used electrical stimulation mapping to compare performance on auditory and visual naming tasks in inferotemporal, lateral temporal, frontal, and parietal cortex in 8 temporal lobe epilepsy (TLE) patients with subdural electrodes placed for preoperative language localization. Performance on auditory responsive naming (ARN) and visual confrontation naming (VCN) was best during stimulation of parietal cortex and was equally impaired during stimulation of inferotemporal and frontal cortex. In contrast, ARN performance was significantly poorer than VCN performance during stimulation of anterior and posterior lateral temporal cortex. In most patients, stimulation of inferotemporal cortex at relatively low stimulus intensities (≥ 5 mA) during either ARN or VCN elicited reproducible errors in which patients could describe, gesture, spell, or draw, but not name, in response to auditory or visual cues. Inferotemporal and frontal cortex appear to be multimodality language regions distinct from lateral temporal cortex.  相似文献   

16.
The reduced neural response in certain brain regions when a task‐relevant stimulus is repeated (“repetition suppression”, RS) is often attributed to facilitation of the cognitive processes performed in those regions. Repetition of visual objects is associated with RS in the ventral and lateral occipital/temporal regions, and is typically attributed to facilitation of visual processes, ranging from the extraction of shape to the perceptual identification of objects. In two fMRI experiments using a semantic classification task, we found RS in a left lateral occipital/inferior temporal region to a picture of an object when the name of that object had previously been presented in a separate session. In other words, we found RS despite negligible visual similarity between the initial and repeated occurrences of an object identity. There was no evidence that this RS was driven by the learning of task‐specific responses to an object identity (“S‐R learning”). We consider several explanations of this occipitotemporal RS, such as phonological retrieval, semantic retrieval, and visual imagery. Although no explanation if fully satisfactory, it is proposed that such effects most plausibly relate to the extraction of task‐relevant information relating to object size, either through the extraction of sensory‐specific semantic information or through visual imagery processes. Our findings serve to emphasize the potential complexity of processing within traditionally visual regions, at least as measured by fMRI. Hum Brain Mapp, 2010. © 2010 Wiley‐Liss, Inc.  相似文献   

17.
The neural correlates of consciousness (NCC), i.e., patterns of brain activity that specifically accompany a particular conscious experience, have been investigated mainly in the visual system using particularly suited paradigms, such as binocular rivalry and multistable percepts in combination with neural recordings or neuroimaging. Through the same principles, we look here for possible NCC in the auditory modality exploiting the properties of the Deutsch's illusion, a stimulation condition in which a sequence of two specular dichotic stimuli presented in alternation causes an illusory segregation of pitch and side (ear of origin), which can yield up to four different auditory percepts per dichotic stimulus. Using magnetoencephalography in humans, we observed cortical activity specifically accompanying conscious experience of pitch inside an early bilateral network, including the Heschl's gyrus, the middle temporal gyrus, the right inferior, and the superior frontal gyri. The conscious experience of perceived side was instead accompanied by later activity observed bilaterally in the inferior parietal lobe and in the superior frontal gyrus. These results suggest that the NCC are not independent of stimulus features and modality and that, even at the higher cortical levels, the different aspects of a single perceptual scene may not be simultaneously processed.  相似文献   

18.
It is well known that previous perceptual experiences alter subsequent perception, but the details of the neural underpinnings of this general phenomenon are still sketchy. Here, we ask whether previous experiences with an item (such as seeing a person's face) leads to the alteration of the neural correlates related to processing of the item as such, or whether it creates additional associative connections between such substrates and those activated during prior experience. To address this question, we used magnetoencephalography (MEG) to identify neural changes accompanying subjects' viewing of unfamiliar versus famous faces and hearing the names of unfamiliar versus famous names. We were interested in the nature of the involvement of auditory brain regions in the viewing of faces, and in the involvement of visual regions in the hearing of names. Evoked responses from MEG recordings for the names and faces conditions were localized to auditory and visual cortices, respectively. Unsurprisingly, peak activation strength of evoked responses was larger for famous versus nonfamous names within the superior temporal gyrus (STG), and was similar for famous and nonfamous faces in the occipital cortex. More relevant to the issue of experience on perception, peak activation strength in the STG was larger for viewed famous versus nonfamous faces, and peak activation within the occipital cortex was larger for heard famous versus nonfamous names. Critically, these experience-related responses were present within 150-250 msec of stimulus onset. These findings support the hypothesis that prior experiences may influence processing of faces and names such that perception encompasses more than what is imparted on the senses.  相似文献   

19.
Cross‐modal reorganization in the auditory and visual cortices has been reported after hearing and visual deficits mostly during the developmental period, possibly underlying sensory compensation mechanisms. However, there are very few data on the existence or nature and timeline of such reorganization events during sensory deficits in adulthood. In this study, we assessed long‐term changes in activity‐dependent immediate early genes c‐Fos and Arc/Arg3.1 in auditory and neighboring visual cortical areas after bilateral deafness in young adult rats. Specifically, we analyzed qualitatively and quantitatively c‐Fos and Arc/Arg3.1 immunoreactivity at 15 and 90 days after cochlea removal. We report extensive, global loss of c‐Fos and Arc/Arg3.1 immunoreactive neurons in the auditory cortex 15 days after permanent auditory deprivation in adult rats, which is partly reversed 90 days after deafness. Simultaneously, the number and labeling intensity of c‐Fos‐ and Arc/Arg3.1‐immunoreactive neurons progressively increase in neighboring visual cortical areas from 2 weeks after deafness and these changes stabilize three months after inducing the cochlear lesion. These findings support plastic, compensatory, long‐term changes in activity in the auditory and visual cortices after auditory deprivation in the adult rats. Further studies may clarify whether those changes result in perceptual potentiation of visual drives on auditory regions of the adult cortex.  相似文献   

20.
Temporal information in acoustic signals is important for the perception of environmental sounds, including speech. This review focuses on several aspects of temporal processing within human auditory cortex and its relevance for the processing of speech sounds. Periodic non-speech sounds, such as trains of acoustic clicks and bursts of amplitude-modulated noise or tones, can elicit different percepts depending on the pulse repetition rate or modulation frequency. Such sounds provide convenient methodological tools to study representation of timing information in the auditory system. At low repetition rates of up to 8-10 Hz, each individual stimulus (a single click or a sinusoidal amplitude modulation cycle) within the sequence is perceived as a separate event. As repetition rates increase up to and above approximately 40 Hz, these events blend together, giving rise first to the percept of flutter and then to pitch. The extent to which neural responses of human auditory cortex encode temporal features of acoustic stimuli is discussed within the context of these perceptual classes of periodic stimuli and their relationship to speech sounds. Evidence for neural coding of temporal information at the level of the core auditory cortex in humans suggests possible physiological counterparts to perceptual categorical boundaries for periodic acoustic stimuli. Temporal coding is less evident in auditory cortical fields beyond the core. Finally, data suggest hemispheric asymmetry in temporal cortical processing.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号