首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Rearing cats from birth to adulthood in darkness prevents neurons in the superior colliculus (SC) from developing the capability to integrate visual and non‐visual (e.g. visual‐auditory) inputs. Presumably, this developmental anomaly is due to a lack of experience with the combination of those cues, which is essential to form associative links between them. The visual‐auditory multisensory integration capacity of SC neurons has also been shown to depend on the functional integrity of converging visual and auditory inputs from the ipsilateral association cortex. Disrupting these cortico‐collicular projections at any stage of life results in a pattern of outcomes similar to those found after dark‐rearing; SC neurons respond to stimuli in both sensory modalities, but cannot integrate the information they provide. Thus, it is possible that dark‐rearing compromises the development of these descending tecto‐petal connections and the essential influences they convey. However, the results of the present experiments, using cortical deactivation to assess the presence of cortico‐collicular influences, demonstrate that dark‐rearing does not prevent the association cortex from developing robust influences over SC multisensory responses. In fact, dark‐rearing may increase their potency over that observed in normally‐reared animals. Nevertheless, their influences are still insufficient to support SC multisensory integration. It appears that cross‐modal experience shapes the cortical influence to selectively enhance responses to cross‐modal stimulus combinations that are likely to be derived from the same event. In the absence of this experience, the cortex develops an indiscriminate excitatory influence over its multisensory SC target neurons.  相似文献   

2.
To make accurate perceptual estimates, observers must take the reliability of sensory information into account. Despite many behavioural studies showing that subjects weight individual sensory cues in proportion to their reliabilities, it is still unclear when during a trial neuronal responses are modulated by the reliability of sensory information or when they reflect the perceptual weights attributed to each sensory input. We investigated these questions using a combination of psychophysics, EEG‐based neuroimaging and single‐trial decoding. Our results show that the weighted integration of sensory information in the brain is a dynamic process; effects of sensory reliability on task‐relevant EEG components were evident 84 ms after stimulus onset, while neural correlates of perceptual weights emerged 120 ms after stimulus onset. These neural processes had different underlying sources, arising from sensory and parietal regions, respectively. Together these results reveal the temporal dynamics of perceptual and neural audio‐visual integration and support the notion of temporally early and functionally specific multisensory processes in the brain.  相似文献   

3.
Prior studies have repeatedly reported behavioural benefits to events occurring at attended, compared to unattended, points in time. It has been suggested that, as for spatial orienting, temporal orienting of attention spreads across sensory modalities in a synergistic fashion. However, the consequences of cross‐modal temporal orienting of attention remain poorly understood. One challenge is that the passage of time leads to an increase in event predictability throughout a trial, thus making it difficult to interpret possible effects (or lack thereof). Here we used a design that avoids complete temporal predictability to investigate whether attending to a sensory modality (vision or touch) at a point in time confers beneficial access to events in the other, non‐attended, sensory modality (touch or vision, respectively). In contrast to previous studies and to what happens with spatial attention, we found that events in one (unattended) modality do not automatically benefit from happening at the time point when another modality is expected. Instead, it seems that attention can be deployed in time with relative independence for different sensory modalities. Based on these findings, we argue that temporal orienting of attention can be cross‐modally decoupled in order to flexibly react according to the environmental demands, and that the efficiency of this selective decoupling unfolds in time.  相似文献   

4.
Although there is increasing knowledge about how visual and tactile cues from the hands are integrated, little is known about how self‐generated hand movements affect such multisensory integration. Visuo‐tactile integration often occurs under highly dynamic conditions requiring sensorimotor updating. Here, we quantified visuo‐tactile integration by measuring cross‐modal congruency effects (CCEs) in different bimanual hand movement conditions with the use of a robotic platform. We found that classical CCEs also occurred during bimanual self‐generated hand movements, and that such movements lowered the magnitude of visuo‐tactile CCEs as compared to static conditions. Visuo‐tactile integration, body ownership and the sense of agency were decreased by adding a temporal visuo‐motor delay between hand movements and visual feedback. These data show that visual stimuli interfere less with the perception of tactile stimuli during movement than during static conditions, especially when decoupled from predictive motor information. The results suggest that current models of visuo‐tactile integration need to be extended to account for multisensory integration in dynamic conditions.  相似文献   

5.
This study analyzed high‐density event‐related potentials (ERPs) within an electrical neuroimaging framework to provide insights regarding the interaction between multisensory processes and stimulus probabilities. Specifically, we identified the spatiotemporal brain mechanisms by which the proportion of temporally congruent and task‐irrelevant auditory information influences stimulus processing during a visual duration discrimination task. The spatial position (top/bottom) of the visual stimulus was indicative of how frequently the visual and auditory stimuli would be congruent in their duration (i.e., context of congruence). Stronger influences of irrelevant sound were observed when contexts associated with a high proportion of auditory‐visual congruence repeated and also when contexts associated with a low proportion of congruence switched. Context of congruence and context transition resulted in weaker brain responses at 228 to 257 ms poststimulus to conditions giving rise to larger behavioral cross‐modal interactions. Importantly, a control oddball task revealed that both congruent and incongruent audiovisual stimuli triggered equivalent non‐linear multisensory interactions when congruence was not a relevant dimension. Collectively, these results are well explained by statistical learning, which links a particular context (here: a spatial location) with a certain level of top‐down attentional control that further modulates cross‐modal interactions based on whether a particular context repeated or changed. The current findings shed new light on the importance of context‐based control over multisensory processing, whose influences multiplex across finer and broader time scales. Hum Brain Mapp 37:273–288, 2016. © 2015 Wiley Periodicals, Inc.  相似文献   

6.
The orienting of attention to the spatial location of sensory stimuli in one modality based on sensory stimuli presented in another modality (i.e., cross‐modal orienting) is a common mechanism for controlling attentional shifts. The neuronal mechanisms of top‐down cross‐modal orienting have been studied extensively. However, the neuronal substrates of bottom‐up audio‐visual cross‐modal spatial orienting remain to be elucidated. Therefore, behavioral and event‐related functional magnetic resonance imaging (FMRI) data were collected while healthy volunteers (N = 26) performed a spatial cross‐modal localization task modeled after the Posner cuing paradigm. Behavioral results indicated that although both visual and auditory cues were effective in producing bottom‐up shifts of cross‐modal spatial attention, reorienting effects were greater for the visual cues condition. Statistically significant evidence of inhibition of return was not observed for either condition. Functional results also indicated that visual cues with auditory targets resulted in greater activation within ventral and dorsal frontoparietal attention networks, visual and auditory “where” streams, primary auditory cortex, and thalamus during reorienting across both short and long stimulus onset asynchronys. In contrast, no areas of unique activation were associated with reorienting following auditory cues with visual targets. In summary, current results question whether audio‐visual cross‐modal orienting is supramodal in nature, suggesting rather that the initial modality of cue presentation heavily influences both behavioral and functional results. In the context of localization tasks, reorienting effects accompanied by the activation of the frontoparietal reorienting network are more robust for visual cues with auditory targets than for auditory cues with visual targets. Hum Brain Mapp 35:964–974, 2014. © 2013 Wiley Periodicals, Inc.  相似文献   

7.
Task‐irrelevant visual stimuli can enhance auditory perception. However, while there is some neurophysiological evidence for mechanisms that underlie the phenomenon, the neural basis of visually induced effects on auditory perception remains unknown. Combining fMRI and EEG with psychophysical measurements in two independent studies, we identified the neural underpinnings and temporal dynamics of visually induced auditory enhancement. Lower‐ and higher‐intensity sounds were paired with a non‐informative visual stimulus, while participants performed an auditory detection task. Behaviourally, visual co‐stimulation enhanced auditory sensitivity. Using fMRI, enhanced BOLD signals were observed in primary auditory cortex for low‐intensity audiovisual stimuli which scaled with subject‐specific enhancement in perceptual sensitivity. Concordantly, a modulation of event‐related potentials could already be observed over frontal electrodes at an early latency (30–80 ms), which again scaled with subject‐specific behavioural benefits. Later modulations starting around 280 ms, that is in the time range of the P3, did not fit this pattern of brain‐behaviour correspondence. Hence, the latency of the corresponding fMRI‐EEG brain‐behaviour modulation points at an early interplay of visual and auditory signals in low‐level auditory cortex, potentially mediated by crosstalk at the level of the thalamus. However, fMRI signals in primary auditory cortex, auditory thalamus and the P50 for higher‐intensity auditory stimuli were also elevated by visual co‐stimulation (in the absence of any behavioural effect) suggesting a general, intensity‐independent integration mechanism. We propose that this automatic interaction occurs at the level of the thalamus and might signify a first step of audiovisual interplay necessary for visually induced perceptual enhancement of auditory perception.  相似文献   

8.
While the olfactory and tactile vibrissal systems have been extensively studied in the rat, the neural basis of these cross‐modal associations is still elusive. Here we tested the hypothesis that the lateral entorhinal cortex (LEC) could be particularly involved. In order to tackle this question, we have developed a new behavioral paradigm which consists in finding one baited cup (+) among three, each of the cups presenting a different and specific odor/texture (OT) combination. During the acquisition of a first task (Task OT1), the three cups were associated with the following OT combination: O1T1 for the baited cup; O2T1 and O1T2 for non‐baited ones. Most rats learn this task within three training sessions (20 trials/session). In a second task (Task OT2) animals had to pair another OT combination with the reward using a new set of stimuli (O3T3+, O4T3, and O3T4). Results showed that rats manage to learn Task OT2 within one session only. In a third task (Task OT3) animals had to learn another OT combination based on previously learned items (e.g. O4T4+, O1T4 and O4T1). This task is called the “recombination task.” Results showed that control rats solve the recombination task within one session. Animals bilaterally implanted with cannulae in the LEC were microinfused with d‐APV (3 µg/0.6 µL) just before the acquisition or the test session of each task. The results showed that NMDA receptor blockade in LEC did not affect recall of Task OT1 but strongly impaired acquisition of both Task OT2 and OT3. Moreover, two control groups of animals infused with d‐APV showed no deficit in the acquisition of unimodal olfactory and tactile tasks. Taken together, these data show that the NMDA system in the LEC is involved in the acquisition of association between an olfactory and a tactile stimulus during cross‐modal learning task. © 2014 Wiley Periodicals, Inc.  相似文献   

9.
The brain improves speech processing through the integration of audiovisual (AV) signals. Situations involving AV speech integration may be crudely dichotomized into those where auditory and visual inputs contain (1) equivalent, complementary signals (validating AV speech) or (2) inconsistent, different signals (conflicting AV speech). This simple framework may allow the systematic examination of broad commonalities and differences between AV neural processes engaged by various experimental paradigms frequently used to study AV speech integration. We conducted an activation likelihood estimation metaanalysis of 22 functional imaging studies comprising 33 experiments, 311 subjects, and 347 foci examining “conflicting” versus “validating” AV speech. Experimental paradigms included content congruency, timing synchrony, and perceptual measures, such as the McGurk effect or synchrony judgments, across AV speech stimulus types (sublexical to sentence). Colocalization of conflicting AV speech experiments revealed consistency across at least two contrast types (e.g., synchrony and congruency) in a network of dorsal stream regions in the frontal, parietal, and temporal lobes. There was consistency across all contrast types (synchrony, congruency, and percept) in the bilateral posterior superior/middle temporal cortex. Although fewer studies were available, validating AV speech experiments were localized to other regions, such as ventral stream visual areas in the occipital and inferior temporal cortex. These results suggest that while equivalent, complementary AV speech signals may evoke activity in regions related to the corroboration of sensory input, conflicting AV speech signals recruit widespread dorsal stream areas likely involved in the resolution of conflicting sensory signals. Hum Brain Mapp 35:5587–5605, 2014. © 2014 Wiley Periodicals, Inc .  相似文献   

10.
There are ongoing debates on whether object concepts are coded as supramodal identity‐based or modality‐specific representations in the human brain. In this fMRI study, we adopted a cross‐modal “prime–neutral cue–target” semantic priming paradigm, in which the prime‐target relationship was manipulated along both the identity and the modality dimensions. The prime and the target could refer to either the same or different semantic identities, and could be delivered via either the same or different sensory modalities. By calculating the main effects and interactions of this 2 (identity cue validity: “Identity_Cued” vs. “Identity_Uncued”) × 2 (modality cue validity: “Modality_Cued” vs. “Modality_Uncued”) factorial design, we aimed at dissociating three neural networks involved in creating novel identity‐specific representations independent of sensory modality, in creating modality‐specific representations independent of semantic identity, and in evaluating changes of an object along both the identity and the modality dimensions, respectively. Our results suggested that bilateral lateral occipital cortex was involved in creating a new supramodal semantic representation irrespective of the input modality, left dorsal premotor cortex, and left intraparietal sulcus were involved in creating a new modality‐specific representation irrespective of its semantic identity, and bilateral superior temporal sulcus was involved in creating a representation when the identity and modality properties were both cued or both uncued. In addition, right inferior frontal gyrus showed enhanced neural activity only when both the identity and the modality of the target were new, indicating its functional role in novelty detection. Hum Brain Mapp 35:4002–4015, 2014. © 2014 Wiley Periodicals, Inc .  相似文献   

11.
The localization of touch in external space requires the remapping of somatotopically represented tactile information into an external frame of reference. Several recent studies have highlighted the role of posterior parietal areas for this remapping process, yet its temporal dynamics are poorly understood. The present study combined cross‐modal stimulation with electrophysiological recordings in humans to trace the time course of tactile spatial remapping during visual–tactile interactions. Adopting an uncrossed or crossed hand posture, participants made speeded elevation judgments about rare vibrotactile stimuli within a stream of frequent, task‐irrelevant vibrotactile events presented to the left or right hand. Simultaneous but spatially independent visual stimuli had to be ignored. An analysis of the recorded event‐related potentials to the task‐irrelevant vibrotactile stimuli revealed a somatotopic coding of tactile stimuli within the first 100 ms. Between 180 and 250 ms, neither an external nor a somatotopic representation dominated, suggesting that both coordinates were active in parallel. After 250 ms, tactile stimuli were coded in a somatotopic frame of reference. Our results indicate that cross‐modal interactions start before the termination of tactile spatial remapping, that is within the first 100 ms. Thereafter, tactile stimuli are represented simultaneously in both somatotopic and external spatial coordinates, which are dynamically (re‐)weighted as a function of processing stage.  相似文献   

12.
Even the healthiest older adults experience changes in cognitive and sensory function. Studies show that older adults have reduced neural responses to sensory information. However, it is well known that sensory systems do not act in isolation but function cooperatively to either enhance or suppress neural responses to individual environmental stimuli. Very little research has been dedicated to understanding how aging affects the interactions between sensory systems, especially cross-modal deactivations or the ability of one sensory system (e.g., audition) to suppress the neural responses in another sensory system cortex (e.g., vision). Such cross-modal interactions have been implicated in attentional shifts between sensory modalities and could account for increased distractibility in older adults. To assess age-related changes in cross-modal deactivations, functional MRI studies were performed in 61 adults between 18 and 80 years old during simple auditory and visual discrimination tasks. Results within visual cortex confirmed previous findings of decreased responses to visual stimuli for older adults. Age-related changes in the visual cortical response to auditory stimuli were, however, much more complex and suggested an alteration with age in the functional interactions between the senses. Ventral visual cortical regions exhibited cross-modal deactivations in younger but not older adults, whereas more dorsal aspects of visual cortex were suppressed in older but not younger adults. These differences in deactivation also remained after adjusting for age-related reductions in brain volume of sensory cortex. Thus, functional differences in cortical activity between older and younger adults cannot solely be accounted for by differences in gray matter volume.  相似文献   

13.
Human activities often involve hand‐motor responses following external auditory–verbal commands. It has been believed that hand movements are predominantly driven by the contralateral primary sensorimotor cortex, whereas auditory–verbal information is processed in both superior temporal gyri. It remains unknown whether cortical activation in the superior temporal gyrus during an auditory–motor task is affected by laterality of hand‐motor responses. Here, event‐related γ‐oscillations were intracranially recorded as quantitative measures of cortical activation; we determined how cortical structures were activated by auditory‐cued movement using each hand in 15 patients with focal epilepsy. Auditory–verbal stimuli elicited augmentation of γ‐oscillations in a posterior portion of the superior temporal gyrus, whereas hand‐motor responses elicited γ‐augmentation in the pre‐ and postcentral gyri. The magnitudes of such γ‐augmentation in the superior temporal, precentral, and postcentral gyri were significantly larger when the hand contralateral to the recorded hemisphere was required to be used for motor responses, compared with when the ipsilateral hand was. The superior temporal gyrus in each hemisphere might play a greater pivotal role when the contralateral hand needs to be used for motor responses, compared with when the ipsilateral hand does. Hum Brain Mapp, 2010. © 2010 Wiley‐Liss, Inc.  相似文献   

14.
The temporal closure of one eye in juvenile and young adult mice induces a shift of the ocular dominance (OD) of neurons in the binocular visual cortex. However, OD plasticity typically declines with age and is completely absent in matured mice beyond postnatal day (PD) 110. As it has been shown that the deprivation of one sensory input can induce neuronal alterations in non‐deprived sensory cortices, we here investigated whether cross‐modal interactions have the potential to reinstall OD plasticity in matured mice. Strikingly, using intrinsic signal imaging we could demonstrate that both whisker deprivation and auditory deprivation for only one week reinstated OD plasticity in fully adult mice. These OD shifts were always mediated by an increase of V1 responsiveness to visual stimulation of the open eye, a characteristic feature of OD plasticity normally only found in young adult mice. Moreover, systemic administration of the competitive NMDA receptor antagonist CPP completely abolished cross‐modally induced OD plasticity. Taken together, we demonstrate here for the first time that the deprivation of non‐visual senses has the potential to rejuvenate the adult visual cortex.  相似文献   

15.
Visual neurons coordinate their responses in relation to the stimulus; however, the complex interplay between a stimulus and the functional dynamics of an assembly still eludes neuroscientists. To this aim, we recorded cell assemblies from multi‐electrodes in the primary visual cortex of anaesthetized cats in response to randomly presented sine‐wave drifting gratings whose orientation tilted in 22.5° steps. Cross‐correlograms revealed the functional connections at all the tested orientations. We show that a cell‐assembly discriminates between orientations by recruiting a ‘salient’ functional network at every presented orientation, wherein the connections and their strengths (peak‐probabilities in the cross‐correlogram) change from one orientation to another. Within these assemblies, closely tuned neurons exhibited increased connectivity and connection‐strengths compared with differently tuned neurons. Minimal connectivity between untuned neurons suggests the significance of neuronal selectivity in assemblies. This study reflects upon the dynamics of functional connectivity, and brings to the fore the importance of a ‘signature’ functional network in an assembly that is strictly related to a specific stimulus. It appears that an assembly is the major ‘functional unit’ of information processing in cortical circuits, rather than the individual neurons.  相似文献   

16.
It is well established that the congenital lack of one sensory modality enhances functionality in the spared senses. However, whether a late onset deprivation of one sense leads to such alterations is largely unknown. Here, we investigated whether a somatosensory deprivation induced by bilateral whisker removal affects visual acuity and contrast sensitivity in fully adult mice. Using the visual cortex‐dependent visual water task, we found that a brief somatosensory deprivation markedly improved behavioral visual acuity and contrast sensitivity by about 40%. Determining these attributes of vision using periodic optical imaging of intrinsic signals in the same mice revealed that visual cortex responses elicited by weak visual stimuli were massively increased after somatosensory deprivation. Strikingly, comparison of visual acuity and contrast sensitivity values determined by the visual water task and intrinsic signal imaging revealed that these measurements were almost identical, even at the level of individual animals. In summary, our results suggest that a brief manipulation of somatosensory experience profoundly boosts visual cortex‐dependent vision in adults.  相似文献   

17.
“Sense of agency” (SoA), the feeling of control for events caused by one's own actions, is deceived by visuomotor incongruence. Sensorimotor networks are implicated in SoA, however little evidence exists on brain functionality during agency processing. Concurrently, it has been suggested that the brain's intrinsic resting‐state (rs) activity has a preliminary influence on processing of agency cues. Here, we investigated the relation between performance in an agency attribution task and functional interactions among brain regions as derived by network analysis of rs functional magnetic resonance imaging. The action‐effect delay was adaptively increased (range 90–1,620 ms) and behavioral measures correlated to indices of cognitive processes and appraised self‐concepts. They were then regressed on local metrics of rs brain functional connectivity as to isolate the core areas enabling self‐agency. Across subjects, the time window for self‐agency was 90–625 ms, while the action‐effect integration was impacted by self‐evaluated personality traits. Neurally, the brain intrinsic organization sustaining consistency in self‐agency attribution was characterized by high connectiveness in the secondary visual cortex, and regional segregation in the primary somatosensory area. Decreased connectiveness in the secondary visual area, regional segregation in the superior parietal lobule, and information control within a primary visual cortex‐frontal eye fields network sustained self‐agency over long‐delayed effects. We thus demonstrate that self‐agency is grounded on the intrinsic mode of brain function designed to organize information for visuomotor integration. Our observation is relevant for current models of psychopathology in clinical conditions in which both rs activity and sense of agency are altered.  相似文献   

18.
This study describes a possible mechanism of coding of multisensory information in the anterior ectosylvian visual area of the feline cortex. Extracellular microelectrode recordings on 168 cells were carried out in the anterior ectosylvian sulcal region of halothane-anaesthetized, immobilized, artificially ventilated cats. Ninety-five neurons were found to respond to visual stimuli, 96 responded to auditory stimuli and 45 were bimodal, reacting to both visual and auditory modalities. A large proportion of the neurons exhibited significantly different responses to stimuli appearing in different regions of their huge receptive field. These neurons have the ability to provide information via their discharge rate on the site of the stimulus within their receptive field. This suggests that they may serve as panoramic localizers. The ability of the bimodal neurons to localize bimodal stimulus sources is better than any of the unimodal localizing functions. Further, the sites of maximal responsivity of the visual, auditory and bimodal neurons are distributed over the whole extent of the large receptive fields. Thus, a large population of such panoramic visual, auditory and multisensory neurons could accurately code the locations of the sensory stimuli. Our findings support the notion that there is a distributed population code of multisensory information in the feline associative cortex.  相似文献   

19.
The orphan receptor, GPR88, is emerging as a key player in the pathophysiology of several neuropsychiatric diseases, including psychotic disorders. Knockout (KO) mice lacking GPR88 throughout the brain exhibit many abnormalities relevant to schizophrenia including locomotor hyperactivity, behavioural hypersensitivity to dopaminergic psychostimulants and deficient sensorimotor gating. Here, we used conditional knockout (cKO) mice lacking GPR88 selectively in striatal medium spiny neurons expressing A2A receptor to determine neuronal circuits underlying these phenotypes. We first studied locomotor responses of A2AR‐Gpr88 KO mice and their control littermates to psychotomimetic, amphetamine, and to selective D1 and D2 receptor agonists, SKF‐81297 and quinpirole, respectively. To assess sensorimotor gating performance, mice were submitted to acoustic and visual prepulse inhibition (PPI) paradigms. Total knockout GPR88 mice were also studied for comparison. Like total GPR88 KO mice, A2AR‐Gpr88 KO mice displayed a heightened sensitivity to locomotor stimulant effects of amphetamine and SKF‐81297. They also exhibited enhanced locomotor activity to quinpirole, which tended to suppress locomotion in control mice. By contrast, they had normal acoustic and visual PPI, unlike total GPR88 KO mice that show impairments across different sensory modalities. Finally, none of the genetic manipulations altered central auditory temporal processing assessed by gap‐PPI. Together, these findings support the role of GPR88 in the pathophysiology of schizophrenia and show that GPR88 in A2A receptor‐expressing neurons modulates psychomotor behaviour but not sensorimotor gating.  相似文献   

20.
The complex neuroanatomical connections of the inferior colliculus (IC) and its major subdivisions offer a juxtaposition of segregated processing streams with distinct organizational features. While the tonotopically layered central nucleus is well‐documented, less is known about functional compartments in the neighboring lateral cortex (LCIC). In addition to a laminar framework, LCIC afferent‐efferent patterns suggest a multimodal mosaic, consisting of a patchy modular network with surrounding extramodular domains. This study utilizes several neurochemical markers that reveal an emerging LCIC modular‐extramodular microarchitecture. In newborn and post‐hearing C57BL/6J and CBA/CaJ mice, histochemical and immunocytochemical stains were performed for acetylcholinesterase (AChE), nicotinamide adenine dinucleotide phosphate‐diaphorase (NADPH‐d), glutamic acid decarboxylase (GAD), cytochrome oxidase (CO), and calretinin (CR). Discontinuous layer 2 modules are positive for AChE, NADPH‐d, GAD, and CO throughout the rostrocaudal LCIC. While not readily apparent at birth, discrete cell clusters emerge over the first postnatal week, yielding an identifiable modular network prior to hearing onset. Modular boundaries continue to become increasingly distinct with age, as surrounding extramodular fields remain largely negative for each marker. Alignment of modular markers in serial sections suggests each highlight the same periodic patchy network throughout the nascent LCIC. In contrast, CR patterns appear complementary, preferentially staining extramodular LCIC zones. Double‐labeling experiments confirm that NADPH‐d, the most consistent developmental modular marker, and CR label separate, nonoverlapping LCIC compartments. Determining how this emerging modularity may align with similar LCIC patch‐matrix‐like Eph/ephrin guidance patterns, and how each interface with, and potentially influence developing multimodal LCIC projection configurations is discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号