首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 703 毫秒
1.
Previous research has provided inconsistent results regarding the spatial modulation of auditory-somatosensory interactions. The present study reports three experiments designed to investigate the nature of these interactions in the space close to the head. Human participants made speeded detection responses to unimodal auditory, somatosensory, or simultaneous auditory-somatosensory stimuli. In Experiment 1, electrocutaneous stimuli were presented to either earlobe, while auditory stimuli were presented from the same versus opposite sides, and from one of two distances (20 vs. 70 cm) from the participant's head. The results demonstrated a spatial modulation of auditory-somatosensory interactions when auditory stimuli were presented from close to the head. In Experiment 2, electrocutaneous stimuli were delivered to the hands, which were placed either close to or far from the head, while the auditory stimuli were again presented at one of two distances. The results revealed that the spatial modulation observed in Experiment 1 was specific to the particular body part stimulated (head) rather than to the region of space (i.e. around the head) where the stimuli were presented. The results of Experiment 3 demonstrate that sounds that contain high-frequency components are particularly effective in eliciting this auditory-somatosensory spatial effect. Taken together, these findings help to resolve inconsistencies in the previous literature and suggest that auditory-somatosensory multisensory integration is modulated by the stimulated body surface and acoustic spectra of the stimuli presented.  相似文献   

2.
Simple reaction times (RTs) to auditory-somatosensory (AS) multisensory stimuli are facilitated over their unisensory counterparts both when stimuli are delivered to the same location and when separated. In two experiments we addressed the possibility that top-down and/or task-related influences can dynamically impact the spatial representations mediating these effects and the extent to which multisensory facilitation will be observed. Participants performed a simple detection task in response to auditory, somatosensory, or simultaneous AS stimuli that in turn were either spatially aligned or misaligned by lateralizing the stimuli. Additionally, we also informed the participants that they would be retrogradely queried (one-third of trials) regarding the side where a given stimulus in a given sensory modality was presented. In this way, we sought to have participants attending to all possible spatial locations and sensory modalities, while nonetheless having them perform a simple detection task. Experiment 1 provided no cues prior to stimulus delivery. Experiment 2 included spatially uninformative cues (50% of trials). In both experiments, multisensory conditions significantly facilitated detection RTs with no evidence for differences according to spatial alignment (though general benefits of cuing were observed in Experiment 2). Facilitated detection occurs even when attending to spatial information. Performance with probes, quantified using sensitivity (d′), was impaired following multisensory trials in general and significantly more so following misaligned multisensory trials. This indicates that spatial information is not available, despite being task-relevant. The collective results support a model wherein early AS interactions may result in a loss of spatial acuity for unisensory information.  相似文献   

3.
We investigated the time-course and scalp topography of multisensory interactions between simultaneous auditory and somatosensory stimulation in humans. Event-related potentials (ERPs) were recorded from 64 scalp electrodes while subjects were presented with auditory-alone stimulation (1000-Hz tones), somatosensory-alone stimulation (median nerve electrical pulses), and simultaneous auditory-somatosensory (AS) combined stimulation. Interaction effects were assessed by comparing the responses to combined stimulation with the algebraic sum of responses to the constituent auditory and somatosensory stimuli when they were presented alone. Spatiotemporal analysis of ERPs and scalp current density (SCD) topographies revealed AS interaction over the central/postcentral scalp which onset at approximately 50 ms post-stimulus presentation. Both the topography and timing of these interactions are consistent with multisensory integration early in the cortical processing hierarchy, in brain regions traditionally held to be unisensory.  相似文献   

4.
Human–environment interactions are mediated through the body and occur within the peripersonal space (PPS), the space immediately adjacent to and surrounding the body. The PPS is taken to be a critical interface between the body and the environment, and indeed, body‐part specific PPS remapping has been shown to depend on body‐part utilization, such as upper limb movements in otherwise static observers. How vestibular signals induced by whole‐body movement contribute to PPS representation is less well understood. In a series of experiments, we mapped the spatial extension of the PPS around the head while participants were submitted to passive whole‐body rotations inducing vestibular stimulation. Forty‐six participants, in three experiments, executed a tactile detection reaction time task while task‐irrelevant auditory stimuli approached them. The maximal distance at which the auditory stimulus facilitated tactile reaction time was taken as a proxy for the boundary of peri‐head space. The present results indicate two distinct vestibular effects. First, vestibular stimulation speeded tactile detection indicating a vestibular facilitation of somatosensory processing. Second, vestibular stimulation modulated audio‐tactile interaction of peri‐head space in a rotation direction‐specific manner. Congruent but not incongruent audio‐vestibular motion stimuli expanded the PPS boundary further away from the body as compared to no rotation. These results show that vestibular inputs dynamically update the multisensory delineation of PPS and far space, which may serve to maintain accurate tracking of objects close to the body and to update spatial self‐representations.  相似文献   

5.
The spatial rule of multisensory integration holds that cross-modal stimuli presented from the same spatial location result in enhanced multisensory integration. The present study investigated whether processing within the somatosensory cortex reflects the strength of cross-modal visuotactile interactions depending on the spatial relationship between visual and tactile stimuli. Visual stimuli were task-irrelevant and were presented simultaneously with touch in peripersonal and extrapersonal space, in the same or opposite hemispace with respect to the tactile stimuli. Participants directed their attention to one of their hands to detect infrequent tactile target stimuli at that hand while ignoring tactile targets at the unattended hand, all tactile nontarget stimuli, and any visual stimuli. Enhancement of ERPs recorded over and close to the somatosensory cortex was present as early as 100 msec after onset of stimuli (i.e., overlapping with the P100 component) when visual stimuli were presented next to the site of tactile stimulation (i.e., perihand space) compared to when these were presented at different locations in peripersonal or extrapersonal space. Therefore, this study provides electrophysiological support for the spatial rule of visual-tactile interaction in human participants. Importantly, these early cross-modal spatial effects occurred regardless of the locus of attention. In addition, and in line with previous research, we found attentional modulations of somatosensory processing only to be present in the time range of the N140 component and for longer latencies with an enhanced negativity for tactile stimuli at attended compared to unattended locations. Taken together, the pattern of the results from this study suggests that visuotactile spatial effects on somatosensory processing occur prior and independent of tactile-spatial attention.  相似文献   

6.
This review discusses how visual and the tactile signals are combined in the brain to ensure appropriate interactions with the space around the body. Visual and tactile signals converge in many regions of the brain (e.g. parietal and premotor cortices) where multisensory input can interact on the basis of specific spatial constraints. Crossmodal interactions can modulate also unisensory visual and somatosensory cortices, possibly via feed-back projections from fronto-parietal areas. These processes enable attentional selection of relevant locations in near body space, as demonstrated by studies of spatial attention in healthy volunteers and in neuropsychological patients with crossmodal extinction. These crossmodal spatial effects can be flexibly updated taking into account the position of the eyes and the limbs, thus reflecting the spatial alignment of visuo-tactile stimuli in external space. Further, studies that manipulated vision of body parts (alien, real or fake limbs) have demonstrated that passive viewing of the body can influence the perception of somatosensory stimuli, again involving areas in the premotor and parietal cortices. Finally, we discuss how tool-use can expand the region of visuo-tactile integration in near body space, emphasizing the flexibility of this system at the single-neuron level in the monkey's parietal cortex, with corresponding multisensory effects in normals and neuropsychological patients. We conclude that visuo-tactile crossmodal links dominate the representation of near body space and that this is implemented functionally in parietal and premotor brain regions. These integration processes mediate the orienting of spatial attention and generate an efficient and flexible representation the space around the body.  相似文献   

7.
Sensorimotor co-ordination in mammals is achieved predominantly via the activity of the basal ganglia. To investigate the underlying multisensory information processing, we recorded the neuronal responses in the caudate nucleus (CN) and substantia nigra (SN) of anaesthetized cats to visual, auditory or somatosensory stimulation alone and also to their combinations, i.e. multisensory stimuli. The main goal of the study was to ascertain whether multisensory information provides more information to the neurons than do the individual sensory components. A majority of the investigated SN and CN multisensory units exhibited significant cross-modal interactions. The multisensory response enhancements were either additive or superadditive; multisensory response depressions were also detected. CN and SN cells with facilitatory and inhibitory interactions were found in each multisensory combination. The strengths of the multisensory interactions did not differ in the two structures. A significant inverse correlation was found between the strengths of the best unimodal responses and the magnitudes of the multisensory response enhancements, i.e. the neurons with the weakest net unimodal responses exhibited the strongest enhancement effects. The onset latencies of the responses of the integrative CN and SN neurons to the multisensory stimuli were significantly shorter than those to the unimodal stimuli. These results provide evidence that the multisensory CN and SN neurons, similarly to those in the superior colliculus and related structures, have the ability to integrate multisensory information. Multisensory integration may help in the effective processing of sensory events and the changes in the environment during motor actions controlled by the basal ganglia.  相似文献   

8.
Multisensory integration is essential for the expression of complex behaviors in humans and animals. However, few studies have investigated the neural sites where multisensory integration may occur. Therefore, we used electrophysiology and retrograde labeling to study a region of the rat parietotemporal cortex that responds uniquely to auditory and somatosensory multisensory stimulation. This multisensory responsiveness suggests a functional organization resembling multisensory association cortex in cats and primates. Extracellular multielectrode surface mapping defined a region between auditory and somatosensory cortex where responses to combined auditory/somatosensory stimulation were larger in amplitude and earlier in latency than responses to either stimulus alone. Moreover, multisensory responses were nonlinear and differed from the summed unimodal responses. Intracellular recording found almost exclusively multisensory cells that responded to both unisensory and multisensory stimulation with excitatory postsynaptic potentials (EPSPs) and/or action potentials, conclusively defining a multisensory zone (MZ). In addition, intracellular responses were similar to extracellular recordings, with larger and earlier EPSPs evoked by multisensory stimulation, and interactions suggesting nonlinear postsynaptic summation to combined stimuli. Thalamic input to MZ from unimodal auditory and somatosensory thalamic relay nuclei and from multisensory thalamic regions support the idea that parallel thalamocortical projections may drive multisensory functions as strongly as corticocortical projections. Whereas the MZ integrates uni- and multisensory thalamocortical afferent streams, it may ultimately influence brainstem multisensory structures such as the superior colliculus.  相似文献   

9.
The extracellular response properties of neurons in the caudate-putamen (CPu), globus pallidus (GP) and lateral amygdaloid nucleus (La) evoked by auditory and somatosensory stimuli were investigated. A total of 61 neurons in these areas responded either singly to somatosensory stimulation (unisensory), or to both somatosensory and auditory stimulation (multisensory). Higher rates of somatosensory stimulation reduce the responsed magnitude of CPu neurons more than that of GP neurons. In multisensory neurons, combined somatosensory and auditory stimulation compared to unisensory stimulation resulted in three characteristic response patterns: enhancement, depression or interaction. Temporal misalignment of the peak frequency latencies evoked by auditory and somatosensory stimulation altered the response magnitude in the majority of neurons. The response properties and anatomical connectivity of CPu and GP neurons suggest that the observed multisensory integrative effects may be used to facilitate motor responses to low intensity stimuli.  相似文献   

10.
Enhanced detection and discrimination, along with faster reaction times, are the most typical behavioural manifestations of the brain's capacity to integrate multisensory signals arising from the same object. In this study, we examined whether multisensory behavioural gains are observable across different components of the localization response that are potentially under the command of distinct brain regions. We measured the ability of ferrets to localize unisensory (auditory or visual) and spatiotemporally coincident auditory–visual stimuli of different durations that were presented from one of seven locations spanning the frontal hemifield. During the localization task, we recorded the head movements made following stimulus presentation, as a metric for assessing the initial orienting response of the ferrets, as well as the subsequent choice of which target location to approach to receive a reward. Head‐orienting responses to auditory–visual stimuli were more accurate and faster than those made to visual but not auditory targets, suggesting that these movements were guided principally by sound alone. In contrast, approach‐to‐target localization responses were more accurate and faster to spatially congruent auditory–visual stimuli throughout the frontal hemifield than to either visual or auditory stimuli alone. Race model inequality analysis of head‐orienting reaction times and approach‐to‐target response times indicates that different processes, probability summation and neural integration, respectively, are likely to be responsible for the effects of multisensory stimulation on these two measures of localization behaviour.  相似文献   

11.
Multisensory interactions are a fundamental feature of brain organization. Principles governing multisensory processing have been established by varying stimulus location, timing and efficacy independently. Determining whether and how such principles operate when stimuli vary dynamically in their perceived distance (as when looming/receding) provides an assay for synergy among the above principles and also means for linking multisensory interactions between rudimentary stimuli with higher-order signals used for communication and motor planning. Human participants indicated movement of looming or receding versus static stimuli that were visual, auditory, or multisensory combinations while 160-channel EEG was recorded. Multivariate EEG analyses and distributed source estimations were performed. Nonlinear interactions between looming signals were observed at early poststimulus latencies (~75 ms) in analyses of voltage waveforms, global field power, and source estimations. These looming-specific interactions positively correlated with reaction time facilitation, providing direct links between neural and performance metrics of multisensory integration. Statistical analyses of source estimations identified looming-specific interactions within the right claustrum/insula extending inferiorly into the amygdala and also within the bilateral cuneus extending into the inferior and lateral occipital cortices. Multisensory effects common to all conditions, regardless of perceived distance and congruity, followed (~115 ms) and manifested as faster transition between temporally stable brain networks (vs summed responses to unisensory conditions). We demonstrate the early-latency, synergistic interplay between existing principles of multisensory interactions. Such findings change the manner in which to model multisensory interactions at neural and behavioral/perceptual levels. We also provide neurophysiologic backing for the notion that looming signals receive preferential treatment during perception.  相似文献   

12.
The synchronous occurrence of the unisensory components of a multisensory stimulus contributes to their successful merging into a coherent perceptual representation. Oscillatory gamma-band responses (GBRs, 30-80 Hz) have been linked to feature integration mechanisms and to multisensory processing, suggesting they may also be sensitive to the temporal alignment of multisensory stimulus components. Here we examined the effects on early oscillatory GBR brain activity of varying the precision of the temporal synchrony of the unisensory components of an audio-visual stimulus. Audio-visual stimuli were presented with stimulus onset asynchronies ranging from -125 to +125 ms. Randomized streams of auditory (A), visual (V), and audio-visual (AV) stimuli were presented centrally while subjects attended to either the auditory or visual modality to detect occasional targets. GBRs to auditory and visual components of multisensory AV stimuli were extracted for five subranges of asynchrony (e.g., A preceded by V by 100+/-25 ms, by 50+/-25 ms, etc.) and compared with GBRs to unisensory control stimuli. Robust multisensory interactions were observed in the early GBRs when the auditory and visual stimuli were presented with the closest synchrony. These effects were found over medial-frontal brain areas after 30-80 ms and over occipital brain areas after 60-120 ms. A second integration effect, possibly reflecting the perceptual separation of the two sensory inputs, was found over occipital areas when auditory inputs preceded visual by 100+/-25 ms. No significant interactions were observed for the other subranges of asynchrony. These results show that the precision of temporal synchrony can have an impact on early cross-modal interactions in human cortex.  相似文献   

13.
We live in a multisensory world and one of the challenges the brain is faced with is deciding what information belongs together. Our ability to make assumptions about the relatedness of multisensory stimuli is partly based on their temporal and spatial relationships. Stimuli that are proximal in time and space are likely to be bound together by the brain and ascribed to a common external event. Using this framework we can describe multisensory processes in the context of spatial and temporal filters or windows that compute the probability of the relatedness of stimuli. Whereas numerous studies have examined the characteristics of these multisensory filters in adults and discrepancies in window size have been reported between infants and adults, virtually nothing is known about multisensory temporal processing in childhood. To examine this, we compared the ability of 10 and 11 year olds and adults to detect audiovisual temporal asynchrony. Findings revealed striking and asymmetric age-related differences. Whereas children were able to identify asynchrony as readily as adults when visual stimuli preceded auditory cues, significant group differences were identified at moderately long stimulus onset asynchronies (150-350 ms) where the auditory stimulus was first. Results suggest that changes in audiovisual temporal perception extend beyond the first decade of life. In addition to furthering our understanding of basic multisensory developmental processes, these findings have implications on disorders (e.g., autism, dyslexia) in which emerging evidence suggests alterations in multisensory temporal function.  相似文献   

14.
Clinical and neuroimaging observations of the cortical network implicated in tactile attention have identified foci in parietal somatosensory, posterior parietal, and superior frontal locations. Tasks involving intentional hand-arm movements activate similar or nearby parietal and frontal foci. Visual spatial attention tasks and deliberate visuomotor behavior also activate overlapping posterior parietal and frontal foci. Studies in the visual and somatosensory systems thus support a proposal that attention to the spatial location of an object engages cortical regions responsible for the same coordinate referents used for guiding purposeful motor behavior. Tactile attention also biases processing in the somatosensory cortex through amplification of responses to relevant features of selected stimuli. Psychophysical studies demonstrate retention gradients for tactile stimuli like those reported for visual and auditory stimuli, and suggest analogous neural mechanisms for working memory across modalities. Neuroimaging studies in humans using memory tasks, and anatomic studies in monkeys support the idea that tactile information relayed from the somatosensory cortex is directed ventrally through the insula to the frontal cortex for short-term retention and to structures of the medial temporal lobe for long-term encoding. At the level of single neurons, tactile (such as visual and auditory) short-term memory appears as a persistent response during delay intervals between sampled stimuli.  相似文献   

15.
This study analyzed high‐density event‐related potentials (ERPs) within an electrical neuroimaging framework to provide insights regarding the interaction between multisensory processes and stimulus probabilities. Specifically, we identified the spatiotemporal brain mechanisms by which the proportion of temporally congruent and task‐irrelevant auditory information influences stimulus processing during a visual duration discrimination task. The spatial position (top/bottom) of the visual stimulus was indicative of how frequently the visual and auditory stimuli would be congruent in their duration (i.e., context of congruence). Stronger influences of irrelevant sound were observed when contexts associated with a high proportion of auditory‐visual congruence repeated and also when contexts associated with a low proportion of congruence switched. Context of congruence and context transition resulted in weaker brain responses at 228 to 257 ms poststimulus to conditions giving rise to larger behavioral cross‐modal interactions. Importantly, a control oddball task revealed that both congruent and incongruent audiovisual stimuli triggered equivalent non‐linear multisensory interactions when congruence was not a relevant dimension. Collectively, these results are well explained by statistical learning, which links a particular context (here: a spatial location) with a certain level of top‐down attentional control that further modulates cross‐modal interactions based on whether a particular context repeated or changed. The current findings shed new light on the importance of context‐based control over multisensory processing, whose influences multiplex across finer and broader time scales. Hum Brain Mapp 37:273–288, 2016. © 2015 Wiley Periodicals, Inc.  相似文献   

16.
Because the posterior limb of the rostral suprasylvian sulcus (RSp) of the cat resides in close proximity to representations of the somatosensory, auditory, and visual modalities, the surrounding cortices would be expected to be a region where a high degree of multisensory convergence and integration is found. The present experiments tested this notion by using anatomical and electrophysiological methods. Tracer injections into somatosensory, auditory, and visual cortical areas almost all produced terminal labeling within the RSp, albeit at different locations and in different proportions. Inputs from somatosensory cortices primarily targeted the inner portion of the anterior RSp; inputs from auditory cortices generally filled the outer portion of the middle and posterior RSp; inputs from visual cortices terminated in the inner portion of the posterior RSp. These projections did not have sharp borders but often overlapped one another, thereby providing a substrate for multisensory convergence. Electrophysiological recordings confirmed this anatomical organization as well as identifying the presence of multisensory (bimodal) neurons in the areas of overlap between representations. Curiously, however, the proportion of bimodal neurons was only 24% of the neurons sampled in this region, and the majority of these did not show multisensory interactions when combined-modality stimuli were presented. In summary, these experiments indicate that the RSp is primarily auditory in nature, but this representation could be further subdivided into an outer sulcal anterior auditory field (sAAF) and an inner field of the rostral suprasylvian sulcus (FRS).  相似文献   

17.
Multisensory peripersonal space develops in a maturational process that is thought to be influenced by early sensory experience. We investigated the role of vision in the effective development of audiotactile interactions in peripersonal space. Early blind (EB), late blind (LB) and sighted control (SC) participants were asked to lateralize auditory, tactile and audiotactile stimuli. The experiment was conducted with the hands uncrossed or crossed over the body midline in order to alter the relationship between personal and peripersonal spatial representations. First, we observed that the crossed posture results in a greater detrimental effect for tactile performance in sighted subjects but a greater deficit in auditory performance in early blind ones. This result is interpreted as evidence for a visually driven developmental process that automatically remaps tactile and proprioceptive spatial representation into an external framework. Second, we demonstrate that improved reaction times observed in the bimodal conditions in SC and LB exceeds that predicted by probability summation in both conditions of postures, indicating neural integration of different sensory information. In EB, nonlinear summation was obtained in the uncrossed but not in the crossed posture. We argue that the default use of an anatomically anchored reference system in EB prevents effective audiotactile interactions in the crossed posture due to the poorly aligned spatial coordinates of these two modalities in such conditions. Altogether, these results provide compelling evidence for the critical role of early vision in the development of multisensory perception and action control in peripersonal space.  相似文献   

18.
The physical arrangement of receptive fields (RFs) within neural structures is important for local computations. Nonuniform distribution of tuning within populations of neurons can influence emergent tuning properties, causing bias in local processing. This issue was studied in the auditory system of barn owls. The owl's external nucleus of the inferior colliculus (ICx) contains a map of auditory space in which the frontal region is overrepresented. We measured spatiotemporal RFs of ICx neurons using spatial white noise. We found a population-wide bias in surround suppression such that suppression from frontal space was stronger. This asymmetry increased with laterality in spatial tuning. The bias could be explained by a model of lateral inhibition based on the overrepresentation of frontal space observed in ICx. The model predicted trends in surround suppression across ICx that matched the data. Thus, the uneven distribution of spatial tuning within the map could explain the topography of time-dependent tuning properties. This mechanism may have significant implications for the analysis of natural scenes by sensory systems.  相似文献   

19.
Coordinated attention to information from multiple senses is fundamental to our ability to respond to salient environmental events, yet little is known about brain network mechanisms that guide integration of information from multiple senses. Here we investigate dynamic causal mechanisms underlying multisensory auditory–visual attention, focusing on a network of right‐hemisphere frontal–cingulate–parietal regions implicated in a wide range of tasks involving attention and cognitive control. Participants performed three ‘oddball’ attention tasks involving auditory, visual and multisensory auditory–visual stimuli during fMRI scanning. We found that the right anterior insula (rAI) demonstrated the most significant causal influences on all other frontal–cingulate–parietal regions, serving as a major causal control hub during multisensory attention. Crucially, we then tested two competing models of the role of the rAI in multisensory attention: an ‘integrated’ signaling model in which the rAI generates a common multisensory control signal associated with simultaneous attention to auditory and visual oddball stimuli versus a ‘segregated’ signaling model in which the rAI generates two segregated and independent signals in each sensory modality. We found strong support for the integrated, rather than the segregated, signaling model. Furthermore, the strength of the integrated control signal from the rAI was most pronounced on the dorsal anterior cingulate and posterior parietal cortices, two key nodes of saliency and central executive networks respectively. These results were preserved with the addition of a superior temporal sulcus region involved in multisensory processing. Our study provides new insights into the dynamic causal mechanisms by which the AI facilitates multisensory attention.  相似文献   

20.
Previous studies on crossmodal spatial orienting typically used simple and stereotyped stimuli in the absence of any meaningful context. This study combined computational models, behavioural measures and functional magnetic resonance imaging to investigate audiovisual spatial interactions in naturalistic settings. We created short videos portraying everyday life situations that included a lateralised visual event and a co‐occurring sound, either on the same or on the opposite side of space. Subjects viewed the videos with or without eye‐movements allowed (overt or covert orienting). For each video, visual and auditory saliency maps were used to index the strength of stimulus‐driven signals, and eye‐movements were used as a measure of the efficacy of the audiovisual events for spatial orienting. Results showed that visual salience modulated activity in higher‐order visual areas, whereas auditory salience modulated activity in the superior temporal cortex. Auditory salience modulated activity also in the posterior parietal cortex, but only when audiovisual stimuli occurred on the same side of space (multisensory spatial congruence). Orienting efficacy affected activity in the visual cortex, within the same regions modulated by visual salience. These patterns of activation were comparable in overt and covert orienting conditions. Our results demonstrate that, during viewing of complex multisensory stimuli, activity in sensory areas reflects both stimulus‐driven signals and their efficacy for spatial orienting; and that the posterior parietal cortex combines spatial information about the visual and the auditory modality. Hum Brain Mapp 35:1597–1614, 2014. © 2013 Wiley Periodicals, Inc.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号