首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 21 毫秒
1.
Self-disturbances such as an anomalous perception of one’s own body boundary are central to the phenomenology of schizophrenia (SZ), but measuring the spatial parameters of the hypothesized self–other boundary has proved to be challenging. Peripersonal space (PPS) refers to the immediate zone surrounding the body where the self interacts physically with the environment; the space that corresponds to hypothesized self–other boundary. PPS is represented by enhanced multisensory integration and faster reaction time (RT) for objects near the body. Thus, multisensory RT tasks can be used to estimate self–other boundary. We aimed to quantify PPS in SZ using an immersive virtual reality visuotactile RT paradigm. Twenty-four participants with SZ and 24 demographically matched controls (CO) were asked to detect tactile vibration while watching a ball approaching them, thrown by either a machine (nonsocial condition) or an avatar (social condition). Parameters of PPS were estimated from the midpoint of the spatial range where the tactile RT decreased most rapidly (size) and the gradient of the RT change at this midpoint (slope). Overall, PPS was smaller in participants with SZ compared with CO. PPS slope for participants with SZ was shallower than CO in the social but not in nonsocial condition, indicating an increased uncertainty of self–other boundary across an extended zone in SZ. Social condition also increased false alarms for tactile detection in SZ. Clinical symptoms were not clearly associated with PPS parameters. These findings suggest the context-dependent nature of weakened body boundary in SZ and underscore the importance of reconciliating objective and subjective aspects of self-disturbances.  相似文献   

2.
Peripheral vestibular organs feed the central nervous system with inputs favoring the correct perception of space during head and body motion. Applying temporal order judgments (TOJs) to pairs of simultaneous or asynchronous stimuli presented in the left and right egocentric space, we evaluated the influence of leftward and rightward vestibular rotatory accelerations given around the vertical head-body axis on covert attentional orienting. In a first experiment, we presented visual stimuli in the left and right hemifield. In a second experiment, tactile stimuli were presented to hands lying on their anatomical side or in a crossed position across the sagittal body midline. In both experiments, stimuli were presented while normal subjects suppressed or did not suppress the vestibulo-ocular response (VOR) evoked by head-body rotation. Independently of VOR suppression, visual and tactile stimuli presented on the side of rotation were judged to precede simultaneous stimuli presented on the side opposite the rotation. When limbs were crossed, attentional facilitatory effects were only observed for stimuli presented to the right hand lying in the left hemispace during leftward rotatory trials with VOR suppression. This result points to spatiotopic rather than somatotopic influences of vestibular inputs, suggesting that cross-modal effects of these inputs on tactile ones operate on a representation of space that is updated following arm crossing. In a third control experiment, we demonstrated that temporal prioritization of stimuli presented on the side of rotation was not determined by response bias linked to spatial compatibility between the directions of rotation and the directional labels used in TOJs (i.e., "left" or "right" first). These findings suggest that during passive rotatory head-body accelerations, covert attention is shifted toward the direction of rotation and the direction of the fast phases of the VOR.  相似文献   

3.
The localization of touch in external space requires the remapping of somatotopically represented tactile information into an external frame of reference. Several recent studies have highlighted the role of posterior parietal areas for this remapping process, yet its temporal dynamics are poorly understood. The present study combined cross‐modal stimulation with electrophysiological recordings in humans to trace the time course of tactile spatial remapping during visual–tactile interactions. Adopting an uncrossed or crossed hand posture, participants made speeded elevation judgments about rare vibrotactile stimuli within a stream of frequent, task‐irrelevant vibrotactile events presented to the left or right hand. Simultaneous but spatially independent visual stimuli had to be ignored. An analysis of the recorded event‐related potentials to the task‐irrelevant vibrotactile stimuli revealed a somatotopic coding of tactile stimuli within the first 100 ms. Between 180 and 250 ms, neither an external nor a somatotopic representation dominated, suggesting that both coordinates were active in parallel. After 250 ms, tactile stimuli were coded in a somatotopic frame of reference. Our results indicate that cross‐modal interactions start before the termination of tactile spatial remapping, that is within the first 100 ms. Thereafter, tactile stimuli are represented simultaneously in both somatotopic and external spatial coordinates, which are dynamically (re‐)weighted as a function of processing stage.  相似文献   

4.
When a sound is presented in the free field at a location that remains fixed to the head during whole‐body rotation in darkness, it is heard displaced in the direction opposing the rotation. This phenomenon is known as the audiogyral illusion. Consequently, the subjective auditory median plane (AMP) (the plane where the binaural difference cues for sound localization are perceived to be zero) shifts in the direction of body rotation. Recent experiments, however, have suggested opposite AMP results when using a fixation light that also moves with the head. Although in this condition the eyes remain stationary in the head, an ocular pursuit signal cancels the vestibulo‐ocular reflex, which could induce an additional AMP shift. We tested whether the AMP is influenced by vestibular signals, eye position or eye velocity. We rotated subjects sinusoidally at different velocities, either in darkness or with a head‐fixed fixation light, while they judged the laterality (left vs. right with respect to the midsagittal plane of the head) of broadband sounds presented over headphones. Subjects also performed the same task without vestibular stimulation while tracking a sinusoidally moving visual target, which mimicked the average eye‐movement patterns of the vestibular experiments in darkness. Results show that whole‐body rotation in darkness induces a shift of the AMP in the direction of body rotation. In contrast, we obtained no significant AMP change when a fixation light was used. The pursuit experiments showed a shift of the AMP in the direction of eccentric eye position but not at peak pursuit velocity. We therefore conclude that the vestibular‐induced shift in average eye position underlies both the audiogyral illusion and the AMP shift.  相似文献   

5.
In the present study we report neuropsychological evidence of the existence of an auditory peripersonal space representation around the head in humans and its characteristics. In a group of right brain-damaged patients with tactile extinction, we found that a sound delivered near the ipsilesional side of the head (20 cm) strongly extinguished a tactile stimulus delivered to the contralesional side of the head (cross-modal auditory-tactile extinction). By contrast, when an auditory stimulus was presented far from the head (70 cm), cross-modal extinction was dramatically reduced. This spatially specific cross-modal extinction was most consistently found (i.e., both in the front and back spaces) when a complex sound was presented, like a white noise burst. Pure tones produced spatially specific cross-modal extinction when presented in the back space, but not in the front space. In addition, the most severe cross-modal extinction emerged when sounds came from behind the head, thus showing that the back space is more sensitive than the front space to the sensory interaction of auditory-tactile inputs. Finally, when cross-modal effects were investigated by reversing the spatial arrangement of cross-modal stimuli (i.e., touch on the right and sound on the left), we found that an ipsilesional tactile stimulus, although inducing a small amount of cross-modal tactile-auditory extinction, did not produce any spatial-specific effect. Therefore, the selective aspects of cross-modal interaction found near the head cannot be explained by a competition between a damaged left spatial representation and an intact right spatial representation. Thus, consistent with neurophysiological evidence from monkeys, our findings strongly support the existence, in humans, of an integrated cross-modal system coding auditory and tactile stimuli near the body, that is, in the peripersonal space.  相似文献   

6.
Occelli V  Spence C  Zampini M 《Neuropsychologia》2008,46(11):2845-2850
In the present study, we examined the potential modulatory effect of relative spatial position on audiotactile temporal order judgments (TOJs) in sighted, early, and late blind adults. Pairs of auditory and tactile stimuli were presented from the left and/or right of participants at varying stimulus onset asynchronies (SOAs) using the method of constant stimuli. The participants had to make unspeeded TOJs regarding which sensory modality had been presented first on each trial. Systematic differences between the participants emerged: While the performance of the sighted participants was unaffected by whether the two stimuli were presented from the same or different positions (replicating the results of several recent studies), the blind participants (regardless of the age of onset of blindness) were significantly more accurate when the auditory and tactile stimuli were presented from different positions rather than from the same position. These results provide the first empirical evidence to suggest a spatial modulation of audiotactile interactions in a temporal task performed by visually impaired humans. The fact that the performance of the blind participants was modulated by the relative spatial position of the stimuli is consistent with data showing that visual deprivation results in an improved ability to process spatial cues within the residual tactile and auditory modalities. These results support the hypothesis that the absence of visual cues results in the emergence of more pronounced audiotactile spatial interactions.  相似文献   

7.
Previous research has provided inconsistent results regarding the spatial modulation of auditory-somatosensory interactions. The present study reports three experiments designed to investigate the nature of these interactions in the space close to the head. Human participants made speeded detection responses to unimodal auditory, somatosensory, or simultaneous auditory-somatosensory stimuli. In Experiment 1, electrocutaneous stimuli were presented to either earlobe, while auditory stimuli were presented from the same versus opposite sides, and from one of two distances (20 vs. 70 cm) from the participant's head. The results demonstrated a spatial modulation of auditory-somatosensory interactions when auditory stimuli were presented from close to the head. In Experiment 2, electrocutaneous stimuli were delivered to the hands, which were placed either close to or far from the head, while the auditory stimuli were again presented at one of two distances. The results revealed that the spatial modulation observed in Experiment 1 was specific to the particular body part stimulated (head) rather than to the region of space (i.e. around the head) where the stimuli were presented. The results of Experiment 3 demonstrate that sounds that contain high-frequency components are particularly effective in eliciting this auditory-somatosensory spatial effect. Taken together, these findings help to resolve inconsistencies in the previous literature and suggest that auditory-somatosensory multisensory integration is modulated by the stimulated body surface and acoustic spectra of the stimuli presented.  相似文献   

8.
Eye and head movements during vestibular stimulation in the alert rabbit   总被引:2,自引:0,他引:2  
Rabbits passively oscillated in the horizontal plane with a free head tended to stabilize their head in space (re: earth-fixed surroundings) by moving the head on the trunk (neck angular deviation, NAD) opposite the passively imposed body rotation. The gain (NAD/body rotation) of head stabilization varied from 0.0 to 0.95 (nearly perfect stability) and was most commonly above 0.5. Horizontal eye movement (HEM) was inversely proportional to head-in-space stability, i.e. the gaze (sum of HEM, NAD, and body rotation) was stable in space (regardless of the gain of head stabilization). When the head was fixed to the rotating platform, attempted head movements (head torque) mimicked eye movements in both the slow and fast phases of vestibular nystagmus; tonic eye position was also accompanied by conjugate shifts in tonic head torque. Thus, while eye and head movements may at times be linked, that the slow eye and head movements vary inversely during vestibular stimulation with a free head indicates that the linkage is not rigid.Absence of a textured stationary visual field consistently produced a response termed ‘visual inattentiveness,’ which was characterized by, among other things, a reduction of head and gaze stability in space. This behavioral response could also be reproduced in a subject allowed vision during prolonged vestibular stimulation in the absence of other environmental stimuli. It is suggested that rabbits optimize gaze stability (re: stationary surroundings), with the head contributing variably, as long as the animal is attending to its surroundings.  相似文献   

9.
Evidence from electrophysiological and imaging studies suggests that audio‐visual (AV) stimuli presented in spatial coincidence enhance activity in the subcortical colliculo‐dorsal extrastriate pathway. To test whether repetitive AV stimulation might specifically activate this neural circuit underlying multisensory integrative processes, electroencephalographic data were recorded before and after 2 h of AV training, during the execution of two lateralized visual tasks: a motion discrimination task, relying on activity in the colliculo‐dorsal MT pathway, and an orientation discrimination task, relying on activity in the striate and early ventral extrastriate cortices. During training, participants were asked to detect and perform a saccade towards AV stimuli that were disproportionally allocated to one hemifield (the trained hemifield). Half of the participants underwent a training in which AV stimuli were presented in spatial coincidence, while the remaining half underwent a training in which AV stimuli were presented in spatial disparity (32°). Participants who received AV training with stimuli in spatial coincidence had a post‐training enhancement of the anterior N1 component in the motion discrimination task, but only in response to stimuli presented in the trained hemifield. However, no effect was found in the orientation discrimination task. In contrast, participants who received AV training with stimuli in spatial disparity showed no effects on either task. The observed N1 enhancement might reflect enhanced discrimination for motion stimuli, probably due to increased activity in the colliculo‐dorsal MT pathway induced by multisensory training.  相似文献   

10.
When subjects decide whether two visual stimuli presented in various orientations are identical or mirror-images, reaction time increases with the angular disparity between the stimuli. The interpretation of this well-known observation is that subjects mentally rotate images of the stimuli until they are in congruence, in order to solve the task. Here we review studies involving mental rotation of tactile stimuli. Mental rotation in tactile tasks is specifically associated with the requirement for mirror-image discrimination, as opposed to identity judgments. The key brain region mediating mental rotation of tactile stimuli seems to be the parietal cortex. Visual processing appears to facilitate task performance. We report an experiment from our laboratory addressing the nature of the reference frame for mental rotation of tactile stimuli. Our observations indicate that when the hand is directly in front of the body, with the head facing forward, the shortest reaction times for mirror-image discrimination of stimuli applied to the fingerpad are obtained when the longitudinal axis of the stimulus is in or parallel to the sagittal plane, even when this is perpendicular to the long axis of the finger. Thus, the reference frame for mental rotation of tactile stimuli is not purely hand-centered. This is consistent with other findings indicating variable assignment of reference frames for tactile perception.  相似文献   

11.
Although some brain areas preferentially process information from a particular sensory modality, these areas can also respond to other modalities. Here we used fMRI to show that such responsiveness to tactile stimuli depends on the temporal frequency of stimulation. Participants performed a tactile threshold-tracking task where the tip of either their left or right middle finger was stimulated at 3, 20, or 100 Hz. Whole-brain analysis revealed an effect of stimulus frequency in two regions: the auditory cortex and the visual cortex. The BOLD response in the auditory cortex was stronger during stimulation at hearable frequencies (20 and 100 Hz) whereas the response in the visual cortex was suppressed at infrasonic frequencies (3 Hz). Regardless of which hand was stimulated, the frequency-dependent effects were lateralized to the left auditory cortex and the right visual cortex. Furthermore, the frequency-dependent effects in both areas were abolished when the participants performed a visual task while receiving identical tactile stimulation as in the tactile threshold-tracking task. We interpret these findings in the context of the metamodal theory of brain function, which posits that brain areas contribute to sensory processing by performing specific computations regardless of input modality.  相似文献   

12.
To investigate the role of visual spatial information in the control of spatial attention, event-related brain potentials (ERPs) were recorded during a tactile attention task for a group of totally blind participants who were either congenitally blind or had lost vision during infancy, and for an age-matched, sighted control group who performed the task in the dark. Participants had to shift attention to the left or right hand (as indicated by an auditory cue presented at the start of each trial) in order to detect infrequent tactile targets delivered to this hand. Effects of tactile attention on the processing of tactile events, as reflected by attentional modulations of somatosensory ERPs to tactile stimuli, were very similar for early blind and sighted participants, suggesting that the capacity to selectively process tactile information from one hand versus the other does not differ systematically between the blind and the sighted. ERPs measured during the cue-target interval revealed an anterior directing attention negativity (ADAN) that was present for the early blind group as well as for the sighted control group. In contrast, the subsequent posterior late direction attention negativity (LDAP) was absent in both groups. These results suggest that these two components reflect functionally distinct attentional control mechanisms which differ in their dependence on the availability of visually coded representations of external space.  相似文献   

13.
Tactile, acoustic and vestibular systems sum to elicit the startle reflex   总被引:8,自引:0,他引:8  
The startle reflex is elicited by intense tactile, acoustic or vestibular stimuli. Fast mechanoreceptors in each modality can respond to skin or head displacement. In each modality, stimulation of cranial nerves or primary sensory nuclei evokes startle-like responses. The most sensitive sites in rats are found in the ventral spinal trigeminal pathway, corresponding to inputs from the dorsal face. Cross-modal summation is stronger than intramodal temporal summation, suggesting that the convergence of acoustic, vestibular and tactile information is important for eliciting startle. This summation declines sharply if the cross-modal stimuli are not synchronous. Head impact stimuli activate trigeminal, acoustic and vestibular systems together, suggesting that the startle response protects the body from impact stimuli. In each primary sensory nucleus, large, second-order neurons project to pontine reticular formation giant neurons critical for the acoustic startle reflex. In vestibular nucleus sites, startle-like responses appear to be mediated mainly via the vestibulospinal tract, not the reticulospinal tract. Summation between vestibulospinal and reticulospinal pathways mediating startle is proposed to occur in the ventral spinal cord.  相似文献   

14.
The process of visuo-spatial updating is crucial in guiding human behaviour. While the parietal cortex has long been considered a principal candidate for performing spatial transformations, the exact underlying mechanisms are still unclear. In this study, we investigated in a patient with a right occipito-parietal lesion the ability to update the visual space during vestibularly guided saccades. To quantify the possible deficits in visual and vestibular memory processes, we studied the subject's performance in two separate memory tasks, visual (VIS) and vestibular (VES). In the VIS task, a saccade was elicited from a central fixation point to the location of a visual memorized target and in the VEST task, the saccade was elicited after whole-body rotation to the starting position thus compensating for the rotation. Finally, in an updating task (UPD), the subject had to memorize the position of a visual target then after a whole-body rotation he had to produce a saccade to the remembered visual target location in space. Our main findings was a significant hypometria in the final eye position of both VEST and UPD saccades induced during rotation to the left (contralesional) hemispace as compared to saccades induced after right (ipsilesional) rotation. Moreover, these deficits in vestibularly guided saccades correlated with deficits in vestibulo-ocular time constant, reflecting disorders in the inertial vestibular integration path. We conclude that the occipito-parietal cortex in man can provide a first stage in visuo-spatial remapping by encoding inertial head position signals during gaze orientation.  相似文献   

15.
In the present study, causal roles of both the primary somatosensory cortex (SI) and the posterior parietal cortex (PPC) were investigated in a tactile unimodal working memory (WM) task. Individual magnetic resonance imaging‐based single‐pulse transcranial magnetic stimulation (spTMS) was applied, respectively, to the left SI (ipsilateral to tactile stimuli), right SI (contralateral to tactile stimuli) and right PPC (contralateral to tactile stimuli), while human participants were performing a tactile‐tactile unimodal delayed matching‐to‐sample task. The time points of spTMS were 300, 600 and 900 ms after the onset of the tactile sample stimulus (duration: 200 ms). Compared with ipsilateral SI, application of spTMS over either contralateral SI or contralateral PPC at those time points significantly impaired the accuracy of task performance. Meanwhile, the deterioration in accuracy did not vary with the stimulating time points. Together, these results indicate that the tactile information is processed cooperatively by SI and PPC in the same hemisphere, starting from the early delay of the tactile unimodal WM task. This pattern of processing of tactile information is different from the pattern in tactile‐visual cross‐modal WM. In a tactile‐visual cross‐modal WM task, SI and PPC contribute to the processing sequentially, suggesting a process of sensory information transfer during the early delay between modalities.  相似文献   

16.
Mental body representations are flexible and depend on sensory signals from the body and its surrounding. Clinical observations in amputees, paraplegics and brain-damaged patients suggest a vestibular contribution to the body schema, but studies using well-controlled psychophysical procedures are still lacking. In Experiment 1, we used a tactile distance comparison task between two body segments (hand and forehead). The results showed that objects contacting the hand were judged longer during caloric vestibular stimulation when compared to control thermal stimulation. In Experiment 2, participants located four anatomical landmarks on their left hand by pointing with their right hand. The perceived length and width of the left hand increased during caloric vestibular stimulation with respect to a control stimulation. The results show that the body schema temporarily adjusts as a function of vestibular signals, modifying the internal representation of the hand size. The data provide evidence that vestibular functions are not limited to postural and oculomotor control, and extend the contribution of the vestibular system to bodily cognition. The findings from this study suggest the inclusion of vestibular signals into current models of body representations and bodily self-consciousness.  相似文献   

17.
The existence of a human primary vestibular cortex is still debated. Current knowledge mainly derives from functional magnetic resonance imaging (fMRI) and positron emission tomography (PET) acquisitions during artificial vestibular stimulation. This may be problematic as artificial vestibular stimulation entails coactivation of other sensory receptors. The use of fMRI is challenging as the strong magnetic field and loud noise during MRI may both stimulate the vestibular organ. This study aimed to characterize the cortical activity during natural stimulation of the human vestibular organ. Two fluorodeoxyglucose (FDG)‐PET scans were obtained after natural vestibular stimulation in a self‐propelled chair. Two types of stimuli were applied: (a) rotation (horizontal semicircular canal) and (b) linear sideways movement (utriculus). A comparable baseline FDG‐PET scan was obtained after sitting motion‐less in the chair. In both stimulation paradigms, significantly increased FDG uptake was measured bilaterally in the medial part of Heschl's gyrus, with some overlap into the posterior insula. This is the first neuroimaging study to visualize cortical processing of natural vestibular stimuli. FDG uptake was demonstrated in the medial‐most part of Heschl's gyrus, normally associated with the primary auditory cortex. This anatomical localization seems plausible, considering that the labyrinth contains both the vestibular organ and the cochlea.  相似文献   

18.
The spatial rule of multisensory integration holds that cross-modal stimuli presented from the same spatial location result in enhanced multisensory integration. The present study investigated whether processing within the somatosensory cortex reflects the strength of cross-modal visuotactile interactions depending on the spatial relationship between visual and tactile stimuli. Visual stimuli were task-irrelevant and were presented simultaneously with touch in peripersonal and extrapersonal space, in the same or opposite hemispace with respect to the tactile stimuli. Participants directed their attention to one of their hands to detect infrequent tactile target stimuli at that hand while ignoring tactile targets at the unattended hand, all tactile nontarget stimuli, and any visual stimuli. Enhancement of ERPs recorded over and close to the somatosensory cortex was present as early as 100 msec after onset of stimuli (i.e., overlapping with the P100 component) when visual stimuli were presented next to the site of tactile stimulation (i.e., perihand space) compared to when these were presented at different locations in peripersonal or extrapersonal space. Therefore, this study provides electrophysiological support for the spatial rule of visual-tactile interaction in human participants. Importantly, these early cross-modal spatial effects occurred regardless of the locus of attention. In addition, and in line with previous research, we found attentional modulations of somatosensory processing only to be present in the time range of the N140 component and for longer latencies with an enhanced negativity for tactile stimuli at attended compared to unattended locations. Taken together, the pattern of the results from this study suggests that visuotactile spatial effects on somatosensory processing occur prior and independent of tactile-spatial attention.  相似文献   

19.
The effect of passive whole-body rotation about the earth-vertical axis on the lateralization of dichotic sound was investigated in human subjects. Pure-tone pulses (1 kHz; 0.1 s duration) with various interaural time differences were presented via headphones during brief, low-amplitude rotation (angular acceleration 400 degrees/s2; maximum velocity 90 degrees/s; maximum displacement 194 degrees ). Subjects made two-alternative forced-choice (left/right) judgements on the acoustic stimuli. The auditory median plane of the head was shifted opposite to the direction of rotation, indicating a shift of the intracranial auditory percept in the direction of rotation. The mean magnitude of the shift was 10.7 micros. This result demonstrates a slight, but significant, influence of rotation on sound lateralization, suggesting that vestibular information is taken into account by the brain for accurate localization of stationary sound sources during natural head and body motion.  相似文献   

20.
Animal experiments have shown that the spatial correspondence between auditory and tactile receptive fields of ventral pre-motor neurons provides a map of auditory peripersonal space around the head. This allows neurons to localize a near sound with respect to the head. In the present study, we demonstrated the existence of an auditory peripersonal space around the head in humans. In a right-brain damaged patient with tactile extinction, a sound delivered near the ipsilesional side of the head extinguished a tactile stimulus delivered to the contralesional side of the head (cross-modal auditory-tactile extinction). In contrast, when an auditory stimulus was presented far from the head, cross-modal extinction was dramatically reduced. This spatially specific cross-modal extinction was found only when a complex sound like a white noise burst was presented; pure tones did not produce spatially specific cross-modal extinction. These results show a high degree of functional similarity between the characteristics of the auditory peripersonal space representation in humans and monkeys. This similarity suggests that analogous physiological substrates might be responsible for coding this multisensory integrated representation of peripersonal space in human and non-human primates.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号