首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 515 毫秒
1.
In oculomotor research, there are two common methods by which the apparent location of visual and/or auditory targets are measured, saccadic eye movements with the head restrained and gaze shifts (combined saccades and head movements) with the head unrestrained. Because cats have a small oculomotor range (approximately +/-25 degrees), head movements are necessary when orienting to targets at the extremes of or outside this range. Here we tested the hypothesis that the accuracy of localizing auditory and visual targets using more ethologically natural head-unrestrained gaze shifts would be superior to head-restrained eye saccades. The effect of stimulus duration on localization accuracy was also investigated. Three cats were trained using operant conditioning with their heads initially restrained to indicate the location of auditory and visual targets via eye position. Long-duration visual targets were localized accurately with little error, but the locations of short-duration visual and both long- and short-duration auditory targets were markedly underestimated. With the head unrestrained, localization accuracy improved substantially for all stimuli and all durations. While the improvement for long-duration stimuli with the head unrestrained might be expected given that dynamic sensory cues were available during the gaze shifts and the lack of a memory component, surprisingly, the improvement was greatest for the auditory and visual stimuli with the shortest durations, where the stimuli were extinguished prior to the onset of the eye or head movement. The underestimation of auditory targets with the head restrained is explained in terms of the unnatural sensorimotor conditions that likely result during head restraint.  相似文献   

2.
Stimulation of the superior colliculus in rats produces movements of the head and body that resemble either orientation and approach towards a contralateral stimulus, or avoidance of, or escape from, such a stimulus. A variety of evidence indicates that the crossed descending pathway, which runs in the contralateral predorsal bundle to the pontomedullary reticular formation and the spinal cord, is involved in orienting movements. The nature of this involvement was investigated, by assessing the effects on tectally-elicited movements of midbrain knife-cuts intended to section the pathway as it crosses midline in the dorsal tegmental decussation. As expected, ipsilateral movements resembling avoidance or escape were little affected by dorsal tegmental decussation section, whereas contralateral circling movements of the body were almost abolished. However, contralateral movements of the head in response to electrical stimulation were not eliminated, nor were orienting head movements to visual or tactile stimuli. There was some suggestion that section of the dorsal tegmental decussation increased the latency of head movements from electrical stimulation at lateral sites, and decreased the accuracy of orienting movements to sensory stimuli. These results support the view that the crossed tectoreticulospinal system is concerned with approach rather than avoidance movements. However, it appears that other, as yet unidentified, tectal efferent systems are also involved in orienting head movements. It is possible that this division of labour may reflect functional differences between various kinds of apparently similar orienting responses. One suggestion is that the tectoreticulospinal system is concerned less in open-loop orienting responses (that are initiated but not subsequently guided by sensory stimuli), than in following or pursuit movements.  相似文献   

3.
Previous studies have demonstrated that human subjects update the location of visual targets for saccades after head and body movements and in the absence of visual feedback. This phenomenon is known as spatial updating. Here we investigated whether a similar mechanism exists for the perception of motion direction. We recorded eye positions in three dimensions and behavioral responses in seven subjects during a motion task in two different conditions: when the subject's head remained stationary and when subjects rotated their heads around an anteroposterior axis (head tilt). We demonstrated that after head-tilt subjects updated the direction of saccades made in the perceived stimulus direction (direction of motion updating), the amount of updating varied across subjects and stimulus directions, the amount of motion direction updating was highly correlated with the amount of spatial updating during a memory-guided saccade task, subjects updated the stimulus direction during a two-alternative forced-choice direction discrimination task in the absence of saccadic eye movements (perceptual updating), perceptual updating was more accurate than motion direction updating involving saccades, and subjects updated motion direction similarly during active and passive head rotation. These results demonstrate the existence of an updating mechanism for the perception of motion direction in the human brain that operates during active and passive head rotations and that resembles the one of spatial updating. Such a mechanism operates during different tasks involving different motor and perceptual skills (saccade and motion direction discrimination) with different degrees of accuracy.  相似文献   

4.
The sudden onset of a novel stimulus usually triggers orienting responses of the eyes, head and external ears (pinnae). These responses facilitate the reception of additional signals originating from the source of the stimulus and assist in the sensory guidance of appropriate limb and body movements. A midbrain structure, the superior colliculus, plays a critical role in triggering and organizing orienting movements and is a particularly interesting structure for studying the neural computations involved in the translation of sensory signals into motor commands. Auditory, somatosensory and visual signals converge in its deep layers, where neurons are found that generate motor commands for eye, head and pinna movements. This article focuses on the role of the superior colliculus in the control of saccadic (quick, high-velocity) eye movements with particular regard to three issues related to the functional properties of collicular neurons. First, how do neurons with large movement fields specify accurately the direction and amplitude of an eye movement? Second, how are signals converted from different sensory modalities into commands in a common motor frame of reference? Last, how are the motor command signals found in the superior colliculus transformed into those needed by the motor neuron pools innervating the extraocular muscles?  相似文献   

5.
Spatial orientation is crucial when subjects have to accurately reach memorized visual targets. In previous studies modified gravitoinertial force fields were used to affect the accuracy of pointing movements in complete darkness without visual feedback of the moving limb. Target mislocalization was put forward as one hypothesis to explain this decrease in accuracy of pointing movements. The aim of this study was to test this hypothesis by determining the accuracy of spatial localization of memorized visual targets in a perturbed gravitoinertial force field. As head orientation is involved in localization tasks and carrying relevant sensory systems (visual, vestibular and neck muscle proprioceptive), we also tested the effect of head posture on the accuracy of localization. Subjects (n=10) were seated off-axis on a rotating platform (120 degrees s(-1)) in complete darkness with the head fixed (head-fixed session) or free to move (head-free session). They were required to report verbally the egocentric spatial localization of visual memorized targets. They gave the perceived target location in direction (i.e. left or right) and in amplitude (in centimeters) relative to the direction they thought to be straight ahead. Results showed that the accuracy of visual localization decreased when subjects were exposed to inertial forces. Moreover, subjects localized the memorized visual targets more to the right than their actual position, that was in the direction of the inertial forces. With further analysis, it appeared that this shift of localization was concomitant with a shift of the visual straight ahead (VSA) in the opposite direction. Thus, the modified gravitoinertial force field led to a modification in the orientation of the egocentric reference frame. Furthermore, this shift of localization increased when the head was free to move while the head was tilted in roll toward the center of rotation of the platform and turned in yaw in the same direction. It is concluded that the orientation of the egocentric reference frame was influenced by the gravitoinertial vector.  相似文献   

6.
The effects of stimulus intensity, duration, and risetime on the autonomic and behavioral components of orienting, startle, and defense responses were investigated. Six groups of 10 students were presented with 15 white noise stimuli at either 60 or 100 dB, with controlled risetimes of either 5 or 200 ms, and at stimulus durations of 1 or 5 s (1 s only in the case of the 60-dB groups). A dishabituation stimulus consisting of a 1000 Hz tone was also presented. Measures consisted of skin conductance and heart rate, together with ratings of facial expressions and upper torso movement obtained using video recording. Increased intensity resulted in greater amplitudes and frequencies of electrodermal and behavioral responses, and a change from cardiac deceleration to acceleration. Faster risetimes elicited larger electrodermal responses, greater frequencies of eye-blinks, head and body movements, and larger cardiac accelerations. The effects of duration for the 100-dB stimuli were less clear-cut. Overall, the results are discussed in relation to the differentiation of orienting, startle, and defense responses.  相似文献   

7.
We examined the motor error hypothesis of visual and auditory interaction in the superior colliculus (SC), first tested by Jay and Sparks in the monkey. We trained cats to direct their eyes to the location of acoustic sources and studied the effects of eye position on both the ability of cats to localize sounds and the auditory responses of SC neurons with the head restrained. Sound localization accuracy was generally not affected by initial eye position, i.e., accuracy was not proportionally affected by the deviation of the eyes from the primary position at the time of stimulus presentation, showing that eye position is taken into account when orienting to acoustic targets. The responses of most single SC neurons to acoustic stimuli in the intact cat were modulated by eye position in the direction consistent with the predictions of the "motor error" hypothesis, but the shift accounted for only two-thirds of the initial deviation of the eyes. However, when the average horizontal sound localization error, which was approximately 35% of the target amplitude, was taken into account, the magnitude of the horizontal shifts in the SC auditory receptive fields matched the observed behavior. The modulation by eye position was not due to concomitant movements of the external ears, as confirmed by recordings carried out after immobilizing the pinnae of one cat. However, the pattern of modulation after pinnae immobilization was inconsistent with the observations in the intact cat, suggesting that, in the intact animal, information about the position of the pinnae may be taken into account.  相似文献   

8.
Encoding of visual target location in extrapersonal space requires convergence of at least three types of information: retinal signals, information about orbital eye positions, and the position of the head on the body. Since the position of gaze is the sum of the head position and the eye position, inaccuracy of spatial localization of the target may result from the sum of the corresponding three levels of errors: retina, ocular and head. In order to evaluate the possible errors evoked at each level, accuracy of target encoding was assessed through a motor response requiring subjects to point with the hand towards a target seen under foveal vision, eliminating the retinal source of error. Subjects had first to orient their head to one of three positions to the right (0, 40, 80°) and maintain this head position while orienting gaze and pointing to one of five target positions (0, 20, 40, 60, 80°). This resulted in 11 combinations of static head and eye positions, and corresponded to five different gaze eccentricities. The accuracy of target pointing was tested without vision of the moving hand. Six subjects were tested. No systematic bias in finger pointing was observed for eye positions ranging from 0 to 40° to the right or left within the orbit. However, the variability (as measured by a surface error) given by the scatter of hand pointing increased quadratically with eye eccentricity. A similar observation was made with the eye centreed and the head position ranging from 0 to 80°, although the surface error increased less steeply with eccentricity. Some interaction between eye and head eccentricity also contributed to the pointing error. These results suggest that pointing should be most accurate with a head displacement corresponding to 90% of the gaze eccentricity. These results explain the systematic hypometry of head orienting towards targets observed under natural conditions: thus the respective contribution of head and eye to gaze orientation might be determined in order to optimize accuracy of target encoding.  相似文献   

9.
Sound localization in humans relies on binaural differences (azimuth cues) and monaural spectral shape information (elevation cues) and is therefore the result of a neural computational process. Despite the fact that these acoustic cues are referenced with respect to the head, accurate eye movements can be generated to sounds in complete darkness. This ability necessitates the use of eye position information. So far, however, sound localization has been investigated mainly with a fixed head position, usually straight ahead. Yet the auditory system may rely on head motor information to maintain a stable and spatially accurate representation of acoustic targets in the presence of head movements. We therefore studied the influence of changes in eye-head position on auditory-guided orienting behavior of human subjects. In the first experiment, we used a visual-auditory double-step paradigm. Subjects made saccadic gaze shifts in total darkness toward brief broadband sounds presented before an intervening eye-head movement that was evoked by an earlier visual target. The data show that the preceding displacements of both eye and head are fully accounted for, resulting in spatially accurate responses. This suggests that auditory target information may be transformed into a spatial (or body-centered) frame of reference. To further investigate this possibility, we exploited the unique property of the auditory system that sound elevation is extracted independently from pinna-related spectral cues. In the absence of such cues, accurate elevation detection is not possible, even when head movements are made. This is shown in a second experiment where pure tones were localized at a fixed elevation that depended on the tone frequency rather than on the actual target elevation, both under head-fixed and -free conditions. To test, in a third experiment, whether the perceived elevation of tones relies on a head- or space-fixed target representation, eye movements were elicited toward pure tones while subjects kept their head in different vertical positions. It appeared that each tone was localized at a fixed, frequency-dependent elevation in space that shifted to a limited extent with changes in head elevation. Hence information about head position is used under static conditions too. Interestingly, the influence of head position also depended on the tone frequency. Thus tone-evoked ocular saccades typically showed a partial compensation for changes in static head position, whereas noise-evoked eye-head saccades fully compensated for intervening changes in eye-head position. We propose that the auditory localization system combines the acoustic input with head-position information to encode targets in a spatial (or body-centered) frame of reference. In this way, accurate orienting responses may be programmed despite intervening eye-head movements. A conceptual model, based on the tonotopic organization of the auditory system, is presented that may account for our findings.  相似文献   

10.
1. Orienting movements, consisting of coordinated eye and head displacements, direct the visual axis to the source of a sensory stimulus. A recent hypothesis suggests that the CNS may control gaze position (gaze = eye-relative-to-space = eye-relative-to-head + head-relative-to-space) by the use of a feedback circuit wherein an internally derived representation of gaze motor error drives both eye and head premotor circuits. In this paper we examine the effect of behavioral task on the individual and summed trajectories of horizontal eye- and head-orienting movements to gain more insight into how the eyes and head are coupled and controlled in different behavioral situations. 2. Cats whose heads were either restrained (head-fixed) or unrestrained (head-free) were trained to make orienting movements of any desired amplitude in a simple cat-and-mouse game we call the barrier paradigm. A rectangular opaque barrier was placed in front of the hungry animal who either oriented to a food target that was visible to one side of the barrier or oriented to a location on an edge of the barrier where it predicted the target would reappear from behind the barrier. 3. The dynamics (e.g., maximum velocity) and duration of eye- and head-orienting movements were affected by the task. Saccadic eye movements (head-fixed) elicited by the visible target attained greater velocity and had shorter durations than comparable amplitude saccades directed toward the predicted target. A similar observation has been made in human and monkey. In addition, when the head was unrestrained both the eye and head movements (and therefore gaze movements) were faster and shorter in the visible- compared with the predicted-target conditions. Nevertheless, the relative contributions of the eye and head to the overall gaze displacement remained task independent: i.e., the distance traveled by the eye and head movements was determined by the size of the gaze shift only. This relationship was maintained because the velocities of the eye and head movements covaried in the different behavioral situations. Gaze-velocity profiles also had characteristic shapes that were dependent on task. In the predicted-target condition these profiles tended to have flattened peaks, whereas when the target was visible the peaks were sharper. 4. Presentation of a visual cue (e.g., reappearance of food target) immediately before (less than 50 ms) the onset of a gaze shift to a predicted target triggered a midflight increase in first the eye- and, after approximately 20 ms, the head-movement velocity.(ABSTRACT TRUNCATED AT 400 WORDS)  相似文献   

11.
This study was motivated by the observation of early head movements (EHMs) occasionally generated before gaze shifts. Human subjects were presented with a visual or auditory target, along with an accompanying stimulus of the other modality, that either appeared at the same location as the target (enhancer condition) or at the diametrically opposite location (distractor condition). Gaze shifts generated to the target in the distractor condition sometimes were preceded by EHMs directed either to the side of the target (correct EHMs) or the side of the distractor (incorrect EHMs). During EHMs, the eyes performed compensatory eye movements to keep gaze stable. Incorrect EHMs were usually between 1 and 5 degrees in amplitude and reached peak velocities generally <50 degrees /s. These metrics increased for more eccentric distractors. The dynamics of incorrect EHMs initially followed a trajectory typical of much larger head movements. These results suggest that incorrect EHMs are head movements that initially were planned to orient to the peripheral distractor. Furthermore gaze shifts preceded by incorrect EHMs had longer reaction latencies than gaze shifts not preceded by incorrect EHMs, suggesting that the processes leading to incorrect EHMs also serve to delay gaze-shift initiation. These results demonstrate a form of distraction analogous to the incorrect gaze shifts (IGSs) described in the previous paper and suggest that a motor program encoding a gaze shift to a distractor is capable of initiating either an IGS or an incorrect EHM. A neural program not strong enough to initiate an IGS nevertheless can initiate an incorrect EHM.  相似文献   

12.
1. This study investigates the contribution of the optic tectum in encoding the metric and kinetic properties of saccadic head movements. We describe the dependence of head movement components (size, direction, and speed) on parameters of focal electrical stimulation of the barn owl's optic tectum. The results demonstrate that both the site and the amount of activity can influence head saccade metrics and kinetics. 2. Electrical stimulation of the owl's optic tectum elicited rapid head movements that closely resembled natural head movements made in response to auditory and visual stimuli. The kinetics of these movements were similar to those of saccadic eye movements in primates. 3. The metrics and kinetics of head movements evoked from any given site depended strongly on stimulus parameters. Movement duration increased with stimulus duration, as did movement size. Both the size and the maximum speed of the movement increased to a plateau value with current strength and pulse rate. Movement direction was independent of stimulus parameters. 4. The initial position of the head influenced the size, direction, and speed of movements evoked from any given site: when the owl initially faced away from the direction of the induced saccade, the movement was larger and faster than when the owl initially faced toward the direction of the induced movement. 5. A characteristic movement of particular size, direction, and speed could be defined for each site by the use of stimulation parameters that elicited plateau movements with normal kinetic profiles and by having the head initially centered on the body. The size, direction, and speed of these characteristic movements varied systematically with the site of stimulation across the tectum. The map of head movement vector (size and direction) was aligned with the sensory representations of visual and auditory space, such that the movement elicited from a given site when the owl initially faced straight ahead brought the owl to face that region of space represented by the sensory responses of the neurons at the site of stimulation. 6. The results imply that both the site and the amount of neural activity in the optic tectum contribute to encoding the metrics and kinetics of saccadic movements. A comparison of the present findings with previous studies on saccadic eye movements in primates and combined eye and head movements in cats suggests striking similarities in the ways in which tectal activity specifies a redirection in gaze to such dissimilar motor effectors as the eyes and head.  相似文献   

13.
Spatial attention modulates sound localization in barn owls   总被引:3,自引:0,他引:3  
Attentional influence on sound-localization behavior of barn owls was investigated in a cross-modal spatial cuing paradigm. After being cued to the most probable target side with a visual cuing stimulus, owls localized upcoming auditory target stimuli with a head turn toward the position of the sound source. In 80% of the trials, cuing stimuli pointed toward the side of the upcoming target stimulus (valid configuration), and in 20% they pointed toward the opposite side (invalid configuration). We found that owls initiated the head turns by a mean of 37.4 ms earlier in valid trials, i.e., mean response latencies of head turns were reduced by 16% after a valid cuing stimulus. Thus auditory stimuli appearing at the cued side were processed faster than stimuli appearing at the uncued side, indicating the influence of a spatial-selective attention mechanism. Turning angles were not different when owls turned their head toward a cued or an uncued location. Other types of attention influencing sound localization, e.g., a reduction of response latency as a function of the duration of cue-target delay, could not be observed. This study is the first attempt to investigate attentional influences on sound localization in an animal model.  相似文献   

14.
1. In natural conditions, gaze (i.e., eye + head) orientation is a complex behavior involving simultaneously the eye and head motor systems. Thus one of the key problems of gaze control is whether or not the vestibuloocular reflex (VOR) elicited by head rotation and saccadic eye movement linearly add. 2. Kinematics of human gaze saccades within the oculomotor range (OMR) were quantified under different conditions of head motion. Saccades were visually triggered while the head was fixed or passively moving at a constant velocity (200 deg/s) either in the same direction as, or opposite to, the saccade. Active eye-head coordination was also studied in a session in which subjects were trained to actively rotate their head at a nearly constant velocity during the saccade and, in another session, during natural gaze responses. 3. When the head was passively rotated toward the visual target, both maximum and mean gaze velocities increased with respect to control responses with the head fixed; these effects increased with gaze saccade amplitude. In addition, saccade duration was reduced so that corresponding gaze accuracy, although poorer than for control responses, was not dramatically affected by head motion. 4. The same effects on gaze velocity were present during active head motion when a constant head velocity was maintained throughout saccade duration, and gaze saccades were as accurate as with the head fixed. 5. During natural gaze responses, an increased gaze velocity and a decreased saccade duration with respect to control responses became significant only for gaze displacement larger than 30 degrees, due to the negligible contribution of head motion for smaller responses. 6. When the head was passively rotated in the opposite direction to target step, gaze saccades were slower than those obtained with the head fixed; but their average accuracy was still maintained. 7. These results confirm a VOR inhibition during saccadic eye movements within the OMR. This inhibition, present in all 16 subjects studied, ranged from 40 to 96% (for a 40 degree target step) between subjects and increased almost linearly with target step amplitude. Furthermore, the systematic difference between instantaneous VOR gain estimated at the time of maximum gaze velocity and mean VOR gain estimated over the whole saccadic duration indicates a decay of VOR inhibition during the ongoing saccade. 8. A simplified model is proposed with a varying VOR inhibition during the saccade. It suggests that VOR inhibition is not directly controlled by the saccadic pulse generator.(ABSTRACT TRUNCATED AT 400 WORDS)  相似文献   

15.
A region in the barn owl forebrain, referred to as the archistriatal gaze fields (AGF), is shown to be involved in auditory orienting behavior. In a previous study, electrical microstimulation of the AGF was shown to produce saccadic movements of the eyes and head, and anatomical data revealed that neurons in the AGF region of the archistriatum project directly to brainstem tegmental nuclei that mediate gaze changes. In this study, we investigated the effects of AGF inactivation on the auditory orienting responses of trained barn owls. The AGF and/or the optic tectum (OT) were inactivated pharmacologically using the GABAA agonist muscimol. Inactivation of the AGF alone had no effect on the probability or accuracy of orienting responses to contralateral acoustic stimuli. Inactivation of the OT alone decreased the probability of responses to contralateral stimuli, but the animals were still capable of orienting accurately toward stimuli on about 60% of the trials. Inactivation of both the AGF and the OT drastically decreased the probability of responses to 16–21% and, on the few trials that the animals did respond, there was no relationship between the final direction of gaze and the location of the stimulus. Thus, with the AGF and OT both inactivated, the animals were no longer capable of orienting accurately toward acoustic stimuli located on the contralateral side. These data confirm that the AGF is involved in gaze control and that the AGF and the OT have parallel access to gaze control circuitry in the brainstem tegmentum. In these respects, the AGF in barn owls is functionally equivalent to the frontal eye fields in primates.  相似文献   

16.
In the absence of visual feedback, subject reports of hand location tend to drift over time. Such drift has been attributed to a gradual reduction in the usefulness of proprioception to signal limb position. If this account is correct, drift should degrade the accuracy of movement distance and direction over a series of movements made without visual feedback. To test this hypothesis, we asked participants to perform six series of 75 repetitive movements from a visible start location to a visible target, in time with a regular, audible tone. Fingertip position feedback was given by a cursor during the first five trials in the series. Feedback was then removed, and participants were to continue on pace for the next 70 trials. Movements were made in two directions (30 degrees and 120 degrees ) from each of three start locations (initial shoulder angles of 30 degrees, 40 degrees, 50 degrees, and initial elbow angles of 90 degrees ). Over the 70 trials, the start location of each movement drifted, on average, 8 cm away from the initial start location. This drift varied systematically with movement direction, indicating that drift is related to movement production. However, despite these dramatic changes in hand position and joint configuration, movement distance and direction remained relatively constant. Inverse dynamics analysis revealed that movement preservation was accompanied by substantial modification of joint muscle torque. These results suggest that proprioception continues to be a reliable source of limb position information after prolonged time without vision, but that this information is used differently for maintaining limb position and for specifying movement trajectory.  相似文献   

17.
Horizontal step-ramp stimuli were used to examine gaze-, eye-, and head-movement dynamics during head-unrestrained pursuit in two rhesus monkeys. In a first series of experiments, we characterized and compared head-restrained (HR) and -unrestrained (HU) pursuit responses to unpredictable, nonperiodic, constant velocity (20-80 degrees/s) stimuli. When the head was free to move, both monkeys used a combination of eye and head motion to initially fixate and then pursue the target. The pursuit responses (i.e., gaze responses) were highly stereotyped and nearly identical among the HR and HU conditions for a given step-ramp stimulus. In the HU condition, initial eye and initial head acceleration tended to increase as a function of target velocity but did not vary systematically with initial target eccentricity. In a second series of experiments, step-ramp stimuli (40 degrees/s) were presented, and, approximately 125 ms after pursuit onset, a constant retinal velocity error (RVE) was imposed for a duration of 300 ms. In each monkey, HR and HU gaze velocity was similarly affected by stabilizing the target with respect to the monkey's fovea (i.e., RVE = 0 degrees/s) and by moving the target with constant retinal velocity errors (i.e., RVE = +/- 10 degrees/s). In the HU condition, changes in both eye and head velocity trajectories contributed to the observed gaze velocity responses to imposed RVEs. We conclude that eye and head movements are not independently controlled during HU pursuit but rather are controlled, at least in part, by a shared upstream controller within the pursuit pathways.  相似文献   

18.
Compensatory horizontal eye movements of head restrained rats were compared with compensatory horizontal eye-head movements of partially restrained rats (head movements limited to the horizontal plane). Responses were evoked by constant velocity optokinetic and vestibular stimuli (10–60°/s) and recorded with search coils in a rotating magnetic field. Velocity and position components of eye and head responses were analysed. The velocity gains of optokinetic and vestibular responses of partially restrained and of head restrained rats were similarly high (between 0.8 and 1.0). Eye movements in partially restrained rats also contributed most (about 80%) to the velocity components of the responses. At stimulus velocities above 10°/s, the “beating field” of the evoked optokinetic and vestibular nystagmus was shifted transiently in the direction of ocular quick phases. The amplitude of this shift of the line of sight was about 3–10° in head restrained and about 20–30° in partially head restrained rats. Most of this large, transient gaze shift (about 80%) was accomplished by head movements. We interpret this gaze shift as an orienting response, and conclude that the recruitment of the ocular and the neck motor systems can be independent and task specific: head movements are primarily used to orient eye, ear and nose towards a sector of particular relevance, whereas eye movements provide the higher frequency dynamics for image stabilization and vergence movements.  相似文献   

19.
Differences between gaze shifts evoked by collicular electrical stimulation and those triggered by the presentation of a visual stimulus were studied in head-free cats by increasing the head moment of inertia. This maneuver modified the dynamics of these two types of gaze shifts by slowing down head movements. Such an increase in the head moment of inertia did not affect the metrics of visually evoked gaze saccades because their duration was precisely adjusted to compensate for these changes in movement dynamics. In contrast, the duration of electrically evoked gaze shifts remained constant irrespective of the head moment of inertia, and therefore their amplitude was significantly reduced. These results suggest that visually and electrically evoked gaze saccades are controlled by different mechanisms. Whereas the accuracy of visually evoked saccades is likely to be assured by on-line feedback information, the absence of duration adjustment in electrically evoked gaze shifts suggests that feedback information necessary to maintain their metrics is not accessible or is corrupted during collicular stimulation. This is of great importance when these two types of movements are compared to infer the role of the superior colliculus in the control of orienting gaze shifts.  相似文献   

20.
 The coordination between eye and head movements during a rapid orienting gaze shift has been investigated mainly when subjects made horizontal movements towards visual targets with the eyes starting at the centre of the orbit. Under these conditions, it is difficult to identify the signals driving the two motor systems, because their initial motor errors are identical and equal to the coordinates of the sensory stimulus (i.e. retinal error). In this paper, we investigate head-free gaze saccades of human subjects towards visual as well as auditory stimuli presented in the two-dimensional frontal plane, under both aligned and unaligned initial fixation conditions. Although the basic patterns for eye and head movements were qualitatively comparable for both stimulus modalities, systematic differences were also obtained under aligned conditions, suggesting a task-dependent movement strategy. Auditory-evoked gaze shifts were endowed with smaller eye-head latency differences, consistently larger head movements and smaller concomitant ocular saccades than visually triggered movements. By testing gaze control for eccentric initial eye positions, we found that the head displacement vector was best related to the initial head motor-error (target-re-head), rather than to the initial gaze error (target-re-eye), regardless of target modality. These findings suggest an independent control of the eye and head motor systems by commands in different frames of reference. However, we also observed a systematic influence of the oculomotor response on the properties of the evoked head movements, indicating a subtle coupling between the two systems. The results are discussed in view of current eye-head coordination models. Received: 24 April 1996 / Accepted: 25 October 1996  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号