首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Summary The study was aimed at defining the role of hand (and arm) kinaesthetic information in coordination control of the visuo-oculo-manual tracking system. Baboons were trained to follow slow-moving and stepping visual targets either with the eyes alone or with the eyes and a lever moved by the forelimb about the vertical axis. A LED was attached to the lever extremity. Four oculo-manual tracking condidtions were tested and compared to eye-alone tracking: Eye and hand tracking of a visual target presented on a screen, eye tracking of the hand, and eye tracking of an imaginary target actively moved by the arm. The performance of the animals evaluated in terms of latency, and velocity and position precision for both eye and hand movements was seen to be equivalent to that of humans in similar situations. After dorsal root rhizotomy (C1-T2) the animals were unable to produce slow arm motion in response to slow-moving targets. Instead, they produced successions of ballistic-like motions whose amplitude decreased as retraining proceeded. In addition, the animals could not longer respond with smooth pursuit eye movements to an imaginary target actively displaced by the animal's forelimb. It was concluded that the absence of ocular smooth pursuit after lesion results from the disruption of a signal derived from arm kinaesthetic information and addresses to the oculomotor system. This signal is likely to be used in the control of coordination between arm and eye movements during visuo-oculo-manual tracking tasks. One cause of the animal's inability to achieve slow arm movement in response to slow target motion is thought to be due to a lesion-induced alteration of the spinal common pathway dynamics which normally integrate the velocity signal descending from the arm movement command system.  相似文献   

2.
Dysfunction of the cerebellum leads to significant deterioration of movements performed under visual guidance and of co-ordinated eye and hand movement. Visually guided tracking tasks combine both of these control features, as the eyes and hand together track a visual target. To better understand the involvement of the cerebellum in tracking tasks, we used functional magnetic-resonance imaging to study the activation of cerebellar structures in visually guided tracking movements of the eye and hand. Subjects were tested performing ocular tracking, manual tracking without eye movement or combined eye and hand tracking of a smoothly moving visual target. Three areas were activated in the cerebellum: a bilateral region in the ansiform lobule of the lateral hemisphere, a region in the ipsilateral paramedian lobule and a region in the oculomotor vermis. The ansiform and paramedian areas were most strongly activated by hand movement, although the vermal site was also active. The reverse was found for ocular tracking, with predominantly vermal activation. Activation of these cerebellar cortical areas related to movement of eyes or hand alone was significantly enhanced when the subjects performed co-ordinated eye and hand tracking of a visual target. These results provide the first direct evidence from a functional-imaging study for cerebellar activation in eye and hand co-ordination.  相似文献   

3.
Summary The aim of this study was to examine coordination control in eye and hand tracking of visual targets. We studied eye tracking of a self-moved target, and simultaneous eye and hand tracking of an external visual target moving horizontally on a screen. Predictive features of eye-hand coordination control were studied by introducing a delay (0 to 450 ms) between the Subject's (S's) hand motion and the motion of the hand-driven target on the screen. In self-moved target tracking with artificial delay, the eyes started to move in response to arm movement while the visual target was still motionless, that is before any retinal slip had been produced. The signal likely to trigger smooth pursuit in that condition must be derived from non-visual information. Candidates are efference copy and afferent signals from arm motion. When tracking an external target with the eyes and the hand, in a condition where a delay was introduced in the visual feedback loop of the hand, the Ss anticipated with the arm the movement of the target in order to compensate the delay. After a short tracking period, Ss were able to track with a low lag, or eventually to create a lead between the hand and the target. This was observed if the delay was less than 250–300 ms. For larger delays, the hand lagged the target by 250–300 ms. Ss did not completely compensate the delay and did not, on the average, correct for sudden changes in movement of the target (at the direction reversal of the trajectory). Conversely, in the whole range of studied delays (0–450 ms), the eyes were always in phase with the visual target (except during the first part of the first cycle of the movement, as seen previously). These findings are discussed in relation to a scheme in which both predictive (dynamic nature of the motion) and coordination (eye and hand movement system interactive signals) controls are included.  相似文献   

4.
We evaluated the role of visual and non-visual information in the control of smooth pursuit movements during tracking of a self-moved target. Previous works have shown that self-moved target tracking is characterised by shorter smooth pursuit latency and higher maximal velocity than eye-alone tracking. In fact, when a subject tracks a visual target controlled by his own arm, eye movement and arm movement are closely synchronised. In the present study, we showed that, in a condition where the direction of motion of a self-moved visual target was opposite to that of the arm (same amplitude, same velocity, but opposite direction of movement), the resulting smooth pursuit eye movements occurred with low latency, and continued for about 140 ms in the direction of the arm movement rather than in the direction of the actual visual target movement. After 140 ms, the eye movement direction reversed through a combination of smooth pursuit and saccades. Subsequently, while arm and visual target still moved in opposite directions, smooth pursuit occurred in pace with the visual target motion. Subjects were also submitted to a series of 60 tracking trials, for which the arm-to-target motion relationship was systematically reversed. Under these conditions subjects were able to initiate early smooth pursuit in the actual direction of the visual target. Overall, these results confirm that non-visual information produced by the arm motor system can trigger and control smooth pursuit. They also demonstrate the plasticity of the neuronal network handling eye-arm coordination control.  相似文献   

5.
Summary The processes which develop to coordinate eye and hand movements in response to motion of a visual target were studied in young children and adults. We have shown that functional maturation of the coordination control between eye and hand takes place as a result of training. We observed, in the trained child and in the adult, that when the hand is used either as a target or to track a visual target, the dynamic characteristics of the smooth pursuit system are markedly improved: the eye to target delay is decreased from 150 ms in eye alone tracking to 30 ms, and smooth pursuit maximum velocity is increased by 100%. Coordination signals between arm and eye motor systems may be responsible for smooth pursuit eye movements which occur during self-tracking of hand or finger in darkness. These signals may also account for the higher velocity smooth pursuit eye movements and the shortened tracking delay when the hand is used as a target, as well as for the synkinetic eye-arm motions observed at the early stage of oculo-manual tracking training in children. We propose a model to describe the interaction which develops between two systems involved in the execution of a common sensorimotor task. The model applies to the visuo-oculo-manual tracking system, but it may be generalized to other coordinated systems. According to our definition, coordination control results from the reciprocal transfer of sensory and motor information between two or more systems involved in the execution of single, goal-directed or conjugate actions. This control, originating in one or more highly specialized structures of the central nervous system, combines with the control processes normally operating in each system. Our model relies on two essential notions which describe the dynamic and static aspects of coordination control: timing and mutual coupling.  相似文献   

6.
Summary Two monkeys were trained to track a continuously moving target using a joystick. One then had a cooling probe implanted in nucleus interpositus of the cerebellum ipsilateral to his tracking arm. The other had a cannula implanted in the ipsilateral cortex of the lateral cerebellum through which local anaesthetic could be infused. Both monkeys showed similar tracking deficits during temporary inactivation of the cerebellum. The main effects seen were an increase in the peak velocity of their intermittent corrective tracking movements, and a decrease in the accuracy of these movements. Linear regression analyses were undertaken of the peak velocity and amplitude of each corrective movement against a number of possible control signals (target velocity, target position, error, error velocity etc.). The initially strong correlation of the amplitude of each movement made with target velocity was severely reduced during cerebellar inactivation, and movement amplitude became better predicted by the error between target and joystick positions. The peak velocity of movements became more strongly correlated with movement amplitude and less correlated with target velocity than in the intact animal. These results are consistent with the hypothesis that intermittent tracking is achieved by the production of primitive movements, that are then adjusted to the correct amplitude and velocity required to catch up with the moving target. Our findings suggest that the cerebellum may normally be responsible for these adjustments, using visual and memorised cues about the target. The velocity of each movement may be reduced, and its amplitude adjusted, by combining measures of the current error with estimates of target speed and direction. We conclude that the cerebellum has an inhibitory role in tuning movements during visuo-motor tasks and that optimal tuning using feedforward measurements of target motion cannot be made without it.  相似文献   

7.
Summary Some aspects of the manner in which the central nervous system uses sensory information for the guidance of eye and arm movements were investigated. When subjects experience apparent motion of their restrained forearm, induced by vibration of their biceps muscle in the dark, they are able to pursue with their eyes at least part of this motion and to point with their nonvibrated limb to the apparent location of the vibrated arm. The presence of a small target light on the vibrated hand limits the extent of illusory change in limb position and results in illusory motion of the target light in the same direction as the arm motion. When asked to indicate the spatial position of the light or hand, subjects still point with their nonvibrated arm to the apparent locations. Although visual pursuit of the illusory motion of the forearm can still be elicited in the presence of the target light on the hand, the subjects' eyes remain steadily fixating the stationary target light when they are instructed to track its illusory motion. These findings demonstrate that sensory and motor factors affecting the perception of visual direction and the guidance of arm and eye movements can be differentially employed at several levels of central nervous control.  相似文献   

8.
The initiation of coupled eye and arm movements was studied in six patients with mild cerebellar dysfunction and in six age-matched control subjects. The experimental paradigm consisted of 40 deg step-tracking elbow movements made under different feedback conditions. During tracking with the eyes only, saccadic latencies in patients were within normal limits. When patients were required to make coordinated eye and arm movements, however, eye movement onset was significantly delayed. In addition, removal of visual information about arm versus target position had a pronounced differential effect on movement latencies. When the target was extinguished for 3 s immediately following a step change in target position, both eye and arm onset times were further prolonged compared to movements made to continuously visible targets. When visual information concerning arm position was removed, onset times were reduced. Eye and arm latencies in control subjects were unaffected by changes in visual feedback. The results of this study clearly demonstrate that, in contrast to earlier reports of normal saccadic latencies associated with cerebellar dysfunction, initiation of both eye and arm movements is prolonged during coordinated visuomotor tracking thus supporting a coordinative role for the cerebellum during oculo-manual tracking tasks.  相似文献   

9.
The ocular-following response is a slow tracking eye movement that is elicited by sudden drifting movements of a large-field visual stimulus in primates. It helps to stabilize the eyes on the visual scene. Previous single unit recordings and chemical lesion studies have reported that the ocular-following response is mediated by a pathway that includes the medial superior temporal (MST) area of the cortex and the ventral paraflocculus (VPFL) of the cerebellum. Using a linear regression model, we systematically analyzed the quantitative relationships between the complex temporal patterns of neural activity at each level with sensory input and motor output signals (acceleration, velocity, and position) during ocular-following. The results revealed the following: (1) the temporal firing pattern of the MST neurons locally encodes the dynamic properties of the visual stimulus within a limited range. On the other hand, (2) the temporal firing pattern of the Purkinje cells in the cerebellum globally encodes almost the entire motor command for the ocular-following response. We conclude that the cerebellum is the major site of the sensory-to-motor transformation necessary for ocular-following, where population coding is integrated into rate coding.  相似文献   

10.
Multimodal reference frame for the planning of vertical arms movements   总被引:3,自引:0,他引:3  
In this study we investigated the reference frames used to plan arm movements. Specifically, we asked whether the body axis, visual cues and graviception can each play a role in defining "up" and "down" in the planning and execution of movements along the vertical axis. Horizontal and vertical pointing movements were tested in two postures (upright and reclined) and two visual conditions (with and without vision) to identify possible effects of each of these cues on kinematics of movement. Movements were recorded using an optical 3D tracking system and analysis was conducted on velocity profiles of the hand. Despite a major effect of gravity, our analysis shows an effect of the movement direction with respect to the body axis when subjects were reclined with eyes closed. These results suggest that our CNS takes into account multimodal information about vertical in order to compute an optimal motor command that anticipates the effects of gravity.  相似文献   

11.
Eye–hand coordination is a crucial element of goal-directed movements. However, few studies have looked at the extent to which unconstrained movements of the eyes and hand made to targets influence each other. We studied human participants who moved either their eyes or both their eyes and hand to one of three static or flashed targets presented in 3D space. The eyes were directed, and hand was located at a common start position on either the right or left side of the body. We found that the velocity and scatter of memory-guided saccades (flashed targets) differed significantly when produced in combination with a reaching movement than when produced alone. Specifically, when accompanied by a reach, peak saccadic velocities were lower than when the eye moved alone. Peak saccade velocities, as well as latencies, were also highly correlated with those for reaching movements, especially for the briefly flashed targets compared to the continuous visible target. The scatter of saccade endpoints was greater when the saccades were produced with the reaching movement than when produced without, and the size of the scatter for both saccades and reaches was weakly correlated. These findings suggest that the saccades and reaches made to 3D targets are weakly to moderately coupled both temporally and spatially and that this is partly the result of the arm movement influencing the eye movement. Taken together, this study provides further evidence that the oculomotor and arm motor systems interact above and beyond any common target representations shared by the two motor systems.  相似文献   

12.
When tracking moving visual stimuli, primates orient their visual axis by combining two kinds of eye movements, smooth pursuit and saccades, that have very different dynamics. Yet, the mechanisms that govern the decision to switch from one type of eye movement to the other are still poorly understood, even though they could bring a significant contribution to the understanding of how the CNS combines different kinds of control strategies to achieve a common motor and sensory goal. In this study, we investigated the oculomotor responses to a large range of different combinations of position error and velocity error during visual tracking of moving stimuli in humans. We found that the oculomotor system uses a prediction of the time at which the eye trajectory will cross the target, defined as the "eye crossing time" (T(XE)). The eye crossing time, which depends on both position error and velocity error, is the criterion used to switch between smooth and saccadic pursuit, i.e., to trigger catch-up saccades. On average, for T(XE) between 40 and 180 ms, no saccade is triggered and target tracking remains purely smooth. Conversely, when T(XE) becomes smaller than 40 ms or larger than 180 ms, a saccade is triggered after a short latency (around 125 ms).  相似文献   

13.
The present study evaluated the role of eye movements for manual adaptation to reversed vision. Subjects tracked a visual target using a mouse-driven cursor. In Experiment A, they were instructed to look at the target, look at the cursor, fixate straight ahead, or received no instructions regarding eye movements (Groups T, C, F, and N, respectively). Experiment B involved Groups T and C only. In accordance with literature, baseline manual tracking was more accurate when subjects were instructed to move their eyes rather than to fixate straight ahead. In contrast, no such benefit was observed for the adaptive improvement of tracking. We therefore concluded that transfer of information from the oculomotor to the hand motor system enhances the ongoing control of hand movements but not their adaptive modification; probably because the large computational demand of adaptation does not allow an additional processing of supplementary oculomotor signals. We further found adaptation to be worse in T than in any other group. In particular, adaptation was worse in T than in C although eye movements were the same: subjects in both groups moved their eyes in close relationship with the target rather than the cursor, Group C thus disobeying our instructions. The deficient performance of Group T is therefore not related to eye movements per se, but rather to our instructions. We conclude that an independently moving target strongly attracts eye movements independent of instruction (i.e. Groups T and C), but instructions may redirect spatially selective attention (i.e. Group T vs C), and thus influence adaptation.  相似文献   

14.
Limited knowledge is available regarding the processes by which the brain codes the velocity of visual targets with respect to the observer. Two models have been previously proposed to describe the visual target localization mechanism. Both assume that the necessary information is derived from the coding of the position of the eye in the orbit, either through a copy of the muscular activation (out flow model) or through eye muscle proprioception (in flow model). Eye velocity coding might be derived from velocity sensitive ocular muscle proprioceptors or from position coding signals through differentiation. We used techniques based on manual pointing and manual tracking of visual target, combined with passive deviation of one covered eye, to demonstrate that ocular muscle proprioception is involved in (i) eye-in-head position coding, hence in target localization function; (ii) long-term maintenance of ocular alignment (phoria); and (iii) sensing of visual target velocity with respect to the head. These observations support other data now available, describing the processes by which the brain codes position and velocity of visual targets. Such findings might interest engineers in the field of robotics who are facing the problem of providing robots with the ability to sense object position and velocity in order to create an internal model of their working environment. One of the authors (G. M. Gauthier) had the priviledge to start his scientific carrier with Prof. Lawrence Stark, to whom this work is dedicated. The project was supported by CNRS UA 372, INSERM 896007, and Human Frontier Science Program grants. J. Blouin was a postdoctoral fellow supported by the grant and by le Conseil de recherches en sciences naturelles et en génie du Canada. We thank David S. Zee for contributing to the part of the work reported here on induced phoria changes.  相似文献   

15.
This experiment investigated the relative extent to which different signals from the visuo-oculomotor system are used to improve accuracy of arm movements. Different visuo-oculomotor conditions were used to produce various retinal and extraretinal signals leading to a similar target amplitude: (a) fixating a central target while pointing to a peripheral visual target, (b) tracking a target through smooth pursuit movement and then pointing to the target when its excursion ceased, and (c) pointing to a target reached previously by a saccadic eye movement. The experiment was performed with a deafferented subject and control subjects. For the deafferented patient, the absence of proprioception prevented any comparison between internal representations of target and limb (through proprioception) positions during the arm movement. The deafferented patient's endpoint therefore provided a good estimate of the accuracy of the target coordinates used by the arm motor system. The deafferented subject showed relatively good accuracy by producing a saccade prior to the pointing, but large overshooting in the fixation condition and undershooting in the pursuit condition. The results suggest that the deafferented subject does use oculomotor signals to program arm movement and that signals associated with fast movements of the eyes are better for pointing accuracy than slow ramp movements. The inaccuracy of the deafferented subject when no eye movement is allowed (the condition in which the controls were the most accurate) suggests that, in this condition, a proprioceptive map is involved in which both the target and the arm are represented.  相似文献   

16.
Ocular gaze is anchored to the target of an ongoing pointing movement   总被引:13,自引:0,他引:13  
It is well known that, typically, saccadic eye movements precede goal-directed hand movements to a visual target stimulus. Also pointing in general is more accurate when the pointing target is gazed at. In this study, it is hypothesized that saccades are not only preceding pointing but that gaze also is stabilized during pointing in humans. Subjects, whose eye and pointing movements were recorded, had to make a hand movement and a saccade to a first target. At arm movement peak velocity, when the eyes are usually already fixating the first target, a new target appeared, and subjects had to make a saccade toward it (dynamical trial type). In the statical trial type, a new target was offered when pointing was just completed. In a control experiment, a sequence of two saccades had to be made, with two different interstimulus intervals (ISI), comparable with the ISIs found in the first experiment for dynamic and static trial types. In a third experiment, ocular fixation position and pointing target were dissociated, subjects pointed at not fixated targets. The results showed that latencies of saccades toward the second target were on average 155 ms longer in the dynamic trial types, compared with the static trial types. Saccades evoked during pointing appeared to be delayed with approximately the remaining deceleration time of the pointing movement, resulting in "normal" residual saccadic reaction times (RTs), measured from pointing movement offset to saccade movement onset. In the control experiment, the latency of the second saccade was on average only 29 ms larger when the two targets appeared with a short ISI compared with trials with long ISIs. Therefore the saccadic refractory period cannot be responsible for the substantially bigger delays that were found in the first experiment. The observed saccadic delay during pointing is modulated by the distance between ocular fixation position and pointing target. The largest delays were found when the targets coincided, the smallest delays when they were dissociated. In sum, our results provide evidence for an active saccadic inhibition process, presumably to keep steady ocular fixation at a pointing target and its surroundings. Possible neurophysiological substrates that might underlie the reported phenomena are discussed.  相似文献   

17.
Visuomotor control of the arm was assessed in a single case study of a subject with focal lesions in the cerebellum and brainstem. A dissociation between 'on-line' and 'off-line' visuomotor control was revealed: impairments in 'on-line' visuomotor control included inaccuracy of tracking velocity, increase in spatial pointing variability and a delay in simple reaction time; whereas the patient was able to adapt to a gain change in 'off-line' visual feedback during a pointing task, and his adaptation was less affected than that of control subjects by trial-to-trial random fluctuations in 'off-line' visual feedback. We conclude that focal damage in the cerebellar peduncles may be principally responsible for this dissociation.  相似文献   

18.
The coordination of the oculomotor and manual effector systems is an important component of daily motor behavior. Previous work has primarily examined oculomotor/manual coordination in discrete targeting tasks. Here we extend this work to learning a tracking task that requires continuous response and movement update. Over two sessions, participants practiced controlling a computer mouse with movements of their arm to follow a target moving in a repeated sequence. Eye movements were also recorded. In a retention test, participants demonstrated sequence-specific learning with both effector systems, but differences between effectors also were apparent. Time series analysis and multiple linear regression were employed to probe spatial and temporal contributions to overall tracking accuracy within each effector system. Sequence-specific oculomotor learning occurred only in the spatial domain. By contrast, sequence-specific learning at the arm was evident only in the temporal domain. There was minimal interdependence in error rates for the two effector systems, underscoring their independence during tracking. These findings suggest that the oculomotor and manual systems learn contemporaneously, but performance improvements manifest differently and rely on different elements of motor execution. The results may in part be a function of what the motor learning system values for each effector as a function of its effector’s inertial properties.  相似文献   

19.
Control strategies in directing the hand to moving targets   总被引:2,自引:0,他引:2  
Summary We have evaluated the use of visual information about the movement of a target in two tasks tracking and interceptions — involving multi-joint reaching movements with the arm. Target velocity was either varied in a pseudorandom order (random condition) or was kept constant (predictable condition) across trials. Response latency decreased as target velocity increased in each condition. A simple model that assumes that latency is the sum of two components — the time taken for target motion to be detected, and a fixed processing time — provides a good fit to the data. Results from a step-ramp experiment, in which the target stepped a small distance immediately preceding the onset of the ramp motion, were consistent with this model. The characteristics of the first 100 ms of the response depended on the amount of information about target motion available to the subject. In the tracking task with randomly varied target velocities, the initial changes in hand velocity were largely independent of target velocity. In contrast, when the velocity was predictable the initial hand velocity depended on target velocity. Analogously, the initial changes in the direction of hand motion in the interception task were independent of target velocity in the random condition, but depended on target velocity in the predictable condition. The time course for development of response dependence was estimated by controlling the amount of visual information about target velocity available to the subject before the onset of limb movement. The results suggest that when target velocity was random, hand movement started before visual motion processing was complete. The response was subsequently adjusted after target velocity was computed. Subjects displayed idiosyncratic strategies during the catch-up phase in the tracking task. The peak hand velocity depended on target velocity and was similar for all subjects. The time at which the peak occurred, in contrast, varied substantially among subjects. In the interception task the hand paths were straighter in the predictable than in the random condition. This appeared to be the result of making adjustments in movement direction in the former condition to correct for initially inappropriate responses.  相似文献   

20.
Coordination between the eyes and the hand is likely to be based on a process of motor learning, so that the interactions between the two systems can be accurately controlled. By using an unusual tracking task we measured the change in brain activation levels, as recorded with 3T functional magnetic resonance imaging (fMRI), between naïve human subjects and the same subjects after a period of extended training. Initially the performance of the two groups was similar. One subject group was then trained in a synchronous, coordinated, eye–hand task; the other group trained with a 304 ms temporal offset between hand and eye tracking movements. After training, different patterns of performance were observed for the groups, and different functional activation profiles. Significant change in the relationship between functional activation levels and eye–hand task conditions was predominantly restricted to visuo-motor areas of the lateral and vermal cerebellum. In an additional test with one of the subject groups, we show that there was increased cerebellar activation after learning, irrespective of change in performance error. These results suggest that two factors contribute to the measured blood oxygen level-dependent (BOLD) signal. One declined with training and may be directly related to performance error. The other increased after training, in the test conditions nearest to the training condition, and may therefore be related to acquisition of experience in the task. The loci of activity changes suggest that improved performance is because of selective modified processing of ocular and manual control signals within the cerebellum. These results support the suggestion that coordination between eye and hand movement is based on an internal model acquired by the cerebellum that provides predictive signals linking the control of the two effectors.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号