Direct coupling of haptic signals between hands |
| |
Authors: | Lucile Dupin Vincent Hayward Mark Wexler |
| |
Affiliation: | aLaboratoire Psychologie de la Perception, CNRS and Université Paris Descartes, 75006 Paris, France; and;bSorbonne Universités, Université Pierre et Marie Curie, Paris 06, Unité Mixte de Recherche 7222, Institut des Systèmes Intelligents et de Robotique, 75005 Paris, France |
| |
Abstract: | Although motor actions can profoundly affect the perceptual interpretation of sensory inputs, it is not known whether the combination of sensory and movement signals occurs only for sensory surfaces undergoing movement or whether it is a more general phenomenon. In the haptic modality, the independent movement of multiple sensory surfaces poses a challenge to the nervous system when combining the tactile and kinesthetic signals into a coherent percept. When exploring a stationary object, the tactile and kinesthetic signals come from the same hand. Here we probe the internal structure of haptic combination by directing the two signal streams to separate hands: one hand moves but receives no tactile stimulation, while the other hand feels the consequences of the first hand’s movement but remains still. We find that both discrete and continuous tactile and kinesthetic signals are combined as if they came from the same hand. This combination proceeds by direct coupling or transfer of the kinesthetic signal from the moving to the feeling hand, rather than assuming the displacement of a mediating object. The combination of signals is due to perception rather than inference, because a small temporal offset between the signals significantly degrades performance. These results suggest that the brain simplifies the complex coordinate transformation task of remapping sensory inputs to take into account the movements of multiple body parts in haptic perception, and they show that the effects of action are not limited to moving sensors.Motor and kinesthetic signals arising from the movement of the eyes in the head, and translation of the eyes and ears in space due to head and body movements, have been shown to play an important role in visual (1–3) and auditory (4–6) perception. However, because of the small number of sensory surfaces in these modalities, and the rigid constraints on their movement, the number of kinesthetic degrees of freedom is limited. In active touch or the haptic modality (7–13), the large number of sensory surfaces, and the nearly unlimited ways these surfaces can move, lead to the question of how movement can be represented and associated with the cutaneous or tactile signals. To study how tactile and kinesthetic cues are combined to haptically perceive object shape and size, we created a novel haptic stimulus in which these cues were completely dissociated. This stimulus consisted of simulated triangles felt through a narrow slit, as in anorthoscopic perception in vision (14–16) or haptic perception (17), and as illustrated in .Open in a separate windowTactile stimuli and movement conditions. (A) An example of the tactile stimulus, here an expanding bar. The stimulus is shown as a series of “snapshots,” with the time running to the Right—the actual stimulus was continuous. The red rectangle represents the vibrating pins. (B) SAME condition: the same hand moves and experiences the tactile consequences of the movement. The hand moves the tactile display (green rectangle) mounted on the slider. In this example, a forward movement (away from the participant, upward in the figure) together with an expanding bar simulates a backward-pointing triangle (shown in red), felt through a slit. (C) DIFF condition: one hand moves, while the other hand, immobile, receives the tactile stimulus. The tactile signal (expanding bar) and kinesthetic signal (forward movement) are the same as in the previous example. Will the observer perceive simply an expanding bar or triangle in space? If a triangle is perceived, in which direction will it point? (D) IMMOB condition: the tactile signal is presented alone, with no movement. The expanding or contracting bar’s width as a function of time is a replay of a previous trial. (E) Correct spatial orientation of the triangle in the SAME condition, as a function of the movement direction (forward or backward) and the tactile stimulus (expansion or contraction). This truth table is an exclusive-or function, which has null correlations with its two input signals. Using either of the two signals alone will result in chance performance.The tactile signal consisted of a line that expanded or contracted on the index finger, delivered using a tactile display composed of pins that could vibrate independently, as shown in . In one condition (SAME), the display was mounted on a slider that experimental participants slid along a track perpendicular to the tactile expansion or contraction, as shown in . The slider’s position was used to update the tactile display to simulate stationary triangles of various lengths, oriented toward or away from the participant, felt through a virtual slit that moved with the finger. Would participants perceive an extended triangular shape, rather than the proximal stimulus consisting of expansion or contraction, and if so, which orientation would they perceive? For example, a backward-pointing triangle could result from a tactile expansion coupled with a forward movement (as shown in , where forward motions and orientations—away from the participant—are shown as upward), or a contraction coupled with a backward movement—and vice versa for a forward-pointing triangle (). Thus, the directions of the tactile stimulus and of the finger movement are insufficient by themselves to yield veridical perception of triangle orientation, and must be combined in an exclusive-or function () to yield veridical perception of object orientation. There are no purely tactile shape cues: the triangle’s edges are not slanted in our tactile stimulus (in contrast to the edges of an actual triangle behind a slit). Therefore, from the functional point of view, the only way to obtain triangle orientation is for the nervous system to make use of the exclusive-or rule on some level. Below we will show that this rule is applied perceptually rather than through cognitive inference.Crucially, we separated tactile and kinesthetic stimuli in the different (DIFF) condition. Participants moved the slider with one hand, and received the tactile stimulation through the stationary fingertip of the other hand (). The position of the moving was measured and used to update the tactile display as in the SAME condition. Finally, in the immobile (IMMOB) condition, neither hand moved, and the durations of the tactile expansions and contractions were the same as in the other conditions (). Thus, in both the DIFF and IMMOB conditions, we simulated a moving triangle felt through a stationary slit. In all conditions, participants reported both the perceived orientation and size of the triangle using a visual probe at the end of each trial. Participants closed their eyes during the movement and tactile stimulation, and therefore could not see the display during the crucial part of each trial. In the movement conditions (SAME and DIFF) the direction and speed of finger movement varied from trial to trial. Each of the three conditions was performed in separate blocks for different hands, with the blocks in random order. |
| |
Keywords: | touch haptics perception sensorimotor integration kinesthesis |
|
|