首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 797 毫秒
1.
In studies of rhythmic coordination, where sensory information is often generated by an auditory stimulus, spatial and temporal variability are known to decrease at points in the movement cycle coincident with the stimulus, a phenomenon known as anchoring (Byblow et al. 1994). Here we hypothesize that the role of anchoring may be to globally stabilize coordination under conditions in which it would otherwise undergo a global coordinative change such as a phase transition. To test this hypothesis, anchoring was studied in a bimanual coordination paradigm in which either inphase or antiphase coordination was produced as auditory pacing stimuli (and hence movement frequency) were scaled over a wide range of frequencies. Two different anchoring conditions were used: a single-metronome condition, in which peak amplitude of right finger flexion coincided with the auditory stimulus; and a double-metronome condition, in which each finger reversal (flexion and extension) occurred simultaneously with the auditory stimuli. Anchored reversal points displayed lower spatial variation than unanchored reversal points, resulting in more symmetric phase plane trajectories in the double- than the single-metronome condition. The global coordination dynamics of the double-metronome condition was also more stable, with transitions from antiphase to inphase occurring less often and at higher movement frequencies than in the single-metronome condition. An extension of the Haken-Kelso-Bunz model of bimanual coordination is presented briefly which includes specific coupling of sensory information to movement through a process we call parametric stabilization. The parametric stabilization model provides a theoretical account of both local effects on the individual movement trajectories (anchoring) and global stabilization of observed coordination patterns, including the delay of phase transitions.  相似文献   

2.
We examined to what extent the CNS can efficiently bind together the perception of non-coincident multimodal events with coordinated movements. To do so, we selected a bimanual coordination with left–right asymmetry, which was, achieving 3:2 polyrhythmic movements. We asked participants to synchronize left and right fingers’ movements to events presented, respectively, to the left and to the right side. In two segregated conditions, sound was presented on one side at one frequency while touch was presented on the other side at the other frequency; thus, the left and right rhythms were paced via a distinct sensory modality. In the three control conditions, the stimuli on both sides were presented via the same sensory modality: sound, touch, or coincident sound and touch. Our aim was to contrast two opposing hypotheses: Sensory segregated pacing (1) stabilizes polyrhythmic coordination because it favors the distinction between the fast and the slow rhythm versus (2) destabilizes polyrhythmic coordination because it introduces a very strong asymmetry. We performed a parametric study in which the ability to maintain the polyrhythmic coordination was explored over a broad range of pacing rates. We found that switches from the polyrhythmic coordination to an isofrequency pattern took place only in the sensory segregated conditions, at the highest frequencies. Moreover, transitions were preceded by an increase in the variability of the synchronization of movement to stimuli. We therefore propose that the destabilization originating from the asymmetry between sensory modalities overrides the assumed segregation effect. We discuss the possible neuronal underpinnings of this failure of binding of movement to segregated sound and touch.  相似文献   

3.
We used event-related functional magnetic resonance imaging to study the neural correlates of endogenous spatial attention for vision and touch. We examined activity associated with attention-directing cues (central auditory pure tones), symbolically instructing subjects to attend to one hemifield or the other prior to upcoming stimuli, for a visual or tactile task. In different sessions, subjects discriminated either visual or tactile stimuli at the covertly attended side, during bilateral visuotactile stimulation. To distinguish cue-related preparatory activity from any modulation of stimulus processing, unpredictably on some trials only the auditory cue was presented. The use of attend-vision and attend-touch blocks revealed whether preparatory attentional effects were modality-specific or multimodal. Unimodal effects of spatial attention were found in somatosensory cortex for attention to touch, and in occipital areas for attention to vision, both contralateral to the attended side. Multimodal spatial effects (i.e. effects of attended side irrespective of task-relevant modality) were detected in contralateral intraparietal sulcus, traditionally considered a multimodal brain region; and also in the middle occipital gyrus, an area traditionally considered purely visual. Critically, all these activations were observed even on cue-only trials, when no visual or tactile stimuli were subsequently presented. Endogenous shifts of spatial attention result in changes of brain activity prior to the presentation of target stimulation (baseline shifts). Here, we show for the first time the separable multimodal and unimodal components of such preparatory activations. Additionally, irrespective of the attended side and modality, attention-directing auditory cues activated a network of superior frontal and parietal association areas that may play a role in voluntary control of spatial attention for both vision and touch. Electronic Publication  相似文献   

4.
An increasing number of animal and human studies suggests that different sensory systems share spatial representations in the brain. The aim of the present study was to test whether attending to auditory stimuli presented at a particular spatial location influences the processing of tactile stimuli at that position and vice versa (crossmodal attention). Moreover, it was investigated which processing stages are influenced by orienting attention to a certain stimulus modality (intermodal attention). Event-related brain potentials (ERPs) were recorded from 15 participants while tactile and auditory stimuli were presented at the left or right side of the body midline. The task of the participants was to attend to either the auditory or to the tactile modality, and to respond to infrequent double-stimuli of either the left or right side. Results showed that spatial attention modulated both early and late somatosensory and auditory ERPs when touch and tones were relevant, respectively. Moreover, early somatosensory (N70–100, N125–175) and auditory (N100–170) potentials, but not later deflections, were affected by spatial attention to the other modality, suggesting bi-directional crossmodal links between hearing and touch. Additionally, ERPs were modulated by intermodal selection mechanisms: stimuli elicited enhanced negative early and late ERPs when they belonged to the attended modality compared to those that belonged to the unattended modality. The present results provide evidence for the parallel influence of spatial and intermodal selection mechanisms at early processing stages while later processing steps are restricted to the relevant modality. Electronic Publication  相似文献   

5.
Understanding how we synchronize our actions with stimuli from different sensory modalities plays a central role in helping to establish how we interact with our multisensory environment. Recent research has shown better performance with multisensory over unisensory stimuli; however, the type of stimuli used has mainly been auditory and tactile. The aim of this article was to expand our understanding of sensorimotor synchronization with multisensory audio-visual stimuli and compare these findings to their individual unisensory counterparts. This research also aims to assess the role of spatio-temporal structure for each sensory modality. The visual and/or auditory stimuli had either temporal or spatio-temporal information available and were presented to the participants in unimodal and bimodal conditions. Globally, the performance was significantly better for the bimodal compared to the unimodal conditions; however, this benefit was limited to only one of the bimodal conditions. In terms of the unimodal conditions, the level of synchronization with visual stimuli was better than auditory, and while there was an observed benefit with the spatio-temporal compared to temporal visual stimulus, this was not replicated with the auditory stimulus.  相似文献   

6.
Two positron-emission tomography (PET) experiments explored the neural basis of selective spatial attention in vision and touch, testing for modality-specific versus multimodal activations due to attended side. In the first study, either light flashes or finger vibrations were presented bilaterally. Twelve healthy volunteers were scanned while sustaining covert attention on the left or right hemifield within each modality. The main effect for attending right minus left, across both modalities, revealed bimodal spatial attention effects in the left intraparietal sulcus and left occipitotemporal junction. Modality-specific attentional effects (again, for attending right vs. left) were found in the left superior occipital gyrus for vision, and left superior postcentral gyrus for touch. No significant activations were seen for attending left minus right. The second study presented only tactile stimuli, manipulating whether the eyes were open or closed, and including passive stimulation and rest baselines. The unimodal activation for tactile spatial attention in the left superior postcentral gyrus was replicated. The bimodal activation of the left intraparietal sulcus observed in the first study was now found for touch, but only when the eyes were open (hands visible), apparently confirming its multimodal nature. These results reveal mechanisms of sustained spatial attention operating at both modality-specific and multimodal levels.  相似文献   

7.
A series of experiments evaluated whether the habituation of the startle response of the rat to tactile and auditory cues is stimulus specific. Experiment 1 showed stimulus specificity of a short-term habituation effect, whereby the startle to the second of a pair of stimuli was significantly less when the initial stimulus involved the same rather than the different modality. Experiments 2 and 3 focused on the more persistent decrement in startle that is a result of repeated stimulation, and demonstrated that such long-term habituation to the tactile and auditory stimuli contained a stimulus specific component in addition to a generalized component. The generalized habituation observed between the tactile and auditory stimuli in the three experiments may be due to an auditory accompaniment of the tactile stimulus employed. Discussion emphasized the utility of investigating habituation in a preparation with robust specificity.  相似文献   

8.
The contribution of the auditory cortex to tactile information processing was studied by measuring somatosensory evoked magnetic fields (SEFs). Three kinds of vibrotactile stimuli with frequencies of 180, 280 and 380 Hz were randomly delivered on the right index finger with a probability of 40, 20 and 40%, respectively. Twenty normal subjects participated in four kinds of tasks: a control condition to ignore these stimuli, a simple task to discriminate the 280-Hz stimulus from the other two stimuli (discrimination task for the vibrotactile stimuli, Ts task), a feedback task modified from the Ts task by adding acoustic feedback of the vibratory frequency at 1300 ms poststimulus (tactile discrimination with auditory clues, TA), and an easy version of the TA task (TA-easy) to discriminate the 280-Hz stimulus (20% target) from the 180- or 380-Hz stimuli (80% nontarget). The Ts and TA tasks required accurate perception of the vibrotactile frequencies to discriminate among the three kinds of stimuli. Under such a task demand, the post hoc auditory feedback in the TA task was expected to induce acoustic imagery for the tactile sensation. The SEFs for the nontarget stimuli were analyzed. A middle-latency component (M150/200) was specifically evoked by the three discrimination tasks. In the Ts and TA-easy tasks, the M150/200 source indicated inferior parietal cortical activities (SII area). In the TA task, 11 subjects showed activity in both the SII area and the superior temporal auditory region and increased accuracy of discrimination compared with the Ts task, in contrast with other subjects who showed activity only in the SII area and small changes in task accuracy between the Ts and TA tasks. Asynchronous auditory feedback for the vibrotactile sensation induced the auditory cortex activity in the SEFs in relation to the progress in tactile discrimination, which suggested an induction of acoustic imagery to complement the tactile information processing.  相似文献   

9.
Participants in Experiments 1 and 2 performed a discrimination and counting task to assess the effect of lead stimulus modality on attentional modification of the acoustic startle reflex. Modality of the discrimination stimuli was changed across subjects. Electrodermal responses were larger during task-relevant stimuli than during task-irrelevant stimuli in all conditions. Larger blink magnitude facilitation was found during auditory and visual task-relevant stimuli, but not for tactile stimuli. Experiment 3 used acoustic, visual, and tactile conditioned stimuli (CSs) in differential conditioning with an aversive unconditioned stimulus (US). Startle magnitude facilitation and electrodermal responses were larger during a CS that preceded the US than during a CS that was presented alone regardless of lead stimulus modality. Although not unequivocal, the present data pose problems for attentional accounts of blink modification that emphasize the importance of lead stimulus modality.  相似文献   

10.
The purpose of the current work was to quantify the influence of posture-mediated skin deformation on trunk dorsum tactile perceptual sensitivity. Twelve young and healthy individuals were assessed while adopting three different spine postures (extension, neutral and flexion). Tactile sensitivity threshold tests (T10 and L4 vertebral levels) included measures of touch sensitivity, spatial acuity and stretch sensitivity. The results demonstrate that tactile sensitivity can differ due to changes in body posture. The skin of the trunk dorsum had increased thresholds for touch sensitivity, longitudinal spatial acuity and transverse stretch sensitivity in spine flexion. Furthermore, spine flexion also resulted in a reduced sensory threshold to stretching stimuli in the longitudinal direction. The opposite trends occurred when participants adopted spine extension. It is suggested that posture-mediated skin deformation generates changes in the amount of strain experienced by individual skin mechanoreceptors, and the relative spacing between mechanoreceptors. Furthermore, it is suggested that “pre-stretch” of the skin brings mechanoreceptors closer to their stretch activation thresholds, thereby increasing an individual’s sensitivity to skin stretch when in spine flexion.  相似文献   

11.
Mental imagery is considered to be important for normal conscious experience. It is most frequently investigated in the visual, auditory and motor domain (imagination of movement), while the studies on tactile imagery (imagination of touch) are scarce. The current study investigated the effect of tactile and auditory imagery on the left/right discriminations of tactile and auditory stimuli. In line with our hypothesis, we observed that after tactile imagery, tactile stimuli were responded to faster as compared to auditory stimuli and vice versa. On average, tactile stimuli were responded to faster as compared to auditory stimuli, and stimuli in the imagery condition were on average responded to slower as compared to baseline performance (left/right discrimination without imagery assignment). The former is probably due to the spatial and somatotopic proximity of the fingers receiving the taps and the thumbs performing the response (button press), the latter to a dual task cost. Together, these results provide the first evidence of a behavioural effect of a tactile imagery assignment on the perception of real tactile stimuli.  相似文献   

12.
In the well-known spatial ventriloquism effect, auditory stimuli are mislocalized towards the location of synchronous but spatially disparate visual stimuli. Recent studies have demonstrated a similar influence of tactile stimuli on auditory localization, which predominantly operates in an external coordinate system. Here, we investigated whether this audio-tactile ventriloquist illusion leads to comparable aftereffects in the perception of auditory space as have been observed previously for audiovisual stimulation. Participants performed a relative sound localization task in which they had to judge whether a brief sound was perceived at the same or a different location as a preceding tactile stimulus (“Experiment 1”) or to the left or right of a preceding visual stimulus (“Experiment 2”). Sound localization ability was measured before and after exposure to synchronous audio-tactile stimuli with a constant spatial disparity. After audio-tactile adaptation, unimodal sound localization was shifted in the direction of the tactile stimuli during the preceding adaptation phase in both tasks. This finding provides evidence for the existence of an audio-tactile ventriloquism aftereffect and suggests that auditory space (rather than specific audio-tactile connections) can be rapidly recalibrated to compensate for audio-tactile spatial disparities.  相似文献   

13.
We present two experiments in which we investigated whether tactile attention is modulated by action preparation. In Experiment 1, participants prepared a saccade toward either the left or right index finger, depending on the pitch of a non-predictive auditory cue. In Experiment 2, participants prepared to lift the left or right index finger in response to the auditory cue. In half of the trials in both experiments, a suprathreshold vibratory stimulus was presented with equal probability to either finger, to which the participants made a speeded foot response. The results showed facilitation in the processing of targets delivered at the goal location of the prepared movement (Experiment 1), as well as at the effector of the prepared movement (Experiment 2). These results are discussed within the framework of theories on motor preparation and spatial attention.  相似文献   

14.
It has been shown that stimuli of a task-irrelevant modality receive enhanced processing when they are presented at an attended location in space (crossmodal attention). The present study investigated the effects of visual deprivation on the interaction of the intact sensory systems. Random streams of tactile and auditory stimuli were presented at the left or right index finger of congenitally blind participants. They had to attend to one modality (auditory or tactile) of one side (left or right) and had to respond to deviant stimuli of the attended modality and side. While in a group of sighted participants, early event-related potentials (ERPs) were negatively displaced to stimuli presented at the attended position, compared to the unattended, for both the task-relevant and the task-irrelevant modality, starting as early as 80 ms after stimulus onset (unimodal and crossmodal spatial attention effects, respectively), corresponding crossmodal effects could not be detected in the blind. In the sighted, spatial attention effects after 200 ms were only significant for the task-relevant modality, whereas a crossmodal effect for this late time window was observed in the blind. This positive rather than negative effect possibly indicates an active suppression of task-irrelevant stimuli at an attended location in space. The present data suggest that developmental visual input is essential for the use of space to integrate input of the non-visual modalities, possibly because of its high spatial resolution. Alternatively, enhanced perceptual skills of the blind within the intact modalities may result in reduced multisensory interactions ("inverse effectiveness of multisensory integration").  相似文献   

15.
Loud acoustic stimuli presented during movement preparation can shorten reaction time and increase response forcefulness. We examined how efferent connectivity of an agonist muscle to reticulospinal and corticospinal pathways, and the level of prepared movement force, affect reaction time and movement execution when the motor response is triggered by an intense acoustic stimulus. In Experiment 1, participants executed ballistic wrist flexion and extension movements of low and high force in response to visual stimuli. A loud acoustic stimulus (LAS; 105 dBa) was presented simultaneously with the visual imperative stimulus in probe trials. In Experiment 2, participants executed ballistic wrist flexion movements ranging from 10%–50% of maximum voluntary contraction with a LAS presented in probe trials. The shortening of response initiation was not affected by movement type (flexion or extension) or prepared movement force. Enhancement of response magnitude, however, was proportionally greater for low force movements and for the flexor muscle. Changes in peak force induced by the intense acoustic stimulus indicated that the neural activity introduced to motor program circuits by acoustic stimulation is additive to the voluntary neural activity that occurs due to movement preparation, rather than multiplicative.  相似文献   

16.
Six right-handed subjects performed rhythmic flexion and extension movements of the index finger in time with an auditory metronome. On each block of trials, the wrist of the response hand was placed in a extended, neutral or flexed position. In the flex-on-the-beat condition, subjects were instructed to coordinate maximum excursion in the direction of finger flexion with each beat of the metronome. In the extend-on-the-beat condition, subjects were instructed to coordinate maximum excursion in the direction of finger extension with each beat of the metronome. The frequency of the metronome was increased from 2.00 Hz to 3.75 Hz in 8 steps (8 s epochs) of 0.25 Hz. During trials prepared in the extend-on-the-beat pattern, all subjects exhibited transitions to either a flex-on-the-beat pattern or to phase wandering as the frequency of pacing was increased. The time at which these transitions occurred was reliably influenced by the position of the wrist. Four subjects exhibited qualitative departures from the flex-on-the-beat pattern at pacing frequencies that were greater than those at which the extend-on-the-beat pattern could be maintained. The time at which these departures occurred was not influenced by the position of the wrist. These results are discussed with reference to the constraints imposed on the coordination dynamics by the intrinsic properties of the neuromuscular-skeletal system. Received: 1 October 1997 / Accepted: 20 March 1998  相似文献   

17.
Shape is an inherent property of objects existing in both vision and touch but not audition. Can shape then be represented by sound artificially? It has previously been shown that sound can convey visual information by means of image-to-sound coding, but whether sound can code tactile information is not clear. Blindfolded sighted individuals were trained to recognize tactile spatial information using sounds mapped from abstract shapes. After training, subjects were able to match auditory input to tactually discerned shapes and showed generalization to novel auditory–tactile pairings. Furthermore, they showed complete transfer to novel visual shapes, despite the fact that training did not involve any visual exposure. In addition, we found enhanced tactile acuity specific to the training stimuli. The present study demonstrates that as long as tactile space is coded in a systematic way, shape can be conveyed via a medium that is not spatial, suggesting a metamodal representation.  相似文献   

18.
Four right-handed subjects performed rhythmic flexion and extension movements of the index finger in time with an auditory metronome. On each block of trials the forearm of the response hand was placed in a prone, neutral or supine position. In the flex-on-the-beat condition, subjects were instructed to coordinate maximum excursion in the direction of finger flexion with each beat of the metronome. In the extend-on-the-beat condition, subjects were instructed to coordinate maximum excursion in the direction of finger extension with each beat of the metronome. The frequency of the metronome was increased from 1.75 Hz to 3.50 Hz in eight steps (8-s plateaus) of 0.25 Hz. During trials prepared in the extend-on-the-beat pattern, abrupt transitions to either a flex-on-the-beat pattern or to phase wandering often occurred, particularly at higher pacing frequencies. In marked constrast, during trials prepared in the flexon-the-beat pattern such transitions were never present. Both the frequency and the alacrity of these transitions were greater when the forearm was in a prone or neutral position than when the forearm was in a supine position. These results are discussed with reference to the constraints imposed on the coordination dynamics by the intrinsic properties of the neuromuscular-skeletal system.  相似文献   

19.
Certain sounds, such as fingernails screeching down a chalkboard, have a strong association with somatosensory percepts. In order to assess the influences of audition on somatosensory perception, three experiments measured how task-irrelevant auditory stimuli alter detection rates for near-threshold somatosensory stimuli. In Experiment 1, we showed that a simultaneous auditory stimulus increases sensitivity, but not response biases, to the detection of an electrical cutaneous stimulus delivered to the hand. Experiment 2 demonstrated that this enhancement of somatosensory perception is spatially specific—only monaural sounds on the same side increased detection. Experiment 3 revealed that the effects of audition on touch are also frequency dependent—only sounds with the same frequency as the vibrotactile frequency enhanced tactile detection. These results indicate that auditory information influences touch perception in highly systematic ways and suggest that similar coding mechanisms may underlie the processing of information from these different sensory modalities.  相似文献   

20.
Recent behavioural and event-related potential (ERP) studies reported cross-modal links in spatial attention between vision, audition and touch. Such links could reflect differences in hemispheric-activation levels associated with spatial attention to one side, or more abstract spatial reference-frames mediating selectivity across modalities. To distinguish these hypotheses, ERPs were recorded to lateral tactile stimuli, plus visual (experiment 1) or auditory stimuli (experiment 2), while participants attended to the left or right hand to detect infrequent tactile targets, and ignored other modalities. In separate blocks, hands were either in a crossed or uncrossed posture. With uncrossed hands, visual stimuli on the tactually attended side elicited enhanced N1 and P2 components at occipital sites, and an enhanced negativity at midline electrodes, reflecting cross-modal links in spatial attention from touch to vision. Auditory stimuli at tactually attended locations elicited an enhanced negativity overlapping with the N1 component, reflecting cross-modal links from touch to audition. An analogous pattern of results arose for crossed hands, with tactile attention enhancing auditory or visual responses on the side where the attended hand now lay (i.e. in the opposite visual or auditory hemifield to that enhanced by attending the same hand when uncrossed). This suggests that cross-modal attentional links are not determined by hemispheric projections, but by common external locations. Unexpectedly, somatosensory ERPs were strongly affected by hand posture in both experiments, with attentional effects delayed and smaller for crossed hands. This may reflect the combined influence of anatomical and external spatial codes within the tactile modality, while cross-modal links depend only on the latter codes. Electronic Publication  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号