首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In face-to-face communication, speech is typically enriched by gestures. Clearly, not all people gesture in the same way, and the present study explores whether such individual differences in gesture style are taken into account during the perception of gestures that accompany speech. Participants were presented with one speaker that gestured in a straightforward way and another that also produced self-touch movements. Adding trials with such grooming movements makes the gesture information a much weaker cue compared with the gestures of the non-grooming speaker. The Electroencephalogram was recorded as participants watched videos of the individual speakers. Event-related potentials elicited by the speech signal revealed that adding grooming movements attenuated the impact of gesture for this particular speaker. Thus, these data suggest that there is sensitivity to the personal communication style of a speaker and that affects the extent to which gesture and speech are integrated during language comprehension.  相似文献   

2.
Gestures are an important part of human communication. However, little is known about the neural correlates of gestures accompanying speech comprehension. The goal of this study is to investigate the neural basis of speech-gesture interaction as reflected in activation increase and decrease during observation of natural communication.Fourteen German participants watched video clips of 5 s duration depicting an actor who performed metaphoric gestures to illustrate the abstract content of spoken sentences. Furthermore, video clips of isolated gestures (without speech), isolated spoken sentences (without gestures) and gestures in the context of an unknown language (Russian) were additionally presented while functional magnetic resonance imaging (fMRI) data were acquired.Bimodal speech and gesture processing led to left hemispheric activation increases of the posterior middle temporal gyrus, the premotor cortex, the inferior frontal gyrus, and the right superior temporal sulcus. Activation reductions during the bimodal condition were located in the left superior temporal gyrus and the left posterior insula. Gesture related activation increases and decreases were dependent on language semantics and were not found in the unknown-language condition.Our results suggest that semantic integration processes for bimodal speech plus gesture comprehension are reflected in activation increases in the classical left hemispheric language areas. Speech related gestures seem to enhance language comprehension during the face-to-face communication.  相似文献   

3.
In human face-to-face communication, the content of speech is often illustrated by coverbal gestures. Behavioral evidence suggests that gestures provide advantages in the comprehension and memory of speech. Yet, how the human brain integrates abstract auditory and visual information into a common representation is not known. Our study investigates the neural basis of memory for bimodal speech and gesture representations. In this fMRI study, 12 participants were presented with video clips showing an actor performing meaningful metaphoric gestures (MG), unrelated, free gestures (FG), and no arm and hand movements (NG) accompanying sentences with an abstract content. After the fMRI session, the participants performed a recognition task. Behaviorally, the participants showed the highest hit rate for sentences accompanied by meaningful metaphoric gestures. Despite comparable old/new discrimination performances (d') for the three conditions, we obtained distinct memory-related left-hemispheric activations in the inferior frontal gyrus (IFG), the premotor cortex (BA 6), and the middle temporal gyrus (MTG), as well as significant correlations between hippocampal activation and memory performance in the metaphoric gesture condition. In contrast, unrelated speech and gesture information (FG) was processed in areas of the left occipito-temporal and cerebellar region and the right IFG just like the no-gesture condition (NG). We propose that the specific left-lateralized activation pattern for the metaphoric speech-gesture sentences reflects semantic integration of speech and gestures. These results provide novel evidence about the neural integration of abstract speech and gestures as it contributes to subsequent memory performance.  相似文献   

4.
We compared gesture comprehension and imitation in patients with lesions in the left parietal lobe (LPAR, n=5) and premotor cortex/supplementary motor area (LPMA, n=8) in patients with damage to the right parietal lobe (RPAR, n=6) and right premotor/supplementary motor area (RPMA, n=6) and in 16 non-brain damaged control subjects. Three patients with left parietal lobe damage had aphasia. Subjects were shown 136 meaningful pantomimed motor acts on a videoscreen and were asked to identify the movements and to imitate the motor acts from memory with their ipsilesional and contralesional hand or with both hands simultaneously. Motor tasks included gestures without object use (e.g. to salute, to wave) pantomimed imitation of gestures on one's own body (e.g. to comb one's hair) and pantomimed imitation of motor acts which imply tool use to an object in extrapersonal space (e.g. to hammer a nail). Videotaped test performance was analysed by two independent raters; errors were classified as spatial errors, body part as object, parapraxic performance and non-identifiable movements. In addition, action discrimination was tested by evaluating whether a complex motor sequence was correctly performed. Results indicate that LPAR patients were most severely disturbed when imitation performance was assessed. Interestingly, LPAR patients were worse when imitating gestures on their own bodies than imitating movements with reference to an external object use with most pronounced deficits in the spatial domain. In contrast to imitation, comprehension was not or only slightly disturbed and no clear correlation was found between the severity of imitation deficits and gesture comprehension. Moreover, although the three patients with aphasia imitated the movements more poorly than non-aphasic LPAR patients, the severity of comprehension errors did not differ. Whereas unimanual imitating performance and gesture comprehension of PMA patients did not differ significantly from control subjects, bimanual tasks were severely disturbed, in particular when executing different movements simultaneously with the right and left hands.  相似文献   

5.
Abstract

In order to determine the extent to which aphasic patients are able to use hand gestures in their spontaneous communication, informally structured 20-minute interviews were conducted with three Wernicke's aphasics, four Broca's aphasics and five non-neurologically impaired control subjects. The study has been designed to overcome some of the methodological problems inherent in existing research, notably; (1) the classification of subjects according to the type of aphasia present, (2) the analysis of representative samples of naturally occurring conversation, (3) the use of a well-defined and reliable system of gesture classification, and (4) the interpretation of results based on formal statistical analyses. Overall, it was found that there were significant differences between the three groups in their use of hand gestures. The results of the study indicate that hand gestures are used most by Broca's aphasics and least by non-aphasic controls. Batons, ideographs and deictic movements were associated predominantly with Broca's aphasics, while both Wernicke's and Broca's aphasics used kinetographs significantly more than the non-neurologically impaired comparison group. The results are discussed in terms of the model of communication favoured by Cicone et al. (1979) which considers that expressive communication is governed by a ‘central organizer’ which initiates and controls the clarity and complexity of both speech and gesture. Based on the differential usage of gestures by the Wernicke's and Broca's aphasics, different approaches to the problem of remediation in aphasia are considered.  相似文献   

6.
In stroke patients, it has been suggested that communication disorders could result from lexical and syntactic disorders in left hemisphere lesions and from pragmatics problems in right lesions. However, we have little information on patient behaviour in dyadic communication, especially in conversation. Here, we analyzed the various processes participating in communication difficulties at the rehabilitation phase (1–6 months) post-stroke, in order to define the main mechanisms of verbal and non-verbal communication (VC, NVC) disorders and their relationship with aphasic disorders. Sixty-three patients were recruited, who belonged to six groups, with left or right cortico-sub-cortical (L-CSC, R-CSC) or sub-cortical (L-SC, R-SC), frontal (Fro) or posterior fossa (PF) lesions. They were compared with an equivalent control group (gender, age, education level). We used the Lille Communication Test, which comprises three parts: participation to communication (greeting, attention, engagement), verbal communication (verbal comprehension, speech outflow, intelligibility, word production, syntax, verbal pragmatics and verbal feedback) and non-verbal communication (understanding gestures, affective expressivity, producing gestures, pragmatics and feedback). We also used the Functional Communication Profile and the Boston Diagnostic Aphasia Examination (BDAE). Decrease in participation was found in L-CSC, R-CSC and Fro patients. Verbal communication was essentially disrupted in L-SCS and L-SC groups, including by verbal pragmatic disorders, and to a lesser degree in frontal patients. Nonverbal communication was mainly affected in R-CSC patients, especially by pragmatic difficulties. L-CSC patients showed an increase in gesture production, compensating for aphasia. In conclusion, communication disorders were relatively complex and could not be summarised by syntactical and lexical difficulties in left stroke and pragmatic problems in right stroke. The former also showed severe verbal pragmatic difficulties. Frontal stroke also resulted in evident verbal and non-verbal disorders.  相似文献   

7.
Abstract

The purpose of this study was to investigate the acquisition and generalization of mode interchange skills in people with severe aphasia. The experiment had three phases. In the first, three subjects were trained to acquire gestures for and to draw items used in the experiment. In the second phase, subjects were trained to change the communication mode, that is, drawing to gesture or gesture to drawing, in a ‘request’ situation using a single-subject design. The third phase was a probe for generalization. Subjects acquired both gestures and drawings, however none used these mode interchange skills spontaneously and needed additional training using prompt signs to signal mode interchange. Further the responses of communication partners were noted to determine the use of mode interchange skills. To increase non-verbal communication flexibility, it appeared to be important that partners wait for another response by the subject and thus provide the subject with an opportunity to use the mode interchange skills.  相似文献   

8.
Patients suffering from severe aphasia have to rely on non-verbal means of communication to convey a message. However, to date it is not clear which patients are able to do so. Clinical experience indicates that some patients use non-verbal communication strategies like gesturing very efficiently whereas others fail to transmit semantic content by non-verbal means. Concerns have been expressed that limb apraxia would affect the production of communicative gestures. Research investigating if and how apraxia influences the production of communicative gestures, led to contradictory outcomes. The purpose of this study was to investigate the impact of limb apraxia on spontaneous gesturing. Further, linguistic and non-verbal semantic processing abilities were explored as potential factors that might influence non-verbal expression in aphasic patients. Twenty-four aphasic patients with highly limited verbal output were asked to retell short video-clips. The narrations were videotaped. Gestural communication was analyzed in two ways. In the first part of the study, we used a form-based approach. Physiological and kinetic aspects of hand movements were transcribed with a notation system for sign languages. We determined the formal diversity of the hand gestures as an indicator of potential richness of the transmitted information. In the second part of the study, comprehensibility of the patients' gestural communication was evaluated by naive raters. The raters were familiarized with the model video-clips and shown the recordings of the patients' retelling without sound. They were asked to indicate, for each narration, which story was being told and which aspects of the stories they recognized. The results indicate that non-verbal faculties are the most important prerequisites for the production of hand gestures. Whereas results on standardized aphasia testing did not correlate with any gestural indices, non-verbal semantic processing abilities predicted the formal diversity of hand gestures while apraxia predicted the comprehensibility of gesturing.  相似文献   

9.
During face‐to‐face communication, body orientation and coverbal gestures influence how information is conveyed. The neural pathways underpinning the comprehension of such nonverbal social cues in everyday interaction are to some part still unknown. During fMRI data acquisition, 37 participants were presented with video clips showing an actor speaking short sentences. The actor produced speech‐associated iconic gestures (IC) or no gestures (NG) while he was visible either from an egocentric (ego) or from an allocentric (allo) position. Participants were asked to indicate via button press whether they felt addressed or not. We found a significant interaction of body orientation and gesture in addressment evaluations, indicating that participants evaluated IC‐ego conditions as most addressing. The anterior cingulate cortex (ACC) and left fusiform gyrus were stronger activated for egocentric versus allocentric actor position in gesture context. Activation increase in the ACC for IC‐ego>IC‐allo further correlated positively with increased addressment ratings in the egocentric gesture condition. Gesture‐related activation increase in the supplementary motor area, left inferior frontal gyrus and right insula correlated positively with gesture‐related increase of addressment evaluations in the egocentric context. Results indicate that gesture use and body‐orientation contribute to the feeling of being addressed and together influence neural processing in brain regions involved in motor simulation, empathy and mentalizing. Hum Brain Mapp 36:1925–1936, 2015. © 2015 Wiley Periodicals, Inc .  相似文献   

10.
Background: Gestures can provide an excellent natural alternative to verbal communication in people with aphasia (PWA). However, despite numerous studies focusing on gesture production in aphasia, it is still a matter of debate whether the gesture system remains intact after language impairment and how PWA use gestures to improve communication. A likely source for the contradicting results is that many studies were conducted on individual cases or in heterogeneous groups of individuals with additional cognitive deficits such as conceptual impairment and comorbid conditions such as limb apraxia.

Aims: The goal of the current study was to evaluate the integrity and function of gestures in PWA in light of cognitive theories of language–gesture relationship. Since all such theories presuppose the integrity of the conceptual system, and the absence of comorbid conditions that selectively impair gesturing (i.e., limb apraxia), our sample was selected to fulfill these assumptions.

Methods & Procedures: We examined gesture production in eight PWA with preserved auditory comprehension, no comorbidities, and various degrees of expressive deficit, as well as 11 age- and education-matched controls, while they described events in 20 normed video clips. Both speech and gesture data were coded for quantitative measures of informativeness, and gestures were grouped into several functional categories (matching, complementary, compensatory, social cueing, and facilitating lexical retrieval) based on correspondence to the accompanying speech. Using rigorous group analyses, individual-case analyses, and analyses of individual differences, we provide converging evidence for the integrity and type of function(s) served by gesturing in PWA.

Outcomes & Results: Our results indicate that the gesture system can remain functional even when language production is severely impaired. Our PWA heavily relied on iconic gestures to compensate for their language impairment, and the degree of such compensation was correlated with the extent of language impairment. In addition, we found evidence that producing iconic gestures was related to higher success rates in resolving lexical retrieval difficulties.

Conclusions: When comprehension and comorbidities are controlled for, impairment of language and gesture systems is dissociable. In PWA with good comprehension, gesturing can provide an excellent means to both compensate for the impaired language and act as a retrieval cue. Implications for cognitive theories of language–gesture relationship and therapy are discussed.  相似文献   


11.
Humans speak and produce symbolic gestures. Do these two forms of communication interact, and how? First, we tested whether the two communication signals influenced each other when emitted simultaneously. Participants either pronounced words, or executed symbolic gestures, or emitted the two communication signals simultaneously. Relative to the unimodal conditions, multimodal voice spectra were enhanced by gestures, whereas multimodal gesture parameters were reduced by words. In other words, gesture reinforced word, whereas word inhibited gesture. In contrast, aimless arm movements and pseudo-words had no comparable effects. Next, we tested whether observing word pronunciation during gesture execution affected verbal responses in the same way as emitting the two signals. Participants responded verbally to either spoken words, or to gestures, or to the simultaneous presentation of the two signals. We observed the same reinforcement in the voice spectra as during simultaneous emission. These results suggest that spoken word and symbolic gesture are coded as single signal by a unique communication system. This signal represents the intention to engage a closer interaction with a hypothetical interlocutor and it may have a meaning different from when word and gesture are encoded singly.  相似文献   

12.
Patients with schizophrenia spectrum disorders (SSD) exhibit an aberrant perception and comprehension of abstract speech-gesture combinations associated with dysfunctional activation of the left inferior frontal gyrus (IFG). Recently, a significant deficit of speech-gesture mismatch detection was identified in SSD, but the underlying neural mechanisms have not yet been examined. A novel mismatch-detection fMRI paradigm was implemented manipulating speech-gesture abstractness (abstract/concrete) and relatedness (related/unrelated). During fMRI data acquisition, 42 SSD patients (schizophrenia, schizoaffective disorder, or other non-organic psychotic disorder [ICD-10: F20, F25, F28; DSM-IV: 295.X]) and 36 healthy controls were presented with short video clips of an actor reciting abstract or concrete sentences accompanied by either a semantically related or unrelated gesture. Participants indicated via button press whether they perceived each gesture as matching the speech content or not. Speech-gesture mismatch detection performance was significantly impaired in patients compared to controls. fMRI data analysis revealed that patients showed lower activation in bilateral frontal areas, including the IFG for all abstract > concrete speech-gesture pairs. In addition, they exhibited reduced engagement of the right supplementary motor area (SMA) and bilateral anterior cingulate cortices (ACC) for unrelated > related stimuli. We provide first evidence that impaired speech-gesture mismatch detection in SSD could be the result of dysfunctional activation of the SMA and ACC. Failure to activate the left IFG disrupts the integration of abstract speech-gesture combinations in particular. Future investigations should focus on brain stimulation of the SMA, ACC, and the IFG to improve communication and social functioning in SSD.  相似文献   

13.
Individuals diagnosed with psychotic disorders exhibit abnormalities in the perception of expressive behaviors, which are linked to symptoms and visual information processing domains. Specifically, literature suggests these groups have difficulties perceiving gestures that accompany speech. While our understanding of gesture perception in psychotic disorders is growing, gesture perception abnormalities and clues about potential causes and consequences among individuals meeting criteria for a clinical high-risk (CHR) syndrome is limited. Presently, 29 individuals with a CHR syndrome and 32 healthy controls completed an eye-tracking gesture perception paradigm. In this task, participants viewed an actor using abstract and literal gestures while presenting a story and eye gaze data (eg, fixation counts and total fixation time) was collected. Furthermore, relationships between fixation variables and both symptoms (positive, negative, anxiety, and depression) and measures of visual information processing (working memory and attention) were examined. Findings revealed that the CHR group gazed at abstract gestures fewer times than the control group. When individuals in the CHR group did gaze at abstract gestures, on average, they spent significantly less time fixating compared to controls. Furthermore, reduced fixation (ie, count and time) was related to depression and slower response time on an attentional task. While a similar pattern of group differences in the same direction appeared for literal gestures, the effect was not significant. These data highlight the importance of integrating gesture perception abnormalities into vulnerability models of psychosis and inform the development of targeted treatments for social communicative deficits.  相似文献   

14.
The role of iconic gestures in speech disambiguation: ERP evidence   总被引:1,自引:0,他引:1  
The present series of experiments explored the extent to which iconic gestures convey information not found in speech. Electroencephalogram (EEG) was recorded as participants watched videos of a person gesturing and speaking simultaneously. The experimental sentences contained an unbalanced homonym in the initial part of the sentence (e.g., She controlled the ball ...) and were disambiguated at a target word in the subsequent clause (which during the game ... vs. which during the dance ...). Coincident with the initial part of the sentence, the speaker produced an iconic gesture which supported either the dominant or the subordinate meaning. Event-related potentials were time-locked to the onset of the target word. In Experiment 1, participants were explicitly asked to judge the congruency between the initial homonym-gesture combination and the subsequent target word. The N400 at target words was found to be smaller after a congruent gesture and larger after an incongruent gesture, suggesting that listeners can use gestural information to disambiguate speech. Experiment 2 replicated the results using a less explicit task, indicating that the disambiguating effect of gesture is somewhat task-independent. Unrelated grooming movements were added to the paradigm in Experiment 3. The N400 at subordinate targets was found to be smaller after subordinate gestures and larger after dominant gestures as well as grooming, indicating that an iconic gesture can facilitate the processing of a lesser frequent word meaning. The N400 at dominant targets no longer varied as a function of the preceding gesture in Experiment 3, suggesting that the addition of meaningless movements weakened the impact of gesture. Thus, the integration of gesture and speech in comprehension does not appear to be an obligatory process but is modulated by situational factors such as the amount of observed meaningful hand movements.  相似文献   

15.
During face‐to‐face communication, listeners integrate speech with gestures. The semantic information conveyed by iconic gestures (e.g., a drinking gesture) can aid speech comprehension in adverse listening conditions. In this magnetoencephalography (MEG) study, we investigated the spatiotemporal neural oscillatory activity associated with gestural enhancement of degraded speech comprehension. Participants watched videos of an actress uttering clear or degraded speech, accompanied by a gesture or not and completed a cued‐recall task after watching every video. When gestures semantically disambiguated degraded speech comprehension, an alpha and beta power suppression and a gamma power increase revealed engagement and active processing in the hand‐area of the motor cortex, the extended language network (LIFG/pSTS/STG/MTG), medial temporal lobe, and occipital regions. These observed low‐ and high‐frequency oscillatory modulations in these areas support general unification, integration and lexical access processes during online language comprehension, and simulation of and increased visual attention to manual gestures over time. All individual oscillatory power modulations associated with gestural enhancement of degraded speech comprehension predicted a listener's correct disambiguation of the degraded verb after watching the videos. Our results thus go beyond the previously proposed role of oscillatory dynamics in unimodal degraded speech comprehension and provide first evidence for the role of low‐ and high‐frequency oscillations in predicting the integration of auditory and visual information at a semantic level.  相似文献   

16.
abstract

to study the relationship between verbal and nonverbal behaviour in aphasia, a Gesture Recognition Test (GRT) was given to 111 aphasic patients and to 48 normal controls (NC). Forty-eight aphasics were impaired on the GRT. Global, Wernicke's and Transcortical aphasìcs performed worse than Broca's, Conduction and Anomic aphasics, whose scores did not differ from those of NC. Although a moderate to strong correlation was found between GRT and auditory comprehension performances, type of aphasia had an effect on gesture recognition that was independent of the severity of auditory comprehension impairment. This may reflect the major role played by posterior left-hemispheric areas in the identification of gestures. GRT impairment was associated with reading defects only in patients with central alexia. A strong correlation was found with constructional apraxia, suggesting that these two nonverbal tasks share common neural mechanisms. The weak association between gesture recognition impairment and ideomotor apraxia supports an independence between “receptive” and “expressive” aspects of gestural communication.  相似文献   

17.
18.
Everyday communication is accompanied by visual information from several sources, including co‐speech gestures, which provide semantic information listeners use to help disambiguate the speaker's message. Using fMRI, we examined how gestures influence neural activity in brain regions associated with processing semantic information. The BOLD response was recorded while participants listened to stories under three audiovisual conditions and one auditory‐only (speech alone) condition. In the first audiovisual condition, the storyteller produced gestures that naturally accompany speech. In the second, the storyteller made semantically unrelated hand movements. In the third, the storyteller kept her hands still. In addition to inferior parietal and posterior superior and middle temporal regions, bilateral posterior superior temporal sulcus and left anterior inferior frontal gyrus responded more strongly to speech when it was further accompanied by gesture, regardless of the semantic relation to speech. However, the right inferior frontal gyrus was sensitive to the semantic import of the hand movements, demonstrating more activity when hand movements were semantically unrelated to the accompanying speech. These findings show that perceiving hand movements during speech modulates the distributed pattern of neural activation involved in both biological motion perception and discourse comprehension, suggesting listeners attempt to find meaning, not only in the words speakers produce, but also in the hand movements that accompany speech. Hum Brain Mapp, 2009. © 2009 Wiley‐Liss, Inc.  相似文献   

19.
Patients with degenerative dementia often show language disorders, but little is known about their verbal (VC) and non-verbal communication (NVC). Our aim was to analyse VC and NVC in patients with standard criteria of mild-moderately severe dementia (MMSE ≥ 14/30) resulting from Alzheimer's disease (AD; 29 cases), behavioural variant of frontotemporal dementia (FTD; 16), or dementia with Lewy bodies (DLB; 13). We used the Lille Communication Test, which addresses three domains: participation in communication (PC: greeting, attention, participation), VC (verbal comprehension, speech outflow, intelligibility, word production, syntax, verbal pragmatics and verbal feedback), and NVC (understanding gestures, affective expressivity, producing gestures, pragmatics and feedback). Patients were compared with 47 matching control subjects. AD patients were partially impaired (p ≤ 0.01) in PC (greeting), and more definitely in VC, especially by verbal comprehension and word finding difficulties and to a much lesser degree in verbal pragmatics (responding to open questions, presenting new information), while NVC was mostly preserved. FTD patients were severely impaired in PC. VC difficulties were related to lexical-semantic, syntactic and more specifically pragmatic problems. NVC was impaired by difficulties in affective expressivity, pragmatics and feedback management. DLB patients showed modest difficulties with VC. PC, VC and NVC strongly correlated with performance in the dementia rating scale. In conclusion, the profile of communication difficulties was quite different between groups. FTD patients showed most severe difficulties in PC and verbal and non-verbal pragmatics, in relation to their frontal lesions. AD patients had prominent impairment of lexical-semantic operations.  相似文献   

20.
Left inferior parietal dominance in gesture imitation: an fMRI study   总被引:4,自引:0,他引:4  
The inability to imitate gestures is an essential feature of apraxia. However, discrepancies exist between clinical studies in apraxic patients and neuroimaging findings on imitation. We therefore aimed to investigate: (1) which areas are recruited during imitation under conditions similar to clinical tests for apraxic deficits; (2) whether there are common lateralized areas subserving imitation irrespective of the acting limb side; and also (3) whether there are differences between hand and finger gestures. We used fMRI in 12 healthy, right handed subjects to investigate the imitation of four types of variable gestures that were presented by video clips (16 different finger and 16 different hand gestures with either the right or the left arm). The respective control conditions consisted of stereotyped gestures (only two gestures presented in pseudorandom order). Subtraction analysis of each type of gesture imitation (variable>stereotyped) revealed a bilateral activation pattern including the inferior parietal cortex Brodmann Area (BA 40), the superior parietal cortex, the inferior frontal cortex (opercular region), the prefrontal motor cortex, the lateral occipito-temporal junction, and the cerebellum. These results were supported by statistical conjunction of all four subtraction analyses and by the common analysis of all four types of gesture imitation. The direct comparison of the right and left hemispheric activation revealed a lateralization to the left only of the inferior parietal cortex. Comparisons between different types of gesture imitation yielded no significant results. In conclusion, gesture imitation recruits bilateral fronto-parietal regions, with significant lateralization of only one area, namely the left inferior parietal cortex. These in vivo data indicate left inferior parietal dominance for gesture imitation in right handers, confirming lesion-based theories of apraxia.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号