首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 484 毫秒
1.
Gestures are an important part of interpersonal communication, for example by illustrating physical properties of speech contents (e.g., “the ball is round”). The meaning of these so‐called iconic gestures is strongly intertwined with speech. We investigated the neural correlates of the semantic integration for verbal and gestural information. Participants watched short videos of five speech and gesture conditions performed by an actor, including variation of language (familiar German vs. unfamiliar Russian), variation of gesture (iconic vs. unrelated), as well as isolated familiar language, while brain activation was measured using functional magnetic resonance imaging. For familiar speech with either of both gesture types contrasted to Russian speech‐gesture pairs, activation increases were observed at the left temporo‐occipital junction. Apart from this shared location, speech with iconic gestures exclusively engaged left occipital areas, whereas speech with unrelated gestures activated bilateral parietal and posterior temporal regions. Our results demonstrate that the processing of speech with speech‐related versus speech‐unrelated gestures occurs in two distinct but partly overlapping networks. The distinct processing streams (visual versus linguistic/spatial) are interpreted in terms of “auxiliary systems” allowing the integration of speech and gesture in the left temporo‐occipital region. Hum Brain Mapp, 2009. © 2009 Wiley‐Liss, Inc.  相似文献   

2.
During language comprehension, listeners use the global semantic representation from previous sentence or discourse context to immediately integrate the meaning of each upcoming word into the unfolding message-level representation. Here we investigate whether communicative gestures that often spontaneously co-occur with speech are processed in a similar fashion and integrated to previous sentence context in the same way as lexical meaning. Event-related potentials were measured while subjects listened to spoken sentences with a critical verb (e.g., knock), which was accompanied by an iconic co-speech gesture (i.e., KNOCK). Verbal and/or gestural semantic content matched or mismatched the content of the preceding part of the sentence. Despite the difference in the modality and in the specificity of meaning conveyed by spoken words and gestures, the latency, amplitude, and topographical distribution of both word and gesture mismatches are found to be similar, indicating that the brain integrates both types of information simultaneously. This provides evidence for the claim that neural processing in language comprehension involves the simultaneous incorporation of information coming from a broader domain of cognition than only verbal semantics. The neural evidence for similar integration of information from speech and gesture emphasizes the tight interconnection between speech and co-speech gestures.  相似文献   

3.
In everyday conversation, listeners often rely on a speaker's gestures to clarify any ambiguities in the verbal message. Using fMRI during naturalistic story comprehension, we examined which brain regions in the listener are sensitive to speakers' iconic gestures. We focused on iconic gestures that contribute information not found in the speaker's talk, compared with those that convey information redundant with the speaker's talk. We found that three regions—left inferior frontal gyrus triangular (IFGTr) and opercular (IFGOp) portions, and left posterior middle temporal gyrus (MTGp)—responded more strongly when gestures added information to nonspecific language, compared with when they conveyed the same information in more specific language; in other words, when gesture disambiguated speech as opposed to reinforced it. An increased BOLD response was not found in these regions when the nonspecific language was produced without gesture, suggesting that IFGTr, IFGOp, and MTGp are involved in integrating semantic information across gesture and speech. In addition, we found that activity in the posterior superior temporal sulcus (STSp), previously thought to be involved in gesture‐speech integration, was not sensitive to the gesture‐speech relation. Together, these findings clarify the neurobiology of gesture‐speech integration and contribute to an emerging picture of how listeners glean meaning from gestures that accompany speech. Hum Brain Mapp 35:900–917, 2014. © 2012 Wiley Periodicals, Inc.  相似文献   

4.
Gestures are an important part of human communication. However, little is known about the neural correlates of gestures accompanying speech comprehension. The goal of this study is to investigate the neural basis of speech-gesture interaction as reflected in activation increase and decrease during observation of natural communication.Fourteen German participants watched video clips of 5 s duration depicting an actor who performed metaphoric gestures to illustrate the abstract content of spoken sentences. Furthermore, video clips of isolated gestures (without speech), isolated spoken sentences (without gestures) and gestures in the context of an unknown language (Russian) were additionally presented while functional magnetic resonance imaging (fMRI) data were acquired.Bimodal speech and gesture processing led to left hemispheric activation increases of the posterior middle temporal gyrus, the premotor cortex, the inferior frontal gyrus, and the right superior temporal sulcus. Activation reductions during the bimodal condition were located in the left superior temporal gyrus and the left posterior insula. Gesture related activation increases and decreases were dependent on language semantics and were not found in the unknown-language condition.Our results suggest that semantic integration processes for bimodal speech plus gesture comprehension are reflected in activation increases in the classical left hemispheric language areas. Speech related gestures seem to enhance language comprehension during the face-to-face communication.  相似文献   

5.
For humans, the ability to communicate and use language is instantiated not only in the vocal modality but also in the visual modality. The main examples of this are sign languages and (co-speech) gestures. Sign languages, the natural languages of Deaf communities, use systematic and conventionalized movements of the hands, face, and body for linguistic expression. Co-speech gestures, though non-linguistic, are produced in tight semantic and temporal integration with speech and constitute an integral part of language together with speech. The articles in this issue explore and document how gestures and sign languages are similar or different and how communicative expression in the visual modality can change from being gestural to grammatical in nature through processes of conventionalization. As such, this issue contributes to our understanding of how the visual modality shapes language and the emergence of linguistic structure in newly developing systems. Studying the relationship between signs and gestures provides a new window onto the human ability to recruit multiple levels of representation (e.g., categorical, gradient, iconic, abstract) in the service of using or creating conventionalized communicative systems.  相似文献   

6.
In face-to-face communication, speech is typically enriched by gestures. Clearly, not all people gesture in the same way, and the present study explores whether such individual differences in gesture style are taken into account during the perception of gestures that accompany speech. Participants were presented with one speaker that gestured in a straightforward way and another that also produced self-touch movements. Adding trials with such grooming movements makes the gesture information a much weaker cue compared with the gestures of the non-grooming speaker. The Electroencephalogram was recorded as participants watched videos of the individual speakers. Event-related potentials elicited by the speech signal revealed that adding grooming movements attenuated the impact of gesture for this particular speaker. Thus, these data suggest that there is sensitivity to the personal communication style of a speaker and that affects the extent to which gesture and speech are integrated during language comprehension.  相似文献   

7.
The role of iconic gestures in speech disambiguation: ERP evidence   总被引:1,自引:0,他引:1  
The present series of experiments explored the extent to which iconic gestures convey information not found in speech. Electroencephalogram (EEG) was recorded as participants watched videos of a person gesturing and speaking simultaneously. The experimental sentences contained an unbalanced homonym in the initial part of the sentence (e.g., She controlled the ball ...) and were disambiguated at a target word in the subsequent clause (which during the game ... vs. which during the dance ...). Coincident with the initial part of the sentence, the speaker produced an iconic gesture which supported either the dominant or the subordinate meaning. Event-related potentials were time-locked to the onset of the target word. In Experiment 1, participants were explicitly asked to judge the congruency between the initial homonym-gesture combination and the subsequent target word. The N400 at target words was found to be smaller after a congruent gesture and larger after an incongruent gesture, suggesting that listeners can use gestural information to disambiguate speech. Experiment 2 replicated the results using a less explicit task, indicating that the disambiguating effect of gesture is somewhat task-independent. Unrelated grooming movements were added to the paradigm in Experiment 3. The N400 at subordinate targets was found to be smaller after subordinate gestures and larger after dominant gestures as well as grooming, indicating that an iconic gesture can facilitate the processing of a lesser frequent word meaning. The N400 at dominant targets no longer varied as a function of the preceding gesture in Experiment 3, suggesting that the addition of meaningless movements weakened the impact of gesture. Thus, the integration of gesture and speech in comprehension does not appear to be an obligatory process but is modulated by situational factors such as the amount of observed meaningful hand movements.  相似文献   

8.
In human face-to-face communication, the content of speech is often illustrated by coverbal gestures. Behavioral evidence suggests that gestures provide advantages in the comprehension and memory of speech. Yet, how the human brain integrates abstract auditory and visual information into a common representation is not known. Our study investigates the neural basis of memory for bimodal speech and gesture representations. In this fMRI study, 12 participants were presented with video clips showing an actor performing meaningful metaphoric gestures (MG), unrelated, free gestures (FG), and no arm and hand movements (NG) accompanying sentences with an abstract content. After the fMRI session, the participants performed a recognition task. Behaviorally, the participants showed the highest hit rate for sentences accompanied by meaningful metaphoric gestures. Despite comparable old/new discrimination performances (d') for the three conditions, we obtained distinct memory-related left-hemispheric activations in the inferior frontal gyrus (IFG), the premotor cortex (BA 6), and the middle temporal gyrus (MTG), as well as significant correlations between hippocampal activation and memory performance in the metaphoric gesture condition. In contrast, unrelated speech and gesture information (FG) was processed in areas of the left occipito-temporal and cerebellar region and the right IFG just like the no-gesture condition (NG). We propose that the specific left-lateralized activation pattern for the metaphoric speech-gesture sentences reflects semantic integration of speech and gestures. These results provide novel evidence about the neural integration of abstract speech and gestures as it contributes to subsequent memory performance.  相似文献   

9.
Background: Previous research has found that people with aphasia produce more spontaneous iconic gesture than control participants, especially during word-finding difficulties. There is some evidence that impaired semantic knowledge impacts on the diversity of gestural handshapes, as well as the frequency of gesture production. However, no previous research has explored how impaired semantic knowledge impacts on the frequency and type of iconic gestures produced during fluent speech compared with those produced during word-finding difficulties.

Aims: To explore the impact of impaired semantic knowledge on the frequency and type of iconic gestures produced during fluent speech and those produced during word-finding difficulties.

Methods & Procedures: A group of 29 participants with aphasia and 29 control participants were video recorded describing a cartoon they had just watched. All iconic gestures were tagged and coded as either “manner”, “path only”, “shape outline” or “other”. These gestures were then separated into either those occurring during fluent speech or those occurring during a word-finding difficulty. The relationships between semantic knowledge and gesture frequency and form were then investigated in the two different conditions.

Outcomes & Results: As expected, the participants with aphasia produced a higher frequency of iconic gestures than the control participants, but when the iconic gestures produced during word-finding difficulties were removed from the analysis, the frequency of iconic gesture was not significantly different between the groups. While there was not a significant relationship between the frequency of iconic gestures produced during fluent speech and semantic knowledge, there was a significant positive correlation between semantic knowledge and the proportion of word-finding difficulties that contained gesture. There was also a significant positive correlation between the speakers’ semantic knowledge and the proportion of gestures that were produced during fluent speech that were classified as “manner”. Finally while not significant, there was a positive trend between semantic knowledge of objects and the production of “shape outline” gestures during word-finding difficulties for objects.

Conclusions: The results indicate that impaired semantic knowledge in aphasia impacts on both the iconic gestures produced during fluent speech and those produced during word-finding difficulties but in different ways. These results shed new light on the relationship between impaired language and iconic co-speech gesture production and also suggest that analysis of iconic gesture may be a useful addition to clinical assessment.  相似文献   

10.
Lin Wang  Mingyuan Chu 《Neuropsychologia》2013,51(13):2847-2855
The present study investigated whether and how beat gesture (small baton-like hand movements used to emphasize information in speech) influences semantic processing as well as its interaction with pitch accent during speech comprehension. Event-related potentials were recorded as participants watched videos of a person gesturing and speaking simultaneously. The critical words in the spoken sentences were accompanied by a beat gesture, a control hand movement, or no hand movement, and were expressed either with or without pitch accent. We found that both beat gesture and control hand movement induced smaller negativities in the N400 time window than when no hand movement was presented. The reduced N400s indicate that both beat gesture and control movement facilitated the semantic integration of the critical word into the sentence context. In addition, the words accompanied by beat gesture elicited smaller negativities in the N400 time window than those accompanied by control hand movement over right posterior electrodes, suggesting that beat gesture has a unique role for enhancing semantic processing during speech comprehension. Finally, no interaction was observed between beat gesture and pitch accent, indicating that they affect semantic processing independently.  相似文献   

11.
Everyday communication is accompanied by visual information from several sources, including co‐speech gestures, which provide semantic information listeners use to help disambiguate the speaker's message. Using fMRI, we examined how gestures influence neural activity in brain regions associated with processing semantic information. The BOLD response was recorded while participants listened to stories under three audiovisual conditions and one auditory‐only (speech alone) condition. In the first audiovisual condition, the storyteller produced gestures that naturally accompany speech. In the second, the storyteller made semantically unrelated hand movements. In the third, the storyteller kept her hands still. In addition to inferior parietal and posterior superior and middle temporal regions, bilateral posterior superior temporal sulcus and left anterior inferior frontal gyrus responded more strongly to speech when it was further accompanied by gesture, regardless of the semantic relation to speech. However, the right inferior frontal gyrus was sensitive to the semantic import of the hand movements, demonstrating more activity when hand movements were semantically unrelated to the accompanying speech. These findings show that perceiving hand movements during speech modulates the distributed pattern of neural activation involved in both biological motion perception and discourse comprehension, suggesting listeners attempt to find meaning, not only in the words speakers produce, but also in the hand movements that accompany speech. Hum Brain Mapp, 2009. © 2009 Wiley‐Liss, Inc.  相似文献   

12.
During communication in real‐life settings, the brain integrates information from auditory and visual modalities to form a unified percept of our environment. In the current magnetoencephalography (MEG) study, we used rapid invisible frequency tagging (RIFT) to generate steady‐state evoked fields and investigated the integration of audiovisual information in a semantic context. We presented participants with videos of an actress uttering action verbs (auditory; tagged at 61 Hz) accompanied by a gesture (visual; tagged at 68 Hz, using a projector with a 1,440 Hz refresh rate). Integration difficulty was manipulated by lower‐order auditory factors (clear/degraded speech) and higher‐order visual factors (congruent/incongruent gesture). We identified MEG spectral peaks at the individual (61/68 Hz) tagging frequencies. We furthermore observed a peak at the intermodulation frequency of the auditory and visually tagged signals (fvisual − fauditory = 7 Hz), specifically when lower‐order integration was easiest because signal quality was optimal. This intermodulation peak is a signature of nonlinear audiovisual integration, and was strongest in left inferior frontal gyrus and left temporal regions; areas known to be involved in speech‐gesture integration. The enhanced power at the intermodulation frequency thus reflects the ease of lower‐order audiovisual integration and demonstrates that speech‐gesture information interacts in higher‐order language areas. Furthermore, we provide a proof‐of‐principle of the use of RIFT to study the integration of audiovisual stimuli, in relation to, for instance, semantic context.  相似文献   

13.
Background: Conveying instructions is an everyday use of language, and gestures are likely to be a key feature of this. Although co-speech iconic gestures are tightly integrated with language, and people with aphasia (PWA) produce procedural discourses impaired at a linguistic level, no previous studies have investigated how PWA use co-speech iconic gestures in these contexts.

Aims: This study investigated how PWA communicated meaning using gesture and language in procedural discourses, compared with neurologically healthy people (NHP). We aimed to identify the relative relationship of gesture and speech, in the context of impaired language, both overall and in individual events.

Methods & Procedures: Twenty-nine PWA and 29 NHP produced two procedural discourses. The structure and semantic content of language of the whole discourses were analysed through predicate argument structure and spatial motor terms, and gestures were analysed for frequency and semantic form. Gesture and language were analysed in two key events, to determine the relative information presented in each modality.

Outcomes & Results: PWA and NHP used similar frequencies and forms of gestures, although PWA used syntactically simpler language and fewer spatial words. This meant, overall, relatively more information was present in PWA gesture. This finding was also reflected in the key events, where PWA used gestures conveying rich semantic information alongside semantically impoverished language more often than NHP.

Conclusions: PWA gestures, containing semantic information omitted from the concurrent speech, may help listeners with meaning when language is impaired. This finding indicates gesture should be included in clinical assessments of meaning-making.  相似文献   

14.
Abstract

Co-speech gestures have a close semantic relationship to speech in adult conversation. In typically developing children co-speech gestures which give additional information to speech facilitate the emergence of multi-word speech. A difficulty with integrating audio-visual information is known to exist for individuals with Autism Spectrum Disorder (ASD), which may affect development of the speech-gesture system. A longitudinal observational study was conducted with four children with ASD, aged 2;4 to 3;5 years. Participants were video-recorded for 20?min every 2 weeks during their attendance on an intervention programme. Recording continued for up to 8 months, thus affording a rich analysis of gestural practices from pre-verbal to multi-word speech across the group. All participants combined gesture with either speech or vocalisations. Co-speech gestures providing additional information to speech were observed to be either absent or rare. Findings suggest that children with ASD do not make use of the facilitating communicative effects of gesture in the same way as typically developing children.  相似文献   

15.
There is no doubt that gestures are communicative and can be integrated online with speech. Little is known, however, about the nature of this process, for example, its automaticity and how our own communicative abilities and also our environment influence the integration of gesture and speech. In two Event Related Potential (ERP) experiments, the effects of gestures during speech comprehension were explored. In both experiments, participants performed a shallow task thereby avoiding explicit gesture-speech integration. In the first experiment, participants with normal hearing viewed videos in which a gesturing actress uttered sentences which were either embedded in multi-speaker babble noise or not. The sentences contained a homonym which was disambiguated by the information in a gesture, which was presented asynchronous to speech (1000 msec earlier). Downstream, the sentence contained a target word that was either related to the dominant or subordinate meaning of the homonym and was used to indicate the success of the disambiguation. Both the homonym and the target word position showed clear ERP evidence of gesture-speech integration and disambiguation only under babble noise. Thus, during noise, gestures were taken into account as an important communicative cue. In Experiment 2, the same asynchronous stimuli were presented to a group of hearing-impaired students and age-matched controls. Only the hearing-impaired individuals showed significant speech-gesture integration and successful disambiguation at the target word. The age-matched controls did not show any effect. Thus, individuals who chronically experience suboptimal communicative situations in daily life automatically take gestures into account. The data from both experiments indicate that gestures are beneficial in countering difficult communication conditions independent of whether the difficulties are due to external (babble noise) or internal (hearing impairment) factors.  相似文献   

16.
We assessed intelligence and receptive and expressive language skills in 6 children, ages 7 years 9 months to 12 years 4 months, with bilateral perisylvian polymicrogyria of variable extent and with dysarthria of different severity. In view of the recent findings of a close relationship between word and gesture, we also examined the communicative use of gesture. We found that mental retardation was related to the extent of cortical malformation; lexical comprehension, but not morphosyntactic comprehension, and verbal production were more compromised than expected from nonverbal intellectual abilities; lack of verbal language was not compensated by the use of referential gestures. Results are discussed suggesting that compromised verbal and gestural communication in bilateral perisylvian polymicrogyria are not due simply to mental retardation and/or dysarthria but also to dysfunction of Sylvian fissure areas concerned with the totality of language processing.  相似文献   

17.
Multifactorial investigations of intraspecific laterality of primates’ gestural communication aim to shed light on factors that underlie the evolutionary origins of human handedness and language. This study assesses gorillas’ intraspecific gestural laterality considering the effect of various factors related to gestural characteristics, interactional context and sociodemographic characteristics of signaller and recipient. Our question was: which factors influence gorillas’ gestural laterality? We studied laterality in three captive groups of gorillas (N?=?35) focusing on their most frequent gesture types (N?=?16). We show that signallers used predominantly their hand ipsilateral to the recipient for tactile and visual gestures, whatever the emotional context, gesture duration, recipient’s sex or the kin relationship between both interactants, and whether or not a communication tool was used. Signallers’ contralateral hand was not preferentially used in any situation. Signallers’ right-hand use was more pronounced in negative contexts, in short gestures, when signallers were females and its use increased with age. Our findings showed that gorillas’ gestural laterality could be influenced by different types of social pressures thus supporting the theory of the evolution of laterality at the population level. Our study also evidenced that some particular gesture categories are better markers than others of the left-hemisphere language specialization.  相似文献   

18.
We present the neuropsychological and linguistic follow-up of a girl with bilateral perisylvian polymicrogyria during 4 years of gestural and verbal speech therapy. Some researchers have suggested that children with bilateral perisylvian polymicrogyria mentally fail to reach the syntactic phase and do not acquire a productive morphology. This patient achieved a mean length of utterance in signs/gestures of 3.4, a syntactic phase of completion of the nuclear sentence and the use of morphological modifications. We discuss the link between gesture and language and formulate hypotheses on the role of gestural input on the reorganization of compensatory synaptic circuits.  相似文献   

19.
Gestures represent an integral aspect of interpersonal communication, and they are closely linked with language and thought. Brain regions for language processing overlap with those for gesture processing. Two types of gesticulation, beat gestures and metaphoric gestures are particularly important for understanding the taxonomy of co‐speech gestures. Here, we investigated gesture production during taped interviews with respect to regional brain volume. First, we were interested in whether beat gesture production is associated with similar regions as metaphoric gesture. Second, we investigated whether cortical regions associated with metaphoric gesture processing are linked to gesture production based on correlations with brain volumes. We found that beat gestures are uniquely related to regional volume in cerebellar regions previously implicated in discrete motor timing. We suggest that these gestures may be an artifact of the timing processes of the cerebellum that are important for the timing of vocalizations. Second, our findings indicate that brain volumes in regions of the left hemisphere previously implicated in metaphoric gesture processing are positively correlated with metaphoric gesture production. Together, this novel work extends our understanding of left hemisphere regions associated with gesture to indicate their importance in gesture production, and also suggests that beat gestures may be especially unique. This provides important insight into the taxonomy of co‐speech gestures, and also further insight into the general role of the cerebellum in language. Hum Brain Mapp 36:4016–4030, 2015. © 2015 Wiley Periodicals, Inc.  相似文献   

20.
During face‐to‐face communication, body orientation and coverbal gestures influence how information is conveyed. The neural pathways underpinning the comprehension of such nonverbal social cues in everyday interaction are to some part still unknown. During fMRI data acquisition, 37 participants were presented with video clips showing an actor speaking short sentences. The actor produced speech‐associated iconic gestures (IC) or no gestures (NG) while he was visible either from an egocentric (ego) or from an allocentric (allo) position. Participants were asked to indicate via button press whether they felt addressed or not. We found a significant interaction of body orientation and gesture in addressment evaluations, indicating that participants evaluated IC‐ego conditions as most addressing. The anterior cingulate cortex (ACC) and left fusiform gyrus were stronger activated for egocentric versus allocentric actor position in gesture context. Activation increase in the ACC for IC‐ego>IC‐allo further correlated positively with increased addressment ratings in the egocentric gesture condition. Gesture‐related activation increase in the supplementary motor area, left inferior frontal gyrus and right insula correlated positively with gesture‐related increase of addressment evaluations in the egocentric context. Results indicate that gesture use and body‐orientation contribute to the feeling of being addressed and together influence neural processing in brain regions involved in motor simulation, empathy and mentalizing. Hum Brain Mapp 36:1925–1936, 2015. © 2015 Wiley Periodicals, Inc .  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号