首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 140 毫秒
1.
Everyday communication is accompanied by visual information from several sources, including co‐speech gestures, which provide semantic information listeners use to help disambiguate the speaker's message. Using fMRI, we examined how gestures influence neural activity in brain regions associated with processing semantic information. The BOLD response was recorded while participants listened to stories under three audiovisual conditions and one auditory‐only (speech alone) condition. In the first audiovisual condition, the storyteller produced gestures that naturally accompany speech. In the second, the storyteller made semantically unrelated hand movements. In the third, the storyteller kept her hands still. In addition to inferior parietal and posterior superior and middle temporal regions, bilateral posterior superior temporal sulcus and left anterior inferior frontal gyrus responded more strongly to speech when it was further accompanied by gesture, regardless of the semantic relation to speech. However, the right inferior frontal gyrus was sensitive to the semantic import of the hand movements, demonstrating more activity when hand movements were semantically unrelated to the accompanying speech. These findings show that perceiving hand movements during speech modulates the distributed pattern of neural activation involved in both biological motion perception and discourse comprehension, suggesting listeners attempt to find meaning, not only in the words speakers produce, but also in the hand movements that accompany speech. Hum Brain Mapp, 2009. © 2009 Wiley‐Liss, Inc.  相似文献   

2.
In human face-to-face communication, the content of speech is often illustrated by coverbal gestures. Behavioral evidence suggests that gestures provide advantages in the comprehension and memory of speech. Yet, how the human brain integrates abstract auditory and visual information into a common representation is not known. Our study investigates the neural basis of memory for bimodal speech and gesture representations. In this fMRI study, 12 participants were presented with video clips showing an actor performing meaningful metaphoric gestures (MG), unrelated, free gestures (FG), and no arm and hand movements (NG) accompanying sentences with an abstract content. After the fMRI session, the participants performed a recognition task. Behaviorally, the participants showed the highest hit rate for sentences accompanied by meaningful metaphoric gestures. Despite comparable old/new discrimination performances (d') for the three conditions, we obtained distinct memory-related left-hemispheric activations in the inferior frontal gyrus (IFG), the premotor cortex (BA 6), and the middle temporal gyrus (MTG), as well as significant correlations between hippocampal activation and memory performance in the metaphoric gesture condition. In contrast, unrelated speech and gesture information (FG) was processed in areas of the left occipito-temporal and cerebellar region and the right IFG just like the no-gesture condition (NG). We propose that the specific left-lateralized activation pattern for the metaphoric speech-gesture sentences reflects semantic integration of speech and gestures. These results provide novel evidence about the neural integration of abstract speech and gestures as it contributes to subsequent memory performance.  相似文献   

3.
Lin Wang  Mingyuan Chu 《Neuropsychologia》2013,51(13):2847-2855
The present study investigated whether and how beat gesture (small baton-like hand movements used to emphasize information in speech) influences semantic processing as well as its interaction with pitch accent during speech comprehension. Event-related potentials were recorded as participants watched videos of a person gesturing and speaking simultaneously. The critical words in the spoken sentences were accompanied by a beat gesture, a control hand movement, or no hand movement, and were expressed either with or without pitch accent. We found that both beat gesture and control hand movement induced smaller negativities in the N400 time window than when no hand movement was presented. The reduced N400s indicate that both beat gesture and control movement facilitated the semantic integration of the critical word into the sentence context. In addition, the words accompanied by beat gesture elicited smaller negativities in the N400 time window than those accompanied by control hand movement over right posterior electrodes, suggesting that beat gesture has a unique role for enhancing semantic processing during speech comprehension. Finally, no interaction was observed between beat gesture and pitch accent, indicating that they affect semantic processing independently.  相似文献   

4.
In everyday conversation, listeners often rely on a speaker's gestures to clarify any ambiguities in the verbal message. Using fMRI during naturalistic story comprehension, we examined which brain regions in the listener are sensitive to speakers' iconic gestures. We focused on iconic gestures that contribute information not found in the speaker's talk, compared with those that convey information redundant with the speaker's talk. We found that three regions—left inferior frontal gyrus triangular (IFGTr) and opercular (IFGOp) portions, and left posterior middle temporal gyrus (MTGp)—responded more strongly when gestures added information to nonspecific language, compared with when they conveyed the same information in more specific language; in other words, when gesture disambiguated speech as opposed to reinforced it. An increased BOLD response was not found in these regions when the nonspecific language was produced without gesture, suggesting that IFGTr, IFGOp, and MTGp are involved in integrating semantic information across gesture and speech. In addition, we found that activity in the posterior superior temporal sulcus (STSp), previously thought to be involved in gesture‐speech integration, was not sensitive to the gesture‐speech relation. Together, these findings clarify the neurobiology of gesture‐speech integration and contribute to an emerging picture of how listeners glean meaning from gestures that accompany speech. Hum Brain Mapp 35:900–917, 2014. © 2012 Wiley Periodicals, Inc.  相似文献   

5.
During face‐to‐face communication, body orientation and coverbal gestures influence how information is conveyed. The neural pathways underpinning the comprehension of such nonverbal social cues in everyday interaction are to some part still unknown. During fMRI data acquisition, 37 participants were presented with video clips showing an actor speaking short sentences. The actor produced speech‐associated iconic gestures (IC) or no gestures (NG) while he was visible either from an egocentric (ego) or from an allocentric (allo) position. Participants were asked to indicate via button press whether they felt addressed or not. We found a significant interaction of body orientation and gesture in addressment evaluations, indicating that participants evaluated IC‐ego conditions as most addressing. The anterior cingulate cortex (ACC) and left fusiform gyrus were stronger activated for egocentric versus allocentric actor position in gesture context. Activation increase in the ACC for IC‐ego>IC‐allo further correlated positively with increased addressment ratings in the egocentric gesture condition. Gesture‐related activation increase in the supplementary motor area, left inferior frontal gyrus and right insula correlated positively with gesture‐related increase of addressment evaluations in the egocentric context. Results indicate that gesture use and body‐orientation contribute to the feeling of being addressed and together influence neural processing in brain regions involved in motor simulation, empathy and mentalizing. Hum Brain Mapp 36:1925–1936, 2015. © 2015 Wiley Periodicals, Inc .  相似文献   

6.
Body orientation and eye gaze influence how information is conveyed during face-to-face communication. However, the neural pathways underpinning the comprehension of social cues in everyday interaction are not known. In this study we investigated the influence of addressing vs. non-addressing body orientation on the neural processing of speech accompanied by gestures.While in an fMRI scanner, participants viewed short video clips of an actor speaking sentences with object- (O; e.g., shape) or person-related content (P; e.g., saying goodbye) accompanied by iconic (e.g., circle) or emblematic gestures (e.g., waving), respectively. The actor's body was oriented either toward the participant (frontal, F) or toward a third person (lateral, L) not visible.For frontal vs. lateral actor orientation (F > L), we observed activation of bilateral occipital, inferior frontal, medial frontal, right anterior temporal and left parietal brain regions. Additionally, we observed activity in the occipital and anterior temporal lobes due to an interaction effect between actor orientation and content of the communication (PF > PL) > (OF > OL).Our findings indicate that social cues influence the neural processing of speech-gesture utterances. Mentalizing (the process of inferring the mental state of another individual) could be responsible for these effects. In particular, socially relevant cues seem to activate regions of the anterior temporal lobes if abstract person-related content is communicated by speech and gesture. These new findings illustrate the complexity of interpersonal communication, as our data demonstrate that multisensory information pathways interact at both perceptual and semantic levels.  相似文献   

7.
Gallagher HL  Frith CD 《Neuropsychologia》2004,42(13):1725-1736
Previous functional imaging studies have sought to characterize the neural correlates of gesture representation. However, little is yet known about the representation of different categories of gesture. Here we contrasted the perception of hand gestures that express inner feeling states, e.g. I am angry, I do not care, with the perception of instrumental gestures intended to change the behavior of others by communicating commands, e.g. come here, look over there. We hypothesised that recognition of expressive gestures would activate a network of brain regions associated with mentalising ('theory of mind') whereas instrumental gestures would activate different neural pathways. Twelve normal volunteers underwent fMRI while they watched a series of short videos (3 s duration) of actors performing expressive and instrumental gestures. The volunteers had either to recognise the gesture or to monitor the positions of the hands. As predicted, different neural networks were activated by the observation of instrumental or expressive gestures. The perception of expressive gestures elicited activity in the anterior paracingulate cortex, the amygdala and the temporal poles bilaterally and the right superior temporal sulcus. These regions have all previously been activated during the performance of mentalising tasks. In contrast, instrumental gestures elicited activity in a left-lateralised system previously associated with language and motor imitation.  相似文献   

8.
Observing a speaker's articulatory gestures can contribute considerably to auditory speech perception. At the level of neural events, seen articulatory gestures can modify auditory cortex responses to speech sounds and modulate auditory cortex activity also in the absence of heard speech. However, possible effects of attention on this modulation have remained unclear. To investigate the effect of attention on visual speech-induced auditory cortex activity, we scanned 10 healthy volunteers with functional magnetic resonance imaging (fMRI) at 3 T during simultaneous presentation of visual speech gestures and moving geometrical forms, with the instruction to either focus on or ignore the seen articulations. Secondary auditory cortex areas in the bilateral posterior superior temporal gyrus and planum temporale were active both when the articulatory gestures were ignored and when they were attended to. However, attention to visual speech gestures enhanced activity in the left planum temporale compared to the situation when the subjects saw identical stimuli but engaged in a nonspeech motion discrimination task. These findings suggest that attention to visually perceived speech gestures modulates auditory cortex function and that this modulation takes place at a hierarchically relatively early processing level.  相似文献   

9.
Gestures are an important part of human communication. However, little is known about the neural correlates of gestures accompanying speech comprehension. The goal of this study is to investigate the neural basis of speech-gesture interaction as reflected in activation increase and decrease during observation of natural communication.Fourteen German participants watched video clips of 5 s duration depicting an actor who performed metaphoric gestures to illustrate the abstract content of spoken sentences. Furthermore, video clips of isolated gestures (without speech), isolated spoken sentences (without gestures) and gestures in the context of an unknown language (Russian) were additionally presented while functional magnetic resonance imaging (fMRI) data were acquired.Bimodal speech and gesture processing led to left hemispheric activation increases of the posterior middle temporal gyrus, the premotor cortex, the inferior frontal gyrus, and the right superior temporal sulcus. Activation reductions during the bimodal condition were located in the left superior temporal gyrus and the left posterior insula. Gesture related activation increases and decreases were dependent on language semantics and were not found in the unknown-language condition.Our results suggest that semantic integration processes for bimodal speech plus gesture comprehension are reflected in activation increases in the classical left hemispheric language areas. Speech related gestures seem to enhance language comprehension during the face-to-face communication.  相似文献   

10.
During face‐to‐face communication, listeners integrate speech with gestures. The semantic information conveyed by iconic gestures (e.g., a drinking gesture) can aid speech comprehension in adverse listening conditions. In this magnetoencephalography (MEG) study, we investigated the spatiotemporal neural oscillatory activity associated with gestural enhancement of degraded speech comprehension. Participants watched videos of an actress uttering clear or degraded speech, accompanied by a gesture or not and completed a cued‐recall task after watching every video. When gestures semantically disambiguated degraded speech comprehension, an alpha and beta power suppression and a gamma power increase revealed engagement and active processing in the hand‐area of the motor cortex, the extended language network (LIFG/pSTS/STG/MTG), medial temporal lobe, and occipital regions. These observed low‐ and high‐frequency oscillatory modulations in these areas support general unification, integration and lexical access processes during online language comprehension, and simulation of and increased visual attention to manual gestures over time. All individual oscillatory power modulations associated with gestural enhancement of degraded speech comprehension predicted a listener's correct disambiguation of the degraded verb after watching the videos. Our results thus go beyond the previously proposed role of oscillatory dynamics in unimodal degraded speech comprehension and provide first evidence for the role of low‐ and high‐frequency oscillations in predicting the integration of auditory and visual information at a semantic level.  相似文献   

11.
There are a number of reasons to suppose that language evolved from manual gestures. We review evidence that the transition from primarily manual to primarily vocal language was a gradual process, and is best understood if it is supposed that speech itself a gestural system rather than an acoustic system, an idea captured by the motor theory of speech perception and articulatory phonology. Studies of primate premotor cortex, and, in particular, of the so-called "mirror system" suggest a double hand/mouth command system that may have evolved initially in the context of ingestion, and later formed a platform for combined manual and vocal communication. In humans, speech is typically accompanied by manual gesture, speech production itself is influenced by executing or observing hand movements, and manual actions also play an important role in the development of speech, from the babbling stage onwards. The final stage at which speech became relatively autonomous may have occurred late in hominid evolution, perhaps with a mutation of the FOXP2 gene around 100,000 years ago.  相似文献   

12.
During language comprehension, listeners use the global semantic representation from previous sentence or discourse context to immediately integrate the meaning of each upcoming word into the unfolding message-level representation. Here we investigate whether communicative gestures that often spontaneously co-occur with speech are processed in a similar fashion and integrated to previous sentence context in the same way as lexical meaning. Event-related potentials were measured while subjects listened to spoken sentences with a critical verb (e.g., knock), which was accompanied by an iconic co-speech gesture (i.e., KNOCK). Verbal and/or gestural semantic content matched or mismatched the content of the preceding part of the sentence. Despite the difference in the modality and in the specificity of meaning conveyed by spoken words and gestures, the latency, amplitude, and topographical distribution of both word and gesture mismatches are found to be similar, indicating that the brain integrates both types of information simultaneously. This provides evidence for the claim that neural processing in language comprehension involves the simultaneous incorporation of information coming from a broader domain of cognition than only verbal semantics. The neural evidence for similar integration of information from speech and gesture emphasizes the tight interconnection between speech and co-speech gestures.  相似文献   

13.
Recipients process information from speech and co-speech gestures, but it is currently unknown how this processing is influenced by the presence of other important social cues, especially gaze direction, a marker of communicative intent. Such cues may modulate neural activity in regions associated either with the processing of ostensive cues, such as eye gaze, or with the processing of semantic information, provided by speech and gesture. Participants were scanned (fMRI) while taking part in triadic communication involving two recipients and a speaker. The speaker uttered sentences that were and were not accompanied by complementary iconic gestures. Crucially, the speaker alternated her gaze direction, thus creating two recipient roles: addressed (direct gaze) vs unaddressed (averted gaze) recipient. The comprehension of Speech&Gesture relative to SpeechOnly utterances recruited middle occipital, middle temporal and inferior frontal gyri, bilaterally. The calcarine sulcus and posterior cingulate cortex were sensitive to differences between direct and averted gaze. Most importantly, Speech&Gesture utterances, but not SpeechOnly utterances, produced additional activity in the right middle temporal gyrus when participants were addressed. Marking communicative intent with gaze direction modulates the processing of speech–gesture utterances in cerebral areas typically associated with the semantic processing of multi-modal communicative acts.  相似文献   

14.
ABSTRACT

Background: Gestures are spontaneous hand and arm movements that are part of everyday communication. The roles of gestures in communication are disputed. Most agree that they augment the information conveyed in speech. More contentiously, some argue that they facilitate speech, particularly when word-finding difficulties (WFD) occur. Exploring gestures in aphasia may further illuminate their role.

Aims: This study explored the spontaneous use of gestures in the conversation of participants with aphasia (PWA) and neurologically healthy participants (NHP). It aimed to examine the facilitative role of gesture by determining whether gestures particularly accompanied WFD and whether those difficulties were resolved.

Methods & Procedures: Spontaneous conversation data were collected from 20 PWA and 21 NHP. Video samples were analysed for gesture production, speech production, and WFD. Analysis 1 examined whether the production of semantically rich gestures in these conversations was affected by whether the person had aphasia, and/or whether there were difficulties in the accompanying speech. Analysis 2 identified all WFD in the data and examined whether these were more likely to be resolved if accompanied by a gesture, again for both groups of participants.

Outcomes & Results: Semantically rich gestures were frequently employed by both groups of participants, but with no effect of group. There was an effect of the accompanying speech, with gestures occurring most commonly alongside resolved WFD. An interaction showed that this was particularly the case for PWA. NHP, on the other hand, employed semantically rich gestures most frequently alongside fluent speech. Analysis 2 showed that WFD were common in both groups of participants. Unsurprisingly, these were more likely to be resolved for NHP than PWA. For both groups, resolution was more likely if a WFD was accompanied by a gesture.

Conclusions: These findings shed light on the different functions of gesture within conversation. They highlight the importance of gesture during WFD, both in aphasic and neurologically healthy language, and suggest that gesture may facilitate word retrieval.  相似文献   

15.
In face-to-face communication, speech is typically enriched by gestures. Clearly, not all people gesture in the same way, and the present study explores whether such individual differences in gesture style are taken into account during the perception of gestures that accompany speech. Participants were presented with one speaker that gestured in a straightforward way and another that also produced self-touch movements. Adding trials with such grooming movements makes the gesture information a much weaker cue compared with the gestures of the non-grooming speaker. The Electroencephalogram was recorded as participants watched videos of the individual speakers. Event-related potentials elicited by the speech signal revealed that adding grooming movements attenuated the impact of gesture for this particular speaker. Thus, these data suggest that there is sensitivity to the personal communication style of a speaker and that affects the extent to which gesture and speech are integrated during language comprehension.  相似文献   

16.
We compared brain activation involved in the observation of isolated right hand movements (e.g. twisting a lid), body-referred movements (e.g. brushing teeth) and expressive gestures (e.g. threatening) in 20 healthy subjects by using functional magnetic resonance imaging (fMRI). Perception-related areas in the occipital and inferior temporal lobe but also the mirror neuron system in the lateral frontal (ventral premotor cortex and BA 44) and superior parietal lobe were active during all three conditions. Observation of body-referred compared to common hand actions induced increased activity in the bilateral posterior superior temporal sulcus (STS), the left temporo-parietal lobe and left BA 45. Expressive gestures involved additional areas related to social perception (bilateral STS, temporal poles, medial prefrontal lobe), emotional processing (bilateral amygdala, bilateral ventrolateral prefrontal cortex (VLPFC), speech and language processing (Broca's and Wernicke's areas) and the pre-supplementary motor area (pre-SMA). In comparison to body-referred actions, expressive gestures evoked additional activity only in the left VLPFC (BA 47). The valence-ratings for expressive gestures correlated significantly with activation intensity in the VLPFC during expressive gesture observation. Valence-ratings for negative expressive gestures correlated with right STS-activity. Our data suggest that both, the VLPFC and the STS are coding for differential emotional valence during the observation of expressive gestures.  相似文献   

17.
This fMRI study explores brain regions involved with perceptual enhancement afforded by observation of visual speech gesture information. Subjects passively identified words presented in the following conditions: audio-only, audiovisual, audio-only with noise, audiovisual with noise, and visual only. The brain may use concordant audio and visual information to enhance perception by integrating the information in a converging multisensory site. Consistent with response properties of multisensory integration sites, enhanced activity in middle and superior temporal gyrus/sulcus was greatest when concordant audiovisual stimuli were presented with acoustic noise. Activity found in brain regions involved with planning and execution of speech production in response to visual speech presented with degraded or absent auditory stimulation, is consistent with the use of an additional pathway through which speech perception is facilitated by a process of internally simulating the intended speech act of the observed speaker.  相似文献   

18.
Gestures represent an integral aspect of interpersonal communication, and they are closely linked with language and thought. Brain regions for language processing overlap with those for gesture processing. Two types of gesticulation, beat gestures and metaphoric gestures are particularly important for understanding the taxonomy of co‐speech gestures. Here, we investigated gesture production during taped interviews with respect to regional brain volume. First, we were interested in whether beat gesture production is associated with similar regions as metaphoric gesture. Second, we investigated whether cortical regions associated with metaphoric gesture processing are linked to gesture production based on correlations with brain volumes. We found that beat gestures are uniquely related to regional volume in cerebellar regions previously implicated in discrete motor timing. We suggest that these gestures may be an artifact of the timing processes of the cerebellum that are important for the timing of vocalizations. Second, our findings indicate that brain volumes in regions of the left hemisphere previously implicated in metaphoric gesture processing are positively correlated with metaphoric gesture production. Together, this novel work extends our understanding of left hemisphere regions associated with gesture to indicate their importance in gesture production, and also suggests that beat gestures may be especially unique. This provides important insight into the taxonomy of co‐speech gestures, and also further insight into the general role of the cerebellum in language. Hum Brain Mapp 36:4016–4030, 2015. © 2015 Wiley Periodicals, Inc.  相似文献   

19.
Perception of speech is improved when presentation of the audio signal is accompanied by concordant visual speech gesture information. This enhancement is most prevalent when the audio signal is degraded. One potential means by which the brain affords perceptual enhancement is thought to be through the integration of concordant information from multiple sensory channels in a common site of convergence, multisensory integration (MSI) sites. Some studies have identified potential sites in the superior temporal gyrus/sulcus (STG/S) that are responsive to multisensory information from the auditory speech signal and visual speech movement. One limitation of these studies is that they do not control for activity resulting from attentional modulation cued by such things as visual information signaling the onsets and offsets of the acoustic speech signal, as well as activity resulting from MSI of properties of the auditory speech signal with aspects of gross visual motion that are not specific to place of articulation information. This fMRI experiment uses spatial wavelet bandpass filtered Japanese sentences presented with background multispeaker audio noise to discern brain activity reflecting MSI induced by auditory and visual correspondence of place of articulation information that controls for activity resulting from the above-mentioned factors. The experiment consists of a low-frequency (LF) filtered condition containing gross visual motion of the lips, jaw, and head without specific place of articulation information, a midfrequency (MF) filtered condition containing place of articulation information, and an unfiltered (UF) condition. Sites of MSI selectively induced by auditory and visual correspondence of place of articulation information were determined by the presence of activity for both the MF and UF conditions relative to the LF condition. Based on these criteria, sites of MSI were found predominantly in the left middle temporal gyrus (MTG), and the left STG/S (including the auditory cortex). By controlling for additional factors that could also induce greater activity resulting from visual motion information, this study identifies potential MSI sites that we believe are involved with improved speech perception intelligibility.  相似文献   

20.
Asymmetry in auditory cortical oscillations could play a role in speech perception by fostering hemispheric triage of information across the two hemispheres. Due to this asymmetry, fast speech temporal modulations relevant for phonemic analysis could be best perceived by the left auditory cortex, while slower modulations conveying vocal and paralinguistic information would be better captured by the right one. It is unclear, however, whether and how early oscillation-based selection influences speech perception. Using a dichotic listening paradigm in human participants, where we provided different parts of the speech envelope to each ear, we show that word recognition is facilitated when the temporal properties of speech match the rhythmic properties of auditory cortices. We further show that the interaction between speech envelope and auditory cortices rhythms translates in their level of neural activity (as measured with fMRI). In the left auditory cortex, the neural activity level related to stimulus-brain rhythm interaction predicts speech perception facilitation. These data demonstrate that speech interacts with auditory cortical rhythms differently in right and left auditory cortex, and that in the latter, the interaction directly impacts speech perception performance.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号