首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Gaab N  Schlaug G 《Neuroreport》2003,14(18):2291-2295
We compared brain activation patterns between musicians and non-musicians (matched in performance score) while they performed a pitch memory task (using a sparse temporal sampling fMRI method). Both groups showed bilateral activation of the superior temporal, supramarginal, posterior middle and inferior frontal gyrus, and superior parietal lobe. Musicians showed more right temporal and supramarginal gyrus activation while non-musicians had more right primary and left secondary auditory cortex activation. Since both groups' performance were matched, these results probably indicate processing differences between groups that are possibly related to musical training. Non-musicians rely more on brain regions important for pitch discrimination while musicians prefer to use brain regions specialized in short-term memory and recall to perform well in this pitch memory task.  相似文献   

2.
In this event-related fMRI study, 12 right-handed volunteers heard human laughter, sentential speech, and nonvocal sounds in which global temporal and harmonic information were varied whilst they were performing a simple auditory target detection. This study aimed to delineate distinct peri-auditory regions which preferentially respond to laughter, speech, and nonvocal sounds. Results show that all three types of stimuli evoked blood-oxygen-level-dependent responses along the left and right peri-sylvian cortex. However, we observed differences in regional strength and lateralization in that (i) hearing human laughter preferentially involves auditory and somatosensory fields primarily in the right hemisphere, (ii) hearing spoken sentences activates left anterior and posterior lateral temporal regions, (iii) hearing nonvocal sounds recruits bilateral areas in the medial portion of Heschl's gyrus and at the medial wall of the posterior Sylvian Fissure (planum parietale and parietal operculum). Generally, the data imply a differential regional sensitivity of peri-sylvian areas to different auditory stimuli with the left hemisphere responding more strongly to speech and with the right hemisphere being more amenable to nonspeech stimuli. Interestingly, passive perception of human laughter activates brain regions which control motor (larynx) functions. This observation may speak to the issue of a dense intertwining of expressive and receptive mechanisms in the auditory domain. Furthermore, the present study provides evidence for a functional role of inferior parietal areas in auditory processing. Finally, a post hoc conjunction analysis meant to reveal the neural substrates of human vocal timbre demonstrates a particular preference of left and right lateral parts of the superior temporal lobes for stimuli which are made up of human voices relative to nonvocal sounds.  相似文献   

3.
Although there has been great interest in the neuroanatomical basis of reading, little attention has been focused on auditory language processing. The purpose of this study was to examine the differential neuroanatomical response to the auditory processing of real words and pseudowords. Eight healthy right-handed participants performed two phoneme monitoring tasks (one with real word stimuli and one with pseudowords) during a functional magnetic resonance imaging (fMRI) scan with a 4.1 T system. Both tasks activated the inferior frontal gyrus (IFG), the posterior superior temporal gyrus (pSTG) and the inferior parietal lobe (IPL). Pseudoword processing elicited significantly more activation within the posterior cortical regions compared with real word processing. Previous reading studies have suggested that this increase is due to an increased demand on the lexical access system. The left inferior frontal gyrus, on the other hand, did not reveal a significant difference in the amount of activation as a function of stimulus type. The lack of a differential response in IFG for auditory processing supports its hypothesized involvement in grapheme to phoneme conversion processes. These results are consistent with those from previous neuroimaging reading studies and emphasize the utility of examining both input modalities (e.g., visual or auditory) to compose a more complete picture of the language network.  相似文献   

4.
Categorization is fundamental to our perception and understanding of the environment. However, little is known about the neural bases underlying the categorization of sounds. Using human functional magnetic resonance imaging (fMRI) we compared the brain responses to a category discrimination task with an auditory discrimination task using identical sets of sounds. Our stimuli differed along two dimensions: a speech-nonspeech dimension and a fast-slow temporal dynamics dimension. All stimuli activated regions in the primary and nonprimary auditory cortices in the temporal cortex and in the parietal and frontal cortices for the two tasks. When comparing the activation patterns for the category discrimination task to those for the auditory discrimination task, the results show that a core group of regions beyond the auditory cortices, including inferior and middle frontal gyri, dorsomedial frontal gyrus, and intraparietal sulcus, were preferentially activated for familiar speech categories and for novel nonspeech categories. These regions have been shown to play a role in working memory tasks by a number of studies. Additionally, the categorization of nonspeech sounds activated left middle frontal gyrus and right parietal cortex to a greater extent than did the categorization of speech sounds. Processing the temporal aspects of the stimuli had a greater impact on the left lateralization of the categorization network than did other factors, particularly in the inferior frontal gyrus, suggesting that there is no inherent left hemisphere advantage in the categorical processing of speech stimuli, or for the categorization task itself.  相似文献   

5.
Xiao Z  Zhang JX  Wang X  Wu R  Hu X  Weng X  Tan LH 《Human brain mapping》2005,25(2):212-221
After Newman and Twieg and others, we used a fast event-related functional magnetic resonance imaging (fMRI) design and contrasted the lexical processing of pseudowords and real words. Participants carried out an auditory lexical decision task on a list of randomly intermixed real and pseudo Chinese two-character (or two-syllable) words. The pseudowords were constructed by recombining constituent characters of the real words to control for sublexical code properties. Processing of pseudowords and real words activated a highly comparable network of brain regions, including bilateral inferior frontal gyrus, superior, middle temporal gyrus, calcarine and lingual gyrus, and left supramarginal gyrus. Mirroring a behavioral lexical effect, left inferior frontal gyrus (IFG) was significantly more activated for pseudowords than for real words. This result disconfirms a popular view that this area plays a role in grapheme-to-phoneme conversion, as such a conversion process was unnecessary in our task with auditory stimulus presentation. An alternative view was supported that attributes increased activity in left IFG for pseudowords to general processes in decision making, specifically in making positive versus negative responses. Activation in left supramarginal gyrus was of a much larger volume for real words than for pseudowords, suggesting a role of this region in the representation of phonological or semantic information for two-character Chinese words at the lexical level.  相似文献   

6.
During natural speech perception, humans must parse temporally continuous auditory and visual speech signals into sequences of words. However, most studies of speech perception present only single words or syllables. We used electrocorticography (subdural electrodes implanted on the brains of epileptic patients) to investigate the neural mechanisms for processing continuous audiovisual speech signals consisting of individual sentences. Using partial correlation analysis, we found that posterior superior temporal gyrus (pSTG) and medial occipital cortex tracked both the auditory and the visual speech envelopes. These same regions, as well as inferior temporal cortex, responded more strongly to a dynamic video of a talking face compared to auditory speech paired with a static face. Occipital cortex and pSTG carry temporal information about both auditory and visual speech dynamics. Visual speech tracking in pSTG may be a mechanism for enhancing perception of degraded auditory speech.  相似文献   

7.
This study investigated the cortex during logical processing of auditory information using a 122-channel dc-SQUID gradiometer. The experimental task was designed to require a simple logical decision prior to counting rare paired tones, which consisted of two different pitches and were presented to separate ears. Among six subjects, left and right predominant dipolar activity was observed in three subjects each. When the dipolar sources were superimposed on MR images, the inferior region of the supramarginal gyrus showed activation, suggesting that logical processing occurred in the association cortex but not in the auditory cortex. We propose a modified cognitive sequence model in which auditory information processed in Heschl's gyri is transmitted to the supramarginal gyrus to commence automatic detection processing.  相似文献   

8.
The brain networks supporting speech identification and comprehension under difficult listening conditions are not well specified. The networks hypothesized to underlie effortful listening include regions responsible for executive control. We conducted meta‐analyses of auditory neuroimaging studies to determine whether a common activation pattern of the frontal lobe supports effortful listening under different speech manipulations. Fifty‐three functional neuroimaging studies investigating speech perception were divided into three independent Activation Likelihood Estimate analyses based on the type of speech manipulation paradigm used: Speech‐in‐noise (SIN, 16 studies, involving 224 participants); spectrally degraded speech using filtering techniques (15 studies involving 270 participants); and linguistic complexity (i.e., levels of syntactic, lexical and semantic intricacy/density, 22 studies, involving 348 participants). Meta‐analysis of the SIN studies revealed higher effort was associated with activation in left inferior frontal gyrus (IFG), left inferior parietal lobule, and right insula. Studies using spectrally degraded speech demonstrated increased activation of the insula bilaterally and the left superior temporal gyrus (STG). Studies manipulating linguistic complexity showed activation in the left IFG, right middle frontal gyrus, left middle temporal gyrus and bilateral STG. Planned contrasts revealed left IFG activation in linguistic complexity studies, which differed from activation patterns observed in SIN or spectral degradation studies. Although there were no significant overlap in prefrontal activation across these three speech manipulation paradigms, SIN and spectral degradation showed overlapping regions in left and right insula. These findings provide evidence that there is regional specialization within the left IFG and differential executive networks underlie effortful listening.  相似文献   

9.
The perception of action is influenced by the observer's familiarity with its movement. However, how does motor familiarity with own movement patterns modulate the visual perception of action effects? Cortical activation was examined with fMRI while 20 observers were watching videotaped point-light displays of markers on the shoulders, the right elbow, and wrist of an opposing table tennis player. The racket and ball were not displayed. Participants were asked to predict the invisible effect of the stroke, that is, the ball flight direction. Different table tennis models were used without the observers knowing and being informed in advance that some of the presented videos displayed their own movements from earlier training sessions. Prediction had to be made irrespective of the identity of the player represented by the four moving markers. Results showed that participants performed better when observing their “own” strokes. Using a region-of-interest approach, fMRI data showed that observing own videos was accompanied by stronger activation (compared to other videos) in the left angular gyrus of the inferior parietal lobe and the anterior rostral medial frontal cortex. Other videos elicited stronger activation than own videos in the left intraparietal sulcus and right supramarginal gyrus. We suggest that during action observation of motorically familiar movements, the compatibility between the observed action and the observers’ motor representation is already coded in the parietal angular gyrus—in addition to the paracingulate gyrus. The activation in angular gyrus is presumably part of an action-specific effect retrieval that accompanies actor-specific prefrontal processing. The intraparietal sulcus seems to be sensitive to incongruence between observed kinematics and internal model representations, and this also influences processing in the supramarginal gyrus.  相似文献   

10.
In this study of native Korean trilinguals we examined the effect of syntactic similarity between first (L1) and second (L2) languages on cortical activation during the processing of Japanese and English, which are, respectively, very similar to and different from Korean. Subjects had equivalent proficiency in Japanese and English. They performed auditory sentence comprehension tasks in Korean, Japanese, and English during functional MRI (fMRI). The bilateral superior temporal cortex was activated during the comprehension of three languages. The pars triangularis of the left inferior frontal gyrus (IFG) was additionally activated for L2 processing. Furthermore, the right cerebellum, the pars opercularis of the left IFG, and the posteriomedial part of the superior frontal gyrus were activated during the English tasks only. We observed significantly greater activation in the pars opercularis of the left IFG, the right cerebellum, and the right superior temporal cortex during the English than Japanese task; activation in these regions did not differ significantly between Korean and Japanese. Differential activation of the pars opercularis of the left IFG and the right cerebellum likely reflects syntactic distance and differential activation in the right superior temporal cortex may reflect the prosodic distance between English from Korean and Japanese. Furthermore, in the pars oparcularis of the left IFG and the right cerebellum, significant negative correlation between the activation and duration of exposure was observed for English, but not for Japanese. Our research supports the notion that linguistic similarity between L1 and L2 affects the cortical processing of second language.  相似文献   

11.
The present study used pleasant and unpleasant music to evoke emotion and functional magnetic resonance imaging (fMRI) to determine neural correlates of emotion processing. Unpleasant (permanently dissonant) music contrasted with pleasant (consonant) music showed activations of amygdala, hippocampus, parahippocampal gyrus, and temporal poles. These structures have previously been implicated in the emotional processing of stimuli with (negative) emotional valence; the present data show that a cerebral network comprising these structures can be activated during the perception of auditory (musical) information. Pleasant (contrasted to unpleasant) music showed activations of the inferior frontal gyrus (IFG, inferior Brodmann's area (BA) 44, BA 45, and BA 46), the anterior superior insula, the ventral striatum, Heschl's gyrus, and the Rolandic operculum. IFG activations appear to reflect processes of music-syntactic analysis and working memory operations. Activations of Rolandic opercular areas possibly reflect the activation of mirror-function mechanisms during the perception of the pleasant tunes. Rolandic operculum, anterior superior insula, and ventral striatum may form a motor-related circuitry that serves the formation of (premotor) representations for vocal sound production during the perception of pleasant auditory information. In all of the mentioned structures, except the hippocampus, activations increased over time during the presentation of the musical stimuli, indicating that the effects of emotion processing have temporal dynamics; the temporal dynamics of emotion have so far mainly been neglected in the functional imaging literature.  相似文献   

12.
Meyer M  Zaehle T  Gountouna VE  Barron A  Jancke L  Turk A 《Neuroreport》2005,16(18):1985-1989
This functional magnetic resonance imaging study investigates the neural underpinnings of spectro-temporal integration during speech perception. Participants performed an auditory discrimination task on a set of sine-wave analogues that could be perceived as either nonspeech or speech. Behavioural results revealed a difference in the processing mode; spectro-temporal integration occurred during speech perception, but not when stimuli were perceived as nonspeech. In terms of neuroimaging, we observed an activation increase in the left posterior primary and secondary auditory cortex, namely Heschl's gyrus and planum temporale encroaching onto the superior temporal sulcus, reflecting a shift from auditory to speech perception. This finding demonstrates that the left posterior superior temporal lobe is essential for spectro-temporal processing during speech perception.  相似文献   

13.
Recent animal and human studies indicate the existence of a neural pathway for sound localization, which is similar to the "where" pathway of the visual system and distinct from the sound identification pathway. This study sought to highlight this pathway using a passive listening protocol. We employed fMRI to study cortical areas, activated during the processing of sounds coming from different locations, and MEG to disclose the temporal dynamics of these areas. In addition, the hypothesis of different activation levels in the right and in the left hemispheres, due to hemispheric specialization of the human brain, was investigated. The fMRI results indicate that the processing of sound, coming from different locations, activates a complex neuronal circuit, similar to the sound localization system described in monkeys known as the auditory "where" pathway. This system includes Heschl's gyrus, the superior temporal gyrus, the supramarginal gyrus, and the inferior and middle frontal lobe. The MEG analysis allowed assessment of the timing of this circuit: the activation of Heschl's gyrus was observed 139 ms after the auditory stimulus, the peak latency of the source located in the superior temporal gyrus was at 156 ms, and the inferior parietal lobule and the supramarginal gyrus peaked at 162 ms. Both hemispheres were found to be involved in the processing of sounds coming from different locations, but a stronger activation was observed in the right hemisphere.  相似文献   

14.
Functional magnetic resonance imaging was used to compare cortical organization of the first (L1, Russian) and second (L2, English) languages. Six fluent Russian-English bilinguals who acquired their second language postpuberty were tested with words and nonwords presented either auditorily or visually. Results showed that both languages activated similar cortical networks, including the inferior frontal, middle frontal, superior temporal, middle temporal, angular, and supramarginal gyri. Within the inferior frontal gyrus (IFG), L2 activated a larger cortical volume than L1 during lexical and phonological processing. For both languages, the left IFG was more active than the right IFG during lexical processing. Within the left IFG, the distance between centers of activation associated with lexical processing of translation equivalents across languages was larger than the distance between centers of activation associated with lexical processing of different words in the same language. Results of phonological processing analyses revealed different centers of activation associated with the first versus the second language in the IFG, but not in the superior temporal gyrus (STG). These findings are discussed within the context of the current literature on cortical organization in bilinguals and suggest variation in bilingual cortical activation associated with lexical, phonological, and orthographic processing.  相似文献   

15.
目的应用fMRI技术探讨中国青年和老年人群在简单运算任务下脑激活模式及其与行为学之间的关系。方法分别对青年组(19例)和老年组(20例)健康志愿者进行对照任务和简单运算任务下的fMRI检查。结果两组受试者受教育程度(P=0.125)、智力水平(P=0.921),以及完成对照任务(P=0.142)和简单乘法运算任务(P=0.880)之正确率差异无统计学意义,但老年组受试者完成对照任务(P=0.000)和简单乘法运算任务(P=0.005)反应时间明显延长。青年组受试者在任务刺激下可激活右侧缘上回并向顶内沟和颞中上回后部延伸,中央前回和运动前区、前额叶,左侧缘上回并向颞上回后部和角回延伸,顶内沟区域、颞中下回,内侧后扣带回、楔前叶、辅助运动区、海马沟、海马旁回及前额叶内侧;老年组受试者则分别激活右侧缘上回和顶下区域并向颞中上回后部延伸,中央前回和运动前区、前额叶,左侧缘上回和角回并向顶下延伸,中央前回和运动前区、岛叶及前额叶,内侧后扣带回和中央旁小叶、前扣带回及前额叶内侧;两组受试者共激活脑区包括顶下区域、楔前叶、中央前后回和额顶叶网络,以及颞叶、海马旁回、钩回、屏状核和后扣带回等皮质下结构。结论数学事实提取相关网络的主要成分受年龄影响较小,老年人群的任务激活脑区主要向任务相关顶区集中。  相似文献   

16.
Liu J  Li J  Rieth CA  Huber DE  Tian J  Lee K 《Neuropsychologia》2011,49(5):1177-1186
The present study employed dynamic causal modeling to investigate the effective functional connectivity between regions of the neural network involved in top-down letter processing. We used an illusory letter detection paradigm in which participants detected letters while viewing pure noise images. When participants detected letters, the response of the right middle occipital gyrus (MOG) in the visual cortex was enhanced by increased feed-backward connectivity from the left inferior frontal gyrus (IFG). In addition, illusory letter detection increased feed-forward connectivity from the right MOG to the left inferior parietal lobules. Originating in the left IFG, this top-down letter processing network may facilitate the detection of letters by activating letter processing areas within the visual cortex. This activation in turns may highlight the visual features of letters and send letter information to activate the associated phonological representations in the identified parietal region.  相似文献   

17.
The reading system can be broken down into four basic subcomponents in charge of prelexical, orthographic, phonological, and lexico-semantic processes. These processes need to jointly work together to become a fluent and efficient reader. Using functional magnetic resonance imaging (fMRI), we systematically analyzed differences in neural activation patterns of these four basic subcomponents in children (N = 41, 9–13 years) using tasks specifically tapping each component (letter identification, orthographic decision, phonological decision, and semantic categorization). Regions of interest (ROI) were selected based on a meta-analysis of child reading and included the left ventral occipito-temporal cortex (vOT), left posterior parietal cortex (PPC), left inferior frontal gyrus (IFG), and bilateral supplementary motor area (SMA). Compared to a visual baseline task, enhanced activation in vOT and IFG was observed for all tasks with very little differences between tasks. Activity in the dorsal PPC system was confined to prelexical and phonological processing. Activity in the SMA was found in orthographic, phonological, and lexico-semantic tasks. Our results are consistent with the idea of an early engagement of the vOT accompanied by executive control functions in the frontal system, including the bilateral SMA.  相似文献   

18.
Auditory verbal hallucinations (AVHs) have a high prevalence in schizophrenic patients. An array of studies have explored the neural correlates of AVHs by means of functional neuroimaging and have associated AVHs with diverse brain regions, some of which have been shown to be involved in speech generation, speech perception, and auditory stimulus processing. We divided these studies into "state" studies comparing periods of presence and absence of AVHs within-subject and "trait" studies comparing patients experiencing AVHs with patients without AVHs or healthy controls during tasks with verbal material. We set out to test the internal consistency and possible dissociations of the neural correlates of AVHs. We used activation likelihood estimation to perform quantitative meta-analyses on brain regions reported in state and trait studies on AVHs to assess significant concordance across studies. State studies were associated with activation in bilateral inferior frontal gyrus, bilateral postcentral gyrus, and left parietal operculum. Trait studies on the other hand showed convergence of decreases in hallucinating subjects in left superior temporal gyrus, left middle temporal gyrus, anterior cingulate cortex, and left premotor cortex activity. Based on the clear dissociation of brain regions that show convergence across state in comparison to trait studies, we conclude that the state of experiencing AVHs is primarily related brain regions that have been implicated in speech production ie, Broca's area, whereas the general trait that makes humans prone to AVHs seems to be related to brain areas involved in auditory stimuli processing and speech perception, ie, auditory cortex.  相似文献   

19.
Yeh YY  Kuo BC  Liu HL 《Brain research》2007,1130(1):146-157
The neural mechanisms of attentional orienting in visuospatial working memory for change detection were investigated. A spatial cue was provided with the onset time manipulated to allow more effective top-down control with an early cue than with a late cue. The change type was also manipulated so that accurate detection depended on color or the binding of color and location. The results showed that both the frontal and parietal regions subserved the change detection task without cueing. When data were collapsed over the two change types, early cueing increased activation in the right inferior frontal gyrus (IFG) and middle frontal gyrus (MFG) while late cueing increased activation in the right inferior parietal lobule (IPL) and temporoparietal junction (TPJ) as compared with the no-cue condition. The cue onset time led to different levels of enhancement in the frontal and posterior cortices related to top-down control and stimulus-driven orienting. For feature detection, early cueing increased activation in the right MFG and late cueing increased activation in the bilateral precuneus (PCu), right TPJ, and right cuneus. The neural correlates of conjunction detection involved the right PCu and cerebellum without cueing, were associated with the anterior MFG, left IFG, and the left STG with early cueing, and involved the right MFG, left IFG, and right IPL with late cueing. The left IFG was correlated with memory retrieval of the cued representation for conjunction detection, and the right posterior PCu was associated with maintenance and memory retrieval among competing stimuli.  相似文献   

20.
We compared fMRI findings (using SPM99) obtained with repetition task in normal subjects with those of two patients with Broca's and Wernicke's aphasia who received speech therapy and showed complete recovery. Both aphasic patients with left hemisphere damage who showed complete recovery exhibited activation of only the compensatory area in the right hemisphere during the repetition task. Recovery from Broca's aphasia involves reorganization and neuromodulation between the external temporopolar area and the anterior superior temporal area of the superior temporal gyrus, putamen and the inferior frontal gyrus, while that from Wernicke's aphasia involves reorganization and neuromodulation between the superior temporal gyrus of the temporal region, the posterior supramarginal gyrus and inferior parietal lobule of the parietal region.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号