首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Chen JL  Rae C  Watkins KE 《NeuroImage》2012,59(2):1200-1208
Interactions between the auditory and motor systems are important for music and speech, and may be especially relevant when one learns to associate sounds with movements such as when learning to play a musical instrument. However, little is known about the neural substrates underlying auditory-motor learning. This study used fMRI to investigate the formation of auditory-motor associations while participants with no musical training learned to play a melody. Listening to melodies before and after training activated the superior temporal gyrus bilaterally, but neural activity in this region was significantly reduced on the right when participants listened to the trained melody. When playing melodies and random sequences, activity in the left dorsal premotor cortex (PMd) was reduced in the late compared to early phase of training; learning to play the melody was also associated with reduced neural activity in the left ventral premotor cortex (PMv). Participants with the highest performance scores for learning the melody showed more reduced neural activity in the left PMd and PMv. Learning to play a melody or random sequence involves acquiring conditional associations between key-presses and their corresponding musical pitches, and is related to activity in the PMd. Learning to play a melody additionally involves acquisition of a learned auditory-motor sequence and is related to activity in the PMv. Together, these findings demonstrate that auditory-motor learning is related to the reduction of neural activity in brain regions of the dorsal auditory action stream, which suggests increased efficiency in neural processing of a learned stimulus.  相似文献   

2.
Rhythm is an essential element of human culture, particularly in language and music. To acquire language or music, we have to perceive the sensory inputs, organize them into structured sequences as rhythms, actively hold the rhythm information in mind, and use the information when we reproduce or mimic the same rhythm. Previous brain imaging studies have elucidated brain regions related to the perception and production of rhythms. However, the neural substrates involved in the working memory of rhythm remain unclear. In addition, little is known about the processing of rhythm information from non-auditory inputs (visual or tactile). Therefore, we measured brain activity by functional magnetic resonance imaging while healthy subjects memorized and reproduced auditory and visual rhythmic information. The inferior parietal lobule, inferior frontal gyrus, supplementary motor area, and cerebellum exhibited significant activations during both encoding and retrieving rhythm information. In addition, most of these areas exhibited significant activation also during the maintenance of rhythm information. All of these regions functioned in the processing of auditory and visual rhythms. The bilateral inferior parietal lobule, inferior frontal gyrus, supplementary motor area, and cerebellum are thought to be essential for motor control. When we listen to a certain rhythm, we are often stimulated to move our body, which suggests the existence of a strong interaction between rhythm processing and the motor system. Here, we propose that rhythm information may be represented and retained as information about bodily movements in the supra-modal motor brain system.  相似文献   

3.
Semantic memory has been investigated in numerous neuroimaging and clinical studies, most of which have used verbal or visual, but only very seldom, musical material. Clinical studies have suggested that there is a relative neural independence between verbal and musical semantic memory. In the present study, “musical semantic memory” is defined as memory for “well-known” melodies without any knowledge of the spatial or temporal circumstances of learning, while “verbal semantic memory” corresponds to general knowledge about concepts, again without any knowledge of the spatial or temporal circumstances of learning. Our aim was to compare the neural substrates of musical and verbal semantic memory by administering the same type of task in each modality. We used high-resolution PET H2O15 to observe 11 young subjects performing two main tasks: (1) a musical semantic memory task, where the subjects heard the first part of familiar melodies and had to decide whether the second part they heard matched the first, and (2) a verbal semantic memory task with the same design, but where the material consisted of well-known expressions or proverbs. The musical semantic memory condition activated the superior temporal area and inferior and middle frontal areas in the left hemisphere and the inferior frontal area in the right hemisphere. The verbal semantic memory condition activated the middle temporal region in the left hemisphere and the cerebellum in the right hemisphere. We found that the verbal and musical semantic processes activated a common network extending throughout the left temporal neocortex. In addition, there was a material-dependent topographical preference within this network, with predominantly anterior activation during musical tasks and predominantly posterior activation during semantic verbal tasks.  相似文献   

4.
Schmithorst VJ 《NeuroImage》2005,25(2):444-451
Music perception is a quite complex cognitive task, involving the perception and integration of various elements including melody, harmony, pitch, rhythm, and timbre. A preliminary functional MRI investigation of music perception was performed, using a simplified passive listening task. Group independent component analysis (ICA) was used to separate out various components involved in music processing, as the hemodynamic responses are not known a priori. Various components consistent with auditory processing, expressive language, syntactic processing, and visual association were found. The results are discussed in light of various hypotheses regarding modularity of music processing and its overlap with language processing. The results suggest that, while some networks overlap with ones used for language processing, music processing may involve its own domain-specific processing subsystems.  相似文献   

5.
Kim JJ  Kim MS  Lee JS  Lee DS  Lee MC  Kwon JS 《NeuroImage》2002,15(4):879-891
Verbal working memory plays a significant role in language comprehension and problem-solving. The prefrontal cortex has been suggested as a critical area in working memory. Given that domain-specific dissociations of working memory may exist within the prefrontal cortex, it is possible that there may also be further functional divisions within the verbal working memory processing. While differences in the areas of the brain engaged in native and second languages have been demonstrated, little is known about the dissociation of verbal working memory associated with native and second languages. We have used H2(15)O positron emission tomography in 14 normal subjects in order to identify the neural correlates selectively involved in working memory of native (Korean) and second (English) languages. All subjects were highly proficient in the native language but poorly proficient in the second language. Cognitive tasks were a two-back task for three kinds of visually presented objects: simple pictures, English words, and Korean words. The anterior portion of the right dorsolateral prefrontal cortex and the left superior temporal gyrus were activated in working memory for the native language, whereas the posterior portion of the right dorsolateral prefrontal cortex and the left inferior temporal gyrus were activated in working memory for the second language. The results suggest that the right dorsolateral prefrontal cortex and left temporal lobe may be organized into two discrete, language-related functional systems. Internal phonological processing seems to play a predominant role in working memory processing for the native language with a high proficiency, whereas visual higher order control does so for the second language with a low proficiency.  相似文献   

6.
We used functional magnetic resonance imaging to investigate the effect of two factors on the neural control of temporal sequence performance: the modality in which the rhythms had been trained, and the modality of the pacing stimuli preceding performance. The rhythms were trained 1-2 days before scanning. Each participant learned two rhythms: one was presented visually, the other auditorily. During fMRI, the rhythms were performed in blocks. In each block, four beats of a visual or auditory pacing metronome were followed by repetitive self-paced rhythm performance from memory. Data from the self-paced performance phase was analysed in a 2x2 factorial design, with the two factors Training Modality (auditory or visual) and Metronome Modality (auditory or visual), as well as with a conjunction analysis across all active conditions, to identify activations that were independent of both Training Modality and Metronome Modality. We found a significant main effect only for visual versus auditory Metronome Modality, in the left angular gyrus, due to a deactivation of this region after auditory pacing. The conjunction analysis revealed a set of brain areas that included dorsal auditory pathway areas (left temporo-parietal junction area and ventral premotor cortex), dorsal premotor cortex, the supplementary and presupplementary premotor areas, the cerebellum and the basal ganglia. We conclude that these regions are involved in controlling performance of well-learned rhythms, regardless of the modality in which the rhythms are trained and paced. This suggests that after extensive short-term training, all rhythms, even those that were both trained and paced in visual modality, had been transformed into auditory-motor representations. The deactivation of the angular cortex following auditory pacing may represent cross-modal auditory-visual inhibition.  相似文献   

7.
Adults and children processing music: an fMRI study   总被引:5,自引:0,他引:5  
Koelsch S  Fritz T  Schulze K  Alsop D  Schlaug G 《NeuroImage》2005,25(4):1068-1076
The present study investigates the functional neuroanatomy of music perception with functional magnetic resonance imaging (fMRI). Three different subject groups were investigated to examine developmental aspects and effects of musical training: 10-year-old children with varying degrees of musical training, adults without formal musical training (nonmusicians), and adult musicians. Subjects made judgements on sequences that ended on chords that were music-syntactically either regular or irregular. In adults, irregular chords activated the inferior frontal gyrus, orbital frontolateral cortex, the anterior insula, ventrolateral premotor cortex, anterior and posterior areas of the superior temporal gyrus, the superior temporal sulcus, and the supramarginal gyrus. These structures presumably form different networks mediating cognitive aspects of music processing (such as processing of musical syntax and musical meaning, as well as auditory working memory), and possibly emotional aspects of music processing. In the right hemisphere, the activation pattern of children was similar to that of adults. In the left hemisphere, adults showed larger activations than children in prefrontal areas, in the supramarginal gyrus, and in temporal areas. In both adults and children, musical training was correlated with stronger activations in the frontal operculum and the anterior portion of the superior temporal gyrus.  相似文献   

8.
This 3-T fMRI study investigates brain regions similarly and differentially involved with listening and covert production of singing relative to speech. Given the greater use of auditory-motor self-monitoring and imagery with respect to consonance in singing, brain regions involved with these processes are predicted to be differentially active for singing more than for speech. The stimuli consisted of six Japanese songs. A block design was employed in which the tasks for the subject were to listen passively to singing of the song lyrics, passively listen to speaking of the song lyrics, covertly sing the song lyrics visually presented, covertly speak the song lyrics visually presented, and to rest. The conjunction of passive listening and covert production tasks used in this study allow for general neural processes underlying both perception and production to be discerned that are not exclusively a result of stimulus induced auditory processing nor to low level articulatory motor control. Brain regions involved with both perception and production for singing as well as speech were found to include the left planum temporale/superior temporal parietal region, as well as left and right premotor cortex, lateral aspect of the VI lobule of posterior cerebellum, anterior superior temporal gyrus, and planum polare. Greater activity for the singing over the speech condition for both the listening and covert production tasks was found in the right planum temporale. Greater activity in brain regions involved with consonance, orbitofrontal cortex (listening task), subcallosal cingulate (covert production task) were also present for singing over speech. The results are consistent with the PT mediating representational transformation across auditory and motor domains in response to consonance for singing over that of speech. Hemispheric laterality was assessed by paired t tests between active voxels in the contrast of interest relative to the left-right flipped contrast of interest calculated from images normalized to the left-right reflected template. Consistent with some hypotheses regarding hemispheric specialization, a pattern of differential laterality for speech over singing (both covert production and listening tasks) occurs in the left temporal lobe, whereas, singing over speech (listening task only) occurs in right temporal lobe.  相似文献   

9.
Obleser J  Meyer L  Friederici AD 《NeuroImage》2011,56(4):2310-2320
Under real-life adverse listening conditions, the interdependence of the brain's analysis of language structure (syntax) and its analysis of the acoustic signal is unclear. In two fMRI experiments, we first tested the functional neural organization when listening to increasingly complex syntax in fMRI. We then tested parametric combinations of syntactic complexity (argument scrambling in three degrees) with speech signal degradation (noise-band vocoding in three different numbers of bands), to shed light on the mutual dependency of sound and syntax analysis along the neural processing pathways. The left anterior and the posterior superior temporal sulcus (STS) as well as the left inferior frontal cortex (IFG) were linearly more activated as syntactic complexity increased (Experiment 1). In Experiment 2, when syntactic complexity was combined with improving signal quality, this pattern was replicated. However, when syntactic complexity was additive to degrading signal quality, the syntactic complexity effect in the IFG shifted dorsally and medially, and the activation effect in the left posterior STS shifted from posterior toward more middle sections of the sulcus. A distribution analysis of supra- as well as subthreshold data was indicative of this pattern of shifts in the anterior and posterior STS and within the IFG. Results suggest a signal quality gradient within the fronto-temporal language network. More signal-bound processing areas, lower in the processing hierarchy, become relatively more recruited for the analysis of complex language input under more challenging acoustic conditions ("upstream delegation"). This finding provides evidence for dynamic resource assignments along the neural pathways in auditory language comprehension.  相似文献   

10.
To investigate cortical auditory and motor coupling in professional musicians, we compared the functional magnetic resonance imaging (fMRI) activity of seven pianists to seven non-musicians utilizing a passive task paradigm established in a previous learning study. The tasks involved either passively listening to short piano melodies or pressing keys on a mute MRI-compliant piano keyboard. Both groups were matched with respect to age and gender, and did not exhibit any overt performance differences in the keypressing task. The professional pianists showed increased activity compared to the non-musicians in a distributed cortical network during both the acoustic and the mute motion-related task. A conjunction analysis revealed a distinct musicianship-specific network being co-activated during either task type, indicating areas involved in auditory-sensorimotor integration. This network is comprised of dorsolateral and inferior frontal cortex (including Broca's area), the superior temporal gyrus (Wernicke's area), the supramarginal gyrus, and supplementary motor and premotor areas.  相似文献   

11.
We examined changes in relative cerebral flood flow (relCBF) using PET during a sustained attention paradigm which included auditory stimulation and different tasks of mental counting. Ten normal volunteers underwent PET (15O water) during a baseline state and under experimental conditions which included listening to clicks, serial counting with auditory stimulation, counting with no auditory stimulation, and an additional component of working memory and time estimation. All subjects performed within normal limits in a battery of neurocognitive tests, which included measures of attention and working memory. Both counting with auditory stimulation and counting with no auditory stimulation engaged motor cortex, putamen, cerebellum, and anterior cingulate. Furthermore, counting with no auditory stimulation relative to counting while listening resulted in significantly increased relCBF in the inferior parietal, dorsolateral prefrontal, and anterior cingulate. The findings obtained in this study support the notion that the parietal and dorsolateral prefrontal cortex are involved when time estimation and working memory are taking part in a task requiring sustained attention.  相似文献   

12.
目的 探讨左侧颞前部在汉语听觉信息加工中的作用机制。方法 应用3.0T磁共振成像系统与标准头线圈对15名健康志愿者(男5名,女10名)进行功能磁共振成像(fMRI)。要求受试者完成听觉复述任务和听觉语义危险判断任务。应用软件包AFNI分析两种听觉任务在左颞前部的任务功能定位及其差异。结果 正常成人听觉语义判断任务相比听觉复述任务更多激活左侧颞中回及颞下回前部,而听觉语音复述任务相比听觉语义判断任务更多激活左侧颞上回前部。结论 脑内存在左颞前部对汉语听觉语音语义信息加工的分离,颞上前部对语音分析更强,颞前中下部对语义分析更强。  相似文献   

13.
Chen JL  Zatorre RJ  Penhune VB 《NeuroImage》2006,32(4):1771-1781
When listening to music, we often spontaneously synchronize our body movements to a rhythm's beat (e.g. tapping our feet). The goals of this study were to determine how features of a rhythm such as metric structure, can facilitate motor responses, and to elucidate the neural correlates of these auditory-motor interactions using fMRI. Five variants of an isochronous rhythm were created by increasing the contrast in sound amplitude between accented and unaccented tones, progressively highlighting the rhythm's metric structure. Subjects tapped in synchrony to these rhythms, and as metric saliency increased across the five levels, louder tones evoked longer tap durations with concomitant increases in the BOLD response at auditory and dorsal premotor cortices. The functional connectivity between these regions was also modulated by the stimulus manipulation. These results show that metric organization, as manipulated via intensity accentuation, modulates motor behavior and neural responses in auditory and dorsal premotor cortex. Auditory-motor interactions may take place at these regions with the dorsal premotor cortex interfacing sensory cues with temporally organized movement.  相似文献   

14.
A prevailing neurobiological theory of semantic memory proposes that part of our knowledge about concrete, highly imageable concepts is stored in the form of sensory-motor representations. While this theory predicts differential activation of the semantic system by concrete and abstract words, previous functional imaging studies employing this contrast have provided relatively little supporting evidence. We acquired event-related functional magnetic resonance imaging (fMRI) data while participants performed a semantic similarity judgment task on a large number of concrete and abstract noun triads. Task difficulty was manipulated by varying the degree to which the words in the triad were similar in meaning. Concrete nouns, relative to abstract nouns, produced greater activation in a bilateral network of multimodal and heteromodal association areas, including ventral and medial temporal, posterior-inferior parietal, dorsal prefrontal, and posterior cingulate cortex. In contrast, abstract nouns produced greater activation almost exclusively in the left hemisphere in superior temporal and inferior frontal cortex. Increasing task difficulty modulated activation mainly in attention, working memory, and response monitoring systems, with almost no effect on areas that were modulated by imageability. These data provide critical support for the hypothesis that concrete, imageable concepts activate perceptually based representations not available to abstract concepts. In contrast, processing abstract concepts makes greater demands on left perisylvian phonological and lexical retrieval systems. The findings are compatible with dual coding theory and less consistent with single-code models of conceptual representation. The lack of overlap between imageability and task difficulty effects suggests that once the neural representation of a concept is activated, further maintenance and manipulation of that information in working memory does not further increase neural activation in the conceptual store.  相似文献   

15.
The purpose of this study was to reveal functional areas of the brain modulating processing of selective auditory or visual attention toward utterances. Regional cerebral blood flow was measured in six normal volunteers using positron emission tomography during two selective attention tasks and a control condition. The auditory task activated the auditory, inferior parietal, prefrontal, and anterior cingulate cortices. The visual task activated the visual association, inferior parietal, and prefrontal cortices. Both conditions activated the same area in the superior temporal sulcus. During the visual task, deactivation was observed in the auditory cortex. These results indicate that there exists a modality-dependent selective attention mechanism which activates or deactivates cortical areas in different ways.  相似文献   

16.
Plante E  Creusere M  Sabin C 《NeuroImage》2002,17(1):401-410
Sentence processing was contrasted with processing of syntactic prosody under two task conditions in order to examine the representation of these components of language and their interaction with working memory load. Twelve adults received fMDI scans while they listened to low-pass filtered and unfiltered sentences either passively, or during tasks that required subjects to remember and recognize information contained in the stimuli. Results indicated that temporal activation for prosodic stimuli differed compared to activation for sentence stimuli only during passive listening tasks. The inclusion of memory demands was associated with frontal activation, which was differentially lateralized for sentence and prosodic stimuli. The results demonstrate differential brain activation for prosodic vs sentential stimuli which interacts with the memory demands placed on the subjects.  相似文献   

17.
Miranda RA  Ullman MT 《NeuroImage》2007,38(2):331-345
Language and music share a number of characteristics. Crucially, both domains depend on both rules and memorized representations. Double dissociations between the neurocognition of rule-governed and memory-based knowledge have been found in language but not music. Here, the neural bases of both of these aspects of music were examined with an event-related potential (ERP) study of note violations in melodies. Rule-only violations consisted of out-of-key deviant notes that violated tonal harmony rules in novel (unfamiliar) melodies. Memory-only violations consisted of in-key deviant notes in familiar well-known melodies; these notes followed musical rules but deviated from the actual melodies. Finally, out-of-key notes in familiar well-known melodies constituted violations of both rules and memory. All three conditions were presented, within-subjects, to healthy young adults, half musicians and half non-musicians. The results revealed a double dissociation, independent of musical training, between rules and memory: both rule violation conditions, but not the memory-only violations, elicited an early, somewhat right-lateralized anterior-central negativity (ERAN), consistent with previous studies of rule violations in music, and analogous to the early left-lateralized anterior negativities elicited by rule violations in language. In contrast, both memory violation conditions, but not the rule-only violation, elicited a posterior negativity that might be characterized as an N400, an ERP component that depends, at least in part, on the processing of representations stored in long-term memory, both in language and in other domains. The results suggest that the neurocognitive rule/memory dissociation extends from language to music, further strengthening the similarities between the two domains.  相似文献   

18.
Both electrophysiological research in animals and human brain imaging studies have suggested that, similar to the visual system, separate cortical ventral "what" and dorsal "where" processing streams may also exist in the auditory domain. Recently we have shown enhanced gamma-band activity (GBA) over posterior parietal cortex belonging to the putative auditory dorsal pathway during a sound location working memory task. Using a similar methodological approach, the present study assessed whether GBA would be increased over auditory ventral stream areas during an auditory pattern memory task. Whole-head magnetoencephalogram was recorded from N = 12 subjects while they performed a working memory task requiring same-different judgments about pairs of syllables S1 and S2 presented with 0.8-s delays. S1 and S2 could differ either in voice onset time or in formant structure. This was compared with a control task involving the detection of possible spatial displacements in the background sound presented instead of S2. Under the memory condition, induced GBA was enhanced over left inferior frontal/anterior temporal regions during the delay phase and in response to S2 and over prefrontal cortex at the end of the delay period. gamma-Band coherence between left frontotemporal and prefrontal sensors was increased throughout the delay period of the memory task. In summary, the memorization of syllables was associated with synchronously oscillating networks both in frontotemporal cortex, supporting a role of these areas as parts of the putative auditory ventral stream, and in prefrontal, possible executive regions. Moreover, corticocortical connectivity was increased between these structures.  相似文献   

19.
Gruber O  von Cramon DY 《NeuroImage》2003,19(3):797-809
In the present event-related functional magnetic resonance imaging study, the neural implementation of human working memory was reinvestigated using a factorial design with verbal and visuospatial item-recognition tasks each performed under single-task conditions, under articulatory suppression, and under visuospatial suppression. This approach allowed to differentiate between brain systems subserving domain-specific working memory processes and possible neural correlates of more "central" executive or storage functions. The results of this study indicate (1) a domain-specific functional-neuroanatomical organization of verbal and visuospatial working memory, (2) a dual architecture of verbal working memory in contrast to a unitary macroscopic architecture of visuospatial working memory, (3) possible neural correlates for a domain-unspecific "episodic buffer" in contrast to a failure to find brain areas attributable to a "central executive," and (4) competition for neuronal processing resources as the causal principle for the occurrence of domain-specific interference in working memory.  相似文献   

20.
When two identical stimuli, such as a pair of clicks, are presented with a sufficiently long time-interval between them they are readily perceived as two separate events. However, as they are presented progressively closer together, there comes a point when the two separate stimuli are perceived as one. This phenomenon applies not only to hearing but also to other sensory modalities. Damage to the basal ganglia disturbs this type of temporal discrimination irrespective of sensory modality, suggesting a multimodal process is involved. Our aim was to study the neural substrate of auditory temporal discrimination in healthy subjects and to compare it with structures previously associated with analogous tactile temporal discrimination. During fMRI scanning, paired-clicks separated by variable inter-stimulus intervals (1-50 ms) were delivered binaurally, with different intensities delivered to each ear, yielding a lateralised auditory percept. Subjects were required (a) to report whether they heard one or two stimuli (TD: temporal discrimination); or (b) to report whether the stimuli were located on the right or left side of the head mid-line (SD: spatial discrimination); or (c) simply to detect the presence of an auditory stimulus (control task). Our results showed that both types of auditory discrimination (TD and SD) compared to simple detection activated a network of brain areas including regions of prefrontal cortex and basal ganglia. Critically, two clusters in pre-SMA and the anterior cingulate cortex were specifically activated by TD. Furthermore, these clusters overlap with regions activated for similar judgments in the tactile modality suggesting that they fulfill a multimodal function in the temporal processing of sensory events.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号