首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Working memory (WM) is known to activate the prefrontal cortex. In the present study we hypothesized that when additional contingencies are added to the instruction of a WM task, this would increase the WM load and result in the activation of additional prefrontal areas. With positron emission tomography we measured regional cerebral blood flow in nine subjects performing a control task and two delayed matching to sample tasks, in which the subjects were matching colours and patterns to a reference picture. The second of the two delayed matching tasks had a more complex instruction than the first, with additional contingencies of how to alternate between the matching of colours and patterns. This task thus required the subjects not only to remember a stimulus to match but also to perform this matching according to a specified plan. Both delayed matching tasks activated cortical fields in the middle frontal gyrus, the frontal operculum, upper cingulate gyrus, inferior parietal cortex and cortex lining the intraparietal sulcus, all in the left hemisphere. When alternated delayed matching was compared to simple delayed matching, increases were located in the right superior and middle frontal gyrus and the right anterior inferior parietal cortex. The increased demand during alternated matching thus resulted in bilateral activation of both dorsolateral prefrontal and inferior parietal cortex. The area in the inferior parietal cortex has previously been coactivated with the dorsolateral prefrontal cortex in several WM tasks, irrespective of the sensory modality of the stimuli, and during tasks involving planning.   相似文献   

2.
To identify and categorize complex stimuli such as familiar objects or speech, the human brain integrates information that is abstracted at multiple levels from its sensory inputs. Using cross-modal priming for spoken words and sounds, this functional magnetic resonance imaging study identified 3 distinct classes of visuoauditory incongruency effects: visuoauditory incongruency effects were selective for 1) spoken words in the left superior temporal sulcus (STS), 2) environmental sounds in the left angular gyrus (AG), and 3) both words and sounds in the lateral and medial prefrontal cortices (IFS/mPFC). From a cognitive perspective, these incongruency effects suggest that prior visual information influences the neural processes underlying speech and sound recognition at multiple levels, with the STS being involved in phonological, AG in semantic, and mPFC/IFS in higher conceptual processing. In terms of neural mechanisms, effective connectivity analyses (dynamic causal modeling) suggest that these incongruency effects may emerge via greater bottom-up effects from early auditory regions to intermediate multisensory integration areas (i.e., STS and AG). This is consistent with a predictive coding perspective on hierarchical Bayesian inference in the cortex where the domain of the prediction error (phonological vs. semantic) determines its regional expression (middle temporal gyrus/STS vs. AG/intraparietal sulcus).  相似文献   

3.
Visual and auditory motion information can be used together to provide complementary information about the movement of objects. To investigate the neural substrates of such cross-modal integration, functional magnetic resonance imaging was used to assess brain activation while subjects performed separate visual and auditory motion discrimination tasks. Areas of unimodal activation included the primary and/or early sensory cortex for each modality plus additional sites extending toward parietal cortex. Areas conjointly activated by both tasks included lateral parietal cortex, lateral frontal cortex, anterior midline and anterior insular cortex. The parietal site encompassed distinct, but partially overlapping, zones of activation in or near the intraparietal sulcus (IPS). A subsequent task requiring an explicit cross-modal speed comparison revealed several foci of enhanced activity relative to the unimodal tasks. These included the IPS, anterior midline, and anterior insula but not frontal cortex. During the unimodal auditory motion task, portions of the dorsal visual motion system showed signals depressed below resting baseline. Thus, interactions between the two systems involved either enhancement or suppression depending on the stimuli present and the nature of the perceptual task. Together, these results identify human cortical regions involved in polysensory integration and the attentional selection of cross-modal motion information.  相似文献   

4.
Neural substrates of phonemic perception   总被引:5,自引:2,他引:3  
The temporal lobe in the left hemisphere has long been implicated in the perception of speech sounds. Little is known, however, regarding the specific function of different temporal regions in the analysis of the speech signal. Here we show that an area extending along the left middle and anterior superior temporal sulcus (STS) is more responsive to familiar consonant-vowel syllables during an auditory discrimination task than to comparably complex auditory patterns that cannot be associated with learned phonemic categories. In contrast, areas in the dorsal superior temporal gyrus bilaterally, closer to primary auditory cortex, are activated to the same extent by the phonemic and nonphonemic sounds. Thus, the left middle/anterior STS appears to play a role in phonemic perception. It may represent an intermediate stage of processing in a functional pathway linking areas in the bilateral dorsal superior temporal gyrus, presumably involved in the analysis of physical features of speech and other complex non-speech sounds, to areas in the left anterior STS and middle temporal gyrus that are engaged in higher-level linguistic processes.  相似文献   

5.
Cortical activity at rest predicts cochlear implantation outcome   总被引:1,自引:0,他引:1  
The functional status of central neural pathways, in particular their susceptibility to plasticity and functional reorganization, may influence speech performance of deaf cochlear implant users. In this paper, we sought to determine how brain metabolic activity measured before implantation relates to cochlear implantation outcome, that is, speech perception. In 22 prelingually deaf children between 1 and 11 years, we correlated preoperative glucose metabolism as measured by F-18 fluorodeoxyglucose positron emission tomography with individual speech perception performance assessed 3 years after implantation, while factoring out the confounding effect of age at implantation. Whereas age at implantation was positively correlated with increased activity in the right superior temporal gyrus, speech scores were selectively associated with enhanced metabolic activity in the left prefrontal cortex and decreased metabolic activity in right Heschl's gyrus and in the posterior superior temporal sulcus. These results reinforce the notion that implantation should be performed as early as possible to prevent cross-modal takeover of auditory regions and suggest that rehabilitation strategies may be more efficient if they capitalize on general cognitive functions instead of only targeting specialized circuits dedicated to auditory and audiovisual pattern recognition.  相似文献   

6.
The left hemisphere specialization for speech perception might arise from asymmetries at more basic levels of auditory processing. In particular, it has been suggested that differences in "temporal" and "spectral" processing exist between the hemispheres. Here we used functional magnetic resonance imaging to test this hypothesis further. Fourteen healthy volunteers listened to sequences of alternating pure tones that varied in the temporal and spectral domains. Increased temporal variation was associated with activation in Heschl's gyrus (HG) bilaterally, whereas increased spectral variation activated the superior temporal gyrus (STG) bilaterally and right posterior superior temporal sulcus (STS). Responses to increased temporal variation were lateralized to the left hemisphere; this left lateralization was greater in posteromedial HG, which is presumed to correspond to the primary auditory cortex. Responses to increased spectral variation were lateralized to the right hemisphere specifically in the anterior STG and posterior STS. These findings are consistent with the notion that the hemispheres are differentially specialized for processing auditory stimuli even in the absence of linguistic information.  相似文献   

7.
Three regions of the macaque inferior parietal lobule and adjacent lateral intraparietal sulcus (IPS) are distinguished by the relative strengths of their connections with the superior colliculus, parahippocampal gyrus, and ventral premotor cortex. It was hypothesized that connectivity information could therefore be used to identify similar areas in the human parietal cortex using diffusion-weighted imaging and probabilistic tractography. Unusually, the subcortical routes of the 3 projections have been reported in the macaque, so it was possible to compare not only the terminations of connections but also their course. The medial IPS had the highest probability of connection with the superior colliculus. The projection pathway resembled that connecting parietal cortex and superior colliculus in the macaque. The posterior angular gyrus and the adjacent superior occipital gyrus had a high probability of connection with the parahippocampal gyrus. The projection pathway resembled the macaque inferior longitudinal fascicle, which connects these areas. The ventral premotor cortex had a high probability of connection with the supramarginal gyrus and anterior IPS. The connection was mediated by the third branch of the superior longitudinal fascicle, which interconnects similar regions in the macaque. Human parietal areas have anatomical connections resembling those of functionally related macaque parietal areas.  相似文献   

8.
We studied eight normal subjects in an fMRI experiment where they listened to natural speech sentences and to matched simple or complex speech envelope noises. Neither of the noises (simple or complex) were understood initially, but after the corresponding natural speech sentences had been heard, comprehension was close to perfect for the complex but still absent for the simple speech envelope noises. This setting thus involved identical stimuli that were understood or not and permitted to identify (i) a neural substrate of speech comprehension unconfounded by stimulus acoustic properties (common to natural speech and complex noises), (ii) putative correlates of auditory search for phonetic cues in noisy stimuli (common to simple and complex noises once the matching natural speech had been heard) and (iii) the cortical regions where speech comprehension and auditory search interact. We found correlates of speech comprehension in bilateral medial (BA21) and inferior (BA38 and BA38/21) temporal regions, whereas acoustic feature processing occurred in more dorsal temporal regions. The left posterior superior temporal cortex (Wernicke's area) responded to the acoustic complexity of the stimuli but was additionally sensitive to auditory search and speech comprehension. Attention was associated with recruitment of the dorsal part of Broca's area (BA44) and interaction of auditory attention and comprehension occurred in bilateral insulae, the anterior cingulate and the right medial frontal cortex. In combination, these results delineate a neuroanatomical framework for the functional components at work during natural speech processing, i.e. when comprehension results from concurrent acoustic processing and effortful auditory search.  相似文献   

9.
Human temporal lobe activation by speech and nonspeech sounds   总被引:27,自引:18,他引:9  
Functional organization of the lateral temporal cortex in humans is not well understood. We recorded blood oxygenation signals from the temporal lobes of normal volunteers using functional magnetic resonance imaging during stimulation with unstructured noise, frequency-modulated (FM) tones, reversed speech, pseudowords and words. For all conditions, subjects performed a material-nonspecific detection response when a train of stimuli began or ceased. Dorsal areas surrounding Heschl's gyrus bilaterally, particularly the planum temporale and dorsolateral superior temporal gyrus, were more strongly activated by FM tones than by noise, suggesting a role in processing simple temporally encoded auditory information. Distinct from these dorsolateral areas, regions centered in the superior temporal sulcus bilaterally were more activated by speech stimuli than by FM tones. Identical results were obtained in this region using words, pseudowords and reversed speech, suggesting that the speech-tones activation difference is due to acoustic rather than linguistic factors. In contrast, previous comparisons between word and nonword speech sounds showed left-lateralized activation differences in more ventral temporal and temporoparietal regions that are likely involved in processing lexical-semantic or syntactic information associated with words. The results indicate functional subdivision of the human lateral temporal cortex and provide a preliminary framework for understanding the cortical processing of speech sounds.  相似文献   

10.
Previous neuroimaging studies have identified brain regions that underlie verbal working memory in humans. According to these studies a phonological store is located in the left inferior parietal cortex, and a complementary subvocal rehearsal mechanism is implemented by mostly left-hemispheric speech areas. In the present functional magnetic resonance imaging study, classical interfering and non-interfering dual-task situations were used to investigate further the neural correlates of verbal working memory. Verbal working memory performance under non-interfering conditions activated Broca's area, the left premotor cortex, the cortex along the left intraparietal sulcus and the right cerebellum, thus replicating the results from previous studies. By contrast, no significant memory- related activation was found in these areas when silent articulatory suppression prevented the subjects from rehearsal. Instead, this non-articulatory maintenance of phonological information was associated with enhanced activity in several other, particularly anterior prefrontal and inferior parietal, brain areas. These results suggest that phonological storage may be a function of a complex prefronto-parietal network, and not localized in only one, parietal brain region. Further possible implications for the functional organization of human working memory are discussed.  相似文献   

11.
BACKGROUND: Functional magnetic resonance imaging offers a compelling, new perspective on altered brain function but is sparsely used in studies of anesthetic effect. To examine effects on verbal memory encoding, the authors imaged human brain response to auditory word stimulation using functional magnetic resonance imaging at different concentrations of an agent not previously studied, and tested memory after recovery. METHODS: Six male volunteers were studied breathing 0.0, 2.0, and 1.0% end-tidal sevoflurane (awake, deep, and light states, respectively) via laryngeal mask. In each condition, they heard 15 two-syllable English nouns via closed headphones. Each word was repeated 15 times (1/s), followed by 15 s of rest. Blood oxygenation level-dependent brain activations during blocks of stimulation versus rest were assessed with a 3-T Siemens Trio scanner and a 20-voxel spatial extent threshold. Memory was tested approximately 1.5 h after recovery with an auditory recognition task (chance performance = 33% correct). RESULTS: Scans showed widespread activations (P < 0.005, uncorrected) in the awake state, including bilateral superior temporal, frontal, and parietal cortex, right occipital cortex, bilateral thalamus, striatum, hippocampus, and cerebellum; more limited activations in the light state (bilateral superior temporal gyrus, right thalamus, bilateral parietal cortex, left frontal cortex, and right occipital cortex); and no significant auditory-related activation in the deep state. During recognition testing, subjects correctly selected 77 +/- 12% of words presented while they were awake as "old," versus 32 +/- 15 and 42 +/- 8% (P < 0.01) correct for the light and deep stages, respectively. CONCLUSIONS: Sevoflurane induces dose-dependent suppression of auditory blood oxygenation level-dependent signals, which likely limits the ability of words to be processed during anesthesia and compromises memory.  相似文献   

12.
We employed functional magnetic resonance imaging (fMRI) in 12 healthy subjects to measure cerebral activation related to a set of higher order manual sensorimotor tasks performed in the absence of visual guidance. Purposeless manipulation of meaningless plasticine lumps served as a reference against which we contrasted two tasks where manual manipulation served a meaningful purpose, either the perception and recognition of three-dimensional shapes or the construction of such shapes out of an amorphous plasticine lump. These tasks were compared with the corresponding mental imagery of the modelling process which evokes the constructive concept but lacks concomitant sensorimotor input and output. Neural overlap was found in a bilateral activity increase in the posterior and anterior intraparietal sulcus area (IPS and AIP). Differential activation was seen in the supplementary and cingulate motor areas, the left M1 and the superior parietal lobe for modelling and in the left angular and ventral premotor cortex for imagery. Our data thus point to a congruent neural substrate for both perceptive and constructive object-oriented sensorimotor cognition in the AIP and posterior IPS. The leftward asymmetry of the inferior parietal activations, including the angular gyrus, during imagery of modelling along with the ventral premotor activations emphasize the close vicinity of the circuitry for cognitive manipulative motor behaviour and language.  相似文献   

13.
Although anatomical, histochemical and electrophysiological findings in both animals and humans have suggested a parallel and serial mode of auditory processing, precise activation timings of each cortical area are not well known, especially in humans. We investigated the timing of arrival of signals to multiple cortical areas using magnetoencephalography in humans. Following click stimuli applied to the left ear, activations were found in six cortical areas in the right hemisphere: the posteromedial part of Heschl's gyrus (HG) corresponding to the primary auditory cortex (PAC), the anterolateral part of the HG region on or posterior to the transverse sulcus, the posterior parietal cortex (PPC), posterior and anterior parts of the superior temporal gyrus (STG), and the planum temporale (PT). The mean onset latencies of each cortical activity were 17.1, 21.2, 25.3, 26.2, 30.9 and 47.6 ms respectively. These results suggested a serial model of auditory processing along the medio-lateral axis of the supratemporal plane and, in addition, implied the existence of several parallel streams running postero-superiorly (from the PAC to the belt region and then to the posterior STG, PPC or PT) and anteriorly (PAC-belt-anterior STG).  相似文献   

14.
Background: Functional magnetic resonance imaging offers a compelling, new perspective on altered brain function but is sparsely used in studies of anesthetic effect. To examine effects on verbal memory encoding, the authors imaged human brain response to auditory word stimulation using functional magnetic resonance imaging at different concentrations of an agent not previously studied, and tested memory after recovery.

Methods: Six male volunteers were studied breathing 0.0, 2.0, and 1.0% end-tidal sevoflurane (awake, deep, and light states, respectively) via laryngeal mask. In each condition, they heard 15 two-syllable English nouns via closed headphones. Each word was repeated 15 times (1/s), followed by 15 s of rest. Blood oxygenation level-dependent brain activations during blocks of stimulation versus rest were assessed with a 3-T Siemens Trio scanner and a 20-voxel spatial extent threshold. Memory was tested approximately 1.5 h after recovery with an auditory recognition task (chance performance = 33% correct).

Results: Scans showed widespread activations (P < 0.005, uncorrected) in the awake state, including bilateral superior temporal, frontal, and parietal cortex, right occipital cortex, bilateral thalamus, striatum, hippocampus, and cerebellum; more limited activations in the light state (bilateral superior temporal gyrus, right thalamus, bilateral parietal cortex, left frontal cortex, and right occipital cortex); and no significant auditory-related activation in the deep state. During recognition testing, subjects correctly selected 77 +/- 12% of words presented while they were awake as "old," versus 32 +/- 15 and 42 +/- 8% (P < 0.01) correct for the light and deep stages, respectively.  相似文献   


15.
The purpose of this study was to identify the functional anatomy of the mechanisms involved in visually guided prehension and in object recognition in humans. The cerebral blood flow of seven subjects was investigated by positron emission tomography. Three conditions were performed using the same set of stimuli. In the 'grasping' condition, subjects were instructed to accurately grasp the objects. In the 'matching' condition, subjects were requested to compare the shape of the presented object with that of the previous one. In the 'pointing' condition (control), subjects pointed towards the objects. The comparison between grasping and pointing showed a regional cerebral blood flow (rCBF) increase in the anterior part of the inferior parietal cortex and part of the posterior parietal cortex. The comparison between grasping and matching showed an rCBF increase in the cerebellum, the left frontal cortex around the central sulcus, the mesial frontal cortex and the left inferior parietal cortex. Finally, the comparison between matching and pointing showed an rCBF increase in the right temporal cortex and the right posterior parietal cortex. Thus object-oriented action and object recognition activate a common posterior parietal area, suggesting that some kind of within-object spatial analysis was processed by this area whatever the goal of the task.   相似文献   

16.
Using event-related functional magnetic resonance imaging (fMRI), the neural correlates of memory encoding can be studied by contrasting item-related activity elicited in a study task according to whether the items are remembered or forgotten in a subsequent memory test. Previous studies using this approach have implicated the left prefrontal cortex in the successful encoding of verbal material into episodic memory when the study task is semantic in nature. In the current study, we asked whether the neural correlates of episodic encoding differ depending on type of study task. Seventeen volunteers participated in an event-related fMRI experiment in which at study, volunteers were cued to make either animacy or syllable judgements about words. A recognition memory test followed after a delay of approximately 15 min. For the animacy task, words that were subsequently remembered showed greater activation in left and medial prefrontal regions. For the syllable task, by contrast, successful memory for words was associated with activations in bilateral intraparietal sulcus, bilateral fusiform gyrus, right prefrontal cortex and left superior occipital gyrus. These findings suggest that the brain networks supporting episodic encoding differ according to study task.  相似文献   

17.
Novel mapping stimuli composed of biological motion figures were used to study the extent and layout of multiple retinotopic regions in the entire human brain and to examine the independent manipulation of retinotopic responses by visual stimuli and by attention. A number of areas exhibited retinotopic activations, including full or partial visual field representations in occipital cortex, the precuneus, motion-sensitive temporal cortex (extending into the superior temporal sulcus), the intraparietal sulcus, and the vicinity of the frontal eye fields in frontal cortex. Early visual areas showed mainly stimulus-driven retinotopy; parietal and frontal areas were driven primarily by attention; and lateral temporal regions could be driven by both. We found clear spatial specificity of attentional modulation not just in early visual areas but also in classical attentional control areas in parietal and frontal cortex. Indeed, strong spatiotopic activity in these areas could be evoked by directed attention alone. Conversely, motion-sensitive temporal regions, while exhibiting attentional modulation, also responded significantly when attention was directed away from the retinotopic stimuli.  相似文献   

18.
Human neuroimaging studies conducted during visuospatial working memory tasks have inconsistently detected activation in the prefrontal cortical areas depending presumably on the type of memory and control tasks employed. We used functional magnetic resonance imaging to study brain activation related to the performance of a visuospatial n-back task with different memory loads (0-back, 1-back and 2-back tasks). Comparison of the 2-back versus 0-back tasks revealed consistent, bilateral activation in the medial frontal gyrus (MFG), superior frontal sulcus and adjacent cortical tissue (SFS/SFG) in all subjects and in six out of seven subjects in the intraparietal sulcus (IPS). Activation was also detected in the inferior frontal gyrus, medially in the superior frontal gyrus, precentral gyrus, superior and inferior parietal lobuli, occipital visual association areas, anterior and posterior cingulate areas and in the insula. Comparison between the 1- back versus 0-back tasks revealed activation only in a few brain areas. Activation in the MFG, SFS/SFG and IPS appeared dependent on memory load. The results suggest that the performance of a visuospatial working memory task engages a network of distributed brain areas and that areas in the dorsal visual pathway are engaged in mnemonic processing of visuospatial information.   相似文献   

19.
Visuospatial attention can either be "narrowly" focused on (zooming in) or "widely" distributed to (zooming out) different locations in space. In the current functional magnetic resonance imaging study, we investigated the shared and differential neural mechanisms underlying the dynamic "zooming in" and "zooming out" processes while potential distance confounds from visual inputs between zooming in and zooming out were controlled for. When compared with zooming out, zooming in differentially implicated left anterior intraparietal sulcus (IPS), which may reflect the functional specificity of left anterior IPS in focusing attention on local object features. By contrast, zooming out differentially activated right inferior frontal gyrus, which may reflect higher demands on cognitive control processes associated with enlarging the attentional focus. A conjunction analysis between zooming in and zooming out revealed significant shared activations in right middle temporal gyrus, right superior occipital gyrus, and right superior parietal cortex. The latter result suggests that the right posterior temporal-occipital-parietal system, which is known to be crucial for the control of spatial attention, is involved in updating the internal representation of the spatial locations that attentional processing is associated with.  相似文献   

20.
目的采用静息态fMRI基于分数低频振荡幅度(fALFF)方法评估急性酒精暴露后恒河猴脑功能改变。方法分别对7只健康雄性恒河猴于静脉注射酒精前及注射后10、28、46min进行BOLD fMRI序列及3D结构像扫描,采用fALFF算法获得并比较4个时间点fALFF差异的脑区。结果 4个时间点fALFF总体差异显著的脑区为右侧中央后回、右侧岛叶、右侧小脑、左侧海马旁回、双侧额下回、小脑蚓部、右枕叶、楔前叶、左侧缘上回(P均0.05);静脉注射酒精后fALFF值减低的脑区为双侧额上回、右侧额下回、右侧梭状回、右侧角回、双侧颞上回、右枕叶、左侧外侧沟、左侧中央后回、左侧楔状叶、左侧丘脑、左侧岛叶、前扣带回(P均0.05);静脉注射酒精后fALFF值增高的脑区为右侧额下回、右侧颞中回(P均0.05)。结论酒精暴露急性期脑代谢活动发生显著变化,主要涉及默认网络、奖赏及情绪加工系统、视听皮层等。  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号