首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
1. In order to determine whether the responsiveness of neurons in the caudolateral orbitofrontal cortex (a secondary cortical gustatory area) is influenced by hunger, the activity evoked by prototypical taste stimuli (glucose, NaCl, HCl, and quinine hydrochloride) and fruit juice was recorded in single neurons in this cortical area before, while, and after cynomolgous macaque monkeys were fed to satiety with glucose or fruit juice. 2. It was found that the responses of the neurons to the taste of the glucose decreased to zero while the monkey ate it to satiety during the course of which his behaviour turned from avid acceptance to active rejection. 3. This modulation of responsiveness of the gustatory responses of the neurons to satiety was not due to peripheral adaptation in the gustatory system or to altered efficacy of gustatory stimulation after satiety was reached, because modulation of neuronal responsiveness by satiety was not seen at earlier stages of the gustatory system, including the nucleus of the solitary tract, the frontal opercular taste cortex, and the insular taste cortex. 4. The decreases in the responsiveness of the neurons were relatively specific to the food with which the monkey had been fed to satiety. For example, in seven experiments in which the monkey was fed glucose solution, neuronal responsiveness decreased to the taste of the glucose but not to the taste of blackcurrant juice. Conversely, in two experiments in which the monkey was fed to satiety with fruit juice, the responses of the neurons decreased to fruit juice but not to glucose. 5. These and earlier findings lead to a proposed neurophysiological mechanism for sensory-specific satiety in which the information coded by single neurons in the gustatory system becomes more specific through the processing stages consisting of the nucleus of the solitary tract, the taste thalamus, and the frontal opercular and insular taste primary taste cortices, until neuronal responses become relatively specific for the food tasted in the caudolateral orbitofrontal cortex (secondary) taste area. Then sensory-specific satiety occurs because in this caudolateral orbitofrontal cortex taste area (but not earlier in the taste system) it is a property of the synapses that repeated stimulation results in a decreased neuronal response. 6. Evidence was obtained that gustatory processing involved in thirst also becomes interfaced to motivation in the caudolateral orbitofrontal cortex taste projection area, in that neuronal responses here to water were decreased to zero while water was drunk until satiety was produced.  相似文献   

2.
Seeing a speaker''s face benefits speech comprehension, especially in challenging listening conditions. This perceptual benefit is thought to stem from the neural integration of visual and auditory speech at multiple stages of processing, whereby movement of a speaker''s face provides temporal cues to auditory cortex, and articulatory information from the speaker''s mouth can aid recognizing specific linguistic units (e.g., phonemes, syllables). However, it remains unclear how the integration of these cues varies as a function of listening conditions. Here, we sought to provide insight on these questions by examining EEG responses in humans (males and females) to natural audiovisual (AV), audio, and visual speech in quiet and in noise. We represented our speech stimuli in terms of their spectrograms and their phonetic features and then quantified the strength of the encoding of those features in the EEG using canonical correlation analysis (CCA). The encoding of both spectrotemporal and phonetic features was shown to be more robust in AV speech responses than what would have been expected from the summation of the audio and visual speech responses, suggesting that multisensory integration occurs at both spectrotemporal and phonetic stages of speech processing. We also found evidence to suggest that the integration effects may change with listening conditions; however, this was an exploratory analysis and future work will be required to examine this effect using a within-subject design. These findings demonstrate that integration of audio and visual speech occurs at multiple stages along the speech processing hierarchy.SIGNIFICANCE STATEMENT During conversation, visual cues impact our perception of speech. Integration of auditory and visual speech is thought to occur at multiple stages of speech processing and vary flexibly depending on the listening conditions. Here, we examine audiovisual (AV) integration at two stages of speech processing using the speech spectrogram and a phonetic representation, and test how AV integration adapts to degraded listening conditions. We find significant integration at both of these stages regardless of listening conditions. These findings reveal neural indices of multisensory interactions at different stages of processing and provide support for the multistage integration framework.  相似文献   

3.
The perceptual pattern in autism has been related to either a specific localized processing deficit or a pathway-independent, complexity-specific anomaly. We examined auditory perception in autism using an auditory disembedding task that required spectral and temporal integration. 23 children with high-functioning-autism and 23 matched controls participated. Participants were presented with two-syllable words embedded in various auditory backgrounds (pink noise, moving ripple, amplitude-modulated pink noise, amplitude-modulated moving ripple) to assess speech-in-noise-reception thresholds. The gain in signal perception of pink noise with temporal dips relative to pink noise without temporal dips was smaller in children with autism (= 0.008). Thus, the autism group was less able to integrate auditory information present in temporal dips in background sound, supporting the complexity-specific perceptual account.  相似文献   

4.
5.
6.
Recent research revealed a surprisingly large range of cognitive operations to be preserved during sleep in humans. The new challenge is therefore to understand functions and mechanisms of processes, which so far have been mainly investigated in awake subjects. The current study focuses on dynamic changes of brain oscillations and connectivity patterns in response to environmental stimulation during non-REM sleep. Our results indicate that aurally presented names were processed and neuronally differentiated across the wake-sleep spectrum. Simultaneously recorded EEG and MEG signals revealed two distinct clusters of oscillatory power increase in response to the stimuli: (1) vigilance state-independent θ synchronization occurring immediately after stimulus onset, followed by (2) sleep-specific α/σ synchronization peaking after stimulus offset. We discuss the possible role of θ, α, and σ oscillations during non-REM sleep, and work toward a unified theory of brain rhythms and their functions during sleep.SIGNIFICANCE STATEMENT Previous research has revealed (residual) capacity of the sleeping human brain to interact with the environment. How sensory processing is realized by the neural assemblies in different stages of sleep is however unclear. To tackle this question, we examined simultaneously recorded MEG and EEG data. We discuss the possible role of θ, α, and σ oscillations during non-REM sleep. In contrast to versatile θ band response that reflected early stimulus processing step, succeeding α and σ band activity was sensitive to the saliency of the incoming information, and contingent on the sleep stage. Our findings suggest that the specific reorganization of mechanisms involved in later stages of sensory processing takes place upon falling asleep.  相似文献   

7.
An experiment was designed to study the processing of hemiretinally presented random line drawings. The nature of the processing demanded by the experimental task appears to be the major factor underlying the observed asymmetries. The findings are consistent with the hypothesis which predicts that the left hemisphere will be dominant in tasks which require an analysis of the structure of a stimulus.  相似文献   

8.
9.
10.
11.
The study aimed to investigate possible differences in lateralized olfactory processing in left- and right-handed subjects using a functional MRI paradigm. Twenty-four (14 female, 10 male) right-handers and 24 (14 female, 10 male) left-handers participated; their mean age was 24.0?years, all were in excellent health with no indication of any major nasal or other health problems. The rose-like odor phenyl ethyl alcohol and the smell of rotten eggs (H2S) were used for relatively specific olfactory activation in a block design using a 1.5-T MR scanner. Results indicated no major differences in lateralized olfactory activation between left- and right-handers. This suggests that in simple olfactory tasks, handedness does not seem to play a substantial role in the processing of olfactory information.  相似文献   

12.
Journal of Autism and Developmental Disorders - In this study we investigate whether persons with autism spectrum disorder (ASD) perceive social images differently than control participants (CON)...  相似文献   

13.
14.
Journal of Autism and Developmental Disorders - While many individuals with autism spectrum disorder (ASD) experience difficulties with language processing, non-linguistic semantic processing may...  相似文献   

15.
A group of congenitally deaf and a group of normal hearing children were given two dichhaptically presented tests of hemispheric specialization.Left hemisphere specialization for language was assessed by a letter sequences task and right hemisphere specialization for spatial stimuli was measured by the nonsense shapes test of Witelson (1974). The results showed a significant right tactual field superiority for the processing of tactually presented letter sequences. There were no significant differences between the tactual fields for the processing of tactually presented nonsense shapes. The pattern of lateralization demonstrated on the letter sequences task was consistent across both groups. However, groups differed significantly on the overall performance of the verbal task, with deaf subjects demonstrating lower accuracy scores compared to hearing subjects.Age and I. Q. factors produced significant differences between sub-groups on the nonsense shapes task.  相似文献   

16.
There is evidence to support that the cerebellum contributes to the neural processing of both emotions and painful stimuli. This could be particularly relevant in conditions associated with chronic abdominal pain, such as the irritable bowel syndrome (IBS), which are often also characterized by affective disturbances. We aimed to test the hypothesis that in IBS, symptoms of anxiety and depression modulate brain activation during visceral stimulation within the cerebellum. We reanalyzed a previous data set from N?=?15 female IBS patients and N?=?12 healthy women with a specific focus on the cerebellum using advanced normalization methods. Rectal distension-induced brain activation was measured with functional magnetic resonance imaging using non-painful and painful rectal distensions. Symptoms of anxiety and depression, assessed with the Hospital Anxiety and Depression scale, were correlated with cerebellar activation within IBS patients. Within IBS, depression scores were associated with non-painful distension-induced activation in the right cerebellum primarily in Crus II and lobule VIIIb, and additionally in Crus I. Depression scores were also associated with painful distension-induced activation predominantly in vermal lobule V with some extension to the intermediate cerebellum. Anxiety scores correlated significantly with non-painful induced activation in Crus II. Symptoms of anxiety and depression, which are frequently found in chronic pain conditions like IBS, modulate activation during visceral sensory signals not only in cortical and subcortical brain areas but also in the cerebellum.  相似文献   

17.
18.
We study the claim that multisensory environments are useful for visual learning because nonvisual percepts can be processed to produce error signals that people can use to adapt their visual systems. This hypothesis is motivated by a Bayesian network framework. The framework is useful because it ties together three observations that have appeared in the literature: (a) signals from nonvisual modalities can “teach” the visual system; (b) signals from nonvisual modalities can facilitate learning in the visual system; and (c) visual signals can become associated with (or be predicted by) signals from nonvisual modalities. Experimental data consistent with each of these observations are reviewed.  相似文献   

19.
Varicella zoster, limited to the mandibular nerve, is rare. Classical symptoms are pain, hypesthesia and vesicular eruption restricted to the third trigeminal segment (V3). Little is known on taste affection after mandibular nerve zoster. We report two cases of patients suffering from mandibular zoster associated with subjective taste disorder. In both cases, gustatory measures confirmed ipsilateral hemiageusia of the anterior two-thirds of the tongue. After 2 months, the symptoms regressed and psychophysical measures came back to normal values, whereas post-zoster neuralgia lasted for more than 1 year. Gustatory dysfunction is a possible symptom after mandibular nerve zoster. In contrast to post-zoster neuralgia, taste function seems to recover quickly.  相似文献   

20.
Intranasal insulin has been the subject of attention not only with respect to enhancing memory processes, but also for its anorexic effects, as well as its effects on olfactory sensitivity. In the present study, the influence of intranasal insulin on gustatory sensitivity was investigated using intranasal applications of insulin or placebo in a double‐blind manner alongside a control condition without any application. We hypothesised that, because it mediates satiety, intranasal insulin alters gustatory sensitivity, whereas placebo application and the control should not alter gustatory sensitivity. We did not expect the sensitivity to the different taste solutions to differ. Sweet, salty, bitter and sour liquids in four concentrations each were sprayed onto the tongue of healthy male subjects. Additionally, water with no taste was applied to enable calculation of taste sensitivity in terms of parameter d′ of signal detection theory. The task of the subject was to identify the quality of the respective tastant. Gustatory sensitivity and blood parameters were evaluated using repeated‐measures ANOVAs. Gustatory sensitivity (implying all tastants) improved significantly after intranasal insulin application compared to the application of placebo, although it did not reach significance compared to the control condition. Subjects performed best when detecting the sweet taste and worst when detecting the bitter taste. The blood parameters glucose, insulin, homeostatic model assessment and leptin did not differ with respect to insulin or placebo condition, nor did they differ regarding measurements preceding or following intranasal application, in confirmation of preserved peripheral euglycaemia during the experiment. Thus, it can be concluded that the application of intranasal insulin led to an improved gustatory sensitivity compared to placebo.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号