首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.

Background

The anterior‐medial thalamus (AMT), which is associated with memory processing, is severely affected by Alzheimer''s disease pathology and, when damaged, can be the sole cause of dementia.

Objective

To assess the frequency of magnetic resonance imaging (MRI) hyperintensities affecting the AMT, and their relationship with sudden cognitive decline.

Methods

205 consecutive participants from a university cognitive neurology clinic underwent clinical evaluation, neuropsychological testing and quantitative MRI.

Results

AMT hyperintensities >5 mm3 occurred in 0 of 34 normal controls but were found in 5 of 30 (17%) participants with cognitive impairment with no dementia (CIND), 9 of 109 (8%) patients with probable Alzheimer''s disease, 7 of 17 (41%) with mixed disease and 8 of 15 (53%) with probable vascular dementia (VaD). AMT hyperintensities occurred more often in participants with stepwise decline than in those with slow progression (χ2 = 31.7; p<0.001). Of the 29 people with AMT hyperintensities, those with slow progression had smaller medial temporal width (p<0.001) and smaller anterior‐medial thalamic hyperintensities (p<0.001). In a logistic regression model, both variables were significant, and the pattern of decline was correctly classified in 86% of the sample (Cox and Snell R2 = 0.56; p<0.001). Those with AMT hyperintensities >55 mm3 were likely to have stepwise decline in cognitive function regardless of medial temporal lobe width; in contrast, those with smaller AMT hyperintensities showed a stepwise decline only in the absence of medial temporal lobe atrophy. All patients with VaD had left‐sided AMT hyperintensities, whereas those with CIND had right‐sided AMT hyperintensities.

Conclusions

AMT hyperintensities >55 mm3 probably result in symptomatic decline, whereas smaller lesions may go unrecognised by clinicians and radiologists. Only half of those with AMT hyperintensities had diagnoses of VaD or mixed disease; the other AMT hyperintensities occurred in patients diagnosed with Alzheimer''s disease or CIND. These silent hyperintensities may nevertheless contribute to cognitive dysfunction. AMT hyperintensities may represent a major and under‐recognised contributor to cognitive impairment.Dementia caused solely by cerebrovascular disease is rare. In a large memory clinic autopsy series of over 1900 people with dementia, only six had infarcts without any Alzheimer''s disease neuropathology,1 and all six people had infarctions affecting at least one of three key areas: the thalamus, the medial temporal lobe and the frontal cortex.1 Although the medial temporal lobe has long been appreciated as a site of strategic importance for dementia, the involvement of the thalamus is less frequently assessed. The anterior nucleus of the thalamus is part of a cortical network, including the hippocampus, anterior cingulate and mamillary bodies, which mediates memory processing.2,3,4,5 Infarcts to the anterior and dorsomedial thalamus are associated with memory impairment in animal studies,6 human case reports7,8,9,10 or series.1,11,12,13,14 One indication that thalamic infarcts may be important in dementia populations comes from the Nun Study, which found that people with infarcts to the basal ganglia, thalamus and deep white matter exhibited dementia with less Alzheimer''s disease neuropathology than in those without infarcts.15Despite this finding, and despite the appreciated role of anterior‐medial thalamic (AMT) infarcts in causing isolated cases of amnesia or dementia in stroke populations,11,12,13,16 the frequency and consequences of thalamic lesions in a large sample of people with cognitive impairment have not been evaluated. In this study, we quantified hyperintensities on magnetic resonance images (MRI) in the anterior‐medial thalamus in a cognitive neurology clinic sample. We determined the frequency and volumes of thalamic hyperintensities and whether these hyperintensities were associated with sudden changes in cognitive status defined by clinical history.  相似文献   

2.

Background

Amyotrophic lateral sclerosis (ALS) is a neurodegenerative disease with severe cervical cord damage due to degeneration of the corticospinal tracts and loss of lower motor neurones. Diffusion tensor magnetic resonance imaging (DT MRI) allows the measurement of quantities reflecting the size (such as mean diffusivity) and orientation (such as fractional anisotropy) of water‐filled spaces in biological tissues.

Methods

Mean diffusivity and fractional anisotropy histograms from the cervical cord of patients with ALS were obtained to: (1) quantify the extent of tissue damage in this critical central nervous system region; and (2) investigate the magnitude of the correlation of cervical cord DT MRI metrics with patients'' disability and tissue damage along the brain portion of the corticospinal tracts. Cervical cord and brain DT MRI scans were obtained from 28 patients with ALS and 20 age‐matched and sex‐matched controls. Cord mean diffusivity and fractional anisotropy histograms were produced and the cord cross‐sectional area was measured. Average mean diffusivity and fractional anisotropy along the brain portion of the corticospinal tracts were also measured.

Results

Compared with controls, patients with ALS had significantly lower mean fractional anisotropy (p = 0.002) and cord cross‐sectional area (p<0.001). Mean diffusivity histogram‐derived metrics did not differ between the two groups. A strong correlation was found between mean cord fractional anisotropy and the ALS Functional Rating Score (r = 0.74, p<0.001). Mean cord and brain fractional anisotropy values correlated moderately (r = 0.37, p = 0.05).

Conclusions

Cervical cord DT MRI in patients with ALS allows the extent of cord damage to be graded. The conventional and DT MRI changes found are compatible with the presence of neuroaxonal loss and reactive gliosis, with a heterogeneous distribution of the pathological process between the brain and the cord. The correlation found between cord fractional anisotropy and disability suggests that DT MRI may be a useful adjunctive tool to monitor the evolution of ALS.Amyotrophic lateral sclerosis (ALS) is the most common adult‐onset motor neurone disease, characterised by a progressive and simultaneous degeneration of upper and lower motor neurones.1,2 In its typical form, the disease begins either in one limb or with a combination of bulbar and corticobulbar symptoms, and continues with progressive weakness of the bulbar, limb, thoracic and abdominal musculature.1,2 By using a variety of conventional magnetic resonance imaging (MRI) sequences, several studies3,4,5,6,7,8,9,10,11,12,13,14,15 have shown changes in signal intensity along the brain portion of the corticospinal tracts, particularly in the posterior limb of the internal capsule and cerebral peduncles, varying between 25% and 80%. Reduced magnetisation transfer ratios in the internal capsule8,11 and N‐acetylaspartate levels in the motor cortex13,16,17 of patients with ALS have also been observed. However, none of these studies has reported a correlation between such magnetic resonance abnormalities and the degree of disability.8,11,13,16,17Diffusion‐tensor magnetic resonance imaging (DT MRI) enables the random diffusional motion of water molecules to be measured and thus provides quantitative indices of the structural and orientational features of the central nervous system (CNS).18 DT MRI has been used to assess quantitatively the tissue damage of the brain portion of the corticospinal tracts in ALS,12,19,20,21,22,23 and all studies have shown increased mean diffusivity (indicating a loss of structural barriers limiting the motion of water molecules) and decreased fractional anisotropy (indicating a loss of tissue organisation). However, brain DT MRI studies also resulted in heterogeneous clinicopathological correlations, as some authors found a moderate correlation between brain DT MRI metrics and the severity of disability,12,21,23 but others did not.19 In the past few years, DT MRI has also been used successfully to grade the extent of cervical cord damage associated with demyelinating conditions.24,25,26Considering that the cervical cord in ALS is one of the most affected portions of the CNS (owing to the combined presence of neuronal loss in the anterior horns of the grey matter and degeneration of the corticospinal tracts), we obtained mean diffusivity and fractional anisotropy histograms of the cervical cord from patients with ALS with the following aims: (1) to quantify the extent of tissue damage in this critical CNS region; and (2) to investigate the magnitude of the correlation of cervical cord DT MRI metrics with patients'' disability and tissue damage along the brain portion of the corticospinal tracts.  相似文献   

3.

Background and aim

Trunk performance is an important predictor of functional outcome after stroke. However, the percentage of explained variance varies considerably between studies. This may be explained by the stroke population examined, the different scales used to assess trunk performance and the time points used to measure outcome. The aim of this multicentre study was to examine the predictive validity of the Trunk Impairment Scale (TIS) and its subscales when predicting the Barthel Index score at 6 months after stroke.

Methods

A total of 102 subjects were recruited in three European rehabilitation centres. Participants were assessed on admission (median time since stroke onset 20 days) and 6 months after stroke. Correlation analysis and forward stepwise multiple regression analysis were used to model outcome.

Results

The best predictors of the Barthel Index scores at 6 months after stroke were total TIS score (partial R2 = 0.52, p<.0001) and static sitting balance subscale score (partial R2 = 0.50, p<.0001) on admission. The TIS score on admission and its static sitting balance subscale were stronger predictors of the Barthel Index score at 6 months than the Barthel Index score itself on admission.

Conclusions

This study emphasises the importance of trunk performance, especially static sitting balance, when predicting functional outcome after stroke. The TIS is recommended as a prediction instrument in the rehabilitation setting when considering the prognosis of stroke patients. Future studies should address the evolution of trunk performance over time and the evaluation of treatment interventions to improve trunk performance.Although the age specific incidence of major stroke has fallen over the past few years,1 it is still the main cause of long term disability in adults, with a growing number of survivors being dependent for activities of daily living (ADL).2,3 Frequently identified variables predicting ADL after stroke are age and initial severity of motor and functional deficits.4 Trunk performance has also been identified as an important independent predictor of ADL after stroke.5,6,7,8,9 However, based on multiple regression analyses, the reported variance of functional outcome after stroke explained by trunk performance ranges from 9% to 71%.5,6,7,8,9 Differences in reported variance could be explained by the stroke population included, the various scales used to measure trunk performance and the time points used to measure outcome.Previous studies evaluating the predictive validity of trunk performance after stroke were performed in a single rehabilitation setting, warranting caution when generalising results.5,6,7,8,9,10 Clinical tools used to assess trunk performance are the Trunk Control Test,5,6,10 trunk control items of the Postural Assessment Scale for Stroke patients7,8 and trunk assessment of Fujiwara et al.9 A limitation of the first two tests is that they both have a ceiling effect, which makes their use less suitable in long term outcome studies.5,11,12,13 Furthermore, the trunk control items of the Trunk Control Test and Postural Assessment Scale for Stroke patients are largely comparable with the items of the trunk measure of Fujiwara et al.9 All previously mentioned clinical tools include items in the supine position which involve rolling as well as only basic balance movements in sitting. Finally, with the exception of the trunk control items of the Postural Assessment Scale for Stroke patients,8 no study has evaluated the prognostic value of trunk performance when predicting functional outcome at 6 months after stroke.The Trunk Impairment Scale (TIS) for patients after stroke was designed to measure ADL related selective trunk movements rather than participation of the trunk in gross transfer movements.14 The TIS assesses static and dynamic sitting balance and trunk coordination. Reliability, validity, measurement error, internal consistency and discriminant ability of the TIS have been reported elsewhere.14,15 The TIS has no ceiling effect in subacute and chronic stroke patients and already appeared to be strongly related to measures of gait, balance and functional ability in a cross sectional study.12 To the best of our knowledge, the predictive value of the TIS and its subscales has not been evaluated. Including age and other measures of motor and functional performance could provide a useful combination of variables predicting outcome after stroke. The Barthel Index score is a widely accepted measure in stroke rehabilitation research and assesses functional milestones in stroke recovery. Predicting Barthel Index scores at 6 months after stroke based on measurements taken on admission to a rehabilitation centre would further establish the importance of trunk performance when predicting long term outcome after stroke. Experts in the field of neurological rehabilitation have addressed the trunk as the central key point of the body.16 Proximal stability of the trunk is a prerequisite for distal head and limb movement and therefore expected to be related to functional ADL.In summary, there is still a lack of clarity regarding the importance of trunk performance in functional outcome after stroke. Scales which have been used in previous studies have important statistical limitations and are likely to be a comprehensive measure of motor performance of the trunk. Therefore, the aim of this multicentre study was to examine the predictive validity of the TIS and its subcomponents, together with other known predictors, in predicting functional outcome measured as a Barthel Index score at 6 months after stroke.  相似文献   

4.

Background and purpose

Pre‐existing cognitive decline and new‐onset dementia are common in patients with stroke, but their influence on institutionalisation rates is unknown.

Objective

To evaluate the influence of cognitive impairment on the institutionalisation rate 3 years after a stroke.

Design

(1) The previous cognitive state of 192 consecutive patients with stroke living at home before the stroke (with the Informant Questionnaire on COgnitive Decline in the Elderly (IQCODE)), (2) new‐onset dementia occurring within 3 years and (3) institutionalisation rates within 3 years in the 165 patients who were discharged alive after the acute stage were prospectively evaluated.

Results

Independent predictors of institutionalisation over a 3‐year period that were available at admission were age (adjusted odds ratio (adjOR) for 1‐year increase  = 1.08; 95% confidence interval (CI) 1.03 to 1.15), severity of the neurological deficit (adjOR for 1‐point increase in Orgogozo score = 0.97; 95% CI 0.96 to 0.99) and severity of cognitive impairment (adjOR for 1‐point increase in IQCODE score = 1.03; 95% CI 1 to 1.06). Factors associated with institutionalisation at 3 years that were present at admission or occurred during the follow‐up were age (adjOR for 1‐year increase = 1.17; 95% CI 1.07 to 1.27) and any (pre‐existing or new) dementia (adjOR = 5.85; 95% CI 1.59 to 21.59), but not the severity of the deficit of the neurological deficit.

Conclusion

Age and cognitive impairment are more important predictors of institutionalisation 3 years after a stroke than the severity of the physical disability.Institutionalisation after a stroke increases with the severity of the neurological deficit, increasing age, female gender, low socioeconomic level, marital status and poor social environment.1,2,3,4,5,6Dementia is common after a stroke,7 leading to autonomy loss.8 Pre‐existing dementia is present in up to 16% of patients with stroke,9,10,11,12 and post‐stroke de mentia (PSD) occurs in up to one third.7 Several studies have found a link between cognitive impairment and institutionalisation after a stroke,1,2,3,4,5 but they had several methodological limitations: (1) cross‐sectional studies were performed in long‐term stroke survivors and did not take into account patients who had been institutionalised but died before the study6; (2) there was no systematic cognitive assessment13 or only a Mini Mental State Examination,14 which is not appropriate for patients with stroke; and (3) most studies included only patients recruited in rehabilitation centres, leading to selection bias.1,2,3,4,5 To our knowledge, no study has prospectively evaluated the influence of pre‐existing cognitive impairment and PSD on the institutionalisation rate after a stroke.The aim of this study was to evaluate the influence of the previous cognitive state and new‐onset dementia on the institutionalisation rate 3 years after a stroke.  相似文献   

5.

Background

Patients with Alzheimer''s disease and dementia commonly suffer from behavioural and psychological symptoms of dementia (BPSD). A genetic component to BPSD development in Alzheimer''s disease has been demonstrated. Several studies have investigated whether the exon 4 ε2/ε3/ε4 haplotype of the apolipoprotein E (APOE) gene is associated with BPSD, with variable results.

Objective

We investigated the exon 4 polymorphisms and extended this study to include promoter polymorphisms and the resultant haplotypes across the gene.

Methods

Our large independent cohort of 388 patients with longitudinal measures of BPSD assessed by the Neuropsychiatric Inventory was used to analyse whether any of these variants were associated with the presence of BPSD.

Results

We revealed several significant relationships before correction for multiple testing. The exon 4 haplotype was associated with hallucinations and anxiety, A‐491T with irritability, T‐427C with agitation/aggression and appetite disturbances, and T‐219C with depression. Haplotype analyses of all variants did not reveal any statistically significant findings.

Conclusions

Our data and a review of previous studies showed a diversity of relationships, suggesting that these findings might be due to chance and so collectively do not support a role for the APOE gene in BPSD.Many patients with dementia display behavioural and psychological symptoms of dementia (BPSD). Unlike cognitive decline, BPSD do not continuously exist in a patient once they have occurred. Genetic determinants of BPSD in Alzheimer''s disease have been proposed from studies on families.1,2,3 It has been hypothesised that the genes that increase the risk for Alzheimer''s disease may also determine the presence of BPSD.4 The ε4 allele of the apolipoprotein E (APOE) gene is the only risk factor robustly associated with Alzheimer''s disease. However, previous investigations on APOE have produced inconsistent findings on BPSD, with some researchers reporting associations with a variety of different symptoms and alleles4,5,6,7,8,9,10,11,12,13,14,15,16 (summarised in the table provided online at http://jnnp.bmjjournals.com/supplemental), whereas others find no relevant relationships.17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33 We used a large independent clinical cohort of patients with Alzheimer''s disease, with longitudinal data on BPSD to further extend these studies, and additionally investigated promoter polymorphisms of APOE, which have been shown to independently incur risk of Alzheimer''s disease in some studies.34  相似文献   

6.

Background

Alzheimer''s disease (AD) and frontotemporal dementia (FTD) are the commonest causes of presenile dementia. In the absence of a biological marker, diagnosis is reliant on clinical evaluation. Confirmation is often sought from neuroimaging, including single‐photon emission computed tomography (SPECT). Most previous SPECT studies lack pathological validation.

Aim

To examine the accuracy of SPECT in differentiating FTD from AD in patients with subsequent pathological confirmation.

Methods

Technetium‐99‐labelled hexamethyl propylene amine oxime SPECT images obtained at initial evaluation in 25 pathologically confirmed cases of FTD were examined. These images were visually rated by an experienced blinded nuclear medicine consultant and compared with those of 31 patients with AD, also with pathological validation.

Results

A reduction in frontal cerebral blood flow (CBF) was more common in FTD and was of diagnostic value (sensitivity 0.8, specificity 0.65 and likelihood ratio (LR) 2.25; 95% CI 1.35 to 3.77). A pattern of bilateral frontal CBF reduction without the presence of associated bilateral parietal CBF change is diagnostically more accurate (sensitivity 0.80, specificity 0.81 and +LR 4.13, 95% CI 1.96 to 8.71). Diagnostic categorisation (FTD or AD) on the basis of SPECT alone was less accurate than clinical diagnosis (based on neurology and detailed neuropsychological evaluation). One patient with FTD was initially clinically misdiagnosed as AD, owing to the lack of availability of full neuropsychological assessment. However, SPECT correctly diagnosed this patient, providing a diagnostic gain of 4%.

Conclusion

Technetium‐99‐labelled hexamethyl propylene amine oxime SPECT CBF patterns provide valuable information in the diagnosis of FTD and AD. These data can be better used as an adjunct to clinical diagnosis if pathology is to be correctly predicted in life.Frontotemporal dementia (FTD) is a cortical dementia distinct from other dementing illnesses. It typically presents with personality/behavioural change and decline in social conduct with early loss of insight.1,2 In the absence of biological markers, the pathological detection of characteristic histological changes remains the gold standard of diagnosis. In life, diagnosis is primarily based on patterns of neurological and neuropsychological findings. However, differentiation from other dementias can be difficult and demands an astute qualitative analysis of various behaviours and neuropsychological test performances.3 With a paucity of experienced neuropsychologists, additional and independent diagnostic information is often sought through imaging, be it structural (CT and MRI) and/or functional (single‐photon emission computed tomography (SPECT) and positron emission tomography).SPECT is used to evaluate patients with dementia and can show purported characteristic changes in FTD and in Alzheimer''s disease (AD).4,5,6,7,8,9 The technique provides a method of evaluating blood flow in various regions of the brain, which reflects areas of poor function by showing reductions in regional cerebral blood flow (CBF). It has been shown that posterior changes in regional CBF are common in AD,4,5,6,7 whereas in FTD anterior changes are prevalent7,8,9 and posterior changes rare.7However, CBF changes are neither wholly specific nor invariable in various dementing illnesses. Masterman et al10 looked at the value of bitemporal hypoperfusion in diagnosing AD, and found that, although a sensitive measure for detecting dementia (0.75), it was poorly specific for AD (0.55). Consequently, bitemporal hypoperfusion on SPECT can be a non‐specific finding in various forms of dementia and is not exclusive to AD. Starkstein et al11 reported deficits in CBF in the frontal (especially orbitofrontal) and anterior temporal cortices in FTD. However, they provided neither the measure of the diagnostic accuracy of SPECT in FTD nor of the diagnostic gain it may provide. Most of these studies are also limited by the fact that the dementia groups are defined clinically. The clinical diagnostic accuracy of FTD in life varies hugely between 14–85%.12,13,14A few studies have looked at the accuracy of clinical and SPECT findings in relation to the final pathological diagnoses.15,16,17,18,19 Although these studies found that SPECT findings do correlate with dementia type, they failed to enquire whether SPECT provides any additional diagnostic gain over clinical judgement. These studies are also severely limited by the small numbers of patients in the FTD groups.The aims of this study include evaluation of the diagnostic accuracy of SPECT in differentiating FTD from AD at initial assessment in a group of patients with final pathological confirmation of diagnosis. We also examined the diagnostic gain SPECT may provide over clinical diagnosis of FTD from among this group of patients with FTD and AD.  相似文献   

7.

Background

While patients with amyotrophic lateral sclerosis (ALS) may complain of fatigue, the underlying mechanisms appear complex, with dysfunction of central and peripheral nervous systems independently reported as contributing factors. The aim of the present study was to further delineate the mechanisms underlying increased fatigability in ALS by measuring activity dependent changes in axonal excitability following a maximum voluntary contraction (MVC).

Methods

Nerve excitability changes were recorded before and after an MVC of the abductor pollicis brevis in 16 patients with ALS and 25 controls.

Results

In patients with ALS, there was a greater increase in threshold (36.5 (5.9)%; controls 19.6 (3.5)%; p<0.05) as a result of MVC, with reduction in the amplitude of the compound muscle action potential generated by a submaximal stimulus (ALS 49 (7.6)%; controls 41.0 (5.4)%). These changes were associated with an increase in superexcitability (ALS 65.1 (25.4)%; controls 42.3 (5.7)%) and reduction in strength–duration time constant (ALS 20 (4.9)%; controls 10 (2.5)%; p<0.01), indicative of axonal hyperpolarisation. The increase in threshold was more pronounced in patients with ALS with predominantly lower motor neuronal involvement.

Conclusions

Higher firing rates of surviving motor axons attempting to compensate for neurogenic weakness are likely to explain the greater activity dependent changes in ALS. As such, the present study suggests a further peripheral factor underlying the development of fatigue in ALS.Amyotrophic lateral sclerosis (ALS) is a progressive neurodegenerative disorder that affects motor neurones in the spinal cord, brainstem and motor cortex.1,2 The consequences of this neurodegeneration are motor deficits in the limbs, bulbar and respiratory muscles.3 Although the mechanisms of neuronal dysfunction, and ultimately the development of symptoms in ALS, remain unknown, glutamate excitotoxicity,4,5,6 increased levels of inducible nitric oxide synthase levels4 and, in cases of inherited ALS, oxidative stress secondary to mutations in the superoxide dismutase‐1 gene, have been proposed.7,8,9,10Increased fatigability, defined as an inability to sustain a predictable maximal force during voluntary contraction, is a common symptom of ALS.11,12,13 The mechanisms underlying fatigue in ALS are complex, and contributions from both the central and peripheral nervous systems have been reported.11,12 Central fatigue refers to a reduced excitatory drive to motor neurones, secondary to central nervous system dysfunction, resulting in incomplete motor unit recruitment and submaximal motor unit discharge rates. In contrast, peripheral fatigue typically refers to impaired muscle activation, caused by dysfunction at or below the anterior horn cell.13,14 Perhaps somewhat counterintuitively, fatigue in ALS appears to be independent of muscle strength and disease severity.15,16 Regardless of the underlying mechanism, fatigue in ALS severely impacts on the patient''s quality of life.15,16The ability to sustain a motor output may be assessed by measuring changes in axonal membrane threshold following a voluntary contraction. Specifically, in peripheral nerves, voluntary contraction activates the axonal membrane Na+/K+ pump,17 which attempts to return the resting membrane potential to baseline after contraction has ceased,18,19,20,21 resulting in activity dependent hyperpolarisation. The magnitude of activity dependent hyperpolarisation is determined by the impulse load22 and, in neurological diseases where the safety margin for impulse conduction has been reduced as occurs for instance in demyelinating neuropathy, may be sufficient to induce conduction failure.23,24,25 In an attempt to further delineate the mechanisms underlying fatigability and weakness in ALS, the present study measured activity dependent changes in axonal excitability induced by voluntary contraction.  相似文献   

8.

Background

Among elderly people without dementia, the apolipoprotein E ε4 allele (APOE4) has been associated with cognitive deficit, particularly in episodic memory, but few reports are available on whether this association differs by sex.

Methods

In a community‐dwelling Norwegian cohort of 2181 elderly people (55% women), aged 70–74 years, episodic memory was examined in relation to sex and APOE4 zygosity, with the Kendrick Object Learning Test (KOLT).

Results

Possession of at least one APOE4 allele had a modest, detrimental effect on episodic memory in women, whereas in men, heterozygotes were unaffected and homozygotes had markedly lower scores across the distribution of KOLT scores. This sex difference was found consistently in all analyses: on comparing means and medians, examining trends across quintiles, and studying the distribution of scores and the risk of cognitive impairment. Results were broadly similar when adjusted for known determinants of cognition and also when severely impaired participants were excluded. The adjusted odds ratio (OR) of cognitive impairment in women was shown to be 1.8 (95% confidence interval (CI): 1.1 to 2.8) for heterozygotes and 1.1 (0.3 to 3.7) for homozygotes; the adjusted OR in men was observed to be 1.1 (0.6 to 2.1) for heterozygotes and 10.7 (4.7 to 24) for homozygotes.

Conclusions

Although the harmful effect of APOE4 on episodic memory was modest in women, the risk was found to occur in about 30%. APOE4 was observed to have a dramatic effect on episodic memory in men, but only in homozygotes, who comprised about 3% of men: the whole male homozygous group showed a marked shift to lower memory scores.Age and the apolipoprotein E ε4 allele (APOE4) are the most important known risk factors for sporadic Alzheimer''s disease. The disease is thought to have a long presymptomatic phase,1 which suggests that APOE4 starts exerting its detrimental effects in the preclinical phase. Most studies on elderly people without dementia have found that the APOE4 allele is associated with various cognitive deficits,2,3,4,5,6,7,8,9,10,11,12,13,14 particularly in memory.2,3,4,5,6,7 A recent meta‐analysis of more than 20 000 people concluded that this allele was associated with poorer performance on tests of global cognitive functioning, episodic memory and executive functioning.15The association of APOE4 with Alzheimer''s disease varies with sex.16,17,18,19,20 The meta‐analysis by Farrer et al20 found that APOE4 homozygosity affords a high risk of Alzheimer''s disease for both men and women, but that a single copy of the allele confers a greater risk on women than on men. A similar sex difference related to APOE4 has been found in the degree of hippocampal atrophy in a cohort with mild cognitive impairment.21 We may therefore expect to find an effect related to sex of the APOE4 allele in cognitive tests in elderly people without dementia. Two studies3,22 that have reported an influence of sex on this relationship found a stronger effect of APOE4 in women.3,22In this study, we investigated whether sex influences the relationship between APOE alleles and episodic memory in community‐dwelling elderly people. We selected episodic memory because memory deficit is a hallmark of Alzheimer''s disease. Tests of episodic memory have been found to be particularly effective in identifying people at risk.23,24 We compared the influence of sex in our cohort with that found on the risk of Alzheimer''s disease. We studied a relatively large group of 2181 people from western Norway.  相似文献   

9.

Background

Therapeutic management of gait disorders in patients with advanced Parkinson''s disease (PD) can sometimes be disappointing, since dopaminergic drug treatments and subthalamic nucleus (STN) stimulation are more effective for limb‐related parkinsonian signs than for gait disorders. Gait disorders could also be partly related to norepinephrine system impairment, and the pharmacological modulation of both dopamine and norepinephrine pathways could potentially improve the symptomatology.

Aim

To assess the clinical value of chronic, high doses of methylphenidate (MPD) in patients with PD having gait disorders, despite their use of optimal dopaminergic doses and STN stimulation parameters.

Methods

Efficacy was blindly assessed on video for 17 patients in the absence of l‐dopa and again after acute administration of the drug, both before and after a 3‐month course of MPD, using a Stand–Walk–Sit (SWS) Test, the Tinetti Scale, the Unified Parkinson''s Disease Rating Scale (UPDRS) part III score and the Dyskinesia Rating Scale.

Results

An improvement was observed in the number of steps and time in the SWS Test, the number of freezing episodes, the Tinetti Scale score and the UPDRS part III score in the absence of l‐dopa after 3 months of taking MPD. The l‐dopa‐induced improvement in these various scores was also stronger after the 3‐month course of MPD than before. The Epworth Sleepiness Scale score fell dramatically in all patients. No significant induction of adverse effects was found.

Interpretation

Chronic, high doses of MPD improved gait and motor symptoms in the absence of l‐dopa and increased the intensity of response of these symptoms to l‐dopa in a population with advanced PD.The significant, long‐term benefits of dopaminergic treatment1 and bilateral stimulation of the subthalamic nucleus (STN)2 have been well documented for limb‐related syndromes in patients with advanced Parkinson''s disease (PD). However, after several years of disease progression (and regardless of the ongoing treatment), axial signs in general and gait disorders in particular (including reduced step length, freezing and postural instability) become more prominent and lead to falls and even institutionalisation. Therapeutic management of the condition is disappointing, since dopaminergic treatments and STN stimulation are more effective for other limb‐related parkinsonian signs than for gait disorders as such.2,3 However, an interesting therapeutic approach could involve the combined modulation of l‐dopa bioavailability (to potentiate the partial dopa‐sensitivity of gait disorders) and the non‐dopaminergic system, particularly the norepinephrine system, which has been previously suspected to be involved in gait disorders.4,5 This “norepinephrine hypothesis” could explain the positive results on freezing of gait observed in some open‐label studies on small populations of patients with advanced PD using the synthetic norepinephrine precursor l‐threo‐dihydroxyphenylserine6,7 or tinazidine, an α‐2 adrenergic agonist.4 However, these results have never been confirmed—probably because l‐threo‐dihydroxyphenylserine is a weak precursor of norepinephrine and only slightly influences striatal, extracellular dopamine levels.8Methylphenidate (MPD, Ritalin) is an amphetamine‐like psychomotor stimulant, which influences both the dopaminergic and norepinephrine systems. Indeed, MPD inhibits the dopamine transporter (DAT), particularly in the striatum.9 The DAT is one of the most important determinants of extracellular dopamine concentrations, as demonstrated in DAT knock‐out mice.10 Through inhibition of the DAT, MPD blocks presynaptic dopamine re‐uptake.11 To a lesser extent, MPD also influences the norepinephrine system through presynaptic norepinephrine transporter inhibition.11,12,13 Hence, by targeting the DAT and the norepinephrine transporter, MPD might disperse dopamine widely and consign dopamine storage and release to regulation by norepinephrine neurones as well as by dopaminergic neurones.13 Effects of MPD may be mediated by restoration of the dopaminergic/norepinephrine neurotransmitter balance.13,14A pilot study on five patients with PD with motor fluctuations showed that low doses of MPD (0.2 mg/kg) combined with l‐dopa led to greater peak right‐hand tapping speed.15 The effects of doses of up to 0.4 mg/kg of MPD were also assessed in a double‐blind, placebo‐controlled procedure; MPD seemed to lack an effect when given alone but did potentiate the effects of l‐dopa on walking speeds and dyskinesia.9 Recently, positive effects on gait speed, fall risk and attention were demonstrated in an open‐label study using an acute, low dose (20 mg) of MPD.16 It therefore seemed interesting to determine whether higher doses and longer‐term treatment could improve the MPD‐induced partial response for gait disorders. Indeed, up to 70% of the dopamine nerve terminals (and consequently 70% of DAT activity) are lost in severe PD.17 An oral dose of 0.25 mg/kg MPD may only occupy half of the striatal DATs in humans,12 whereas oral doses of 0.5–0.8 mg/kg allow a higher occupancy and lead to high extracellular dopamine concentrations.13,18,19 Moreover, high doses of MPD could also increase the norepinephrine properties of MPD.Our research hypothesis was the improvement of gait by MPD. The aim of this study was to assess the clinical value of a high‐dose, 3‐month course of MPD (1 mg/kg) in STN‐stimulated patients with advanced PD (free of motor fluctuations) having gait disorders despite their use of optimal dopaminergic doses and STN stimulation parameters. The primary outcome measure was the completion time in the Stand–Walk–Sit (SWS) Test.20 Efficacy was blindly assessed on video in the absence of l‐dopa and then again after acute administration of the latter drug, to assess the potential norepinephrine and/or dopaminergic effects of MPD on gait speed and step length.  相似文献   

10.

Background

Carotid body (CB) glomus cells are highly dopaminergic and express the glial cell line derived neurotrophic factor. The intrastriatal grafting of CB cell aggregates exerts neurotrophic actions on nigrostriatal neurons in animal models of Parkinson disease (PD).

Objective

We conducted a phase I–II clinical study to assess the feasibility, long term safety, clinical and neurochemical effects of intrastriatal CB autotransplantation in patients with PD.

Methods

Thirteen patients with advanced PD underwent bilateral stereotactic implantation of CB cell aggregates into the striatum. They were assessed before surgery and up to 1–3 years after surgery according to CAPIT (Core Assessment Programme for Intracerebral Transplantation) and CAPSIT‐PD (Core Assessment Programme for Surgical Interventional Therapies in Parkinson''s Disease) protocols. The primary outcome measure was the change in video blinded Unified Parkinson''s Disease Rating Scale III score in the off‐medication state. Seven patients had 18F‐dopa positron emission tomography scans before and 1 year after transplantation.

Results

Clinical amelioration in the primary outcome measure was observed in 10 of 12 blindly analysed patients, which was maximal at 6–12 months after transplantation (5–74%). Overall, mean improvement at 6 months was 23%. In the long term (3 years), 3 of 6 patients still maintained improvement (15–48%). None of the patients developed off‐period dyskinesias. The main predictive factors for motor improvement were the histological integrity of the CB and a milder disease severity. We observed a non‐significant 5% increase in mean putaminal 18F‐dopa uptake but there was an inverse relationship between clinical amelioration and annual decline in putaminal 18F‐dopa uptake (r = −0.829; p = 0.042).

Conclusions

CB autotransplantation may induce clinical effects in patients with advanced PD which seem partly related to the biological properties of the implanted glomus cells.Parkinson disease (PD) is a progressive neurodegenerative disorder of unknown aetiology. Its main pathological hallmark is the degeneration of midbrain dopaminergic neurons projecting to the striatum, although other neuronal systems are also affected.1 Current pharmacological and surgical therapies are symptomatically effective but their long term utility is limited because of disease progression.2,3 Therefore, there is a need for neuroprotective and/or neurorestorative therapies capable of arresting or reversing the neurodegenerative process.Over the past two decades, cell replacement therapies have been tested in PD patients with the objective of restoring the striatal dopaminergic deficit.4 Transplantation of fetal mesencephalic neurons, the most frequently used technique, can increase the striatal dopamine storage, but does not always produce the expected clinical benefit and may induce disabling off‐medication dyskinesias.5,6 Thus it appears that the ectopic placement of dopamine secreting cells in the striatum is not the ideal approach to compensate for progressive nigrostriatal neuronal loss.7 Given this scenario, the clinical applicability of other transplantation procedures based on a similar rationale (eg, intrastriatal grafting of porcine mesencephalic neurons, retinal pigment epithelial cells or stem cell derived dopaminergic neurons) is, for the moment, uncertain.More recently, other strategies aiming to protect or restore the nigrostriatal pathway have emerged. Glial cell line derived neurotrophic factor (GDNF) has been shown to exert neuroprotective and neurorestorative actions in animal models of PD.8,9,10 The clinical efficacy of GDNF has been assayed in clinical trials, but the method of delivery is a critical issue. Whereas intraventricular administration failed to induce clinical benefit,11 intraputaminal infusion showed promising results,12,13 although a placebo controlled trial using this route has been halted because of lack of efficacy and safety concerns about recombinant human GDNF administration.14 Other alternative methods being tested experimentally in parkinsonian animals include in vivo gene therapy using GDNF encoding viral vectors15,16,17 and the intrastriatal grafting of recombinant GDNF producing cell lines.18,19,20,21 Carotid body (CB) glomus cells are neural crest derived dopaminergic cells that express high levels of GDNF. Glomus cell GDNF production is resistant to 1‐methyl‐4‐phenyl‐1,2,3,6‐tetrahydropyridine administration, and maintained in aged rodents or after intrastriatal grafting.22,23 The survival rate of these cells after transplantation (>70%) is particularly high as hypoxia stimulates their growth and function. Moreover, CB grafts performed in young rats remain active for the entire animal lifespan.22,23 Transplantation of CB cell aggregates has been shown to induce a neurotrophic mediated recovery in animal models of PD22,23,24,25,26,27 and stroke.28,29We conducted a phase I–II video blinded clinical study to assess the long term safety, clinical and neurochemical effects of intrastriatal CB autotransplantation in patients with advanced PD. In a pilot report of our first six patients, we showed this procedure to be feasible.30 Here we report the clinical outcomes and prognostic factors in the whole study (n = 13), as well as 18F‐dopa positron emission tomography (PET) outcomes in a subgroup of patients (n = 7).  相似文献   

11.

Background

To assess whether the premorbid dietary intake of fatty acids, cholesterol, glutamate or antioxidants was associated with the risk of developing amyotrophic lateral sclerosis (ALS).

Methods

Patients referred to our clinic during 2001–2002, who had definite, probable or possible ALS according to El Escorial criteria, without a familial history of ALS, were asked to participate in a case–control study (132 patients and 220 healthy controls). A food‐frequency questionnaire was used to assess dietary intake for the nutrients of interest. Multivariate logistic regression analysis was performed with adjustment for confounding factors (sex, age, level of education, energy intake, body mass index and smoking).

Results

A high intake of polyunsaturated fatty acid (PUFA) and vitamin E was significantly associated with a reduced risk of developing ALS (PUFA: odds ratio (OR) = 0.4, 95% confidence interval (CI) = 0.2 to 0.7, p = 0.001; vitamin E: OR = 0.4, 95% CI = 0.2 to 0.7, p = 0.001). PUFA and vitamin E appeared to act synergistically, because in a combined analysis the trend OR for vitamin E was further reduced from 0.67 to 0.37 (p = 0.02), and that for PUFA from 0.60 to 0.26 (p = 0.005), with a significant interaction term (p = 0.03). The intake of flavonols, lycopene, vitamin C, vitamin B2, glutamate, calcium or phytoestrogens was not associated with the risk of developing ALS.

Conclusion

A high intake of PUFAs and vitamin E is associated with a 50–60% decreased risk of developing ALS, and these nutrients appear to act synergistically.Sporadic amyotrophic lateral sclerosis (ALS) probably develops through the combined effects of several modifying genes and environmental factors.1 Despite several studies that investigated environmental exposures in relation to ALS, age, gender and smoking are the only established risk factors.2 Several, not mutually exclusive, pathological processes may contribute to motor neurone death in ALS in a so‐called convergence model,3 including oxidative stress, mitochondrial dysfunction, protein misfolding, axonal strangulation, apoptosis, inflammation, glutamate excitotoxicity and defects in neurotrophin biology. Nutrients are factors that could influence these processes and thereby the risk of developing ALS or its clinical expression.ALS was previously found to be positively associated with intake of glutamate,4 fat,4 fish5 and milk,6,7 and inversely associated with intake of lycopene,8 dietary fibre,4 bread and pasta.9 Two other studies, however, failed to establish the relationship with milk.10,11 Several of these studies included only small samples of patients (<25),5,6,9 or investigated nutrition as one of many environmental factors, thus increasing the likelihood of chance findings.5,6,7,9,10,11 Furthermore, most studies did not account for the possible influence of clinical onset preceding the diagnosis5,6,7,8,9,10,11 or adjust for possible confounders including total energy intake, body mass index (BMI), sex, smoking and education.5,6,7,9,10,11One study found an association between intake of total fat and ALS, although this was not hypothesised beforehand.4 This finding is of interest considering the observed associations of intake of saturated and unsaturated fatty acids and cholesterol with other neurodegenerative diseases.12 In this case–control study, therefore, we examined the possible association between premorbid dietary intake of fatty acids, cholesterol, glutamate, phytoestrogens, calcium and anti‐oxidants and the risk of developing ALS, adjusting for confounding factors.  相似文献   

12.

Objective

To determine the usefulness of an interactive multimedia internet‐based system (IMIS) for the cognitive stimulation of Alzheimer''s disease.

Methods

This is a 24‐week, single‐blind, randomised pilot study conducted on 46 mildly impaired patients suspected of having Alzheimer''s disease receiving stable treatment with cholinesterase inhibitors (ChEIs). The patients were divided into three groups: (1) those who received 3 weekly, 20‐min sessions of IMIS in addition to 8 h/day of an integrated psychostimulation program (IPP); (2) those who received only IPP sessions; and (3) those who received only ChEI treatment. The primary outcome measure was the Alzheimer''s Disease Assessment Scale‐Cognitive (ADAS‐Cog). Secondary outcome measures were: Mini‐Mental State Examination (MMSE), Syndrom Kurztest, Boston Naming Test, Verbal Fluency, and the Rivermead Behavioral Memory Test story recall subtest.

Results

After 12 weeks, the patients treated with both IMIS and IPP had improved outcome scores on the ADAS‐Cog and MMSE, which was maintained through 24 weeks of follow‐up. The patients treated with IPP alone had better outcome than those treated with ChEIs alone, but the effects were attenuated after 24 weeks. All patients had improved scores in all of the IMIS individual tasks, attaining higher levels of difficulty in all cases.

Conclusion

Although both the IPP and IMIS improved cognition in patients with Alzheimer''s disease, the IMIS program provided an improvement above and beyond that seen with IPP alone, which lasted for 24 weeks.Alzheimer''s disease is the most frequent form of dementia in elderly people,1,2 and its current treatment includes cholinesterase inhibitors (ChEIs),3,4,5 and n‐methyl‐d‐aspartate receptor blockers (eg, memantine).6 However, symptomatic treatment often entails non‐pharmacological treatments as well, and adequate dementia management requires a wide range of intervention to help maximise the patient''s independence, increase their self‐confidence and relieve burden to the care giver.Current symptomatic treatment of Alzheimer''s disease can improve cognition and functionality.3,4,5,6 However, before the emergence of these drugs, non‐pharmacological treatments had already been evaluated and cognitive stimulation had been found to be potentially beneficial for patients with dementia.7,8,9 Although these non‐pharmacological treatments do not always seem efficacious, methodological problems may limit the validity of some studies.10 A recent Cochrane review11 emphasised caution when interpreting the results of non‐pharmacological treatments, but suggested that certain cognitive domains could, in fact, benefit from these types of interventions.Clinical and laboratory studies have shown that mental and physical activity can positively influence cognition in normal elderly people and people with dementia. Education12 and lifestyle choices (eg, occupation and leisure activities)13,14,15 can modulate the risk of developing dementia, and psychomotor stimulation improves cognition in patients with Alzheimer''s disease.16,17 Environmental enrichment can improve cognition in transgenic mice.18,19 Despite the continued deposition of β‐amyloid, exercise can increase the levels of brain‐derived neurotrophic factor20 and may reduce amyloid burden.21Despite the progressive nature of the degenerative process, patients with Alzheimer''s disease also seem to retain the physiological capacity to alter brain structure and function. Recent studies have shown cognitive plasticity and learning potential not only in patients with Alzheimer''s disease but also in healthy elders.22,23 Positron emission tomography studies that used activation paradigms24,25 have found that people with Alzheimer''s disease have a greater activation than those without dementia in the brain regions usually associated with memory tasks, as well as in the frontal lobes that were activated only with increasing difficulty of tasks. Pathological studies conducted on biopsy specimens of patients with Alzheimer''s disease with mild or moderate dementia have shown increased synaptic contact size.26 Thus, the brain may be able to compensate during the early stages of Alzheimer''s disease, suggesting that there may be some utility to non‐pharmacological adjunctive interventions.Although studies on cognitive stimulation show that it is possible to stimulate the memory of patients with Alzheimer''s disease, the results are often modest. Because of methodological limitations, there is a need to conduct more randomised‐controlled trials with larger samples to validate this therapeutic approach. Computerised systems27 and internet‐based distance programs offer one potential mechanism by which non‐pharmacological cognitive stimulation can be conducted in patients with dementia. In this study, we evaluated an interactive multimedia internet‐based system (IMIS) as an adjunct to ChEI treatment and classic psychostimulation treatment.  相似文献   

13.

Objective

To compare the profile of cognitive impairment in Alzheimer''s disease (AD) with dementia associated with Parkinson''s disease (PDD).

Methods

Neuropsychological assessment was performed in 488 patients with PDD and 488 patients with AD using the Mini‐Mental State Examination (MMSE) and the Alzheimer''s Disease Assessment Scale‐cognitive subscale (ADAS‐cog). Logistic regression analysis was used to investigate whether the diagnosis could be accurately predicted from the cognitive profile. Additionally, the cognitive profiles were compared with a normative group using standardised effect sizes (Cohen''s d).

Results

Diagnosis was predicted from the cognitive profile, with an overall accuracy of 74.7%. Poor performance of the AD patients on the orientation test in ADAS‐cog best discriminated between the groups, followed by poor performance of the PDD patients on the attentional task in MMSE. Both groups showed memory impairment, AD patients performing worse than PDD patients.

Conclusion

The cognitive profile in PDD differs significantly from that in AD. Performance on tests of orientation and attention are best in differentiating the groups.Alzheimer''s disease (AD) and Parkinson''s disease (PD) are the most common neurodegenerative diseases in the elderly. AD is primarily a dementing disease whereas PD is mainly characterised by a movement disorder. However, dementia is common among patients with PD (PDD), with an average point prevalence of 31%1 and a cumulative prevalence close to 80%.2 In PD, dementia is associated with rapid motor3 and functional decline,4 and increased mortality.5Cortical Lewy body pathology correlates best with dementia in PD6,7,8,9; subcortical pathology10 and AD‐type pathology11 have also been found to be associated with PDD. In addition to differences in morphological changes, AD and PDD also differ in the regional pattern of the pathology. In AD the first and most pronounced changes are found in the entorhinal cortex and parahippocampal region,12 subsequently involving neocortical areas, including the posterior association cortices.13 In contrast, in patients with PD without dementia, brainstem nuclei and other subcortical structures are initially affected.14 In PDD, limbic areas, neocortical association cortices, and the motor cortex and primary sensory cortical areas are thought to be successively involved with disease progression.15Given the difference in the distribution and progression of pathology in AD and PDD, it is expected that their cognitive profiles would also differ.16,17 AD is characterised by memory loss emerging in the early stages of the disease,18 primarily involving learning and encoding deficits19 which are associated with medial temporal lobe pathology.20,21,22,23 As the disease progresses, deficits in language, praxis, visuospatial and executive functions gradually develop. In contrast, the cognitive deficits in the early stages of PDD are characterised by executive dysfunction, including impairment in attention24 and working memory,25,26,27 reflecting involvement of brainstem nuclei and frontal–subcortical circuits; deficits in visuoperceptual28,29,30 and visuoconstructional tasks are also frequent.31 Memory impairment is often present26,32,33,34 but whether it is primarily a consequence of frontally mediated executive deficits resulting in poor learning efficacy and retrieval, or whether involvement of limbic areas directly related to memory encoding (such as hippocampal atrophy) also contribute to memory impairment, is debated. Patients with PDD have difficulties in retrieving newly learned material, but perform better in recognition,35 indicating that executive, rather than encoding, deficits, is the underlying mechanism. Conflicting results, however, have been reported recently36,37 which could indicate that the type and mechanisms of memory deficits may vary within the PD group.32Most studies investigating the cognitive profile of PDD patients included small samples which were not community based and thus not necessarily representative of the PD population at large. As there is evidence of interindividual heterogeneity,33 such studies may not adequately reflect the cognitive profile of patients with PDD. In order to assess the profile of cognitive deficits in PDD compared with AD in larger patient populations, we analysed the baseline cognitive data from large clinical trials conducted with the cholinesterase inhibitor rivastigmine.38,39  相似文献   

14.

Objective

To evaluate cognitive outcome in adult survivors of bacterial meningitis.

Methods

Data from three prospective multicentre studies were pooled and reanalysed, involving 155 adults surviving bacterial meningitis (79 after pneumococcal and 76 after meningococcal meningitis) and 72 healthy controls.

Results

Cognitive impairment was found in 32% of patients and this proportion was similar for survivors of pneumococcal and meningococcal meningitis. Survivors of pneumococcal meningitis performed worse on memory tasks (p<0.001) and tended to be cognitively slower than survivors of meningococcal meningitis (p = 0.08). We found a diffuse pattern of cognitive impairment in which cognitive speed played the most important role. Cognitive performance was not related to time since meningitis; however, there was a positive association between time since meningitis and self‐reported physical impairment (p<0.01). The frequency of cognitive impairment and the numbers of abnormal test results for patients with and without adjunctive dexamethasone were similar.

Conclusions

Adult survivors of bacterial meningitis are at risk of cognitive impairment, which consists mainly of cognitive slowness. The loss of cognitive speed is stable over time after bacterial meningitis; however, there is a significant improvement in subjective physical impairment in the years after bacterial meningitis. The use of dexamethasone was not associated with cognitive impairment.The estimated annual incidence of bacterial meningitis is 4–6 per 100 000 adults and Streptococcus pneumoniae (pneumococcus) and Neisseria meningitidis (meningococcus) are the causative bacteria in 80% of cases.1,2 Fatality rates in patients with pneumococcal meningitis (26%) and meningococcal meningitis (7%) are significant.1,2,3 Even in patients with apparent good recovery, cognitive impairment occurs frequently,4 especially after pneumococcal meningitis.4,5,6 The cognitive functions affected by bacterial meningitis differ between studies, most likely because of the limited numbers of patients examined, and the lack of uniformity across studies in assessment methods and in the definition of cognitive impairment.4,5,6,7,8,9,10 We therefore pooled data on cognitive outcome after bacterial meningitis from three of our previous studies to more clearly determine which cognitive functions are affected by bacterial meningitis and to identify which patients are at risk of developing cognitive impairment.  相似文献   

15.

Background

Psychiatric symptoms are a common feature of Huntington''s disease (HD) and often precede the onset of motor and cognitive impairments. However, it remains unclear whether psychiatric changes in the preclinical period result from structural change, are a reaction to being at risk or simply a coincidental occurrence. Few studies have investigated the temporal course of psychiatric disorder across the preclinical period.

Objectives

To compare lifetime and current prevalence of psychiatric disorder in presymptomatic gene carriers and non‐carriers and to examine the relationship of psychiatric prevalence in gene carriers to temporal proximity of clinical onset.

Methods

Lifetime and current psychiatric histories of 204 at risk individuals (89 gene carriers and 115 non‐carriers) were obtained using a structured clinical interview, the Composite International Diagnostic Interview. Psychiatric disorders were classified using both standardised diagnostic criteria and a more subtle symptom based approach. Follow‐up of gene carriers (n = 51) enabled analysis of the role of temporal proximity to clinical onset.

Results

Gene carriers and non‐carriers did not differ in terms of the lifetime frequency of clinical psychiatric disorders or subclinical symptoms. However, gene carriers reported a significantly higher rate of current depressive symptoms. Moreover, the rate of depression increased as a function of proximity to clinical onset.

Conclusions

Affective disorder is an important feature of the prodromal stages of HD. The findings indicate that depression cannot be accounted for by natural concerns of being at risk. There is evidence of a window of several years in which preclinical symptoms are apparent.Huntington''s disease (HD) is an inherited neurodegenerative disorder, characterised by motor dysfunction, cognitive impairment and psychiatric disturbance. HD is associated with a wide range of psychiatric disturbances, including affective disorders,1,2,3 irritability,4,5,6 apathy1,3,6 and psychosis.4,7,8 Both major depression1,2,4,9 and more subtle mood disturbances10 have been reported to predate clinical onset, conventionally defined by onset of motor symptoms. However, the basis for psychiatric symptoms remains unclear. Depression has been observed to occur up to 20 years before the onset of motor symptoms,9,11 raising the possibility that psychiatric symptoms are an early indicator of HD and result from incipient neurodegenerative changes. However, the finding that psychiatric symptoms tend to cluster in certain HD families might indicate that psychiatric changes have a genetic basis and reflect a “switching on” of the HD gene early in life.2,8 High rates of psychiatric disturbance have also been observed in HD family members who do not carry the genetic mutation,9,10 raising the alternative possibility that affective changes arise in response to emotional stressors, such as being at risk, or the burden of growing up in a family with affected members. A more thorough understanding of the underlying basis of psychiatric changes in preclinical gene carriers is crucial, as future therapeutic strategies are most likely to target such individuals.Previous psychiatric studies of at risk individuals have yielded inconsistent results. Earlier studies reported high lifetime rates of psychiatric disorder in preclinical gene carriers (eg, 18% major affective disorder),2 whereas more recent studies indicate little difference between rates for gene carrier and non‐carrier groups.10,12,13,14 A number of factors may account for these discrepancies. The majority of earlier reports were limited to retrospective observation of affected individuals and therefore lacked appropriate controls.4,5 The advent of predictive testing has enabled direct comparison of at risk individuals who have the HD mutation and those who do not, thereby controlling for social and environmental factors.10,12,13,14 Whereas the majority of earlier studies lacked standardised assessment criteria,4,7 more recent studies have utilised operational diagnostic criteria, although these have in turn been criticised for failing to detect the more subtle psychiatric disturbances that can occur in HD.3,15Few studies have taken account of the temporal distance to onset of motor symptoms. It is now well established that the clinical onset of HD is typically preceded by a prodromal period of several months or years during which non‐specific mild neurological signs arise intermittently.16 The difficulty in establishing exact dates of onset for retrospective cases may have led to the inclusion in earlier studies of individuals who were already in the early stages of HD. Studies of presymptomatic individuals have typically recruited participants without motor signs, who may have been further from clinical onset.The present study is a double blind comparison of lifetime and current prevalence of psychiatric disorders in preclinical gene carriers and non‐carriers, using a combination of standardised psychiatric diagnostic criteria and a more subtle symptom based approach. Follow‐up of gene carriers has enabled analysis of the role of temporal proximity to clinical onset.  相似文献   

16.

Aim

To assess the long‐term cognitive and behavioural outcome after bilateral deep brain stimulation (DBS) of the subthalamic nucleus (STN) in patients affected by Parkinson''s disease, with a 5‐year follow‐up after surgery.

Methods

11 patients with Parkinson''s disease treated by bilateral DBS of STN underwent cognitive and behavioural assessments before implantation, and 1 and 5 years after surgery. Postoperative cognitive assessments were carried out with stimulators turned on.

Results

A year after surgery, there was a marginally significant decline on a letter verbal fluency task (p = 0.045) and a significant improvement on Mini‐Mental State Examination (p = 0.009). 5 years after surgery, a significant decline was observed on a letter verbal fluency task (p = 0.007) and an abstract reasoning task (p = 0.009), namely Raven''s Progressive Matrices 1947. No significant postoperative change was observed on other cognitive variables. No patient developed dementia 5 years after surgery. A few days after the implantation, two patients developed transient manic symptoms with hypersexuality and one patient developed persistent apathy.

Conclusion

The decline of verbal fluency observed 5 years after implantation for DBS in STN did not have a clinically meaningful effect on daily living activities in our patients with Parkinson''s disease. As no patient developed global cognitive deterioration in our sample, these findings suggest that DBS of STN is associated with a low cognitive and behavioural morbidity over a 5‐year follow‐up, when selection criteria for neurosurgery are strict.Chronic bilateral deep brain stimulation (DBS) of the subthalamic nucleus (STN) is an effective neurosurgical procedure for treatment of motor symptoms in patients with advanced Parkinson''s disease, who cannot be satisfactorily treated with pharmacological treatments. The safety of this procedure has been investigated by several studies, which have assessed the effects of STN DBS on cognition and behaviour.1,2,3 Some investigations have also attempted to distinguish between the cognitive effects of surgical intervention and those of DBS of STN in itself.4,5,6,7All neuropsychological investigations in patients treated by STN DBS showed a postoperative decline of verbal fluency, whereas less consistent effects have been reported on other cognitive tasks in different studies. A postoperative decline of episodic verbal memory, which was detectable 3 months after surgery, has been reported in some investigations.6,8Different effects of STN DBS on various frontal cognitive functions have been described. STN stimulation may impair response‐inhibition performance on the interference task of the Stroop test, as compared with the off‐stimulation condition.5,7,9 A positron emission tomography study showed that such impaired performance on the Stroop test in the on‐stimulation condition is associated with decreased activation in both the right anterior cingulate cortex and the right ventral striatum.9 Conversely, short‐term STN stimulation may improve performance on cognitive flexibility tasks, including random number generation7 and the Modified Wisconsin Card Sorting Test (MWCST).5Various behavioural effects have been described in patients with Parkinson''s disease treated by STN DBS. Some studies reported cases of depression10 or increased apathy,11 whereas cases of mania were described in other studies12,13,14 and an improvement of depression1 or apathy15 was also found.The long‐term cognitive and behavioural effects of bilateral STN DBS were investigated in 70 patients with Parkinson''s disease followed up for 3 years.11 In this study, a decline of verbal fluency, an improvement of depression and an increased apathy were observed 3 years after surgery. Some patients showed behavioural changes (aggressive behaviour, hypomania, depression and psychosis), which were mostly transient. Recently, the long‐term outcome of bilateral DBS of STN was investigated in a multicentre study conducted in 49 patients with Parkinson''s disease followed up for 3 or 4 years.16 This study showed that stimulation of the STN induced a significant improvement in Parkinsonian motor symptoms and activities of daily living 3–4 years after surgery. Among the adverse events, the authors reported memory decline or psychiatric disturbances (including hallucinations, delirium, depression, apathy and anxiety), which occurred in about 30% of the patients.In two recent investigations, the long‐term outcome of bilateral DBS of STN was investigated in patients with a 5‐year follow‐up.17,18 In one study conducted on 49 patients with Parkinson''s disease,17 cognitive performance was assessed by means of the Mattis Dementia Rating Scale (MDRS)19 and a frontal‐lobe score.4 Five years after surgery, there was a marked improvement of both motor function, while off drugs, and activities of daily living, a statistical trend towards a decline on the MDRS (reflecting the appearance of progressive dementia in three patients between the third and the fifth postoperative years) and a significant decline in the average frontal‐lobe score. Another study carried out on 37 patients with Parkinson''s disease18 also assessed cognitive performance by means of MDRS19 and a frontal score.20 Five years after the implantation, there was an improvement in Parkinsonian motor symptoms and activities of daily living and a reduction of levodopa‐related motor complications and levodopa daily doses. However, a significant decline in cognitive performance was detected on the MDRS and the frontal score.To our knowledge, no extensive neuropsychological data have been reported so far in patients with a follow‐up >3 years. The aim of the present study was to assess the long‐term cognitive and behavioural outcome after bilateral DBS of the STN in a series of patients followed up for 5 years after surgery.  相似文献   

17.

Background

Adult normal pressure hydrocephalus (NPH) is one of the few potentially treatable causes of dementia. Some morphological and functional abnormalities attributed to hydrocephalus improve following treatment.

Objectives

We focused on analysis of changes in cerebral metabolites using proton magnetic resonance spectroscopy (1H‐MRS) after NPH treatment, and its clinical and cognitive correlation.

Methods

1H‐MRS, neuropsychological and clinical status examinations were performed before and 6 months after shunting in 12 adults with idiopathic NPH. We obtained N‐acetyl‐aspartate (NAA), choline (Cho), myoinositol (MI) and creatine (Cr) values.

Results

After surgery, NAA/Cr was significantly increased. Moreover, NAA/Cr values were related to cognitive deterioration.

Conclusion

MRS could be a marker of neuronal dysfunction in NPH.Normal pressure hydrocephalus (NPH) is a potentially treatable cause of dementia,1,2 characterised by progressive cognitive dysfunction, gait disturbance and urinary incontinence associated with ventricular enlargement and abnormalities in CSF dynamics. In these patients, some morphological and functional abnormalities attributed to hydrocephalus improve after treatment.3,4,5 Proton magnetic resonance spectroscopy (1H‐MRS) allows non‐invasive in vivo measurement of brain metabolites. Findings from MRS studies reveal that 1H‐MRS is a potentially non‐invasive technique with sufficient sensitivity to detect subtle changes in neuronal function in neurodegenerative diseases, allowing investigation of neuronal injury or dysfunction6,7 and the assessment of treatment efficacy.8,9,101H‐MRS studies in patients with hydrocephalus are scarce.6,7,11,12,13,14,15 Changes in cerebral metabolites after treatment with hydrocephalus using this technique have been analysed in only two studies, which concentrated exclusively on the results of lactate metabolites.11,12The aim of our study was to describe changes in other major metabolites, using 1H‐MRS, before and after treatment in idiopathic NPH patients, and to obtain preliminary data on their clinical and cognitive correlation, which could serve as the basis for larger studies with control subjects.  相似文献   

18.
19.

Background

The precise time of stroke onset during sleep is difficult to specify, but this has a considerable influence on circadian variations of stroke onset.

Aim

To investigate circadian variations in situations at stroke onset—that is, in the waking state or during sleep—and their differences among subtypes.

Methods

12 957 cases of first‐ever stroke onset diagnosed from the Iwate Stroke Registry between 1991 and 1996 by computed tomography or magnetic resonance imaging were analysed. Circadian variations were compared using onset number in 2‐h periods with relative risk for the expected number of the average of 12 2‐h intervals in the waking state or during sleep in cerebral infarction (CIF), intracerebral haemorrhage (ICH) and subarachnoid haemorrhage (SAH).

Results

ICH and SAH showed bimodal circadian variations and CIF had a single peak in all situations at onset, whereas all three subtypes showed bimodal circadian variations of stroke onset in the waking state only. These variations were different in that CIF showed a bimodal pattern with a higher peak in the morning and a lower peak in the afternoon, whereas ICH and SAH had the same bimodal pattern with lower and higher peaks in the morning and afternoon, respectively.

Conclusions

Sleep or status in sleep tends to promote ischaemic stroke and suppress haemorrhagic stroke. Some triggers or factors that promote ischaemic stroke and prevent haemorrhagic stroke in the morning cause different variations in the waking state between ischaemic and haemorrhagic stroke.Stroke occurrence shows chronobiological variations,1 such as circannual variations, circaseptan variations and circadian variations. Various patterns have been reported but no conclusions have yet been reached on circadian variations. The circadian variations of stroke onset may differ according to subtype or reporter, and are classified as cerebral infarction (CIF) with a single peak2,3,4,5,6 or double peaks,7,8 subarachnoid haemorrhage (SAH) with a single peak9 or double peaks,6,10,11,12,13,14 and intracerebral haemorrhage (ICH) with double peaks.6,10,12 Most previous studies have not treated the three major subtypes simultaneously. Only three reports6,7,8 discussed all the three subtypes, but the number of cases of ICH, especially of SAH, was too small for investigation of circadian variation. This may have led to differences in the conceived patterns of circadian variation. Large numbers of cases in population‐based samples are required to investigate and compare the circadian variations of stroke onset among subtypes. For investigation of the triggers and risk factors of stroke onset, it is necessary to determine the circadian variations of stroke onset with precise times. The precise time of stroke onset during sleep is difficult to specify, but this has a considerable influence on circadian variations of stroke onset.We investigated circadian variation in stroke onset by situations at onset in CIF, ICH and SAH in a Japanese population, by using stroke registry data. We also investigated the differences in circadian variations, triggers and risk factors among subtypes.  相似文献   

20.

Background

In Latvia and other endemic regions, a single tick bite has the potential to transmit both tick‐borne encephalitis (TBE) and Lyme borreliosis.

Objective

To analyse both the clinical features and differential diagnosis of combined tick‐borne infection with TBE and Lyme borreliosis, in 51 patients with serological evidence, of whom 69% had tick bites.

Results

Biphasic fever suggestive of TBE occurred in 55% of the patients. Meningitis occurred in 92%, with painful radicular symptoms in 39%. Muscle weakness occurred in 41%; in 29% the flaccid paralysis was compatible with TBE. Only two patients presented with the bulbar palsy typical of TBE. Typical Lyme borreliosis facial palsy occurred in three patients. Typical TBE oculomotor disturbances occurred in two. Other features typical of Lyme borreliosis detected in our patients were distal peripheral neuropathy (n = 4), arthralgia (n = 9), local erythema 1–12 days after tick bite (n = 7) and erythema chronicum migrans (n = 1). Echocardiogram abnormalities occurred in 15.

Conclusions

Patients with double infection with TBE and Lyme borreliosis fell into three main clinical groups: febrile illness, 3 (6%); meningitis, 15 (30%); central or peripheral neurological deficit (meningoencephalitis, meningomyelitis, meningoradiculitis and polyradiculoneuritis), 33 (65%). Systemic features pointing to Lyme borreliosis were found in 25 patients (49%); immunoglobulin (Ig)M antibodies to borreliosis were present in 18 of them. The clinical occurrence of both Lyme borreliosis and TBE vary after exposure to tick bite, and the neurological manifestations of each disorder vary widely, with considerable overlap. This observational study provides no evidence that co‐infection produces unusual manifestations due to unpredicted interaction between the two diseases. Patients with tick exposure presenting with acute neurological symptoms in areas endemic for both Lyme borreliosis and TBE should be investigated for both conditions. The threshold for simultaneous treatment of both conditions should be low, given the possibility of co‐occurrence and the difficulty in ascribing individual neurological manifestations to one condition or the other.The Baltic region is an endemic focus for both tick‐borne encephalitis (TBE) and Lyme borreliosis transmitted by ticks.1,2,3,4 In Latvia, 7061 cases of TBE and 3566 cases of Lyme borreliosis were registered between 1994 and 2003, out of a population of 2.4 million. Both tick species present in Latvia, Ixodes ricinus and persulcatus, can transmit the encephalitis virus, the borreliosis spirochete and more rarely erlichiosis. A single tick bite has the potential to transmit both infections.5 Despite their different clinical courses, TBE and Lyme borreliosis have neurological features in common: lymphocytic meningitis, flaccid or spastic limb weakness and cranial nerve involvement. Thus, differentiating between these disorders is important, given different approaches to treatment.Of the two infections, only TBE runs a biphasic course with the initial prodomal period of influenza‐like symptoms usually developing 1–2 weeks after the tick bite. Hence, after an asymptomatic period lasting 2–10 days, about a third of infected patients enter a second phase with aseptic meningitis.2 Subsequently, 2–10% in Western TBE subtype or 10–25% in Eastern TBE subtype develop encephalitis, myelitis or meningoencephalomyelitis typically manifesting as combinations of flaccid paresis of the limbs, usually arms and neck, bulbar dysfunction, disorientation, aphasia and spastic paresis.1,2 A poliomyelitis‐like syndrome is described in central European TBE.6 Manifestations of TBE in the Baltic may be heterogeneous, given that infection with the Western, Far Eastern and Siberian subtypes all cause human infection in Latvia.7 Although severe manifestations usually subside after 3–6 weeks, the convalescence period of TBE may be very long, with nearly 40% having a postencephalitic syndrome at 4 years.8 The uptake of TBE vaccination is increasing in the Baltic region.Classical Lyme borreliosis differs considerably from TBE and produces local and generalised forms, systemic involvement, and development over several stages. Its acute and chronic courses pose problems of diagnosis and management.1,9 Diagnosis of neuroborreliosis requires a definite or possible tick bite, erythema migrans or seropositivity, and typical peripheral or central nervous system involvement.10 In early neuroborreliosis (2–10 weeks after tick bite) the most common neurological abnormalities are meningitis, meningoradiculoneuritis and cranial neuritis, particularly facial palsy.1,9,10,11 Progressive chronic encephalomyelitis, polyneuritis and cerebrovascular disorders are later manifestations of Lyme borreliosis, usually occurring months after the initial infection. Neurological features are noted in 10–12% of all patients with Lyme borreliosis in Europe1 and in 10–15% of patients in Northern America.11 Neurological manifestations in 330 European patients with Lyme borreliosis included radicular pain (70%), headache (18%), peripheral paresis (45%), central paresis (4%), sensory disturbances (44%) and facial palsy (39%).1 Borrelia infection takes a subclinical or minimally symptomatic course in up to 80% of the population after tick bites.12 Importantly, borreliosis is treatable with antibiotics.TBE infection can be proven by specific and sensitive ELISA detection of antibody in cerebrospinal fluid (CSF), or by detection of genome through polymerase chain reaction.13 Serum IgM antibodies can remain positive for ⩾10 months.2,14 By contrast, serological tests for Lyme borreliosis infection are less sensitive and specific to variable onset and occurrence of specific IgM and IgG antibodies, with recognised persistent seronegatives; direct detection of a pathogen is rarely possible, and reliance must be placed on interpreting the laboratory investigations in the light of the clinical picture.13,15,16 Demonstration of intrathecal antibody production provides a specific test,17 but is not sensitive in detecting all forms of neuroborreliosis.15 Despite their different clinical courses, TBE and Lyme borreliosis have neurological features in common: lymphocytic meningitis, flaccid or spastic limb weakness, and cranial nerve involvement. Pain, particularly in a radicular distribution, and sensory disturbance are regarded as features more typical of Lyme borreliosis than TBE.Only limited information on double infection with TBE and Lyme borreliosis is available. Single cases, small series or serologically defined series with limited clinical information are described from Germany, Slovenia, Central Russia and Finland.18,19,20,21,22,23,24 This retrospective clinical observational study analyses the clinical features and problems of differential diagnosis in patients with evidence of both TBE and Lyme borreliosis infection in Latvia.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号