首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Objectives

To investigate the relationship of 4 sarcopenia definitions with long-term all-cause mortality risk in older Australian women.

Design

Data from the Perth Longitudinal Study in Aging Women from 2003 to 2013 was examined in this prospective cohort study. The 4 sarcopenia definitions were the United States Foundation for the National Institutes of Health (FNIH), the European Working Group on Sarcopenia in Older People (EWGSOP), and adapted FNIH (AUS-POPF) and EWGSOP (AUS-POPE) definitions using Australian population-specific cut-points [<2 standard deviation (SD)] below the mean of young healthy Australian women. All-cause mortality was captured via linked data systems.

Setting and Participants

In total, 903 community-dwelling older Australian women (baseline mean age 79.9 ± 2.6 years) with concurrent measures of muscle strength (grip strength), physical function (timed-up-and-go; TUG) and appendicular lean mass (ALM) were included.

Measures

Cox-proportional hazards modeling was used to examine the relationship between sarcopenia definitions and mortality over 5 and 9.5 years.

Results

Baseline prevalence of sarcopenia by the 4 definitions differed substantially [FNIH (9.4%), EWGSOP (24.1%), AUS-POPF (12.0%), AUS-POPE (10.7%)]. EWGSOP and AUS-POPE had increased age-adjusted hazard ratios (aHRs) for mortality over 5 years [aHR 1.88 95% confidence interval (CI) (1.24?2.85), P < .01; aHR 2.52 95% CI (1.55?4.09), P < .01, respectively] and 9.5 years (aHR 1.39 95% CI (1.06?1.81), P = .02; aHR 1.94 95% CI (1.40?2.69), P < .01, respectively). No such associations were observed for FNIH or AUS-POPF. Sarcopenia components including weaker grip strength (per SD, 4.9 kg; 17%) and slower TUG (per SD, 3.1 seconds; 40%) but not ALM adjusted-variants (ALM/body mass index or ALM/height2) were associated with greater relative hazards for mortality over 9.5 years.

Conclusions/Relevance

Unlike FNIH, the EWGSOP sarcopenia definition incorporating weak muscle strength and/or poor physical function was related to prognosis, as was the regionally adapted version of EWGSOP. Although sarcopenia definitions were not developed based on prognosis, this is an important consideration for globally standardizing the sarcopenia framework.  相似文献   

2.

Objectives

The use of psychotropic drugs in long-term care (LTC) is very common, despite their known adverse effects. The prevalence of opioid use is growing among older adults. This study aimed to investigate trends in the prevalence of psychotropics, opioids, and sedative load in a LTC setting over a 14-year period. We also explored the interaction of psychotropic and opioid use according to residents’ dementia status in nursing home (NH) and assisted living facility (ALF) settings.

Design

Four cross-sectional studies.

Setting

Institutional settings in Helsinki, Finland.

Participants

Older residents in NHs in 2003 (n = 1987), 2011 (n = 1576), and 2017 (n = 791) and in ALFs in 2007 (n = 1377), 2011 (n = 1586), and 2017 (n = 1624).

Measures

Comparable assessments were conducted among LTC residents at 4 time points over 14 years. The prevalence of regular psychotropics, opioids, and other sedatives and data on demographics and diagnoses were collected from medical records.

Results

Disabilities and severity of dementia increased in both settings over time. The prevalence of all psychotropics decreased significantly in NHs (from 81% in 2003 to 61% in 2017), whereas in ALFs there was no similar linear trend (65% in 2007 and 64% in 2017). There was a significant increase in the prevalence of opioids in both settings (30% in NHs and 22% in AFLs in 2017). Residents with dementia used less psychotropics and opioids than those without dementia in both settings and at each time point.

Conclusions/Implications

NHs show a favorable trend in psychotropic drug use, but the rates of psychotropic use remain high in both NHs and ALFs. In addition, the rates of opioid use have almost tripled, leading to a high sedative load among LTC residents. Clinicians should carefully consider the risk-to-benefit ratio when prescribing in LTC.  相似文献   

3.

Objectives

This study aimed to investigate the additive effects of sarcopenia and low serum albumin level on the risk of incident disability in older adults.

Design

Prospective cohort study.

Setting

A Japanese community.

Participants

Community-dwelling older adults aged ≥65 years, without disability at baseline (N = 4452).

Measures

Sarcopenia was defined as the presence of both poor muscle function (low physical performance or muscle strength) and low muscle mass. Low serum albumin level was defined as ≤4.0 g/dL. Other potential confounding factors (demographics, medical history, depressive symptoms, and cognitive function) were also assessed. Incident disability was monitored based on Long-Term Care Insurance certification during follow-up.

Results

The median follow-up duration was 30 (interquartile range, 28-32) months. Participants were classified into mutually exclusive groups based on sarcopenia status and serum albumin levels: nonsarcopenia/normal serum albumin (n = 3719), low serum albumin alone (n = 552), sarcopenia alone (n = 132), and sarcopenia/low serum albumin (n = 49). A Cox hazards regression showed that the low serum albumin alone [hazard ratios (HR) = 1.71, 95% confidence interval (CI) = 1.26-2.33], sarcopenia alone (HR = 2.74, 95% CI = 1.58-4.77), and sarcopenia/low serum albumin groups (HR = 3.73, 95% CI = 1.87-7.44) had higher risk of disability than the nonsarcopenia/normal serum albumin group after adjusting for the covariates.

Conclusions/Implications

Sarcopenia and low serum albumin level synergistically increase the risk of incident disability in older adults. Sarcopenia in older adults at risk of malnutrition should be detected early, and appropriate interventions should be implemented.  相似文献   

4.

Objectives

To determine whether environmental rearrangements of the long-term care nursing home can affect disruptive behavioral and psychological symptoms of dementia (BPSD) in residents with dementia.

Design

Prospective 6-month study.

Setting

The study was conducted before (phase 1) and after (phase 2) environmental rearrangements [skylike ceiling tiles in part of the shared premises, progressive decrease of the illuminance at night together with soothing streaming music, reinforcement of the illuminance during the day, walls painted in light beige, oversized clocks in corridors, and night team clothes color (dark blue) different from that of the day team (sky blue)].

Participants

All of the patients (n = 19) of the protected unit were included in the study. They were aged 65 years or older and had an estimated life expectancy above 3 months.

Measures

Number and duration of disruptive BPSD were systematically collected and analyzed over 24 hours or during late hours (6:00-12:00 pm) during each 3-month period.

Results

There was no significant change in the patients' dependency, risk of fall, cognitive or depression indexes, or treatment between phase 1 and 2. Agitation/aggression and screaming were observed mainly outside the late hours as opposed to wandering episodes that were noticed essentially within the late hours. The number of patients showing wandering was significantly lower over 24 hours during phase 2. The number of agitation/physical aggression, wandering, and screaming and the mean duration of wandering episodes were significantly (P = .039, .002, .025, and .026 respectively) decreased over 24 hours following environmental rearrangements. Similarly, a significant reduction in the number and mean duration of wandering was noticed during the late hours (P = .031 and .007, respectively).

Conclusions

Our study demonstrates that BPSD prevalence can be reduced following plain environmental rearrangements aimed at improving spatial and temporal orientation.  相似文献   

5.

Objectives

To investigate the association between late-life blood pressure and the incidence of cognitive impairment in older adults.

Design

Prospective cohort study.

Setting

Community-living older adults from 22 provinces in China.

Participants

We included 12,281 cognitively normal [Mini-Mental State Examination (MMSE) ≥ 24] older adults (median age: 81 years) from the Chinese Longitudinal Healthy Longevity Survey. Eligible participants must have baseline blood pressure data and have 1 or more follow-up cognitive assessments.

Measurements

Baseline systolic (SBP) and diastolic blood pressure (DBP) were measured by trained internists. Cognitive function was evaluated by MMSE. We considered mild/moderate/severe cognitive impairment (MMSE <24, and MMSE decline ≥3) as the primary outcome.

Results

The participants with hypertension had a significantly higher risk of mild/moderate/severe cognitive impairment (hazard ratio [HR] 1.17, 95% confidence interval [CI] 1.10-1.24). Overall, the associations with cognitive impairment seem to be hockey stick–shaped for SBP and linear for DBP, though the estimated effects for low SBP/DBP were less precise. High SBP was associated with a gradual increase in the risk of mild/moderate/severe cognitive impairment (P trend < .001). Compared with SBP 120 to 129 mmHg, the adjusted HR was 1.17 (95% CI 1.07-1.29) for SBP 130 to 139 mmHg, increased to 1.54 (95% CI 1.35-1.75) for SBP ≥180 mmHg. Analyses for high DBP showed the same increasing pattern, with an adjusted HR of 1.09 (95% CI 1.01-1.18) for DBP 90 to 99 mmHg and 1.19 (95% CI 1.02-1.38) for DBP ≥110 mmHg, as compared with DBP 70 to 79 mmHg.

Conclusion

Late-life high blood pressure was independently associated with cognitive impairment in cognitively normal Chinese older adults. Prevention and management of high blood pressure may have substantial benefits for cognition among older adults in view of the high prevalence of hypertension in this rapidly growing population.  相似文献   

6.

Background and objective

There is little epidemiologic evidence considering the combined effect of dynapenia and low 25-hydroxyvitamin D [25 (OH) D] on incident disability. Our aim was to investigate whether the combination of dynapenia and low 25 (OH) D serum levels increases the risk of activities of daily living (ADL) incident disability.

Design

Prospective cohort study.

Settings

English Longitudinal Study of Aging.

Participants

A total of 4630 community-dwelling adults aged 50 years and older without ADL disability at baseline.

Measurements

The baseline sample was categorized into 4 groups (ie, nondynapenic/normal 25 (OH) D, low 25 (OH) D only, dynapenic only, and dynapenic/low 25 (OH) D according to their handgrip strength (<26 kg for men and <16 kg for women) and 25 (OH) D (≤50 nmol/L). The outcome was the presence of any ADL disability 2 years after baseline according to the modified Katz Index. Incidence rate ratios (IRRs) adjusted by sociodemographic, behavioral, and clinical characteristics were estimated using Poisson regression.

Results

The fully adjusted model showed that older adults with dynapenia only and those with lower serum levels of 25 (OH) D combined with dynapenia had higher incident ADL disability risk compared with nondynapenic and those with normal serum levels of 25 (OH) D. The IRRs for lower 25 (OH) D serum levels combined with dynapenia were higher than for dynapenia only, however, the confidence intervals (CIs) showed similar effect for these 2 groups. The IRRs were 1.31 for low 25(OH) D only (95% CI 0.99–1.74), 1.77 for dynapenia only (95% CI 1.08–2.88), and 1.94 for combined dynapenia and low 25(OH)D (95% CI 1.28–2.94).

Conclusions

Dynapenia only and dynapenia combined with low 25 (OH) D serum levels were important risk factors for ADL disability in middle-aged individuals and older adults in 2 years of follow-up.  相似文献   

7.

Objectives

To examine the potential added value of a simple 5-item questionnaire for sarcopenia screening (SARC-F) to the Fracture Risk Assessment Tool (FRAX) for hip fracture risk prediction, in order to identify at-risk older adults for screening with dual-energy x-ray absorptiometry (DXA).

Design

A prospective cohort study.

Setting and participants

Two thousand Chinese men and 2000 Chinese women aged 65 years or older were recruited from local communities and were prospectively followed up for about 10 years.

Measures

Areal bone mineral density (BMD) of hip and lumbar spine were measured by DXA at baseline. Ten-year FRAX probability of hip fracture was calculated using the baseline risk factors. Information from the baseline questionnaire was extracted to calculate a modified SARC-F score. The independent predictive values of SARC-F and FRAX questionnaire were evaluated using multivariate survival analysis. The added predictive values of SARC-F to FRAX for pre-DXA screening were examined.

Results

During the follow-up, 63 (3.2%) men and 69 (3.5%) women had at least 1 incident hip fracture. SARC-F had an independent value of FRAX for hip fracture risk prediction, with an adjusted hazard ratio [95% confidence interval (CI)] of 1.24 (1.02, 1.52) and 1.15 (0.99, 1.13) in men and women, respectively. Compared with using FRAX, using SARC-F in conjunction with FRAX made the sensitivity for prediction rise from 58.7% to 76.2% in men and from 69.6% to 78.3% in women, with a nondecreased area under receiver operating characteristic curve of 0.67. Prescreening using FRAX in conjunction with SARC-F could save more than half of the DXA assessment than with no prescreening.

Conclusions/Implications

SARC-F is associated with a modest increase in hip fracture risk, especially in men. Conjoint evaluation for sarcopenia in addition to FRAX screening may help identify older adults at higher risk of hip fracture for more intensive screening and/or preventive interventions.  相似文献   

8.

Objectives

The predictive value of frailty and comorbidity, in addition to more readily available information, is not widely studied. We determined the incremental predictive value of frailty and comorbidity for mortality and institutionalization across both short and long prediction periods in persons with dementia.

Design

Longitudinal clinical cohort study with a follow-up of institutionalization and mortality occurrence across 7 years after baseline.

Setting and Participants

331 newly diagnosed dementia patients, originating from 3 Alzheimer centers (Amsterdam, Maastricht, and Nijmegen) in the Netherlands, contributed to the Clinical Course of Cognition and Comorbidity (4C) Study.

Measures

We measured comorbidity burden using the Cumulative Illness Rating Scale for Geriatrics (CIRS-G) and constructed a Frailty Index (FI) based on 35 items. Time-to-death and time-to-institutionalization from dementia diagnosis onward were verified through linkage to the Dutch population registry.

Results

After 7 years, 131 patients were institutionalized and 160 patients had died. Compared with a previously developed prediction model for survival in dementia, our Cox regression model showed a significant improvement in model concordance (U) after the addition of baseline CIRS-G or FI when examining mortality across 3 years (FI: U = 0.178, P = .005, CIRS-G: U = 0.180, P = .012), but not for mortality across 6 years (FI: U = 0.068, P = .176, CIRS-G: U = 0.084, P = .119). In a competing risk regression model for time-to-institutionalization, baseline CIRS-G and FI did not improve the prediction across any of the periods.

Conclusions

Characteristics such as frailty and comorbidity change over time and therefore their predictive value is likely maximized in the short term. These results call for a shift in our approach to prognostic modeling for chronic diseases, focusing on yearly predictions rather than a single prediction across multiple years. Our findings underline the importance of considering possible fluctuations in predictors over time by performing regular longitudinal assessments in future studies as well as in clinical practice.  相似文献   

9.

Objectives

To evaluate the effects of repeated cerebrospinal fluid (CSF) tap procedures in idiopathic normal pressure hydrocephalus (iNPH) patients ineligible for surgical treatment.

Design

Prospective, monocentric, pilot study.

Setting

University hospital.

Participants

Thirty-nine patients aged 75 years and older, ineligible for shunting surgical intervention.

Intervention

Repeated CSF taps.

Measurements

All patients underwent a comprehensive geriatric assessment before and after each CSF tap. Adverse events were recorded.

Results

No major side effect was reported. Eleven patients showed no response to the first CSF tap test and were excluded. In the remaining 28 patients, all physical and cognitive functions improved after the drainage procedures, except for continence (which seemed poorly influenced). According to clinical judgment, the mean time frame of benefit between CSF taps was 7 months. Patients withdrawing from the protocol during the clinical follow-up showed a worsening of functional and cognitive performances after the interruption.

Conclusions/Implications

Periodic CSF therapeutic taps are safe, allow a better control of iNPH symptoms, and prevent functional decline in geriatric patients.  相似文献   

10.

Objective

The objective of this study was to examine the incidence of new onset depressive symptoms and associated factors over a 1-year period in an older Chinese suburban population.

Design

Prospective cohort study.

Setting and Participants

The sample comprised 691 Chinese community-dwelling participants (304 men; mean age 67.5 ± 5.7 years) without depressive symptoms at baseline, recruited from Chadian of Tianjin, China.

Measures

We had documented detailed information regarding sociodemographics, behavioral characteristics, and medical conditions. Sarcopenia was defined according to the Asian Working Group for Sarcopenia (AWGS) criteria. The outcome was new onset depressive symptoms at 1-year follow-up, defined as a score of ≥11 on the 30-item Geriatric Depression Scale.

Results

We found that 83 (12.0%) of the 691 participants without depressive symptoms at baseline had developed depressive symptoms. After multivariate adjustments, it was found that the incidence of new onset depressive symptoms was associated with sarcopenia, type 2 diabetes mellitus, and cardiovascular disease. People with a higher level of muscle mass and better sleep quality were significantly less likely to develop depressive symptoms than their counterparts.

Conclusions/Implications

We found that the incidence of depressive symptoms increased with some chronic diseases, such as sarcopenia, type 2 diabetes mellitus, and cardiovascular disease. In addition, muscle mass was the most related protective factor among sarcopenia's 3 basic diagnosis components—muscle mass, muscle strength, and physical performance. Hence, maintaining enough muscle mass could be beneficial in the prevention of depressive symptoms for older adults.  相似文献   

11.

Objectives

The objective was to test the hypothesis that antihypertensive drugs have a differential effect on cognition in carriers and noncarriers of the apolipoprotein ε4 (APOE4) polymorphism.

Design

Prospective population-based cohort, France.

Setting and participants

A total of 3359 persons using antihypertensive drugs (median age 74 years, 62% women) were serially assessed up to 10 years follow-up.

Measures

Exposure to antihypertensive drug use was established in the first 2 years. Cognitive function was assessed at baseline, 2, 4, 7, and 10 years with a validated test battery covering global cognition, verbal fluency, immediate visual recognition memory, processing speed, and executive function. Clinically significant change in cognitive function was determined using reliable change indices represented as z scores and analyzed with linear mixed-models.

Results

From 3359 persons exposed to antihypertensive drugs, 653 were APOE4 carriers (5.1% homozygous, 94.9% heterozygous) and median follow-up was 5.2 years (interquartile range 3.7–8.0). In APOE4 carriers, improved general cognitive function over time was associated with exposure to angiotensin converting enzyme inhibitors [β = .14; 95% confidence interval (CI) .06–.23, P = .001] and angiotensin receptor blockers (β = .11; 95% CI .02–.21, P = .019). Improved verbal fluency was associated with angiotensin converting enzyme inhibitors (β = .11; 95% CI .03–.20, P = .012).

Conclusions

Renin-angiotensin-system blockade was associated with improved general cognitive function in APOE4 carriers. Findings did not support renin-angiotensin-system drugs' lipophilicity or ability to cross the blood-brain barrier as potential mechanisms. The findings have implications for selecting the optimal antihypertensive drug in older populations at risk of cognitive decline and dementia.  相似文献   

12.

Objective(s)

To examine the change in physical functional status among persons living with HIV (PLWH) in nursing homes (NHs) and how change varies with age and dementia.

Design

Retrospective cohort study.

Setting

NHs in 14 states in the United States.

Participants

PLWH who were admitted to NHs between 2001 and 2010 and had stays of ≥90 days (N = 3550).

Measurements

We linked Medicaid Analytic eXtract (MAX) and Minimum Data Set (MDS) data for NH residents in the sampled states and years and used them to determine HIV infection. The main outcome was improvement in physical functional status, defined as a decrease of at least 4 points in the activities of daily living (ADL) score within 90 days of NH admission. Independent variables of interest were age and dementia (Alzheimer's disease or other dementia). Multivariate logistic regression was used, adjusting for individual-level covariates.

Results

The average age on NH admission of PLWH was 58. Dementia prevalence ranged from 14.5% in the youngest age group (age <40 years) to 38.9% in the oldest group (age ≥70 years). Overall, 44% of the PLWH experienced ADL improvement in NHs. Controlling for covariates, dementia was related to a significantly lower likelihood of ADL improvement among PLWH in the oldest age group only: the adjusted probability of improvement was 40.6% among those without dementia and 29.3% among those with dementia (P < .01).

Conclusions/relevance

PLWH, especially younger persons, may be able to improve their ADL function after being admitted into NHs. However, with older age, PLWH with dementia are more physically dependent and vulnerable to deterioration of physical functioning in NHs. More and/or specialized care may be needed to maintain physical functioning among this population. Findings from this study provide NHs with information on care needs of PLWH and inform future research on developing interventions to improve care for PLWH in NHs.  相似文献   

13.

Objective

Discharge to skilled nursing facilities (SNFs) is common in patients with heart failure (HF). It is unknown whether the transition from SNF to home is risky for these patients. Our objective was to study outcomes for the 30 days after discharge from SNF to home among Medicare patients hospitalized with HF who had subsequent SNF stays of 30 days or less.

Design

Retrospective cohort study.

Setting and participants

All Medicare fee-for-service beneficiaries 65 and older admitted during 2012-2015 with a HF diagnosis discharged to SNF then subsequently discharged home.

Measures

Patients were followed for 30 days following SNF discharge. We categorized patients by SNF length of stay: 1 to 6 days, 7 to 13 days, and 14 to 30 days. For each group, we modeled time to a composite outcome of unplanned readmission or death after SNF discharge. Our model examined 0-2 days and 3-30 days post-SNF discharge.

Results

Our study included 67,585 HF hospitalizations discharged to SNF and subsequently discharged home. Overall, 16,333 (24.2%) SNF discharges to home were readmitted within 30 days of SNF discharge. The hazard rate of the composite outcome for each group was significantly increased on days 0 to 2 after SNF discharge compared to days 3 to 30, as reflected in their hazard rate ratios: for patients with SNF length of stay 1 to 6 days, 4.60 (4.23-5.00); SNF length of stay 7 to 13 days, 2.61 (2.45-2.78); SNF length of stay 14 to 30 days, 1.70 (1.62-1.78).

Conclusions/implications

The hazard rate of readmission after SNF discharge following HF hospitalization is highest during the first 2 days home. This risk attenuated with longer SNF length of stay. Interventions to improve postdischarge outcomes have primarily focused on hospital discharge. This evidence suggests that interventions to reduce readmissions may be more effective if they also incorporate the SNF-to-home transition.  相似文献   

14.

Objectives

To define the prevalence of dysphagia and its associated factors and to investigate the influence of dysphagia and nutritional therapies performed in dysphagic subjects on clinical outcomes, including nutritional status, pressure ulcers, hospitalization, and mortality.

Design

A prospective observational study.

Setting and participants

Thirty-one Italian nursing homes participating in the ULISSE project and 1490 long-stay nursing home residents, older than 65 years, assessed at baseline and reassessed after 6 and 12 months.

Measures

All participants underwent a standardized comprehensive assessment using the Italian version of the nursing home Minimum Data Set. The activities of daily living Long-Form scale was used to evaluate functional status. Health care professionals assessed dysphagia by means of clinical evaluation. Nutritional status was assessed using the information on weight loss.

Results

The prevalence of dysphagia was 12.8%, and 16% of the subjects were treated with artificial nutrition. The mortality rate in subjects with dysphagia was significantly higher compared with that of nondysphagic subjects (27.7% vs 16.8%; P = .0001). The prevalence of weight loss and pressure ulcers was also higher in dysphagic subjects. At variance, dysphagia was not associated with a higher hospitalization risk.

Conclusion/Implications

Dysphagia is common in nursing home residents, and it is associated with higher mortality. Therefore, early diagnosis and optimal management of dysphagia should become a priority issue in nursing homes.  相似文献   

15.

Objectives

We aimed to quantify the increased risk of disability associated with cardiovascular risk factors among older adults, and to verify whether this risk may vary by age and functional status.

Design

Longitudinal population-based cohort study.

Setting

Urban area of Stockholm, Sweden.

Participants

Community-dwelling and institutionalized adults ≥60 years in the Swedish National study on Aging and Care in Kungsholmen free of cardiovascular diseases and disability (n = 1756) at baseline (2001-2004).

Measures

Incident disability in activities of daily living (ADL) was ascertained over 9 years. Cardiovascular risk factors (physical inactivity, alcohol consumption, smoking, high blood pressure, diabetes, high body mass index, high levels of total cholesterol, and high C-reactive protein) and walking speed were assessed at baseline. Data were analyzed using Cox proportional hazards models, stratifying by younger-old (age 60-72 years) and older-old (≥78 years).

Results

During the follow-up, 23 and 148 persons developed ADL-disability among the younger- and older-old, respectively. In the younger-old, the adjusted hazard ratio (HR) of developing ADL-disability was 4.10 (95% confidence interval [CI] 1.22-13.76) for physical inactivity and 5.61 (95% CI 1.17-26.82) for diabetes. In the older-old, physical inactivity was associated with incident ADL-disability (HR 1.99, 95% CI 1.36-2.93), and there was a significant interaction between physical inactivity and walking speed limitation (<0.8 m/s), showing a 6-fold higher risk of ADL-disability in those who were both physically inactive and had walking speed limitation than being active with no limitation, accounting for a population-attributable risk of 42.7%.

Conclusions/Implications

Interventions targeting cardiovascular risk factors may be more important for the younger-old in decreasing the risk of disability, whereas improving physical function and maintaining physical activity may be more beneficial for the older-old.  相似文献   

16.

Objectives

Protein and energy malnutrition and unintended weight loss are frequently reported in patients with mild cognitive impairment (MCI) and Alzheimer's disease (AD). Possible underlying mechanisms include increased energy expenditure, altered uptake of nutrients, a reduced nutritional intake, or a combination of these 3. We aimed at systematically reviewing the literature to examine potential differences in energy and protein intake in patients with MCI and AD compared to controls as a possible mechanism for unintended weight loss.

Design

Systematic review and meta-analysis.

Setting

PubMed and Cochrane Electronic databases were searched from inception to September 2017 for case control studies.

Participants

Patients with MCI or AD compared to cognitive healthy controls, all adhering to a Western dietary pattern.

Measurements

Energy and protein intake.

Results

The search resulted in 7 articles on patients with AD versus controls, and none on patients with MCI. Four articles found no differences in energy and protein intakes, 1 found higher intakes in patients with AD, and 1 article found lower intakes in patients with AD compared to controls. One article reported on intakes, but did not test differences. A meta-analysis of the results indicated no difference between patients with AD and controls in energy [?8 kcal/d, 95% confidence interval (CI): ?97, 81; P = .85], or protein intake (2 g/d, 95% CI: ?4, 9; P = .47). However, heterogeneity was high (I2 > 70%), and study methodology was generally poor or moderate.

Conclusion

Contrary to frequently reported unintended weight loss, our systematic review does not provide evidence for a lower energy or protein intake in patients with AD compared to controls. High heterogeneity of the results as well as of participant characteristics, setting, and study methods was observed. High-quality studies are needed to study energy and protein intake as a possible mechanism for unintended weight loss and malnutrition in both patients with MCI and AD.  相似文献   

17.

Objective

To determine the influence of the Kuchi-kara Taberu (KT) index on rehabilitation outcomes during hospitalized convalescent rehabilitation.

Design

A historical controlled study.

Setting and Participants

A rehabilitation hospital.

Participants

Patients who were admitted to a convalescent rehabilitation ward from June 2014 to May 2017.

Measures

Patients’ background characteristics included age, sex, nutritional status, activities of daily living (ADL) assessed using the Functional Impedance Measure (FIM), dysphagia assessed using the Functional Oral Intake Scale (FOIS), and reasons for rehabilitation. The following values before (control group) and after initiation of the KT index intervention period (intervention group) were compared: gain of FIM, length of stay, accumulated rehabilitation time, discharge destination, gain of FOIS, gain of body weight (BW), and nutritional intake (energy and protein).

Results

Mean age was 76.4 ± 12.3 years (n = 233). There were no significant differences in the baseline characteristics of the patients at admission between the control and intervention groups, except for reason of rehabilitation. The intervention group demonstrated statistically higher values for the total (P = .004) and motor FIM gain (P = .003), total (P = .018) and motor FIM efficiency (P = .016), and FOIS gain (P < .001), compared with values in the control group. The proportion of patients returning home was statistically more frequent in the intervention group compared with that in the control group (73.4% vs 85.5%, odds ratio 2.135, 95% confidence interval [CI] 1.108-4.113, P = .022). Multivariate analyses indicated that intervention using the KT index was a significant independent factor for increased FIM gain (β coefficient = 0.163, 95% CI 1.379-8.329, P = .006) and returning home (adjusted odds ratio 2.570, 95% CI 1.154-5.724, P = .021).

Conclusions/Implications

A rehabilitation program using the KT index may lead to improvement of inpatient outcomes in post-acute care. Further prospective research is warranted to confirm the efficacy of this program.  相似文献   

18.

Objective

To assess the influence of frailty on cognitive decline.

Design

Population-based prospective cohort study.

Settings/participants

Community-dwelling older adults living in a rural Ecuadorian village, fulfilling the following criteria: age ≥60 years at baseline Montreal Cognitive Assessment (MoCA) and frailty assessment, a baseline brain magnetic resonance imaging, and a follow-up MoCA performed at least 12 months after the baseline.

Measures

Frailty was evaluated by the Edmonton Frailty Scale (EFS) and cognitive performance by MoCA. The relationship between baseline EFS and MoCA decline was assessed by longitudinal linear and fractional polynomial models, adjusted for relevant confounders. The score of the cognitive component of the EFS was subtracted, and an alternative fractional polynomial model was fitted to settle the impact of such cognitive question on the model.

Results

A total of 252 individuals, contributing 923.7 person-years of follow-up (mean: 3.7 ± 0.7 years) were included. The mean EFS score was 4.7 ± 2.5 points. The mean baseline MoCA score was 19.5 ± 4.5 points, and that of the follow-up MoCA was 18.1 ± 4.9 points (P = .001). Overall, 154 (61%) individuals had lower MoCA scores in the follow-up. The best fitted longitudinal linear model showed association between baseline EFS score and MoCA decline (P = .027). There was a continuous increase in MoCA decline in persons with an EFS ≥7 points (nonlinear relationship). Fractional polynomials explained the effect of the EFS on MoCA decline. For the complete EFS score, the β coefficient was 2.43 (95% confidence interval 1.22–3.63). For the effect of the EFS (without its cognitive component) on MoCA decline, the relationship was still significant (β 4.86; 95% confidence interval 2.6–7.13).

Conclusions/implications

Over a 3.7-year period, 61% of older adults living in Atahualpa experienced cognitive decline. Such decline was significantly associated with frailty status at baseline. Region-specific risk factors influencing this relationship should be further studied to reduce its burden in rural settings.  相似文献   

19.

Objectives

To establish the prevalence and course of geriatric syndromes from hospital admission up to 3 months postdischarge and to determine the probability to retain geriatric syndromes over the period from discharge until 3 months postdischarge, once they are present at admission.

Design

Prospective multicenter cohort study conducted between October 2015 and June 2017.

Setting and participants

Acutely hospitalized patients aged 70 years and older recruited from internal, cardiology, and geriatric wards of 6 Dutch hospitals.

Measures

Cognitive impairment, depressive symptoms, apathy, pain, malnutrition, incontinence, dizziness, fatigue, mobility impairment, functional impairment, fall risk, and fear of falling were assessed at admission, discharge, and 1, 2, and 3 months postdischarge. Generalized estimating equations analysis were performed to analyze the course of syndromes and to determine the probability to retain syndromes.

Results

A total of 401 participants [mean age (standard deviation) 79.7 (6.7)] were included. At admission, a median of 5 geriatric syndromes were present. Most prevalent were fatigue (77.2%), functional impairment (62.3%), apathy (57.5%), mobility impairment (54.6%), and fear of falling (40.6%). At 3 months postdischarge, an average of 3 syndromes were present, of which mobility impairment (52.7%), fatigue (48.1%), and functional impairment (42.5%) were most prevalent. Tracking analysis showed that geriatric syndromes that were present at admission were likely to be retained. The following 6 geriatric syndromes were most likely to stay present postdischarge: mobility impairment, incontinence, cognitive impairment, depressive symptoms, functional impairment, and fear of falling.

Implications

Acutely hospitalized older adults exhibit a broad spectrum of highly prevalent geriatric syndromes. Moreover, patients are likely to retain symptoms that are present at admission postdischarge. Our study underscores the need to address a wide range of syndromes at admission, the importance of communication on syndromes to the next care provider, and the need for adequate follow-up care and syndrome management postdischarge.  相似文献   

20.

Background

Long-term care (LTC) homes expressed concern that patients had experienced medication incidents after hospital discharge as a result of poor coordination of care.

Objective

The London Transfer Project aimed to reduce LTC medication incidents by 50% within 48 hours of discharge from general medicine units at the London Health Sciences Centre.

Design

This quality improvement study involved 2 hospitals and 5 LTC homes in London, Ontario, Canada. The baseline prevalence of medication incidents was measured and explored for root causes. Two change ideas were tested on general medicine units to improve transfer communication: (1) expediting medication reconciliation and (2) faxing medication plans before discharge.

Measures

Evaluation involved time-series measurement and a comparison of baseline and intervention periods. The primary outcome was medication incidents by omission or commission within 48 hours of discharge, which was determined by dual chart reviews in hospital and LTC homes. Process measures included medication reconciliation and fax completion times. Hospital discharge times were included as a balance measure of the new communication process.

Results

Four hundred seventy-seven LTC transfers were reviewed between 2016 and 2017; 92 transfers were reviewed for medication incidents in participating homes at baseline (January-April 2016) and implementation (January-April 2017). Medication incidents decreased significantly by 56%, from 44% (22/50) at baseline to 19% (8/42) during implementation (P = .006). Medication reconciliation completion by noon increased from 56% (28/50) to 74% (31/42) but not significantly (P = .076). Faxes sent before discharge increased significantly from 4% (2/50) to 67% (28/42, P = .015). There was no significant change in hospital discharge time.

Conclusions/Implications

Medication incidents can be significantly reduced during care transitions by taking a systems perspective to explore quality gaps and redesign communication processes. This solution will be scaled to other inpatient services with a high proportion of LTC residents.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号