首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Objectives

Patients discharged to a skilled nursing facility (SNF) for post-acute care have a high risk of hospital readmission. We aimed to develop and validate a risk-prediction model to prospectively quantify the risk of 30-day hospital readmission at the time of discharge to a SNF.

Design

Retrospective cohort study.

Setting

Ten independent SNFs affiliated with the post-acute care practice of an integrated health care delivery system.

Participants

We evaluated 6032 patients who were discharged to SNFs for post-acute care after hospitalization.

Measurements

The primary outcome was all-cause 30-day hospital readmission. Patient demographics, medical comorbidity, prior use of health care, and clinical parameters during the index hospitalization were analyzed by using gradient boosting machine multivariable analysis to build a predictive model for 30-day hospital readmission. Area under the receiver operating characteristic curve (AUC) was assessed on out-of-sample observations under 10-fold cross-validation.

Results

Among 8616 discharges to SNFs from January 1, 2009, through June 30, 2014, a total of 1568 (18.2%) were readmitted to the hospital within 30 days. The 30-day hospital readmission prediction model had an AUC of 0.69, a 16% improvement over risk assessment using the Charlson Comorbidity Index alone. The final model included length of stay, abnormal laboratory parameters, and need for intensive care during the index hospitalization; comorbid status; and number of emergency department and hospital visits within the preceding 6 months.

Conclusions and implications

We developed and validated a risk-prediction model for 30-day hospital readmission in patients discharged to a SNF for post-acute care. This prediction tool can be used to risk stratify the complex population of hospitalized patients who are discharged to SNFs to prioritize interventions and potentially improve the quality, safety, and cost-effectiveness of care.  相似文献   

2.

Objective

To determine the influence of the Kuchi-kara Taberu (KT) index on rehabilitation outcomes during hospitalized convalescent rehabilitation.

Design

A historical controlled study.

Setting and Participants

A rehabilitation hospital.

Participants

Patients who were admitted to a convalescent rehabilitation ward from June 2014 to May 2017.

Measures

Patients’ background characteristics included age, sex, nutritional status, activities of daily living (ADL) assessed using the Functional Impedance Measure (FIM), dysphagia assessed using the Functional Oral Intake Scale (FOIS), and reasons for rehabilitation. The following values before (control group) and after initiation of the KT index intervention period (intervention group) were compared: gain of FIM, length of stay, accumulated rehabilitation time, discharge destination, gain of FOIS, gain of body weight (BW), and nutritional intake (energy and protein).

Results

Mean age was 76.4 ± 12.3 years (n = 233). There were no significant differences in the baseline characteristics of the patients at admission between the control and intervention groups, except for reason of rehabilitation. The intervention group demonstrated statistically higher values for the total (P = .004) and motor FIM gain (P = .003), total (P = .018) and motor FIM efficiency (P = .016), and FOIS gain (P < .001), compared with values in the control group. The proportion of patients returning home was statistically more frequent in the intervention group compared with that in the control group (73.4% vs 85.5%, odds ratio 2.135, 95% confidence interval [CI] 1.108-4.113, P = .022). Multivariate analyses indicated that intervention using the KT index was a significant independent factor for increased FIM gain (β coefficient = 0.163, 95% CI 1.379-8.329, P = .006) and returning home (adjusted odds ratio 2.570, 95% CI 1.154-5.724, P = .021).

Conclusions/Implications

A rehabilitation program using the KT index may lead to improvement of inpatient outcomes in post-acute care. Further prospective research is warranted to confirm the efficacy of this program.  相似文献   

3.

Objectives

To determine whether environmental rearrangements of the long-term care nursing home can affect disruptive behavioral and psychological symptoms of dementia (BPSD) in residents with dementia.

Design

Prospective 6-month study.

Setting

The study was conducted before (phase 1) and after (phase 2) environmental rearrangements [skylike ceiling tiles in part of the shared premises, progressive decrease of the illuminance at night together with soothing streaming music, reinforcement of the illuminance during the day, walls painted in light beige, oversized clocks in corridors, and night team clothes color (dark blue) different from that of the day team (sky blue)].

Participants

All of the patients (n = 19) of the protected unit were included in the study. They were aged 65 years or older and had an estimated life expectancy above 3 months.

Measures

Number and duration of disruptive BPSD were systematically collected and analyzed over 24 hours or during late hours (6:00-12:00 pm) during each 3-month period.

Results

There was no significant change in the patients' dependency, risk of fall, cognitive or depression indexes, or treatment between phase 1 and 2. Agitation/aggression and screaming were observed mainly outside the late hours as opposed to wandering episodes that were noticed essentially within the late hours. The number of patients showing wandering was significantly lower over 24 hours during phase 2. The number of agitation/physical aggression, wandering, and screaming and the mean duration of wandering episodes were significantly (P = .039, .002, .025, and .026 respectively) decreased over 24 hours following environmental rearrangements. Similarly, a significant reduction in the number and mean duration of wandering was noticed during the late hours (P = .031 and .007, respectively).

Conclusions

Our study demonstrates that BPSD prevalence can be reduced following plain environmental rearrangements aimed at improving spatial and temporal orientation.  相似文献   

4.

Objectives

To evaluate the effects of repeated cerebrospinal fluid (CSF) tap procedures in idiopathic normal pressure hydrocephalus (iNPH) patients ineligible for surgical treatment.

Design

Prospective, monocentric, pilot study.

Setting

University hospital.

Participants

Thirty-nine patients aged 75 years and older, ineligible for shunting surgical intervention.

Intervention

Repeated CSF taps.

Measurements

All patients underwent a comprehensive geriatric assessment before and after each CSF tap. Adverse events were recorded.

Results

No major side effect was reported. Eleven patients showed no response to the first CSF tap test and were excluded. In the remaining 28 patients, all physical and cognitive functions improved after the drainage procedures, except for continence (which seemed poorly influenced). According to clinical judgment, the mean time frame of benefit between CSF taps was 7 months. Patients withdrawing from the protocol during the clinical follow-up showed a worsening of functional and cognitive performances after the interruption.

Conclusions/Implications

Periodic CSF therapeutic taps are safe, allow a better control of iNPH symptoms, and prevent functional decline in geriatric patients.  相似文献   

5.

Objective(s)

To examine the change in physical functional status among persons living with HIV (PLWH) in nursing homes (NHs) and how change varies with age and dementia.

Design

Retrospective cohort study.

Setting

NHs in 14 states in the United States.

Participants

PLWH who were admitted to NHs between 2001 and 2010 and had stays of ≥90 days (N = 3550).

Measurements

We linked Medicaid Analytic eXtract (MAX) and Minimum Data Set (MDS) data for NH residents in the sampled states and years and used them to determine HIV infection. The main outcome was improvement in physical functional status, defined as a decrease of at least 4 points in the activities of daily living (ADL) score within 90 days of NH admission. Independent variables of interest were age and dementia (Alzheimer's disease or other dementia). Multivariate logistic regression was used, adjusting for individual-level covariates.

Results

The average age on NH admission of PLWH was 58. Dementia prevalence ranged from 14.5% in the youngest age group (age <40 years) to 38.9% in the oldest group (age ≥70 years). Overall, 44% of the PLWH experienced ADL improvement in NHs. Controlling for covariates, dementia was related to a significantly lower likelihood of ADL improvement among PLWH in the oldest age group only: the adjusted probability of improvement was 40.6% among those without dementia and 29.3% among those with dementia (P < .01).

Conclusions/relevance

PLWH, especially younger persons, may be able to improve their ADL function after being admitted into NHs. However, with older age, PLWH with dementia are more physically dependent and vulnerable to deterioration of physical functioning in NHs. More and/or specialized care may be needed to maintain physical functioning among this population. Findings from this study provide NHs with information on care needs of PLWH and inform future research on developing interventions to improve care for PLWH in NHs.  相似文献   

6.

Background

Long-term care (LTC) homes expressed concern that patients had experienced medication incidents after hospital discharge as a result of poor coordination of care.

Objective

The London Transfer Project aimed to reduce LTC medication incidents by 50% within 48 hours of discharge from general medicine units at the London Health Sciences Centre.

Design

This quality improvement study involved 2 hospitals and 5 LTC homes in London, Ontario, Canada. The baseline prevalence of medication incidents was measured and explored for root causes. Two change ideas were tested on general medicine units to improve transfer communication: (1) expediting medication reconciliation and (2) faxing medication plans before discharge.

Measures

Evaluation involved time-series measurement and a comparison of baseline and intervention periods. The primary outcome was medication incidents by omission or commission within 48 hours of discharge, which was determined by dual chart reviews in hospital and LTC homes. Process measures included medication reconciliation and fax completion times. Hospital discharge times were included as a balance measure of the new communication process.

Results

Four hundred seventy-seven LTC transfers were reviewed between 2016 and 2017; 92 transfers were reviewed for medication incidents in participating homes at baseline (January-April 2016) and implementation (January-April 2017). Medication incidents decreased significantly by 56%, from 44% (22/50) at baseline to 19% (8/42) during implementation (P = .006). Medication reconciliation completion by noon increased from 56% (28/50) to 74% (31/42) but not significantly (P = .076). Faxes sent before discharge increased significantly from 4% (2/50) to 67% (28/42, P = .015). There was no significant change in hospital discharge time.

Conclusions/Implications

Medication incidents can be significantly reduced during care transitions by taking a systems perspective to explore quality gaps and redesign communication processes. This solution will be scaled to other inpatient services with a high proportion of LTC residents.  相似文献   

7.

Objective

The aim of this study was to determine the prevalence of low fluid intake in institutionalized older residents and the associated factors.

Design

This was a cross-sectional study.

Setting and Participants

The study was carried out at a nursing home with a capacity for 156 residents, all of whom were older than 65 years.

Measures

Data were collected on the fluids consumed by each resident over a period of 1 week. Information relating to sociodemographic variables and to residents' health, nutrition, and hydration status was also collected.

Results

Of 53 residents, 34% ingested less than 1500 mL/d. The factors with the greatest correlation associated with low fluid intake were cognitive and functional impairment, the risk of suffering pressure ulcers, being undernourished, a texture-modified diet, dysphagia, impaired swallowing safety, and BUN:creatinine ratio.

Conclusions/Implications

The results obtained highlight the scale of low fluid intake in nursing homes and also aid to identify and understand the factors associated with this problem. The findings could help us to develop specific strategies to promote the intake of liquids and thereby reduce the incidence of dehydration in nursing homes.  相似文献   

8.

Objectives

Our article's primary objective is to examine whether rehabilitation providers can predict which patients discharged from skilled nursing facility (SNF) rehabilitation will be successful in their transition to home, controlling for sociodemographic factors and physical, mental, and social health characteristics.

Design

Longitudinal cohort study.

Setting and Participants

One hundred-twelve English-speaking adults aged 65 years and older admitted to 2 SNF rehabilitation units.

Measures

Our outcome is time to “failed transition to home,” which identified SNF rehabilitation patients who did not successfully transition from the SNF to home during the study. Our primary independent variable consisted of the prediction of medical providers, occupational therapists, physical therapists, and social workers about the likely success of their patients' SNF-to-home transition. We also examined the association of sociodemographic factors and physical, mental, and social health with a failed transition to home.

Results

The predictions of occupational and physical therapists were associated with whether patients successfully transitioned from the SNF to their homes in bivariate [hazard ratio (HR) = 4.96, P = .014; HR = 10.91, P = .002, respectively] and multivariate (HR = 5.07, P = .036; HR = 53.33, P = .004) analyses. The predictions of medical providers and social workers, however, were not associated with our outcome in either bivariate (HR = 1.44, P = .512; HR = 0.84, P = .794, respectively) or multivariate (HR = 0.57, P = .487; HR = 0.54, P = .665) analyses. Living alone, more medical conditions, lower physical functioning scores, and greater depression scores were also associated with time to failed transition to home.

Conclusions/Implications

These findings suggest that occupational and physical therapists may be better able to predict post-SNF discharge outcomes than are other rehabilitation providers. Why occupational and physical therapists' predictions are associated with the SNF-to-home outcome whereas the predictions of medical providers and social workers are not is uncertain. A better understanding of the factors informing the postdischarge predictions of occupational and physical therapists may help identify ways to improve the SNF-to-home discharge planning process.  相似文献   

9.

Objectives

To investigate if the multicomponent intervention of the COSMOS trial, combining communication, systematic pain management, medication review, and activities, improved quality of life (QoL) in nursing home patients with complex needs.

Design

Multicenter, cluster-randomized, single-blinded, controlled trial.

Setting

Thirty-three nursing homes with 67 units (clusters) from 8 Norwegian municipalities.

Participants

Seven hundred twenty-three patients with and without dementia (≥65 years) were cluster randomized to usual care or intervention in which health care staff received standardized education and on-site training for 4 months with follow-up at month 9.

Measurements

Primary outcome was change in QoL as measured by QUALIDEM (QoL dementia scale); QUALID (QoL late-stage dementia scale), and EQ-VAS (European QoL–visual analog scale) from baseline to month 4. Secondary outcomes were activities of daily living (ADL), total medication, staff distress, and clinical global impressions of change (CGIC).

Results

During the active intervention, all 3 QoL measures worsened, 2 significantly (QUALID P = .04; QUALIDEM P = .002). However, follow-up analysis from month 4 to 9 showed an intervention effect for EQ-VAS (P = .003) and QUALIDEM total score (P = .01; care relationship P = .02; positive affect P = .04, social relations P = .01). The secondary outcomes of ADL function, reduction of medication (including psychotropics) and staff distress, improved significantly from baseline to month 4. Intervention effects were also demonstrated for CGIC at month 4 (P = .023) and 9 (P = .009), mainly because of deterioration in the control group.

Conclusion and implications

Temporarily, the QoL decreased in the intervention group, leading to our hypothesis that health care staff may be overwhelmed by the work-intensive COSMOS intervention period. However, the decrease reversed significantly during follow-up, indicating a potential learning effect. Further, the intervention group improved in ADL function and received less medication, and staff reported less distress and judged COSMOS as able to bring about clinically relevant change. This suggests that nonpharmacologic multicomponent interventions require long follow-up to ensure uptake and beneficial effects.  相似文献   

10.

Objective

Ultrahigh therapy use has increased in SNFs without concomitant increases in residents' characteristics. It has been suggested that this trend may also have influenced the provision of high-intensity rehabilitation therapies to residents who are at the end of life (EOL). Motivated by lack of evidence, we examined therapy use and intensity among long-stay EOL residents.

Design

An observational study covering a time period 2012-2016.

Setting and participants

New York State nursing homes (N = 647) and their long-stay decedent residents (N = 55,691).

Methods

Data sources included Minimum Data Set assessments, vital statistics, Nursing Home Compare website, LTCfocus, and Area Health Resource File.Therapy intensity in the last month of life was the outcome measure. Individual-level covariates were used to adjust for health conditions. Facility-level covariates were the key independent variables of interest. Multinomial logistic regression models with facility random effects were estimated.

Results

Overall, 13.6% (n = 7600) of long-stay decedent residents had some therapy in the last month of life, 0% to 45% across facilities. Of those, almost 16% had very high/ultrahigh therapy intensity (>500 minutes) prior to death. Adjusting for individual-level covariates, decedents in the for-profit facilities had 18% higher risk of low/medium therapy [relative risk ratio (RRR) = 1.182, P < .001], and more than double the risk of high/ultrahigh therapy (RRR = 2.126, P < .001), compared to those with no therapy use in the last month of life. In facilities with higher physical therapy staffing, decedents had higher risk (RRR = 16.180, P = .002) of high/ultrahigh therapy, but not of low/medium therapy intensity. The use of high/ultrahigh therapy in this population has increased over time.

Conclusions and Relevance

This is a first study to empirically demonstrate that facility characteristics are associated with the provision of therapy intensity to EOL residents. Findings suggest that facilities with a for-profit mission, and with higher staffing of therapists, may be more incentivized to maximize therapy use, even among the sickest of the residents.  相似文献   

11.

Objectives

The use of psychotropic drugs in long-term care (LTC) is very common, despite their known adverse effects. The prevalence of opioid use is growing among older adults. This study aimed to investigate trends in the prevalence of psychotropics, opioids, and sedative load in a LTC setting over a 14-year period. We also explored the interaction of psychotropic and opioid use according to residents’ dementia status in nursing home (NH) and assisted living facility (ALF) settings.

Design

Four cross-sectional studies.

Setting

Institutional settings in Helsinki, Finland.

Participants

Older residents in NHs in 2003 (n = 1987), 2011 (n = 1576), and 2017 (n = 791) and in ALFs in 2007 (n = 1377), 2011 (n = 1586), and 2017 (n = 1624).

Measures

Comparable assessments were conducted among LTC residents at 4 time points over 14 years. The prevalence of regular psychotropics, opioids, and other sedatives and data on demographics and diagnoses were collected from medical records.

Results

Disabilities and severity of dementia increased in both settings over time. The prevalence of all psychotropics decreased significantly in NHs (from 81% in 2003 to 61% in 2017), whereas in ALFs there was no similar linear trend (65% in 2007 and 64% in 2017). There was a significant increase in the prevalence of opioids in both settings (30% in NHs and 22% in AFLs in 2017). Residents with dementia used less psychotropics and opioids than those without dementia in both settings and at each time point.

Conclusions/Implications

NHs show a favorable trend in psychotropic drug use, but the rates of psychotropic use remain high in both NHs and ALFs. In addition, the rates of opioid use have almost tripled, leading to a high sedative load among LTC residents. Clinicians should carefully consider the risk-to-benefit ratio when prescribing in LTC.  相似文献   

12.

Objectives

The aim of this study was to determine the feasibility and efficacy of a 6-month tele-rehabilitation home-based program, designed to prevent falls in older adults with 1 or more chronic diseases (cardiac, respiratory, neuromuscular or neurologic) returning home after in-hospital rehabilitation for their chronic condition. Patients were eligible for selection if they had experienced a fall during the previous year or were at high risk of falling.

Design

Randomized controlled trial. Tele-rehabilitation consisted of a falls prevention program run by the physiotherapist involving individual home exercise (strength, balance, and walking) and a weekly structured phone-call by the nurse inquiring about the disease status and symptoms and providing patient support.

Setting and Participants

Two hundred eighty-three patients (age 79 ± 6.6 years; F = 59%) with high risk of falls and discharged home after in-hospital rehabilitation were randomized to receive home-based program (intervention group, n = 141) or conventional care (control group, n = 142).

Measures

Incidence of falls at home in the 6-month period (primary outcome); time free to the first fall and proportion of patients sustaining ≥2 falls (secondary outcomes).

Results

During the 6 months, 85 patients fell at least once: 29 (20.6%) in the Intervention Group versus 56 (39.4%) in the control group (P < .001). The risk of falls was significantly reduced in the intervention group (relative risk =0.60, 95% confidence interval: 0.44-0.83; P < .001). The mean ± standard deviation time to first fall was significantly longer in intervention group than control group (152 ± 58 vs 134 ± 62 days; P = .001). Significantly, fewer patients experienced ≥2 falls in the intervention group than in the control group: 11 (8%) versus 24 (17%), P = .020.

Conclusions

A 6-month tele-rehabilitation home-based program integrated with medical/nursing telesurveillance is feasible and effective in preventing falls in older chronic disease patients with a high risk of falling.  相似文献   

13.

Background

The overall diet quality of individuals and populations can be assessed by dietary indexes based on information from food surveys. Few studies have evaluated the diet quality of individuals with type 2 diabetes or its potential associations with glycemic control.

Objective

To evaluate the relationship between diet quality and glycemic control.

Design

Cross-sectional study with consecutive enrollment from 2013 to 2016.

Participants

Outpatients with type 2 diabetes treated at a university hospital in southern Brazil.

Main outcome measures

Dietary information was obtained by a quantitative food frequency questionnaire validated for patients with diabetes. Overall diet quality was evaluated by the Healthy Eating Index 2010. Glycemic control was assessed by fasting plasma glucose and glycated hemoglobin.

Statistical analyses

A receiver operating characteristic curve was constructed to find the optimal Healthy Eating Index cutoff point to discriminate diet quality, considering good glycemic control as glycated hemoglobin level <7%. Patients were then classified as having lower vs higher diet quality, and the two groups were compared statistically. Logistic regression models were constructed with glycated hemoglobin level ≥7% as the dependent variable, adjusted for age, current smoking, diabetes duration and treatment, physical activity, body mass index, high-density lipoprotein cholesterol level, and energy intake.

Results

A total of 229 patients with type 2 diabetes (median age=63.0 years [interquartile range=58.0 to 68.5 years]; diabetes duration=10.0 years [interquartile range=5 to 19 years]; body mass index 30.8±4.3; and glycated hemoglobin=8.1% [interquartile range=6.9% to 9.7%]) were evaluated. A Healthy Eating Index score >65% yielded the best properties (area under the receiver operator characteristic curve=0.60; sensitivity=71.2%; specificity=52.1%; P=0.018). Patients with lower-quality diets were younger and more likely to be current smokers than patients with higher-quality diets. After adjusting for confounders, patients with lower-quality diets had nearly threefold odds of poorer glycemic control (2.92; 95% CI 1.27 to 6.71; P=0.012) than those in the higher-quality diet group.

Conclusions

Lower diet quality, defined as an Healthy Eating Index 2010 score <65%, was associated with poor glycemic control in this sample of outpatients with type 2 diabetes.  相似文献   

14.

Background

No studies have assessed the relationship between diet quality, using the Healthy Eating Index (HEI), and adiposity, physical activity, and metabolic disease risk factors in a Hispanic college population.

Objective

To assess associations between diet quality and adiposity, metabolic health, and physical activity levels in a Hispanic college freshman population.

Design

This was a cross-sectional study. Measurements were obtained during a 4-hour in-person visit and included demographic information via questionnaire, height, weight, waist circumference, body mass index, body fat via BodPod, hepatic fat, visceral adipose tissue (VAT) and subcutaneous adipose tissue via magnetic resonance imaging, glucose, insulin, homeostatic model assessment of insulin resistance (HOMA-IR), and lipids via blood draw from fasting subjects, physical activity (ie, step counts per day and time spent in different intensity levels) via 7-day accelerometry, and dietary intake via three to four 24-hour dietary recalls. Dietary quality was calculated using the HEI-2015.

Participants/setting

Hispanic college freshmen (n=92), 18 to 19 years, 49% male, who were enrolled at University of Texas at Austin from 2014 to 2015.

Main outcome measures

Main outcome measures were diet quality and adiposity, metabolic health, and physical activity levels.

Statistical analyses performed

Linear regressions determined if dietary quality is related to adiposity, metabolic, and physical activity outcomes. A priori covariates included sex, body fat, and body mass index percentile (for metabolic models), and moderate and vigorous physical activity (MVPA, for adiposity and metabolic models).

Results

The average HEI-2015 total score was 54.9±13.4. A 1-point increase in HEI score was associated with 1.5 mL lower VAT (P=0.013); 8 minutes per day higher light activity (P=0.008), and 107 more step counts per day (P=0.002); and 0.10 μg/mL lower insulin (P=0.046) and 0.5 U lower HOMA-IR (P<0.001).

Conclusion

Results suggest that small improvements in diet quality may be positively associated with a reduction in metabolic disease risk, during a critical time period in a young person’s life.  相似文献   

15.

Objective

Discharge to skilled nursing facilities (SNFs) is common in patients with heart failure (HF). It is unknown whether the transition from SNF to home is risky for these patients. Our objective was to study outcomes for the 30 days after discharge from SNF to home among Medicare patients hospitalized with HF who had subsequent SNF stays of 30 days or less.

Design

Retrospective cohort study.

Setting and participants

All Medicare fee-for-service beneficiaries 65 and older admitted during 2012-2015 with a HF diagnosis discharged to SNF then subsequently discharged home.

Measures

Patients were followed for 30 days following SNF discharge. We categorized patients by SNF length of stay: 1 to 6 days, 7 to 13 days, and 14 to 30 days. For each group, we modeled time to a composite outcome of unplanned readmission or death after SNF discharge. Our model examined 0-2 days and 3-30 days post-SNF discharge.

Results

Our study included 67,585 HF hospitalizations discharged to SNF and subsequently discharged home. Overall, 16,333 (24.2%) SNF discharges to home were readmitted within 30 days of SNF discharge. The hazard rate of the composite outcome for each group was significantly increased on days 0 to 2 after SNF discharge compared to days 3 to 30, as reflected in their hazard rate ratios: for patients with SNF length of stay 1 to 6 days, 4.60 (4.23-5.00); SNF length of stay 7 to 13 days, 2.61 (2.45-2.78); SNF length of stay 14 to 30 days, 1.70 (1.62-1.78).

Conclusions/implications

The hazard rate of readmission after SNF discharge following HF hospitalization is highest during the first 2 days home. This risk attenuated with longer SNF length of stay. Interventions to improve postdischarge outcomes have primarily focused on hospital discharge. This evidence suggests that interventions to reduce readmissions may be more effective if they also incorporate the SNF-to-home transition.  相似文献   

16.

Background

The Expanded Food and Nutrition Education Program (EFNEP) is a federally funded, community nutrition education program that assists the low-income population in acquiring knowledge and skills related to nutrition, food safety, food resource management, food security, and physical activity. Evaluation of EFNEP includes a 24-hour dietary recall (24HDR) administered by paraprofessional educators, yet protocols for most large-scale nutrition research studies employ registered dietitian nutritionists (RDNs) or individuals with educational backgrounds in nutrition or related fields to collect dietary recalls.

Objective

To compare 24HDRs collected by trained paraprofessional educators with recalls collected by an RDN.

Design

Exploratory cross-over study comparing same-day 24HDR in a one-on-one setting collected by paraprofessional educators and an RDN. Paired recalls were separated by at least 1 hour.

Participants and setting

The participants (n=41) were volunteer women who were eligible for participation in EFNEP in two states.

Main outcome measures

The 24HDRs were compared for energy, macronutrients, micronutrients, and food groups.

Statistical analysis performed

Mixed-model analysis to account for repeated measures. Intraclass correlation and Spearman correlation coefficients to determine interrater agreement.

Results

No difference in 24HDR was seen when compared by interviewer (paraprofessional vs RDN) or by site (Colorado vs North Carolina). There were significant differences in four components (energy, total fat, saturated fat, and solid fats-added sugar) based on recall order, with a higher intake in the second recall compared with the first.

Conclusion

The results of this preliminary study suggest that a well-trained paraprofessional educator using a valid methodology can collect a 24HDR that is similar to a recall collected by an RDN. The paraprofessional educator can be employed for dietary data collection, allowing the RDN to focus on more advanced aspects of scope of practice, such as data evaluation and program development.  相似文献   

17.

Background

Objective indicators of nutritional status are essential for accurate identification of malnutrition. Previous research has indicated an association between measures of respiratory muscle strength (RMS) and nutritional status. Measurement of RMS—including maximal inspiratory pressure (MIP), maximal expiratory pressure (MEP), and sniff nasal inspiratory pressure (SNIP)—may provide evidence to support the assessment of nutritional status in hospitalized patients.

Objective

The purpose of this study was to determine whether there was a difference in MIP, MEP, and SNIP between well-nourished and malnourished hospitalized patients.

Design

A cross-sectional study was conducted.

Participants/setting

Patients were screened for eligibility criteria on admission by means of electronic medical records in general medical or surgical units at a tertiary care hospital in Chicago, IL, from January 2016 to January 2017. A total of 140 patients were included for analysis.

Main outcomes measured

The primary outcome was detection of differences in measures of RMS between malnourished and well-nourished hospitalized patients. Nutritional status was assessed using subjective global assessment and Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition (Academy/ASPEN) criteria recommended to identify malnutrition. The MIP, MEP, and SNIP measures were obtained and reported as absolute values (expressed in centimeters of water) and percent of predicted values.

Statistical analysis

Independent t tests or Mann-Whitney U tests were used to determine differences in RMS measures between patients assessed as well nourished and those assessed as malnourished, depending on normality.

Results

Compared with well-nourished patients, malnourished patients identified by subjective global assessment criteria had significantly lower absolute SNIP (73.7±28.7 vs 59.5±27.1 cm H2O, P=0.004) and percent of predicted SNIP (78.6%±26.3% vs 64.8%± 30.0% predicted, P=0.006). Similarly, compared with well-nourished patients when Academy/ASPEN guidelines were used, malnourished individuals had significantly lower absolute SNIP (76.5±28.6 vs 58.3±26.3 cm H2O, P<0.001), percent of predicted SNIP (81.4%±26.4% vs 63.5%±28.7% predicted, P<0.001), absolute MIP (83.5±34.6 vs 71.1±33.6 cm H2O, P=0.05), and absolute MEP (108.7±36.6 vs 94.2±39.9 cm H2O, P=0.04).

Conclusion

Differences in RMS between well-nourished and malnourished patients were observed when SNIP measures were used. However, there were no differences in MIP and MEP measures. Further research is needed to build on the findings from this study.  相似文献   

18.

Objectives

We aimed to identify the best form of cognitive therapy among 3 main cognitive interventions of Alzheimer's disease (AD) including cognitive training (CT), cognitive stimulation (CS), and cognitive rehabilitation (CR).

Design

Systematic review and Bayesian network meta-analysis.

Setting and Participants

An exhaustive literature search was conducted based on PubMed, Embase, the Cochrane Central Register of Controlled Trials, PsycINFO, the China National Knowledge Infrastructure database, the Chinese Biomedical Literature database, the Wan Fang database, and Web of Science and other database and randomized controlled trials were identified from their inception to May 1, 2018. Older adult participants diagnosed with AD were recruited.

Measures

We conducted a Bayesian network meta-analysis (NMA) to rank the included treatments. Cognitive functions were measured based on the Mini-Mental State Examination (MMSE). A series of analyses and assessments, such as the Pairwise meta-analysis and the risk of bias, were performed concurrently.

Results

Only 22 studies were included in our analysis based on a series of rigorous screenings, which comprised 1368 participants. No obvious heterogeneities were found in NMA (I2 = 32.7%, P = .07) after the data were pooled. The mean difference (MD) of CT [MD = 2.1, confidence interval [CI]: 1.0, 3.2), CS (MD = 0.92, CI: ?0.20, 2.0), and CR (MD = 2.0, CI: 0.73, 3.4) showed that CT and CR could significantly improve cognitive function as measured by MMSE in the treatment group whereas the CS was less effective. CT had the highest probability among the 3 cognitive interventions [the surface under the cumulative ranking curve (SUCRA) = 84.7%], followed by CR (SUCRA = 50.0%) and CS (SUCRA = 47.4%).

Conclusions/Relevance

Our study indicated that the CT might be the best method for improving the cognitive function of AD patients. The findings from our study may be useful for policy makers and service commissioners when they make choices among different alternatives.  相似文献   

19.

Objective

To investigate whether depression and/or antidepressants can be a potential risk factor for the development of dementia and mild cognitive impairment (MCI).

Design

Systematic review and meta-analysis of longitudinal studies.

Setting and Participants

Community or clinical settings. Participants included patients with depression, antidepressant users, and the general population.

Measures

Longitudinal studies evaluating the risks of dementia or MCI in patients with depression and/or antidepressant users were identified from the OVID database. The outcomes were the number of patients who developed dementia or MCI among the antidepressant users and nonusers. Relative risk (RR) with 95% confidence interval (95% CI) was used to evaluate the association between the use of antidepressants and the risk of dementia and MCI. Meta-analysis was used for combining the effect sizes of individual studies, and the heterogeneity test was performed. Risk of bias and reporting quality of included studies was assessed. Subgroup analyses were conducted for different types of antidepressants.

Results

A total of 18 studies with 2,119,627 participants with mean age ranging from 55 to 81 years were included. Among patients with depression, antidepressant users showed a significantly higher risk of dementia (RR = 1.37, 95% CI = 1.11-1.70) and MCI (RR = 1.20, 95% CI = 1.02-1.42) than the nonusers. Besides, patients with depression who used antidepressants and who did not use antidepressants also showed significantly higher risk of dementia than the general population (RR = 1.50, 95% CI = 1.26-1.78, and RR = 1.31, 95% CI = 1.15-1.51, respectively).

Conclusions/Implications

Patients with depression are associated with a higher risk of dementia, and the use of antidepressants is not shown to be a protective factor of dementia. Further large-scale trials are required for investigation of the benefit-risk ratio between depression relapse and dementia when prescribing antidepressants.  相似文献   

20.

Objectives

To evaluate the quality of communication between hospitals and home health care (HHC) clinicians and patient preparedness to receive HHC in a statewide sample of HHC nurses and staff.

Design

A web-based 48-question cross-sectional survey of HHC nurses and staff in Colorado to describe the quality of communication after hospital discharge and patient preparedness to receive HHC from the perspective of HHC nurses and staff. Questions were on a Likert scale, with optional free-text questions.

Setting and participants

Between January and June 2017, we sent a web-based survey to individuals from the 56 HHC agencies in the Home Care Association of Colorado that indicated willingness to participate.

Results

We received responses from 50 of 122 individuals (41% individual response rate) representing 14 of 56 HHC agencies (25% agency response rate). Half of the respondents were HHC nurses, the remainder were managers, administrators, or quality assurance clinicians. Among respondents, 60% (n = 30) reported receiving insufficient information to guide patient management in HHC and 44% (n = 22) reported encountering problems related to inadequate patient information. Additional tests recommended by hospital clinicians was the communication domain most frequently identified as insufficient (58%). More than half of respondents (52%) indicated that patient preparation to receive HHC was inadequate, with patient expectations frequently including extended-hours caregiving, housekeeping, and transportation, which are beyond the scope of HHC. Respondents with electronic health record (EHR) access for referring providers were less likely to encounter problems related to a lack of information (27% vs 57% without EHR access, P = .04). Respondents with EHR access were also more likely to have sufficient information about medications and contact isolation.

Conclusions/Implications

Communication between hospitals and HHC is suboptimal, and patients are often not prepared to receive HHC. Providing EHR access for HHC clinicians is a promising solution to improve the quality of communication.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号