首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Importance

Although participation in physical and cognitive activities is encouraged to reduce the risk of dementia, the preventive efficacy of these activities for patients with mild cognitive impairment is unestablished.

Objective

To compare the cognitive and mobility effects of a 40-week program of combined cognitive and physical activity with those of a health education program.

Design

A randomized, parallel, single-blind controlled trial.

Setting

A population-based study of participants recruited from Obu, a residential suburb of Nagoya, Japan.

Participants

Between August 2011 and February 2012, we evaluated 945 adults 65?years or older with mild cognitive impairment, enrolled 308, and randomly assigned them to the combined activity group (n?=?154) or the health education control group (n?=?154).

Interventions

The combined activity program involved weekly 90-minute sessions for 40?weeks focused on physical and cognitive activities. The control group attended 90-minute health promotion classes thrice during the 40-week trial period.

Measurement

The outcome measures were assessed at the study's beginning and end by personnel blinded to mild cognitive impairment subtype and group. The primary endpoints were postintervention changes in scores on (1) the Mini-Mental State Examination as a measure of general cognitive status and memory, (2) the Wechsler Memory Scale-Revised–Logical Memory II, and (3) the Rey Auditory Verbal Learning Test. We applied mobility assessments and assessed brain atrophy with magnetic resonance imaging.

Results

Compared with the control group, the combined activity group showed significantly greater scores on the Mini-Mental State Examination (difference?=?0.8 points, P?=?.012) and Wechsler Memory Scale-Revised–Logical Memory II (difference?=?1.0, P?=?.004), significant improvements in mobility and the nonmemory domains and reduced left medial temporal lobe atrophy in amnestic mild cognitive impairment (Z-score difference?=??31.3, P?<?.05).

Conclusion

Combined physical and cognitive activity improves or maintains cognitive and physical performance in older adults with mild cognitive impairment, especially the amnestic type.  相似文献   

2.

Objectives

A simple and inexpensive tool for screening of sarcopenia would be helpful for clinicians. The present study was performed to determine whether the SARC-F questionnaire is useful in screening of patients with cardiovascular disease (CVD) for impaired physical function.

Design

Cross-sectional study.

Setting

Single university hospital.

Participants

A total of 235 Japanese patients ≥65 years old admitted to our hospital for CVD.

Measurements

SARC-F, handgrip strength, leg strength, respiratory muscle strength, standing balance, usual gait speed, Short Physical Performance Battery (SPPB) score, and 6-minute walking distance were measured before discharge from hospital. The patients were divided into 2 groups according to SARC-F score: SARC-F < 4 (nonsarcopenia group) and SARC-F ≥ 4 (sarcopenia group).

Results

The sarcopenia prevalence rate was 25.5% and increased with age (P trend < .001). The sarcopenia group (SARC-F score ≥ 4) had significantly lower handgrip strength, leg strength, and respiratory muscle strength, poorer standing balance, slower usual gait speed, lower SPPB score, and shorter 6-minute walking distance compared to the nonsarcopenia group (SARC-F score < 4). Patients in the sarcopenia group had consistently poorer physical function even after adjusting for covariates.

Conclusion

The SARC-F questionnaire is a useful screening tool for impaired physical function in elderly CVD patients. These findings support the use of the SARC-F for screening in hospital settings.  相似文献   

3.

Background

Malnutrition is a significant problem for hospitalized patients. However, the true prevalence of reported malnutrition diagnosis in real-world clinical practice is largely unknown. Using a large collaborative multi-institutional database, the rate of malnutrition diagnosis was assessed and used to assess institutional variables associated with higher rates of malnutrition diagnosis.

Objective

The aim of this study was to define the prevalence of malnutrition diagnosis reported among inpatient hospitalizations.

Design

The University Health System Consortium (Vizient) database was retrospectively reviewed for reported rates of malnutrition diagnosis.

Participants/setting

All adult inpatient hospitalization at 105 member institutions during fiscal years 2014 and 2015 were evaluated.

Main outcome measures

Malnutrition diagnosis based on the presence of an International Classification of Diseases-Ninth Revision diagnosis code.

Statistical analysis

Hospital volume and publicly available hospital rankings and patient satisfaction scores were obtained. Multiple regression analysis was performed to assess the association between these variables and reported rates of malnutrition.

Results

A total of 5,896,792 hospitalizations were identified from 105 institutions during the 2-year period. It was found that 292,754 patients (5.0%) had a malnutrition diagnosis during their hospital stay. By institution, median rate of malnutrition diagnosis during hospitalization was 4.0%, whereas the rate of severe malnutrition diagnosis was 0.9%. There was a statistically significant increase in malnutrition diagnosis from 4.0% to 4.9% between 2014 and 2015 (P<0.01). Institutional factors associated with increased diagnosis of malnutrition were higher hospital volume, hospital ranking, and patient satisfaction scores (P<0.01).

Conclusions

Missing a malnutrition diagnosis appears to be a universal issue because the rate of malnutrition diagnosis was consistently low across academic medical centers. Institutional variables were associated with the prevalence of malnutrition diagnosis, which suggests that institutional culture influences malnutrition diagnosis. Quality improvement efforts aimed at improved structure and process appear to be needed to improve the identification of malnutrition.  相似文献   

4.

Background

Diet and obesity influence prostate cancer risk and progression–effects that may be mediated through the gut microbiome.

Objective

Our aim was to explore relationships among diet, gut microbes, and Gleason sum in overweight and obese prostate cancer patients enrolled in a presurgical weight-loss trial.

Design

Randomized controlled trial (NCT01886677) secondary analysis.

Participants/setting

In 2013-2014, 40 prostate cancer patients in the southeastern United States were randomized and allocated equally to weight-loss and wait-list control arms while they awaited prostatectomy; stool samples were collected on a subset of 22 patients.

Intervention

Registered dietitian nutritionists and exercise physiologists provided semi-weekly in-person and telephone-based guidance on calorie-restricted diets and exercise to promote an approximate weight loss of 0.91 kg/wk.

Main outcome measures

Baseline and follow-up 24-hour dietary recalls were conducted and analyzed (using the Automated Self-Administered 24-hour dietary recall system; National Cancer Institute, Bethesda, MD) for macronutrients, micronutrients, and food groups. Microbiome analysis targeting the V4 region of the 16S ribosomal RNA gene was performed on fecal samples. Biopsy Gleason sum data were accessed from diagnostic pathology reports.

Statistical analyses performed

Associations between dietary factors and operational taxonomic units were determined by β-diversity analysis. Wilcoxon signed rank, and Mann-Whitney U testing assessed within- and between-arm differences. Associations between Gleason sum and operational taxonomic units, and diet and operational taxonomic units, were analyzed using Spearman correlations.

Results

At baseline, Proteobacteria (median 0.06, interquartile range 0.01 to 0.16) were abundant, with four orders positively associated with Gleason sum. Gleason sum was associated with Clostridium (ρ=.579; P=0.005) and Blautia (ρ=?0.425, P=0.049). Increased red meat consumption from baseline was associated with Prevotella (ρ=?.497; P=0.018) and Blautia (ρ=.422; P=0.039). Men who increased poultry intake had decreased Clostridiales abundance (P=0.009).

Conclusions

This hypothesis-generating study provides a starting point for investigating the relationships between the fecal microbiome, diet, and prostate cancer. Adequately powered studies are required to further explore and validate these findings.  相似文献   

5.

Background

Rural children consume more calories per day on average than urban children, and they are less likely to consume fruit. Self-service salad bars have been proposed as an effective approach to better meet the National School Lunch Program’s fruit and vegetable recommendations. No studies have examined how rural and urban schools differ in the implementation of school salad bars.

Objective

To compare the prevalence of school-lunch salad bars and differences in implementation between urban and rural Arizona schools.

Design

Secondary analysis of a cross-sectional web-based survey.

Participants/setting

School nutrition managers (N=596) in the state of Arizona.

Main outcomes measured

National Center for Education Statistics locale codes defined rural and urban classifications. Barriers to salad bar implementation were examined among schools that have never had, once had, and currently have a school salad bar. Promotional practices were examined among schools that once had and currently have a school salad bar.

Statistical analyses performed

Generalized estimating equation models were used to compare urban and rural differences in presence and implementation of salad bars, adjusting for school-level demographics and the clustering of schools within districts.

Results

After adjustment, the prevalence of salad bars did not differ between urban and rural schools (46.9%±4.3% vs 46.8%±8.5%, respectively). Rural schools without salad bars more often reported perceived food waste and cost of produce as barriers to implementing salad bars, and funding was a necessary resource for offering a salad bar in the future, as compared with urban schools (P<0.05). No other geographic differences were observed in reported salad bar promotion, challenges, or resources among schools that currently have or once had a salad bar.

Conclusions

After adjustment, salad bar prevalence, implementation practices, and concerns are similar across geographic settings. Future research is needed to investigate methods to address cost and food waste concerns in rural areas.  相似文献   

6.

Objective

To analyze the association between dietary patterns and the 12-year risk of frailty and its components in community-dwelling elderly French adults.

Design

A prospective cohort study.

Setting

The Bordeaux sample of the Three-City Study.

Participants

A total of 972 initially nonfrail nondemented participants (336 men and 636 women) aged 73 years on average, re-examined at least once over 12 years.

Measurements

Five sex-specific dietary clusters were previously derived at baseline. Frailty incident to the baseline visit was defined as having at least three out of the following 5 criteria: unintentional weight loss, exhaustion, low energy expenditure, slowness, and muscle weakness. Multivariate Cox proportional hazard models were used to assess the association between dietary clusters and the risk of frailty and its components.

Results

In total, 78 men for 3719 person-years and 221 women for 7027 person-years became frail over the follow-up. In multivariate analyses, men in the “pasta” pattern and women in the “biscuits and snacking” pattern had a significantly higher risk of frailty compared with those in the “healthy” pattern [hazard ratio (HR) 2.2; 95% confidence interval (CI) 1.1–4.4 and HR 1.8; 95% CI 1.2–2.8, respectively; P = .09 and P = .13 for the global test of significance of risk difference across clusters, respectively]. In men, “biscuits and snacking” and “pasta” patterns were significantly associated with higher risk for muscle weakness (HR 3.3; 95% CI 1.6–7.0 and HR 2.1; 95% CI 1.2–3.7, respectively; P = .003 for global test).

Conclusions

This 12-year prospective population-based study suggests that some particular unhealthy dietary patterns may increase the risk of frailty in older adults.  相似文献   

7.

Objectives

This study examined the benefits of and differences between 12 weeks of thrice-weekly supervised balance training and an unsupervised at-home balance activity (using the Nintendo Wii Fit) for improving balance and reaction time and lowering falls risk in older individuals with type 2 diabetes mellitus (T2DM).

Design

Before-after trial.

Setting

University research laboratory, home environment.

Participants

Sixty-five older adults with type 2 diabetes were recruited for this study. Participants were randomly allocated to either supervised balance training (mean age 67.8 ± 5.2) or unsupervised training using the Nintendo Wii Fit balance board (mean age 66.1 ± 5.6).

Intervention

The training period for both groups lasted for 12 weeks. Individuals were required to complete three 40-minute sessions per week for a total of 36 sessions.

Measurement

The primary outcome measure was falls risk, which was as derived from the physiological profile assessment. In addition, measures of simple reaction time, lower limb proprioception, postural sway, knee flexion, and knee extension strength were also collected. Persons also self-reported any falls in the previous 6 months.

Results

Both training programs resulted in a significant lowering of falls risk (P < .05). The reduced risk was attributable to significant changes in reaction times for the hand (P < .05), foot (P < .01), lower-limb proprioception (P < .01), and postural sway (P < .05).

Conclusions

Overall, training led to a decrease in falls risk, which was driven by improvements in reaction times, lower limb proprioception, and general balance ability. Interestingly, the reduced falls risk occurred without significant changes in leg strength, suggesting that interventions to reduce falls risk that target intrinsic risk factors related to balance control (over muscle strength) may have positive benefits for the older adult with T2DM at risk for falls.  相似文献   

8.

Background

Fear of falling (FoF) is present in 20% to 85% of older adults and may be an early marker of decline in global cognitive functioning (GCF). We tested the hypothesis that FoF is associated with lower levels of GCF (cross-sectional) and greater decline in GCF (prospective) in adults aged 50 and older.

Design

Observational cohort study.

Setting

The Irish Longitudinal Study on Ageing, a population-based study.

Participants

Data were from 4931 participants (mean age 62.9 ± 9.1, range 50–98, 54.3% female).

Measurements

FoF was based on self-report in 2010. GCF was measured with the Montreal Cognitive Assessment (MoCA) and Mini Mental Status Examination (MMSE) in 2010 and 2014. The cross-sectional association was examined using linear regression unadjusted and after adjustment for demographic and health factors. The prospective association between FoF and the odds of >1-SD decline in GCF were examined using logistic regression. Interaction with age and mediation by social and physical activities were examined.

Results

In 2010, 21.9% of participants reported FoF. In the unadjusted cross-sectional models, those with FoF had lower scores on the MoCA (B ?1.15, 95% confidence interval [CI] ?1.40 to ?0.90) and MMSE (B ?0.52, CI ?0.67 to ?0.37). In the unadjusted prospective models, FoF was associated with a greater odds of decline in MoCA (odds ratio [OR] 1.60, CI 1.26–2.04) and MMSE (OR 1.64, CI 1.29–2.08). After adjustment for covariates, all associations attenuated and were no longer statistically significant, except the association with decline in MoCA (OR 1.32, CI 1.01–1.71). No statistically significant interaction with age was found (P > .37). Additional adjustment for social and physical activity did not change the results.

Conclusions

The findings provide weak evidence for FoF as a predictor of cognitive decline.  相似文献   

9.

Background

Unintentional underfeeding is common in patients receiving enteral nutrition (EN), and is associated with increased risk of malnutrition complications. Protocols for EN in critically ill patients have been shown to enhance adequacy, resulting in better clinical outcomes; however, outside of intensive care unit (ICU) settings, the influence of a protocol for EN is unknown.

Objective

To evaluate the efficacy and safety of implementing an EN protocol in a noncritical setting.

Design

Randomized controlled clinical trial.

Participants and settings

This trial was conducted from 2014 to 2016 in 90 adult hospitalized patients (non-ICU) receiving exclusively EN. Patients with carcinomatosis, ICU admission, or <72 hours of EN were excluded.

Intervention

The intervention group received EN according to a protocol, whereas the control group was fed according to standard practice.

Main outcome measures

The proportion of patients receiving ≥80% of their caloric target at Day 4 after EN initiation.

Statistical analyses performed

Student t test or Wilcoxon rank-sum test were used for continuous variables and the difference between the groups in the time to receipt of the optimal amount of nutrition was analyzed using Kaplan-Meier curves.

Results

Forty-five patients were randomized to each group. At Day 4 after EN initiation, 61% of patients in the intervention arm had achieved the primary end point compared with 23% in the control group (P=0.001). In malnourished patients, 63% achieved the primary end point in the intervention group compared with 16% in the control group (P=0.003). The cumulative deficit on Day 4 was lower in the intervention arm compared with the control arm: 2,507 kcal (interquartile range [IQR]=1,262 to 2,908 kcal) vs 3,844 kcal (IQR=2,620 to 4,808 kcal) (P<0.001) and 116 g (IQR=69 to 151 g) vs 191 g (IQR=147 to 244 g) protein (P<0.001), respectively. The rates of gastrointestinal complications were not significantly different between groups.

Conclusions

Implementation of an EN protocol outside the ICU significantly improved the delivery of calories and protein when compared with current standard practice without increasing gastrointestinal complications.  相似文献   

10.

Background

Medicare incentivizes the reduction of hospitalizations of nursing facility (NF) residents. The effects of these incentives on resident safety have not been examined.

Objective

Examine safety indicators in NFs participating in a randomized, controlled trial of the INTERACT Quality Improvement Program.

Design

Secondary analysis of a randomized trial in which intervention NFs exhibited a statistically nonsignificant reduction in hospitalizations.

Setting

NFs with adequate on-site medical, radiography, laboratory, and pharmacy services, and capability for online training and data input were eligible.

Participants

264 NFs randomized into intervention and comparison groups stratified by previous INTERACT use and self-reported hospital readmission rates.

Intervention

NFs randomized to the intervention group received INTERACT materials, access to online training and a series of training webinars, feedback on hospitalization rates and root-cause analysis data, and monthly telephonic support.

Measures

Minimum data set (MDS) data for unintentional weight loss, malnutrition, hip fracture, pneumonia, wound infection, septicemia, urinary tract infection, and falls with injury for the intervention year and the year prior; unintentional weight loss, dehydration, changes in rates of falls, pressure ulcers, severe pain, and unexpected deaths obtained from the NFs participating in the intervention through monthly telephone calls.

Results

No adverse effects on resident safety, and no significant differences in safety indicators between intervention and comparison group NFs were identified, with 1 exception. Intervention NFs with high levels of INTERACT tool use reported significantly lower rates of severe pain.

Conclusions/Implications

Resident safety was not compromised during implementation of a quality improvement program designed to reduce unnecessary hospitalization of NF residents.  相似文献   

11.

Objectives

Use of exercise technologies has benefits for community-dwelling older adults in terms of improved gait and balance. But research on the feasibility of use of exercise technologies in various geriatric health care settings is lacking. Hence, the current study examined the feasibility of implementing an exercise technology intended to augment rehabilitation in patients receiving post-acute care (PAC) in a skilled nursing facility (SNF). We focused on 3 indicators of feasibility: extent of usage (including predictors of more intense use), patients' acceptability of the technology, and limited efficacy.

Design

Cross-sectional study with data from patients' electronic medical records (EMR), exercise technology portal, and patient interviews.

Setting

SNF.

Participants

A sample of post-acute patients (n = 237).

Measurements

Sociodemographic and health-related variables, time spent using the technology, and 8 items of the Physical Activity Enjoyment Scale (PACES).

Results

Average time spent using the technology varied greatly (range, 1–460 minutes). A regression analysis showed that patients who had a longer length of stay (β = .01, P < .05) and were younger (β = ?0.01, P < .05) spent significantly more time using the technology. Acceptability of technology was high among patients. Finally, patients who used the technology had lower 30-day rehospitalization rates.

Conclusion

Exercise technology is feasible to use in supporting rehabilitation in patients receiving PAC in a SNF and seems to have beneficial effects.  相似文献   

12.

Objectives

To examine the association between each type of frailty status and the incidence rate of depressive symptoms among community-dwelling older adults.

Design

Prospective cohort study.

Setting

General communities in Japan.

Participants

Participants comprised 3538 older Japanese adults.

Measurements

We assessed our participants in terms of frailty status (physical frailty, cognitive impairment, and social frailty), depressive symptoms (geriatric depression scale ≥6), and other covariates, and excluded those who showed evidence of depression. Then, after a 4-year interval, we again assessed the participants for depressive symptoms. Physical frailty was defined by the Fried criteria, showing 1 or more of these were physical frailty. To screen for cognitive impairment, receiving a score below an age-education adjusted reference threshold in 1 or more tests was cognitive impairment. Finally, social frailty was defined using 5 questions, and those who answered positively to 1 or more of these were considered to have social frailty.

Results

After multiple imputations, the incidence rate of depressive symptoms after 4 years of follow-up was 7.2%. The incidence rates of depressive symptoms for each frailty status were as follows: 9.6% for physical frailty vs 4.6% without, 9.3% for cognitive impairment vs 6.5% without, and 12.0% for social frailty vs 5.1% without. Finally, through the application of multivariable logistic regression analysis, the incidence of depressive symptoms was found to have a significant association with social frailty (odds ratio 1.55; 95% confidence interval 1.10–2.20) but not with physical frailty or cognitive impairment.

Conclusions

This study revealed that social frailty, in comparison with physical frailty and cognitive impairment, is more strongly associated with incidences of depressive symptoms among elderly.  相似文献   

13.

Objectives

To investigate the prevalence and factors associated with the use of medications of questionable benefit throughout the final year of life of older adults who died with dementia.

Design

Register-based, longitudinal cohort study.

Setting

Entire Sweden.

Participants

All older adults (≥75 years) who died with dementia between 2007 and 2013 (n = 120,067).

Measurements

Exposure to medications of questionable benefit was calculated for each of the last 12 months before death, based on longitudinal data from the Swedish Prescribed Drug Register.

Results

The proportion of older adults with dementia who received at least 1 medication of questionable benefit decreased from 38.6% 12 months before death to 34.7% during the final month before death (P < .001 for trend). Among older adults with dementia who used at least 1 medication of questionable benefit 12 months before death, 74.8% remained exposed until their last month of life. Living in an institution was independently associated with a 15% reduction of the likelihood to receive ≥1 medication of questionable benefit during the last month before death (odds ratio 0.85, 95% confidence interval 0.88–0.83). Antidementia drugs accounted for one-fifth of the total number of medications of questionable benefit. Lipid-lowering agents were used by 8.3% of individuals during their final month of life (10.2% of community-dwellers and 6.6% of institutionalized people, P < .001).

Conclusion

Clinicians caring for older adults with advanced dementia should be provided with reliable tools to help them reduce the burden of medications of questionable benefit near the end of life.  相似文献   

14.

Objectives

To test the association between polypharmacy and 1-year change in physical and cognitive function among nursing home (NH) residents.

Design

Longitudinal multicenter cohort study based on data from the Services and Health for Elderly in Long TERm care (SHELTER) study.

Setting

NH in Europe (n = 50) and Israel (n = 7).

Participants

3234 NH older residents.

Measurements

Participants were assessed through the interRAI long-term care facility instrument. Polypharmacy was defined as the concurrent use of 5 to 9 drugs and excessive polypharmacy as the use of ≥10 drugs. Cognitive function was assessed through the Cognitive Performance Scale (CPS). Functional status was evaluated through the Activities of Daily Living (ADL) Hierarchy scale. The change in CPS and ADL score, based on repeated assessments, was the outcome, and their association with polypharmacy was modeled via linear mixed models. The interaction between polypharmacy and time was reported [beta and 95% confidence intervals (95% CIs)].

Results

A total of 1630 (50%) residents presented with polypharmacy and 781 (24%) excessive polypharmacy. After adjusting for potential confounders, residents on polypharmacy (beta 0.10, 95% CI 0.01-0.20) and those on excessive polypharmacy (beta 0.13, 95% CI 0.01-0.24) had a significantly higher decline in CPS score compared to those using <5 drugs. No statistically (P > .05) significant change according to polypharmacy status was shown for ADL score.

Conclusions

Polypharmacy is highly prevalent among older NH residents and, over 1 year, it is associated with worsening cognitive function but not functional decline.  相似文献   

15.

Background

Afterschool interventions have been found to improve the nutritional quality of snacks served. However, there is limited evidence on how these interventions affect children’s snacking behaviors.

Objective

Our aim was to determine the impact of an afterschool intervention focused at the school district, site, family, and child levels on dietary consumption of foods and beverages served at snack.

Design

This was a secondary analysis of a group-randomized controlled trial.

Participants/setting

Data were collected from 400 children at 20 afterschool sites in Boston, MA before (fall 2010) and after (spring 2011) intervention implementation.

Intervention

The Out-of-School Nutrition and Physical Activity intervention aimed to promote fruits, vegetables, whole grains, and water, while limiting sugary drinks and trans fats. Researchers worked with district foodservice staff to change snack foods and beverages. Teams of afterschool staff participated in three 3-hour learning collaborative sessions to build skills and created action plans for changing site practices. The intervention included family and child nutrition education.

Main outcome measures

Research assistants observed dietary snack consumption using a validated measure on 2 days per site at baseline and follow-up.

Statistical analyses performed

This study used multivariable regression models, accounting for clustering of observations, to assess the intervention effect, and conducted post-hoc stratified analyses by foodservice type.

Results

Children in intervention sites had greater decreases in consumption of juice (–0.61 oz/snack, 95% CI –1.11 to –0.12), beverage calories (–29.1 kcal/snack, 95% CI –40.2 to 18.0), foods with trans fats (–0.12 servings/snack, 95% CI –0.19 to –0.04), total calories (–47.7 kcal/snack, 95% CI –68.2 to –27.2), and increases in consumption of whole grains (0.10 servings/snack, 95% CI 0.02 to 0.18) compared to controls. In post-hoc analyses, sites with on-site foodservice had significant improvements for all outcomes (P<0.001), with no effect for sites with satellite foodservice.

Conclusions

Results demonstrate that an afterschool intervention can improve children’s dietary snack consumption, particularly at sites with on-site foodservice.  相似文献   

16.

Background

Little is known about zinc intakes and status during complementary feeding. This is particularly true for baby-led approaches, which encourage infants to feed themselves from the start of complementary feeding, although self-feeding may restrict the intake of zinc-rich foods.

Objective

To determine the zinc intakes, sources, and biochemical zinc status of infants following Baby-Led Introduction to SolidS (BLISS), a modified version of Baby-Led Weaning (BLW), compared with traditional spoon-feeding.

Design

Secondary analysis of the BLISS randomized controlled trial.

Participants/setting

Between 2012 and 2014, 206 community-based participants from Dunedin, New Zealand were randomized to a Control or BLISS group.

Intervention

BLISS participants received eight study visits (antenatal to 9 months) providing education and support regarding BLISS (ie, infant self-feeding from 6 months with modifications to address concerns about iron, choking, and growth).

Main outcome measures

Dietary zinc intakes at 7 and 12 months (weighed 3-day diet records) and zinc status at 12 months (plasma zinc concentration).

Statistical analyses performed

Regression analyses were used to investigate differences in dietary intakes and zinc status by group, adjusted for maternal education and parity and infant age and sex.

Results

There were no significant differences in zinc intakes between BLISS and Control infants at 7 (median: 3.5 vs 3.5 mg/day; P=0.42) or 12 (4.4 vs 4.4 mg/day; P=0.86) months. Complementary food groups contributing the most zinc at 7 months were “vegetables” for Control infants, and “breads and cereals” for BLISS infants, then “dairy” for both groups at 12 months. There was no significant difference in mean±standard deviation plasma zinc concentration between the Control (62.8±9.8 μg/dL [9.6±1.5 μmol/L]) and BLISS (62.8±10.5 μg/dL [9.6±1.6 μmol/L]) groups (P=0.75).

Conclusions

BLISS infants achieved similar zinc intake and status to Control infants. However, the BLISS intervention was modified to increase iron intake, which may have improved zinc intake, so these results should not be generalized to infants following unmodified BLW.  相似文献   

17.

Objective

High dietary sodium intake is a risk factor for cardiovascular events and death. Recently, a J-shaped correlation between sodium intake and adverse outcomes has been shown. The evidence on the association between sodium intake and cardiovascular outcomes in the elderly is scant. The objective of this study was to evaluate the correlation between sodium intake and cardiovascular events and mortality in an elderly population, taking into account frailty status.

Design

Cohort study of community dwelling older people enrolled in the InCHIANTI (Invecchiare in Chianti - Aging in the Chianti) study from 1998 to 2000 and followed-up for 9 years.

Setting

Two communities in Tuscany, Italy.

Participants

A total of 920 participants 65 years of age and older, with 24-hour urinary sodium excretion data.

Measurements

Nine-year mortality and incident cardiovascular events were analyzed using Cox and nonlinear log-binomial models, stratified by frailty status. Sensitivity analysis in participants without hypertension and cardiovascular diseases was performed.

Results

Mean age of the population was 74.5 years (standard deviation 6.99); 55.4% were women. There was a bi-modal association between sodium excretion and mortality, with risk increasing only below sodium excretion of 6.25 g/d [hazard ratio (HR) 1.29, 95% confidence interval (CI) 1.20-1.38], confirmed in the adjusted model (HR 1.12, 95% CI 1.04-1.22). These results were confirmed in participants without cardiovascular diseases. After stratification for frailty phenotype, the association was stronger in frail participants (adjusted HR 1.23, 95% CI 1.02-1.50 vs HR 1.11, 95% CI 1.01-1.22 in robust participants). There was no association between 24-hour sodium excretion and 9-year incidence of cardiovascular diseases (adjusted risk ratio 0.96, 95% CI 0.90-1.02).

Conclusions

Reduced sodium excretion is associated with increased mortality in a sample of community-dwelling older people, especially among the frail participants. High levels of sodium excretion are not associated with adverse outcomes in this population; therefore, sodium restriction might not be beneficial in older people.  相似文献   

18.

Objectives

The distinction between dementia and mild cognitive impairment (MCI) relies upon the evaluation of independence in instrumental activities of daily living (IADL). Self- and informant reports are prone to bias. Clinician-based performance tests are limited by long administration times, restricted access, or inadequate validation. To close this gap, we developed and validated a performance-based measure of IADL, the Sydney Test of Activities of Daily Living in Memory Disorders (STAM).

Design

Prospective cohort study (Sydney Memory and Ageing Study).

Setting

Eastern Suburbs, Sydney, Australia.

Participants

554 community-dwelling individuals (54% female) aged 76 and older with normal cognition, MCI, or dementia.

Measurements

Activities of daily living were assessed with the STAM, administered by trained psychologists, and the informant-based Bayer-Activities of Daily Living Scale (B-ADL). Depressive symptoms were measured with the Geriatric Depression Scale (15-item version). Cognitive function was assessed with a comprehensive neuropsychological test battery. Consensus diagnoses of MCI and dementia were made independently of STAM scores.

Results

The STAM showed high interrater reliability (r = 0.854) and test-retest reliability (r = 0.832). It discriminated significantly between the diagnostic groups of normal cognition, MCI, and dementia with areas under the curves ranging from 0.723 to 0.948. A score of 26.5 discriminated between dementia and nondementia with a sensitivity of 0.831 and a specificity of 0.864. Correlations were low with education (r = 0.230) and depressive symptoms (r = ?0.179), moderate with the B-ADL (r = ?0.332), and high with cognition (ranging from r = 0.511 to r = 0.594). The mean time to complete the STAM was 16 minutes.

Conclusions

The STAM has good psychometric properties. It can be used to differentiate between normal cognition, MCI, and dementia and can be a helpful tool for diagnostic classification both in clinical practice and research.  相似文献   

19.
20.

Objectives

In Parkinson disease (PD), sarcopenia may represent the common downstream pathway that from motor and nonmotor symptoms leads to the progressive loss of resilience, frailty, and disability. Here we (1) assessed the prevalence of sarcopenia in older adults with PD using 3 different criteria, testing their agreement, and (2) evaluated the association between PD severity and sarcopenia.

Design

Cross-sectional, observation study.

Setting

Geriatric day hospital.

Participants

Older adults with idiopathic PD.

Measurements

Body composition was evaluated through dual energy x-ray absorptiometry. Handgrip strength and walking speed were measured. Sarcopenia was operationalized according to the Foundation for the National Institutes of Health, the European Working Group on Sarcopenia in Older Persons, and the International Working Group. Cohen k statistics was used to test the agreement among criteria.

Results

Among the 210 participants (mean age 73 years; 38% women), the prevalence of sarcopenia was 28.5%–40.7% in men and 17.5%–32.5% in women. The prevalence of severe sarcopenia was 16.8%–20.0% in men and 11.3%–18.8% in women. The agreement among criteria was poor. The highest agreement was obtained between the European Working Group on Sarcopenia in Older Persons (severe sarcopenia) and International Working Group criteria (k = 0.52 in men; k = 0.65 in women; P < .01 for both). Finally, severe sarcopenia was associated with PD severity (odds ratio 2.30; 95% confidence interval 1.15–4.58).

Conclusions

Sarcopenia is common in PD, with severe sarcopenia being diagnosed in 1 in every 5 patients with PD. We found a significant disagreement among the 3 criteria evaluated, in detecting sarcopenia more than in ruling it out. Finally, sarcopenia is associated with PD severity. Considering its massive prevalence, further studies should address the prognosis of sarcopenia in PD.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号