首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 406 毫秒
1.
Objective: Heavy alcohol consumption can alter vitamin D status; however, the relationships between alcohol consumption and vitamin D concentrations in pregnant women have not been well studied. The aim of this study was to investigate the vitamin D status in a population of alcohol-exposed (N = 180) and low/unexposed control (N = 179) Ukrainian pregnant women.

Methods: Women who attended prenatal care facilities in 2 regions of Ukraine (Rivne and Khmelnytsky) for a routine prenatal visit were screened for the study. At the time of enrollment (20.4 ± 7.0 weeks of gestation), blood samples and alcohol consumption data (during a typical week around conception and the most recent 2 weeks) were collected. Vitamin D status was assessed by 25-hydroxyvitamin D [25(OH)D] concentrations.

Results: A high prevalence of suboptimal vitamin D status in pregnant Ukrainian women was observed. Overall, 50.1% and 33.4% of the women were classified as vitamin D deficient [25(OH)D < 20 ng/mL] or insufficient [25(OH)D ≥ 20 ng/mL and ≤30 ng/mL], respectively, based on 2011 Endocrine Society guidelines. Alcohol-exposed women had significantly lower 25(OH)D concentrations than low/unexposed women in Spring (p = 0.006) and Winter (p = 0.022). When vitamin D concentrations were grouped into sunny season (Summer + Fall) compared to not sunny season (Winter + Spring), there was a significant ethanol by season interaction (p = 0.0028), with alcohol-drinking women having lower circulating vitamin D compared to low/unexposed women in seasons of low sun availability.

Conclusions: These data suggest that when vitamin D concentrations are generally low (e.g., during seasons of low sun availability), alcohol consumption during pregnancy has a negative impact on vitamin D status.  相似文献   


2.
Background: Myofascial pain that has been associated with cancer and increased risk of morbidity and mortality in cancer patients is intrinsically associated with low magnesium and low 25-hydroxyvitamin D (25(OH)D). Therefore, this physical finding was used as a clinical diagnostic proxy.

Objective: The objective of this study was to assess the association and prevalence of disease in individuals with myofascial pain and low 25(OH)D in a county with low magnesium in the drinking water.

Design: This is a retrospective cross-sectional study of a chart review of 269 subjects to assess subjects presenting with myofascial pain (assessed by tender trigger points) and 25(OH)D concentrations below 30 ng/mL or a history of 25(OH)D deficiency compared to those without these exposures.

Results: The association between the exposure of low 25(OH)D levels and myofascial pain was compared to all cancers, colon polyps, and tendon ruptures. The odds of having cancer with the combined exposures was 10.14 times the odds of not having either exposure (95% confidence interval [CI], 5.08, 20.25, p < 0.001). For adenomatous colon polyps, the odds ratio (OR) was 7.24 (95% CI, 3.83, 13.69, p < 0.001), and for tendon rupture, the OR was 8.65 (95% CI, 3.76, 19.94, p < 0.001). Of 80 subjects who had both myofascial pain and 25(OH)D less than 30 ng/mL, 74 were tested for red blood cell (RBC) magnesium. Half of those subjects had RBC magnesium concentrations < 4.6 mg/dL, and 23% had levels below the reference range (4.0–6.4 mg/dL).

Conclusion: Myofascial pain as assessed by tender trigger points and 25(OH)D deficiency showed a significant association with cancer, adenomatous colon polyps, and tendon rupture. Further studies to verify these results are needed, especially in areas where there is low magnesium in the drinking water.  相似文献   


3.
Aim: The aim of this study was to evaluate the effect of vitamin D supplementation in patients with type 2 diabetes mellitus (T2DM) with regard to their glycemic control and lipid profile.

Methods: One hundred subjects with T2DM were recruited and given 4500 IU/day of vitamin D for 2 months. 25-Hydroxyvitamin D [25(OH)D], fasting blood glucose (FBG), glycosylated hemoglobin A1c (HbA1c), and lipid profile were measured pre- and postsupplementation.

Results: There was a significant increase in the mean value of 25(OH)D level after supplementation (baseline level 16 ± 5.3 ng/ml vs. after supplement level 49.2 ± 17.7 ng/ml, p < 0.05). Both FBG and HbA1c but not lipid profile were significantly decreased after supplementation. However, the univariate general linear model between 25(OH)D percentiles and lipid profile levels showed that diabetic subjects with high 25(OH)D levels (>61 ng/ml) had significantly lower levels of total cholesterol and low-density lipoprotein cholesterol (LDL-C) in comparison to those in the low or middle percentiles. Furthermore, participants in a higher percentile had a significantly higher level of high-density lipoprotein cholesterol (HDL-C) than those in the middle percentile. Lipid profile levels were not affected by the supplement except for triglycerides (TG) levels in females, which were significantly decreased.

Conclusions: Vitamin D supplementation may be beneficial to diabetic subjects because it improved glycemic control. Diabetic subjects with high 25(OH)D levels (>61 ng/ml) had better lipid profiles.  相似文献   


4.
Objective: Although a significant positive association of vitamin D deficiency with coronary heart disease has been demonstrated in cross-sectional as well as prospective studies, only a few studies have examined the association of vitamin D deficiency with subclinical atherosclerosis. We examined whether vitamin D deficiency is associated with subclinical atherosclerosis, as measured by coronary artery calcification (CAC) in asymptomatic adults.

Methods: In a population-based cross-sectional study, 195 men aged 40 to 49 years without cardiovascular disease were randomly selected (98 Caucasian and 97 Japanese American men). Liquid chromatography–tandem mass spectrometry was utilized to measure serum vitamin D. CAC was examined by electron beam computed tomography using standardized protocols and read centrally at the University of Pittsburgh using Agatston's methods. To investigate an association between vitamin D deficiency (defined as 25-hydroxyvitamin D [25(OH)D] < 20 ng/mL) and CAC (defined as Agatston score ≥ 10), we utilized multivariable logistic regression models.

Results: Prevalence of CAC and vitamin D deficiency was 27.2% and 10.3%, respectively. Participants with CAC were significantly older, had significantly higher body mass index (BMI), and had higher rates of smoking. Those with CAC were 3.31 times likely to be vitamin D deficient, after adjusting for traditional cardiovascular risk factors (odds ratio [OR] = 3.31, 95% confidence interval [CI], 1.12–9.77).

Conclusions: In this population-based study of healthy middle-aged men, vitamin D deficiency had a significant positive association with the presence of CAC.  相似文献   


5.
Objective: The aim of this study was to evaluate possible effects of food fortification practices on vitamin D intake in adults.

Design and setting: This was designed as a cross-sectional, population-based study.

Subjects: We investigated vitamin D intake in a population-based sample of 5224 adults, using a validated food frequency questionnaire. A theoretical model was conducted to evaluate the hypothetical effects of dairy product fortification.

Results: Dairy had the highest mean of vitamin D intake among food groups. If all types of milk were fortified by vitamin D (42 IU/100 grams of milk), the mean intake of vitamin D would reach 132 ± 148 (92(180)) IU/day. If both milk and yogurt were fortified to 42 IU/100 g and 89 IU/100 g, respectively, the average mean vitamin D intake from foods in this population would increase from 84 ± 88 IU/day to 308 ± 240 IU/day. As the fortification level increased, the proportions of young people with more than the recommended daily allowance (RDA) of vitamin D increased from 1.1% to 77.4% in men and from 1.4% to 80% in women, but none of them achieved the tolerable upper intake level (UL) of vitamin D.

Conclusion: The proposed fortification scenario would provide enough vitamin D intakes by RDA in a population aged between 18 and 50 years (about 80% of the population), with none of them achieving ULs.  相似文献   


6.
Background: The aim of this community-based study is to ascertain the effect of different obesity phenotypes on the incidence of chronic kidney disease in Iranian adults.

Study Design: A prospective cohort study, the Tehran Lipid Glucose Study (TLGS).

Setting and Participants: Adults aged ≥ 20 years with a mean age of 40.38 years (54.8% female) who were free from chronic kidney disease (CKD) at baseline (phase 1) and were followed up at 3 time stages (phases 2, 3, and 4) for a mean duration of 9.4 years to assess the risk for CKD.

Predictor: Obesity phenotypes.

Outcome: Incidence of chronic kidney disease.

Measurements: Glomerular filtration rate (GFR) was estimated from the simplified equation developed using data from the Modification of Diet in Renal Disease (MDRD) Study.

Results: CKD events occurred in 1162 participants. The prevalence of the 2 known obesity phenotypes (metabolically obese normal weight [MONW] and metabolically healthy but obese [MHO]) in the overall population was 3.5% and 8.8%, respectively. According to Kaplan-Meier curves, rates of freedom from CKD in the MHO and MONW obesity phenotypes were 75.3% and 60.6%, respectively (p < 0.0001). Age- and sex-adjusted (model 1) hazard ratios for participants with MHO or MONW obesity phenotype were 1.14 (95% confidence interval [CI], 0.91–1.43) and 1.43 (95% CI, 1.09–1.88), respectively. After further adjustment for confounder variables (model 2), multivariate-adjusted hazard ratios for CKD for participants with MHO or MONW obesity phenotypes were 1.23 (95% CI, 0.93–1.62) and 1.43 (95% CI, 1.08–1.90), respectively.

Conclusion: Adults with the MONW obesity phenotype compared to those with MHO obesity phenotype have a higher risk for incidence of CKD. The results indicate that having a normal weight is not the only factor to protect against incidence of CKD.  相似文献   


7.
8.
Objective: Curcumin exhibits many beneficial health-promoting characteristics. However, its poor oral absorption precludes its general use. This study assessed the bioavailability of a novel curcumin formulation compared to 95% curcumin and published results for various other curcumin formulations.

Methods: A randomized, crossover, double-blind, comparator-controlled pharmacokinetic study was performed in 12 healthy adult subjects to determine the appearance of free curcumin and its metabolites curcumin sulfate and curcumin glucuronide in plasma after a single dose of a novel proprietary curcumin liquid droplet micromicellar formulation (CLDM) and unformulated 95% curcumin powder in capsule form. An equivalent 400-mg dose of each product was administered. The 95% curcumin contained 323 mg curcumin, and the CLDM contained 64.6 mg curcumin. Blood samples were drawn and plasma was analyzed for curcumin and its 2 conjugates without enzymatic hydrolysis by liquid chromatography–tandem mass spectroscopy.

Results: Plasma levels of curcumin sulfate and curcumin glucuronide after 1.5 hours from CLDM were approximately 20 and 300 ng/mL, respectively, whereas the levels for 95% curcumin were near baseline. Free curcumin reached a maximum level of 2 ng/mL for CLDM and 0.3 ng/mL for 95% curcumin at 1.5 hours. For the CLDM, a small secondary free curcumin peak occurred at 12 hours and a tertiary 1.5-ng/mL peak occurred at 24 hours. The total curcumin absorbed as represented by the area under the curve (AUC)/mg administered curcumin for CLDM was 522 times greater than for the 95% curcumin.

Conclusions: The novel CLDM formulation facilitates absorption and produces exceedingly high plasma levels of both conjugated and total curcumin compared to 95% curcumin. A comparison of the Cmax/mg curcumin and AUC/mg of administered curcumin for CLDM with data from pharmacokinetic studies of various enhanced absorption formulations indicate that the greatest absorption and bioavailability are produced with the novel CLDM formulation.  相似文献   


9.
Objective: Fractures of bones, especially forearm fractures, are very common in children and their number is increasing. This study was designed to determine the impact of vitamin D serum levels and vitamin D receptor (VDR) polymorphisms on the occurrence of low-energy fractures in children.

Methods: The study group consisted of 100 children with clinically relevant bone fractures and a control group consisted of 127 children without fractures. Total vitamin D [25(OH)D3 plus 25(OH)D2] serum concentrations were evaluated in every patient. Genotypes for 4 restriction fragment length polymorphisms of the vitamin D receptor gene (FokI, ApaI, TaqI, and BsmI) were determined by standard polymerase chain reaction–restriction fragment length polymorphism (PCR-RFLP) techniques.

Results: Differences in concentrations of vitamin D were observed between the group with bone fractures (median = 12 ng/ml) and the control group (median = 16 ng/ml; p = 0.000044).

Higher levels of vitamin D reduced the risk of fracture by 1.06 times (p = 0.0005). No impact of particular VDR polymorphism on the occurrence of low-energy fractures in children was detected. However, there were significant differences in the prevalence of FokI polymorphism genotypes between the fracture and control groups (p = 0.05). Furthermore, the recessive “aa” genotype of ApaI polymorphism and the dominant “TT” genotype of TaqI polymorphism were associated with higher levels of vitamin D (p = 0.005 and p = 0.036, respectively).

Conclusions: Vitamin D deficiency is an independent risk factor for fractures in children. ApaI polymorphism recessive “aa” and TaqI polymorphism dominant “TT” genotypes are associated with higher levels of vitamin D in serum.  相似文献   


10.
Objective: In recent years, the welfare of workers and the prevention of chronic disabling diseases has become a topic of great interest. This study investigates serum levels of total 25-hydroxyvitamin D (25(OH)D) in a cohort of overweight–obese and insulin-resistant northern Italian indoor workers in apparent good health followed a nutritional education program.

Methods: An observational cross-sectional study on 385 patients (females = 291, males = 94), age range 18–69 years and body mass index (BMI) > 25 kg/m2, was performed at the Department of Occupational Medicine Milan, Italy, latitude 45.465454 N. We evaluated nutritional intakes, occupational and leisure physical activity, anthropometric measurements, impedance evaluation, blood pressure, the presence of metabolic syndrome (MetS) and nonalcoholic fatty liver diseases (NAFLD) by fatty liver index (FLI). Hematologic and biochemical parameters and (25(OH)D) levels were evaluated from fasting blood samples.

Results: Only 10.91% of subjects had optimal values of 25(OH)D; 17.40% of the remaining 89.09% subjects were severely deficient, with no gender difference and insufficient intake of vitamin D. Only 28% declared leisure physical activity; 39.48% had metabolic syndrome and 62.60% had an FLI > 30. An inverse relationship between 25(OH)D levels and BMI was found, with a significant reduction of total 25(OH)D serum concentrations in winter. The homeostasis model assessment–insulin resistance (HOMA-IR) is positively related to BMI and inversely related to 25(OH)D concentrations. A positive correlation between vitamin D and leisure physical activity was found. At univariate analysis adjusted for age, gender and BMI, an inverse relationship between vitamin D and FLI was observed in both genders. The correlation between 25(OH)D levels, inflammation markers, BMI, and FLI showed an increased risk of cardiovascular disease in this cohort of workers.

Conclusion: Our results suggest the rationale for a large-scale screening program for vitamin D by means of easily implementable low-cost preventive supplementation.  相似文献   


11.
Objectives: More than one-third of hospitalized patients have hyperglycemia. Despite evidence that improving glycemic control leads to better outcomes, achieving recognized targets remains a challenge. The objective of this study was to evaluate the implementation of a computerized insulin order set and titration algorithm on rates of hypoglycemia and overall inpatient glycemic control.

Methods: A prospective observational study evaluating the impact of a glycemic order set and titration algorithm in an academic medical center in non-critical care medical and surgical inpatients. The initial intervention was hospital-wide implementation of a comprehensive insulin order set. The secondary intervention was initiation of an insulin titration algorithm in two pilot medicine inpatient units. Point of care testing blood glucose reports were analyzed. These reports included rates of hypoglycemia (BG < 70 mg/dL) and hyperglycemia (BG >200 mg/dL in phase 1, BG > 180 mg/dL in phase 2).

Results: In the first phase of the study, implementation of the insulin order set was associated with decreased rates of hypoglycemia (1.92% vs 1.61%; p < 0.001) and increased rates of hyperglycemia (24.02% vs 27.27%; p < 0.001) from 2010 to 2011. In the second phase, addition of a titration algorithm was associated with decreased rates of hypoglycemia (2.57% vs 1.82%; p = 0.039) and increased rates of hyperglycemia (31.76% vs 41.33%; p < 0.001) from 2012 to 2013.

Conclusions: A comprehensive computerized insulin order set and titration algorithm significantly decreased rates of hypoglycemia. This significant reduction in hypoglycemia was associated with increased rates of hyperglycemia. Hardwiring the algorithm into the electronic medical record may foster adoption.  相似文献   


12.
Background: The aim of this study was to assess the relationship between admission serum phosphate levels and in-hospital mortality in all hospitalized patients.

Methods: All adult hospitalized patients who had admission serum phosphate available between years 2009 and 2013 were enrolled. Admission serum phosphate was categorized based on its distribution into six groups (<2.5, 2.5–3.0, 3.1–3.6, 3.7–4.2, 4.3–4.8 and ≥4.9 mg/dL). The odds ratio (OR) of in-hospital mortality by admission serum phosphate, using the phosphate category of 3.1–3.6 mg/dL as the reference group, was obtained by logistic regression analysis.

Results: 42,336 patients were studied. The lowest incidence of in-hospital mortality was associated with a serum phosphate within 3.1–4.2 mg/dL. A U-shaped curve emerged demonstrating higher in-hospital mortality associated with both serum phosphate <3.1 and >4.2 mg/dL. After adjusting for potential confounders, both serum phosphate <2.5 and >4.2 mg/dL were associated with in-hospital mortality with ORs of 1.60 (95%CI 1.25–2.05), 1.60 (95%CI 1.29–1.97), and 3.89 (95%CI 3.20–4.74) when serum phosphate were <2.5, 4.3–4.8 and ≥4.9 mg/dL, respectively. Among subgroups of patients with chronic kidney disease (CKD) and cardiovascular disease (CVD), the highest mortality was associated with a serum phosphate ≥4.9 mg/dL with ORs of 4.11 (95%CI 3.16–5.39) in CKD patients and 5.11 (95%CI 3.33–7.95) in CVD patients.

Conclusion: Hospitalized patients with admission serum phosphate <2.5 and >4.2 mg/dL are associated with an increased risk of in-hospital mortality. The highest mortality risk is associated with CKD and CVD patients with admission hyperphosphatemia.  相似文献   


13.
Objective: This preliminary study compared a DSM-IV-TR screening tool for posttraumatic stress symptoms (PTSS) with a modified DSM-5 version for parents of children diagnosed with cancer.

Methods: Caregivers (n = 101) completed the Brief Symptom Inventory (BSI) and Impact of Event Scale-Revised (IES-R). Five BSI items were added to the IES-R to assess whether caregivers met DSM-5 specific posttraumatic stress disorder criteria.

Results: Chi-square analysis revealed three groups: caregivers who (1) did not meet screening criteria for DSM-IV-TR or DSM-5; (2) only met DSM-IV-TR criteria; and (3) met criteria for DSM-IV-TR and DSM-5, X2(1, n = 101) = 64.47, < 0.001. Subgroup 2 had lower overall PTSS than subgroup 3 (< 0.001), but more than Subgroup 1 (< 0.001).

Conclusions: A “gap group” evidenced elevated PTSS but did not meet DSM-5 screening criteria. Further research is needed to clarify the prevalence and composition of PTSS among caregivers, and evaluate the clinical implications of the changes in diagnostic criteria.  相似文献   


14.
Objective: The aim of this study was to investigate whether short-term exposure to a Mediterranean diet during a structured abroad experience could influence dietary habits and attitudes.

Design: This study used a cross-sectional design.

Setting: The study was conducted on the Florence University of the Arts (FUA) campus, Italy.

Subjects: Fifty-four (47 females, 7 males; mean age 21.1 ± 1.9 years) college students from 12 different states, mainly located in the central United States, were enrolled in this study.

Measures of Outcome: Outcome measures included adherence score to Mediterranean diet and self-reported perceptions of diet and food availability. A demographic survey was used to collect data regarding personal characteristics, anthropometrics, duration of stay, and residency status.

Analysis: Chi-square test, independent T-test, and Mann-Whitney test were used to perform analyses.

Results: At 3 weeks' follow-up, 94% of the population reported that availability of foods affected their food choices. Interestingly, students reported that they consumed less meat with respect to their usual dietary habits in the United States (p < 0.0001) and they reported significantly increased the consumption of olive oil, cereals, fruit, and alcohol (p < 0.05). The adherence score to a Mediterranean diet significantly increased by about 1 point, going from 9.9 ± 2.4 to 10.9 ± 2.0 (p < 0.05).

Conclusions: After a 3-week stay in Italy, an increase in the adherence score to a Mediterranean diet was observed. Future research should explore the relationship between length of time spent in a foreign country and dietary adherence in a cultural context.  相似文献   


15.
Objectives: There is a paucity of information on the prevalence and clinical implications of malnutrition in patients hospitalised for management of acute exacerbations of chronic obstructive pulmonary disease (AECOPD). This study aimed to fill this gap in knowledge.

Methods: We performed a retrospective observational cohort study of 100 hospitalised AECOPD patients. The Malnutrition Screening Tool (MST) was used to identify patients at risk of malnutrition (MST ≥2). Patient characteristics, length of stay, readmission rate, 12-month survival and overall survival were collected using a proforma.

Results: MST scores were available in 90 patients, of whom 22% of patients had a MST score of ≥2. There were no significant differences in COPD severity, treatment received and biochemical parameters between the groups of patients ‘at risk of malnutrition’ and those ‘not at risk of malnutrition’. Length of stay in hospital was longer in patients ‘at risk of malnutrition’ (median (IQR): 3.5 (2–7.5) vs. 3.0 (1–5), p = 0.048). Overall survival was significantly reduced in patients with ‘at risk of malnutrition’ compared to those patients ‘not at risk of malnutrition’ (337 ± 245 vs. 670 ± 292, p < 0.001).

Conclusions: Using the MST we found that one-fifths of our hospitalised AECOPD patients are at ‘at risk of malnutrition’. Moreover, this cohort of patients had worse outcomes both during and extending beyond hospitalisation compared to patients ‘not at risk of malnutrition’. Our study illustrates the need for routine malnutrition screening for hospitalised AECOPD patients because it has implications for potentially reducing morbidity and mortality in COPD.  相似文献   


16.
Background: The ADA 2010 guidelines added HbA1c ≥ 6.5% as a criterion for diagnosing diabetes mellitus type 2.

Objective: To evaluate the HbA1c test in predicting type 2 diabetes in a high risk population.

Methods: A community-based historic cohort study was conducted including 10 201 patients, who had not been diagnosed with diabetes, and who underwent HbA1c test during the years 2002–2005. Data was retrieved on diabetes risk factors and the onset of diabetes (according to the ADA 2003 criteria), during a follow-up period of five-to-eight years.

Results: Mean age was 58.25 ± 15.58 years; mean HbA1c level was 5.59 ± 0.55% and 76.8% had a BMI > 25 kg/m2 (mean: 30.74 ± 8.30). In a Cox proportional hazards regression model, the risk of developing type 2 diabetes was 2.49 (95% CI: 1.29–3.71) for 5.5% ≤ HbA1c < 6% at baseline, 4.82 (95% CI: 2.83–8.20) for 6% ≤ HbA1c < 6.5% at baseline and 7.57 (95% CI: 4.43–12.93) for 6.5% ≤ HbA1c < 7% at baseline, compared to HbA1c < 4.5%. The risk of developing diabetes was 1.14 (95% CI: 1.05–1.25) for male gender, 1.16 (95% CI: 1.04–1.28) for cardiovascular diseases and 2.06 (95% CI: 1.80–2.35) for overweight (BMI > 25 kg/m2) at baseline. Neither age nor low socio-economic status was associated with increased risk of diabetes.

Conclusion: Levels of HbA1c ≥ 5.5% were associated with increased risk of type 2 diabetes during a five-to-eight-year follow-up period. Findings support the use of HbA1c testing as a screening tool in populations at risk of developing diabetes.  相似文献   


17.
Background: Safety protocols are usually neglected in most of the matchstick industries rendering the laborer prone to various occupational hazards.

Objective: The present study highlights DNA damage among matchstick factory workers (n = 92) against a control group (n = 48) of healthy individuals.

Methods: Genotoxicity was measured in peripheral blood lymphocytes of the test subjects using a Single Cell Gel Electrophoresis assay (SCGE/comet assay).

Results: Our results substantiate a high Total Comet Score (TCS) for factory workers (74.5 ± 47.0) when compared to the control group (53.0 ± 25.0) (P ≤ 0.001). Age and duration of occupational exposure had no significant effect (P > 0.05) on TCS value. As for job function, the TCS value was greatest in sweepers (91.0 ± 56.1) and lowest in box-making operators (26.0 ± 25.0) indicating that waste disposal poses the higher risk of DNA damage.

Conclusions: Our study corroborates that matchstick chemicals can potentially damage the DNA of exposed subjects.  相似文献   


18.
Objectives: Chronic obstructive pulmonary disease (COPD), especially acute exacerbations of COPD, are associated with increased cardiovascular mortality, including sudden cardiac death. Previous studies have reported that ECG abnormalities are common in stable COPD patients. However, the prognostic utility of ECG taken at the time of AECOPD is not known. In this study we sought to address this gap in knowledge pertaining to ECG parameters at time of AECOPD and overall survival.

Methods: We conducted a retrospective cohort study of patients admitted to our institution with a primary diagnosis of AECOPD. Standard 12-lead ECG obtained at the time of initial presentation was evaluated. The primary outcome was overall survival.

Results: Two hundred and eleven AECOPD patients were considered for the study. Death had occurred in 42 (20%) patients at follow-up. Among the different ECG parameters evaluated, the QT Dispersion (QTD) and corrected QT Dispersion (QTcD) were significantly associated with increased mortality. Receiver Operator Characteristic analysis identified QTcD >48msec had a sensitivity of 90% and specificity of 55% in predicting death and QTcD >48msec was also associated with worse overall survival (months) (mean ± SD: 26 ± 1.0 vs. 30 ± 0.7, p = 0.001).

Conclusion: QTcD ≥48msec is associated with increased mortality. Further research is required to better understand this association and potentially identify reversible factors that if appropriately addressed, may ultimately improve the prognosis of patients with COPD.  相似文献   


19.
Objective: KAMUT khorasan is an ancient grain with widely acclaimed health benefits. The aim of this study was to investigate the effects of a replacement diet with ancient khorasan wheat products in patients with NAFLD, in comparison to a similar replacement diet with control products made from organic semi-whole-grain modern wheat.

Methods: Forty NAFLD patients (12 M/28 F; age 55.2 ± 10.4 years) with mild to moderate liver steatosis were included. The experimental design was a randomized, double-blind, parallel-arm study with 20 participants assigned to consume either KAMUT khorasan or control wheat products (pasta, bread, crackers, biscuits) over a 3-month period. Anthropometric measurements, blood analyses, and ultrasonography examination were performed at both the beginning and end of each dietary intervention.

Results: After the implementation of a general linear model for repeated measurements adjusted for baseline demographic details, risk factors, and medication, alanine aminotransferase (ALT) was significantly reduced by 12%, aspartate aminotransferase (AST) by 14%, alkaline phosphatase (ALP) by 8%, and cholesterol by 6% only in the khorasan group (p < 0.05 for all). Similarly, significant reductions in circulating proinflammatory tumor necrosis factor-alpha by 50%, interleukin l-receptor antagonist-alpha by 37%, interleukin-8 by 24%, and interferon gamma by 24% were evident only in participants who consumed the khorasan products (p < 0.05 for all). Finally, significant improvements in the liver steatosis grading, Doppler perfusion index values, and reactive oxygen species (ROS) production were evident after consumption of both the khorasan and control products.

Conclusions: This study suggests that a short-term replacement diet with ancient KAMUT khorasan products is most effective in reducing metabolic risk factors and ameliorating the liver profile in patients with NAFLD.  相似文献   


20.
Background: Intakes of ready-to-eat cereal (RTEC) have been inversely associated with risk factors of chronic diseases such as cardiovascular disease (CVD), type 2 diabetes, and certain cancers; however, their relations with total and cause-specific mortality remain unclear.

Objective: To prospectively assess the associations of RTEC intakes with all causes and disease-specific mortality risk.

Design: The study included 367,442 participants from the prospective National Institutes of Health (NIH)–AARP Diet and Health Study. Intakes of RTEC were assessed at baseline.

Results: Over an average of 14 years of follow-up, 46,067 deaths were documented. Consumption of RTEC was significantly associated with reduced risk of mortality from all-cause mortality and death from CVD, diabetes, all cancer, and digestive cancer (all p for trend < 0.05). In multivariate models, compared to nonconsumers of RTEC, those in the highest intake of RTEC had a 15% lower risk of all-cause mortality and 10%–30% lower risk of disease-specific mortality. Within RTEC consumers, total fiber intakes were associated with reduced risk of mortality from all-cause mortality and deaths from CVD, all cancer, digestive cancer, and respiratory disease (all p for trend < 0.005).

Conclusions: Consumption of RTEC was associated with reduced risk of all-cause mortality and mortality from specific diseases such as CVD, diabetes, and cancer. This association may be mediated via greater fiber intake.  相似文献   


设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号