首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
ObjectiveTo determine the risk of long-term major adverse cardiovascular events (MACE) when sleep-disordered breathing (SDB) and decreased cardiorespiratory fitness (CRF) co-occur.MethodsWe included consecutive patients who underwent symptom-limited cardiopulmonary exercise tests between January 1, 2005, and January 1, 2010, followed by first-time diagnostic polysomnography within 6 months. Patients were stratified based on the presence of moderate-to-severe SDB (apnea/hypopnea index ≥15 per hour) and decreased CRF defined as <70% predicted peak oxygen consumption (VO2). Long-term MACE was a composite outcome of myocardial infarction (MI), coronary artery bypass graft (CABG), percutaneous coronary intervention (PCI), stroke or transient ischemic attack (TIA), and death, assessed until May 21, 2018. Cox-proportional hazard models were adjusted for factors known to influence CRF and MACE.ResultsOf 498 included patients (60±13 years, 28.1% female), 175 (35%) had MACE (MI=17, PCI=14, CABG=13, stroke=20, TIA=12, deaths=99) at a median follow-up of 8.7 years (interquartile range=6.5 to 10.3 years). After adjusting for age, sex, beta blockers, systemic hypertension, diabetes mellitus, coronary artery disease, cardiac arrhythmia, chronic obstructive pulmonary disease, smoking, and use of positive airway pressure (PAP), decreased CRF alone (hazard ratio [HR]=1.91, 95% confidence interval [CI], 1.15 to 3.18; P=.01), but not SDB alone (HR=1.26, 95% CI, 0.75 to 2.13, P=.39) was associated with increased risk of MACE. Those with SDB and decreased CRF had greater risk of MACE compared with patients with decreased CRF alone (HR=1.85; 95% CI, 1.21 to 2.84; P<.005) after accounting for these confounders. The risk of MACE was attenuated in those with reduced CRF alone after additionally adjusting for adequate adherence to PAP (HR=1.59; 95% CI, 0.77 to 3.31; P=.21).ConclusionThe incidence of MACE, especially mortality, was high in this sample. Moderate-to-severe SDB with concurrent decreased CRF was associated with higher risk of MACE than decreased CRF alone. These results highlight the importance of possibly including CRF in the risk assessment of patients with SDB and, conversely, that of screening for SDB in patients with low peak VO2.  相似文献   

2.
ObjectiveTo determine population-attributable risk (PAR) and exposure impact number (EIN) for mortality associated with impaired cardiorespiratory fitness (CRF), physical inactivity, and other risk markers among veteran subjects.MethodsThe sample included 5890 male subjects (mean age 58±15) who underwent a maximal exercise test for clinical reasons between January 1, 1992, and December 31, 2014. All-cause mortality was the end point. Cox multivariable hazard models were performed to determine clinical, demographic, and exercise-test determinants of mortality. Population-attributable risks and EIN for the lowest quartile of CRF and for inactive behavior were analyzed, accounting for competing events.ResultsThere were 2728 deaths during a mean ± standard deviation follow-up period of 9.9±5.8 years. Having low CRF (<5.0 metabolic equivalents [METs]) was associated with an approximate 3-fold higher risk of mortality and a PAR of 12.9%. Each higher MET achieved on the treadmill was associated with a 15% reduction in mortality (hazard ratio [HR]=0.85; 95% confidence interval [CI], 0.83 to 0.88; P<.001). Nearly half the sample was inactive, and these subjects had a 23% higher mortality risk and a PAR of 8.8%. The least fit quartile (<5.0 METs) had relative risks of ≈6.0 compared with the most-fit group (HR=5.99; 95% CI, 4.9 to 7.3). The least-active tertile had ≈2-fold higher risks of mortality vs the most active subjects (HR=1.9; 95% CI, 0.91 to 4.1). The lowest EIN was observed for low fitness (3.8; 95% CI, 3.4 to 4.3, P<.001), followed by diabetes, smoking, hypertension, and physical inactivity (all P<.001 except for diabetes, P=.008).ConclusionBoth higher CRF and physical activity provide protection against all-cause mortality in subjects referred for exercise testing for clinical reasons. Encouraging physical activity with the aim of increasing CRF would have a significant impact on reducing mortality.  相似文献   

3.
ObjectiveTo assess long-term survival with repeat coronary artery bypass grafting (RCABG) or percutaneous coronary intervention (PCI) in patients with previous CABG.MethodsFrom January 1, 2000, through December 31, 2013, 1612 Mayo Clinic patients underwent RCABG (n=215) or PCI (n=1397) after previous CABG. The RCABG cohort was grouped by use of saphenous vein grafts only (n=75), or with additional arterial grafts (n=140); the PCI cohort by, bare metal stents (BMS; n=628), or drug-eluting stents (DES; n=769), and by the treated target into native coronary artery (n=943), bypass grafts only (n=338), or both (n=116). Multivariable regression and propensity score analysis (n=280 matched patients) were used.ResultsIn multivariable analysis, the 30-day mortality was increased in RCABG versus PCI patients (hazard ratio [HR], 5.32; 95%CI, 2.34-12.08; P<.001), but overall survival after 30 days improved with RCABG (HR, 0.72; 95% CI, 0.55-0.94; P=.01). Internal mammary arteries were used in 61% (129 of 215) of previous CABG patients and improved survival (HR, 0.82; 95% CI, 0.69-0.98; P=.03). Patients treated with drug-eluting stent had better 10-year survival (HR, 0.74; 95% CI, 0.59-0.91; P=.001) than those with bare metal stent alone. In matched patients, RCABG had improved late survival over PCI: 48% vs 33% (HR, 0.57; 95% CI, 0.35-0.91; P=.02). Compared with RCABG, patients with PCI involving bypass grafts (n=60) had increased late mortality (HR, 1.62; 95% CI, 1.10-2.37; P=.01), whereas those having PCI of native coronary arteries (n=80) did not (HR, 1.09; 95% CI, 0.75-1.59; P=.65).ConclusionRCABG is associated with improved long-term survival after previous CABG, especially compared with PCI involving bypass grafts.  相似文献   

4.
ObjectiveTo comparatively assess the natural history of patients of different ages undergoing transcatheter aortic valve replacement (TAVR).Patients and MethodsFor this study, we used the YOUNG TAVR, an international, multicenter registry investigating mortality trends up to 2 years in patients with aortic valve stenosis treated by TAVR, classified according to 3 prespecified age groups: 75 years or younger (n=179), 76 to 86 years (n=602), and older than 86 years (n=221). A total of 1002 patients undergoing TAVR were included. Demographic, clinical, and outcome data in the youngest group were compared with those of patients 76 to 86 years and older than 86 years. Patients were followed up for up to 2 years.ResultsCompared with patients 75 years or younger (reference group), patients aged 76 to 86 years and older than 86 years had nonsignificantly different 30-day mortality (odds ratio, 0.76; 95% CI, 0.41-1.38; P=.37 and odds ratio, 1.27; 95% CI, 0.62-2.60; P=.51, respectively) and 1-year mortality (hazard ratio (HR), 0.72; 95% CI, 0.48-1.09; P=.12 and HR, 1.11; 95% CI, 0.88-1.40; P=.34, respectively). Mortality at 2 years was significantly lower among patients aged 76 to 86 years (HR, 0.62; 95% CI, 0.42-0.90; P=.01) but not among the older group (HR, 1.06; 95% CI, 0.68-1.67; P=.79). The Society of Thoracic Surgeons 30-day mortality score was lower in younger patients who, however, had a significantly higher prevalence of chronic obstructive pulmonary disease (P=.005 vs the intermediate group and P=.02 vs the older group) and bicuspid aortic valves (P=.02 vs both older groups), larger left ventricles, and lower ejection fractions.ConclusionIn the present registry, mortality at 2 years after TAVR among patients 75 years or younger was higher compared with that of patients aged 75 to 86 years and was not markedly different from that of patients older than 86 years. The findings are attributable at least in part to a greater burden of comorbidities in the younger age group that are not entirely captured by current risk assessment tools.  相似文献   

5.
ObjectiveTo evaluate the risks of recurrent stroke and major bleeding events with clopidogrel and aspirin use among patients aged 80 years or older.Patients and MethodsThis retrospective cohort study was conducted using the Full Population Data of the Health and Welfare Database in Taiwan. Patients aged 80 years or older who received monotherapy with clopidogrel or aspirin following hospitalization for primary acute ischemic stroke between January 1, 2009, and December 31, 2018, were included. Inverse probability of treatment weighting was used to balance measured covariates between clopidogrel and aspirin users. Measured outcomes included recurrent acute ischemic stroke, acute myocardial infarction, composite cardiovascular events (recurrent stroke or acute myocardial infarction), intracranial hemorrhage, major gastrointestinal tract bleeding, and composite major bleeding events (intracranial hemorrhage or major gastrointestinal tract bleeding).ResultsA total of 15,045 patients were included in the study, 1979 of whom used clopidogrel and 13,066 who used aspirin following hospitalization for primary acute ischemic stroke. Clopidogrel use was associated with significantly lower risk of recurrent acute ischemic stroke (hazard ratio [HR], 0.89; 95% CI, 0.83 to 0.96; P=.002), composite cardiovascular events (HR, 0.88; 95% CI, 0.82 to 0.95; P<.001), intracranial hemorrhage (HR, 0.71; 95% CI, 0.56 to 0.90; P=.005), and composite major bleeding events (HR, 0.89; 95% CI, 0.80 to 0.99; P=.04) compared with aspirin use.ConclusionIn patients aged 80 years or older with primary acute ischemic stroke, clopidogrel users had lower risks of recurrent stroke and the composite cardiovascular events compared with aspirin users. Clopidogrel users also had lower risks of intracranial hemorrhage and the composite major bleeding events compared with aspirin users.  相似文献   

6.
ObjectiveTo determine whether fitness could improve mortality risk stratification among older adults compared with cardiovascular disease (CVD) risk factors.MethodsWe examined 6509 patients 70 years of age and older without CVD from the Henry Ford ExercIse Testing Project (FIT Project) cohort. Patients performed a physician-referred treadmill stress test between 1991 and 2009. Traditional categorical CVD risk factors (hypertension, hyperlipidemia, diabetes, and smoking) were summed from 0 to 3 or more. Fitness was grouped as low, moderate, and high (<6, 6 to 9.9, and ≥10 metabolic equivalents of task). All-cause mortality was ascertained through US Social Security Death Master files. We calculated age-adjusted mortality rates, multivariable adjusted Cox proportional hazards, and Kaplan-Meier survival models.ResultsPatients had a mean age of 75±4 years, and 3385 (52%) were women; during a mean follow-up of 9.4 years, there were 2526 deaths. A higher fitness level (P<.001), not lower CVD risk factor burden (P=.31), was associated with longer survival. The age-adjusted mortality rate per 1000 person-years was 56.7 for patients with low fitness and 0 risk factors compared with 24.9 for high fitness and 3 or more risk factors. Among patients with 3 or more risk factors, the adjusted mortality hazard was 0.68 (95% CI, 0.61 to 0.76) for moderate and 0.51 (95% CI, 0.44 to 0.60) for high fitness compared with the least fit.ConclusionAmong persons aged 70 years and older, there was no significant difference in survival of patients with 0 vs 3 or more risk factors, but a higher fitness level identified older persons with good long-term survival regardless of CVD risk factor burden.  相似文献   

7.
ObjectiveTo investigate the management strategies, temporal trends, and clinical outcomes of patients with a history of coronary artery bypass graft (CABG) surgery and presenting with acute myocardial infarction (MI).Patients and MethodsWe undertook a retrospective cohort study using the National Inpatient Sample database from the United States (January 2004–September 2015), identified all inpatient MI admissions (7,250,768 records) and stratified according to history of CABG (group 1, CABG-naive [94%]; group 2, prior CABG [6%]).ResultsPatients in group 2 were older, less likely to be female, had more comorbidities, and were more likely to present with non-ST-elevation myocardial infarction compared with group 1. More patients underwent coronary angiography (68% vs 48%) and percutaneous coronary intervention (PCI) (44% vs 26%) in group 1 compared with group 2. Following multivariable logistic regression analyses, the adjusted odd ratio (OR) of in-hospital major adverse cardiovascular and cerebrovascular events (OR, 0.98; 95% CI, 0.95 to 1.005; P=.11), all-cause mortality (OR, 1; 95% CI, 0.98 to 1.04; P=.6) and major bleeding (OR, 0.99; 95% CI, 0.94 to 1.03; P=.54) were similar to group 1. Lower adjusted odds of in-hospital major adverse cardiovascular and cerebrovascular events (OR, 0.64; 95% CI, 0.57 to 0.72; P<.001), all-cause mortality (OR, 0.45; 95% CI, 0.38 to 0.53; P<.001), and acute ischemic stroke (OR, 0.71; 95% CI, 0.59 to 0.86; P<.001) were observed in group 2 patients who underwent PCI compared with those managed medically without any increased risk of major bleeding (OR, 1.08; 95% CI, 0.94 to 1.23; P=.26).ConclusionsIn this national cohort, MI patients with prior-CABG had a higher risk profile, but similar in-hospital adverse outcomes compared with CABG-naive patients. Prior-CABG patients who received PCI had better in-hospital clinical outcomes compared to those who received medical management.  相似文献   

8.
ObjectiveTo study the utility of artificial intelligence (AI)–enabled electrocardiograms (ECGs) in patients with Graves disease (GD) in identifying patients at high risk of atrial fibrillation (AF) and heart failure with reduced ejection fraction (HFrEF), and to study whether AI-ECG can reflect hormonal changes and the resulting menstrual changes in GD.Patients and MethodsPatients diagnosed with GD between January 1, 2009, and December 31, 2019, were included. We considered AF diagnosed at 30 days or fewer before or any time after GD and de novo HFrEF not explained by ischemia, valve disorder, or other cardiomyopathy at/after GD diagnosis. Electrocardiograms at/after index condition were excluded. A subset analysis included females younger than 45 years of age to study the association between ECG-derived female probability and menstrual changes (shorter, lighter, or newly irregular cycles).ResultsAmong 430 patients (mean age, 50±17 years; 337 (78.4%) female), independent risk factors for AF included ECG probability of AF (hazard ratio [HR], 1.5; 95% CI, 1.2 to 1.6 per 10%; P<.001), older age (HR, 1.05; 95% CI, 1.03 to 1.07 per year; P<.001), and overt hyperthyroidism (HR, 3.9; 95% CI, 1.2 to 12.7; P=.03). The C-statistic was 0.85 for the combined model. Among 495 patients (mean age, 52±17 years; 374 (75.6%) female), independent risk factors for HFrEF were ECG probability of low ejection fraction (HR, 1.4; 95% CI, 1.1 to 1.6 per 10%; P=.001) and presence of AF (HR, 8.3; 95% CI, 2.2 to 30.9; P=.002), and a C-statistic of 0.89 for the combined model. Lastly, of 72 females younger than 45 years, 30 had menstrual changes at time of GD and had a significantly lower AI ECG–derived female probability [median 77.3; (IQR 57.9 to 94.4)% vs. median 97.7 (IQR 92.4 to 99.5)%, P<.001].ConclusionAI-enabled ECG identifies patients at risk for GD-related AF and HFrEF and was associated with menstrual changes in women with GD.  相似文献   

9.
ObjectiveTo synthesize more conclusive evidence on the anti-inflammatory effects of angiotensin-converting enzyme inhibitors (ACEIs) and angiotensin receptor blockers (ARBs).MethodsPubMed, Scopus, and Embase were searched from inception until March 1, 2021. We included randomized controlled trials (RCTs) that assessed the effect of ACEIs or ARBs, compared with placebo, on any of the following markers: C-reactive protein (CRP), interleukin 6 (IL-6), or tumor necrosis factor α (TNF-α). Mean changes in the levels of these markers were pooled as a weighted mean difference (WMD) with a 95% CI.ResultsThirty-two RCTs (n=3489 patients) were included in the final analysis. Overall pooled analysis suggested that ACEIs significantly reduced plasma levels of CRP (WMD, ?0.54 [95% CI, ?0.88 to ?0.21]; P=.002; I2=96%), IL-6 (WMD, ?0.84 [95% CI, ?1.03 to ?0.64]; P<.001; I2=0%), and TNF-α (WMD, ?12.75 [95% CI, ?17.20 to ?8.29]; P<.001; I2=99%). Moreover, ARBs showed a significant reduction only in IL-6 (WMD, ?1.34 [95% CI, ?2.65 to ?0.04]; P=.04; I2=85%) and did not significantly affect CRP (P=.15) or TNF-α (P=.97) levels. The lowering effect of ACEIs on CRP levels remained significant with enalapril (P=.006) and perindopril (P=.01) as well as with a treatment duration of less than 24 weeks (WMD, -0.67 [95% CI, ?1.07 to -0.27]; P=.001; I2=94%) and in patients with coronary artery disease (WMD, ?0.75 [95% CI, ?1.17 to ?0.33]; P<.001; I2=96%).ConclusionBased on this meta-analysis, ACEIs showed a beneficial lowering effect on CRP, IL-6, and TNF-α, whereas ARBs were effective as a class in reduction of IL-6 only.  相似文献   

10.
ObjectiveTo study the complications of hand-assisted laparoscopic living donor nephrectomy (HALLDN) with an emphasis on complications occurring early after hospital discharge up to 120 days after surgery.Patients and MethodsWe retrospectively categorized complications using the Clavien-Dindo classification in 3002 HALLDNs performed at 1 center from January 1, 2000, through December 31, 2019. In addition to overall summaries, modeling was used to identify correlates of complications before and after living donation.ResultsOf these donors, 87% were White, 59% were female, the mean age was 45 years (range, 18-77 years), 30.3% had a body mass index of at least 30, and 36.3% had previous abdominopelvic surgery. There were no deaths related to the surgery. The incidence of major complications (intraoperative complications plus Clavien-Dindo grade ≥III postoperatively) was 2.5% (n=74). The overall complication rate was 12.4% (n=371), including 15 intraoperative, 76 postoperative before discharge, and 280 after discharge to 120 days. Reoperation was required in 1.8% of patients (n=54), and all but 1 of these were incision-related problems. Seventy-six percent of all complications occurred after discharge, including 85% of the reoperations. For major complications, no risk factor was found. Risk factors for any complication included paramedian incision (hazard ratio [HR], 2.54; 95% CI, 1.49 to 4.34; P<.001); a history of abdominopelvic surgery (HR, 1.37; 95% CI, 1.07 to 1.76; P=.01), male sex (HR, 1.37; 95% CI, 1.07 to 1.76; P=.01), non-White race (HR, 1.40; 95% CI, 1.05 to 1.88; P=.02), and early era of the experience.ConclusionMost major complications of HALLDN occur after discharge, suggesting that close follow-up is warranted and that the current literature may underestimate the true incidence.  相似文献   

11.
ObjectiveTo identify whether, and to what extent, treatment with cardiovascular drugs and neurotropic drugs are associated with postural control and falls in patients with acute stroke.DesignObservational cohort study.SettingA stroke unit at a university hospital.ParticipantsA consecutive sample of patients (N=504) with acute stroke.InterventionsNot applicable.Main Outcome MeasuresPostural control was assessed using the modified version of the Postural Assessment Scale for Stroke Patients. Data including baseline characteristics, all drug treatments, and falls were derived from medical records. Univariable and multivariable logistic regression and Cox proportional hazards models were used to analyze the association of drug treatment and baseline characteristics with postural control and with falls.ResultsIn the multivariable logistic regression analysis, factors significantly associated with impaired postural control were treatment with neurotropic drugs (eg, opioids, sedatives, hypnotics, antidepressants) with an odds ratio (OR) of 1.73 (95% confidence interval [CI], 1.01-2.97, P=.046); treatment with opioids (OR 9.23, 95% CI, 1.58-54.00, P=0.014); age (OR 1.09, 95% CI, 1.07-1.12, P<.0001), stroke severity, which had a high National Institutes of Health Stroke Scale-score (OR 1.29, 95% CI, 1.15-1.45, P<.0001), and sedentary life style (OR 4.32, 95% CI, 1.32-14.17, P=.016). No association was found between neurotropic drugs or cardiovascular drugs and falls.ConclusionsTreatment with neurotropic drugs, particularly opioids, in the acute phase after stroke, is associated with impaired postural control. Since impaired postural control is the major cause of falls in patients with acute stroke, these results suggest opioids should be used with caution in these patients.  相似文献   

12.
ObjectiveTo compare survival by the presenting parkinsonism symptoms at diagnosis among patients with incident clinically diagnosed synucleinopathies.Patients and MethodsUsing the Rochester Epidemiology Project medical records–linkage system, we identified all persons residing in Olmsted County, Minnesota, who received a diagnostic code of parkinsonism from January 1, 1991, through December 31, 2010. A movement disorder specialist reviewed the complete medical records of each individual to confirm the presence of parkinsonism, determine the type of synucleinopathy, and identify the onset dates of each cardinal symptom (tremor at rest, bradykinesia, rigidity, and impaired postural reflexes). We determined the median time from age at diagnosis to death or censoring (June 30, 2015) for each presenting symptom and the age- and sex-adjusted risk of death.ResultsFrom 1991 through 2010, a total of 433 individuals had a synucleinopathy diagnosed (301 [69.5%], Parkinson disease; 68 [15.7%], dementia with Lewy bodies; 52 [12.0%], Parkinson disease dementia; and 12 [2.8%], multiple systems atrophy with parkinsonism). Overall, the risk of death in the tremor-predominant group was less than that in the bradykinesia/rigidity-only group (hazard ratio [HR], 0.59; 95% CI, 0.40-0.87; P=.007). Similarly, risk of death in the bradykinesia/rigidity-only group was significantly greater than in the tremor-predominant group (HR, 1.75; 95% CI, 1.23-2.51; P=.002) and compared with tremor before bradykinesia (HR, 1.75; 95% CI, 1.24-2.47; P=.001).ConclusionPatients with tremor as a presenting symptom have longer survival. In contrast, the presence of bradykinesia/rigidity as a presenting symptom correlates with reduced survival across all types of synucleinopathies.  相似文献   

13.
Given previous biologic evidence of immunomodulatory effects of coffee, we hypothesized that the association between coffee intake of colorectal cancer patients and survival differs by immune responses. Using a molecular pathologic epidemiology database of 4465 incident colorectal cancer cases, including 1262 cases with molecular data, in the Nurses’ Health Study and the Health Professionals Follow-up Study, we examined the association between coffee intake of colorectal cancer patients and survival in strata of levels of histopathologic lymphocytic reaction and T-cell infiltrates in tumor tissue. We did not observe a significant association of coffee intake with colorectal cancer–specific mortality (multivariable-adjusted hazard ratio [HR] for 1-cup increase of coffee intake per day, 0.93; 95% CI, 0.84 to 1.03). Although statistical significance was not reached at the stringent level (α=.005), the association of coffee intake with colorectal cancer–specific mortality differed by Crohn disease–like lymphoid reaction (Pinteraction=.007). Coffee intake was associated with lower colorectal cancer–specific mortality in patients with high Crohn disease–like reaction (multivariable HR for 1-cup increase of coffee intake per day, 0.55; 95% CI, 0.37 to 0.81; Ptrend=.002) but not in patients with intermediate Crohn disease–like reaction (the corresponding HR, 1.02; 95% CI, 0.72 to 1.44) or negative/low Crohn disease–like reaction (the corresponding HR, 0.95; 95% CI, 0.83 to 1.07). The associations of coffee intake with colorectal cancer–specific mortality did not significantly differ by levels of other lymphocytic reaction or any T-cell subset (Pinteraction>.18). There is suggestive evidence for differential prognostic effects of coffee intake by Crohn disease–like lymphoid reaction in colorectal cancer.  相似文献   

14.
15.
ObjectiveTo examine the prevalence, distribution, and temporal trends of metabolic phenotypes that are jointly determined by obesity and metabolic health status among US adults, overall and in key population subgroups.Participants and MethodsA nationally representative sample of civilian, noninstitutionalized US adults aged 20 years and older from the National Health and Nutrition Examination Survey between 1999-2000 and 2017-2018 were included. Metabolic phenotypes were characterized jointly by body mass index and metabolic health: metabolically healthy underweight, normal weight, overweight, and obese (MH-OB); and metabolically unhealthy underweight, normal weight, overweight, and obese (MU-OB). Metabolic health was defined using the 2009 joint scientific statement for metabolic syndrome from the International Diabetes Federation Task Force on Epidemiology and Prevention, National Heart, Lung, and Blood Institute, American Heart Association, World Heart Federation, International Atherosclerosis Society, and International Association for the Study of Obesity as having 2 or less components (primary analysis) or no components (secondary analysis) of the following: waist circumference of 102 cm or greater in men and 88 cm or greater in women, fasting plasma glucose level of 100 mg/dL or greater, blood pressure of 130/85 mm Hg or greater, triglyceride level of 150 mg/dL or greater, and high-density lipoprotein cholesterol level of less than 40 mg/dL in men and less than 50 mg/dL in women.ResultsOf 19,941 adults, the mean age was 46.9 years; 10,005 (50.6%) were female. From 1999 to 2018, the prevalence in primary analysis declined from 33.2% (465465 of 1646) to 25.1% (454454 of 2058) (difference, ?8.09%; 95% CI, ?12.5% to ?3.70%) for metabolically healthy normal weight, whereas it increased from 9.92% (178178 of 1646) to 14.1% (277277 of 2058) (difference, 4.17%; 95% CI, 1.13% to 7.21%) for MH-OB (both P<.001 for trend). The prevalence of metabolically healthy underweight and overweight remained stable at about 1.62% (298298 of 19,94119,941) (95% CI, 1.38% to 1.89%; P=.34 for trend) and 22.2% (4,275 of 19,941) (95% CI, 21.4% to 23.0%; P=.14 for trend), respectively. The prevalence declined from 3.77% (72 of 1646) to 2.10% (68 of 2058) (difference, ?1.67%; 95% CI, ?3.22% to ?0.12%; P=.006 for trend) for metabolically unhealthy normal weight, whereas it increased from 19.0% (343 of 1646) to 26.4% (574 of 2058) (difference, 7.41%; 95% CI, 2.67% to 12.2%; P<.001 for trend) for MU-OB. The prevalence of metabolically unhealthy underweight and overweight remained stable at 0.06% (11 of 19,941) (95% CI, 0.03% to 0.15%; P=.84 for trend) and 11.2% (2528 of 19,941) (95% CI, 10.6% to 11.8%; P=.29 for trend), respectively. Persistent differences in the prevalence of metabolic phenotypes were identified across multiple sociodemographic subgroups. For example, the prevalence of MH-OB increased from 7.58% (53 of 754) to 12.0% (79 of 694) (P<.001 for trend) for non-Hispanic Whites and 12.2% (60 of 567) to 18.4% (76 of 493) for Hispanics (P=.01 for trend) and remained stable at 22.6% (756 of 3,825) for non-Hispanic Blacks (P=.62 for trend and P=.05 for interaction). Results in secondary analyses revealed similar patterns.ConclusionFrom 1999 to 2018, US adults experienced major increases in the prevalence of both MH-OB and MU-OB, largely due to decreases in MH-N. The prevalence of MU-OB increased across all subgroups, with higher values observed in older adults and those with lower education and income levels.  相似文献   

16.
ObjectiveTo determine whether biological effective dose (BED) was predictive of obliteration after stereotactic radiosurgery (SRS) for cerebral arteriovenous malformations (AVMs).Patients and MethodsWe studied patients undergoing single-session AVM SRS between January 1, 1990, and December 31, 2014, with at least 2 years of imaging follow-up. Excluded were patients with syndromic AVM, previous SRS or embolization, and patients treated with volume-staged SRS. Biological effective dose was calculated using a mono-exponential model described by Jones and Hopewell. The primary outcome was likelihood of total obliteration defined by digital subtraction angiography or magnetic resonance imaging (MRI). Variables were analyzed as continuous and dichotomous variables based on the maximum value of (sensitivity–[1–specificity]).ResultsThis study included 352 patients (360 AVM, median follow-up, 5.9 years). The median margin dose prescribed was 18.75 Gy (interquartile range [IQR]: 18 to 20 Gy). Two hundred fifty-nine patients (71.9%) had obliteration shown by angiography (n=176) or MRI (n=83) at a median of 36 months after SRS (IQR: 26 to 44 months). Higher BED was associated with increased likelihood of obliteration in univariate Cox regression analyses, when treated as either a dichotomous (≥133 Gy; hazard ratio [HR],1.52; 95% confidence interval [CI], 1.19 to 1.95; P<.001) or continuous variable (HR, 1.00, 95% CI, 1.0002 to 1.005; P=.04). In multivariable analyses including dichotomized BED and location, BED remained associated with obliteration (P=.001).ConclusionBiological effective dose ≥133 Gy was predictive of AVM obliteration after single-session SRS within the prescribed margin dose range 15 to 25 Gy. Further study is warranted to determine whether BED optimization should be considered as well as treatment dose for AVM SRS planning.  相似文献   

17.
ObjectiveTo evaluate the efficacy and safety of progressive resistance exercise (PRE) for patients with total knee arthroplasty (TKA) in a meta-analysis.Data SourcesPubMed, MEDLINE, Cochrane’s Library, and EMBASE databases.Study SelectionRandomized controlled trials evaluating the effect of PRE on mobility and function in patients with TKA.Data ExtractionA random-effects model was applied if significant heterogeneity was detected; otherwise, a fixed-effects model was applied.Data SynthesisSeven randomized controlled trials. Compared with a rehabilitation program without PRE, physiotherapy including PRE was associated with improvements in the 6-minute walking test (weighed mean difference [WMD], 19.22m; P=.04) with a wide confidence interval (CI, 0.48~37.95). However, sensitivity analysis by omitting 1 study with preoperative rehabilitation revealed nonsignificant results (WMD, 15.15m; P=.16). Moreover, PRE did not significantly improve the maximal walking speed (WMD, 0.05m/s, 95% CI, 0.00~0.11; P=.05). However, PRE was associated with improved knee strength of extension (standardized mean difference [SMD], 0.72; 95% CI, 0.47~0.96; P<.001) and flexion (SMD, 0.47; 95% CI, 0.19~0.74; P<.001) but not self-reported physical function (SMD, ?0.17; 95% CI, ?0.37~0.03; P=.10) or changes in pain score (SMD, 0.11; 95% CI, ?0.15~0.37; P=.40). PRE did not increase the risk of adverse events (risk ratio, 1.19; 95% CI, 0.52~2.71; P=.68).ConclusionsPRE may lead to improvements in physical function among patients receiving a TKA. PRE leads to higher ultimate strength in the surgical knee and is safe to perform.  相似文献   

18.
ObjectiveTo evaluate how breast cancers come to clinical attention (mode of detection [MOD]) in a population-based cohort, determine the relative frequency of different MODs, and characterize patient and tumor characteristics associated with MOD.Patients and MethodsWe used the Rochester Epidemiology Project to identify women ages 40 to 75 years with a first-time diagnosis of breast cancer from May 9, 2017, to May 9, 2019 (n=500) in a 9-county region in Minnesota. We conducted a retrospective medical record review to ascertain the relative frequency of MODs, evaluating differences between screening mammography vs all other MODs by breast density and cancer characteristics. Multiple logistic regression was conducted to examine the likelihood of MOD for breast density and stage of disease.ResultsIn our population-based cohort, 162 of 500 breast cancers (32.4%) were detected by MODs other than screening mammography, including 124 (24.8%) self-detected cancers. Compared with women with mammography-detected cancers, those with MODs other than screening mammography were more frequently younger than 50 years of age (P=.004) and had higher-grade tumors (P=.007), higher number of positive lymph nodes (P<.001), and larger tumor size (P<.001). Relative to women with mammography-detected cancers, those with MODs other than screening mammography were more likely to have dense breasts (odds ratio, 1.87; 95% CI, 1.20 to 2.92; P=.006) and advanced cancer at diagnosis (odds ratio, 3.58; 95% CI, 2.29 to 5.58; P<.001).ConclusionOne-third of all breast cancers in this population were detected by MODs other than screening mammography. Increased likelihood of nonmammographic MODs was observed among women with dense breasts and advanced cancer.  相似文献   

19.
ObjectiveTo evaluate whether arthritis predicts the likelihood of advanced hepatic fibrosis in HFE hemochromatosis.Patients and MethodsWe conducted a retrospective, cross-sectional analysis of 112 well-characterized patients with HFE hemochromatosis and liver biopsy–validated fibrosis staging recruited between January 1, 1983, and December 31, 2013. Complete clinical, biochemical, hematologic, and noninvasive serum biochemical indices (aspartate aminotransferase to platelet ratio index [APRI] and fibrosis 4 index [FIB4]) were available. Scheuer fibrosis stages 3 and 4, APRI greater than 0.44, or FIB4 greater than 1.1 were used to define advanced hepatic fibrosis. Comparisons between groups were performed using categorical analysis, unpaired or paired t test.ResultsMale (n=76) and female (n=36) patients were similar in age. Nineteen patients had advanced hepatic fibrosis, and 47 had hemochromatosis arthritis. Arthritis was significantly associated with the presence of advanced hepatic fibrosis as determined by liver biopsy (sensitivity, 84%, [95% CI, 62% to 95%]; negative predictive value, 95% [95% CI, 87% to 99%]; relative risk, 7.4 [95% CI, 2.5 to 23]; P<.001), APRI (sensitivity, 75% [95% CI, 55% to 88%]; negative predictive value, 91% [95% CI, 81% to 96%]; relative risk, 4.5 [95% CI, 2.0 to 10.2]; P<.001), or FIB4 (sensitivity, 61% [95% CI, 41% to 78%]; negative predictive value, 67% [95% CI, 68% to 90%]; relative risk, 2.2 [95% CI, 1.1 to 4.6]; P=.03). Mean cell volume values were significantly higher pretreatment in patients with F3-4 fibrosis (96.7±1.1 fL) compared with F0-2 fibrosis (93.4±0.5 fL; P=.004) and declined following treatment (F3-4, 93.2±0.9 fL, P=.01; F0-2, 91.7±0.6 fL, P=.01).ConclusionAdvanced hepatic fibrosis is strongly associated with arthritis in HFE hemochromatosis. The absence of arthritis predicts a low likelihood of advanced hepatic fibrosis, supporting its use as a clinical marker for advanced hepatic fibrosis in HFE hemochromatosis.  相似文献   

20.
ObjectiveTo examine the association between continuous renal replacement therapy (CRRT) liberation and clinical outcomes among patients with acute kidney injury (AKI) requiring CRRT.MethodsThis single-center, retrospective cohort study included adult patients admitted to intensive care units with AKI and treated with CRRT from January 1, 2007, to May 4, 2018. Based on the survival and renal replacement therapy (RRT) status at 72 hours after the first CRRT liberation, we classified patients into liberated, reinstituted, and those who died. We observed patients for 90 days after CRRT initiation to compare the major adverse kidney events (MAKE90).ResultsOf 1135 patients with AKI, 228 (20%), 437 (39%), and 470 (41%) were assigned to liberated, reinstituted, and nonsurvival groups, respectively. The MAKE90, mortality, and RRT independence rates of the cohort were 62% (707 cases), 59% (674 cases), and 40% (453 cases), respectively. Compared with reinstituted patients, the liberated group had a lower MAKE90 (29% vs 39%; P=.009) and higher RRT independence rate (73% vs 65%; P=.04) on day 90, but without significant difference in 90-day mortality (26% vs 33%; P=.05). After adjustments for confounders, successful CRRT liberation was not associated with lower MAKE90 (odds ratio, 0.71; 95% CI, 0.48 to 1.04; P=.08) but was independently associated with improved kidney recovery at 90-day follow-up (hazard ratio, 1.81; 95% CI, 1.41 to 2.32; P<.001).ConclusionOur study demonstrated a high occurrence of CRRT liberation failure and poor 90-day outcomes in a cohort of AKI patients treated with CRRT.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号