首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 468 毫秒
1.
Background and AimsAlcohol-related liver disease is a leading cause of liver-related mortality. The effect of alcohol abstinence on the natural history of alcohol-related cirrhosis across distinct stages of portal hypertension has not been thoroughly investigated. In this study, we assessed the clinical implications of abstinence in patients with alcohol-related cirrhosis and clinically significant portal hypertension.MethodsAlcohol abstinence, hepatic decompensation, and mortality were assessed in patients with alcohol-related cirrhosis who underwent a baseline hepatic venous pressure gradient (HVPG) measurement and were diagnosed with clinically significant portal hypertension (HVPG ≥10 mm Hg).ResultsA total of 320 patients with alcohol-related cirrhosis (median age: 57 [interquartile range (IQR), 49.7-63.1] years; 75.6% male; 87.5% decompensated) and a median HVPG of 20 (IQR, 17-23) mm Hg were followed up for a median of 36 (IQR, 14-80) months. Overall, 241 (75.3%) patients remained abstinent, while 79 (24.7%) patients had active alcohol consumption. Alcohol abstinence was linked to a significantly reduced risk of hepatic decompensation (adjusted hazard ratio [aHR], 0.391; P < .001), as well as liver-related (aHR, 0.428; P < .001) and all-cause (aHR, 0.453; P < .001) mortality, after adjusting for baseline HVPG, MELD, and previous decompensation. Importantly, alcohol abstinence significantly reduced the cumulative incidence of hepatic decompensation in both groups with HVPG 10–19 mm Hg (P < .001) and HVPG ≥20 mm Hg (P = .002). The 3-year decompensation probability was 32.4% vs 60.0% in HVPG 10–19 mm Hg and 57.5% vs 82.6% in HVPG ≥20 mm Hg for abstinent patients vs active drinkers, respectively.ConclusionsAlcohol abstinence improves prognosis across all stages of portal hypertension in alcohol-related cirrhosis, including in patients who have already progressed to high-risk portal hypertension. (ClinicalTrials.gov, Number: NCT03267615).  相似文献   

2.
Background and objectives: The COVID-19 pandemic imperiled the global health system. We aimed to determine the impact of COVID-19 on the care continuum of HCV-infected patients.Material and Methods: Two hundred and fifty-six patients who were prescribed a course of DAA therapy at three tertiary medical centers in the US and China between January 1, 2019 to June 30, 2020 were included. We assessed the proportions of patients who completed DAA therapy and had HCV RNA testing during and after the end of therapy. We also assessed the impact of utilization of telemedicine.Results: The proportion of patients undergoing HCV RNA testing during DAA treatment decreased from >81.7% before pandemic to 67.8% during the pandemic (P=0.006), with a more prominent decrease in the US. There were significant decreases in HCV RNA testing >12 (P<0.001) and >20 weeks (P<0.001) post-treatment during COVID-19 era. Compared to pre-COVID period, post-treatment clinic encounters during COVID-19 era decreased significantly in China (Xi'an: 13.6% to 7.4%; Nanjing: 16.7% to 12.5%) but increased in the US (12.5% to 16.7%), mainly due to the use of telemedicine. There was a 4-fold increase in utilization of telemedicine in the US.Conclusions: COVID-19 pandemic carried profound impact on care for HCV patients in both the US and China. HCV cure rate assessment decreased by half during COVID era but the proportion of patients finishing DAA therapy was not significantly affected. Increased utilization of telemedicine led to increased compliance with DAA therapy but did not encourage patients to have their laboratory assessment for HCV cure.  相似文献   

3.
ObjectivesPorphyria cutanea tarda (PCT) is common and usually associated with HCV chronic infection and HFE polymorphisms. Since DAA IFN-free regimens availability, SVR for HCV is nearly a constant and we wonder whether HCV SVR determine PCT evolution.MethodsRetrospective observational study including patients with HCV associated PCT from the Gastroenterology and Infectious Diseases Departments at our Hospital, treated with DAA (Apr/2015–Apr/2017). Clinical variables of PCT were collected at PCT diagnosis, after PCT treatment, before DAA use and after SVR achievement. UROD activity and C282Y/H63D polymorphisms were registered. SPSS 22.0.Results13 HCV-PCT patients included: median age 52.5 years; 4 females; 8 HCV/HIV co-infected (all on undetectable viral load). Classical PCT factors: 12 smoked, 9 alcohol abuse, 6 former IDU. 10 type I PCT and 1 type II PCT. HFE polymorphism: 2 cases with C282Y/H63D; H63D polymorphism in 8. PCT manifestations resolved with PCT treatment in 4 patients, almost completely in 7 patients, 1 patient referred stabilization and one worsened. After DAA treatment all the residual lesions resolved, what always led to specific treatment interruption.ConclusionsOur series of cases of HCV-associated PCT shows that SVR after DAA treatment leads to PCT resolution. Porphyrin levels are not needed after ending PCT specific treatment interruption when there are no residual skin lesions in HCV-associated PCT.  相似文献   

4.
Background and aimsNonalcoholic steatohepatitis (NASH) may progress to advanced liver disease (AdvLD). This study characterized comorbidities, healthcare resource utilization (HCRU) and associated costs among hospitalized patients with AdvLD due to NASH in Italy.Methods and resultsAdult nonalcoholic fatty liver disease (NAFLD)/NASH patients from 2011 to 2017 were identified from administrative databases of Italian local health units using ICD-9-CM codes. Development of compensated cirrhosis (CC), decompensated cirrhosis (DCC), hepatocellular carcinoma (HCC), or liver transplant (LT) was identified using first diagnosis date for each severity cohort (index-date). Patients progressing to multiple disease stages were included in >1 cohort. Patients were followed from index-date until the earliest of disease progression, end of coverage, death, or end of study. Within each cohort, per member per month values were annualized to calculate all-cause HCRU or costs(€) in 2017.Of the 9,729 hospitalized NAFLD/NASH patients identified, 97% were without AdvLD, 1.3% had CC, 3.1% DCC, 0.8% HCC, 0.1% LT. Comorbidity burden was high across all cohorts. Mean annual number of inpatient services was greater in patients with AdvLD than without AdvLD. Similar trends were observed in outpatient visits and pharmacy fills. Mean total annual costs increased with disease severity, driven primarily by inpatient services costs.ConclusionNAFLD/NASH patients in Italy have high comorbidity burden. AdvLD patients had significantly higher costs. The higher prevalence of DCC compared to CC in this population may suggest challenges of effectively screening and identifying NAFLD/NASH patients. Early identification and effective management are needed to reduce risk of disease progression and subsequent HCRU and costs.  相似文献   

5.
BackgroundCoronary artery calcium (CAC) is a marker of plaque burden. Whether CAC improves risk stratification for incident sudden cardiac death (SCD) beyond atherosclerotic cardiovascular disease (ASCVD) risk factors is unknown.ObjectivesSCD is a common initial manifestation of coronary heart disease (CHD); however, SCD risk prediction remains elusive.MethodsThe authors studied 66,636 primary prevention patients from the CAC Consortium. Multivariable competing risks regression and C-statistics were used to assess the association between CAC and SCD, adjusting for demographics and traditional risk factors.ResultsThe mean age was 54.4 years, 33% were women, 11% were of non-White ethnicity, and 55% had CAC >0. A total of 211 SCD events (0.3%) were observed during a median follow-up of 10.6 years, 91% occurring among those with baseline CAC >0. Compared with CAC = 0, there was a stepwise higher risk (P trend < 0.001) in SCD for CAC 100 to 399 (subdistribution hazard ratio [SHR]: 2.8; 95% CI: 1.6-5.0), CAC 400 to 999 (SHR: 4.0; 95% CI: 2.2-7.3), and CAC >1,000 (SHR: 4.9; 95% CI: 2.6-9.9). CAC provided incremental improvements in the C-statistic for the prediction of SCD among individuals with a 10-year risk <7.5% (ΔC-statistic = +0.046; P = 0.02) and 7.5% to 20% (ΔC-statistic = +0.069; P = 0.003), which were larger when compared with persons with a 10-year risk >20% (ΔC-statistic = +0.01; P = 0.54).ConclusionsHigher CAC burden strongly associates with incident SCD beyond traditional risk factors, particularly among primary prevention patients with low-intermediate risk. SCD risk stratification can be useful in the early stages of CHD through the measurement of CAC, identifying patients most likely to benefit from further downstream testing.  相似文献   

6.
BackgroundIn October 2018, the U.S. heart allocation system expanded the number of priority “status” tiers from 3 to 6 and added cardiogenic shock requirements for some heart transplant candidates listed with specific types of treatments.ObjectivesThis study sought to determine the impact of the new policy on the treatment practices of transplant centers.MethodsInitial listing data on all adult heart candidates listed from December 1, 2017 to April 30, 2019 were collected from the Scientific Registry of Transplant Recipients. The status-qualifying treatments (or exception requests) and hemodynamic values at listing of a post-policy cohort (December 2018 to April 2019) were compared with a seasonally matched pre-policy cohort (December 2017 to April 2018). Candidates in the pre-policy cohort were reclassified into the new priority system statuses by using treatment, diagnosis, and hemodynamics.ResultsComparing the post-policy cohort (N = 1,567) with the pre-policy cohort (N = 1,606), there were significant increases in listings with extracorporeal membrane oxygenation (+1.2%), intra-aortic balloon pumps (+ 4 %), and exceptions (+ 12%). Listings with low-dose inotropes (−18%) and high-dose inotropes (−3%) significantly decreased. The new priority status distribution had more status 2 (+14%) candidates than expected and fewer status 3 (−5%), status 4 (− 4%) and status 6 (−8%) candidates than expected (p values <0.01 for all comparisons).ConclusionsAfter implementation of the new heart allocation policy, transplant centers listed more candidates with extracorporeal membrane oxygenation, intra-aortic balloon pumps, and exception requests and fewer candidates with inotrope therapy than expected, thus leading to significantly more high-priority status listings than anticipated. If these early trends persist, the new allocation system may not function as intended.  相似文献   

7.
BackgroundInfective endocarditis may affect patients after transcatheter aortic valve replacement (TAVR).ObjectivesThe purpose of this study was to provide detailed information on incidence rates, types of microorganisms, and outcomes of infective endocarditis after TAVR.MethodsBetween February 2011 and July 2018, consecutive patients from the SwissTAVI Registry were eligible. Infective endocarditis was classified into early (peri-procedural [<100 days] and delayed-early [100 days to 1 year]) and late (>1 year) endocarditis. Clinical events were adjudicated according to the Valve Academic Research Consortium-2 endpoint definitions.ResultsDuring the observational period, 7,203 patients underwent TAVR at 15 hospitals in Switzerland. During follow-up of 14,832 patient-years, endocarditis occurred in 149 patients. The incidence for peri-procedural, delayed-early, and late endocarditis after TAVR was 2.59, 0.71, and 0.40 events per 100 person-years, respectively. Among patients with early endocarditis, Enterococcus species were the most frequently isolated microorganisms (30.1%). Among those with peri-procedural endocarditis, 47.9% of patients had a pathogen that was not susceptible to the peri-procedural antibiotic prophylaxis. Younger age (subhazard ratio [SHR]: 0.969; 95% confidence interval [CI]: 0.944 to 0.994), male sex (SHR: 1.989; 95% CI: 1.403 to 2.818), lack of pre-dilatation (SHR: 1.485; 95% CI: 1.065 to 2.069), and treatment in a catheterization laboratory as opposed to hybrid operating room (SHR: 1.648; 95% CI: 1.187 to 2.287) were independently associated with endocarditis. In a case-control matched analysis, patients with endocarditis were at increased risk of mortality (hazard ratio: 6.55; 95% CI: 4.44 to 9.67) and stroke (hazard ratio: 4.03; 95% CI: 1.54 to 10.52).ConclusionsInfective endocarditis after TAVR most frequently occurs during the early period, is commonly caused by Enterococcus species, and results in considerable risks of mortality and stroke. (NCT01368250)  相似文献   

8.
《JACC: Cardiovascular Imaging》2021,14(12):2337-2349
ObjectivesThe aim of this meta-analysis was to assess the diagnostic performance of various CMR imaging parameters for evaluating acute cardiac transplant rejection.BackgroundEndomyocardial biopsy is the current gold standard for detection of acute cardiac transplant rejection. Cardiac magnetic resonance (CMR) is uniquely capable of myocardial tissue characterization and may be useful as a noninvasive alternative for the diagnosis of graft rejection.MethodsPubMed and Web of Science were searched for relevant publications reporting on the use of CMR myocardial tissue characterization for detection of acute cardiac transplant rejection with endomyocardial biopsy as the reference standard. Pooled sensitivity, specificity, and hierarchical modeling–based summary receiver-operating characteristic curves were calculated.ResultsOf 478 papers, 10 studies comprising 564 patients were included. The sensitivity and specificity for the detection of acute cardiac transplant rejection were 84.6 (95% CI: 65.6-94.0) and 70.1 (95% CI: 54.2-82.2) for T1, 86.5 (95% CI: 72.1-94.1) and 85.9 (95% CI: 65.2-94.6) for T2, 91.3 (95% CI: 63.9-98.4) and 67.6 (95% CI: 56.1-77.4) for extracellular volume fraction (ECV), and 50.1 (95% CI: 31.2-68.9) and 60.2 (95% CI: 36.7-79.7) for late gadolinium enhancement (LGE). The areas under the hierarchical modeling–based summary receiver-operating characteristic curve were 0.84 (95% CI: 0.81-0.87) for T1, 0.92 (95% CI: 0.89-94) for T2, 0.78 (95% CI: 0.74-0.81) for ECV, and 0.56 (95% CI: 0.51-0.60) for LGE. T2 values demonstrated the highest diagnostic accuracy, followed by native T1, ECV, and LGE (all P values <0.001 for T1, ECV, and LGE vs T2).ConclusionsT2 mapping demonstrated higher diagnostic accuracy than other CMR techniques. Native T1 and ECV provide high diagnostic use but lower diagnostic accuracy compared with T2, which was related primarily to lower specificity. LGE showed poor diagnostic performance for detection of rejection.  相似文献   

9.
Background & AimsPatients with advanced fibrosis related to nonalcoholic fatty liver disease (NAFLD) are at risk of developing hepatic and extrahepatic complications. We investigated whether, in a large cohort of patients with NAFLD and compensated advanced chronic liver disease, baseline liver stiffness measurements (LSMs) and their changes can be used to identify patients at risk for liver-related and extrahepatic events.MethodsWe performed a retrospective analysis of consecutive patients with NAFLD (n = 1039) with a histologic diagnosis of F3–F4 fibrosis and/or LSMs>10 kPa, followed for at least 6 months, from medical centers in 6 countries. LSMs were made by FibroScan using the M or XL probe and recorded at baseline and within 1 year from the last follow-up examination. Differences between follow up and baseline LSMs were categorized as: improvement (reduction of more than 20%), stable (reduction of 20% to an increase of 20%), impairment (an increase of 20% or more). We recorded hepatic events (such as liver decompensation, ascites, encephalopathy, variceal bleeding, jaundice, or hepatocellular carcinoma [HCC]) and overall and liver-related mortality during a median follow-up time of 35 months (interquartile range, 19–63 months).ResultsBased on Cox regression analysis, baseline LSM was independently associated with occurrence of hepatic decompensation (hazard ratio [HR], 1.03; 95% CI, 1.02–1.04; P < .001), HCC (HR, 1.03; 95% CI, 1.00–1.04; P = .003), and liver-related death (HR, 1.02; 95% CI, 1.02–1.03; P = .005). In 533 patients with available LSMs during the follow-up period, change in LSM was independently associated with hepatic decompensation (HR, 1.56; 95% CI, 1.05–2.51; P = .04), HCC (HR, 1.72; 95% CI, 1.01–3.02; P = .04), overall mortality (HR, 1.73; 95% CI, 1.11–2.69; P = .01), and liver-related mortality (HR, 1.96; 95% CI, 1.10–3.38; P = .02).ConclusionsIn patients with NAFLD and compensated advanced chronic liver disease, baseline LSM and change in LSM are associated with risk of liver-related events and mortality.  相似文献   

10.
BackgroundSeveral clinical and cardiac magnetic resonance (CMR)-derived parameters have been shown to be associated with death or heart transplant late after the Fontan operation.ObjectivesThe objective of this study was to identify the relative importance and interactions of clinical and CMR-based parameters for risk stratification after the Fontan operation.MethodsFontan patients were retrospectively reviewed. Clinical and CMR parameters were analyzed using univariable Cox regression. The primary endpoint was time to death or (listing for) heart transplant. To identify the patients at highest risk for the endpoint, classification and regression tree survival analysis was performed, including all significant variables from Cox regression.ResultsThe cohort consisted of 416 patients (62% male) with a median age of 16 years (25th, 75th percentiles: 11, 23 years). Over a median follow-up of 5.4 years (25th, 75th percentiles: 2.4, 10.0 years) after CMR, 57 patients (14%) reached the endpoint (46 deaths, 7 heart transplants, 4 heart transplant listings). Lower total indexed end-diastolic volume (EDVi) was the strongest predictor of transplant-free survival. Among patients with dilated ventricles (EDVi ≥156 ml/BSA1.3), worse global circumferential strain (GCS) was the next most important predictor (73% vs. 44%). In patients with smaller ventricles (EDVi <156 ml/BSA1.3), New York Heart Association functional class ≥II was the next most important predictor (30% vs. 4%).ConclusionsIn this cohort of patients late after Fontan operation, increased ventricular dilation was the strongest independent predictor of death or transplant (listing). Patients with both ventricular dilation and worse GCS were at highest risk. These data highlight the value of integrating CMR and clinical parameters for risk stratification in this population.  相似文献   

11.
BackgroundIn patients undergoing heart transplantation, significant allosensitization limits access to organs, resulting in longer wait times and high waitlist mortality. Current desensitization strategies are limited in enabling successful transplantation.ObjectivesThe purpose of this study was to describe the cumulative experience of combined heart-liver transplantation using a novel heart-after-liver transplant (HALT) protocol resulting in profound immunologic protection.MethodsReported are the results of a clinical protocol that was instituted to transplant highly sensitized patients requiring combined heart and liver transplantation at a single institution. Patients were dual-organ listed with perceived elevated risk of rejection or markedly prolonged waitlist time due to high levels of allo-antibodies. Detailed immunological data and long-term patient and graft outcomes were obtained.ResultsA total of 7 patients (age 43 ± 7 years, 86% women) with high allosensitization (median calculated panel reactive antibody = 77%) underwent HALT. All had significant, unacceptable donor specific antibodies (DSA) (>4,000 mean fluorescence antibody). Prospective pre-operative flow cytometric T-cell crossmatch was positive in all, and B-cell crossmatch was positive in 5 of 7. After HALT, retrospective crossmatch (B- and T-cell) became negative in all. DSA fell dramatically; at last follow-up, all pre-formed or de novo DSA levels were insignificant at <2,000 mean fluorescence antibody. No patients experienced >1R rejection over a median follow-up of 48 months (interquartile range: 25 to 68 months). There was 1 death due to metastatic cancer and no significant graft dysfunction.ConclusionsA heart-after-liver transplantation protocol enables successful transplantation via near-elimination of DSA and is effective in preventing adverse immunological outcomes in highly sensitized patients listed for combined heart-liver transplantation.  相似文献   

12.
BackgroundFour long-acting muscarinic antagonists (LAMAs), tiotropium, glycopyrronium, aclidinium, and umeclidinium, are currently available for the treatment of stable chronic obstructive pulmonary disease (COPD). However, no integrated analysis has sought to determine the effectiveness of these LAMAs. Thus, we conducted a systematic review and meta-analysis to evaluate the efficacy and safety of LAMA versus placebo in patients with stable COPD.MethodsA literature search of relevant randomized control trials that administered LAMA to stable COPD patients was conducted, and the exacerbations, quality of life (QoL), dyspnea score, lung function, and adverse event of patients were evaluated.ResultsA total of 33 studies were included in this meta-analysis. LAMA significantly decreased the frequency of exacerbations compared to the placebo (OR 0.75; 95% CI 0.66 to 0.85; P < 0.001). The mean changes in the St George's Respiratory Questionnaire score (mean difference, ?3.61; 95% CI, ?4.27 to ?2.95; P < 0.00001), transitional dyspnea index score (mean difference 1.00; 95% CI 0.83 to 1.17; P < 0.00001), and trough FEV1 (mean difference 0.12; 95% CI 0.11 to 0.13; P < 0.0001) indicated significantly greater improvement in the LAMA group than the placebo group. The number of withdrawals due to adverse events in the LAMA group was significantly fewer than that in the placebo group (OR -0.02; 95% CI -0.03 to ?0.01; P = 0.002).ConclusionLAMA is superior to placebo due to lower frequency of exacerbations and adverse events, as well as higher trough FEV1, QoL, and dyspnea score for stable COPD.  相似文献   

13.
ObjectivesThis study determined whether flow state classified by stroke volume index (SVi) or transvalvular flow rate (FR) improved risk stratification of all-cause mortality, hospitalization due to heart failure, and aortic valvular interventions for patients with severe aortic stenosis (AS).BackgroundSVi is a widely accepted classification for flow state in severe low-flow, low-gradient (LFLG) AS. Recent studies suggest that FR more closely approximates true AS severity and provides more useful prognostication than SVi.MethodsPatients with severe AS over a 7-year period were subclassified by echocardiographic parameters. LFLG-AS was defined as severe AS (aortic valve area index [AVAi]: <0.6 cm2/m2), with a mean transvalvular pressure gradient of <40 mm Hg in the setting of low flow state: SVi of <35 ml/m2 and/or FR of <200 ml/s and subclassified into preserved (≥50%; paradoxical) or reduced (<50%; classical) left ventricular ejection fraction (LVEF).ResultsAmong 621 consecutive patients with severe AS, the proportions of patients classified as LFLG-AS were different between SVi and FR (p < 0.001). Classification using SVi, FR, and LVEF was a strong predictor of the composite endpoint at the 2-year follow-up. The addition of SVi to the echocardiographic and clinical model provided significant improvement in reclassification (net reclassification improvement: 0.089; 95% confidence interval [CI]: 0.045 to 0.133; p = 0.04), whereas addition of FR did not (net reclassification improvement: 0.061; 95% CI: 0.016 to 0.106; p = 0.17). C-statistics indicated improved risk discrimination when AVAi, LVEF, and SVi or FR were added as predictive variables to the clinical model (p = 0.006).ConclusionsLow SVi or FR was associated with adverse cardiovascular events and showed improvement in discrimination, but only SVi, not FR, significantly improved risk reclassification compared to other conventional clinical and echocardiographic predictors. This suggests that FR is not superior to SVi in distinguishing true severe from pseudosevere forms of AS and identification of patients with LFLG-AS who have worse outcomes.  相似文献   

14.
Background & AimsEosinophilic esophagitis (EoE) is a chronic, immune-mediated disease for which there is currently no pharmacologic therapy approved by the U.S. Food and Drug Administration.MethodsIn this double-blind, placebo-controlled, phase 3 trial, patients 11–55 years of age with EoE and dysphagia were randomized 2:1 to receive budesonide oral suspension (BOS) 2.0 mg twice daily or placebo for 12 weeks at academic or community care practices. Co-primary endpoints were the proportion of stringent histologic responders (≤6 eosinophils/high-power field) or dysphagia symptom responders (≥30% reduction in Dysphagia Symptom Questionnaire [DSQ] score) over 12 weeks. Changes in DSQ score (key secondary endpoint) and EoE Endoscopic Reference Score (EREFS) (secondary endpoint) from baseline to week 12, and safety parameters were examined.ResultsOverall, 318 patients (BOS, n = 213; placebo, n = 105) were randomized and received ≥1 dose of study treatment. More BOS-treated than placebo-treated patients achieved a stringent histologic response (53.5% vs 1.0%; Δ53% [95% confidence interval (CI), 43.8%–59.5%]; P < .001) or symptom response (52.6% vs 39.1%; Δ13% [95% CI, 1.6%–24.3%]; P = .024) over 12 weeks. BOS-treated patients also had greater improvements in least-squares mean DSQ scores and EREFS over 12 weeks than placebo-treated patients: DSQ, –13.0 (SEM 1.2) vs –9.1 (SEM 1.5) (Δ–3.9 [95% CI, –7.1 to –0.8]; P = .015); EREFS, –4.0 (SEM 0.3) vs –2.2 (SEM 0.4) (Δ–1.8 [95% CI, –2.6 to –1.1]; P < .001). BOS was well tolerated; most adverse events were mild or moderate in severity.ConclusionsIn patients with EoE, BOS 2.0 mg twice daily was superior to placebo in improving histologic, symptomatic, and endoscopic outcomes over 12 weeks. BOS 2.0 mg twice daily was well tolerated. ClinicalTrials.gov number: NCT02605837.  相似文献   

15.
Background and aimsStress hyperglycemia ratio (SHR) is associated with increased in-hospital morbidity and mortality in patients with acute myocardial infarction (AMI). We aimed to investigate the impact of stress “hyperglycemia” on long-term mortality after AMI in patients with and without diabetes mellitus (DM).Methods and resultsWe included 2089 patients with AMI between February 2014 and March 2018. SHR was measured with the fasting glucose divided by the estimated average glucose derived from glycosylated hemoglobin (HbA1c). The primary endpoint was all-cause death. Of 2 089 patients (mean age: 65.7 ± 12.4, 76.7% were men) analyzed, 796 (38.1%) had DM. Over a median follow-up of 2.7 years, 141 (6.7%) and 150 (7.2%) all-cause deaths occurred in the diabetic and nondiabetic cohorts, respectively. Compared with participants with low SHR (<1.24 in DM; <1.14 in non-DM), the hazard ratios and 95% confidence intervals for those with high SHR (≥1.24 in DM; ≥1.14 in non-DM) for all-cause mortality were 2.23 (1.54–3.23) and 1.79 (1.15–2.78); for cardiovascular mortality were 2.42 (1.63–3.59) and 2.10 (1.32–3.35) in DM and non-DM subjects, respectively. The mortality prediction was improved in the diabetic individuals with the incorporation of SHR into the Global Registry of Acute Coronary Events (GRACE) score, showing an increase in a continuous net reclassification index of 0.184 (95%CI: 0.003–0.365) and an absolute integrated discrimination improvement of 0.014 (95%CI: 0.002–0.025).ConclusionThe improvement in the prediction of long-term mortality beyond the GRACE score indicates the potential of SHR as a biomarker for post-MI risk stratification among patients with DM.Registration number for clinical trialsNCT03533543.  相似文献   

16.
ObjectivesThe aim of this study was to assess the pooled clinical and echocardiographic outcomes of different isolated transcatheter tricuspid valve repair (ITTVR) strategies for significant (moderate or greater) tricuspid regurgitation (TR).BackgroundSignificant TR is a common valvular heart disease worldwide.MethodsPublished research was systematically searched for studies evaluating the efficacy and safety of ITTVR for significant TR in adults. The primary outcomes were improvement in New York Heart Association (NYHA) functional class and 6-minute walking distance and the presence of severe or greater TR at the last available follow-up of each individual study. Random-effect meta-analysis was performed comparing outcomes before and after ITTVR.ResultsFourteen studies with 771 patients were included. The mean age was 77 ± 8 years, and the mean European System for Cardiac Operative Risk Evaluation II score was 6.8% ± 5.4%. At a weighted mean follow-up of 212 days, 209 patients (35%) were in NYHA functional class III or IV compared with 586 patients (84%) at baseline (risk ratio: 0.23; 95% CI: 0.13-0.40; P < 0.001). Six-minute walking distance significantly improved from 237 ± 113 m to 294 ± 105 m (mean difference +50 m; 95% CI: +34 to +66 m; P < 0.001). One hundred forty-seven patients (24%) showed severe or greater TR after ITTVR compared with 616 (96%) at baseline (risk ratio: 0.29; 95% CI: 0.20-0.42; P < 0.001).ConclusionsPatients undergoing ITTVR for significant TR experienced significant improvements in NYHA functional status and 6-minute walking distance and a significant reduction in TR severity at mid-term follow-up.  相似文献   

17.
Background & AimsAdiposity, type 2 diabetes, alcohol and coffee consumption, and smoking have been examined in relation to diverticular disease in observational studies. We conducted a Mendelian randomization study to assess the causality of these associations.MethodsIndependent genetic instruments associated with the studied exposures at genome-wide significance were obtained from published genome-wide association studies. Summary-level data for the exposure-associated single nucleotide polymorphisms with diverticular disease were available in the FinnGen consortium (10,978 cases and 149,001 noncases) and the UK Biobank study (12,662 cases and 348,532 noncases).ResultsHigher genetically predicted body mass index and genetic liability to type 2 diabetes and smoking initiation were associated with an increased risk of diverticular disease in meta-analyses of results from the two studies. The combined odds ratio of diverticular disease was 1.23 (95% confidence interval [CI], 1.14–1.33; P < .001) for a 1-standard deviation (~4.8 kg/m2) increase in body mass index, 1.04 (95% CI, 1.01–1.07; P = .007) for a 1-unit increase in log-transformed odds ratio of type 2 diabetes, and 1.21 (95% CI, 1.12–1.30; P < .001) for a 1-standard deviation increase in prevalence of smoking initiation. Coffee consumption was not associated with diverticular disease, whereas the association for alcohol consumption largely differed between the 2 studies.ConclusionsThis study strengthens the causal associations of higher body mass index, type 2 diabetes, and smoking with an increased risk of diverticular disease. Coffee consumption is not associated with diverticular disease. Whether alcohol consumption affects the risk of diverticular disease needs further investigation.  相似文献   

18.
Introduction and objectivesHeart retransplantation (ReHT) is controversial in the current era. The aim of this study was to describe and analyze the results of ReHT in Spain.MethodsWe performed a retrospective cohort analysis from the Spanish Heart Transplant Registry from 1984 to 2018. Data were collected on donors, recipients, surgical procedure characteristics, immunosuppression, and survival. The main outcome was posttransplant all-cause mortality or need for ReHT. We studied differences in survival according to indication for ReHT, the time interval between transplants and era of ReHT.ResultsA total of 7592 heart transplants (HT) and 173 (2.3%) ReHT were studied (median age, 52.0 and 55.0 years, respectively). Cardiac allograft vasculopathy was the most frequent indication for ReHT (42.2%) and 59 patients (80.8%) received ReHT > 5 years after the initial transplant. Acute rejection and primary graft failure decreased as indications over the study period. Renal dysfunction, hypertension, need for mechanical ventilation or intra-aortic balloon pump and longer cold ischemia time were more frequent in ReHT. Median follow-up for ReHT was 5.8 years. ReHT had worse survival than HT (weighted HR, 1.43; 95%CI, 1.17-1.44; P < .001). The indication of acute rejection (HR, 2.49; 95%CI, 1.45-4.27; P < .001) was related to the worst outcome. ReHT beyond 5 years after initial HT portended similar results as primary HT (weighted HR, 1.14; 95%CI, 0.86-1.50; P < .001).ConclusionsReHT was associated with higher mortality than HT, especially when indicated for acute rejection. ReHT beyond 5 years had a similar prognosis to primary HT.  相似文献   

19.
《JACC: Cardiovascular Imaging》2022,15(11):1883-1896
BackgroundGlobal circumferential strain (GCS) and global radial strain (GRS) are reduced with cytotoxic chemotherapy. There are limited data on the effect of immune checkpoint inhibitor (ICI) myocarditis on GCS and GRS.ObjectivesThis study aimed to detail the role of GCS and GRS in ICI myocarditis.MethodsIn this retrospective study, GCS and GRS from 75 cases of patients with ICI myocarditis and 50 ICI-treated patients without myocarditis (controls) were compared. Pre-ICI GCS and GRS were available for 12 cases and 50 controls. Measurements were performed in a core laboratory blinded to group and time. Major adverse cardiovascular events (MACEs) were defined as a composite of cardiogenic shock, cardiac arrest, complete heart block, and cardiac death.ResultsCases and controls were similar in age (66 ± 15 years vs 63 ± 12 years; P = 0.20), sex (male: 73% vs 61%; P = 0.20) and cancer type (P = 0.08). Pre-ICI GCS and GRS were also similar (GCS: 22.6% ± 3.4% vs 23.5% ± 3.8%; P = 0.14; GRS: 45.5% ± 6.2% vs 43.6% ± 8.8%; P = 0.24). Overall, 56% (n = 42) of patients with myocarditis presented with preserved left ventricular ejection fraction (LVEF). GCS and GRS were lower in myocarditis compared with on-ICI controls (GCS: 17.5% ± 4.2% vs 23.6% ± 3.0%; P < 0.001; GRS: 28.6% ± 6.7% vs 47.0% ± 7.4%; P < 0.001). Over a median follow-up of 30 days, 28 cardiovascular events occurred. A GCS (HR: 4.9 [95% CI: 1.6-15.0]; P = 0.005) and GRS (HR: 3.9 [95% CI: 1.4-10.8]; P = 0.008) below the median was associated with an increased event rate. In receiver-operating characteristic (ROC) curves, GCS (AUC: 0.80 [95% CI: 0.70-0.91]) and GRS (AUC: 0.76 [95% CI: 0.64-0.88]) showed better performance than cardiac troponin T (cTnT) (AUC: 0.70 [95% CI: 0.58-0.82]), LVEF (AUC: 0.69 [95% CI: 0.56-0.81]), and age (AUC: 0.54 [95% CI: 0.40-0.68]). Net reclassification index and integrated discrimination improvement demonstrated incremental prognostic utility of GRS over LVEF (P = 0.04) and GCS over cTnT (P = 0.002).ConclusionsGCS and GRS are lower in ICI myocarditis, and the magnitude of reduction has prognostic significance.  相似文献   

20.
《JACC: Cardiovascular Imaging》2021,14(12):2414-2424
ObjectivesThis study aimed at investigating the additional contribution of coronary artery calcium (CAC) score to SAFEHEART (Spanish Familial Hypercholesterolemia Cohort Study) risk equation (SAFEHEART-RE) for cardiovascular risk prediction in heterozygous familial hypercholesterolemia (HeFH).BackgroundCommon cardiovascular risk equations are imprecise for HeFH. Because of the high phenotype variability of HeFH, CAC score could help to better stratify the risk of atherosclerotic cardiovascular disease (ASCVD).MethodsREFERCHOL (French Registry of Familial Hypercholesterolemia) and SAFEHEART are 2 ongoing national registries on HeFH. We analyzed data from primary prevention HeFH patients undergoing CAC quantification. We used probability-weighted Cox proportional hazards models to estimate HRs. Area under the receiver-operating characteristic curve (AUC) and net reclassification improvement (NRI) were used to compare the incremental contribution of CAC score when added to the SAFEHEART-RE for ASCVD prediction. ASCVD was defined as coronary heart disease, stroke or transient ischemic attack, peripheral artery disease, resuscitated sudden death, and cardiovascular death.ResultsWe included 1,624 patients (mean age: 48.5 ± 12.8 years; men: 45.7%) from both registries. After a median follow-up of 2.7 years (interquartile range: 0.4-5.0 years), ASCVD occurred in 81 subjects. The presence of a CAC score of >100 was associated with an HR of 32.05 (95% CI: 10.08-101.94) of developing ASCVD as compared to a CAC score of 0. Receiving-operating curve analysis showed a good performance of CAC score alone in ASCVD prediction (AUC: 0.860 [95% CI: 0.853-0.869]). The addition of log(CAC + 1) to SAFEHEART-RE resulted in a significantly improved prediction of ASCVD (AUC: 0.884 [95% CI: 0.871-0.894] for SAFEHEART-RE + log(CAC + 1) vs AUC: 0.793 [95% CI: 0.779-0.818] for SAFEHEART-RE; P < 0.001). These results were confirmed also when considering only hard cardiovascular endpoints. The addition of CAC score was associated with an estimated overall net reclassification improvement of 45.4%.ConclusionsCAC score proved its use in improving cardiovascular risk stratification and ASCVD prediction in statin-treated HeFH.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号