首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
ObjectivesIn this study, the authors tested the hypotheses that the systolic stretch index (SSI) developed by computer modeling and applied using echocardiographic strain imaging may characterize the electromechanical substrate predictive of outcome following cardiac resynchronization therapy (CRT). They included patients with QRS width 120 to 149 ms or non-left bundle branch block (LBBB), where clinical uncertainty for CRT exists. They further tested the hypothesis that global longitudinal strain (GLS) has additional prognostic value.BackgroundResponse to CRT is variable. Guidelines favor patient selection by electrocardiographic LBBB with QRS width ≥150 ms.MethodsThe authors studied 442 patients enrolled in the Adaptive CRT 94-site randomized trial with New York Heart Association functional class III–IV heart failure, ejection fraction ≤35%, and QRS ≥120 ms. A novel computer program semiautomatically calculated the SSI from strain curves as the sum of posterolateral prestretch percent before aortic valve opening and the septal rebound stretch percent during ejection. The primary endpoint was hospitalization for heart failure (HF) or death, and the secondary endpoint was death over 2 years after CRT.ResultsIn all patients, high longitudinal SSI (≥ group median of 3.1%) was significantly associated with freedom from the primary endpoint of HF hospitalization or death (hazard ratio [HR] for low SSI: 2.17; 95% confidence interval [CI]: 1.45 to 3.24, p < 0.001) and secondary endpoint of death (HR for low SSI: 4.06; 95% CI: 1.95 to 8.45, p < 0.001). Among the 203 patients with QRS 120 to 149 ms or non-LBBB, those with high longitudinal SSI (≥ group median of 2.6%) had significantly fewer HF hospitalizations or deaths (HR for low SSI: 2.08; 95% CI: 1.27 to 3.41, p = 0.004) and longer survival (HR for low SSI: 5.08; 95% CI: 1.94 to 13.31, p < 0.001), similar to patients with LBBB ≥150 ms. SSI by circumferential strain had similar associations with clinical outcomes, and GLS was additive to SSI in predicting clinical events (p = 0.001).ConclusionsSystolic stretch by strain imaging characterized the myocardial substrate associated with favorable CRT response, including in the important patient subgroup with QRS width 120 to 149 ms or non-LBBB. GLS had additive prognostic value.  相似文献   

2.
BackgroundCardiac resynchronization therapy (CRT) is usually performed by biventricular (BiV) pacing. Previously, feasibility of transvenous implantation of a lead at the left ventricular (LV) endocardial side of the interventricular septum, referred to as LV septal (LVs) pacing, was demonstrated.ObjectivesThe authors sought to compare the acute electrophysiological and hemodynamic effects of LVs with BiV and His bundle (HB) pacing in CRT patients.MethodsTemporary LVs pacing (transaortic approach) alone or in combination with right ventricular (RV) (LVs+RV), BiV, and HB pacing was performed in 27 patients undergoing CRT implantation. Electrophysiological changes were assessed using electrocardiography (QRS duration), vectorcardiography (QRS area), and multielectrode body surface mapping (standard deviation of activation times [SDAT]). Hemodynamic changes were assessed as the first derivative of LV pressure (LVdP/dtmax).ResultsAs compared with baseline, LVs pacing resulted in a larger reduction in QRS area (to 73 ± 22 μVs) and SDAT (to 26 ± 7 ms) than BiV (to 93 ± 26 μVs and 31 ± 7 ms; both p < 0.05) and LVs+RV pacing (to 108 ± 37 μVs; p < 0.05; and 29 ± 8 ms; p = 0.05). The increase in LVdP/dtmax was similar during LVs and BiV pacing (17 ± 10% vs. 17 ± 9%, respectively) and larger than during LVs+RV pacing (11 ± 9%; p < 0.05). There were no significant differences between basal, mid-, or apical LVs levels in LVdP/dtmax and SDAT. In a subgroup of 16 patients, changes in QRS area, SDAT, and LVdP/dtmax were comparable between LVs and HB pacing.ConclusionsLVs pacing provides short-term hemodynamic improvement and electrical resynchronization that is at least as good as during BiV and possibly HB pacing. These results indicate that LVs pacing may serve as a valuable alternative for CRT.  相似文献   

3.
BackgroundImplantable cardioverter-defibrillators (ICDs) and cardiac resynchronization therapy (CRT) reduce sudden cardiac death and all-cause mortality in patients with heart failure with reduced ejection fraction (HFrEF). Current guidelines do not suggest any upper age limit for ICD and CRT but recommend avoidance of ICD and CRT in frail patients with a life expectancy of less than 1 year. It remains unclear whether elderly patients undergoing CRT derive the same additional benefit from ICDs as younger patients. We aimed to assess the use of ICDs in elderly compared to younger patients receiving CRT.MethodsWe searched electronic databases, up to April 11, 2016, for all studies reporting on ICD use stratified by age in patients who received CRT. We used random-effects meta-analysis models to calculate the summarized baseline characteristics and rates of implantation of ICD among patients enrolled in the studies.ResultsWe retained six observational studies enrolling 613 patients ≥75 years old and 2810 patients <75 years old. The aggregate mean age was 82.7 years for the elderly patients compared to 66.3 years in the younger patients. There was a significantly lower use of ICDs in elderly patients compared to that in younger patients (37.9% versus 64.3%) (odds ratio: 0.26; 95% confidence intervals: 0.14-0.46; p < 0.0001).ConclusionsIn conclusion, ICD was less frequently used in patients ≥75 years old receiving CRT compared to younger patients receiving CRT. Future studies that evaluate the efficacy and effectiveness of ICDs in elderly patients with indications for CRT are needed to guide management of this increasing population.  相似文献   

4.
BackgroundThe aim of our study was to compare the effect of interventricular (VV) delay optimisation in CRT recipients on the basis of systolic dyssynchrony index (SDI) derived from the three-dimensional echocardiography (3DE) versus QRS width assessment on left ventricle volume reduction at the 12-month follow-up.MethodsWe included 63 patients with recently implanted CRT in this randomised, open-label trial. Patients were randomised to VV delay optimisation according to QRS complex width measurement in group 1 (n = 31) to obtain the narrowest QRS complex and SDI in group 2 (n = 32) to achieve its lowest possible value. We evaluated left ventricular end-systolic volume (LVESv), left ventricular ejection fraction (LVEF) and SDI by 3DE before CRT implantation and at a 12-month follow-up in all the patients. We also obtained the New York Heart Association functional class, the 6-minute walk test, the quality of life questionnaire and the level of NT-proBNP.ResultsThe number of volumetric responders was similar in both groups (17 vs. 20, P = 0.786). There were also no significant differences in the reduction of LVESv (−41 ± 55 mL vs. - 61 ± 51 mL, P = 0.111), improvement in LVEF (+10.1 ± 10.6% vs. + 13.0 ± 9.9%, P = 0.213) or differences in clinical outcomes between both groups at the 12-month follow-up.ConclusionCRT optimisation of interventricular delay using SDI compared with QRS width assessment did not reveal any significant difference in terms of volumetric and clinical response at the 12-month follow-up.  相似文献   

5.
ObjectivesThe purpose of this study was to examine prognostic value of T1- and T2-mapping techniques in heart transplant patients.BackgroundMyocardial characterization using T2 mapping (evaluation of edema/inflammation) and pre- and post-gadolinium contrast T1 mapping (calculation of extracellular volume fraction [ECV] for assessment of interstitial expansion/fibrosis) are emerging modalities that have been investigated in various cardiomyopathies.MethodsA total of 99 heart transplant patients underwent the magnetic resonance imaging (MRI) scans including T1- (n = 90) and T2-mapping (n = 79) techniques. Relevant clinical characteristics, MRI parameters including late gadolinium enhancement (LGE), and invasive hemodynamics were collected. Median clinical follow-up duration after the baseline scan was 2.4 to 3.5 years. Clinical outcomes include cardiac events (cardiac death, myocardial infarction, coronary revascularization, and heart failure hospitalization), noncardiac death and noncardiac hospitalization.ResultsOverall, the global native T1, postcontrast T1, ECV, and T2 were 1,030 ± 56 ms, 458 ± 84 ms, 27 ± 4% and 50 ± 4 ms, respectively. Top-tercile-range ECV (ECV >29%) independently predicted adverse clinical outcomes compared with bottom-tercile-range ECV (ECV <25%) (hazard ratio [HR]: 2.87; 95% confidence interval [CI]: 1.07 to 7.68; p = 0.04) in a multivariable model with left ventricular end-systolic volume and LGE. Higher T2 (T2 ≥50.2 ms) independently predicted adverse clinical outcomes (HR: 3.01; 95% CI: 1.39 to 6.54; p = 0.005) after adjustment for left ventricular ejection fraction, left ventricular end-systolic volume, and LGE. Additionally, higher T2 (T2 ≥50.2 ms) also independently predicted cardiac events (HR: 4.92; CI: 1.60 to 15.14; p = 0.005) in a multivariable model with left ventricular ejection fraction.ConclusionsMRI-derived myocardial ECV and T2 mapping in heart transplant patients were independently associated with cardiac and noncardiac outcomes. Our findings highlight the need for larger prospective studies.  相似文献   

6.
BackgroundWhether to repair nonsevere tricuspid regurgitation (TR) during surgery for ischemic mitral valve regurgitation (IMR) remains uncertain.ObjectivesThe goal of this study was to investigate the incidence, predictors, and clinical significance of TR progression and presence of ≥moderate TR after IMR surgery.MethodsPatients (n = 492) with untreated nonsevere TR within 2 prospectively randomized IMR trials were included. Key outcomes were TR progression (either progression by ≥2 grades, surgery for TR, or severe TR at 2 years) and presence of ≥moderate TR at 2 years.ResultsPatients’ mean age was 66 ± 10 years (67% male), and TR distribution was 60% ≤trace, 31% mild, and 9% moderate. Among 2-year survivors, TR progression occurred in 20 (6%) of 325 patients. Baseline tricuspid annular diameter (TAD) was not predictive of TR progression. At 2 years, 37 (11%) of 323 patients had ≥moderate TR. Baseline TR grade, indexed TAD, and surgical ablation for atrial fibrillation were independent predictors of ≥moderate TR. However, TAD alone had poor discrimination (area under the curve, ≤0.65). Presence of ≥moderate TR at 2 years was higher in patients with MR recurrence (20% vs. 9%; p = 0.02) and a permanent pacemaker/defibrillator (19% vs. 9%; p = 0.01). Clinical event rates (composite of ≥1 New York Heart Association functional class increase, heart failure hospitalization, mitral valve surgery, and stroke) were higher in patients with TR progression (55% vs. 23%; p = 0.003) and ≥moderate TR at 2 years (38% vs. 22%; p = 0.04).ConclusionsAfter IMR surgery, progression of unrepaired nonsevere TR is uncommon. Baseline TAD is not predictive of TR progression and is poorly discriminative of ≥moderate TR at 2 years. TR progression and presence of ≥moderate TR are associated with clinical events. (Comparing the Effectiveness of a Mitral Valve Repair Procedure in Combination With Coronary Artery Bypass Grafting [CABG] Versus CABG Alone in People With Moderate Ischemic Mitral Regurgitation, NCT00806988; Comparing the Effectiveness of Repairing Versus Replacing the Heart’s Mitral Valve in People With Severe Chronic Ischemic Mitral Regurgitation, NCT00807040)  相似文献   

7.
ObjectivesThe aim of this study was to define risk factors and develop a predictive risk score for new pacemaker implantation (PMI) after transcatheter aortic valve replacement (TAVR).BackgroundTAVR has become an accepted treatment alternative for patients with severe aortic stenosis at elevated surgical risk. New PMI is a common occurrence after TAVR and is associated with poorer outcomes.MethodsAll patients without prior valve procedures undergoing elective TAVR with the Edwards SAPIEN 3 at a single institution (n = 1,266) were evaluated. Multivariate analysis was performed to evaluate for predictors of PMI in this population in a derivation cohort of patients with complete data (n = 778), and this model was used to develop the Emory risk score (ERS), which was tested in a validation cohort (n = 367).ResultsFifty-seven patients (7.3%) in the derivation cohort required PMI. In a regression model, history of syncope (odds ratio [OR]: 2.5; p = 0.026), baseline right bundle branch block (OR: 4.3; p < 0.001), QRS duration ≥138 ms (OR: 2.5; p = 0.017), and valve oversizing >15.6% (OR: 1.9; p = 0.041) remained independent predictors of PMI and were included in the ERS. The ERS was strongly associated with PMI (per point increase OR: 2.2; p < 0.001) with an area under the receiver-operating characteristic curve of 0.778 (p < 0.001), which was similar to its performance in the derivation cohort.ConclusionsA history of syncope, right bundle branch block, longer QRS duration, and higher degree of oversizing are predictive of the need for PMI after TAVR. Additionally, the ERS for PMI was developed and validated, representing a simple bedside tool to aid in risk stratification for patients for undergoing TAVR.  相似文献   

8.
ObjectivesThe aim of this study was to explore the association between mechanical dyssynchrony of the left ventricle before cardiac resynchronization therapy (CRT) and improvement of mitral regurgitation (MR) after CRT.BackgroundMR is very frequent among patients with dilated cardiomyopathy and conduction delay.MethodsEchocardiograms (pre-CRT and 12 ± 3.8 months thereafter) of 314 patients with dilated cardiomyopathy and any degree of MR, who underwent CRT device implantation according to guidelines, were analyzed. Left ventricular (LV) mechanical dyssynchrony was assessed by apical rocking (ApRock) and septal flash (SF), while MR severity was graded from I to IV on the basis of vena contracta width, regurgitation jet size, and proximal isovelocity surface area.ResultsAt baseline, 30% of patients presented with severe MR (grade III or IV). In 62% of patients, MR decreased after CRT, and these patients more frequently had left bundle branch block, had more severe MR, had more dilated left ventricles, had lower ejection fractions, and more often had ApRock and SF. Reverse remodeling was more frequent among patients with MR reduction (ΔLV end-systolic volume ?35.5% ± 27.2% vs ?4.1% ± 33.2%; P < 0.001). In a multivariable logistic stepwise regression, only ApRock (odds ratio [OR]: 3.8; 95% CI: 1.7-8.5; P = 0.001), SF (OR: 3.6; 95% CI: 1.6-7.9; P = 0.002), and baseline MR (OR: 1.4; 95% CI: 1.0-1.9; P = 0.046) remained significantly associated with MR reduction.ConclusionsApRock, SF, and severity of MR at baseline are strongly associated with MR reduction after CRT, while LV reverse remodeling is its underlying mechanism. Therefore, in patients with heart failure with LV dyssynchrony on optimal medical treatment, CRT should be the primary treatment attempt for relevant MR.  相似文献   

9.
《JACC: Cardiovascular Imaging》2021,14(11):2059-2069
ObjectivesThis study sought to investigate if contractile asymmetry between septum and left ventricular (LV) lateral wall drives heart failure development in patients with left bundle branch block (LBBB) and whether the presence of lateral wall dysfunction affects potential for recovery of LV function with cardiac resynchronization therapy (CRT).BackgroundLBBB may induce or aggravate heart failure. Understanding the underlying mechanisms is important to optimize timing of CRT.MethodsIn 76 nonischemic patients with LBBB and 11 controls, we measured strain using speckle-tracking echocardiography and regional work using pressure-strain analysis. Patients with LBBB were stratified according to LV ejection fraction (EF) ≥50% (EFpreserved), 36% to 49% (EFmid), and ≤35% (EFlow). Sixty-four patients underwent CRT and were re-examined after 6 months.ResultsSeptal work was successively reduced from controls, through EFpreserved, EFmid, and EFlow (all p < 0.005), and showed a strong correlation to left ventricular ejection fraction (LVEF; r = 0.84; p < 0.005). In contrast, LV lateral wall work was numerically increased in EFpreserved and EFmid versus controls, and did not significantly correlate with LVEF in these groups. In EFlow, however, LV lateral wall work was substantially reduced (p < 0.005). There was a moderate overall correlation between LV lateral wall work and LVEF (r = 0.58; p < 0.005). In CRT recipients, LVEF was normalized (≥50%) in 54% of patients with preserved LV lateral wall work, but only in 13% of patients with reduced LV lateral wall work (p < 0.005).ConclusionsIn early stages, LBBB-induced heart failure is associated with impaired septal function but preserved lateral wall function. The advent of LV lateral wall dysfunction may be an optimal time-point for CRT.  相似文献   

10.
ObjectivesThis study sought to investigate the impact of computed tomography (CT)–based area and perimeter oversizing on the incidence of paravalvular regurgitation (PVR) and valve hemodynamics in patients treated with the SAPIEN 3 transcatheter heart valve (THV).BackgroundThe incremental value of considering annular perimeter or left ventricular outflow tract measurements and the impact of THV oversizing on valve hemodynamics are not well defined.MethodsThe PARTNER 3 (Placement of Aortic Transcatheter Valves 3) trial included 495 low-surgical-risk patients with severe aortic stenosis who underwent THV implantation. THV sizing was based on annular area assessed by CT. Area- and perimeter-based oversizing was determined using systolic annular CT dimensions and nominal dimensions of the implanted THV. PVR, effective orifice area, and mean gradient were assessed on 30-day transthoracic echocardiography.ResultsOf 485 patients with available CT and echocardiography data, mean oversizing was 7.9 ± 8.7% for the annulus area and 2.1 ± 4.1% for the perimeter. A very low incidence of ≥moderate PVR (0.6%) was observed, including patients with minimal annular oversizing. Incidence of ≥mild PVR and need for procedural post-dilatation were inversely related to the degree of oversizing. For patients with annular dimensions suitable for 2 THV sizes, the larger THV with both area and perimeter oversizing was associated with the lowest incidence of ≥mild PVR (12.0% vs 43.4%; P < 0.0001). Left ventricular outflow tract area oversizing was not associated with PVR. THV prosthesis size, rather than degree of oversizing, had greatest impact on effective orifice area and mean gradient.ConclusionsIn low-surgical-risk patients, a low incidence of ≥moderate PVR was observed, including patients with minimal THV oversizing. The degree of prosthesis oversizing had the greatest impact on reducing mild PVR and incidence of post-dilatation, without impacting valve hemodynamics. In selected patients with annular dimensions in between 2 valve sizes, the larger THV device oversized to both the annular area and perimeter reduced PVR and optimized THV hemodynamics.  相似文献   

11.
《JACC: Cardiovascular Imaging》2021,14(12):2275-2285
ObjectivesThe aim of this study was to examine the value of first-phase ejection fraction (EF1), to predict response to cardiac resynchronization therapy (CRT) and clinical outcomes after CRT.BackgroundCRT is an important treatment for patients with chronic heart failure. However, even in carefully selected cases, up to 40% of patients fail to respond. EF1, the ejection fraction up to the time of maximal ventricular contraction, is a novel sensitive echocardiographic measure of early systolic function and might relate to response to CRT.MethodsAn initial retrospective study was performed in 197 patients who underwent CRT between 2009 and 2018 and were followed to determine clinical outcomes at King’s Health Partners in London. A validation study (n = 100) was performed in patients undergoing CRT at Barts Heart Centre in London.ResultsVolumetric response rate (reduction in end-systolic volume ≥15%) was 92.3% and 12.1% for those with EF1 in the highest and lowest tertiles (P < 0.001). A cutoff value of 11.9% for EF1 had >85% sensitivity and specificity for prediction of response to CRT; on multivariate binary logistic regression analysis incorporating previously defined predictors, EF1 was the strongest predictor of response (odds ratio [OR]: 1.56 per 1% change in EF1; 95% CI: 1.37-1.78; P < 0.001). EF1 was also the strongest predictor of improvement in clinical composite score (OR: 1.11; 95% CI: 1.04-1.19; P = 0.001). Improvement in EF1 at 6 months after CRT implantation (6.5% ± 5.8% vs 1.8% ± 4.3% in responders vs nonresponders; P < 0.001) was the best predictor of heart failure rehospitalization and death after median follow-up period of 20.3 months (HR: 0.81; 95% CI: 0.73-0.90; P < 0.001). In the validation cohort, EF1 was a similarly 1strong predictor of response (OR: 1.45; 95% CI: 1.23-1.70; P < 0.001) as in the original cohort.ConclusionsEF1 is a promising marker to identify patients likely to respond to CRT.  相似文献   

12.
ObjectivesThis study assessed the impact of right-atrial (RA) pacing on left-atrial (LA) physiology and clinical outcome.BackgroundData for the effects of RA pacing on LA synchronicity, function, and structure after cardiac resynchronization therapy (CRT) are scarce.MethodsThe effect of RA pacing on LA function, morphology, and synchronicity was assessed in a prospective imaging cohort of heart failure (HF) patients in sinus rhythm with a guideline-based indication for CRT. Additionally, in a retrospective outcome cohort of consecutive HF patients undergoing CRT implantation, the relationship to RA pacing was assessed using various outcome endpoints. High versus low atrial pacing burden was defined as atrial pacing above or below 50% in both cohorts.ResultsA total of 36 patients were included in the imaging cohort (68 ± 11 years of age). Six months after CRT, patients with high RA pacing burden showed less improvement in LA maximum and minimum volumes and total emptying fraction (p < 0.05). Peak atrial longitudinal strain and reservoir and booster strain rates but not conduit strain rate improved after CRT in patients with low RA pacing burden but worsened in patients with high RA pacing burden (p < 0.05 for all). A high RA pacing burden induced significant intra-atrial dyssynchrony (maximum opposing wall delay: 44 ± 13 ms vs. 97 ± 17 ms, respectively; p = 0.022). A total of 569 patients were included in the outcome cohort. After covariate adjustments were made, a high RA pacing burden was associated with reduced LV reverse remodeling (β = 8.738; 95% confidence interval [CI]: 3.101 to 14.374; p = 0.002) and new-onset or recurrent atrial fibrillation (41% vs. 22%, respectively, at a median of 31 months [range 22 to 44 months follow-up]; p < 0.001). There were no differences in time to first HF hospitalization or all-cause mortality (p = 0.185) after covariate adjustment. However, in a recurrent event analysis, HF readmissions were more common in patients exposed to a high RA pacing burden (p = 0.003).ConclusionsRA pacing in CRT patients negatively influences LA morphology, function, and synchronicity, which is associated with worse clinical outcome, including diminished LV reverse remodeling, increased risk for new-onset or recurrent AF and heart failure readmission. Strategies reducing RA pacing burden may be warranted.  相似文献   

13.
BackgroundLittle is known about the incidence and clinical relevance of postprocedural acute kidney injury (AKI) in patients undergoing transcatheter edge-to-edge repair (TEER) for tricuspid regurgitation (TR).ObjectivesThe aim of this study was to investigate the prognostic impact of postprocedural AKI following TEER for TR.MethodsTwo hundred sixty-eight patients who underwent TEER for TR at 2 centers were retrospectively analyzed. Postprocedural AKI was defined as an increase in serum creatinine of ≥0.3 mg/dL within 48 hours or ≥50% within 7 days after the procedure compared with baseline. The association between AKI and the composite outcome, consisting of all-cause mortality and rehospitalization for heart failure within 1 year after the procedure, was determined.ResultsThe mean age of the patients was 79.0 ± 6.8 years, and 43.3% were men. Postprocedural AKI occurred in 42 patients (15.7%). Age, male sex, an estimated glomerular filtration rate of <60 mL/min/1.73 m2, and absence of procedural success were associated with the occurrence of AKI. Patients with AKI had a higher incidence of in-hospital mortality than those without AKI (9.5% vs 0.9%; P = 0.006). Moreover, AKI was associated with the incidence of the composite outcome within 1 year after TEER for TR (adjusted HR: 2.39; 95% CI: 1.45-3.94; P = 0.001).ConclusionsPostprocedural AKI occurred in 15.7% of patients undergoing TEER for TR, despite the absence of iodinated contrast agents, which was associated with worse clinical outcomes. These findings highlight the clinical impact of AKI following TEER for TR and should help in identifying patients at high risk for AKI.  相似文献   

14.
ObjectivesThe aim of this registry was to evaluate the feasibility and safety of transcatheter tricuspid valve implantation (TTVI) in patients with extreme surgical risk.BackgroundIsolated tricuspid regurgitation (TR) surgery is associated with high in-hospital mortality.MethodsThirty consecutive patients (mean age 75 ± 10 years; 56% women) from 10 institutions, with symptomatic functional TR, had institutional and notified body approval for compassionate use of the GATE TTVI system. Baseline, discharge, and 30-day follow-up echocardiographic data and procedural, in-hospital, and follow-up clinical outcomes were collected.ResultsAt baseline, all patients had multiple comorbidities, severe or greater TR, and reduced baseline right ventricular function. Technical success was achieved in 26 of 30 patients (87%). Device malpositioning occurred in 4 patients, with conversion to open heart surgery in 2 (5%). Of those who received the device, 100% had reductions in TR of ≥1, and 75% experienced reductions of ≥2 grades, resulting in 18 of 24 of patients (76%) with mild or less TR at discharge. All patients had mild or less central TR. There was continued improvement in TR grade between discharge and 30 days in 15 of 19 patients (79%). In-hospital mortality was 10%. At mean follow-up of 127 ± 82 days, 4 patients (13%) had died. Of patients alive at follow-up, 62% were in New York Heart Association functional class I or II, with no late device-related adverse events.ConclusionsCompassionate treatment of severe, symptomatic functional TR using a first-generation TTVI device is associated with significant reduction in TR and improvement in functional status with acceptable in-hospital mortality. Further studies are needed to determine the appropriate patient population and long-term outcomes with TTVI therapy.  相似文献   

15.
BackgroundPatients presenting with acute coronary syndrome (ACS) and nonobstructive coronary arteries are a diagnostic dilemma. Cardiac magnetic resonance (CMR) has an overall diagnostic yield of ~75%; however, in ~25% of patients, CMR does not identify any myocardial injury. Identifying the underlying diagnosis has important clinical implications for patients’ management and outcome.ObjectivesThe authors sought to assess whether the combination of CMR and peak troponin levels in patients with ACS and nonobstructive coronary arteries would lead to increased diagnostic yield.MethodsConsecutive patients with ACS and nonobstructive coronary arteries without an obvious cause underwent CMR. The primary endpoint of the study was the diagnostic yield of CMR. The Youden index was used to find the optimal diagnostic cut point for peak troponin T to combine with CMR to improve diagnostic yield. Logistic or Cox regression models were used to estimate predictors of a diagnosis by CMR.ResultsA total of 719 patients met the inclusion criteria. The peak troponin T threshold for optimal diagnostic sensitivity and specificity was 211 ng/L. Overall, CMR has a diagnostic yield of 74%. CMR performed <14 days from presentation and with a peak troponin of ≥211 ng/L (n = 198) leads to an improved diagnostic yield (94% vs 72%) compared with CMR performed ≥14 days (n = 245). When CMR was performed <14 days and with a peak troponin of <211 ng/L, the diagnostic yield was 76% (n = 86) compared with 53% (n = 190) when performed ≥14 days. An increase in 1 peak troponin decile increases the odds of the CMR identifying a diagnosis by 20% (OR: 1.20; P = 0.008, 95% CI: 1.05-1.36).ConclusionsThe combination of CMR performed <14 days from presentation and peak troponin T ≥211 ng/L leads to a very high diagnostic yield (94%) on CMR. The diagnostic yield remains high (72%) even when CMR is performed ≥14 days from presentation, but reduces to 53% when peak troponin T was <211 ng/L.  相似文献   

16.
ObjectivesThis study sought to determine if combining the Seattle Heart Failure Model (SHFM-D) and cardiac magnetic resonance (CMR) provides complementary prognostic data for patients with cardiac resynchronization therapy (CRT) defibrillators.BackgroundThe SHFM-D is among the most widely used risk stratification models for overall survival in patients with heart failure and implantable cardioverter-defibrillators (ICDs), and CMR provides highly detailed information regarding cardiac structure and function.MethodsCMR Displacement Encoding with Stimulated Echoes (DENSE) strain imaging was used to generate the circumferential uniformity ratio estimate with singular value decomposition (CURE-SVD) circumferential strain dyssynchrony parameter, and the SHFM-D was determined from clinical parameters. Multivariable Cox proportional hazards regression was used to determine adjusted hazard ratios and time-dependent areas under the curve for the primary endpoint of death, heart transplantation, left ventricular assist device, or appropriate ICD therapies.ResultsThe cohort consisted of 100 patients (65.5 [interquartile range 57.7 to 72.7] years; 29% female), of whom 47% had the primary clinical endpoint and 18% had appropriate ICD therapies during a median follow-up of 5.3 years. CURE-SVD and the SHFM-D were independently associated with the primary endpoint (SHFM-D: hazard ratio: 1.47/SD; 95% confidence interval: 1.06 to 2.03; p = 0.02) (CURE-SVD: hazard ratio: 1.54/SD; 95% confidence interval: 1.12 to 2.11; p = 0.009). Furthermore, a favorable prognostic group (Group A, with CURE-SVD <0.60 and SHFM-D <0.70) comprising approximately one-third of the patients had a very low rate of appropriate ICD therapies (1.5% per year) and a greater (90%) 4-year survival compared with Group B (CURE-SVD ≥0.60 or SHFM-D ≥0.70) patients (p = 0.02). CURE-SVD with DENSE had a stronger correlation with CRT response (r = −0.57; p < 0.0001) than CURE-SVD with feature tracking (r = −0.28; p = 0.004).ConclusionsA combined approach to risk stratification using CMR DENSE strain imaging and a widely used clinical risk model, the SHFM-D, proved to be effective in this cohort of patients referred for CRT defibrillators. The combined use of CMR and clinical risk models represents a promising and novel paradigm to inform prognosis and device selection in the future.  相似文献   

17.
ObjectivesThe aim of this study was to investigate whether the degree of aortic angulation (AA) affects outcomes after transcatheter aortic valve replacement (TAVR) using newer-generation transcatheter heart valves (THVs).BackgroundAA ≥48° has been reported to adversely influence accurate THV deployment, procedural success, fluoroscopy time, and paravalvular leak (PVL) in patients undergoing TAVR with early generation self-expanding (SE) THVs.MethodsA retrospective observational study was conducted among 841 patients across all risk strata who underwent transfemoral TAVR using the balloon-expandable (BE) SAPIEN 3 or the SE CoreValve Evolut PRO from 2015 to 2020. The previously published cutoff of 48° was used to analyze procedural success and in-hospital outcomes according to THV type. Receiver-operating characteristic analysis was performed to investigate the impact of AA on an in-hospital composite outcome (need for >1 THV, more than mild PVL, new permanent pacemaker implantation, stroke, and death).ResultsAA ≥48° did not influence outcomes in patients with BE THVs. Additionally, AA ≥48° did not influence procedural success (99.1% vs. 99.1%; p = 0.980), number of THVs used (1.02 vs. 1.04; p = 0.484), rates of more than mild PVL (0.4% vs. 0%; p = 0.486), new permanent pacemaker implantation (11.8% vs. 17.1%; p = 0.178), in-hospital stroke (3.9% vs. 1.8%; p = 0.298), or in-hospital death (0.4% vs. 0.9%; p = 0.980) in patients with SE THVs. Receiver-operating characteristic analysis demonstrated similar outcomes irrespective of AA, with areas under the curve of 0.5525 for SE THVs and 0.5115 for BE THVs.ConclusionsAA no longer plays a role with new-generation BE or SE THVs in contemporary TAVR practice. AA ≥48° did not affect procedural success or in-hospital outcomes and should no longer be a consideration when determining THV selection.  相似文献   

18.
ObjectivesThis study aims to establish a computed tomography (CT)–based scoring system for grading mitral annular calcification (MAC) severity and potentially aid in predicting valve embolization during transcatheter mitral valve (MV) replacement using balloon-expandable aortic transcatheter heart valves.BackgroundTranscatheter MV replacement is emerging as an alternative treatment for patients with severe MAC who are not surgical candidates. Although cardiac CT is the imaging modality of choice in the evaluation of candidates for valve-in-MAC (ViMAC), a standardized grading system to quantify MAC severity has not been established.MethodsWe performed a multicenter retrospective review of cardiac CT and clinical outcomes of patients undergoing ViMAC. A CT-based MAC score was created using the following features: average calcium thickness (mm), degrees of annulus circumference involved, calcification at one or both fibrous trigones, and calcification of one or both leaflets. Features were assigned points according to severity (total maximum score = 10) and severity grade was assigned based on total points (mild ≤3, moderate 4 to 6, and severe ≥7 points). The association between MAC score and device migration/embolization was evaluated.ResultsOf 117 patients in the TMVR in MAC registry, 87 had baseline cardiac CT of adequate quality. Of these, 15 were treated with transatrial access and were not included. The total cohort included 72 (trans-septal = 37, transapical = 35). Mean patient age was 74 ± 12 years, 66.7% were female, and the mean Society of Thoracic Surgery risk score was 15.4 ± 10.5%. The mean MAC score was 7.7 ± 1.4. Embolization/migration rates were lower in higher scores: Patients with a MAC score of 7 had valve embolization/migration rate of 12.5%, MAC score ≥8 had a rate of 8.7%, and a MAC score of ≥9 had zero (p = 0.023). Patients with a MAC score of ≤6 had 60% embolization/migration rate versus 9.7% in patients with a MAC score ≥7 (p < 0.001). In multivariable analysis, a MAC score ≤6 was in independent predictor of valve embolization/migration (odds ratio [OR]: 5.86 [95% CI: 1.00 to 34.26]; p = 0.049).ConclusionsThis cardiac CT–based score provides a systematic method to grade MAC severity which may assist in predicting valve embolization/migration during trans-septal or transapical ViMAC procedures.  相似文献   

19.
BackgroundIn the ARISTOTLE (Apixaban for Reduction in Stroke and Other Thromboembolic Events in Atrial Fibrillation) trial, patients with atrial fibrillation and ≥2 dose-adjustment criteria (age ≥80 years, weight ≤60 kg, or creatinine ≥1.5 mg/dl [133 μmol/l]) were randomized to receive apixaban 2.5 mg twice daily or warfarin.ObjectivesThe purpose of this study was to describe the effects of apixaban dose adjustment on clinical and pharmacological outcomes.MethodsPatients receiving the correct dose of study drug were included (n = 18,073). The effect of apixaban 2.5 mg twice daily versus warfarin on population pharmacokinetics, D-dimer, prothrombin fragment 1 + 2 (PF1+2), and clinical outcomes was compared with the standard dose (5 mg twice daily).ResultsPatients receiving apixaban 2.5 mg twice daily exhibited lower apixaban exposure (median area under the concentration time curve at a steady state 2,720 ng/ml vs. 3,599 ng/ml; p < 0.0001) than those receiving the standard dose. In patients with ≥2 dose-adjustment criteria, reductions in D-dimers (p interaction = 0.20) and PF1+2 (p interaction = 0.55) were consistent with those observed in the standard-dose population. Patients with ≥2 dose-adjustment criteria (n = 751) were at higher risk for stroke/systemic embolism, major bleeding, and all-cause death than the standard-dose population (0 or 1 dose-adjustment criterion, n = 17,322). The effect of apixaban 2.5 mg twice daily versus warfarin in the ≥2 dose-adjustment criteria population was consistent with the standard dose in the reductions in stroke or systemic embolism (p interaction = 0.26), major bleeding (p interaction = 0.25), and death (p interaction = 0.72).ConclusionsApixaban drug concentrations were lower in patients receiving 2.5 mg twice daily compared with 5 mg twice daily. However, the effects of apixaban dose adjustment to 2.5 mg versus warfarin were consistent for coagulation biomarkers and clinical outcomes, providing reassuring data on efficacy and safety. (Apixaban for the Prevention of Stroke in Subjects With Atrial Fibrillation [ARISTOTLE]; NCT00412984)  相似文献   

20.
ObjectivesThe aim of this study was to determine the impact of delayed high-degree atrioventricular block (HAVB) or complete heart block (CHB) after transcatheter aortic valve replacement (TAVR) using a minimalist approach followed by ambulatory electrocardiographic (AECG) monitoring.BackgroundLittle is known regarding the clinical impact of HAVB or CHB in the early period after discharge following TAVR.MethodsA prospective, multicenter study was conducted, including 459 consecutive TAVR patients without permanent pacemaker who underwent continuous AECG monitoring for 14 days (median length of hospital stay 2 days; IQR: 1-3 days), using 2 devices (CardioSTAT and Zio AT). The primary endpoint was the occurrence of HAVB or CHB. Patients were divided into 3 groups: 1) no right bundle branch block (RBBB) and no electrocardiographic (ECG) changes; 2) baseline RBBB with no further changes; and 3) new-onset ECG conduction disturbances.ResultsDelayed HAVB or CHB episodes occurred in 21 patients (4.6%) (median 5 days postprocedure; IQR: 4-6 days), leading to PPM in 17 (81.0%). HAVB or CHB events were rare in group 1 (7 of 315 [2.2%]), and the incidence increased in group 2 (5 of 38 [13.2%]; P < 0.001 vs group 1) and group 3 (9 of 106 [8.5%]; P = 0.007 vs group 1; P = 0.523 vs group 2). No episodes of sudden or all-cause death occurred at 30-day follow-up.ConclusionsSystematic 2-week AECG monitoring following minimalist TAVR detected HAVB and CHB episodes in about 5% of cases, with no mortality at 1 month. Whereas HAVB or CHB was rare in patients without ECG changes post-TAVR, baseline RBBB and new-onset conduction disturbances determined an increased risk. These results would support tailored management using AECG monitoring and the possibility of longer hospitalization periods in patients at higher risk for delayed HAVB or CHB.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号