首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 93 毫秒
1.
ObjectiveThe study objective was to examine pulmonary function and quality of life improvement after robotic-assisted thoracoscopic tracheobronchoplasty for patients with different degrees of obstructive airway disease.MethodsWe performed a retrospective review of a prospective database of patients who underwent robotic-assisted thoracoscopic tracheobronchoplasty between 2013 and 2020.ResultsA total of 118 patients underwent robotic-assisted thoracoscopic tracheobronchoplasty. Preoperative and postoperative pulmonary function tests were available for 108 patients. Postoperative pulmonary function tests at a median of 16 months demonstrated a significant increase in percent predicted forced expiratory volume in 1 second (preoperative median: 76.76% predicted, postoperative: 83% predicted, P = .002). Preoperative and postoperative St George Respiratory Questionnaires were available for 64 patients with a significant decrease in postoperative score at a median of 7 months (preoperative median: 61, postoperative: 41.60, P < .001). When stratified by preoperative degree of obstruction, robotic-assisted thoracoscopic tracheobronchoplasty improved forced expiratory volume in 1 second in moderate to very severe obstruction with a statistically significant improvement in moderate (preoperative median: 63.91% predicted, postoperative median: 73% predicted, P = .001) and severe (preoperative median: 44% predicted, postoperative median: 57% predicted, P = .007) obstruction. St George Respiratory Questionnaire scores improved for all patients. Improvement for mild (preoperative median: 61.27, postoperative median: 36.71, P < .001) and moderate (preoperative median: 57.15, postoperative median: 47.52, P = .03) obstruction was statistically significant.ConclusionsRobotic-assisted thoracoscopic tracheobronchoplasty improves obstruction and symptoms. With limited follow-up, subgroup analysis showed forced expiratory volume in 1 second improved in severe preoperative obstruction and quality of life improved in moderate obstruction. Future follow-up is required to determine robotic-assisted thoracoscopic tracheobronchoplasty effects on the most severe group, but we cannot conclude that increased degree of preoperative obstruction precludes surgery.  相似文献   

2.
ObjectiveThe study objective was to determine whether donor substance abuse (opioid overdose death, opioid use, cigarette or marijuana smoking) impacts lung acceptance and recipient outcomes.MethodsDonor offers to a single center from 2013 to 2019 were reviewed to determine if lung acceptance rates and recipient outcomes were affected by donor substance abuse.ResultsThere were 3515 donor offers over the study period. A total of 154 offers (4.4%) were opioid use and 117 (3.3%) were opioid overdose deaths. A total of 1744 donors (65.0%) smoked cigarettes and 69 donors (2.6%) smoked marijuana. Of smokers, 601 (35.0%) had less than 20 pack-year history and 1117 (65.0%) had more than 20 pack-year history. Substance abuse donors were younger (51.5 vs 55.2 P < .001), more often male (65.6 vs 54.8%, P < .001), more often White (86.2 vs 68.7%, P < .001), and had hepatitis C (8.3 vs 0.8%, P < .001). Donor acceptance was significantly associated with brain dead donors (odds ratio, 1.56, P < .001), donor smoking history (odds ratio, 0.56, P < .001), hepatitis C (odds ratio, 0.35, P < .001), younger age (odds ratio, 0.98, P < .001), male gender (odds ratio, 0.74, P = .004), and any substance abuse history (odds ratio, 0.50, P < .001), but not opioid use, opioid overdose death, or marijuana use. Recipient survival was equivalent when using lungs from donors who had opioid overdose death, who smoked marijuana, or who smoked cigarettes for less than 20 patient-years or more than 20 patient-years, and significantly longer in recipients of opioid use lungs. There was no significant difference in time to chronic lung allograft dysfunction for recipients who received lungs from opioid overdose death or with a history of opioid use, marijuana smoking, or cigarette smoking.ConclusionsDonor acceptance was impacted by cigarette smoking but not opioid use, opioid overdose death, or marijuana use. Graft outcomes and recipient survival were similar for recipients of lungs from donors who abused substances.  相似文献   

3.
ObjectiveThe aim of this study was to evaluate comparative outcomes for percutaneous coronary intervention (PCI) versus coronary artery bypass grafting (CABG) in patients with reduced ejection fraction.MethodsAll patients from the University of Pittsburgh Medical Center from 2011 to 2018 who had reduced preoperative ejection fraction (<50%) and underwent CABG or PCI for coronary revascularization were included in this study. Patients were risk-adjusted with propensity matching (1:1) and primary outcomes included long-term survival, readmission, and major adverse cardiac and cerebrovascular events (MACCE).ResultsA total of 2000 patients were included in the current study, consisting of CABG (n = 1553) and PCI (n = 447) cohorts with a mean ejection fraction of 35% ± 9.53%. Propensity matching yielded a 1:1 match with 324 patients in each cohort, controlling for all baseline characteristics. Thirty-day mortality was similar for PCI versus CABG (6.2% vs 4.9%; P = .49). Overall mortality over the study follow-up period (median, 3.23 years; range, 1.83-4.98 years) was significantly higher for the PCI cohort (37.4% vs 21.3%; P < .001). Total hospital readmissions (24.1% vs 12.9%; P = .001), cardiac readmissions (20.4% vs 11.1%; P = .001), myocardial infarction event (7.7% vs 1.8%; P = .001), MACCE (41.4% vs 23.8%; P < .001), and repeat revascularization (6.5% vs 2.6%; P = .02) occurred more frequently in the PCI cohort. Freedom from MACCE at 1 year (74.4% vs 87.0%; P < .001) and 5 years (54.5% vs 74.0%; P < .001) was significantly lower for the PCI cohort. On multivariable cox regression analysis, CABG (hazard ratio, 0.57; 95% confidence interval, 0.44-0.73; P < .001) was significantly associated with improved survival. Prior liver disease, dialysis, diabetes, and peripheral artery disease were the most significant predictors of mortality. The cumulative incidence of hospital readmission was lower for the CABG cohort (hazard ratio, 0.51; 95% confidence interval, 0.37-0.71; P < .001). Multivariable cox regression for MACCE (hazard ratio, 0.48; 95% confidence interval, 0.39-0.58; P < .001) showed significantly fewer events for the CABG cohort.ConclusionsPatients with reduced ejection fraction who underwent CABG had significantly improved survival, lower MACCE, and fewer repeat revascularization procedures compared with patients who underwent PCI.  相似文献   

4.
ObjectivesLobar lung transplantation (LLTx) from deceased donors is a potential solution for donor–recipient size mismatch for small sized recipients. We reviewed our institutional experience to compare outcomes after LLTx to standard lung transplantation (LTx).MethodsWe retrospectively reviewed transplants in our institution from January 2000 to December 2017. LLTx early- and long-term outcomes were compared with LTx. Additional analysis of outcomes was performed after dividing the cohort into 2 eras (era 1, 2000-2012; era 2, 2013-2017).ResultsAmong the entire cohort (1665), 75 were LLTx (4.5%). Compared with LTx, LLTx were more frequently bridged to transplant with extracorporeal life support or mechanical ventilation and were transplanted in a rapidly deteriorating status (respectively, 20% vs 4.4%, P = .001; 22.7% vs 7.9, P < .001; and 41.3% vs 26.5%, P = .013). LLTx had longer intensive care unit and hospital lengths of stay (respectively, median 17 vs 4 days, and 45 vs 23, both P < .001), and greater 30-day mortality (13.3% vs 4.3%, P = .001) and 90-day mortality (17.3% vs 7.2%, P = .003). In era 2, despite a significantly greater 30-day mortality (10.8% vs 2.8%, P = .026), there was no significant difference in 90-day mortality between LLTx and LTx (13.5% vs 5.1%, P = .070). Overall survival at 1, 3, and 5 years was not significantly different between LLTx and LTx (73.2% vs 84.4%, 56.9% vs 68.4% and 50.4% vs 55.8, P = .088).ConclusionsAlthough LLTx is a high-risk procedure, both mid- and long-term survival are comparable with LTx in all cohorts in the modern era. LLTx therefore represents a valuable surgical option for small-sized recipients.  相似文献   

5.
BackgroundThe strategy for intervention remains controversial for patients presenting with type A aortic dissection (TAAAD) and cerebral malperfusion with neurologic deficit.MethodsSurgically managed patients with TAAAD enrolled in the International Registry of Acute Aortic Dissection were evaluated to determine the incidence and prognosis of patients with cerebral malperfusion.ResultsA total of 2402 patients underwent surgical repair of TAAAD. Of these, 362 (15.1%) presented with cerebral malperfusion (CM) and neurologic deficits, and 2040 (84.9%) patients had no neurologic deficits at presentation. Patients with CM were more less likely to present with chest pain (66% vs 86.5%; P < .001) and back pain (35.9% vs 44.4%; P = .008). Patients with CM were more likely to present with syncope (48.4% vs 10.1%; P < .001), peripheral malperfusion (52.7% vs 38.0%; P < .001), and shock (16.2% vs 4.1%; P < .001). There was no difference in the incidence of Marfan syndrome (2.8% vs 3.0%; P = .870) or history of known aortic aneurysm (11.7% vs 13.9%; P = .296). Patients with CM were more likely to have a DeBakey I (63.8% vs 47.1%; P < .001) and a pericardial effusion (53.8% vs 40.6; P < .001) on presentation. There was no difference in total arch replacement (21.3% for CM vs 19.5% for no CM; P = .473). Patients with CM had an increased incidence of postoperative cerebrovascular accident (17.5% vs 7.2%; P < .001) and acute kidney injury (28.3% vs 18.1%; P < .001). In-hospital mortality was greater in patients with CM (25.7% vs 12.0%; P < .001).ConclusionsFifteen percent of patients with TAAAD presented with CM and neurologic deficits. Despite the fact that this subset of the population was older and more likely to present with peripheral malperfusion, cardiac tamponade, and in shock, in-hospital survival was noted in nearly 75% of the patients. Surgeons may continue to offer lifesaving surgery for TAAAD to this critically ill cohort of patients with acceptable morbidity and mortality.  相似文献   

6.
ObjectiveTo report long-term outcomes after deep hypothermic circulatory arrest (DHCA) with or without perioperative blood or blood products.MethodsAll patients who underwent proximal aortic surgery with DHCA from 2011 to 2018 were propensity matched according to baseline characteristics. Primary outcomes included short- and long-term mortality. Stratified Cox regression analysis was performed for significant associations with survival.ResultsA total of 824 patients underwent aortic replacement requiring circulatory arrest. After matching, there were 224 patients in each arm (transfusion and no transfusion). All baseline characteristics were well matched, with a standardized mean difference (SMD) <0.1. Preoperative hematocrit (41.0 vs 40.6; SMD = 0.05) and ejection fraction (57.5% vs 57.0%; SMD = 0.08) were similar between the no transfusion and blood product transfusion cohorts. Rate of aortic dissection (42.9% vs 45.1%; SMD = 0.05), hemiarch replacement (70.1% vs 70.1%; SMD = 0.00), and total arch replacement (21.9% vs 23.2%; SMD = 0.03) were not statistically different. Cardiopulmonary bypass and cross-clamp time were higher in the blood product transfusion cohort (P < .001). Operative mortality (9.4% vs 2.7%; P = .003), stroke (7.6% vs 1.3%; P = .001), reoperation rate, pneumonia, prolonged ventilation, and dialysis requirements were significantly higher in the transfusion cohort (P < .001). In stratified Cox regression, transfusion was an independent predictor of mortality (hazard ratio, 2.62 [confidence interval, 1.47-4.67]; P = .001). One- and 5-year survival were significantly reduced for the transfusion cohort (P < .001).ConclusionsIn patients who underwent aortic surgery with DHCA, perioperative transfusions were associated with poor outcomes despite matching for preoperative baseline characteristics.  相似文献   

7.
ObjectiveThe impact of staff turnover during cardiac procedures is unknown. Accurate inventory of sharps (needles/blades) requires attention by surgical teams, and sharp count errors result in delays, can lead to retained foreign objects, and may signify communication breakdown. We hypothesized that increased team turnover raises the likelihood of sharp count errors and may negatively affect patient outcomes.MethodsAll cardiac operations performed at our institution from May 2011 to March 2016 were reviewed for sharp count errors from a prospectively maintained database. Univariate and multivariable analyses were performed.ResultsAmong 7264 consecutive cardiac operations, sharp count errors occurred in 723 cases (10%). There were no retained sharps detected by x-ray in our series. Sharp count errors were lower on first start cases (7.7% vs 10.7%, P < .001). Cases with sharp count errors were longer than those without (7 vs 5.7 hours, P < .001). In multivariable analysis, factors associated with an increase in sharp count errors were non–first start cases (odds ratio [OR], 1.3; P = .006), weekend cases (OR, 1.6; P < .004), more than 2 scrub personnel (3 scrubs: OR, 1.3; P = .032; 4 scrubs: OR, 2; P < .001; 5 scrubs: OR, 2.4; P = .004), and more than 1 circulating nurse (2 nurses: OR, 1.9; P < .001; 3 nurses: OR, 2; P < .001; 4 nurses: OR, 2.4; P < .001; 5 nurses: OR, 3.1; P < .001). Sharp count errors were associated with higher rates of in-hospital mortality (OR, 1.9; P = .038).ConclusionsSharp count errors are more prevalent with increased team turnover and during non–first start cases or weekends. Sharp count errors may be a surrogate marker for other errors and thus increased mortality. Reducing intraoperative team turnover or optimizing hand-offs may reduce sharp count errors.  相似文献   

8.
ObjectiveDespite growing evidence of comparable outcomes in recipients of donation after circulatory death and donation after brain death donor lungs, donation after circulatory death allografts continue to be underused nationally. We examined predictors of nonuse.MethodsAll donors who donated at least 1 organ for transplantation between 2005 and 2019 were identified in the United Network for Organ Sharing registry and stratified by donation type. The primary outcome of interest was use of pulmonary allografts. Organ disposition and refusal reasons were evaluated. Multivariable regression modeling was used to assess the relationship between donor factors and use.ResultsA total of 15,458 donation after circulatory death donors met inclusion criteria. Of 30,916 lungs, 3.7% (1158) were used for transplantation and 72.8% were discarded primarily due to poor organ function. Consent was not requested in 8.4% of donation after circulatory death offers with donation after circulatory death being the leading reason (73.4%). Nonuse was associated with smoking history (P < .001), clinical infection with a blood source (12% vs 7.4%, P = .001), and lower PaO2/FiO2 ratio (median 230 vs 423, P < .001). In multivariable regression, those with PaO2/FiO2 ratio less than 250 were least likely to be transplanted (adjusted odds ratio, 0.03; P < .001), followed by cigarette use (0.28, P < .001), and donor age >50 (0.75, P = .031). Recent transplant era was associated with significantly increased use (adjusted odds ratio, 2.28; P < .001).ConclusionsNontransplantation of donation after circulatory death lungs was associated with potentially modifiable predonation factors, including organ procurement organizations' consenting behavior, and donor factors, including hypoxemia. Interventions to increase consent and standardize donation after circulatory death donor management, including selective use of ex vivo lung perfusion in the setting of hypoxemia, may increase use and the donor pool.  相似文献   

9.
ObjectivesThe prognosis of patients with locally advanced esophageal squamous cell carcinoma with different recurrence backgrounds is highly heterogeneous. This study aims to explore the effects of recurrence patterns on prognosis.MethodsThe phase III, multicenter, prospective NEOCRTEC5010 trial enrolled 451 patients with stage IIB-III esophageal squamous cell carcinoma randomly assigned to neoadjuvant chemoradiotherapy combined with surgery (NCRT group) or surgery alone (S group) and followed them long-term. We investigated the effects of recurrence patterns on survival in patients undergoing radical esophagectomy.ResultsIn total, 353 patients were included in the study. The 5-year overall survival of patients with different recurrence patterns was significantly different: recurrence versus recurrence-free (17.8% vs 89.2%; P < .001), early recurrence versus late recurrence (4.6% vs 51.2%; P < .001), and distant metastasis versus locoregional recurrence (17.0% vs 20.0%; P = .666). Patients with early recurrence had significantly shorter survival after recurrence than those with late recurrence (hazard ratio, 1.541; 95% confidence interval, 1.047-2.268, P = .028). There was no significant difference in postrecurrence survival between patients with distant metastasis and locoregional recurrence (hazard ratio, 1.181; 95% confidence interval, 0.804-1.734; P = .396). Multivariate logistic analysis showed that pN1 stage, lymph node dissection <20, and lack of response to NCRT were independent risk factors for postoperative early recurrence. Multivariate Cox regression suggested that NCRT, age ≥60 years, early recurrence, and the pN1 stage were independent risk factors for shortened survival after recurrence.ConclusionsPrerecurrence primary tumor stage is inaccurate in predicting postrecurrence survival. In contrast, recurrence patterns can guide follow-up while also predicting postrecurrence survival. NCRT prolongs disease-free survival but is associated with a worse prognosis in patients with recurrence, especially early recurrence.  相似文献   

10.
ObjectiveThe objective of this study was to investigate the association between morphological variation and postsurgical pulmonary vein (PV) stenosis (PPVS) in patients with cardiac total anomalous pulmonary venous connection (TAPVC).MethodsThis single-center, retrospective study included 168 pediatric patients who underwent surgical repair of cardiac TAPVC from 2013 to 2019 (connection to the coronary sinus [CS], n = 136; connection directly to the right atrium [RA], n = 32). Three-dimensional computed tomography modeling and geometric analysis were performed to investigate the morphological features; their relevance to the PPVS was examined.ResultsThe connection type had no association with PPVS (CS type: 18% vs right atrial type: 19%; P = .89) but there was a higher incidence of PPVS in patients with a single PV orifice than > 1 orifice (P < .001). Confluence-to-total PV area ratio (hazard ratio, 4.78, 95% CI, 1.86-12.32; P = .001) and length of drainage route (hazard ratio, 1.22; 95% CI, 1.14-1.31; P < .001) had a 4- and 1-fold increase in the risk for PPVS in the CS type after adjustment for age and preoperative pulmonary venous obstruction. In the right atrial type, those with anomalous PV return to the RA roof were more likely to develop PPVS than to the posterior wall of the RA (P < .001).ConclusionsThe number of inter-junction PV orifice correlated with PPVS development in cardiac TAPVC. The confluence-to-total PV ratio, length of drainage route, and anomalous PV return to the RA roof are important predictors for PPVS. Morphological subcategorization in this clinical setting can potentially assist in surgical decision-making.  相似文献   

11.
ObjectivesStereotactic body radiation therapy (SBRT) is increasingly used to treat non–small cell lung cancer. The purpose of this study is to analyze relationships between facility SBRT utilization and surgical patient selection and survival after surgery.MethodsData on patients with TI/T2N0M0 lesions and treatment facility characteristics were abstracted from the National Cancer Database, 2008 to 2017. Facilities were stratified using an SBRT/surgery ratio previously associated with short-term survival benefit for patients treated surgically, and by a previously identified surgical volume threshold. Multiple regression analyses, Cox proportional-hazard regressions, and Kaplan–Meier log rank test were employed.ResultsIn total, 182,610 patients were included. Proportion of high SBRT:surgery ratio (≥17%) facilities increased from 118 (11.5%) to 558 (48.4%) over the study period. Patients undergoing surgery at high-SBRT facilities had comparable comorbidity scores and tumor sizes to those at low-SBRT facilities, and nonclinically significant differences in age, race, and insurance status. Among low-volume surgical facilities, treatment at a high SBRT-using facility was associated with decreased 30-day mortality (1.8% vs 1.4%, P < .001) and 90-day mortality (3.3% vs 2.6%, P < .001). At high-volume surgical facilities, no difference was observed. At 5 years, a survival advantage was identified for patients undergoing resection at facilities with high surgical volumes (hazard ratio, 0.91; confidence interval, 0.90-0.93 P < .001) but not at high SBRT-utilizing facilities.ConclusionsDifferences in short-term survival following resection at facilities with high-SBRT utilization may be attributable to low surgical volume facilities. Patients treated at high volume surgical facilities do not demonstrate differences in short-term or long-term survival based on facility SBRT utilization.  相似文献   

12.
ObjectiveTo discern the impact of depressed left ventricular ejection fraction (LVEF) on the outcomes of open descending thoracic aneurysm (DTA) and thoracoabdominal aneurysms (TAAA) repair.MethodsRestricted cubic spline analysis was used to identify a threshold of LVEF, which corresponded to an increase in operative mortality and major adverse events (MAE: operative death, myocardial infarction, stroke, spinal cord injury, need for tracheostomy or dialysis). Logistic and Cox regression were performed to identify independent predictors of MAE, operative mortality, and survival.ResultsDTA/TAAA repair was performed in 833 patients between 1997 and 2018. Restricted cubic spline analysis showed that patients with LVEF <40% (n = 66) had an increased risk of MAE (odds ratio [OR], 2.17; 95% confidence interval [CI], 1.22-3.87; P < .01) and operative mortality (OR, 2.72; 95% CI, 1.21-6.12; P = .02) compared with the group with LVEF ≥40% (n = 767). The group with LVEF <40% had a worse preoperative profile (eg, coronary revascularization, 48.5% vs 17.3% [P < .01]; valvular disease, 82.8% vs 49.39% [P < .01]; renal insufficiency, 45.5% vs 26.1% [P < .01]; respiratory insufficiency, 36.4% vs 21.2% [P = .01]) and worse long-term survival (35.5% vs 44.7% at 10 years; P = .01). Nonetheless, on multivariate regression, depressed LVEF was not an independent predictor of operative mortality, MAE, or survival.ConclusionsLVEF is not an independent predictor of adverse events in surgery for DTA.  相似文献   

13.
ObjectivePrematurity, low birth weight, genetic syndromes, extracardiac conditions, and secondary cardiac lesions are considered high-risk conditions associated with mortality after stage 1 palliation. We report the impact of these conditions on outcomes from a prospective multicenter improvement collaborative.MethodsThe National Pediatric Cardiology Quality Improvement Collaborative Phase II registry was queried. Comorbid conditions were categorized and quantified to determine the cumulative burden of high-risk diagnoses on survival to the first birthday. Logistic regression was applied to evaluate factors associated with mortality.ResultsOf the 1421 participants, 40% (575) had at least 1 high-risk condition. The aggregate high-risk group had lower survival to the first birthday compared with standard risk (76.2% vs 88.1%, P < .001). Presence of a single high-risk diagnosis was not associated with reduced survival to the first birthday (odds ratio, 0.71; confidence interval, 0.49-1.02, P = .066). Incremental increases in high-risk diagnoses were associated with reduced survival to first birthday (odds ratio, 0.23; confidence interval, 0.15-0.36, P < .001) for 2 and 0.17 (confidence interval, 0.10-0.30, P < .001) for 3 to 5 high-risk diagnoses. Additional analysis that included prestage 1 palliation characteristics and stage 1 palliation perioperative variables identified multiple high-risk diagnoses, poststage 1 palliation extracorporeal membrane oxygenation support (odds ratio, 0.14; confidence interval, 0.10-0.22, P < .001), and cardiac reoperation (odds ratio, 0.66; confidence interval, 0.45-0.98, P = .037) to be associated with reduced survival odds to the first birthday.ConclusionsThe presence of 1 high-risk diagnostic category was not associated with decreased survival at 1 year. Cumulative diagnoses across multiple high-risk diagnostic categories were associated with decreased odds of survival. Further patient accrual is needed to evaluate the impact of specific comorbid conditions within the broader high-risk categories.  相似文献   

14.
ObjectiveBarlow's disease remains challenging to repair, given the complex valvular morphology and lack of quantitative data to compare techniques. Although there have been recent strides in ex vivo evaluation of cardiac mechanics, to our knowledge, there is no disease model that accurately simulates the morphology and pathophysiology of Barlow's disease. The purpose of this study was to design such a model.MethodsTo simulate Barlow's disease, a cross-species ex vivo model was developed. Bovine mitral valves (n = 4) were sewn into a porcine annulus mount to create excess leaflet tissue and elongated chordae. A heart simulator generated physiologic conditions while hemodynamic data, high-speed videography, and chordal force measurements were collected. The regurgitant valves were repaired using nonresectional repair techniques such as neochord placement.ResultsThe model successfully imitated the complexities of Barlow's disease, including redundant, billowing bileaflet tissues with notable regurgitation. After repair, hemodynamic data confirmed reduction of mitral leakage volume (25.9 ± 2.9 vs 2.1 ± 1.8 mL, P < .001) and strain gauge analysis revealed lower primary chordae forces (0.51 ± 0.17 vs 0.10 ± 0.05 N, P < .001). In addition, the maximum rate of change of force was significantly lower postrepair for both primary (30.80 ± 11.38 vs 8.59 ± 4.83 N/s, P < .001) and secondary chordae (33.52 ± 10.59 vs 19.07 ± 7.00 N/s, P = .006).ConclusionsThis study provides insight into the biomechanics of Barlow's disease, including sharply fluctuating force profiles experienced by elongated chordae prerepair, as well as restoration of primary chordae forces postrepair. Our disease model facilitates further in-depth analyses to optimize the repair of Barlow's disease.  相似文献   

15.
BackgroundModerate hypothermic circulatory arrest (MHCA) with antegrade cerebral perfusion (ACP) is safe and efficient in total arch replacement (TAR) and frozen elephant trunk (FET) for acute type A aortic dissection (ATAAD). Complications related to hypothermia and ischemia are inevitable, however. The aortic balloon occlusion (ABO) technique is performed to elevate the lowest nasopharyngeal temperature to 28°C and shorten the circulatory arrest time. In this study, we aimed to evaluate the efficacy of this new technique.MethodsWe reviewed the clinical data of patients with ATAAD who underwent TAR and FET, including 79 who underwent ABO and 109 who underwent MHCA/ACP.ResultsCirculatory arrest time was significantly lower in the ABO group compared with the MHCA/ACP group (mean, 4.8 ± 1.2 minutes vs 18.4 ± 3.1 minutes; P < .001). The composite endpoint was comparable in the 2 groups (11.4% for ABO vs 13.8% for MHCA/ACP; P = .631). Fewer patients in the ABO group developed high-grade acute kidney injury (AKI) according to a modified RIFLE criterion (22.8% vs 36.7%; P = .041), and the rate of hepatic dysfunction was lower in the ABO group (11.4% vs 28.4%; P = .005). Multivariable logistic analysis showed that the ABO technique is protective against duration of ventilation >24 hours (odds ratio [OR], 0.455; 95% confidence interval [CI], 0.234-0.887; P = .021), hepatic dysfunction (OR, 0.218; 95% CI, 0.084-0.561; P = .002), and grade II-III AKI (OR, 0.432; 95% CI, 0.204-0.915; P = .028).ConclusionsThe ABO technique significantly shortens the circulatory arrest time in TAR and FET. Available clinical data suggest that it has a certain protective effect on the liver and kidney. Future large-sample studies are warranted to thoroughly evaluate this new technique.  相似文献   

16.
ObjectiveValve-sparing root replacement using reimplantation techniques is increasingly applied to bicuspid aortopathy. Long-term durability of cusp repair is unclear. We analyze midterm results using a conservative approach to cusp repair.MethodsFrom 2006 to 2018, 327 patients underwent valve-sparing reimplantation, 66 with bicuspid valves. Leaflets were analyzed after reimplantation. A majority (51/66) required no cusp repair. Fifteen patients had cusp repair limited to closure of unfused raphe or central plication. Patients were followed by echocardiography.ResultsMean age of patients was 44.7 ± 12.3 years. The cusp repair group had a higher incidence of preoperative moderate (10% vs 40%) or severe (4% vs 33.3%) aortic insufficiency (P < .001). There was no operative mortality or major complication. Mean follow-up was 51.6 ± 40.8 months. On postoperative echocardiography, incidence of none, trace, or mild aortic insufficiency was 41.3% (19/46), 43.5% (20/46), and 15.2% (7/46) in the no cusp repair group and 40% (6/15), 40% (6/15), and 20% (3/15) in the cusp repair group, respectively (P = .907). Few patients progressed in degree of aortic insufficiency. No patients required reoperation. At 5 years, freedom from any aortic insufficiency was 46.9% versus 15.8% (P = .013), and freedom from greater than trace aortic insufficiency was 59.1% versus 36.9% (P = .002) due to the higher rate of postoperative trace and mild aortic insufficiency with cusp repair. There was no difference in freedom from greater than mild aortic insufficiency (92.1% vs 100%; P = .33).ConclusionsValve-sparing root replacement is reliably performed with bicuspid aortic valves whether or not cusp reconstruction is necessary. Few patients progress to greater than mild aortic insufficiency. Need for reoperation is rare in midterm follow-up.  相似文献   

17.
18.
ObjectiveFemale sex is a known risk factor in most cardiac surgery, including coronary and valve surgery, but unknown in acute type A aortic dissection repair.MethodsFrom 1996 to 2018, 650 patients underwent acute type A aortic dissection repair; 206 (32%) were female, and 444 (68%) were male. Data were collected through the Cardiac Surgery Data Warehouse, medical record review, and National Death Index database.ResultsCompared with men, women were significantly older (65 vs 57 years, P < .0001). The proportion of women and men inverted with increasing age, with 23% of patients aged less than 50 years and 65% of patients aged 80 years or older being female. Women had significantly less chronic renal failure (2.0% vs 5.4%, P = .04), acute myocardial infarction (1.0% vs 3.8%, P = .04), and severe aortic insufficiency. Women underwent significantly fewer aortic root replacements with similar aortic arch procedures, shorter cardiopulmonary bypass times (211 vs 229 minutes, P = .0001), and aortic crossclamp times (132 vs 164 minutes, P < .0001), but required more intraoperative blood transfusion (4 vs 3 units) compared with men. Women had significantly lower operative mortality (4.9% vs 9.5%, P = .04), especially in those aged more than 70 years (4.4% vs 16%, P = .02). The significant risk factors for operative mortality were male sex (odds ratio, 2.2), chronic renal failure (odds ratio, 3.4), and cardiogenic shock (odds ratio, 6.8). The 10-year survival was similar between sexes.ConclusionsPhysicians and women should be cognizant of the risk of acute type A aortic dissection later in life in women. Surgeons should strongly consider operations for acute type A aortic dissection in women, especially in patients aged 70 years or more.  相似文献   

19.
ObjectiveLeft ventricular (LV) distention is a feared complication in patients receiving venoarterial (VA) extracorporeal membrane oxygenation (ECMO). LV unloading can be achieved indirectly with intra-aortic balloon pump (IABP) or directly with an Impella device (Abiomed, Danvers, Mass). We sought to assess the clinical and hemodynamic effects of IABP and Impella devices on patients supported with VA ECMO.MethodsWe conducted a retrospective review of VA ECMO patients at our institution from January 2015 to June 2020. Patients were categorized as either ECMO alone or ECMO with LV unloading. LV unloading was characterized as either ECMO with IABP or ECMO with Impella. We recorded baseline characteristics, survival, complications, and hemodynamic changes associated with device initiation.ResultsDuring the study, 143 patients received ECMO alone whereas 140 received ECMO with LV unloading (68 ECMO with IABP, 72 ECMO with Impella). ECMO with Impella patients had a higher incidence of bleeding events compared with ECMO alone or ECMO with IABP (52.8% vs 37.1% vs 17.7%; P < .0001). Compared with ECMO alone, ECMO with IABP patients had better survival at 180 days (log rank P = .005) whereas survival in ECMO with Impella patients was not different (log rank P = .66). In a multivariable Cox hazard analysis, age (hazard ratio [HR], 1.02; 95% confidence interval [CI], 1.00-1.03; P = .015), male sex (HR, 0.54; 95% CI, 0.38-0.80; P = .002), baseline lactate (HR, 1.06; 95% CI, 1.02-1.11; P = .004), baseline creatinine (HR, 1.06; 95% CI, 1.00-1.11; P = .032), need for extracorporeal membrane oxygenation-cardiopulmonary resuscitation (HR, 2.09; 95% CI, 1.40-3.39; P = .001), and presence of pre-ECMO IABP (HR, 0.45; 95% CI, 0.25-0.83; P = .010) were associated with reduced mortality. There was no significant difference in hemodynamic changes in the ECMO with IABP versus ECMO with Impella cohorts.ConclusionsConcomitant support with IABP might help reduce morbidity and improve 180-day survival in patients receiving VA ECMO for cardiogenic shock.  相似文献   

20.
BackgroundPatients with medically treated type B aortic dissection (TBAD) remain at significant risk for late adverse events (LAEs). We hypothesize that not only initial morphological features, but also their change over time at follow-up are associated with LAEs.Materials and MethodsBaseline and 188 follow-up computed tomography (CT) scans with a median follow-up time of 4 years (range, 10 days to 12.7 years) of 47 patients with acute uncomplicated TBAD were retrospectively reviewed. Morphological features (n = 8) were quantified at baseline and each follow-up. Medical records were reviewed for LAEs, which were defined according to current guidelines. To assess the effects of changes of morphological features over time, the linear mixed effects models were combined with Cox proportional hazards regression for the time-to-event outcome using a joint modeling approach.ResultsLAEs occurred in 21 of 47 patients at a median of 6.6 years (95% confidence interval [CI], 5.1-11.2 years). Among the 8 investigated morphological features, the following 3 features showed strong association with LAEs: increase in partial false lumen thrombosis area (hazard ratio [HR], 1.39; 95% CI, 1.18-1.66 per cm2 increase; P < .001), increase of major aortic diameter (HR, 1.24; 95% CI, 1.13-1.37 per mm increase; P < .001), and increase in the circumferential extent of false lumen (HR, 1.05; 95% CI, 1.01-1.10 per degree increase; P < .001).ConclusionsIn medically treated TBAD, increases in aortic diameter, new or increased partial false lumen thrombosis area, and increases of circumferential extent of the false lumen are strongly associated with LAEs.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号