首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.

Background

Roux-en-Y choledochojejunostomy and duct-to-duct anastomosis are potential methods for biliary reconstruction in liver transplantation (LT) for recipients with primary sclerosing cholangitis (PSC). However, there is controversy over which method yields superior outcomes. The purpose of this study was to evaluate the outcomes of duct-to-duct versus Roux-en-Y biliary anastomosis in patients undergoing LT for PSC.

Methods

Studies comparing Roux-en-Y versus duct-to-duct anastomosis during LT for PSC were identified based on systematic searches of 9 electronic databases and multiple sources of gray literature.

Results

The search identified 496 citations, including 7 retrospective series, and 692 patients met eligibility criteria. The use of duct-to-duct anastomosis was not associated with a significant difference in clinical outcomes, including 1-year recipient survival rates (odds ratio [OR], 1.02; 95% confidence interval [CI], 0.65–1.60; P = .95), 1-year graft survival rates (OR, 1.11; 95% CI, 0.72–1.71; P = .64), risk of biliary leaks (OR, 1.23; 95% CI, 0.59–2.59; P = .33), risk of biliary strictures (OR, 1.99; 95% CI, 0.98–4.06; P = .06), or rate of recurrence of PSC (OR, 0.94; 95% CI, 0.19–4.78; P = .94).

Conclusions

There were no significant differences in 1-year recipient survival, 1-year graft survival, risk of biliary complications, and PSC recurrence between Roux-en-Y and duct-to-duct biliary anastomosis in LT for PSC.  相似文献   

2.

Introduction

There is a tendency to favor oversized donor hearts for heart transplant candidates affected by mild to moderate pulmonary hypertension (PHTN). We hypothesize that both undersized and oversized donor hearts fare equally well in this setting.

Methods

A total of 107 cases from 2003 to 2008 were retrospectively reviewed and subsequently divided into those receiving organs from undersized donors (group 1: donor weight/recipient weight ≤0.90, n = 37) and oversized donors (group 2: donor weight/recipient weight ≥1.2, n = 70). PHTN was identified in the perioperative period in those patients with systolic pulmonary artery pressure (SPAP) ≥40 mm Hg. Endpoints of mortality and hemodynamic data were investigated.

Results

Of 107 patients, 37 received undersized donor allografts, with a mean donor-to-recipient weight ratio of 0.8, and 70 received oversized donors allografts, with a mean donor-to-recipient ratio of 1.4. Perioperative PAH was diagnosed in 20 of the 37 (54%) patients from the undersized group (mean SPAP = 45.9 mm Hg) and 41 of 70 (59%) patients from the oversized group (mean SPAP = 46.5 mm Hg). There was no significant difference in right ventricular function at 1 week, 1 month, or 6 months. Left ventricular function was similar between both groups at 6 months (P = .22). The mean SPAP in the undersized group was 45.9, 33.4, 31.8, and 23.1 mm Hg at the perioperative, 1 week, 1 month, and 6 month time points, respectively. Corresponding mean SPAP for the oversized group was 46.5, 35.0, 29.4, and 26.1 mm Hg. The 1 month, 1 year, and 3 year survivals were similar in both groups.

Conclusions

Oversized and undersized donor hearts fared equally well in the setting of mild to moderate perioperative PAH. This in addition to the propensity for resolution of pulmonary hypertension over time suggests that the current practice of favoring oversized donor hearts for patients with pre-transplantation PAH may be unwarranted.  相似文献   

3.

Background

Obesity is thought to be associated with higher rates of morbidity and mortality after liver transplantation (LT); however, its actual impact is difficult to evaluate, in part because of the confounding effects of fluid accumulation on body mass index (BMI).

Objective

We sought to define the effects of conventional BMI (cBMI) and modified BMI (mBMI; calculated by multiplying the BMI by serum albumin level to compensate for fluid accumulation), on the outcome of LT recipients overall.

Methods

A cohort of 507 patients who underwent LT from April 2000 to August 2006 were analyzed.

Results

Pre-LT diabetes mellitus was seen somewhat more frequently in the higher mBMI group (P = .054), whereas there was no difference across cBMI categories. The recipients at extremes of cBMI (>40 kg/m2 and <18.5 kg/m2) had significantly lower patient and graft survival than other groups (P = .038 and P = .010, respectively); however, no statistically significant differences were found in overall patient and graft survival across mBMI categories. There were no differences in duration of intensive care unit stay, duration of overall hospital stay, and vascular complications after LT among mBMI categories.

Conclusions

Pre-LT obesity alone, when estimated by mBMI rather than by cBMI, should not be a contraindication for LT.  相似文献   

4.

Background

Pneumothoraces are relatively common among trauma patients and can rapidly progress to tension physiology and death if not identified and treated. We sought to develop a reliable and reproducible large animal model of tension pneumothorax and to examine the cardiovascular effects during progression from simple pneumothorax to tension pneumothorax.

Materials and methods

Ten swine were intubated, sedated, and placed on mechanical ventilation. After a midline celiotomy, a 10-mm balloon-tipped laparoscopic trocar was placed through the diaphragm, and a 28F chest tube was placed in the standard position and clamped. Thoracic insufflation was performed in 5-mm increments, and continuous cardiovascular measurements were obtained.

Results

Mean insufflation pressures of 10 mm Hg were associated with a 67% decrease in cardiac output (6.6 L/min versus 2.2 l/min; P = 0.04). An additional increase in the insufflation pressure (mean 15 mm Hg) was associated with an 82% decrease in cardiac output from baseline (6.8 versus 1.2 L/min; P < 0.01). Increasing insufflation pressures were associated with a corresponding increase in central venous pressure (from 7.6 mm Hg to 15.2 mm Hg; P < 0.01) and a simultaneous decrease in the pulmonary artery diastolic pressure (from 15 mm Hg to 12 mm Hg; P = 0.06), with the central venous pressure and pulmonary artery diastolic pressure approaching equalization immediately before the development of major hemodynamic decline. Pulseless electrical activity arrest was induced at an average of 20 mm Hg. Tension physiology was immediately reversible with adequate decompression, allowing for multiple repeated trials.

Conclusions

A reliable and highly reproducible model was created for severe tension pneumothorax in a large animal. Major cardiovascular instability proceeding to pulseless electrical activity arrest with stepwise insufflation was noted. This model could be highly useful for studying new diagnostic and treatment modalities for tension pneumothorax.  相似文献   

5.

Background

Ex vivo normothermic perfusion (EVNP) can reverse some of the detrimental effects of ischemic injury. However, in kidneys with warm and cold ischemic injury the optimal perfusion pressure remains undetermined. The aim of this study was to evaluate the effects of two different arterial pressures during EVNP.

Methods

Porcine kidneys underwent static cold storage for 23 h followed by 1 h of EVNP using leukocyte depleted blood at a mean arterial pressure of either 55 or 75 mm Hg. After this, kidneys were reperfused for 3 h to assess renal function and injury. This was compared with a control group that underwent 24 h cold storage.

Results

During EVNP, kidneys perfused at 75 mm Hg had a higher renal blood flow, increased oxygen consumption (median 59.9 mL/min/g (range 30.1–78.6] versus 31.8 [8.2–53.8] mL/min/g; P = 0.026), and produced more urine (P = 0.002) than kidneys perfused at 55 mm Hg. During ex vivo reperfusion, renal blood flow was significantly higher in the 75 mm Hg and 55 mm Hg groups compared with the control (area under the curve median 75 mm Hg 462 [228–745], 55 mm Hg 454 [254–923] versus control 262 [215–442] mL/min/100g.h; P = 0.040). There was a significant loss of renal function and increase in tubular injury in the 55 mm Hg group kidneys (P = 0.001, 0.007). Levels of endothelin 1 were significantly reduced in the 75 mm Hg group (P = 0.026).

Conclusions

A mean arterial pressure of 75 mm Hg during EVNP resulted in less tubular damage and less endothelial injury during ex vivo reperfusion compared with kidneys perfused at 55 mm Hg.  相似文献   

6.

Background

High ratios of fresh frozen plasma:packed red blood cells in damage control resuscitation (DCR) are associated with increased survival. The impact of volume and type of resuscitative fluid used during high ratio transfusion has not been analyzed. We hypothesize a difference in outcomes based on the type and quantity of resuscitative fluid used in patients that received high ratio DCR.

Methods

A matched case control study of patients who received transfusions of ≥ four units of PRBC during damage control surgery over 4 1/2 y, was conducted at a Level I Trauma Center. All patients received a high ratio DCR, >1:2 of fresh frozen plasma:packed red blood cells. Demographics and outcomes of the type and quantity of resuscitative fluids used in combination with high ratio DCR were compared and analyzed. A Kaplan-Meier survival analysis was computed among four groups: colloid (median quantity = 1.0 L), <3 L crystalloid, 3–6 L crystalloid, and >6 L crystalloid.

Results

There were 56 patients included in the analysis (28 in the crystalloid group and 28 in the colloid group). Demographics were statistically similar. Intraoperative median units of PRBC: crystalloid versus colloid groups was 13 (IQR 8-21) versus 16 (IQR 12–19), P = 0.135; median units of FFP: 12 (IQR 7–18) versus 12 (IQR 10–18), P = 0.440. OR for 10-d mortality in the crystalloid group was 8.41 [95% CI 1.65–42.76 (P = 0.01)]. Kaplan-Meier survival analysis demonstrated lowest mortality in the colloid group and higher mortality with increasing amounts of crystalloid (P = 0.029).

Conclusions

During high ratio DCR, resuscitation with higher volumes of crystalloids was associated with an overall decreased survival, whereas low volumes of colloid use were associated with increased survival. In order to improve outcomes without diluting the survival benefit of hemostatic resuscitation, guidelines should focus on effective low volume resuscitation when high ratio DCR is used. A multi-institutional analysis is needed in order to validate these results.  相似文献   

7.

Background

The T-cell activation Rho GTPase–activating protein (TAGAP) gene has a regulatory role in T cell activation. We have previously suggested a correlation between the TAGAP-associated single nucleotide polymorphism rs212388 and protection from anal sepsis in Crohn's disease (CD) patients. The present study sought to evaluate TAGAP's expression in colonic tissue of CD patients with varying disease severity and location.

Materials and methods

Five transverse, 17 left, and five sigmoid colectomy specimens from 27 CD patients with varying disease severity (16 male, mean age at diagnosis 26.4 ± 2.2 y) were evaluated for TAGAP messenger RNA expression. Fisher exact, Mann–Whitney, and Welch two-sample t-tests were used for statistical evaluation. Immunohistochemistry confirmed results.

Results

Patients with tissue demonstrating lower TAGAP messenger RNA expression (less than the overall mean) were younger at diagnosis (mean age 21.1 ± 6.3 versus 32.5 ± 13 y, P = 0.009). Increased TAGAP expression was seen in moderate or severely diseased tissue versus tissue with no or mild disease (RQ = 1.3 ± 0.34 versus 0.53 ± 0.09, P = 0.050). This was the most dramatic in the sigmoid colon (P = 0.041). TAGAP expression was increased in more distal tissue with a significant difference seen when comparing transverse versus sigmoid colon with moderate or severe disease (0.51 ± 0.14 versus 1.9 ± 0.37, P = 0.049).

Conclusions

Colonic expression of TAGAP in CD patients varied according to disease severity and location, being the most elevated in patients with severe disease in the sigmoid colon. Whether changes in TAGAP expression are a result of disease response or inherent to the disease pathophysiology itself remains to be determined. This gene warrants further investigation for its role in CD.  相似文献   

8.

Background

Along with an increased number of cases of liver transplantation (LT), perioperative mortality has decreased and short-term survival has improved. However, long-term complications have not been fully elucidated today.

Purpose

Chronic complications were analyzed individually to find risk factors and to improve long-term outcomes after LT.

Subjects

There were 63 cases of LT from our outpatient clinic that were included in this study. Among them, 58 were performed using living donor LT and 5 were performed using deceased donor LT. Original diseases mainly consisted of hepatitis C virus (HCV; 45.9%) and hepatitis B virus (23.0%).

Findings

The median follow-up was 5.4 ± 3.3 years (range, 0.1∼17 years). Overall survival at 2, 3, 5, and 10 years was 89.3%, 83.4%, 81.3%, and 81.3%, respectively. Long-term complications mainly consisted of renal dysfunction (62.7%), dyslipidemia (29.4%), diabetes mellitus (21.6%), and arterial hypertension (21.6%). In univariate analysis, HCV (P = .03) and elapsed years after LT (P = .02) were identified as predictive factors for arterial hypertension and recipient age >50 (P = .03), and elapsed years after LT for renal dysfunction (P = .03), respectively. In multivariate Cox regression analysis, HCV (odds ratio [OR] 5.25, 95% confidence interval [CI] 1.05–34.06, P = .04) was identified as a predictive factor for arterial hypertension, and recipient age older than 50 years for renal dysfunction (OR 5.67, 95% CI 1.34–28.88, P = .02). The number of elapsed years after transplantation was also identified as a predictive factor for arterial hypertension/dyslipidemia/renal dysfunction (OR 13.88/14.15/4.10, 95% CI 1.91–298.26/2.18–290.78/1.09–18.03, P = .01/.003/.04). Fifty percent of the recipients developed renal dysfunction within 8 years after LT, and fluctuation of estimated glomerular filtration rate (eGFR) within 3 months after LT was successfully associated with an annual decrease of eGFR (r2 value = 0.574, P < .0001).

Conclusion

Renal dysfunction is the most frequent chronic complication after LT. As chronic individual eGFR can be now accurately predicted with deterioration speed, recipient strata for renal protection strategies should be precisely targeted.  相似文献   

9.

Background

Very large non–small cell lung cancers (NSCLC) remain a therapeutic challenge. The objective of this study was to evaluate the effect of surgery in the presence and absence of neoadjuvant radiation (NRT) on survival of patients with T3N0 >7-cm NSCLCs.

Materials and methods

The Surveillance, Epidemiology, and End Results database was used to identify patients undergoing lobectomy or pneumonectomy for T3N0 NSCLC tumors >7 cm from 1999–2008. Patients were categorized into groups based on type of surgery performed and whether NRT was used. Five-year overall (OS) and lung cancer–specific survival (LCSS) were estimated by the Kaplan-Meier method and comparisons made using log-rank tests and Cox regression models.

Results

There were 1301 patients evaluated, including 1232 undergoing primary surgical therapy (PST) and 69 receiving NRT. NRT was not associated with improvements in 5-y OS (48% versus 41%, P = 0.062) or LCSS (59% versus 52%, P = 0.116) compared with PST. Lobectomies were associated with better 5-y OS (43% versus 33%; P = 0.006) and LCSS (54% versus 43%, P = 0.005) compared with pneumonectomies. On multivariate analysis, NRT did not produce any significant advantage in OS (P = 0.242) and LCSS (P = 0.208). Pneumonectomies were associated with significantly worse OS (hazard ratio, 1.32; P = 0.007) and LCSS (hazard ratio, 1.38; P = 0.005) when compared with lobectomies.

Conclusions

NRT, which most likely was a combination of chemotherapy and radiation, was not associated with improvements in OS or LCSS in patients with T3N0 >7-cm NSCLC compared with PST. When feasible, lobectomy appears more beneficial than pneumonectomy in terms of long-term survival for very large tumors.  相似文献   

10.

Background

According to International Society of Heart and Lung Transplantation criteria, high body mass index (BMI; ≥30 kg/m2) is a relative contraindication for lung transplantation (LT). On the other hand, low BMI may be associated with worse outcome. We investigated the influence of pre-LT BMI on survival after LT in a single-center study.

Methods

Patients were divided according to the World Health Organization criteria into 4 groups: BMI <18.5 kg/m2 (underweight), BMI 18.5–24.9 kg/m2 (normal weight), BMI 25–29.9 kg/m2 (overweight), and BMI ≥30 kg/m2 (obesity). An additional analysis was made per underlying disease.

Results

BMI was determined in a cohort of 546 LT recipients, of which 28% had BMI <18.5 kg/m2. Underweight resulted in similar survival (P = .28) compared with the normal weight group. Significantly higher mortality was found in overweight (P = .016) and obese patients (P = .031) compared with the normal-weight group. Subanalysis of either underweight (P = .19) or obese COPD patients (P = .50) did not reveal worse survival. In patients with interstitial lung disease, obesity was associated with increased mortality (P = .031) compared with the normal-weight group. In cystic fibrosis patients, underweight was not associated with a higher mortality rate (P = .12) compared with the normal-weight group.

Conclusions

Low pre-LT BMI did not influence survival rate in our cohort, independently from underlying disease.  相似文献   

11.

Background

The American Society of Anesthesiologists (ASA) physical status classification and Charlson comorbidity index (CCI) was adopted to assess patients' physical condition before surgery. Studies suggest that ASA score and CCI might be a prognostic criterion (indicator) for patient outcome. The aim of this study is to determine if ASA classification and CCI can determine the risk of anastomotic leaks (AL) in patients who underwent colorectal surgery.

Methods

A retrospective analysis of 505 consecutive colorectal resections with primary anastomoses between 2008 and 2012 was performed at a university hospital. ASA score, CCI, surgical procedure, length of stay, age, body mass index (BMI), comorbidities, and postoperative outcomes were analyzed.

Results

Two hundred sixty-five patients had an ASA score of I and II, 227 patients had an ASA score of III, and 13 patients had an ASA score of IV. A total of 19 patients had an anastomotic leak (ASA I–II: 5 patients, 1.9%; ASA III: 12 patients, 5.58%; ASA IV: 2 patients, 18.18%). A higher ASA score was significantly associated with AL on further analysis (OR: 2.99, 95% CI: 1.345–6.670, P = 0.007). When matched for age, BMI, and CCI on logistic regression analysis, increased ASA level was independently related to an increased likelihood of leak (ORsteroids = 14.35, P < 0.01; ORASA_III v I–II = 2.02, P = 0.18; ORASA_IVvI–II = 8.45, P = 0.03). There were no statistically significant differences in means between the leak and no-leak patients with respect to age (60.69 versus 65.43, P = 0.17), BMI (28.03 versus 28.96, P = 0.46), and CCI (6.19 versus 7.58, P = 0.09).

Conclusions

ASA score, but not CCI, is independently associated with anastomotic leak. Patients with a high ASA class should be closely followed postoperatively for AL after colorectal operations.  相似文献   

12.

Background

We sought to assess the independent effect of concomitant adhesions (CAs) on patient outcome in abdominal surgery.

Materials and methods

Using the American College of Surgeons National Surgical Quality Improvement Program data, we created a uniform data set of all gastrectomies, enterectomies, hepatectomies, and pancreatectomies performed between 2007 and 2012 at our tertiary academic center. American College of Surgeons National Surgical Quality Improvement Program data were supplemented with additional variables (e.g., procedure complexity–relative value unit). The presence of CAs was detected using the Current Procedural Terminology codes for adhesiolysis (44005, 44180, 50715, 58660, and 58740). Cases where adhesiolysis was the primary procedure (e.g., bowel obstruction) were excluded. Multivariable logistic regression analyses were performed to assess the independent effect of CAs on 30-d morbidity and mortality, while controlling for age, comorbidities and the type/complexity/approach/emergency nature of surgery.

Results

Adhesiolysis was performed in 875 of 5940 operations (14.7%). Operations with CAs were longer (median duration 3.2 versus 2.7 h, P < 0.001), more complex (median relative value unit 37.5 versus 33.4, P < 0.001), performed in sicker patients (American Society for Anesthesiologists class ≥3 in 49.9% versus 41.2%, P < 0.001), and harbored higher risk for inadvertent enterotomies (3.0% versus 0.9%, P < 0.001). In multivariable analyses, CAs independently predicted higher morbidity (adjusted odds ratio [OR], 1.35; 95% confidence interval, 1.13–1.61, P = 0.001). Specifically, CAs independently correlated with superficial and deep or organ-space surgical site infections (OR = 1.42 (1.02–1.86), P = 0.036; OR = 1.47 (1.09–1.99), P = 0.013, respectively), and prolonged postoperative hospital stay (≥7 d, OR = 1.34 [1.11–1.61], P = 0.002). No difference in 30-d mortality was detected.

Conclusions

CAs significantly increase morbidity in abdominal surgery. Risk adjusting for the presence of adhesions is crucial in any efforts aimed at quality assessment and/or benchmarking of abdominal surgery.  相似文献   

13.

Background

The impact of pregnancy on the course of Crohn disease is largely unknown. Retrospective surveys have suggested a variable effect, but there are limited population-based clinical data. We hypothesized pregnant women with Crohn disease will have similar rates of surgical disease as a nonpregnant Crohn disease cohort.

Material and methods

International Classification of Diseases, Ninth Revision, Clinical Modification codes were used to identify female Crohn patients from all patients admitted using the Nationwide Inpatient Sample (1998–2009). Women were stratified as either pregnant or nonpregnant. We defined Crohn-related surgical disease as peritonitis, gastrointestinal hemorrhage, intra-abdominal abscess, toxic colitis, anorectal suppuration, intestinal–intestinal fistulas, intestinal–genitourinary fistulas, obstruction and/or stricture, or perforation (excluding appendicitis).

Results

Of the 92,335 women admitted with a primary Crohn-related diagnosis, 265 (0.3%) were pregnant. Pregnant patients were younger (29 versus 44 y; P < 0.001) and had lower rates of tobacco use (6% versus 13%; P < 0.001). Pregnant women with Crohn disease had higher rates of intestinal–genitourinary fistulas (23.4% versus 3.0%; P < 0.001), anorectal suppuration (21.1% versus 4.1%; P < 0.001), and overall surgical disease (59.6% versus 39.2%; P < 0.001). On multivariate logistic regression analysis controlling for malnutrition, smoking, age, and prednisone use, pregnancy was independently associated with higher rates of anorectal suppuration (odds ratio [OR], 5.2; 95% confidence interval [CI], 3.8–7.0; P < 0.001), intestinal–genitourinary fistulas (OR, 10.4; 95% CI, 7.8–13.8; P < 0.001), and overall surgical disease (OR, 2.9; 95% CI, 2.3–3.7; P < 0.001).

Conclusions

Pregnancy in women with Crohn disease is a significant risk factor for Crohn-related surgical disease, in particular, anorectal suppuration and intestinal–genitourinary fistulas.  相似文献   

14.
15.

Background

The purpose of this study was to investigate the relationship between insurance status and outcomes for trauma patients presenting without vital signs undergoing urgent intervention.

Materials and methods

The National Trauma Data Bank was queried for patients presenting with a systolic blood pressure equal to zero and a Glasgow Coma Scale score of three (“clinically dead”), who underwent urgent thoracotomy and–or laparotomy (UTL). Insured patients were compared with uninsured (INS [−]) patients.

Results

There were 18,171 patients presenting clinically dead having a payment source documented. INS (−) patients were more likely to undergo UTL (5.4% [416–7704] versus 2.7% [285–10,467], 1.481 [1.390–1.577], <0.001). Out of 689 patients who underwent UTL and meeting inclusion criteria, 416 (60.4%) were INS (−). Patients with insurance demonstrated a significantly greater survival (9.9% [27–273] versus 1.7% [7–416], 5.878 [2.596–13.307] P < 0.001). Adjusting for mechanism, race, age, injury severity, and comorbidities, insured status was independently associated with survival.

Conclusions

The presence of health insurance is independently associated with survival in trauma patients presenting with cardiovascular collapse who undergo urgent surgical intervention.  相似文献   

16.

Background

There are little published data on outcomes of blood conservation (BC) patients after noncardiac surgery. The objective of this study was to compare the surgical outcomes of patients enrolled in our BC program with that of the general population of surgical patients.

Methods

BC patients at our institution undergoing various surgical procedures were identified from the 2007–2009 National Surgical Quality Improvement Program database and compared with a cohort of conventional care (CC) patients matched by age, gender, and surgical procedure. Univariate and multiple logistic regression analyses were performed to evaluate 30-d postoperative outcomes.

Results

One hundred twenty BC patients were compared with 238 CC patients. The two groups were similar for all preoperative variables except smoking, which was lower in the BC group. On univariate analysis, BC patients had similar mean operating time (148 versus 155 min; P = 0.5), length of stay (5.9 versus 5.5 d; P = 0.7), and rate of return to the operating room (7.5% versus 5.5%; P = 0.4) compared with CC patients. BC and CC patients had similar 30-d morbidity (18% versus 14%; P = 0.3) and mortality rates (1.6% versus 1.3%; P = 1.0), respectively. On multivariable analysis, enrollment in the BC program had no impact on postoperative 30-d morbidity (odds ratio, 1.78; 95% confidence interval, 0.71–4.47) or 30-d mortality (unadjusted odds ratio, 1.33; 95% confidence interval, 0.22–8.05).

Conclusions

Short-term postoperative outcomes in BC patients are similar to the general population, and these patients should not be denied surgical treatment based on their unwillingness to receive blood products.  相似文献   

17.

Background

The malignancy rate after alemtuzumab (C-1H) induction in cardiac transplantation is unknown.

Methods

A retrospective analysis from a single center for all patients that underwent cardiac transplantation from January 2000 to January 2011 and that had no history of malignancy before transplantation was performed. Patients induced with alemtuzumab were compared with a group of patients receiving thymoglobulin or no induction and assessed for 4-year cancer-free post–heart transplantation survival.

Results

Of 402 patients included, 185 (46.0%) received alemtuzumab, 56 (13.9%) thymoglobulin, and 161 (40.0%) no induction. Baseline characteristics did not differ between groups: mean age 54.0 years, male 77.1%, white 88.6%, ischemic cardiomyopathy 49.0%. The calcineurin inhibitor was tacrolimus in 98.9% of alemtuzumab patients, 98.2% of thymoglobulin patients, and 87.0% of the noninduced (P < .001). The secondary agent was mycophenolate mofetil in all but 16 noninduced patients (9.9%), who received azathioprine. The 4-year cancer-free survival did not differ between groups: 88.1% alemtuzumab, 87.5% thymoglobulin, 88.2% noninduction; P = .088. The 4-year nonskin cancer–free survival was 96.8% for the alemtuzumab group, 96.4% for the thymoglobulin group, and 95.7% for the noninduced; P = .899.

Conclusions

Neither the 4-year cancer-free survival nor the 4-year nonskin cancer–free survival differed between the alemtuzumab, thymoglobulin, and noninduced groups.  相似文献   

18.

Background

Sternal reconstruction with vascularized flaps is central to the management of sternal wound infections and mediastinitis but carries a high risk of complications. There is a need to identify reliable predictors of complication risk to help inform patients and clinicians in preparation for surgery. Unfortunately, body mass index and serum albumin may not be reliable predictors of complication rates. Analytic morphomics provides a robust quantitative method to measure patients' obesity as it pertains to their risk of complications in undergoing sternal reconstruction.

Methods

We identified 34 patients with preoperative computed tomography scans of the abdomen from a cohort of sternal reconstructions performed between 1997 and 2010. Using semiautomated analytic morphomics, we identified the patients' skin and fascia layers between the ninth and 12th thoracic spine levels; from these landmarks, we calculated morphomic measurements of the patients' abdomens, including their total body cross sectional area and the cross sectional area of their subcutaneous fat. We obtained the incidence of complications from chart review and correlated the incidence of complications (including seroma, hematoma, recurrent wounds, mediastinitis, tracheostomy, and death) with patients' morphomic measurements.

Results

Sixty-two percent of patients (n = 21) suffered complications after their operation. Those who suffered from complications, relative to those who did not have complications, had increased visceral fat area (12,547.2 mm2versus 6569.9 mm2, P = 0.0080), subcutaneous fat area (16,520.2 mm2versus 8020.1 mm2, P = 0.0036), total body area (91,028.6 mm2versus 67,506.5 mm2, P = 0.0022), fascia area (69,238.4 mm2versus 56,730.9 mm2, P = 0.0118), total body circumference (1101.8 mm versus 950.2 mm, P = 0.0017), and fascia circumference (967.5 mm versus 868.1 mm, P = 0.0077). We also demonstrated a significant positive correlation between the previously mentioned morphomic measurements and the incidence of complications in multivariate logistic regression models, with odds ratios ranging from 1.19–3.10 (P values ranging from 0.010–0.022).

Conclusions

Increases in abdominal morphomic measurements correlate strongly with the incidence of complications in patients undergoing sternal reconstruction. This finding may influence preoperative risk stratification and surgical decision making in this patient population.  相似文献   

19.

Background

Due to the shortage of suitable organs, the demand for partial liver transplantation from living donors has increased worldwide. N-acetylcysteine (NAC) has shown protective effects as a free radical scavenger during hypothermic preservation and warm ischemia–reperfusion liver injury; however, no study has reported the effects in partial liver transplantation. The aim of this study was to analyze the impact of NAC on liver graft microcirculation and graft function after partial liver transplantation in rats.

Methods

Orthotopic partial liver transplantations were performed in 40 rats following cold storage in histidine-tryptophan-ketoglutarate solution for 3 h with 20 mM NAC (NAC group, n = 20) or without (control group, n = 20). We assessed portal circulation, graft microcirculation, and biochemical analyses of plasma at 1, 3, 24, and 168 h after portal reperfusion.

Results

(Control versus NAC, median and range): Portal venous pressure was significantly lower with NAC (P = 0.03). Microcirculation measured by laser Doppler was significantly improved with NAC throughout the time course (P = 0.003). Alanine aminotransferase levels were significantly lower in the NAC group (P < 0.05). Total antioxidative capacity was significantly higher in the NAC group at 1 h after reperfusion (Trolox equivalents: median, 3 μM; range, 2.9–6.7 versus median, 16.45 μM; range, 10.4–18.8). Lipid peroxidation was significantly abrogated in the NAC group (median, 177.6 nmol/mL; range, 75.9–398.1 versus median, 71.5 nmol/mL; range, 58.5–79 at 3 h).

Conclusions

This study showed that NAC treatment during cold storage resulted in improved microcirculation and preservation quality of partial liver graft likely because of enhanced antioxidant capacity and reduced lipid peroxidation.  相似文献   

20.

Background

Before bariatric surgery, some patients with type 2 diabetes mellitus (T2DM) experience improvement in blood glucose control and reduced insulin requirements while on a preoperative low-calorie diet (LCD). We hypothesized that patients who exhibit a significant glycemic response to this diet are more likely to experience remission of their diabetes in the postoperative period.

Materials and methods

Insulin-dependent T2DM patients undergoing bariatric surgery between August 2006 and February 2011 were eligible for inclusion. Insulin requirements at day 0 and 10 of the LCD were compared. Patients with a ≥50% reduction in total insulin dosage to maintain appropriate blood glucose control were considered rapid responders to the preoperative LCD. All others were non–rapid responders. We analyzed T2DM remission rates up to 1 y postoperatively.

Results

A total of 51 patients met inclusion criteria and 29 were categorized as rapid responders (57%). The remaining 22 were considered non–rapid responders (43%). The two groups did not differ demographically. Rapid responders had greater T2DM remission rates at 6 (44% versus 13.6%; P = 0.02) and 12 mo (72.7% versus 5.9%; P < 0.01). In patients undergoing laparoscopic gastric bypass, rapid responders showed greater excess weight loss at 3 mo (40.1% versus 28.2%; P < 0.01), 6 mo (55.2% versus 40.2%; P < 0.01), and 12 mo (67.7% versus 47.3%; P < 0.01).

Conclusions

Insulin-dependent T2DM bariatric surgery patients who display a rapid glycemic response to the preoperative LCD are more likely to experience early remission of T2DM postoperatively and greater weight loss.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号