首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 687 毫秒
1.

Objective

This study investigated the incidence and clinical relevance of the slow-flow phenomenon after infrapopliteal balloon angioplasty.

Methods

This retrospective, single-center study included 161 consecutive patients with critical limb ischemia (173 limbs) who underwent endovascular treatment for infrapopliteal lesions between January 2012 and May 2015. The overall technical success rate was 88%. Of these lesions, 30 limbs presented with slow flow after angioplasty.

Results

Total occlusion (90% vs 63%; P < .01) and severe calcification (43% vs 8%; P < .01) were more common in the slow-flow group. Kaplan-Meier curve analysis revealed that freedom from major amputation (60% vs 86%; log-rank, P < .01) and wound healing at 2 years (77% vs 91%; log-rank, P = .03) were significantly less common in the slow-flow group. Univariate Cox proportional hazard analysis identified Rutherford class 6 (hazard ratio [HR], 6.4; 95% confidence interval [CI], 2.8-15.8; P < .01), the slow-flow phenomenon (HR, 3.9; 95% CI, 1.6-8.9; P < .01), and hemodialysis (HR, 3.2; 95% CI, 1.2-11.1; P = .02) as independent predictors of major amputation and Rutherford class 6 (HR, 0.3; 95% CI, 0.2-0.6; P < .01), the slow-flow phenomenon (HR, 0.5; 95% CI, 0.3-0.9; P = .02), and pedal arch (HR, 1.6; 95% CI, 1.0-2.5; P = .04) as predictors of wound healing.

Conclusions

The slow-flow phenomenon after infrapopliteal balloon angioplasty occurred in 18.6% of limbs. This phenomenon may result in poor outcomes.  相似文献   

2.

Objective

The effect of gender on outcomes after lower extremity revascularization is controversial. The aim of our systemic review and meta-analysis was to evaluate the gender-related outcomes after peripheral vascular interventions.

Methods

We systematically searched MEDLINE, Embase, Cochrane Database, and Scopus to identify studies comparing outcomes after revascularization according to gender. A random-effects model was used to pool outcomes. Time-to-event data were reported using hazard ratios (HRs) and dichotomous data were presented using odds ratios (ORs).

Results

Included were 40 studies. Pooling of short-term outcomes after intervention showed that women had significantly increased risks of 30-day mortality (OR, 1.31; 95% confidence interval [CI], 1.11-1.55; P = .001), amputation (OR, 1.07; 95% CI, 1.02-1.12; P = .002), early graft thrombosis (OR, 1.56; 95% CI, 1.28-1.90; P < .0001), embolization (OR, 1.64; 95% CI, 1.24-2.17; P = .0005), incisional site complication (OR, 1.56; 95% CI, 1.34-1.80; P < .0001), cardiac events (OR, 1.21; 95% CI, 1.16-1.26; P < .0001), stroke (OR, 1.35; 95% CI, 1.19-1.53; P < .0001), and pulmonary complication (OR, 1.07; 95% CI, 1.03-1.12; P = .0006). No significant differences were found between women and men for short-term reinterventions (OR, 1.06; 95% CI, 0.73-1.54; P = .74) and renal complications (OR, 1.03; 95% CI, 0.76-1.39; P = .86). No significant differences in long-term outcomes between women and men were found, with similar rates of cumulative survival (HR, 1.10; 95% CI, 0.97-1.24; P = .12), primary patency (HR, 1.14; 95% CI, 1.00-1.30; P = .06), secondary patency (HR, 1.07; 95% CI, 0.86-1.34; P = .54), and limb salvage (HR, 0.93; 95% CI, 0.70-1.24; P = .63). However, in the open surgery subgroup, women had significantly reduced survival compared with men (HR, 1.21; 95% CI, 1.01-1.44; P = .04).

Conclusions

Women have inferior short-term outcomes but similar long-term outcomes compared with men after lower limb revascularization. A higher treatment threshold may be warranted when considering intervening on women with symptomatic peripheral arterial disease owing to the increased risks of postprocedural mortality and complications.  相似文献   

3.

Background

Urinary tract infections (UTIs) are the commonest infectious complication in kidney transplant recipients (KTRs). No recommendations exist regarding treatment of asymptomatic bacteriuria. We aimed to identify potential risk factors and microbiological profile for UTIs, the role of treatment of asymptomatic bacteriuria, and effects on graft outcomes of bacteriuria within the first year post-transplantation.

Methods

We performed a retrospective analysis of UTIs in KTRs transplanted between January 2012 and December 2013 in 2 transplantation centers. Patients were routinely commenced on prophylactic sulfamethoxazole-trimethoprim. Clinical and microbiological data were analyzed for the first year following transplantation.

Results

In all, 276 KTRs were evaluated; 67% were men, with a mean age of 51 years. At 12 months post-transplantation 158 (57%) KTRs had no bacteriuria, 75 (27%) had asymptomatic bacteriuria, 21 (8%) had symptomatic UTIs without further complication, and 22 (8%) with UTIs developed either pyelonephritis or urosepsis. Most frequent pathogens identified were Enterococcus faecalis and Escherichia coli, and 36% of organisms were multidrug resistant. Female sex was a risk factor for infection (P = .002), and presence of a double-J ureteral stent significantly increased the risk of asymptomatic bacteriuria and symptomatic UTIs (P = .003). Diabetes, age, and prior transplantation did not increase risk. Presence of infection was not associated with increased rejection, with similar renal function at 12 months. For episodes of bacteriuria (n = 420, asymptomatic n = 324), untreated asymptomatic bacteriuria (n = 185) followed by symptomatic UTI with the same organism was significantly higher (P = .002) compared with cases of treated asymptomatic bacteriuria (n = 139).

Conclusion

Bacteriuria post–kidney transplantation is common, affecting nearly half of KTRs in the first year after transplantation. Treatment of asymptomatic bacteriuria may be beneficial to prevent subsequent episodes of symptomatic UTIs.  相似文献   

4.

Background

Kidney transplantation is known to increase the survival of dialysis patients by ameloriating cardiac status, including both systolic and diastolic functions. We aimed to evaluate the role of immunosuppressive drug regimens on cardiac functions of kidney transplant recipients (KTRs).

Methods

We prospectively evaluated 120 KTRs immediately before and 1 year after the kidney transplantation, using tissue Doppler echocardiography. A triple immunosuppressive therapy including tacrolimus, mycophenoloic acid (MPA), and prednisolone was started for all patients. After 3 to 6 months, the tacrolimus dose was lowered to achieve target serum levels of 5 to 8 ng/mL in both groups. MPA was switched to everolimus, with target levels of 4 to 6 ng/mL, in group 1 (n = 58), whereas group 2 (n = 62) continued with MPA.

Results

No differences in age, sex, or dialysis duration existed between the groups. The prevalence of diabetic or hypertensive nephropathy as the etiology of chronic kidney disease was similar. Blood pressure was strictly controlled. The number of acute rejection episodes was not different in both groups, and no graft loss was observed in either group. Improvement in cardiac parameters including ejection fraction, left ventricular diastolic diameter, posterior wall thickness, and left ventricular hypertrophy was significantly better before and 1 year after transplantation. Interestingly, when compared with group 2, ameloriation of all of the parameters mentioned above was even better in group 1 patients (P = .02, P = .03, P = .04, and P = .04, respectively). Multivariate analysis of the significant variables determined by univariate analysis identified albumin (relative risk [RR] = 1.05, P = .02) and everolimus (RR = 1.07, P = .01) as two independent factors of improving cardiovascular function.

Conclusions

Better ameloriation of cardiovascular functions with everolimus may favor the choice of this drug in KTRs.  相似文献   

5.

Objective

Efficacy of atrial fibrillation ablation in rheumatic mitral valve disease has been regarded inferior to that in nonrheumatic diseases. This study aimed to evaluate net clinical benefits by the addition of concomitant atrial fibrillation ablation in rheumatic mitral valve surgery.

Methods

Among 1229 consecutive patients with atrial fibrillation from 1997 to 2016 (54.4 ± 11.7 years; 68.2% were female), 812 (66.1%) received concomitant ablation of atrial fibrillation (ablation group), and 417 (33.9%) underwent valve surgery alone (no ablation group). Death and thromboembolic events were compared between these groups. Mortality was regarded as a competing risk to evaluate thromboembolic outcomes. To reduce selection bias, inverse probability of treatment weighting methods were performed.

Results

Freedom from atrial fibrillation occurrence at 5 years was 76.5% ± 1.8% and 5.3% ± 1.1% in the ablation and no ablation groups, respectively (P < .001). The ablation group had significantly lower risks for death (hazard ratio [HR], 0.69; 95% confidence interval [CI], 0.52-0.93) and thromboembolic events (HR, 0.49; 95% CI, 0.32-0.76) than the no ablation group. Time-varying Cox analysis revealed that the occurrence of stroke after surgery was significantly associated with death (HR, 3.97; 95% CI, 2.36-6.69). In subgroup analyses, the reduction in the composite risk of death and thromboembolic events was observed in all mechanical (n = 829; HR, 0.53; 95% CI, 0.39-0.73), bioprosthetic replacement (n = 239; HR, 0.67; 95% CI, 0.41-1.08), and repair (n = 161; HR, 0.17; 95% CI, 0.06-0.52) subgroups (P for interaction = .47).

Conclusions

Surgical atrial fibrillation ablation during rheumatic mitral valve surgery was associated with a lower risk of long-term mortality and thromboembolic events. Therefore, atrial fibrillation ablation for rheumatic mitral valve disease may be a reasonable option.  相似文献   

6.

Objectives

We conducted propensity score matching to determine whether the use of the right internal thoracic artery (RITA) confers a survival advantage when compared with the radial artery (RA) as second arterial conduit in coronary artery bypass grafting.

Methods

The study population included a highly selected low-risk group of patients who received the RITA (n = 764) or the RA (n = 1990) as second arterial conduit. We obtained 764 matched pairs that were comparable for all pretreatment variables. A time-segmented Cox regression model that stratified on the matched pairs was used to investigate the effect of treatment on late mortality.

Results

After a mean follow-up of 10.2 ± 4.5 years (maximum 17.3 years), survival probabilities at 5, 10, and 15 years were 96.4% ± 0.7% versus 95.4% ± 0.7%, 91.0% ± 1.1% versus 89.1% ± 1.2%, and 82.4% ± 1.9% versus 77.2% ± 2.5% in the RITA and RA groups, respectively. During the first 4 years, RITA and RA were comparable in terms of mortality (hazard ratio [HR], 1.00; 95% confidence interval [CI], 0.56-1.78; P = .98). However, after 4 years RITA was associated with a significant reduction in late mortality (HR, 0.67; 95% CI, 0.48-0.95; P = .02). RITA was superior to RA when the experimental conduit was used to graft the left coronary system (HR, 0.69; 95% CI, 0.47-0.99; P = .04) but not the right coronary system (HR, 0.98; 95% CI, 0.59-1.62; P = .93).

Conclusions

In a highly selected low-risk group of patients, the use of the RITA as second arterial conduit instead of the RA was associated with better survival when used to graft the left but not the right coronary artery.  相似文献   

7.

Objective

Arteriovenous grafts remain reliable substitutes for permanent hemodialysis access in patients without a suitable autogenous conduit. Advances in conduit design and endovascular management of access-related complications question the preference for synthetic conduits over biologic grafts in contemporary practice. In this study, we compared outcomes between a bovine carotid artery (BCA) biologic graft and expanded polytetrafluoroethylene (ePTFE) grafts for hemodialysis access in a recent cohort of patients.

Methods

This was a single-institution retrospective review of 120 consecutive grafts placed in 98 patients between January 1, 2011, and June 30, 2014. Univariate methods (χ2, analysis of variance, t-test) were used to compare demographic and medical characteristics of patients who received each graft type. Kaplan-Meier, log-rank tests, univariate and multivariate logistic analyses, and Cox regression analyses were used to evaluate patency and graft complications. Outcomes were defined and analyzed according to reporting guidelines published by the Society for Vascular Surgery.

Results

Of the 120 grafts studied, 52 (43%) were BCA and 68 (57%) were ePTFE. Successful graft use for dialysis was 96% (95% confidence interval [CI], 90%-100%) for BCA and 84% (95% CI, 74%-93%) for ePTFE (P = .055). Comparing BCA vs ePTFE, estimates for primary patency were 30% vs 43% at 1 year and 16% vs 29% at 2 years (P = .27). Primary assisted patency was 36% vs 45% at 1 year and 24% vs 35% at 2 years (P = .57). Secondary patency was 67% vs 48% at 1 year and 67% vs 38% at 2 years (P = .05). There were no differences in primary (hazard ratio [HR], 0.70; 95% CI, 0.40-1.28; P = .25) and primary assisted (HR, 0.87; 95% CI, 0.46-1.65; P = .67) patency for BCA compared with ePTFE. However, secondary patency was higher for BCA compared with ePTFE (HR, 2.92; 95% CI, 1.29-6.61; P = .01). Graft infection rates during the study period were 15.4% for BCA and 20.6% for ePTFE (P = .47). The significant predictors of graft failure were higher body mass index (HR, 1.06; 95% CI, 1.00-1.11; P = .04) and hyperlipidemia (HR, 2.94; 95% CI, 1.27-6.76; P = .01).

Conclusions

In this study of a recent cohort of patients who received arteriovenous grafts, primary and primary assisted patencies were similar between BCA and ePTFE grafts. However, secondary patency was higher for BCA, indicating better durability for the biologic graft than for ePTFE grafts in patients whose anatomy preclude placement of an arteriovenous fistula.  相似文献   

8.

Objective

We evaluated whether video-assisted thoracoscopic lobectomy for locally advanced non–small cell lung cancer could be performed safely and with acceptable long-term outcomes by our improved technique and compared with standard thoracotomy lobectomy in a well-balanced population.

Methods

Patients with clinical stage II and III A non–small cell lung cancers who received lobectomy were reviewed. Video-assisted thoracoscopic lobectomies were all performed with Wang's technique by the surgeons who had overcome the learning curve and achieved proficiency. By using propensity-matched analysis, perioperative outcomes and long-term survival were compared.

Results

Matching based on propensity scores produced 120 patients in each group. Conversion rate to thoracotomy was 11.7%. After thoracoscopic lobectomy, hospital length of stay was shorter compared with thoracotomy (9.2 vs 12 days; P = .014) despite similar rates of postoperative complications (30/125 [25%] vs 34/125 [28.3%]; P = .56). Disease-free survival (49.1% vs 42.2%; P = .40) and overall survival (55.0% vs 57.1%; P = .73) at 5 years were similar between groups. Although advanced pathologic stage (hazard ratio [HR], 2.018; 95% confidence interval [CI], 1.330-3.062) and no postoperative chemotherapy (HR, 1.880; 95% CI, 1.236-2.858) were independently associated with increased hazard of death in multivariable Cox regression at each time point in follow-up, thoracoscopic lobectomy was not (HR, 1.075; 95% CI, 0.714-1.620; P = .73).

Conclusions

With continued experience and optimized technique, video-assisted thoracoscopic lobectomy can be performed in the majority of cases without compromising perioperative outcomes and oncologic efficacy.  相似文献   

9.

Purpose

Studies have shown that arecoline, the major alkaloid component of betel nuts, alters the activity of enzymes in the cytochrome P450 (CYP-450) family. Tacrolimus, an immunosuppressant that protects against organ rejection in transplant recipients, not only is mainly metabolized by CYP3A enzymes but also has a narrow therapeutic range. We aimed to investigate whether dose-adjusted blood trough levels of tacrolimus differed over time between betel nut-chewing and non–betel nut-chewing liver transplant recipients.

Methods

In this retrospective case-control study, 14 active betel nut-using liver recipients were matched at a 1:2 ratio to 28 non-betel nut-using liver recipients by sex, age, graft source, duration of follow-up after liver transplantation, and estimated glomerular filtration rate. Differences in liver function index, renal function index, and dose-adjusted blood trough levels of tacrolimus over an 18-month period were compared between the 2 groups by using the Generalized Estimating Equation approach.

Results

Dose-adjusted blood trough levels of tacrolimus tended to be significantly (P = .04) lower in betel nut chewers (mean = 0.81, medium = 0.7, 95% confidence interval [CI] = 0.73 to 0.90) than in nonchewers (mean = 1.12, medium = 0.88, 95% CI = 1.03 to 1.22) during the 18-month study period. However, there was no significant difference in renal and liver function index between the 2 groups.

Conclusion

Liver transplant recipients receiving tacrolimus tend to have lower blood trough levels of the drug over time if they chew betel nuts.  相似文献   

10.

Background

The aim of this study is to analyze the long-term immunologic outcomes of living-related kidney transplantations depending on the donor-recipient relationship.

Methods

This retrospective single-center study included adult kidney transplant recipients (KTR) transplanted between 2000 and 2014. Among 1117 KTRs, 178 patients (15.9%) received living-related donations. Those patients were further categorized according to the donor-recipient relationship: 65 transplantations between siblings, 39 father-to-child (F-t-C) and 74 mother-to-child (M-t-C) donations. Allograft biopsies were performed for clinically suspected rejections. Data analysis included patient and graft survival, biopsy proven rejections (T-cell mediated [TCMR] or antibody mediated) and development of de novo donor-specific antibody. Outcome data were assessed over a period of a maximum 14 years.

Results

There was no significant difference between the groups (F-t-C, M-t-C, and siblings) with regard to HLA-mismatches, prior kidney transplantations, time on dialysis, and cold ischemia time. Among KTRs with related donors, the type of relationship had no significant influence on graft survival. F-t-C and M-t-C pairs showed comparable incidences of TCMR at 7 years post-transplantation, both significantly exceeding the rate in sibling-to-sibling pairs (26.2% and 26.8% vs 10%, respectively; P = .043). A multivariate Cox regression analysis adjusted for recipient age, donor age, and HLA (A, B, DR)–mismatches identified both M-t-C- and F-t-C-donations as important independent risk factors for TCMR (hazard ratio: 8.13; P < .001 and hazard ratio: 8.09; P = .001, respectively). There was no significant difference between the groups concerning the incidence of antibody-mediated rejection and de novo donor-specific antibody.

Conclusion

Our results indicate that parent-to-child kidney donation is an independent risk factor for TCMR.  相似文献   

11.

Background

A higher body mass index (BMI) before kidney transplantation (KT) is associated with increased mortality and allograft loss in kidney transplant recipients (KTRs). However, the effect of changes in BMI after KT on these outcomes remains uncertain. The aim of this study was to investigate the effect of baseline BMI and changes in BMI on clinical outcomes in KTRs.

Methods

A total of 869 KTRs were enrolled from a multicenter observational cohort study from 2012 to 2015. Patients were divided into low and high BMI groups before KT based on a BMI cutoff point of 23 kg/m2. Differences in acute rejection and cardiovascular disease (CVD) between the 2 groups were analyzed. In addition, clinical outcomes across the 4 BMI groups divided by BMI change 1 year after KT were compared. Associations between BMI change and laboratory findings were also evaluated.

Results

Patients with a higher BMI before KT showed significantly increased CVD after KT (P = .027) compared with patients with a lower BMI. However, among the KTRs with a higher baseline BMI, only persistently higher BMI was associated with increased CVD during the follow-up period (P = .003). Patients with persistently higher BMI had significantly decreased high-density lipoprotein cholesterol and increased hemoglobin, triglyceride, and hemoglobin A1c levels. Baseline BMI and post-transplantation change in BMI were not related to acute rejection in KTRs.

Conclusions

BMI in the 1st year after KT as well as baseline BMI were associated with CVD in KTRs. More careful monitoring of obese KTRs who do not undergo a reduction in BMI after KT is required.  相似文献   

12.

Aim

The aim of this study was to evaluate risk factors affecting graft and patient survival after transplantation from deceased donors.

Methods

We retrospectively analyzed the outcomes of 186 transplantations from deceased donors performed at our center between 2006 and 2014. The recipients were divided into two groups: Group I (141 recipients without graft loss) and Group II (45 recipients with graft loss). Kaplan-Meier, log-rank test, and Cox proportional hazard regressions were used.

Results

The characteristics of both groups were similar except renal resistive index at the last follow-ups. When graft survival and mortality at the first, third, and fifth years were analyzed, tacrolimus (Tac)-based regimens were superior to cyclosporine (CsA)-based regimens (P < .001). Risk factors associated with graft survival at the first year included cardiac cause of death (versus cerebrovascular accident [CVA]; hazard ratio [HR], 6.36; 95% confidence interval [CI], 1.84–22.05; P = .004), older transplant age (HR, 1.05; 95% CI, 1.02–1.08; P < .001), and high serum creatinine level at 6 months post-transplantation (HR, 1.74; 95% CI, 1.48–2.03; P < .001), whereas younger donor age decreased risk (HR, 0.97; 95% CI, 0.95–1.00; P = .019). Also, the Tac-based regimen had a 3.63-fold (95% CI, 1.47–8.97; P = .005) lower risk factor than the CsA-based regimen, and 2.93-fold (95% CI, 1.13–7.63; P = .027) than other regimens without calcineurin inhibitors. When graft survival at 3 years was analyzed, diabetes mellitus was lower than idiopathic causes and pyelonephritis (P = .035). In Cox regression analysis at year 3, older transplantation age (HR, 1.20; 95% CI, 1.04–1.39; P = .014) and serum creatinine level at month 6 post-transplantation (HR, 1.65; 95% CI, 1.42–1.90; P < .001) were significant risk factors for graft survival. Hemodialysis (HD) plus peritoneal dialysis (PD) treatment was 2.22-fold (95% CI, 1.08–4.58; P = .03) risk factor than only HD before transplantation. When graft survival and mortality at year 5 were analyzed, diabetes mellitus was lower compared with all other diseases. In Cox regression analysis at year 5, younger donor age (HR, 0.73; 95% CI, 0.62–0.86; P < .001) was protective for graft survival, whereas older transplantation age (HR, 1.40; 95% CI, 1.20–1.64; P < .001) and serum creatinine level at month 6 of post-transplantation (HR, 1.39; 95% CI, 1.19–1.61; P < .001) were significant risk factors. PD increased 3.32 (95% CI, 1.28–8.61; P = .014) times the risk than HD. In Cox regression analysis at year 1, cardiac cause of death (versus CVA; HR, 5.28; 95% CI, 1.37–20.31; P = .016), CsA-based regimen (versus Tac; HR, 4.95; 95% CI, 1.78–13.78; P = .002), HD plus PD treatment (versus alone HD; HR, 3.26; 95% CI, 1.28–8.30; P = .013), older transplantation age (HR, 1.08; 95% CI, 1.04–1.11; P < .001), serum creatinine level at month 6 post-transplantation (HR, 1.34; 95% CI, 1.11–1.62; P = .003), and low HLA mismatches (HR, 1.67; 95% CI 1.01–2.70; P = .044) were risk factors for mortality. At year 3, CsA-based regimen (versus Tac; HR, 3.54; 95% CI, 1.32–9.47; P = .012), PD (versus HD; HR, 5.04; 95% CI, 1.41–18.05; P = .013), HD plus PD treatment (versus alone HD; HR, 3.51; 95% CI, 1.37–9.04; P = .009), and older transplantation age (HR, 1.27; 95% CI 1.05–1.53; P = .015) were risk factors for mortality. At year 5, older age at transplantation (HR, 1.47; 95% CI, 1.23–1.77; P < .001), PD (versus HD; HR, 9.21; 95% CI, 3.09–27.45; P < .001), and CsA-based regimen (versus Tac; HR, 2.75; 95% CI, 1.04–7.23; P = .041) were risk factors for mortality, whereas younger donor age decreased risk (HR, 0.71; 95% CI, 0.56–0.86; P < .001).

Conclusion

Death of donor with cardiac cause, CsA-based immunosuppressive regimen, donor age, serum creatinine level at month 6 post-transplantation, and renal replacement therapy before transplantation affected mortality and graft survival in deceased donors.  相似文献   

13.

Rationale

Pneumonia remains the most common major infection after cardiac surgery despite numerous preventive measures.

Objectives

To prospectively examine the timing, pathogens, and risk factors, including modifiable management practices, for postoperative pneumonia and estimate its impact on clinical outcomes.

Methods

A total of 5158 adult cardiac surgery patients were enrolled prospectively in a cohort study across 10 centers. All infections were adjudicated by an independent committee. Competing risk models were used to assess the association of patient characteristics and management practices with pneumonia within 65 days of surgery. Mortality was assessed by Cox proportional hazards model and length of stay by a multistate model.

Measurements and Main Results

The cumulative incidence of pneumonia was 2.4%, 33% of which occurred after discharge. Older age, lower hemoglobin level, chronic obstructive pulmonary disease, steroid use, operative time, and left ventricular assist device/heart transplant were risk factors. Ventilation time (24-48 vs ≤24 hours; hazard ratio [HR], 2.83; 95% confidence interval [95% CI], 1.72-4.66; >48 hours HR, 4.67; 95% CI, 2.70-8.08), nasogastric tubes (HR, 1.80; 95% CI, 1.10-2.94), and each unit of blood cells transfused (HR, 1.16; 95% CI, 1.08-1.26) increased the risk of pneumonia. Prophylactic use of second-generation cephalosporins (HR, 0.66; 95% CI, 0.45-0.97) and platelet transfusions (HR, 0.49, 95% CI, 0.30-0.79) were protective. Pneumonia was associated with a marked increase in mortality (HR, 8.89; 95% CI, 5.02-15.75) and longer length of stay of 13.55 ± 1.95 days (bootstrap 95% CI, 10.31-16.58).

Conclusions

Pneumonia continues to impose a major impact on the health of patients after cardiac surgery. After we adjusted for baseline risk, several specific management practices were associated with pneumonia, which offer targets for quality improvement and further research.  相似文献   

14.

Background

BETA-2 score using a single fasting blood sample was developed to estimate beta-cell function after islet transplantation (ITx) and was validated internally by a high ITx volume center (Edmonton). The goal was to validate BETA-2 externally, in our center.

Methods

Areas under receiver operating characteristic curves (AUROCs) were obtained to see if beta score or BETA-2 would better detect insulin independence and glucose intolerance.

Results

We analyzed values from 48 mixed meal tolerance tests (MMTTs) in 4 ITx recipients with a long-term follow-up to 140 months (LT group) and from 54 MMTTs in 13 short-term group patients (ST group). AUROC for no need for insulin support was 0.776 (95% confidence interval [CI] 0.539–1, P = .02) and 0.922 (95% CI 0.848–0.996, P < .001) for beta score and 0.79 (95% CI 0.596–0.983, P = .003) and 0.941 (95% CI 0.86–1, P < .001) for BETA-2, in LT and ST groups, respectively, and did not differ significantly. In LT group BETA-2 score ≥ 13.03 predicted no need for insulin supplementation with sensitivity of 98%, specificity of 50%, positive predictive value (PPV) of 93%, and negative predictive value (NPV) of 75%. In ST group the optimal cutoff was ≥13.63 with sensitivity of 92% and specificity, PPV, and NPV 82% to 95%. For the detection of glucose intolerance BETA-2 cutoffs were <19.43 in LT group and <17.23 in ST group with sensitivity > 76% and specificity, PPV, and NPV > 80% in both groups.

Conclusion

BETA-2 score was successfully validated externally and is a practical tool allowing for frequent and reliable assessments of islet graft function based on a single fasting blood sample.  相似文献   

15.

Background

Precise monitoring of the glomerular filtration rate (GFR) is needed to estimate the allograft function in kidney transplant recipients (KTRs). The GFR is widely estimated with the use of formulas based on serum cystatin C (SCys) and serum creatinine (SCr) levels. We compared the efficacy of SCys-based equations with that of SCr-based equations to predict the allograft function.

Methods

We calculated the Modification of Diet in Renal Disease (MDRD), Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI Cr), CKD-EPI creatinine–cystatin C (CKD-EPI Cr/Cys), and CKD-EPI cystatin C (CKD-EP ICys) equations in 70 KTRs. The measured GFR (mGFR) was defined as the GFR estimated by technetium-99m–diethylene triamine pentaacetic acid (99mTc-DTPA) clearance. The accuracy and precision of the equations were compared with the mGFR. The performance characteristics of SCr and SCys were analyzed with the use of receiver operating characteristic (ROC) curves to ascertain the sensitivity and specificity at the cutoff value of <45 mL/min/1.73 m2 DTPA.

Results

Overall, MDRD and CKD-EPICys did not show significant differences from mGFR (P = .05 and P = .077, respectively), whereas CKD-EPI Cr and CKD-EPI Cr/Cys significantly underestimated mGFR (P < .001 and P = .005, respectively). In the subgroup of patients with mGFR <45 mL/min/1.73 m2, CKD-EPI Cys showed little bias (P = .122), whereas MDRD significantly underestimated mGFR (P = .037). The area under the ROC curve for predicting mGFR <45 mL/min/1.73 m2 was 0.80 for SCys, which was better than that for SCr at 0.763.

Conclusions

Cystatin C–based equations showed better predictive performance of the allograft function than creatinine-based equations for the KTRs, including patients with lower GFR. Cystatin C level might be a good alternate measurement to monitor the allograft function.  相似文献   

16.

Background

The importance of heart rate (HR) measurement as a prognostic factor has been recognized in many clinical conditions, such as hypertension, coronary artery disease, or heart failure. Patients with liver cirrhosis tend to have increased resting HR as consequence of hyperdynamic circulation. In the current study, we examined whether pretransplant resting increased HR is associated with overall mortality in cirrhotic patients following liver transplantation (LT).

Patients and Methods

We retrospectively collected and analyzed the data of 881 liver recipients who underwent LT surgery between October 2009 and September 2012. Patients were categorized into 3 groups by tertile of resting HR as follows: tertile 1 group, HR ≤ 65 beats per minute (bpm); tertile 2 group, HR 66 to 80 bpm; and tertile 3 group, HR > 80 bpm.

Results

Kaplan-Meier analysis showed that the all-cause mortality rate was significantly different according to tertiles of HR (P = .016, log-rank test). The multivariate Cox regression analysis showed that tertile 3 group was significantly associated with higher risk for all-cause mortality (hazard ratio 1.83, 95% confidence interval, 1.10–3.07; P = .021) compared with tertile 1 group, after adjusting for clinically significant variables in univariate analysis.

Conclusions

Our results demonstrate that pretransplant resting tachycardia can identify patients at high risk of death in cirrhotic patients following LT, suggesting that further study will be need to clarify relationship between HR burden and sympathetic cardiac neuropathy.  相似文献   

17.

Background

Acute kidney injury (AKI) is a common complication in the early period of lung transplantation (LTx). We aimed to describe the incidence and perioperative risk factors associated with AKI following LTx.

Methods

Clinical data of 30 patients who underwent LTx were retrospectively reviewed. Primary outcomes were development of AKI and patient mortality within 30 postoperative days. Postoperative AKI is determined based on creatinine criteria from Acute Kidney Injury Network (AKIN) classification. Secondary outcomes included the association between AKI and demographic and clinical parameters of patients and treatment modalities in the pre- and postoperative periods.

Results

Of the 30 LTx recipients included, AKI occurred in 16 patients (53.4%) within the first 30 days. Length of intensive care unit (P = .06) and hospital stay (P = .008) and mechanical ventilation duration (P = .03) were significantly higher in patients with AKI compared with patients without AKI. Factors independently associated with AKI were intraoperative hypotension (odds ratio [OR] 0.500; 95% confidence interval [CI], 1.145 to 26.412, P = .02), longer duration of mechanical ventilation (OR 1.204; 95% CI 0.870 to 1.665, P = .03), and systemic infection (OR 8.067; 95% CI 1.538 to 42.318, P = .014) in the postoperative period. Short-term mortality was similar in patients with and patients without AKI.

Conclusion

By the AKIN definition, AKI occurred in half of the patients following LTx. Several variables including intraoperative hypotension, longer duration of mechanical ventilation, and systemic infection in the postoperative period independently predict AKI in LTx recipients.  相似文献   

18.

Background

This study investigated the prevalence of osteoporosis and the risk factors for its progression in kidney transplant recipients (KTRs).

Methods

Dual energy X-ray absorptiometry was used to prospectively measure changes in bone mineral density (BMD) before kidney transplantation (KT) and 1 year after transplantation in 207 individuals. We also analyzed the risk factors of osteoporosis progression during this period.

Results

Prior to KT, the mean BMD score (T-score of the femur neck area) was ?2.1 ± 1.2, and the prevalence of osteoporosis was 41.5% (86/207). At 1 year post-transplantation, the mean BMD score significantly decreased to ?2.3 ± 1.1 (P < .001), and the prevalence of osteoporosis increased to 47.3% (98/207; P = .277). The BMD score worsened over the study period in 69.1% (143/207) of patients, improved in 24.1% (50/207), and showed no change in 6.8% (14/207). Minimal intact parathyroid hormone (iPTH) improvement after KT was found to be an independent risk factor of osteoporosis progression.

Conclusions

This study demonstrates progressive loss of BMD after KT and sustained secondary hyperparathyroidism might influence the progression of osteoporosis.  相似文献   

19.

Background

Single-ventricle palliation (SVP) for children with unbalanced atrioventricular septal defect (uAVSD) is thought to carry a poor prognosis, but limited data have been reported.

Methods

We performed a retrospective review of children with uAVSD who underwent SVP at a single institution. Data were obtained from medical records and correspondence with general practitioners and cardiologists.

Results

Between 1976 and 2016, a total of 139 patients underwent SVP for uAVSD. A neonatal palliative procedure was performed in 83.5% of these patients (116 of 139), and early mortality occurred in 11.2% (13 of 116). Ninety-four patients underwent stage II palliation, with an early mortality of 6.4% (6 of 94). Eighty patients (57.6%) underwent Fontan completion, with an early mortality of 3.8% (3 of 80). Interstage mortality was 11.7% (12 of 103) between stages I and II and 17.0% (15 of 88) between stage II and Fontan.Long-term survival was 66.5% (95% confidence interval [CI], 57.9%-73.9%) at 5 years, 64.4% (95% CI, 55.5%-72.0%) at 15 years, and 57.8% (95% CI, 47.5%-66.8%) at 25 years. Survival post-Fontan was 94.9% (95% CI, 86.9%-98.0%) at 5 years, 92.0% (95% CI, 80.6%-96.8%) at 15 years, and 82.4% (95% CI, 61.5%-92.6%) at 25 years. Risk factors associated with death or transplantation were aortic atresia (hazard ratio [HR], 5.3; P = .03) and hypoplastic aortic arch (HR, 2.5; P = .02). Atrioventricular valve operations were required in 31.7% of the patients (44 of 139), with 31.8% of them (14 of 44) requiring a further operation.

Conclusions

Children undergoing SVP for uAVSD have substantial mortality, with <60% survival at 25 years. However, survival of children who achieve Fontan completion is better than has been reported previously.  相似文献   

20.

Background

This study was performed to assess the impact of soft tissue imbalance on the knee flexion angle 2 years after posterior stabilized total knee arthroplasty (TKA).

Methods

A total of 329 consecutive varus knees were included to assess the association of knee flexion angle 2 years after TKA with preoperative, intraoperative, and postoperative variables. All intraoperative soft tissue measurements were performed by a single surgeon under spinal anesthesia in a standardized manner including the subvastus approach, reduced patella, and without use of a pneumonic tourniquet.

Results

Multiple linear regression analysis showed no significant correlations in terms of intraoperative valgus imbalance at 90-degree flexion or the difference in soft tissue tension between 90-degree flexion and 0-degree extension (β = ?0.039; 95% confidence interval [CI], ?0.88 to 0.80; P = .93 and β = 0.015; 95% CI, ?0.29 to 0.32; P = .92, respectively). Preoperative flexion angle was significantly correlated with knee flexion angle 2 years after TKA (β = 0.42; 95% CI, 0.33 to 0.51; P < .0001).

Conclusion

Avoiding valgus imbalance at 90-degree flexion and aiming for strictly equal soft tissue tension between 90-degree flexion and 0-degree extension had little practical value with regard to knee flexion angle 2 years after posterior stabilized TKA.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号