首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Primary graft dysfunction (PGD) after lung transplantation may result from ischemia reperfusion injury (IRI). The innate immune response to IRI may be mediated by Toll-like receptor and IL-1-induced long pentraxin-3 (PTX3) release. We hypothesized that elevated PTX3 levels were associated with PGD. We performed a nested case control study of lung transplant recipients with idiopathic pulmonary fibrosis (IPF) or chronic obstructive pulmonary disease (COPD) from the Lung Transplant Outcomes Group cohort. PTX3 levels were measured pretransplant, and 6 and 24 h postreperfusion. Cases were subjects with grade 3 PGD within 72 h of transplantation and controls were those without grade 3 PGD. Generalized estimating equations and multivariable logistic regression were used for analysis. We selected 40 PGD cases and 79 non-PGD controls. Plasma PTX3 level was associated with PGD in IPF but not COPD recipients (p for interaction < 0.03). Among patients with IPF, PTX3 levels at 6 and 24 h were associated with PGD (OR = 1.6, p = 0.02 at 6 h; OR = 1.4, p = 0.008 at 24 h). Elevated PTX3 levels were associated with the development of PGD after lung transplantation in IPF patients. Future studies evaluating the role of innate immune activation in IPF and PGD are warranted.  相似文献   

2.
The association between pretransplant serum albumin concentration and post‐transplant outcomes in kidney transplant recipients is unclear. We hypothesized that in transplant‐waitlisted hemodialysis patients, lower serum albumin concentrations are associated with worse post‐transplant outcomes. Linking the 5‐year patient data of a large dialysis organization (DaVita) to the Scientific Registry of Transplant Recipients, we identified 8961 hemodialysis patients who underwent first kidney transplantation. Mortality or graft failure and delayed graft function (DGF) risks were estimated by Cox regression (hazard ratio [HR]) and logistic regression (Odds ratio [OR]), respectively. Patients were 48 ± 13 years old and included 37% women and 27% diabetics. The higher pretransplant serum albumin was associated with lower mortality, graft failure and DGF risk even after multivariate adjustment for case‐mix, malnutrition–inflammation complex and transplant related variable. Every 0.2 g/dL higher pretransplant serum albumin concentration was associated with 13% lower all‐cause mortality (HR = 0.87 [95% confidence interval: 0.82–0.93]), 17% lower cardiovascular mortality (HR = 0.83[0.74–0.93]), 7% lower combined risk of death or graft failure (HR = 0.93[0.89–0.97]) and 4% lower DGF risk (OR = 0.96[0.93–0.99]). Hence, lower pretransplant serum albumin level is associated with worse post‐transplant outcomes. Clinical trials to examine interventions to improve nutritional status in transplant‐waitlisted hemodialysis patients and their impacts on post‐transplant outcomes are indicated.  相似文献   

3.
Children who receive a non‐renal solid organ transplant may develop secondary renal failure requiring kidney transplantation. We investigated outcomes of 165 pediatric kidney transplant recipients who previously received a heart, lung, or liver transplant using data from 1988 to 2012 reported to the United Network for Organ Sharing. Patient and allograft survival were compared with 330 matched primary kidney transplant (PKT) recipients. Kidney transplantation after solid organ transplant (KASOT) recipients experienced similar allograft survival: 5‐ and 10‐year graft survival was 78% and 60% in KASOT recipients, compared to 80% and 61% in PKT recipients (p = 0.69). However, KASOT recipients demonstrated worse 10‐year patient survival (75% KASOT vs. 97% PKT, p < 0.001). Competing risks analysis indicated that KASOT recipients more often experienced graft loss due to patient death (p < 0.001), whereas allograft failure per se was more common in PKT recipients (p = 0.01). To study more recent outcomes, kidney transplants performed from 2006 to 2012 were separately investigated. Since 2006, KASOT and PKT recipients had similar 5‐year graft survival (82% KASOT vs. 83% PKT, p = 0.48), although 5‐year patient survival of KASOT recipients remained inferior (90% KASOT vs. 98% PKT, p < 0.001). We conclude that despite decreased patient survival, kidney allograft outcomes in pediatric KASOT recipients are comparable to those of PKT recipients.  相似文献   

4.

Introduction

The neutrophil-lymphocyte ratio (NLR) is an indicator of inflammatory status. We studied the effect of preoperative elevated NLR in the recipient in relation to the risk of developing delayed graft function (DGF) after kidney transplantation.

Methods

We retrospectively analysed the preoperative white blood cell count of renal transplant recipients between 2003 and 2005. An NLR >3.5 was considered elevated. There were 398 kidney transplant recipients of whom 249 received organs from donors after brain death (DBD), 61 from donors after circulatory death (DCD), and 88 from living donors.

Results

One hundred three patients (26%) developed DGF, of which 67 (65%) had NLRs >3.5. Of 295 recipients with primary graft function, only 44 (15%) had elevated NLR. Univariate analysis revealed three factors that significantly influenced graft function: NLR >3.5, cold ischemic time (CIT) >15 hours, and donor type. On multivariate analysis, both donor type (DCD: hazard ratio [HR] = 2.421, confidence interval [CI] = 1.195–4.905, P = .014; LD: HR = 0.289, CI = 0.099–0.846, P = .024) and NLR (HR = 10.673, CI = 6.151–18.518, P < .0001) remained significant.

Conclusions

Elevated recipient preoperative NLR could contribute to increase the risk of developing DGF, which appears to be more pronounced in patients receiving grafts from living donors.  相似文献   

5.
INTRODUCTION: Renal transplant recipients with elevated body mass index (BMI) have been shown to have inferior patient survival as compared to patients with lower BMI. However, previous studies could not establish a link between increased BMI and decreased death censored graft survival. Obesity in nontransplant patients has been associated with hypertension, hyperlipidemia, type II diabetes, proteinuria and glomerulopathy. Given this evidence it is possible that renal transplant recipients with an elevated BMI may have worse long term graft survival. To investigate this hypothesis we retrospectively analyzed 51,927 primary, adult renal transplants registered in the USRDS. METHODS: BMI at date of transplant was calculated for all patients using BMI=body weight (in kg)=.stature (height, in meters) squared. BMI values were further categorized into 11 categories: below 18, from 18 to 36 at 2 unit increments, and above 36 kg/m2. Primary study end points were graft and patient survival. Secondary study end points were death censored graft survival, chronic allograft failure, delayed graft function, and acute rejection (AR). Cox proportional hazard and logistic regression models investigated the link between categorized BMI and the study end points correcting for potential confounding variables. RESULTS: BMI showed a very strong association with outcomes after renal transplantation. The extremes of very high and very low BMI were associated with significantly worse patient and graft survival. The same was true for death censored graft survival and chronic allograft failure. Elevated BMI was also associated with an increased risk for delayed graft function while lower BMI was significantly protective. Acute rejection did not show any significant association with BMI. CONCLUSIONS: BMI has a very strong association with outcomes after renal transplantation independent of most of the known risk factors for patient and graft survival. The extremes of very high and very low BMI before renal transplantation are important risk factors for patient and graft survival. It is important to note that elevated BMI was significantly associated with worse graft survival independent of patient survival. Whether prospective weight adjustment before renal transplantation can favorably affect posttransplant risk needs to be assessed by further studies.  相似文献   

6.
BACKGROUND: Increased graft ischemic time and donor age are risk factors for early death after heart transplantation, but the effect of these variables on survival after lung transplantation has not been determined in a large, multinational study. METHODS: All recipients of cadaveric lung transplantations performed between October 1, 1987 and June 30, 1997 which were reported to the United Network for Organ Sharing/International Society for Heart and Lung Transplantation (UNOS/ISHLT) Registry were analyzed. Patient survival rates were estimated using Kaplan-Meier methods. Multivariate logistic regression was used to determine the impact of donor and recipient characteristics on patient survival after transplantation. To examine whether the impact of donor age varied with ischemic time, interactions between the 2 terms were examined in a separate multivariate logistic regression model. RESULTS: Kaplan-Meier survival did not differ according to the total lung graft ischemia time, but recipient survival was significantly adversely affected by young (-10 years) or old (-51 years) donor age (p = 0.01). On multivariate analysis, neither donor age nor lung graft ischemic time per se were independent predictors of early survival after transplantation, except if quadratic terms of these variables were included in the model. The interaction between donor age and graft ischemia time, however, predicted 1 year mortality after lung transplantation (p = 0.005), especially if donor age was greater than 55 years and ischemic time was greater than 6 to 7 hours. CONCLUSIONS: Graft ischemia time alone is not a risk factor for early death after lung transplantation. Very young or old donor age was associated with decreased early survival, whereas the interaction between donor age and ischemic time was a significant predictor of 1 year mortality after transplantation. Cautious expansion of donor acceptance criteria (especially as regards ischemic time) is advisable, given the critical shortage of donor lung grafts.  相似文献   

7.
BackgroundAccumulated knowledge on the outcomes related to size mismatch in lung transplantation derives from predicted total lung capacity equations rather than individualized measurements of donors and recipients. The increasing availability of computed tomography (CT) makes it possible to measure the lung volumes of donors and recipients before transplantation. We hypothesize that CT-derived lung volumes predict a need for surgical graft reduction and primary graft dysfunction.MethodsDonors from the local organ procurement organization and recipients from our hospital from 2012 to 2018 were included if their CT exams were available. The CT lung volumes and plethysmography total lung capacity were measured and compared with predicted total lung capacity using Bland Altman methods. We used logistic regression to predict the need for surgical graft reduction and ordinal logistic regression to stratify the risk for primary graft dysfunction.ResultsA total of 315 transplant candidates with 575 CT scans and 379 donors with 379 CT scans were included. The CT lung volumes closely approximated plethysmography lung volumes and differed from the predicted total lung capacity in transplant candidates. In donors, CT lung volumes systematically underestimated predicted total lung capacity. Ninety-four donors and recipients were matched and transplanted locally. Larger donor and smaller recipient lung volumes estimated by CT predicted a need for surgical graft reduction and were associated with higher primary graft dysfunction grade.ConclusionThe CT lung volumes predicted the need for surgical graft reduction and primary graft dysfunction grade. Adding CT-derived lung volumes to the donor-recipient matching process may improve recipients’ outcomes.  相似文献   

8.
BackgroundIn the current practice of lung transplantation, donor and recipient genders are neither directly considered nor matched. However, some data have suggested a possible effect of gender combinations on survival following lung transplantation.MethodsA total of 249 adult lung transplant recipients at a single center between February 1988 and December 2008, were analyzed retrospectively for donor-recipient gender matching. We compared the mortality by calculating one-term survival rates after transplantation using the Kaplan-Meier method with comparisons using the log-rank (Mantel-Cox) test. Statistical significance of the mean effects of size matching was assessed by paired Student t tests and Wilcoxon signed rank tests.ResultsKaplan-Meier survival analysis shown that male compared to female recipients did not have an effect on outcomes after lung transplantation at 5 years (P = .5379), 10 years (P = .107), 15 years (P = .0841), 20 years (P = .0711). No effect of gender on lung transplantation outcomes was observed with donor-recipient gender mismatches at 5 years (P = .1804), 10 years (P = .1457), 15 years (P = .0731), or 20 years (P = .0629). Similarly, no differences were observed for each gender combination. The degree of size matching was defined as the ratio of donor-to-recipient predicted total lung capacity. The ratios were similar for the donor-recipient gender match and significantly different for the donor-recipient gender mismatch.ConclusionsThese analyses suggested that gender was not a significant independent risk factor affecting survival after lung transplantation. Size mismatch caused by gender mismatch did not increase mortality.  相似文献   

9.

Background

Pretreatment neutrophil-to-lymphocyte ratio (NLR) is a marker of systemic inflammation that has been associated with adverse survival in a variety of malignancies. However, the relationship between NLR and oncologic outcomes following radical cystectomy (RC) for urothelial carcinoma of the bladder (UCB) has not been well studied.

Objective

To evaluate the association of preoperative NLR with clinicopathologic outcomes following RC.

Design, setting, and participants

We identified 899 patients who underwent RC without neoadjuvant therapy at our institution between 1994 and 2005 and who had a pretreatment NLR.

Outcome measurements and statistical analysis

Preoperative NLR (within 90 d prior to RC) was recorded. Recurrence-free, cancer-specific, and overall survival were estimated using the Kaplan-Meier method and compared using the log-rank test. Multivariate Cox proportional hazard and logistic regression models were used to analyze the association of NLR with clinicopathologic outcomes.

Results and limitations

Median postoperative follow-up was 10.9 yr (interquartile range: 8.3–13.9 yr). Higher preoperative NLR was associated with significantly increased risks of pathologic, extravesical tumor extension (odds ratio [OR]: 1.07; p = 0.03) and lymph node involvement (OR: 1.09; p = 0.02). Univariately, 10-yr cancer-specific survival was significantly worse among patients with a preoperative NLR (≥2.7 [51%] vs <2.7 [64%]; p < 0.001). Moreover, on multivariate analysis, increased preoperative NLR was independently associated with greater risks of disease recurrence (hazard ratio [HR]: 1.04; p = 0.02), death from bladder cancer (HR: 1.04; p = 0.01), and all-cause mortality (HR: 1.03; p = 0.01).

Conclusions

Elevated preoperative NLR among patients undergoing RC is associated with significantly increased risk for locally advanced disease as well as subsequent disease recurrence, and cancer-specific and all-cause mortality. These data suggest that serum NLR may be a useful prognostic marker for preoperative patient risk stratification, including consideration for neoadjuvant therapy and clinical trial enrollment.  相似文献   

10.
BACKGROUND: New immunosuppressive drugs such as anti-interleukin-2 receptor antibodies (aIL2R) and mycophenolate mofetil (MMF) have reduced the incidence of acute rejection after renal transplantation. Whether matching donor and recipient human leukocyte antigen (HLA) antigens is still relevant in patients receiving modern immunosuppression has been questioned. METHODS: We retrospectively analyzed the incidence and risk factors of acute rejection during the first posttransplant year and the impact of acute rejection on long-term graft survival in a cohort of 208 renal transplant patients treated with aIL2R (basiliximab, n=166; daclizumab, n=42), calcineurin inhibitors (tacrolimus, n=180; cyclosporin, n=28), mycophenolate mofetil, and steroids. Graft and patient survival were calculated by the Kaplan-Meier method. Risk factors for acute rejection were analyzed by logistic regression modeling. RESULTS: Twenty-seven patients were treated for acute rejection (26 biopsy-proven) during the first posttransplant year. The Kaplan-Meier estimate of first-year acute rejection was 13.2%. The number of HLA mismatches (odds ratio [OR] 1.65 per HLA mismatch) and long periods of dialysis before transplantation (OR 3.1 for more than 4 years of dialysis) were the only independent risk factors for first-year acute rejection. First-year acute rejection was associated with a significant reduction in overall and death-censored graft survival at 5 years after transplantation. CONCLUSIONS: Although infrequent in patients receiving modern immunosuppressive drugs, acute rejection remains an important risk factor for graft loss after renal transplantation. Our results suggest that better HLA matching and shorter periods of dialysis before transplantation could reduce acute rejection rates and further improve outcomes under current immunosuppressive regimens.  相似文献   

11.
Racial differences on the outcome of simultaneous pancreas and kidney (SPK) transplantation have not been well studied. We compared mortality and graft survival of African Americans (AA) recipients to other racial/ethnic groups (non‐AA) using the national data. We studied a total of 6585 adult SPK transplants performed in the United States between January 1, 2000 and December 31, 2007. We performed multivariate logistic regression analyses to determine risk factors associated with early graft failure and immune‐mediated late graft loss. We used conditional Kaplan–Meier survival and multivariate Cox regression analyses to estimate late death‐censored kidney and pancreas graft failure and death between the groups. Although there was no racial disparity in the first 90 days, AA patients had 38% and 47% higher risk for late death‐censored kidney and pancreas graft failure, respectively (p = 0.006 and 0.001). AA patients were twice more likely to lose the kidney and pancreas graft due to rejection (OR 2.31 and 1.86, p = 0.002 and 0.008, respectively). Bladder pancreas drainage was associated with inferior patient survival (HR 1.42, 95% CI 1.15, 1.75, p = 0.001). In the era of modern immunosuppresion, AA SPK transplant patients continue to have inferior graft outcome. Additional studies to explore the mechanisms of such racial disparity are warranted.  相似文献   

12.
BACKGROUND: The shortage of cadaveric donors for kidney transplantation has prompted many centers to use cadaver kidneys from pediatric donors. Use of kidneys from pediatric donors has been shown to have a lower graft survival. METHODS: Recipients receiving cadaver kidneys from pediatric and adult donors between 1988 and 1995 were analyzed. The data were obtained from United Network of Organ Sharing database. The actuarial kidney transplant graft survival was estimated by the Kaplan-Meier method. A logistic regression analysis was used to identify various risk factors for 1-year graft failure. Odds ratios (OR) were estimated for various risk factors. RESULTS: Kidney transplant survival rates for donor age <18 years (n=12,838) at 1, 2, 3, 4, and 5 years were 81.5%, 76.3%, 71.3%, 66.4%, and 61.7%, respectively. The corresponding results for adult donors from age 18 to 50 years (n=35, 442) were 83.5%, 78.4%, 73.1%, 67.9%, and 62.4%, respectively, Log-rank test P<0.01. Pediatric donors were further divided into three groups according to donor age: group I (0-5 years), group II (6-11 years), and group III (12-17 years). The actuarial survival rates for 1, 3, and 5 years for group I (n=2198) were 73.6%, 63.3%, and 55.6%, respectively. The corresponding values for group II (n=2873) were 78.0%, 67.5%, and 57.8% and for group III (n=7767) were 85%, 75.0%, and 64.8%, respectively, P<0.01. Although the recipients of group I had lower graft survival, en bloc grafts (n=751) had much better 1-, 3-, and 5-year graft survival rates (76.3%, 67.7%, and 60.7%, respectively) compared with single grafts (n=1447; 72.2%, 61.1%, and 53.2%, P=0.02) from donors 0 to 5 years. Graft thrombosis as a cause of graft failure was seen in 10% of group I compared with 6% in group II and 5% in group III. In group I, lower OR were seen when an en bloc transplant was performed (0.688, P<0.01) and when donor body weight was>15 kg (0.547, P<0.01). However, OR were elevated in recipients of previous transplants (1.556, P<0.01), with prolonged cold ischemic time (1.097, P=0.03), for black recipients (1.288, P=0.03), and for recipients with body mass index> or =25 (1.286, P=0.02). Progressive increase in the donor age was associated with lower OR in group II (0.894, P<0.01). CONCLUSIONS: (1) Overall, poorer graft survival was seen in pediatric donor transplants, (2) transplant kidney survival with en bloc kidneys was better than a single kidney from donors 0-5 years, (3) progressive increase in donor age was associated with improved graft survival when the donors were 6-11 years, whereas progressive increase in donor weight was associated with improved graft survival when the donors were 0-5 years.  相似文献   

13.
BACKGROUND: Recurrent hepatitis C virus (HCV) infection in patients after liver transplantation is an important clinical problem. Because serum cryoglobulins (CG) are known to be associated with an increased incidence of cirrhosis in nontransplant patients, the authors tested the hypothesis that CG would also predict aggressive recurrent HCV in patients after liver transplantation. METHODS: Using a longitudinal database, the outcomes of 105 allografts transplanted into 97 HCV-positive patients from 1991 through 2002 were analyzed on the basis of CG status using a retrospective cohort design. Fifty-nine CG-negative and 38 CG-positive patients were identified. Histologic outcomes and graft survival were analyzed using Kaplan-Meier estimates and Cox univariate and multivariate analyses. Both overall survival and HCV-specific survival (non-HVC-related deaths and graft losses censored) were analyzed. RESULTS: By Kaplan-Meier estimates, CG-positive patients showed earlier graft failure with decreased time to severe histologic activity and fibrosis as compared with CG-negative patients (P<0.05 for all outcomes). By univariate analysis, CG-positive patients had significantly higher risk ratios for shortened HCV-specific graft survival, severe activity-free survival, and severe fibrosis-free survival as compared with CG-negative patients (P<0.05 for all outcomes). In the multivariate model, CG was an independent predictor for severe activity-free, severe fibrosis-free, and HCV-specific graft survival (P<0.05 for all outcomes). CONCLUSIONS: CG-positivity is associated with severe recurrent HCV disease in liver transplant recipients.  相似文献   

14.
With less ischemia, improved donor selection and controlled procedures, living donor liver transplantation (LDLT) might lead to less HLA donor‐specific antibody (DSA) formation or fewer adverse outcomes than deceased donor liver transplantation (DDLT). Using the multicenter A2ALL (Adult‐to‐Adult Living Donor Liver Transplantation Cohort Study) biorepository, we compared the incidence and outcomes of preformed and de novo DSAs between LDLT and DDLT. In total, 129 LDLT and 66 DDLT recipients were identified as having serial samples. The prevalence of preformed and de novo DSAs was not different between DDLT and LDLT recipients (p = 0.93). There was no association between patient survival and the timing (preformed vs. de novo), class (I vs. II) and relative levels of DSA between the groups; however, preformed DSA was associated with higher graft failure only in DDLT recipients (p = 0.01). De novo DSA was associated with graft failure regardless of liver transplant type (p = 0.005) but with rejection only in DDLT (p = 0.0001). On multivariate analysis, DSA was an independent risk factor for graft failure regardless of liver transplant type (p = 0.017, preformed; p = 0.002, de novo). In conclusion, although similar in prevalence, DSA may have more impact in DDLT than LDLT recipients. Although our findings need further validation, future research should more robustly test the effect of donor type and strategies to mitigate the impact of DSA.  相似文献   

15.
PurposeNeutrophil-lymphocyte ratio (NLR) and platelet-lymphocyte ratio (PLR) are hematologic scoring and indicators of the systemic inflammatory response. The increasing use of NLR and PLR have been associated with poor outcome in various types of malignancy. We evaluated the effect of NLR and PLR on survival outcomes of nonmetastatic renal cell carcinoma (RCC).Materials and methodsWe retrospectively review 150 patients who had undergone nephrectomy for nonmetastatic RCC between 2006 and 2016. Cancer specific survival (CSS) was assessed using Kaplan–Meier method and compared using log-rank test. We applied univariate and multivariate Cox regression model to analyze the association of NLP and PLR with clinical outcome.ResultsAt median follow up of 33 months, 45 patients had died. High PLR (>100) was an independent prognostic hematologic marker for CSS (hazard ratio [HR] 2.61, 95% confidence interval [CI],1.08–6.31; P = 0.034). Univariate analysis identified elevated NLR (p = 0.005), and anemia (p = 0.023) were significantly associated with CSS.ConclusionElevated PLR is a strong hematologic prognosis factor in term of survival for patients with nonmetastatic RCC undergoing nephrectomy with curative intent. The PLR is an easily obtained biomarker which is useful for preoperative risk stratification.  相似文献   

16.
Pancreas after kidney (PAK) transplantation is one of the accepted pancreas transplant modalities. We studied the impact of time interval between kidney and pancreas transplantation on the outcomes of PAK transplantation. Using OPTN/SRTR data, we included 1853 PAK transplants performed between 1996 and 2005 with follow-up until November 1, 2008. Kaplan-Meier survival and multivariate Cox regression analyses were performed using the time interval between kidney and pancreas transplantation either as a categorical (less than one yr, between one and less than three yr, and greater than or equal to three yr) or as a continuous variable (months) to assess kidney graft and patient survival. Patients who received a pancreas transplant three yr or later after kidney transplantation had higher risk of death-censored kidney graft loss (HR 1.56, 95% CI 1.04, 2.32, p = 0.03). Each month beyond three yr between kidney and pancreas transplantation incurred 1% higher risk of subsequent death-censored kidney graft loss (HR 1.01, 95% CI 1.001, 1.02, p = 0.03). In conclusion, time interval between pancreas and kidney transplantation is an independent risk factor of kidney graft loss following pancreas transplantation. Shortening the time interval between pancreas and kidney transplantation to less than three yr may reduce the risk of kidney graft loss in qualified PAK transplant candidates.  相似文献   

17.
《Transplantation proceedings》2023,55(7):1535-1542
BackgroundWe examined the association between induction type for a second kidney transplant in dialysis-dependent recipients and the long-term outcomes.MethodsUsing the Scientific Registry of Transplant Recipients, we identified all second kidney transplant recipients who returned to dialysis before re-transplantation. Exclusion criteria included: missing, unusual, or no-induction regimens, maintenance regimens other than tacrolimus and mycophenolate, and positive crossmatch status. We grouped recipients by induction type into 3 groups: the anti-thymocyte group (N = 9899), the alemtuzumab group (N = 1982), and the interleukin 2 receptor antagonist group (N = 1904). We analyzed recipient and death-censored graft survival (DCGS) using the Kaplan-Meier survival function with follow-up censored at 10 years post-transplant. We used Cox proportional hazard models to examine the association between induction and the outcomes of interest. To account for the center-specific effect, we included the center as a random effect. We adjusted the models for the pertinent recipient and organ variables.ResultsIn the Kaplan-Meier analyses, induction type did not alter recipient survival (log-rank P = .419) or DCGS (log-rank P = .146). Similarly, in the adjusted models, induction type was not a predictor of recipient or graft survival. Live-donor kidneys were associated with better recipient survival (HR 0.73, 95% CI [0.65, 0.83], P < .001) and graft survival (HR 0.72, 95% CI [0.64, 0.82], P < .001). Publicly insured recipients had worse recipient and allograft outcomes.ConclusionIn this large cohort of average immunologic-risk dialysis-dependent second kidney transplant recipients, who were discharged on tacrolimus and mycophenolate maintenance, induction type did not influence the long-term outcomes of recipient or graft survival. Live-donor kidneys improved recipient and graft survival.  相似文献   

18.
BackgroundThe interest in donation after cardiocirculatory death (DCD) donors for lung transplantation (LT) has been recently rekindled due to lung allograft shortage. Clinical outcomes following DCD have proved satisfactory. The aim of this systematic review is to provide a thorough analysis of published experience regarding outcomes of LT after controlled DCD compared with donation after brain death (DBD) donors.MethodsWe performed a literature search in Cochrane Database of Systematic Reviews, PubMed and Web of Science using the items “lung transplantation” AND “donation after circulatory death” on November 1, 2018. The full text of relevant articles was evaluated by two authors independently. Quality assessment was performed using the NIH protocol for case-control and case series studies. A pooled Odds ratio (OR) and mean differences with inverse variance weighting using DerSimonian-Laird random effect models were computed to account for between-trial variance (τ2).ResultsOf the 508 articles identified with our search, 9 regarding controlled donation after cardiac death (cDCD) were included in the systematic review, including 2973 patients (403 who received graft from DCD and 2570 who had DBD). Both 1-year survival and 2 and 3-grade primary graft dysfunction (PGD) were balanced between the two cohorts (OR = 1.00 and 1.03 respectively); OR for airway complications was 2.07 against cDCD. We also report an OR = 0.57 for chronic lung allograft dysfunction (CLAD) and an OR = 0.57 for 5-year survival against cDCD.ConclusionsOur meta-analysis shows no significant difference between recipients after cDCD or DBD regarding 1-year survival, PGD and 1-year freedom from CLAD. Airway complications and long-term survival were both related with transplantation after cDCD, but these statistical associations need further research.  相似文献   

19.
BackgroundEnd-stage kidney disease secondary to hyperoxaluria presents a major challenge for transplant physicians given concern regarding disease recurrence. Few contemporary studies have reported long-term outcomes following transplantation in this population.MethodsThis study examined the outcomes of all adult patients with end-stage kidney disease secondary to hyperoxaluria who received a kidney or combined liver-kidney transplant in Australia and New Zealand between 1965 and 2015. Patients with hyperoxaluria were propensity score matched to control patients with reflux nephropathy. The primary outcome was graft survival. Secondary outcomes included graft function, acute rejection, and patient survival.ResultsNineteen transplants performed in 16 patients with hyperoxaluria were matched to 57 transplants in patients with reflux nephropathy. Graft survival was inferior in patients with hyperoxaluria receiving a kidney transplant alone (subhazard ratio [SHR] = 3.83, 95% confidence interval [CI], 1.22-12.08, P = .02) but not in those receiving a combined liver-kidney transplant (SHR = 0.63, 95% CI, 0.08-5.21, P = .67). Graft failure risk was particularly high in patients with hyperoxaluria receiving a kidney transplant alone after more than 1 year of renal replacement therapy (SHR = 8.90, 95% CI, 2.35-33.76, P = .001). Posttransplant estimated glomerular filtration rate was lower in patients with hyperoxaluria (10.97 mL/min/1.73 m2, 95% CI, 0.53-21.42, P = .04). There was no difference between groups in the risk of acute rejection or death with a functioning graft.ConclusionCompared to reflux nephropathy, hyperoxaluria was associated with inferior graft survival in patients receiving a kidney transplant alone. Long-term graft function was lower in patients with hyperoxaluria, but the risks of acute rejection and death were not different.  相似文献   

20.
The effect of patient travel time to a transplant center on outcomes is unknown. We compared outcomes between patients living >3 hours (Group A) vs. 90 days to list, listing, survival while listed, transplantation, and posttransplantation survival. Covariates included Model for End-Stage Liver Disease (MELD) score, hepatocellular carcinoma (HCC), alcoholic liver disease, insurance type, and psychosocial score. There were 38 (23%) patients in Group A and 128 (77%) in Group B. Median MELD scores were 14.5 (range, 6-36) for Group A and 14.0 (range, 7-32) for Group B (p = 0.20). Groups were similar for age, gender, diagnosis, psychosocial score, insurance, and HCC variables. Group A was not independently associated with >90 days to list (odds ratio, 0.98; 95% confidence interval [CI], 0.4-2.4). Kaplan-Meier cumulative probabilities for listing, transplantation, and 1-yr posttransplantation survival were similar (A vs. B: 0.77 vs. 0.83, 0.70 vs. 0.69, and 0.85 vs. 0.86, respectively; all p values >0.05). Being in Group A remained insignificant in terms of probability of listing, transplantation, and posttransplantation survival by Cox proportional hazard modeling. Survival on the list was significantly better for Group A (A: 1.0, B: 0.55; p = 0.02). Fewer patients at high MELD score in Group A and referral biases may explain this difference. In conclusion, after entering evaluation, patients living >3 hours away from a transplant center have comparable outcomes to those living closer.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号