首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background

Calcineurin inhibitor (CNI)-free immunosuppression is used increasingly after heart transplantation to avoid CNI toxicity, but in the absence of a randomized trial, concerns remain over an increased rejection risk.

Methods

We studied the incidence of graft rejection episodes among all cardiac graft recipients, beginning with the first introduction of CNI-free protocols. We compared events during CNI-free and CNI-containing immunosuppression among 231 transplant recipients of overall mean age 55.2 ± 11.8 years, from a mean 5.2 ± 5.4 years after transplantation through a mean follow-up of 3.1 ± 1.4 years. We considered as acute rejection episodes requiring treatment those of International Society for Heart and Lung Transplantation.

Results

During the total follow-up of 685 patient years (CNI-containing, 563; CNI-free, 122), we performed 1,374 biopsies which diagnosed 78 rejection episodes. More biopsies were performed in CNI-free patients: biopsies/patient-month of CNI-containing, 0.13 versus CNI-free, 0.22 (P < .05). The incidence of rejection episodes per patient-month was significantly higher on CNI-free compared with CNI therapy, among patients switched both early and later after heart transplantation, namely, within 1 year, 0.119 versus 0.035 (P = .02); beyond 1 year, 0.011 versus 0.004 (P = .007); beyond 2 years, 0.007 versus 0.003 (P = .04); and beyond 5 years: 0.00578 versus 0.00173 (P = .04).

Conclusions

Rejection incidence during CNI-free immunosuppression protocols after heart transplantation was significantly increased in both early and later postoperative periods. Given the potentially long delay to rejection occurrence, patients should be monitored closely for several months after a switch to CNI-free immunosuppressive protocols.  相似文献   

2.

Background

Efforts to improve long-term patient and allograft survival have included use of induction therapies as well as steroid and/or calcineurin inhibitor (CNI) avoidance/minimization.

Methods

This is a retrospective review of kidney transplant recipients between September 2004 and July 2009. Immune minimization (group 1; n = 182) received alemtuzumab induction, low-dose CNI, and mycophenolic acid (MPA). Conventional immunosuppression (group 2; n = 232) received rabbit anti-thymocyte globulin, standard-dose CNI, MPA, and prednisone.

Results

Both groups were followed up for same length of time (49.4 ± 21.7 months; P = .12). Patient survival was also similar (90% vs 94%; P = .14). Death-censored graft survival was inferior in group 1 compared with group 2 (86% vs 96%, respectively; P = .003). On multivariate analysis, group 1 was an independent risk factor for graft loss (aHR = 2.63; 95% confidence interval [CI], 1.32–5.26; P = .006). Biopsy-proven acute rejection occurred more in group 1, due to late rejections compared with group 2 (7% vs 2%; P < .01 respectively). Graft function was lower in group 1 compared with group 2 at 3 months (49.5 mL/mt vs 70.7 mL/mt, respectively; P < .001) to 48 months (48.6 mL/mt vs 69.4 mL/mt, respectively; P = .04).

Conclusion

Minimization of maintenance immunosuppression after alemtuzumab correlated with higher acute rejection and inferior graft survival compared with thymoglobulin and conventional triple immunotherapy.  相似文献   

3.

Background

An increase in the number of obese patients on transplantation waiting lists can be observed. There are conflicting results regarding the influence of body mass index (BMI) on graft function.

Methods

We performed a single-center, retrospective study of 859 adult patients who received a renal graft from deceased donors. BMI (kg/m2) was calculated from patients' height and weight at the time of transplantation. Kidney recipients were subgrouped into 4 groups, according to their BMI: Groups A (<18.5; n = 57), B (18.6–24.9; n = 565), C (25–29.9; n = 198) and D (>30; n = 39). Primary or delayed graft function (DGF), acute rejection (AR) episodes, and number of reoperations, graft function expressed by glomerular filtration rate (GFR) and serum creatinine concentration and number of graft loss as well as the recipient's death were analyzed. The follow-up period was 1 year.

Results

Obese patients' grafts do not develop any function more frequently in comparison with their nonobese counterparts (P < .0001; odds ratio [OR], 32.364; 95% CI, 2.174–941.422). Other aspects of the procedure were analyzed to confirm that thesis: Cold ischemia time and number of HLA mismatches affect the frequency of AR (OR, 1.0182 [P = .0029] and OR, 1.1496 [P = .0147], respectively); moreover, donor median creatinine serum concentration (P = .00004) and cold ischemia time (P = .00019) are related to delayed graft function. BMI did not influence the incidence of DGF (P = .08, OR; 1.167; 95% CI, 0.562–2.409), the number of AR episodes (P > .1; OR, 1.745; 95% CI, 0.846–3.575), number of reoperations, GFR (P = .22–.92), or creatinine concentration (P = .09). Number of graft losses (P = .12; OR, 1.8; 95% CI, 0.770–4.184) or patient deaths (P = .216; OR, 3.69; 95% CI, 0.153–36.444) were not influenced.

Conclusion

Greater recipient BMI at the time of transplantation has a significant influence on the incidence of primary graft failure.  相似文献   

4.
Post‐transplantation lymphoproliferative disorders (PTLD) are associated with poor patient and graft survival. The risk of rejection and subsequent graft loss are increased by the reduction of immunosuppression therapy, the cornerstone of PTLD treatment. This multicentre, retrospective, nonrandomized cohort study includes 104 adults who developed PTLD after renal or simultaneous renal/pancreatic transplantation between 1990 and 2007. It examines the effect of calcineurin inhibitor (CNI) withdrawal on long‐term graft and patient survival. At 10 years postonset of PTLD, the Kaplan–Meier graft loss rate was 43.9% and graft loss or death with functioning graft was 64.4%. Cox multivariate analysis determined risk factors of graft loss as PTLD stage greater than I‐II and CNI withdrawal, and for graft loss and mortality, these remained risk factors along with age over 60 years. Type and location of PTLD, year of diagnosis, and chemotherapy regime were not independent risk factors. Multivariate analysis determined CNI withdrawal as the most important risk factor for graft loss (HR = 3.07, CI 95%: 1.04–9.09; P = 0.04) and death (HR: 4.00, CI 95%: 1.77–9.04; P < 0.001). While long‐term stable renal function after definitive CNI withdrawal for PTLD has been reported, this review determined that withdrawal is associated with reduced graft and patient survival.  相似文献   

5.
The PIRCHE (Predicted Indirectly ReCognizable HLA Epitopes) score is an HLA epitope matching algorithm. PIRCHE algorithm estimates the level of presence of T-cell epitopes in mismatched HLA. The PIRCHE-II numbers associate with de novo donor-specific antibody (dnDSA) formation following liver transplantation and kidney allograft survival following renal transplantation. The aim of our study was to assess the PIRCHE-II score in calcineurin inhibitor (CNI)-free maintenance immunosuppression recipients.This was a retrospective study of forty-one liver transplant recipients on CNI-free immunosuppression and with available liver allograft biopsies. Donors and recipients were HLA typed. The HLA-derived mismatched peptide epitopes that could be presented by the recipient's HLA-DRB1 molecules were calculated using PIRCHE-II algorithm. The associations between PIRCHE-II scores and graft immune-mediated events were assessed using receiver operating characteristics curves and subsequent univariate and multivariate analyses.CNI-free patients with cellular rejection, humoral rejection, or severe portal inflammation had higher mean PIRCHE-II scores compared to patients with normal liver allografts. PIRCHE-II score and donor age were independent risk factors for liver graft survival in CNI-free patients (HR: 8.0, 95% CI: 1.3–49, p = .02; and HR: 0.88, 95% CI: 0.00–0.96, p = .007, respectively).PIRCHE-II scores could be predictive of liver allograft survival in CNI-free patients following liver transplantation. Larger studies are needed to confirm these results.  相似文献   

6.
Immunosuppressive maintenance therapy after kidney transplantation leads to various undesired side effects such as calcineurin inhibitor (CNI)–associated nephrotoxicity or elevated cardiovascular risk due to posttransplantation diabetes and hypertension. These effects show negative impacts on long–term allograft function as well as patient morbidity and mortality. Therefore, we used an immunosuppressive regimen with early corticosteroid withdrawal (ESW), maintenance therapy containing tacrolimus, sirolimus (SRL), and mycophenolate sodium for 3 months followed by a prospective randomized trial comparing a CNI free versus a low-dose CNI therapy. The primary endpoint was 6-month graft function. Among 75 patients, ESW was performed after 4 days in 65 patients. Over the following 3 months before randomization to CNI-free maintenance therapy, we experienced a high number (25%) of SRL discontinuations due to adverse events, including leukopenia, anemia, arthritis, and pneumonitis. In addition there were significantly more allograft rejection episodes in the CNI-free group (P = .017) during the study period leading to a switch from SRL to a CNI. Despite the higher rate of rejection episodes in the CNI-free groups, glomerular filtration rates (GFR) at 6 months were comparable between the study groups (P = .25). After 1 year only 9.2% (6/65) of all patients treated with SRL remained on this drug. Conclusion, there was an unacceptably high rate of SRL intolerance using an ESW and CNI-free immunosuppressive regimen combined with a significantly higher rate of rejection episodes.  相似文献   

7.

Objective

The aim of this study was to investigate whether donor age was a predictor of outcomes in liver transplantation, representing an independent risk factor as well as its impact related to recipient age-matching.

Methods

We analyzed prospectively collected data from 221 adult liver transplantations performed from January 2006 to September 2009.

Results

Compared with recipients who received grafts from donors <60 years old, transplantation from older donors was associated with significantly higher rates of graft rejection (9.5% vs 3.5%; P = .05) and worse graft survival (P = .021). When comparing recipient and graft survivals according to age matching, we observed significantly worse values for age-mismatched (P values .029 and .037, respectively) versus age-matched patients. After adjusting for covariates in a multivariate model, age mismatch was an independent risk factor for patient death (hazard ratio [HR] 2.13, 95% confidence interval [CI] 1.1–4.17; P = .027) and graft loss (HR 3.86, 95% CI 1.02–15.47; P = .046).

Conclusions

The results of this study suggest to that optimized donor allocation takes into account both donor and recipient ages maximize survival of liver-transplanted patients.  相似文献   

8.

Background

Tacrolimus and cyclosporine are the 2 major immunosuppressants for lung transplantation. Several studies have compared these 2 drugs, but the outcomes were not consistent. The aim of this meta-analysis of randomized controlled trials (RCTs) was to compare the beneficial and harmful effects of tacrolimus and cyclosporine as the primary immunosuppressant for lung transplant recipients.

Methods

We conducted searches of electronic databases and manual bibliographies. We performed a meta-analysis of all RCTs comparing tacrolimus with cyclosporine as primary immunosuppression for lung transplant recipients. Extracted, pooled data for mortality, acute rejection, withdrawals, and adverse events were analyzed using Mantel-Haenszel tests with a random effects model.

Results

Three RCTs including 297 patients were assessed in this study. Mortality at 1 year or more was comparable between lung recipients treated with tacrolimus and cyclosporine (odds ratio [OR], 0.94; 95% confidence interval [CI], 0.42-2.10; P = .88). Tacrolimus-treated patients experienced fewer incidences of acute rejection (MD = −0.14; 95% CI, −0.28 to −0.01; P = .04). Pooled analysis showed a trend toward a lower risk of bronchiolitis obliterans syndrome (BOS) among tacrolimus-treated patients, although it did not reach significances (OR, 0.53; 95% CI, 0.25-1.12; P = .10). Fewer patients stopped tacrolimus than cyclosporine (OR, 0.12; 95% CI, 0.03-0.48; P = .003). The rate of new-onset diabetes was higher among the tacrolimus group (OR, 3.69; 95% CI, 1.17-11.62; P = .03). The incidence of hypertension and renal dysfunction were comparable in these 2 groups (OR, 0.24; 95% CI, 0.03-1.70; P = .15; and OR, 1.67; 95% CI, 0.70-3.96; P = .25, respectively). There was a trend toward lower risk of malignancy in tacrolimus-treated patients, although it did not reach significance either (OR, 0.19; 95% CI, 0.03-1.13; P = .07). The incidence of infection was comparable in these 2 groups (MD = −0.29, 95% CI, −0.68 to 0.11; P = .16).

Conclusion

Using tacrolimus as primary immunosuppressant for lung transplant recipient resulted in comparable survival and reduction in acute rejection episodes when compared with cyclosporine.  相似文献   

9.
Calcineurin inhibitors (CNI) are the mainstay of immunosuppression after liver transplantation (LT), but CNIs are associated with significant nephrotoxicity. Recently, mTOR inhibitors such as sirolimus and everolimus (EVR) have been used with or without CNIs in LT recipients for their renal‐sparing effect. We conducted a systematic review and meta‐analysis of randomized controlled trials (RCT) that examined the effect of EVR with CNI minimization or withdrawal on renal function in LT recipients. RCT of primary adult LT recipients with baseline GFR >30 mL/min who received EVR with CNI minimization or withdrawal were included. Four RCTs (EVR n=465, control n=428) were included. In three RCTs, EVR was initiated 4 weeks following LT; these studies were used to assess the primary outcome. All four studies were used to assess the secondary outcomes. Based on this study, EVR use with CNI minimization in LT recipients is associated with improved renal function at 12 months by GFR of 10.2 mL/min (95% CI: 2.75‐17.8). EVR use was not associated with an increased risk of biopsy‐proven acute rejection (RR 0.68, 95% CI: 0.31‐1.46), graft loss (RR 1.60, 95% CI: 0.51‐5.00), or mortality (RR 1.34, 95% CI 0.62‐2.90). However, it was associated with an increased risk of overall infections (RR 1.45, 95% CI: 1.10‐1.91).  相似文献   

10.

Background

Roux-en-Y choledochojejunostomy and duct-to-duct anastomosis are potential methods for biliary reconstruction in liver transplantation (LT) for recipients with primary sclerosing cholangitis (PSC). However, there is controversy over which method yields superior outcomes. The purpose of this study was to evaluate the outcomes of duct-to-duct versus Roux-en-Y biliary anastomosis in patients undergoing LT for PSC.

Methods

Studies comparing Roux-en-Y versus duct-to-duct anastomosis during LT for PSC were identified based on systematic searches of 9 electronic databases and multiple sources of gray literature.

Results

The search identified 496 citations, including 7 retrospective series, and 692 patients met eligibility criteria. The use of duct-to-duct anastomosis was not associated with a significant difference in clinical outcomes, including 1-year recipient survival rates (odds ratio [OR], 1.02; 95% confidence interval [CI], 0.65–1.60; P = .95), 1-year graft survival rates (OR, 1.11; 95% CI, 0.72–1.71; P = .64), risk of biliary leaks (OR, 1.23; 95% CI, 0.59–2.59; P = .33), risk of biliary strictures (OR, 1.99; 95% CI, 0.98–4.06; P = .06), or rate of recurrence of PSC (OR, 0.94; 95% CI, 0.19–4.78; P = .94).

Conclusions

There were no significant differences in 1-year recipient survival, 1-year graft survival, risk of biliary complications, and PSC recurrence between Roux-en-Y and duct-to-duct biliary anastomosis in LT for PSC.  相似文献   

11.

Background

No consensus exists as to whether laparoscopic treatment for pancreatic insulinomas (PIs) is safe and feasible. The aim of this meta-analysis was to assess the feasibility, safety, and potential benefits of laparoscopic approach (LA) for PIs. The abovementioned approach is also compared with open surgery.

Methods

A systematic literature search (MEDLINE, EMBASE, Cochrane Library, Science Citation Index, and Ovid journals) was performed to identify relevant articles. Articles that compare the use of LA and open approach to treat PI published on or before April 30, 2013, were included in the meta-analysis. The evaluated end points were operative outcomes, postoperative recovery, and postoperative complications.

Results

Seven observational clinical studies that recruited a total of 452 patients were included. The rates of conversion from LA to open surgery ranged from 0%–41.3%. The meta-analysis revealed that LA for PIs is associated with reduced length of hospital stay (weighted mean difference, −5.64; 95% confidence interval [CI], −7.11 to −4.16; P < 0.00001). No significant difference was observed between LA and open surgery in terms of operation time (weighted mean difference, 2.57; 95% CI, −10.91 to 16.05; P = 0.71), postoperative mortality, overall morbidity (odds ratio [OR], 0.64; 95% CI, 0.35–1.17; P = 0.14], incidence of pancreatic fistula (OR, 0.86; 95% CI, 0.51–1.44; P = 0.56), and recurrence of hyperglycemia (OR, 1.81; 95% CI, 0.41–7.95; P = 0.43).

Conclusions

Laparoscopic treatment for PIs is a safe and feasible approach associated with reduction in length of hospital stay and comparable rates of postoperative complications in relation with open surgery.  相似文献   

12.

Background

Long-term function of transplanted kidney is the factor determining quality of life for transplant recipients. The aim of this study was to evaluate the effect of selected factors on time of graft function after renal transplantation within 15 years of observation.

Methods

Preoperative and intraoperative factors were analyzed in 232 kidney recipients within a 15-year observation period. Analysis included age, sex, cause of recipient's renal failure, length of hemodialyses before transplantation, peak panel reactive antibodies test, human leukocyte antigen compatibility, cold ischemia time, delayed graft function occurrence, length and time of hemodialyses after transplantation, early graft rejection, creatinine level at days 1, 3, 7, 30, 90, and 180 after transplantation, and influence of these factors on the time of graft function. Statistical analysis was performed with the use of univariate and multivariate Kaplan-Meier test and Cox regression proportional hazards model, with P < .05 considered to be significant.

Results

Univariate analysis showed significantly shorter renal graft function in the group of recipients with higher creatinine levels in all of the analyzed time periods and in patients experiencing delayed graft function. Length of time of hemodialyses after transplantation and number of dialyses had significant impact on worsening of late transplant results. Multivariate analysis reported that early graft rejection in the postoperative period is an independent factor improving late graft function: P = .002; hazard ratio (HR), 0.49 (95% confidence interval [CI], 0.31–0.78). Higher creatinine level at day 90 after kidney transplantation is a predictive factor of late graft dysfunction: P = .002; HR, 1.68 (95% CI 1.2–2.35).

Conclusions

Creatinine level at day 90 after renal transplantation is the prognostic factor of long-term kidney function. Early transplant rejection leads to introduction of more aggressive immunosuppression protocol, which improves long-term transplant results.  相似文献   

13.
Immunosuppression using calcineurin inhibitors (CNIs) is accompanied by neuropsychiatric side effects, which counteract longevity and quality of life benefits in 10% to 28% of patients. Following the availability of the mammalian target of rapamycin (mTOR) inhibitors, it became possible to replace CNI without increasing the risk of acute graft rejection. mTOR, a member of the phosphatidyl inositol 3′ kinase family, is a downstream target of brain-derived neurotrophic factor, which has been implicated in the pathophysiology and treatment of several psychiatric disorders. Preclinical evidence has implicated the mTOR pathway in synaptic plasticity and fear memory consolidation and reconsolidation.

Methods

In the present study we prospectively evaluated the psychiatric outcomes of CNI-free immunosuppression in adult maintenance heart transplant recipients (n = 9; age: 66.1 ± 6.1) using the Wechsler Memory Scale-Revised (WMS-R), Symptom Checklist-90-Revised (SCL-90-R), Beck Depression Inventory (BDI), Trail Making Tests A and B, Digit Span (DS), and Hamilton Depression Scale (HAMD).

Results

Four weeks after switching to CNI-free immunosuppression using everolimus, BDI (Z = −1.14; P = .048), Trail Making tests A and B (Z = −2.52; P = .012), WMS-R (Z = 2.37; P = .018), and SCL-90-R (Z = −2.37; P = .018) were all significantly improved while DS (Z = −1.18; P = .236) and HAMD (Z = −0.595; P = .552) remained unchanged.

Conclusion

This report describes favorable psychiatric outcome variables using everolimus in maintenance heart transplant recipients. CNI-free immunosuppression with everolimus might provide significant improvement in memory, concentration, and overall psychiatric symptoms among heart transplant recipients.  相似文献   

14.
Calcineurin-inhibitor-sparing immunosuppressive protocols   总被引:3,自引:0,他引:3  
Calcineurin inhibitors (CNI) have played an important role in improving graft survival. However, the balance between preventing immunologic allograft losses and the management of CNI-related nephrotoxicity is still an issue in renal transplantation. There are three major CNI-sparing strategies. CNI MINIMIZATION: The advent of mycophenolate mofetil (MMF) allows cyclosporine (CsA) reduction to ameliorate renal function in patients with chronic renal allograft dysfunction, without increasing acute rejection rates. In combination with mTOR inhibitors, very low CNI levels may be sufficient to prevent acute rejection. However, in this association, CNI nephrotoxicity is magnified by pharmacokinetic interaction. CNI WITHDRAWAL: CNI withdrawal has been attempted in regimens containing MMF or sirolimus (SRL). Introduction of MMF in patients with chronic allograft nephropathy (CAN) followed by CNI withdrawal resulted in stabilization or improvement of renal function and hypertension profile, although there is some risk of acute rejection. In regimes based on SRL, CNI withdrawal is a safety strategy, achieving a sustained improvement of renal function, histology, and graft survival. There is not consensus at all whether MMF should be added or not in patients converted from CNI to mTOR inhibitor. CNI AVOIDANCE: Polyclonal-based regimens with MMF and steroids have shown acceptable acute rejection rates, but high rates of cytomegalovirus (CMV) and opportunistic infections. Conversely, anti-IL-2R in combination with MMF and steroids resulted in 50% incidence of acute rejection, thus suggesting that CNI avoidance is not feasible in a regimen based on MMF. Alternatively, a protocol based on anti-IL-2R induction therapy combined with SRL, MMF, and prednisone has shown an efficient prevention of acute rejection, higher creatinine clearance and lower rate of CAN in comparison with a group treated with CNI. New strategies using costimulation blockade may help in the development of safe CNI-free regimens. In summary, in renal transplantation the new immunosuppressive medications have made feasible old aspirations such as minimization, withdrawal, or even avoidance of CNI.  相似文献   

15.

Background

Osteoporosis can develop and become aggravated in kidney transplant patients; however, the best preventive options for post-transplantation osteoporosis remain controversial.

Methods

We retrospectively analyzed cohort of 182 renal transplant recipients of mean age 46.7 ± 12.1 years including 47.3% women. Seventy-three patients received neither vitamin D nor bisphosphonate after transplantation (group 1). The other patients were classified into the following 3 groups: calcium plus vitamin D (group 2; n = 40); bisphosphonate (group 3; n = 18); and both regimens (group 4; n = 51). Bone mineral density (BMD) was evaluated by dual-energy X-ray absorptiometry at baseline and at 1 year after transplantation.

Results

At 1 year after transplantation, T-scores of the femoral neck and entire femur were significantly decreased in group 1 (−0.23 ± 0.65 [P = .004] and −0.21 ± 0.74 [P = .018], respectively), whereas the lumbar spine was significantly increased in group 4 (0.27 ± 0.79; P = .020). Post hoc analysis demonstrated that the delta T-score was significantly lower in group 1 than in group 4 (P = .009, 0.035, and 0.031 for lumbar spine, femoral neck, and entire femur, respectively). In a multivariate analysis adjusted by age, sex, body mass index, dialysis duration, diabetes, calcineurin inhibitors, estimated glomerular filtration rate, and persistent hyperparathyroidism, both group 2 and group 4 showed protective effects on BMD reduction (odds ratio [OR], 0.165; 95% confidence interval [CI] 0.032–0.845 [P = .031]; and OR, 0.169; 95% CI, 0.045–0.626 [P = .008]; respectively). However, group 3 did not show a protective effect (OR, 0.777; 95% CI, 0.198–3.054; P = .718), because their incidence of persistent hyperparathyroidism after transplantation was significantly higher (50.0%) than the other groups (P < .001). The incidence of bone fractures did not differ among the groups.

Conclusions

Combination therapy with vitamin D and bisphosphonate was the most effective regimen to improve BMD among kidney recipients.  相似文献   

16.

Background

One of the problems of cadaveric renal transplantation is that its graft survival rate is less than that for living renal transplantation. We aim to study relationships between the graft survival of cadaveric renal transplantation patients and various factors.

Materials and Methods

We retrospectively analyzed 350 cadaveric renal transplantation patients from our institutions from 1983 to 2011. Kaplan-Meier analysis was performed to evaluate graft survival ratios. Using a multivariable Cox regression model, we evaluated the relationship between graft survival and the factors such as age and gender of donor and recipient, body mass index of recipient, duration of hemodialysis, warm ischemic time, and acute rejection (AR), etc.

Results

Among 235 males and 115 females, the overall mean age was 41 years. Median follow-up was 15 years (2 to 28 years). The graft survival ratio was 97% at 1 year, 85% at 5 years, and 71% at 10 years. Using the Cox regression model, graft survival was affected by donor age (younger than 60 years; hazard ratio [HR] 1.5; 95% confidence interval (CI) 1.0–2.0; P = .027) and early acute rejection (within 3 months; HR 2.1; CI 1.6–2.8; P < .001).

Conclusions

The graft survival of cadaveric renal transplantation patients is affected by factors of donor age and early AR.  相似文献   

17.

Background

A high incidence of delayed graft function (DGF) after deceased donor kidney transplantation occurs in Brazil. The reasons for such have not been adequately studied.

Methods

We performed a retrospective cohort study of 346 kidney transplant recipients from deceased donors. DGF risk factors related to the recipient, donor, and transplantation surgery were analyzed and correlated with graft outcomes. A logistic regression analysis was used to identify independent risk factors and patient and graft survival were assessed using Kaplan-Meier curves.

Results

The incidence of DGF was 70.8% (245 cases). Our final model of multivariate analysis showed that DGF is associated (P < .05) with donor final serum creatinine (relative risk [RR], 1.84; 95% confidence interval [CI], 1.26–2.70), donor age (RR, 1.02 [1.0–1.033]), receiving a kidney from national offer (RR, 2.44 [1.06–5.59]), and need for antibody induction (RR, 2.87 [1.33–6.18]). Outcomes that were associated with DGF were longer length of hospital stay (32.5 ± 20.5 vs 18.8 ± 16.3 days; P = .01), higher incidence of acute rejection (37.8 vs 12.9%; P < .01), worse graft survival at 1 year (83.5% vs 93.9%; P < .01), and higher levels of serum creatinine at 3, 6, and 12 months (P < .05). There was no difference in patient survival and the occurrence of acute rejection did not influence the survival of patients or grafts.

Conclusion

DGF was associated with higher donor final serum creatinine, donor age, receiving a kidney from the national supply, and need for antibody induction. Most importantly, DGF was associated with worse outcomes.  相似文献   

18.

Background

Various studies have reported poorer graft survival among individuals displaying T-cell-positive flow cytometry crossmatches (FCXM). Good outcomes have been observed in immunologically high-risk patients with the use of rituximab, plasmapheresis, and γ-globulin. Because the relevance of FCXM B-cell–positivity (BCXM (+)) alone remains controversial, we examined its impact on living donor renal transplantations.

Patients and Methods

We retrospectively studied 146 adult renal transplantation recipients from April 2007 to June 2012, dividing the patients into BCXM (+) (n = 31) versus BCXM (−) recipients (n = 115). We examined patient and graft survivals as well as rejection rates at 0 to 3, 3 to 12, and 12 to 24 months. We also determined the incidence of infectious diseases. We performed stepwise multivariate regression to identify risk factors contributing rejection episodes.

Results

One-year patient and graft survivals were 100% in both groups. The BCXM (−) group have a 16.8% rejection probability whereas the BCXM (+) group, 33.2% (P = .201). There were no significantly differences in the incidence of infectious diseases. Only the rate of a sensitizing history was an independent risk factor for a rejection episode.

Conclusion

BCXM (+) showed only a tendency but not a significant impact on rejection episodes compared with BCXM (−); short-term graft survivals were similar.  相似文献   

19.

Background

Donor renal volume, which can be easily measured by computerized tomographic angiography with 3-dimensional reconstruction, may influence graft outcomes. Low functional renal mass and donor kidney–recipient body size mismatch can lead to progressive renal injury and poor graft function.

Materials and methods

This single–center retrospective analysis of 51 consecutive living donor renal transplantations performed between January 2005 and December 2011 defined transplant renal volume per unit recipient body surface area (BSA; mL/m2). The patients were divided into 2 groups: group I (n = 31, donor–recipient BSA ratio ≤1) and group II (n = 20, BSA ratio >1). We analyzed the clinical characteristics and laboratory data of donors and recipients to ascertain correlations with, renal volumes and graft outcomes.

Results

The renal volumes of living donors correlated with estimated glomerular filtration ratios (eGFR; r = .314, P = .025). Serum creatinine after renal transplantation correlated with transplanted renal volume at 1, 3, and 12 months (r = −.319, P = .048; r = −.407, P = .010; r = −.472, P = .002). Serum eGFR also correlated with transplanted renal volume at 3 and 12 months after renal transplantation (r = .318, P = .049 and r = .388, P = .015). There were no significant differences between groups for acute or chronic rejection, infection or delayed graft function. However, serum creatinine levels were higher (P = .011, P = .022, and P = .007) and serum eGFR significantly lower in group I at 1, 3, 6, and 12 months after renal transplantation (P = .036, P = .042, P = .042, and P = .049, respectively). There was no significant difference in graft survival.

Conclusions

Renal volume of living donors may reflect renal function and have a significant impact on graft outcomes. Renal volume matching should be considered to select donor–recipient pairs for living donor renal transplantation.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号