首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Although acute rejection rates have fallen over time, how this relates to graft outcomes is not known. Using data from the ANZDATA Registry, we examined associations of rejection within six months of transplantation with graft and patient outcomes among kidney-only transplants performed between April 1997 and December 2004 in Australia and New Zealand. Associations of biopsy histology with outcomes of the rejection episode were also examined. Outcomes were examined among 4325 grafts with 1961 rejection episodes in total. Crude rejection rates have fallen by one-third over that time, but rates of graft survival are constant. The occurrence of acute rejection was associated with an increased risk of graft loss after 6 months (HR, adjusted for donor and recipient characteristics, 1.69 [1.36-2.11], p<0.001). Late rejection (first rejection >or=90 days) was associated with higher risk of graft loss (adjusted HR 2.46 [1.70-3.56], p<0.001). Vascular rejection was also associated with a higher risk of graft loss 2.07 [95% CI 1.60-2.68], p<0.001. The occurrence of acute rejection is associated with an ongoing increased risk of graft loss, particularly if that episode occurred late or included vascular rejection. The reduced rates of rejection have not been associated with improved graft survival.  相似文献   

2.
BACKGROUND: Live donors are an increasingly important source of kidneys for transplantation in Australia. The aim of this study was to compare the rate and severity of rejection between patients receiving kidney transplants from live versus cadaveric donors. METHODS: A retrospective analysis was undertaken of all patients receiving live-donor (n=109) and cadaveric-donor (n=389) renal transplants at our institution between April 1, 1994, and March 31, 2000. Follow-up was completed on all patients until graft loss, death, or May 31, 2001. RESULTS: The baseline characteristics of the live-donor and cadaveric groups were similar, except for recipient age (mean+/-SD, 36.3+/-15.6 vs. 44.5+/-14.4 years, respectively; P<0.001); donor age (46.1+/-11.3 vs. 36.1+/-16.4 years, P<0.001); pretransplant dialysis duration (1.36+/-2.1 vs. 3.4+/-4.4 years, P<0.001); and the proportions of patients receiving first allografts (95% vs. 88%, respectively; P<0.05), antibody induction (8% vs. 20%, P<0.01), and mycophenolate mofetil (MMF) (60% vs. 37%, P<0.001). Acute rejection was observed in 48 (44%) live-donor and 108 (28%) cadaveric transplants (P=0.001). Cadaveric donor type was independently predictive of less acute rejection both on logistic regression (adjusted odds ratio [AOR], 0.47; 95% confidence interval [CI], 0.30-0.73; P=0.001) and multivariate Cox proportional hazards model analysis (hazard ratio, 0.49; 95% CI, 0.34-0.69; P<0.001). Patients receiving cadaveric-donor transplants were also significantly less likely to receive antibody therapy for rejection (univariate, 18% vs. 9%; P=0.006; multivariate AOR, 0.45; 95% CI, -0.25-0.82; P<0.01), independent of recipient age, gender, race, transplant number, human leukocyte antigen mismatch, sensitization, induction therapy, delayed graft function, MMF use, tacrolimus or cyclosporine A use, sirolimus-everolimus use, year of transplant, donor age, or dialysis duration. However, donor type did not independently influence graft survival, immunologic graft survival, or patient survival. CONCLUSIONS: Live-donor kidney transplant recipients had a higher rate and severity of rejection and a shorter rejection-free period than cadaveric renal transplant recipients. Further consideration of the reasons for this difference and the use of alternative immunosuppressive strategies for live-donor transplants are recommended.  相似文献   

3.
Kidney transplant outcomes that vary by program or geopolitical unit may result from variability in practice patterns or health care delivery systems. In this collaborative study, we compared kidney graft outcomes among 4 countries (United States, United Kingdom, Australia, and New Zealand) on 3 continents. We analyzed transplant and follow‐up registry data from 1988‐2014 for 379 257 recipients of first kidney‐only transplants using Cox regression. Compared to the United States, 1‐year adjusted graft failure risk was significantly higher in the United Kingdom (hazard ratio [HR] 1.22, 95% confidence interval [CI] 1.18‐1.26, P < .001) and New Zealand (hazard ratio [HR] 1.29, 95% confidence interval [CI] 1.14‐1.46, P < .001), but lower in Australia (HR 0.90, 95% CI 0.84‐0.96, P = .001). In contrast, long‐term adjusted graft failure risk (conditional on 1‐year function) was significantly higher in the United States compared to Australia, New Zealand, and the United Kingdom (HR 0.74, 0.75, and 0.74, respectively; each P < .001). Thus long‐term kidney graft outcomes are approximately 25% worse in the United States than in 3 other countries with well‐developed kidney transplant systems. Case mix differences and residual confounding from unmeasured factors were found to be unlikely explanations. These findings suggest that identification of potentially modifiable country‐specific differences in care delivery and/or practice patterns should be sought.  相似文献   

4.
Increasing the donor pool using en bloc pediatric kidneys for transplant   总被引:3,自引:0,他引:3  
OBJECTIVES: En bloc pediatric kidney transplants (EBPKT) are still a subject of controversy. The aim of this study was to determine whether acceptable long-term graft survival and function can be achieved in EBPKT compared with the transplant of single, cadaveric, adult donor kidneys. METHODS: A retrospective review was conducted of 66 recipients of en bloc kidneys from cadaveric pediatric donors and 434 patients who underwent transplantation with a single kidney from an adult donor between January 1990 and May 2002 at the authors' hospital. The recipients were well-matched demographically. Both transplant groups were analyzed for short- and long-term performance in terms of transplant outcome and quality of graft function. RESULTS: Overall death-censored actuarial graft survival rates at 1 and 5 years were 89.2% and 84.6% in the adult kidney transplants (AKT) and 83.3% and 81.1% in EBPKT, respectively (P=0.56). In the EBPKT group, graft function was improved over that observed in AKT. Vascular thrombosis was the most common cause of graft loss in EBPKT. Acute rejection occurred more frequently in AKT and Cox's regression analysis indicated that undergoing an AKT was a predictive factor for acute vascular rejection (adjusted risk ratio, 3.8; 95% confidence interval, 1.4-10.2; P=0.001). CONCLUSIONS: Overall graft survival was similar in both groups, vascular complications were the main cause of graft loss in EBPKT, and the EBPKT showed excellent long-term graft function and a low incidence of acute rejection.  相似文献   

5.
Anatomical differences between right and left kidneys could influence transplant outcome. We compared graft function and survival for left and right kidney recipients transplanted from the same deceased organ donor. Adult recipients of 4900 single kidneys procured from 2450 heart beating deceased donors in Australia and New Zealand from 1995 to 2009 were included in a paired analysis. Right kidneys were associated with more delayed graft function (DGF) (25 vs. 21% for left kidneys, p < 0.001) and, if not affected by DGF, a slower fall in serum creatinine. One‐year graft survival was lower for right kidneys (89.1 vs. 91.1% for left kidneys, p = 0.001), primarily attributed to surgical complications (66 versus 35 failures for left kidneys). Beyond the first posttransplant year, kidney side was not associated with eGFR, graft or patient survival. Receipt of a right kidney is a risk factor for inferior outcomes in the first year after transplantation. A higher incidence of surgical complications suggests the shorter right renal vein may be contributory. The higher susceptibility of right kidneys to injury should be considered in organ allocation.  相似文献   

6.
We evaluated outcomes with the sirolimus (SRL) and mycophenolate mofetil (MMF) combination regimen (SRL/MMF) in solitary kidney transplant recipients transplanted between 2000 and 2005 reported to the Scientific Registry of Renal Transplant Recipients. Three-and-a-half percent received SRL/MMF (n = 2040). Six-month acute rejection rates were higher with SRL/MMF (SRL/MMF: 16.0% vs. other regimens: 11.2%, p < 0.001). Overall graft survival was significantly lower on SRL/MMF. SRL/MMF was associated with twice the hazard for graft loss (AHR = 2.0, 95% C.I., 1.8, 2.2) relative to TAC/MMF, also consistent in both living donor transplants (AHR = 2.4, 95% C.I., 1.9, 2.9) and expanded criteria donor transplants (AHR = 2.1, 95% C.I., 1.7-2.5). Among deceased donor transplants, DGF rates were higher in the SRL/MMF cohort (47% vs. 27%, p < 0.001). However, adjusted graft survival was also significantly inferior with SRL/MMF in DGF-free patients (AHR = 1.9, 95% C.I., 1.6-2.3). In analyses restricted to patients who remained on the discharge regimen at 6 months posttransplant, conditional graft survival in deceased donor transplants was significantly lower with SRL/MMF compared to patients on TAC/MMF or CsA/MMF regimens at 5 years posttransplant (64%, 78%, 78%, respectively, p = 0.001) and across all patient subgroups. In conclusion, SRL/MMF is associated with inferior renal transplant outcomes compared with other commonly used regimens.  相似文献   

7.
BACKGROUND: Comparison of mortality rates after kidney transplantation with those treated by dialysis is an important factor is assessing treatment options, but is subject to many pitfalls in selection of appropriate control groups, in particular allowing for varying post-operative risk, and recent changes in mortality rates with better immunosuppression and dialysis techniques. We examined the outcomes following cadaveric renal transplantation and compared them with an appropriate control group of dialysis patients, using contemporary national data from Australia and New Zealand and appropriate statistical methods. In particular, we explicitly addressed the changing risks following transplantation, and looked at both younger (low-risk) and older (higher-risk) recipients, and examined the effect of attribution of deaths in the early period following loss of transplant function to the risk of transplantation. METHODS: We performed a cohort study, initially including 11 560 people aged 15-65 years who began treatment for end-stage renal disease in Australia or New Zealand between 1991 and 2000. Of these, 5144 were recorded at least once as on an active cadaveric transplant waiting list. Survival was analysed with Cox regression, including time-dependent covariates to allow for the violation of proportional hazards with changing mortality risks post-operatively. We also performed stratified analyses on low-risk recipients (<50 years, without co-morbidity) and older recipients. RESULTS: There was a clear difference in survival between those on the active transplant waiting list and those not listed. Of those who were on the cadaveric transplant waiting list, 2362 (46%) were transplanted in the period to 30 September 2001. Cadaveric transplantation was associated with an initial increase in mortality [during the first 3 months post-transplantation, adjusted HR 2.0 (1.5-2.7), P<0.001]. This fell below the dialysis group at 6 months [adjusted HR 0.27 (0.16-0.47), P<0.001] and from 12 months post-transplantation, the reduction in risk of mortality was approximately 80% [adjusted HR 0.19 (0.15-0.24), P<0.001]. A secondary analysis showed the excess risk attributed to the period immediately following transplantation was actually due to deaths in the 60 days after loss of transplant function rather than those occurring with a functioning graft. CONCLUSIONS: As well as improved quality of life, cadaveric renal transplantation in Australia and New Zealand is associated with a survival advantage compared with those remaining on the waiting list.  相似文献   

8.
BACKGROUND: Diltiazem, widely used as a cyclosporine-sparing agent, has been suggested to confer a benefit on graft and patient outcome in kidney transplantation related to immunomodulatory properties. Use of cyclosporine-sparing agents (CsSpA) is routinely recorded by the Australia & New Zealand Dialysis and Transplant (ANZDATA) Registry, and we used these data to examine the associations between CsSpA use and outcomes. METHODS: Graft and patient survival were analyzed for a cohort of 3913 people who received kidney transplants in Australia or New Zealand between 1 April 1993 and 30 March 2001. Patients were followed to death or loss of graft function. Graft and patient survival analyses were performed using Cox proportional hazards models, including a time varying covariate for CsSpA use in analyses of graft failure. Occurrence of delayed graft function (DGF) and acute rejection also were examined as secondary outcomes. RESULTS: There was no difference in patient survival in the first 12 months post transplantation, but from 12 months onwards there is a survival advantage associated with CsSpA use among cadaveric donor (CD) recipients in both univariate hazard ratio (HR) 0.56, 95% CI 0.41 to 0.76, P < 0.001 and multivariate (HR 0.56, 95% CI 0.40 to 0.79, P < 0.001) analyses. This was consistent across subgroups examined. Lower rates of early graft loss (censored for death) were associated with CsSpA use [odds ratio (OR) 0.61, 95% CI 0.50 to 0.75, P < 0.0001]. Lower rates of use of antibody therapy for rejection also were observed, but not lower rates of biopsy-proven rejection. CONCLUSIONS: CsSpA use was associated with improved patient mortality after kidney transplantation. Whether this was a direct drug effect or due to other factors associated with diltiazem use cannot be inferred directly from these data, although several plausible mechanisms exist which might mediate a diltiazem effect.  相似文献   

9.
Advances in immunosuppression have facilitated increased use of steroid‐avoidance protocols in pediatric kidney transplantation. To evaluate such steroid avoidance, a retrospective cohort analysis of pediatric kidney transplant recipients between 2002 and 2009 in the United Network for Organ Sharing database was performed. Outcomes (acute rejection and graft loss) in steroid‐based and steroid‐avoidance protocols were assessed in 4627 children who received tacrolimus and mycophenolate immunosuppression and did not have multiorgan transplants. Compared to steroid‐based protocols, steroid avoidance was associated with decreased risk of acute rejection at 6 months posttransplant (8.3% vs. 10.9%, p = 0.02) and improved 5‐year graft survival (84% vs. 78%, p < 0.001). However, patients not receiving steroids experienced less delayed graft function (p = 0.01) and pretransplant dialysis, were less likely to be African‐American and more frequently received a first transplant from a living donor (all p < 0.001). In multivariate analysis, steroid avoidance trended toward decreased acute rejection at 6 months, but this no longer reached statistical significance, and there was no association of steroid avoidance with graft loss. We conclude that, in clinical practice, steroid avoidance appears safe with regard to graft rejection and loss in pediatric kidney transplant recipients at lower immunologic risk.  相似文献   

10.
The relative efficacy of anti-IL-2 receptor antibodies (IL2R Abs) and antilymphocyte antibodies in preventing acute rejection and improving graft survival after renal transplantation is poorly defined. In particular, the benefits of these agents in specific subgroups, such as recipients with different degrees of HLA mismatch, are unknown. Using the SRTR database, we compared IL2R Abs to no induction and to antilymphocyte antibody induction in 48 948 first renal transplant recipients in the United States between 1998 and 2003 with respect to acute rejection and graft failure. IL2R Abs decreased acute rejection at 6 months (OR: 0.81(0.75-0.87)), and reduced graft failure (HR: 0.90(0.84-0.95)), compared to no induction over a follow-up of 1059 days. Compared to IL2R Abs, antilymphocyte Abs were associated with decreased acute rejection (OR: 0.90(0.83-0.99)) at 1 year, but were not associated with improved graft survival (OR: 1.08(1.00-1.18)) over a follow-up of 732 days. The benefit of IL2R Abs in reducing acute rejection increased significantly with greater HLA mismatch (p = 0.007). IL2R Abs remain an important option in the management of renal transplant patients, and may be particularly useful in specific patient subsets.  相似文献   

11.
BackgroundEnd-stage kidney disease secondary to hyperoxaluria presents a major challenge for transplant physicians given concern regarding disease recurrence. Few contemporary studies have reported long-term outcomes following transplantation in this population.MethodsThis study examined the outcomes of all adult patients with end-stage kidney disease secondary to hyperoxaluria who received a kidney or combined liver-kidney transplant in Australia and New Zealand between 1965 and 2015. Patients with hyperoxaluria were propensity score matched to control patients with reflux nephropathy. The primary outcome was graft survival. Secondary outcomes included graft function, acute rejection, and patient survival.ResultsNineteen transplants performed in 16 patients with hyperoxaluria were matched to 57 transplants in patients with reflux nephropathy. Graft survival was inferior in patients with hyperoxaluria receiving a kidney transplant alone (subhazard ratio [SHR] = 3.83, 95% confidence interval [CI], 1.22-12.08, P = .02) but not in those receiving a combined liver-kidney transplant (SHR = 0.63, 95% CI, 0.08-5.21, P = .67). Graft failure risk was particularly high in patients with hyperoxaluria receiving a kidney transplant alone after more than 1 year of renal replacement therapy (SHR = 8.90, 95% CI, 2.35-33.76, P = .001). Posttransplant estimated glomerular filtration rate was lower in patients with hyperoxaluria (10.97 mL/min/1.73 m2, 95% CI, 0.53-21.42, P = .04). There was no difference between groups in the risk of acute rejection or death with a functioning graft.ConclusionCompared to reflux nephropathy, hyperoxaluria was associated with inferior graft survival in patients receiving a kidney transplant alone. Long-term graft function was lower in patients with hyperoxaluria, but the risks of acute rejection and death were not different.  相似文献   

12.
BACKGROUND: Improvements in long-term kidney graft survival have been recently noted. However, the reasons for this were unclear. This study examined post-transplant renal function within the first year as an independent variable influencing long-term survival. METHODS: The influence of demographic characteristics (age, sex, race); transplant variables (cadaver versus living donor, cold ischemia time, HLA mismatching, delayed graft function and transplant year), and post-transplant variables (immunosuppressive agents for the prevention of acute rejection, clinical acute rejection and post-transplant renal function in the first year) on graft survival were analyzed for 105,742 adult renal transplants between 1988 and 1998. Renal function in the first year was expressed as serum creatinine at six months and one year and delta creatinine (change in serum creatinine between 6 months and 1 year). Graft half-life was used to measure long-term survival. RESULTS: During this 11-year period, the one-year serum creatinine values for cadaver recipients steadily improved, from 1.82 +/- 0.82 mg/dL in 1988 to 1.67 +/- 0.82 mg/dL in 1998 (P < 0.001), as did the graft half-life. There was a progressive decline in graft half-life for each incremental increase of six month, one year and Delta creatinine for living and cadaver donor transplants as well for cadaver transplants with donor age > and < or =50 years. The Relative Hazard (RH) for graft failure was 1.63 (1.61, 1.65; P < 0.0001) with each increment of 1.0 mg/dL of serum creatinine at one year post-transplant and it increased to 2.26 (2.2, 2.31; P < 0.0001) when the Delta creatinine was 0.5 mg/dL. The RH reduction for graft failure was substantially lower for the years 1993, 1996, 1997 and 1998 when post-transplant renal function was not included in the model (P < 0.05). However, the RH reduction per year was not different when post-transplant creatinine was included in the model, 1.01 (0.94 to 1.05; P = 0.89). CONCLUSION: In conclusion, one-year creatinine and Delta creatinine values predict long-term renal graft survival. Recent improvements in graft half-life are related to conservation of renal function within the first year post-transplantation.  相似文献   

13.
Historically, higher acute rejection rates, earlier first rejection, and an inability to reverse the rejection characterize pediatric renal transplantation. In recent years, short-term (1-year) graft survival of pediatric renal transplants has steadily improved. To test the hypothesis that these improvements were mediated by changes in acute rejection, we considered the rejection profile of patients who received a renal allograft between 1987 and 1989 (cohort A) and compared it with recipients transplanted between 1997 and 1999 (Cohort B). Cohort A comprised 1469 transplants and cohort B comprised 1189 transplants. Restricting the data to the first year of follow-up, rejection ratios were 1.6 and 0.7, respectively (p < 0.001). Sixty per cent of the later cohort (B) were rejection free at 1 year, compared with 29% for the earlier cohort (A) (p < 0.001). Controlling for donor source, the rejection reversal rate for the later cohort was significantly better than that of the early cohort (p < 0.001). Cumulative distribution of times to first rejection was significantly better for cohort B (p < 0.001). One-year graft survival for cohort B at 94% was significantly better than 80% for cohort A (p < 0.001). We conclude that the improved short-term graft survival is mediated by improvements in the rejection profile in more recently transplanted patients and that this may translate into a better half-life for pediatric renal transplant recipients who received an allograft in the years 1997-99.  相似文献   

14.
Pancreas after kidney transplants   总被引:6,自引:0,他引:6  
BACKGROUND: For certain uremic diabetic patients, a sequential transplant of a kidney (usually from a living donor) followed by a cadaver pancreas has become an attractive alternative to a simultaneous transplant of both organs. The purpose of this study was to compare outcomes with simultaneous pancreas-kidney (SPK) versus pancreas after kidney (PAK) transplants to determine advantages and disadvantages of the two procedures. METHODS: Between January 1, 1994, and June 30, 2000, we performed 398 cadaver pancreas transplants at our center. Of these, 193 were SPK transplants and 205 were PAK transplants. We compared these two groups with regard to several endpoints, including patient and graft survival rates, surgical complications, acute rejection rates, waiting times, length of hospital stay, and quality of life. RESULTS: Overall, surgical complications were more common for SPK recipients. The total relaparotomy rate was 25.9% for SPK recipients versus 15.1% for PAK recipients (P = 0.006). Leaks, intraabdominal infections, and wound infections were all significantly more common in SPK recipients (P = 0.009, P = 0.05, and P = 0.01, respectively, versus PAK recipients). Short-term pancreas graft survival rates were similar between the two groups: at 1 year posttransplant, 78.0% for SPK recipients and 77.9% for PAK recipients (P = not significant). By 3 years, however, pancreas graft survival differed between the two groups (74.1% for SPK and 61.7% for PAK recipients), although this did not quite reach statistical significance (P = 0.15). This difference in graft survival seemed to be due to increased immunologic losses for PAK recipients: at 3 years posttransplant, the incidence of immunologic graft loss was 16.2% for PAK versus 5.2% for SPK recipients (P = 0.01). Kidney graft survival rates were, however, better for PAK recipients. At 3 years after their kidney transplant, kidney graft survival rates were 83.6% for SPK and 94.6% for PAK recipients (P = 0.001). The mean waiting time to receive the pancreas transplant was 244 days for SPK and 167 days for PAK recipients (P = 0.001). CONCLUSIONS: PAK transplants are a viable option for uremic diabetics. While long-term pancreas graft results are slightly inferior to SPK transplants, the advantages of PAK transplants include the possibility of a preemptive living donor kidney transplant, better long-term kidney graft survival, significantly decreased waiting times, and decreased surgical complication rates. Use of a living donor for the kidney transplant expands the donor pool. Improvements in immunosuppressive regimens will hopefully eliminate some of the difference in long-term pancreas graft survival between SPK and PAK transplants.  相似文献   

15.
BACKGROUND: The relative benefit versus safety of induction therapy in live-donor renal transplant recipients is controversial. This paper presents observational data of live-donor recipients who received Thymoglobulin induction and standard maintenance immunosuppressive therapy. METHODS: Review and analysis of clinic records and electronic databases of live-donor renal transplants that received Thymoglobulin induction from May 1996 through 2003. RESULTS: Data analysis included 214 live-donor recipients (146 related, 68 unrelated) with a mean follow-up of 3.0+/-1.9 years. The average age of recipients was 44+/-13 years, with a majority being Caucasian (86%) and male (64%). Nineteen (9%) received previous transplants. No patients experienced delayed graft function and 10 (5%) developed acute rejection. Overall, predicted five-year patient survival was 96% and graft survival was 82%. The rates of CMV infection (5%), malignancy (3%), and lymphoproliferative disorder (0.5%) were low. When compared to live-donor kidney transplant recipients nationwide, the center cohort demonstrated improved five year patient (96% center versus 90% national, P=0.0326) and graft survival (82% center versus 79% national, P=0.0901), and a lower one-year acute rejection rate (2% center versus 21 % national, P<0.001). CONCLUSIONS: In this analysis, the use of Thymoglobulin in live-donor renal transplantation was associated with an absence of delayed graft function, low acute rejection rates, and high patient and graft survival without increasing the risk of infection or lymphoproliferative disorder.  相似文献   

16.
The success of cadaveric renal transplants in the first year is determined largely by events that transpire during the transplant hospitalization. This conclusion is based upon analyses of data on 19,525 cadaver donor renal transplants performed since October 1987 and reported to the UNOS Scientific Renal Transplant Registry from more than 200 centers nationwide. Graft survival rates at 1 year differed by 20-30% depending upon whether or not the transplanted kidney functioned immediately and upon whether the patient required dialysis during the first week posttransplant, experienced rejection, or was discharged with a kidney that was functioning well. Recipients whose discharge serum creatinine level was less than 2.6 mg/dl or whose graft was functioning well at the time of discharge had 88% 1-year graft survival. A multistep logistic regression analysis showed cold ischemia time, transfusions, donor age and cause of death, HLA-DR mismatches, and peak sensitization to be significant factors in the first week. Prophylactic antilymphocyte antibodies (ALG/OKT3) reduced the incidence of rejection from 30% to 20% during the transplant hospitalization, but apparently only delayed rejection. By 6 months there was only a 3% reduction with ALG and a 5% reduction with OKT3 in the incidence of reported rejection and a 2% difference in 1-year graft survival. Although graft and patient survival are important measures of transplant success, graft survival is predicted upon both early and late events. The course of the transplant during the initial hospitalization and the quality of function at discharge were the strongest determinants of 1-year graft survival.  相似文献   

17.
Greater compatibility of human leucocyte antigen (HLA) alleles between kidney donors and recipients may lead to improved graft outcomes. This study aimed to compare the incidence of acute rejection and graft failure in zero‐HLA‐mismatched recipients of living‐related (LD) and deceased donor (DD) kidney transplants. Using data from the Australia and New Zealand Dialysis and Transplant Registry, we compared the risk of any acute rejection and biopsy‐proven acute rejection (BPAR) and graft failure in recipients of zero‐HLA‐mismatched kidneys between LD and DD using logistic and Cox regression models. Of the 931 zero‐HLA‐mismatched recipients transplanted between 1990 and 2012, 19 (2.0%) received kidneys from monozygotic/dizygotic twins (twin), 500 (53.7%) from nontwin LD and 412 (44.3%) from DD. Twin kidney transplant recipients did not experience rejection. Compared to DD transplant recipients, the risk of any acute rejection (adjusted odds ratio 0.52, 95%CI 0.34–0.79, P = 0.002) and overall graft failure (adjusted hazard ratio 0.55, 95%CI 0.41–0.73, P < 0.001) was significantly lower in LD recipients independent of initial immunosuppression, but not for BPAR (adjusted odds ratio 0.52, 95%CI 0.16–1.64, P = 0.263). Zero‐HLA‐mismatched DD kidney transplant recipients have a significantly higher risk of any acute rejection episodes and graft loss compared to zero‐HLA‐mismatched LD kidney transplant recipients. A cautious and careful approach in reducing immunosuppression appears to be warranted in this group of transplant recipients.  相似文献   

18.
2,500 living donor kidney transplants: a single-center experience   总被引:8,自引:0,他引:8  
OBJECTIVE: To review a single center's experience and outcome with living donor transplants. SUMMARY BACKGROUND DATA: Outcome after living donor transplants is better than after cadaver donor transplants. Since the inception of the authors' program, they have performed 2,540 living donor transplants. For the most recent cohort of recipients, improvements in patient care and immunosuppressive protocols have improved outcome. In this review, the authors analyzed outcome in relation to protocol. METHODS: The authors studied patient and graft survival by decade. For those transplanted in the 1990s, the impact of immunosuppressive protocol, donor source, diabetes, and preemptive transplantation was analyzed. The incidence of rejection, posttransplant steroid-related complications, and return to work was determined. Finally, multivariate analysis was used to study risk factors for worse 1-year graft survival and, for those with graft function at 1 year, to study risk factors for worse long-term survival. RESULTS: For each decade since 1960, outcome has improved after living donor transplants. Compared with patients transplanted in the 1960s, those transplanted in the 1990s have better 8-year actuarial patient and graft survival rates. Death with function and chronic rejection have continued to be a major cause of graft loss, whereas acute rejection has become a rare cause of graft loss. Cardiovascular deaths have become a more predominant cause of patient death; infection has decreased. Donor source (e.g., ideally HLA-identical sibling) continues to be important. For living donor transplants, rejection and graft survival rates are related to donor source. The authors show that patients who had preemptive transplants or less than 1 year of dialysis have better 5-year graft survival and more frequently return to full-time employment. Readmission and complications remain problems; of patients transplanted in the 1990s, only 36% never required readmission. Similarly, steroid-related complications remain common. The authors' multivariate analysis shows that the major risk factor for worse 1-year graft survival was delayed graft function. For recipients with 1-year graft survival, risk factors for worse long-term outcome were pretransplant smoking, pretransplant peripheral vascular disease, pretransplant dialysis for more than 1 year, one or more acute rejection episodes, and donor age older than 55. CONCLUSIONS: These data show that the outcome of living donor transplants has continued to improve. However, for living donors, donor source affects outcome. The authors also identify other major risk factors affecting both short- and long-term outcome.  相似文献   

19.
Aim: Pre-emptive renal transplantation has become the preferred first-line therapy for patients with end-stage kidney failure. This study examines the outcome of allograft and patient survival in pre-emptive transplantation compared with non-pre-emptive transplantation from living donors in Australia and New Zealand. Methods: We have performed a retrospective study using the Australian and New Zealand Dialysis and Transplantation Registry. Allograft and patient survival were compared at 1, 5 and 10 years in pre-emptive transplantation and non-pre-emptive transplantation following a living donor transplant. Results: Allograft survival at 1, 5 and 10 years post pre-emptive transplantation was better than post non-pre-emptive transplantation (multivariate hazard ratio (HR) 0.80 [95% confidence interval 0.64–0.99], P = 0.036). Pre-emptive transplantation was associated with a significant patient survival advantage over non-pre-emptive transplantation when analysed from the time of transplantation and adjusted for age and gender (multivariate HR 0.46 [0.27–0.80], P = 0.006). Patient survival for pre-emptive transplantation and non-pre-emptive transplantation was 97% [0.95–0.98] and 93% [0.91–0.94] at 5 years and 93% [0.88–0.96] and 84% [0.82–0.87] at 10 years post transplant respectively. There was no difference in the overall rejection rate between pre-emptive transplantation and non-pre-emptive transplantation. Vascular rejection was less common in pre-emptive transplantation (HR 0.70 [0.50–0.98], P = 0.04). Conclusion: Pre-emptive transplantation from a living donor is associated with both better allograft and patient survival compared with transplantation after a period of dialysis. Pre-emptive transplantation should be the preferred modality of renal replacement therapy in patients who have a living donor.  相似文献   

20.
BACKGROUND: Living unrelated and related kidney transplantation has been shown to have similar allograft survival. However, the effect of donor-recipient relatedness in living-related and unrelated kidney transplantation on graft and patient survival remains uncertain. METHODS: Using Australia and New Zealand Dialysis and Transplant Registry, primary living renal transplant recipients in Australia between 1995 and 2004 were studied (n=1989). Donors were categorized according to their relationship with recipients: parent (n=606), child (n=103), spouse (n=358), sibling (n=656), other living-related donors (n=81), and other living-unrelated donors (n=185). Outcomes analyzed included the presence of rejection at 6 months, estimated glomerular filtration rate (eGFR) at 1 and 3 years, graft survival, and patient survival. RESULTS: A greater proportion of renal transplant recipients from parental and spousal donors were transplanted preemptively. Donor groups had no relationship with graft or patient survival. Parental donors were associated with an increased relative odds of acute rejection (odds ratio 1.69, 95% confidence interval 1.13-2.53, P=0.009) and a lower eGFR at both 1 and 3 years (coefficient -2.99 and -5.68, respectively; P<0.0001) compared to other donor groups (reference sibling donor group). CONCLUSIONS: This study has established that donor-recipient relatedness in both related and unrelated living kidney transplantation had no significant effect on graft and patient survival. Parental donors were associated with a higher relative risk of rejection and lower eGFR in the transplant recipients, although these findings did not translate to a worse graft outcome.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号