首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Graft survival in the autosomal dominant polycystic kidney disease (ADPKD) transplant population at our center was compared to other end stage renal disease (ESRD) transplant recipients (excluding diabetics). There were 1512 adult cadaveric renal transplants carried out at our center between 1989 and 2002. After exclusions, 1372 renal grafts were included in the study. Using Kaplan-Meier methods, patient and graft survival were determined and compared between the two groups. Mean age at transplant was significantly older for the ADPKD group of patients. The age adjusted graft survival at 5 years was 79% for ADPKD patients compared to 68% in the controls. Patient survival for ADPKD patients improved from 89% at 5 years to 95% when age adjusted. Using the Cox proportional hazards models to compare ADPKD with other causes of ESRD (including recipient age and other variables) in a multifactorial model, ADPKD was significant at the 5% level (p=0.036). This study demonstrates a graft and patient survival advantage in ADPKD patients when age-matched compared to other ESRD patients.  相似文献   

2.
Although the nephrotoxic side effects of cyclosporine are well known, the impact of long-term CsA on renal transplant function is uncertain. We studied 5-10-year renal function in 347 CsA-treated patients, and in 64 randomly selected non-CsA-treated patients who had a minimum of 55 months of graft function. Non-CsA patients had a lower creatinine (Cr) level at one year than CsA patients (P = .001), with no change in renal function over time (P = .6). In CsA-treated patients there was also no suggestion of progressive renal damage, as evidenced by no change in Cr or 1/Cr. Simple linear regression models of 1/Cr vs. time for the first 10 years posttransplant were fit to the data for each patient. Analysis of the Y-intercept estimates from these regressions showed that age (P = .001), sex (P = .001), cyclosporine toxicity (P = .024), and initial cyclosporine dosage (P = .016) significantly affected the one-year serum Cr. Variables not affecting one-year Cr included donor source, early rejection episodes, late rejection episodes, ATN, diabetes, transplant number, HLA ABDR mismatch (for cadaver transplants), maximum PRA, and PRA at transplant. Analysis of the slope estimates from the regressions revealed that only age (P = .001) and late rejection episodes (P = .001) significantly affected the rate of change in 1/Cr over time. We conclude that, in long-term renal transplant patients, there is no evidence of progressive deterioration in renal function due to CsA nephrotoxicity.  相似文献   

3.
The purpose of this study was to clarify the selectivity and specificity of noninvasive procedures for diagnosis of clinically suspected posttransplant renovascular hypertension. We prospectively investigated 25 renal transplant recipients with arterial hypertension and clinically suspected stenosis of the graft artery (8 female and 17 male patients; ages 45 +/- 15 years). We performed a captopril test with 25 mg captopril (n = 25), renography with technetium-99m diethylene triamine penta-acetic acid (99mTc-DTPA) before and after angiotensin-converting enzyme (ACE) inhibition with determination of glomerular filtration rate (GFR) and effective renal plasma flow (ERPF) (n = 23) and color-coded duplex ultrasonography of the transplant kidney vessels (n = 24). Renal transplant artery stenosis (RTAS) was excluded by renal arteriography in 20 patients and by operative evaluation or clinical follow-up in 5 patients. We identified 4 patients with RTAS and renovascular hypertension. The noninvasive methods showed the following results (sensitivity/specificity): (1) captopril test: 75%/67%; (2) renography combined with ACE-inhibition: 75%/84%; and (3) color-coded duplex ultrasonography: 100%/75%. We conclude that in patients with clinical evidence of RTAS most noninvasive diagnostic procedures are not sufficiently accurate to exclude the diagnosis. Only color-coded duplex ultrasonography did not fail to detect all patients with RTAS and may act as a screening test. Intraarterial renal angiography remains the most reliable and as-yet indispensable diagnostic test for transplant recipients to rule out RTAS.  相似文献   

4.
BACKGROUND: End-stage renal disease is associated with illness-induced disruptions that challenge patients and their families to accommodate and adapt. However, the impact of patients' marital status on kidney transplant outcome has never been studied. This project, based on data from United States Renal Data System (USRDS), helps to answer how marriage affects renal transplant outcome. METHODS: Data have been collected from USRDS on all kidney/kidney-pancreas allograft recipients between January 1, 1995 and June 30, 2002, who were 18 yr old or older and had information about their marital status prior to the kidney transplantation (n = 2061). Survival analysis was performed using Kaplan-Meier methods and Cox proportional hazards modeling to control for confounding variables. RESULTS: Overall findings of this study suggest that being married has a significant protective effect on death-censored graft survival [Hazard Ratio (HR) 0.80, p < 0.05] but a non-significant effect on recipient survival (HR 0.85, p = 0.122). When stratified by gender, the effect was still present in males for death-censored graft survival (HR 0.75, p < 0.05), but not for recipient survival (HR 0.86, p = 0.24). The effect was not observed in females, where neither graft (HR 0.90, p = 0.55) nor recipient (HR 0.8, p = 0.198) survival had an association with marital status. In subgroup analysis similar association was found in the recipients of a single transplant. CONCLUSION: Based on our analysis, being married in the pre-transplant period is associated with positive outcome for the graft, but not for the recipient survival. When analyzed separately, the effect is present in male, but not in female recipients.  相似文献   

5.
Previous studies of the effect of donor factors on renal transplant outcomes have not tested the role of recipient body mass index, donor/recipient weight ratios and age matching, and other factors. We analyzed 20,309 adult (age 16 or older) recipients having solitary cadaveric renal transplants from adult donors from 1 July 1994 to 30 June 1998 in an historical cohort study (the 2000 United States Renal Data System) of death censored graft loss by the Cox proportional hazards models, which corrected for characteristics thought to affect outcomes. The only independently significant findings in Cox Regression analysis were a high donor/ recipient age ratio (> or = 1.10, e.g. a 55-year-old donor given to a recipient age 50years or younger, adjusted hazard ratio (AHR) 3.22, 95% confidence interval (CI) 2.36-4.39) and African American donor kidneys (AHR 1.64, 95% CI, 1.24-2.17). African American recipients and older donors were not at independently increased risk of graft failure in this model. Among donor factors, older donor kidneys given to younger recipients and donor African American kidneys were independently associated with graft loss in recipients of cadaver kidneys. The task for the transplant community should be to find the best means for managing all donor organs without discouraging organ donation.  相似文献   

6.
Detailed data on living donor age, and its interplay with recipient age, in predicting allograft and recipient outcomes are wanting. We used the Scientific Registry of Transplant Recipients (2000–2009, n = 49 589) to assess the effect of living donor age on delayed graft function (DGF), total graft failure, death‐censored graft failure, death with graft function, and graft failure with death as a competing risk using logistic and Cox proportional hazards models. Potential nonlinear associations were modeled using fractional polynomial functions. There was a significant 1.87‐fold increase in the adjusted odds of DGF in the oldest versus youngest age groups. The 10‐year adjusted hazard ratios (HR) for total graft failure, death‐censored graft failure, and death with graft function increased in a nonlinear fashion across the range of living donor age studied. Graft failure was most accentuated in the youngest recipient age groups in competing risk models. Adjustment for renal function at 6‐ and 12‐months post‐transplant markedly attenuated the association between living donor age and graft/patient outcomes. Our findings confirm the important influence of living donor age on transplant outcomes and provide detailed estimates of risk across the living donor age continuum.  相似文献   

7.
BACKGROUND: The 'centre effect' has accounted for significant variation in renal allograft outcomes in the United States and Europe. To determine whether similar variation exists in Canada, we analysed mortality and graft failure (GF) rates among Canadian end-stage renal disease patients who received a renal allograft from 1988 to 1997 (n = 5082) across 20 transplant centres. METHODS: Patients were followed from the date of transplantation to the time of GF and/or death. A Cox proportional hazards model was used to estimate mortality and GF hazard ratios (HRs) adjusted for relevant covariates, including centre volume. Centre-specific HRs were derived by comparing each centre's outcome rates against all others. RESULTS: Twenty centres were included in the analysis. There was significant centre-specific variation in recipient and transplant characteristics (e.g. age, diabetes mellitus, donor source and centre volume) as well as covariate-adjusted facility-specific outcome rates. Facility-specific HRs for GF (including death with a functioning graft) ranged from 0.51 to 1.77, while mortality HRs (including death beyond GF) showed a similar spread (0.44-1.84). These HRs represent a 3- to 4-fold difference in transplant outcomes among the 20 centres studied. Centres performing less than 200 transplants over the study period were associated with lower graft and patient survival. CONCLUSIONS: These findings demonstrate significant centre-specific variation in the success of renal transplantation in Canada. Further studies are needed to elucidate the causes of this variation, with the goal of developing strategies to minimize the centre effect and ensure the best possible outcomes for all renal transplant recipients.  相似文献   

8.
Histidine-Tryptophan-Ketoglutarate (HTK) solution is increasingly used to flush and preserve organ donor kidneys, with efficacy claimed equivalent to University of Wisconsin (UW) solution. We observed and reported increased graft pancreatitis in pancreata flushed with HTK solution, which prompted this review of transplanting HTK-flushed kidneys. We analyzed outcomes of deceased-donor kidneys flushed with HTK and UW solutions with a minimum of 12 months follow-up, excluding pediatric and multi-organ recipients. We evaluated patient and graft survival and rejection rates, variables that might constitute hazards to graft survival and renal function. Two-year patient survival, rejection, renal function and graft survival were not different, but early graft loss (<6 months) was worse in HTK-flushed kidneys (p < 0.03). A Cox analysis of donor grade, cold ischemic time, panel reactive antibodies (PRA), donor race, first vs. repeat transplant, rejection and flush solution showed that only HTK use predicted early graft loss (p < 0.04; relative risk = 3.24), almost exclusively attributable to primary non-function (HTK, n = 5 (6.30%); UW, n = 1 (0.65%); p = 0.02). Delayed graft function and early graft loss with HTK occurred only in lesser grade kidneys, suggesting it should be used with caution in marginal donors.  相似文献   

9.
BACKGROUND: There is substantial evidence that renal transplant recipients with obesity or body mass index >30 kg/m(2) have increased risk of graft loss. However, few data exist for Asian population; clinicians have to rely upon indirect evidence by extrapolating the results from Caucasians to Asian recipients. STUDY: Using data from a study population of 150 consecutive cadaveric or living related Chinese renal transplant recipients at our center between 1985 and 2002, we examined the effect of baseline body mass index cut-off 25 kg/m(2) on the transplant outcomes using Kaplan-Meier analysis and Cox proportional hazards analysis. Primary study end point was overall graft survival. RESULTS: The mean body mass index was 22.9 +/- 4.0 kg/m(2). Thirty-seven (25%) patients were classified as overweight and 113 patients as non-overweight, using the cut-off of 25 kg/m(2). After median follow up period of nine yr, 15 graft losses (41%) occurred in the overweight group, as compared with 14 graft losses (12%) in the non-overweight group. Kaplan-Meier estimates of cumulative graft survival at five yr were significantly worse for the overweight group than non-overweight group (83.6% versus 92.7%, p = 0.0041). By multivariate Cox proportional hazards analysis, baseline body mass index > or =25 kg/m(2) conferred a significantly higher risk of graft loss (with relative risk of 4.78, 95% CI = 1.52-15.2) and doubling of serum creatinine (relative risk 3.19, 95% CI = 1.22-8.40). Death with a functioning graft occurred in 11% of overweight recipients, compared with 3% among the non-overweight group (p = 0.041). CONCLUSION: Our results draw attention to a recipient body mass index cut-off value > or =25 kg/m(2), which might confer an increased risk of graft loss in Asian kidney transplant populations.  相似文献   

10.
A racial disparity in graft survival for renal transplant recipients has been documented for both cadaveric and living-donor transplants. In the present single-center study we analyzed graft survival by race for recipients of living-donor kidney transplants in three eras: 1985-89, 1990-94, and 1995-2000. There was an intensification of the immunosuppressant regimen beginning in 1996, such that all patients received cyclosporine or tacrolimus with mycophenolate and prednisone. Graft survival was analyzed using the Cox proportional hazards model. There were 79 black recipients and 210 white recipients with no difference in mean age, degree of HLA matching, or proportion of recipients with diabetes as the cause of end-stage renal disease. Using all data from 1985 to 2000, graft survival was significantly better for whites vs. blacks adjusted for age, gender, diabetes, era of the transplant, and haplotype match (p = 0.05). However, when analyzed by era, there was a temporal trend for a progressive decrease in the racial disparity in graft survival. In confirmation of this effect, there was a significant race-era interaction (p = 0.01) on multivariable Cox proportional hazards analysis. The most recent data from the United States Renal Data System (USRDS) show a similar decrease in the racial difference in 1-year graft survival. We conclude that the influence of race on living-donor graft survival is diminishing over time.  相似文献   

11.
BACKGROUND: Renal insufficiency and end-stage renal disease (ESRD) are important problems in the cardiac transplant population, and are associated with significant morbidity, mortality and financial cost. We undertook this study to define pre-operative or early post-operative predictors of subsequent renal insufficiency and ESRD. METHODS: We studied 370 patients at Brigham and Women's Hospital who received heart transplants between 1984 and 1999, with up to 10-year follow-up. We evaluated 2 time-dependent primary outcomes: early reduction in GFR, and development of ESRD at any timepoint. Cox proportional hazards modeling was used in both univariate and multivariate analyses. RESULTS: The mean estimated glomerular filtration rate (GFR) fell 24% within the first post-transplant year, and remained stable thereafter. By actuarial analysis, 23% of patients developed a 50% reduction in GFR by the third year, and 20% developed ESRD by the tenth year of follow-up. In Cox multivariate analysis, significant predictors of post-transplant ESRD included: GFR <50 ml/min (hazards ratio [HR] 3.69, p = 0.024); high mean cyclosporine trough in the first 6 months (HR 5.10, p = 0.0059); and presence of diabetes (HR 3.53, p = 0.021). Conclusions about renal insufficiency outcome were limited by difficulties with accurate estimation of GFR and with definition of renal insufficiency. CONCLUSIONS: The results of this study underscore the magnitude of the problem of renal insufficiency and ESRD in the heart transplant population. In addition, these data suggest that patients at high risk for these outcomes can be identified early, even pre-operatively, to guide post-operative management.  相似文献   

12.
AIM: We prospectively followed a cohort of 202 renal transplant recipients for 5 years to examine the impact of fasting homocysteinemia on long-term patient and renal allograft survival. METHODS: Cox proportional hazards regression analysis was used to identify independent predictors of all-cause mortality and graft loss. RESULTS: Hyperhomocysteinemia (tHcy >15 micromol/L) was present in 48.7% of the 202 patients, predominantly among men (55.8%) as opposed to women (37.1%). At the end of the follow-up period, 13 (6.4%) patients had died including 10 from cardiovascular disease, and 23 had (11.4%) had lost their grafts. Patient death with a functioning allograft was the most prevalent cause of graft loss (13 recipients). Levels of tHcy were higher among patients who died than among survivors (median 23.9 vs 14.3 micromol/L; P = .005). Median tHcy concentration was also higher among the patients who had lost their allografts than those who did not (median 19.0 vs 14.1 micromol/L; P = .001). In a Cox regression model including gender, serum creatinine concentration, transplant duration, traditional cardiovascular risk factors, and associated conditions, such as past cardiovascular disease, only tHcy concentration (ln) (HR = 5.50; 95% CI, 1.56 to 19.36; P = .008) and age at transplantation (HR = 1.07; 95% CI, 1.02 to 1.13; P = .01) were independent predictors of patient survival. After censoring data for patient death, tHcy concentration was not a risk factor for graft loss. CONCLUSIONS: This prospective study shows that tHcy concentration is a significant predictor of mortality, but not of graft loss, after censoring data for patient death.  相似文献   

13.
The objective was to use the United States Renal Data System (USRDS) to quantify the relationship between immunosuppressant therapy (IST) adherence and risk of graft failure among adult renal transplant recipients (RTRs). A secondary objective was to examine the relationship among select patient characteristics and IST adherence. The study sample included adult RTRs who: received primary transplant between January 1, 1999 and December 31, 2005; experienced graft survival for at least 12 months post‐transplant and had at least 12 months of data in the USRDS; utilized Medicare coverage for IST; and were prescribed cyclosporine or tacrolimus. IST adherence was measured by medication possession ratio (MPR). Pearson chi‐square tests were used to examine associations between patient characteristics and MPR quartiles. Cox proportional hazards regression was used to assess relationships among time to graft failure, MPR, and patient characteristics. Thirty‐one thousand nine hundred and thirteen RTRs met inclusion criteria. Older age, female gender, white race, deceased donors, and tacrolimus were associated with greater adherence (p < 0.001). Cox proportional hazard modeling indicated greater adherence, white race, and having a living donor were significantly associated with longer graft survival (p < 0.05). Future prospective studies should further examine the clinical significance of IST nonadherence as it relates to graft failure.  相似文献   

14.
This study evaluated the efficacy of primary endovascular stenting in cases of transplant renal artery stenosis (TRAS) from cadaver and non-heart-beating donor kidneys. Patients with TRAS (n = 13) from a single-centre transplant population (n = 476) were treated by primary percutaneous angioplasty and endovascular stenting. The short-term efficacy of this intervention is demonstrated in terms of serum creatinine, glomerular filtration rate (GFR) biochemical, anti-hypertensive medications and mean arterial blood pressure control. Stenting for TRAS was performed in male (n = 10) and female (n = 3) recipients. The median age at transplantation was 55 yr (range 10-67 yr). Stenting occurred at a median duration of 410 d post-transplantation (range 84-5799 d). Mean serum creatinine (pre, 247 micromol/L; post, 214 micromol/L; p = 0.002), GFR (pre, 82.6 mL/min; post, 100.9 mL/min; p < 0.001), arterial blood pressure (pre, 104 mmHg; post, 97 mmHg; p = 0.036) and the number of anti-hypertensive medications required (pre, 3.4; post, 3.0; p = 0.002) showed significant improvement after post-endovascular therapy. There were no serious complications encountered. Primary endovascular stenting of TRAS produces a significant improvement in biochemical parameters of renal graft function and in blood pressure stability, with the benefit of low patient morbidity and single arterial puncture. Primary endoluminal stenting of TRAS is a safe and effective procedure for the treatment of TRAS.  相似文献   

15.
16.
Data were collected retrospectively on all 449 first-transplant cadaver renal allograft recipients transplanted at four centers between 1/1/78 an 12/31/82 who had graft failure by 1/1/85. A total of 383 of these patients had information available regarding subsequent disposition. Of these, 182 (47.5%) were placed on an active waiting list for retransplantation. There were no associations found between placement on a waiting list and the following variables: panel reactive antibody (PRA) prior to first transplant or subsequent to graft failure, recipient age at first transplant or at the time of graft failure, recipient race, PRA after first graft loss, or HLA-A, B match of the first transplant. When stratified by level of HLA-A, B match as poor (0-1 antigen, n = 150) or good (2-4 antigens, n = 233) the poorly matched recipients as a group had a significantly lower mean PRA prior to first transplant (9.4 +/- 1.6 vs. 15.5 +/- 1.7, P less than 0.01), but a significantly higher PRA within the first year following graft failure (48.1 +/- 4.8 vs. 36.2 +/- 3.2, P less than 0.04). In addition, the poorly matched (vs. well-matched) group had a significantly higher mean increase in PRA following graft failure (45.1 +/- 4.4 vs. 33.7 +/- 3.5), and a significantly higher percentage of patients with PRA level greater than or equal to 60% within a year after graft failure (40% vs. 25%). Of the 182 patients who were placed on a waiting list, 113 (62.1%) were regrafted. As a group, regrafted patients had a significantly lower PRA within the first year following graft failure compared with the group not regrafted (33.6 +/- 3.9 vs. 54.0 +/- 5.0, P less than 0.002). Patients with a good first transplant HLA match had a higher overall regraft rate compared with those with a poor match (70.0% vs. 50.0%, P less than 0.01). Likewise, the percentage of well-matched patients regrafted within two years of first graft failure was significantly higher (55.5% vs. 32.5%, P less than 0.02). By multivariate analysis using the Cox proportional hazard model with 13 separate variables and considering all patients, the relative risk (RR) of not being regrafted was significantly (P less than 0.012) associated with poor HLA-A, B matching of the first transplant (RR = 1.7).(ABSTRACT TRUNCATED AT 400 WORDS)  相似文献   

17.
BACKGROUND: The advances in immunotherapy, along with a liberalization of eligibility criteria have contributed significantly to the ever increasing demand for donor organs. In an attempt to expand the donor pool, transplant programs are now accepting older donors as well as donors from more remote areas. The purpose of this study is to determine the effect of donor age and organ ischemic time on survival following orthotopic heart transplantation (OHT). METHODS: From April 1981 to December 1996 372 adult patients underwent OHT at the University of Western Ontario. Cox proportional hazards models were used to identify predictors of outcome. Variables affecting survival were then entered into a stepwise logistic regression model to develop probability models for 30-day- and 1-year-mortality. RESULTS: The mean age of the recipient population was 45.6 +/- 12.3 years (range 18-64 years: 54 < or = 30; 237 were 31-55; 91 > 56 years). The majority (329 patients, 86.1%) were male and the most common indications for OHT were ischemic (n = 180) and idiopathic (n = 171) cardiomyopathy. Total ischemic time (TIT) was 202.4 +/- 84.5 minutes (range 47-457 minutes). In 86 donors TIT was under 2 hours while it was between 2 and 4 hours in 168, and more than 4 hours in 128 donors. Actuarial survival was 80%, 73%, and 55% at 1, 5, and 10 years respectively. By Cox proportional hazards models, recipient status (Status I-II vs III-IV; risk ratio 1.75; p = 0.003) and donor age, examined as either a continuous or categorical variable ([age < 35 vs > or = 35; risk ratio 1.98; p < 0.001], [age < 50 vs > or = 50; risk ratio 2.20; p < 0.001], [age < 35 vs 35-49 versus > or = 50; risk ratio 1.83; p < 0.001]), were the only predictors of operative mortality. In this analysis, total graft ischemic time had no effect on survival. However, using the Kaplan-Meier method followed by Mantel-Cox logrank analysis, ischemic time did have a significant effect on survival if donor age was > 50 years (p = 0.009). By stepwise logistic regression analysis, a probability model for survival was then developed based on donor age, the interaction between donor age and ischemic time, and patient status. CONCLUSIONS: Improvements in myocardial preservation and peri-operative management may allow for the safe utilization of donor organs with prolonged ischemic times. Older donors are associated with decreased peri-operative and long-term survival following. OHT, particularly if graft ischemic time exceeds 240 minutes and if these donor hearts are transplanted into urgent (Status III-IV) recipients.  相似文献   

18.
AIM: To evaluate factors affecting patient and kidney survival after renal transplant. PATIENT AND METHODS: Among 361 patients undergoing renal transplant: 52% (n = 189) were simultaneous with pancreas transplant (SPKT group) and 48% (n = 172), a kidney transplant alone (KT group). Out of 361 patients, 75% (n = 270) were diabetics. The patients were 220 (61%) men and 141 (39%) women of mean age 41 +/- 9 years. The mean time of dialysis was 42 +/- 21 months (range 0 to 126), and the mean duration of diabetes 24 +/- 7 years (range 5 to 51). A Cox regression analysis was done. RESULTS: The multivariate analysis revealed that in the final model diabetes and donor age were significant predictors of kidney graft survival; moreover, diabetes and recipient age were predictors of patient survival. Overall patient survival was significantly greater among nondiabetic patients (P = .002) or in diabetic patients who received SPKT, when compared with diabetics in whom only the kidney was transplanted (P = .001). CONCLUSIONS: Diabetes and donor age were independent prognostic factors affecting kidney graft survival after renal transplant, and recipient age and diabetes were prognostic factors affecting patient survival. Combined pancreas and kidney transplantation should be offered to patients with end-stage diabetic nephropathy.  相似文献   

19.
Domingues EMFL, Matuck T, Graciano ML, Souza E, Rioja S, Falci MC, Monteiro de Carvalho DB, Porto LC. Panel reactive HLA antibodies, soluble CD30 levels, and acute rejection six months following renal transplant.
Clin Transplant 2010: 24: 821–829. © 2009 John Wiley & Sons A/S. Abstract: Background: Specific anti‐human leukocyte antigen antibodies (HLA) in the post‐transplant period may be present with acute rejection episodes (ARE), and high soluble CD30 (sCD30) serum levels may be a risk factor for ARE and graft loss. Methods: HLA cross‐matching, panel reactive antibodies (PRA), and sCD30 levels were determined prior to transplantation in 72 patients. Soluble CD30 levels and PRA were re‐assessed at day 7, 14, 21, and 28, and monthly up to the sixth. Results: Twenty‐four subjects had a positive PRA and 17 experienced ARE. Nine of 17 ARE subjects demonstrated positive PRA and 16 had HLA mismatches. Positive PRA was more frequent in ARE subjects (p = 0.03). Eight subjects with ARE had donor‐specific antibodies (DSA) in serum samples pre‐transplantation, two subjects developed DSA. Three subjects without ARE had positive PRA only in post‐transplantation samples. Soluble CD30 levels were higher in pre‐transplant samples and ARE subjects than non‐ARE subjects (p = 0.03). Post‐transplant sCD30 levels were elevated in subjects who experienced rejection and were significantly higher at seven d (p = 0.0004) and six months (p = 0.03). Conclusions: Higher sCD30 levels following transplant were associated with ARE. Elevated sCD30 levels may represent a risk factor for acute rejection.  相似文献   

20.
BACKGROUND: Waiting time on dialysis has been shown to be associated with worse outcomes after living and cadaveric transplantation. To validate and quantify end-stage renal disease (ESRD) time as an independent risk factor for kidney transplantation, we compared the outcome of paired donor kidneys, destined to patients who had ESRD more than 2 years compared to patients who had ESRD less than 6 months. METHODS: We analyzed data available from the U.S. Renal Data System database between 1988 and 1998 by Kaplan-Meier estimates and Cox proportional hazards models to quantify the effect of ESRD time on paired cadaveric kidneys and on all cadaveric kidneys compared to living-donated kidneys. RESULTS: Five- and 10-year unadjusted graft survival rates were significantly worse in paired kidney recipients who had undergone more than 24 months of dialysis (58% and 29%, respectively) compared to paired kidney recipients who had undergone less than 6 months of dialysis (78% and 63%, respectively; P<0.001 each). Ten-year overall adjusted graft survival for cadaveric transplants was 69% for preemptive transplants versus 39% for transplants after 24 months on dialysis. For living transplants, 10-year overall adjusted graft survival was 75% for preemptive transplants versus 49% for transplants after 24 month on dialysis. CONCLUSIONS: ESRD time is arguably the strongest independent modifiable risk factor for renal transplant outcomes. Part of the advantage of living-donor versus cadaveric-donor transplantation may be explained by waiting time. This effect is dominant enough that a cadaveric renal transplant recipient with an ESRD time less than 6 months has the equivalent graft survival of living donor transplant recipients who wait on dialysis for more than 2 years.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号