首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
We evaluated the incidence and management of vascular complications after live donor renal transplantation. Possible risk factors and their effects on patient and graft survival were also assessed. MATERIALS AND METHODS: A total of 1,200 consecutive live donor renal transplants were performed in 1,152 patients at a single institution. The incidence of different types of vascular complications were determined and correlated with relevant risk factors. The impact on patient and graft survival was also studied. RESULTS: There were 34 vascular complications (2.8%). Stenotic or thrombotic complications were recorded in 11 cases (0.9%), including renal artery stenosis in 5 (0.4%), renal artery thrombosis in 5 (0.4%) and renal vein thrombosis in 1 (0.1%). Hemorrhagic complications were observed in 23 patients (1.9%). Although no risk factors could be identified that were related to stenotic or thrombotic complications, grafts with multiple renal arteries were significantly associated with hemorrhagic complications (p = 0.04). Stenotic and thrombotic complications as well as hemorrhagic complications were significantly associated with subsequent biopsy proved acute tubular necrosis (p <0.001). The mean 5-year patient and graft survival rates +/- SD for those with vascular complications were 71.9% +/- 1.9% and 41.6% +/- 8.9% compared with 86.3% +/- 1.1% and 76.8% +/- 1.4% for the remainder of our transplant population, respectively (p <0.001). The deleterious impact on survival was not only observed in recipients with thrombotic or stenotic crises, but also in those with hemorrhagic sequelae. CONCLUSIONS: Hemorrhagic crises are as serious as the stenotic and thrombotic complications affecting patient and graft survival. Because they are a significant factor in the development of hemorrhagic complications, grafts with multiple renal arteries should be managed critically.  相似文献   

2.
Three hundred and eight cadaveric renal transplants were analysed to establish the effects of acute rejection in the first 90 days and delayed graft function (DGF) on graft outcome. There were 120 patients (39%) with no DGF and no rejection (group 1), 101 patients (33%) with rejection but no DGF (group 2), 41 patients (13%) with DGF but no rejection (group 3) and 46 patients (15%) with both rejection and DGF (group 4). The actuarial 4-year graft survival rates for groups 1,2,3 and 40.4%, respectively. The acute rejection rate was 101/221 (46%) in patients with initial graft function compared with 46/87 (53%) for those with DGF (2=1.02, P=0.31). Cox stepwise logistic regression analysis demonstrated that DGF was a more powerful predictive factor for poor graft survival (P=0.001) than acute rejection occurring in the first 90 days post-transplant (P=0.034). Further efforts at improving graft outcome should concentrate on reducing the incidence of DGF.  相似文献   

3.
In a prospective, randomized and placebocontrolled study weevaluated the influence of treatment with the calcium-channelblocker diltiazem on the course and results of cadaveric kidneytransplantation in 39 graft recipients. The grafts were reperfusedwith Euro-Collins solution containing diltiazem 20 mg/l. Allrecipients except those in chronic treatment with a calcium-channelblocker received preoperatively a bolus of diltiazem or placebo0.3 mg/kg and in all an infusion of diltiazem or placebo 3 mg/kg/24h was started preoperatively. After that, diltiazem or placebowas given orally for 3 months. Donors were not treated. Immunosuppressivetherapy consisted of prednisone, azathioprine and CsA. Therewere no significant differences between the groups concerningdonor or recipient characteristics, HLA-mismatching, and ischaemictime. Thrombosis leading to graft loss occurred in 3 recipients(diltiazem:2, placebo:1) and one graft was lost due to septicaemia(diltiazem). For the remaining 35 grafts no beneficial effectof treatment with diltiazem was found for the rate of delayedgraft function, the rate of rejections, time to first rejection,whole blood CsA concentration, or graft function. The CsA doseneeded to reach target whole blood concentration was significantlyless in the diltiazem group. In conclusion, our results do notindicate any beneficial effects of treatment with diltiazemin cadaveric kidney transplantation, except a reduction of costsbecause of a significant reduction of the CsA dosage.  相似文献   

4.

Background

Acute kidney injury (AKI) and renal dysfunction after heart transplantation are common and serious complications. Atrial natriuretic peptide (ANP) has been shown to increase glomerular filtration rate (GFR) and exert renoprotective effects when used for the prevention/treatment of AKI in cardiac surgery. We tested the hypothesis that intraoperative and postoperative administration of ANP could prevent a postoperative decrease in renal function early after heart transplantation.

Methods

Seventy patients were randomized to receive either ANP (50 ng/kg/min) (n = 33) or placebo (n = 37) starting after induction of anesthesia and continued for 4 days after heart transplantation or until treatment with dialysis was started. The primary end-point of the present study was measured GFR (mGFR) at day 4, assessed by plasma clearance of a renal filtration marker. Also, the incidence of postoperative AKI and dialysis were assessed.

Results

Median (IQR) mGFR at day 4 postoperatively was 60.0 (57.0) and 50.1 (36.3) ml/min/1.72 m2 for the placebo and ANP groups, respectively (p = .705). During ongoing ANP infusion, the need for dialysis was 21.6% and 9.1% for the placebo and ANP groups, respectively (p = .197). The incidences of AKI for the placebo and the ANP groups were 76.5% and 63.6%, respectively (p = .616). The incidences of AKI stage 1 were 32.4% and 21.2% for the placebo and ANP groups, respectively (p = .420) and for AKI stage 2 or 3, 37.8% and 42.4%, respectively (p = .808).

Conclusion

The study failed to detect that ANP infusion attenuates renal dysfunction or decreases the incidence of AKI after heart transplantation.  相似文献   

5.
The planned nature of live donor kidney transplantation allows for immunosuppression to be initiated in the pretransplant period. The aim of this study was to determine the effect of pre-emptive immunosuppression on acute rejection rates after live donor kidney transplantation. In two consecutive cohorts of live donor kidneys transplants, 99 patients received pre-emptive immunosuppression with tacrolimus monotherapy for 2 weeks prior to transplantation (PET group – first era) and 100 patients received tacrolimus-based immunosuppression commencing on the day of transplantation (control group – second era). The main outcome measure was the incidence of biopsy-proven acute rejection (BPAR) in the first 3 months post-transplantation. Tacrolimus levels were significantly higher in the PET group at day 4 post-transplant (PET 9.08 ± 4.57 vs. control 5.92 ± 3.64 ng/ml; P < 0.0001), but there were no significant differences in tacrolimus levels at day 7 (PET 8.22 ± 3.58 vs. control 7.63 ± 3.56 ng/ml; P = 0.2452). BPAR was numerically higher in the PET group, but this difference did not reach statistical significance (PET 13/99 vs. control 6/100; P = 0.097). There were no differences in allograft function measured by serum creatinine at 1 year (PET 130 ± 36 vs. control 142 ± 69 μmol/l; P = 0.6829). Graft survival at 1 year was equivalent in both groups (PET 96.9 vs. control 97.0%; P = 0.9915). This study suggests that there is little role for the use of pre-emptive tacrolimus monotherapy in ABO blood group and HLA-compatible live donor kidney transplantation in patients on triple maintenance immunosuppression.  相似文献   

6.
BACKGROUND: Schistosomiasis is a major health problem in some areas of the world. Schistosomal-specific nephropathy is a well-known occurrence and eventually leads to end-stage renal failure. Patients with schistosomal infection were considered to be suitable recipients for renal transplantation. However, the long-term impact of schistosomiasis on kidney transplantation is not yet been reported. METHODS: The long-term impact of schistosomiasis on patient and graft outcomes was studied by comparing two groups of subjects from a total of 243 patients. Group I consisted of cases with schistosomal infections and group II consisted of schistosoma-free controls. Schistosomiasis was documented in group I by identifying schistosoma eggs in urine, stool or rectal mucosal biopsy. Also intra-operative biopsies from bladder mucosa of the graft recipients and from the lower end of the ureter of living donors were obtained to search for schistosoma eggs. RESULTS: Sixty-three cases of schistosomiasis were diagnosed in both recipients and donors, 65 cases in recipients only, and eight cases in donors only. Infected recipients and donors with active lesions were treated at least 1 month before transplantation by combined antischistosomal drugs (praziquantel and oxamniquine). The 243 patients (136 schistosoma-infected cases and 107 controls) were followed regularly for a period of 10 years after transplantation. We found that there was no significant difference in the incidence of acute and chronic rejection between the groups; however, higher cyclosporin doses were needed for the infected group with subsequent higher incidence of both acute and chronic cyclosporin nephrotoxicity. Moreover, the schistosomal group had a significantly higher incidence of urinary tract infection and urological complications with no evidence of schistosomal re-infection. CONCLUSIONS: Despite a higher incidence of schistosoma-related complications after renal transplantation, schistosomal infection is not a major risk factor for transplantation. Therefore, infected patients can be considered as suitable recipients if they have been properly treated before transplantation.  相似文献   

7.
BACKGROUND.: Studies on the effect of recombinant human erythropoietin (rHuEpo)on haematopoiesis in patients with kidney transplants, havebeen limited to progressive chronic graft failure, late aftertransplantation. In the present prospective randomized study,the efficacy of rHuEpo in the correction of anaemia during thefirst weeks after renal transplantation (RTP) was evaluated. METHODS.: Patients were allocated to either an Epo-(n=14) or a non-Epo-treatedgroup (n=15). Epo (150 U/kg.week s.c.) was started at a haematocrit(Hct) <30% and was increased at weekly intervals by 30 U/kg.week,as long as Hct remained <25%. RESULTS.: In the Epo group, Hct increased from a nadir of 22±4%2 weeks after RTP to 30±4% at week 4 and to 36±4%at week 6 (P<0.001 and P<0.0001 respectively vs week 2).Corresponding values in the non-Epo group were 25±6%,28±6% (P=NS) and 32±6% (P<0.05 vs week 2) (overallevolution Epo vs non-Epo: P=0.038 by variance analysis). Thedifferences in Hct between the Epo and non Epo group were evenmore marked in patients without major complications (varianceanalysis P=0.009). The Epotreated patients required fewer post-surgicalblood transfusions (0.005 vs 0.014/days follow-up, P<0.05),in spite of greater post-surgical blood losses, especially atday 1 (P<0.05) and the presence of more major complications(7 vs 4) and a higher number of ganciclo vir-treated patients(4 vs 0; P<0.05). The maximum Epo dose after RTP was >2xhigher than the one required before RTP (197.1±45.1 vs85.0±76.0 U/kg.week; P<0.05). CONCLUSIONS.: It is concluded that rHuEpo during the first weeks after RTPis of benefit in the correction of the Hct in the early post-surgicalperiod, in spite of relative Epo resistance.  相似文献   

8.
9.
BACKGROUND: Epidemiological data implicate that renal transplants from living unrelated donors result in superior survival rates as compared with cadaveric grafts, despite a higher degree of human lymphocyte antigen (HLA) mismatching. We undertook a center-based case control study to identify donor-specific determinants affecting early outcome in cadaveric transplantation. METHODS: The study database consisted of 152 consecutive cadaveric renal transplants performed at our center between June 1989 and September 1998. Of these, 24 patients received a retransplant. Donor kidneys were allocated on the basis of prospective HLA matching according to the Eurotransplant rules of organ sharing. Immunosuppressive therapy consisted of a cyclosporine-based triple-drug regimen. In 67 recipients, at least one acute rejection episode occurred during the first month after transplantation. They were taken as cases, and the remaining 85 patients were the controls. Stepwise logistic regression was done on donor-specific explanatory variables obtained from standardized Eurotransplant Necrokidney reports. In a secondary evaluation, the impact on graft survival in long-term follow-up was further measured by applying a Cox regression model. The mean follow-up of all transplant recipients was 3.8 years (SD 2.7 years). RESULTS: Donor age [odds ratio (OR) 1.05; 95% CI, 1.02 to 1.08], traumatic brain injury as cause of death (OR 2.75; 95% CI, 1.16 to 6. 52), and mismatch on HLA-DR (OR 3.0; 95% CI, 1.47 to 6.12) were associated with an increased risk of acute rejection, whereas donor use of dopamine (OR 0.22; 95% CI, 0.09 to 0.51) and/or noradrenaline (OR 0.24; 95% CI, 0.10 to 0.60) independently resulted in a significant beneficial effect. In the multivariate Cox regression analysis, both donor treatment with dopamine (HR 0.44; 95% CI, 0.22 to 0.84) and noradrenaline (HR 0.30; 95% CI, 0.10 to 0.87) remained a significant predictor of superior graft survival in long-term follow-up. CONCLUSIONS: Our data strongly suggest that the use of catecholamines in postmortal organ donors during intensive care results in immunomodulating effects and improves graft survival in long-term follow-up. These findings may at least partially be explained by down-regulating effects of adrenergic substances on the expression of adhesion molecules (VCAM, E-selectin) in the vessel walls of the graft.  相似文献   

10.
11.
INTRODUCTION: We examined the relationship between late acute rejection (LAR) after cadaveric kidney transplantation and medical compliance utilizing a modified version of the Long-term Medication Behaviour Self-efficacy Scale (LTMBS-scale), a validated patient self-report questionnaire. The original LTMBS-scale uses a five-point scale, however, our pilot study showed that patients found it difficult to discriminate between the five options. We therefore modified this to a three-point scale. PATIENTS AND METHODS: We carried out a retrospective analysis of all patients who received a kidney transplant in our unit in the cyclosporin (CyA) era. We divided rejections into early and late rejection based on the time interval after transplantation. Graft rejection was confirmed by biopsy; LAR was defined as acute rejection occurring after 90 d. We retrospectively administered the modified LTMBS-scale to determine individual patient confidence and self-efficacy in taking their medications in a variety of situations (home, work, leisure, psychological and physical). Individual patient confidence and self-efficacy was analysed in relationship to compliance behaviour. RESULTS: Twenty-four questionnaires were distributed, 22 (92%) were returned fully completed. The overall results suggested that our patients surveyed were not particularly confident (mean score 2.17 out of maximum possible 3) in taking their medications in a variety of contexts. They demonstrated significantly less confidence (mean score 1.0) when experiencing physical (brittle bones, feeling 'ill') and psychological ('sadness') side-effects of medication and emotional reactions to the experience of chronic illness. CONCLUSION: Negative physical and psychological states were related to low self-efficacy with the taking of immunosuppressive medication, non-compliance and subsequent LAR in our cohort of patients.  相似文献   

12.
Renal transplant recipients have significantly higher mortality than individuals without kidney disease and the excess mortality is mainly due to cardiovascular causes. In this study, we sought to determine the impact of smoking, a major cardiovascular risk factor, on patient and renal graft survival. The study population included all adult recipients of first cadaveric kidney transplants done in our institution from 1984 to 1991. By selection, all patients were alive and had a functioning graft for at least 1 yr after transplantation. Smoking history was gathered prior to transplantation. The follow-up period was 84.3 + 41 months and during this time 28%, of the patients died and 21%, lost their graft. By univariate and multivariate analysis, patient survival, censored at the time of graft loss, correlated with these pre-transplant variables: age (p < 0.0001); diabetes (p = 0.0002); history of cigarette smoking (p = 0.004); time on dialysis prior to the transplant (p = 0.0005); and cardiomegaly by chest X-ray (p = 0.0005). Post-transplant variables did not correlate with patient mortality. By Cox regression, patient survival time was significantly shorter in diabetics (p < 0.0001), smokers (p = 0.0005), and recipients older than 40 yr. However, there were no significant differences between the survival of smokers, non-diabetics, diabetics, and older recipients. Patient death was the most common cause of renal transplant failure in smokers, in patients older than 40 yr, and in diabetics, but these patient characteristics did not correlate with graft survival. The prevalence of different causes of death was not significantly different between smokers and non-smokers. In conclusion, a history of cigarette smoking correlates with decreased patient survival after transplantation, and the magnitude of the negative impact of smoking in renal transplant recipients is quantitatively similar to that of diabetes.  相似文献   

13.
To determine short- and long-term patient and graft survival in obese [body mass index (BMI) >or= 30 kg/m(2)] and nonobese (BMI < 30 kg/m(2)) renal transplant patients we retrospectively analyzed our national-database. Patients 18 years or older receiving a primary transplant after 1993 were included. A total of 1,871 patients were included in the nonobese group and 196 in the obese group. In the obese group there were significantly more females (52% vs. 38.6%, P < 0.01) and patients were significantly older [52 years (43-59) vs. 48 years (37-58); P < 0.05]. Patient survival and graft survival were significantly decreased in obese renal transplant recipients (1 and 5 year patient survival were respectively 94% vs. 97% and 81% vs. 89%, P < 0.01; 1 and 5 year graft survival were respectively 86% vs. 92% and 71% vs. 80%, P < 0.01). Initial BMI was an independent predictor for patient death and graft failure. This large retrospective study shows that both graft and patient survival are significantly lower in obese renal transplant recipients.  相似文献   

14.
Acute rejection (AR) can lead to allograft dysfunction following renal transplantation, despite immunosuppressive treatments. Accumulating evidence points out a role for epigenetic modification in immune responses. However, the mechanism and contribution of DNA methylation in allograft survival remain unclear. In this study, we followed up patients who successively experienced end-stage renal disease, renal transplantation with allograft function or dysfunction, and hemodialysis. Peripheral blood mononuclear cells were collected at different time points for analysis of the DNA methylation. Epigenetic modifier analysis was also performed to explore its effect of methylation in a mouse model of AR. Compared with the allograft-stable cohort, patients who experienced AR-induced allograft dysfunction demonstrated more changes in methylation patterns. Pathway analysis revealed that the hypermethylated areas in the allograft dysfunction group were associated with genes related to the mechanistic target of rapamycin (mTOR) signaling pathway. Moreover, in the mouse AR model, treatment with the DNA methyltransferase inhibitor—decitabine regulated the Th1/2/17/regulatory T cell (Treg cell) immune response via its demethylating role in the suppressing the activity of the mTOR pathway, which ultimately ameliorated renal allograft-related inflammatory injuries. These results revealed that changes in methylation accompany AR-induced allograft dysfunction after renal transplantation. Epigenetics may provide new insights into predicting and improving allograft survival.  相似文献   

15.
Cyclosporin has improved graft survival after renal transplantation,but cyclosporin nephrotoxicity is a severe clinical problem.Conversion from cyclosporin to azathioprine 1 year after transplantationmight improve long-term graft survival by avoidance of cyclosporinnephrotoxicity. After treatment with cyclosporin and prednisoloneduring the first year after renal transplantation, 106 patientswere consecutively randomized to treatment with either azathiprineand prednisolone or cyclosporin and prednisolone in a prospective,controlled study during the following 5 years, i.e. 6 yearsafter transplantation. Actuarial estimates of graft survivalrates after inclusion in the study were obtained by the product-limitmethod of Kaplan-Meier, and the Mantel-Cox log rank test wasused to compare the two treatment regimens. When the end-pointsin the analyses were cessation of graft function or withdrawalof immunosuppressive treatment due to side-effects, and whenpatients alive with graft function or who had died with a functioninggraft were treated as censored observations, graft survival5 years after inclusion in the study was 57.7±5.2% inthe total material and was the same in both the azathioprinegroup (52.4±7.7%) and the cyclosporin group (63.3±6.7%)(log rank=0.40, P=0.53). When cessation of graft function wasthe only end-point, graft survival 5 years after inclusion inthe study was 73.7±5.2% for the total material with nosignificant differences between the two groups (log rank=0.58,P=0.45). Assuming that cyclosporin and prednisolone were usedduring the first year after renal transplantation, it can beconcluded that conversion to treatment with azathioprine andprednisolone does not deviate from continued treatment withcyclosporin and prednisolone with regard to long-term graftsurvival for the following 5 years.  相似文献   

16.
In the early era of transplantation, it was common practice to exclude diabetic patients since the outcome in such cases was usually poor. At our center in Malmö, Sweden, diabetic nephropathy was never regarded as a contraindication. During the 22-year period from 1972 to 1993, 223 renal allografts were transplanted in 189 uremic diabetics, representing 24% of all renal transplant recipients (n=788). The two subgroups — patients with and without diabetes —did not differ significantly in graft survival rates for the 22-year period, which was characterized by a successive improvement in the success rate that was especially striking in the diabetic nephropathy subgroup. Among transplantations performed before 1988, the overall patient survival rate was significantly lower in the diabetic subgroup than in the remainder. After 1988 (when a series of new procedures had been adopted), the patient survival rate in the diabetic subgroup was similar to that in the nondiabetic subgroup, a similarity that persisted for at least 5 years. The 1 st year post-transplant mortality rate was reduced in diabetic patients from 24% before 1988 to 0% in those transplanted after 1988. In the 22-year period as a whole, cardiovascular or cerebrovascular events were the most common cause of death in both subgroups; the risk of cardiovascular or cerebrovascular death was reduced after 1988, and the rates were similar in both subgroups. The improved success rate of renal transplantation in patients with diabetic nephropathy supports continuation of the renal transplant program, which is based on careful management of the early stages of the disease.  相似文献   

17.
18.
We report on the successful regrafting of a transplanted kidney. The donor kidney was first transplanted into a 32‐year‐old patient with renal atrophy. More than 2 years later, he suffered from severe grand mal seizure with brain edema and the patient met the criteria for brain death. The well‐functioning graft was recovered and subsequently transplanted into a 66‐year‐old woman with chronic glomerular nephritis. Neither the first nor the second recipient experienced any acute rejection. To date, more than 14 years later, she is in good health with excellent graft function. This case report implies that excellent long‐term graft function is viable in a graft reused 2 years after the initial transplantation.  相似文献   

19.
Abstract:  We reported a case of renal graft loss in cadaveric renal transplantation. An episode biopsy with renal dysfunction showed plasma cell predominant inflammatory infiltration in the interstitium without a finding of vascular or glomerular rejection, and was diagnosed as plasma cell-rich acute rejection (PCAR). Despite intensive immunosuppressive therapy, the renal histology of repeated biopsies showed persistent plasma cell infiltration and the graft was finally lost. Immunohistological staining and immunoglobulin gene rearrangement studies to estimate the clonality of inflammatory cells revealed that the infiltrating plasma cells were polyclonal in origin. Epstein–Barr virus was not detected by in situ hybridization. From these results, we excluded the possibility of post-transplant lymphoproliferative disorder (PTLD); however, a precise definition and differential diagnosis between PCAR and PTLD has not yet been fully determined. As therapeutic regimens for PCAR and PTLD are different, definite guidelines for diagnosis and treatments of PCAR need to be established.  相似文献   

20.
BACKGROUND: Early graft function (EGF) has an enduring effect on the subsequent course after kidney transplantation. This study compares quantitative parameters of EGF for the prediction of graft survival. METHODS: We involved 300 consecutive transplant recipients from deceased donors from 1989 to 2005. Urine output during 24 h post-transplant (UO), and serum creatinine after 1 week (Cr7) were taken for explanatory variables. We generated Kaplan-Meier (K-M) estimates of graft survival, by quintiles of the explanatory variable. Cox regression was applied to control for various recipient factors. RESULTS: K-M survival estimates indicate a threshold effect of UO and Cr7, which can dissect the risk of graft failure. The thresholds referring to the 2nd quintile correspond to a UO >630 ml and a Cr7 <2.5 mg/dl and were associated with a proportional hazard ratio of 0.52 (95% CI 0.33-0.84) and 0.34 (95% CI 0.18-0.65), respectively. Combining both of the parameters predicted a 5-year graft survival probability >90%, according to a hazard ratio of 0.21 (95% CI 0.09-0.46). Requirement of dialysis post-transplant lost its discriminatory power and was not a significant explanatory variable in the multivariate analysis. CONCLUSION: Routine parameters for monitoring of EGF display a threshold effect allowing accurate prediction of 5-year graft survival at the earliest point in time. The quantitative threshold levels for an optimum discriminatory power require validation in a larger, preferably multicentre database.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号