首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Purpose

Our aim was to evaluate the influence of donor cause of brain death on the results of kidney transplantation.

Methods

This retrospective study included 896 consecutive deceased-donor renal transplantations performed between January 1, 2000, and December 31, 2009. We compared outcomes of grafts from donors after cerebrovascular accident (CVA; n = 371) versus head trauma (HT; n = 525).

Results

Univariate analysis of pretransplantation data showed statistically significant differences (P < .05): among the following variables for the HT versus CVA groups respectively: recipient age (43.63 ± 13.2 y vs 49.80 ± 12.5 y); donor age (36.06 ± 16.6 y vs 52.57 ± 13.2 y) and time on dialysis (50.67 ± 45.034 mo vs 59.39 ± 46.3 mo). Regarding transplantation results, we observed that mean serum creatinine was significantly lower among HT recipient, at 1, 3, 6, 12, and 24 months after transplantation (P < .05). Chronic allograft nephropathy (CAN) and delayed graft function were higher among the CVA group. HT group kidneys showed significantly longer mean survival times than CVA group kidneys (102.7 ± 3.9 mo vs 94.8 ± 5.6 mo; log rank: P = .04). Upon multivariate analysis donor cause of death was not identified as an independent risk factor for graft survival or occurrence of chronic allograft nephropathy.

Conclusions

Transplantation results were better among the HT group. However multivariate regression analysis indicated that donor cause of death was not an independent risk factor for graft survival or occurrence of chronic allograft nephropathy.  相似文献   

2.

Background

Vascular endothelial dysfunction occurs in the kidney graft from marginal brain death (BD) donors and may be responsible for a low success rate after transplantation.

Methods

BD was induced in 16 dogs for 6 hours. Immediately after the inflation of the intracranial balloon, the treated group (n = 8) received 40 mg/kg bolus followed by 3 mg/kg/min infusion of L-arginine for 30 minutes. Renal vascular function and hemodynamic and biochemical parameters were determined.

Results

BD caused vasoconstriction, increase in renal venous nitrite (4.9 ± 0.8 versus 2.6 ± 0.1, P < .05) and myeloperoxidase levels (1.43 ± 0.04 versus 2.43 ± 0.23, P < .001), and reduced vasodilatation of renal artery to acetylcholine. Larginine diminished the renal vasoconstriction induced by 6 hour BD (RVR = 0.92 ± 0.06 versus 1.38 ± 0.003 in controls, P < .05), maintained renal oxygen extraction in physiological range (17.5 ± 4.6% versus 25.4 ± 2.9% in controls, P < .05) and prevented the rise of myeloperoxidase (1.69 ± 0.19, P < .05 versus controls) and nitrite levels (3.3 ± 0.5, P < .05), followed by preservation of endothelium dependent vasodilatation (P < .05 versus controls).

Conclusions

The findings suggest that exogenous L-arginine supplementation may preserve endothelial vascular function in the kidney before prelevation from marginal BD donors.  相似文献   

3.

Background

It is generally recognized that living donor kidney transplantation (LDKT) grafts are superior to deceased donor kidney transplantation (DDKT) grafts. We compared survival and functional outcomes of LDKT and DDKT grafts.

Methods

Among 1000 kidneys transplanted from 1995 to 2008, we selected grafts surviving >5 years, excluding pediatric, multi-organ transplantation, and retransplantations (n = 454).

Results

There were 179 kidneys from deceased donors and 275 from living donors. Recipients showed no difference in age, gender, or cause of renal failure. Donors were younger in the DDKT group (30.6 vs 38.5 years; P < .05). There were more male donors in the DDKT group (73.2% vs 54.5%; P < .05). Deceased donors showed a greater mean number of HLA mismatches (4.2 vs 2.7; P < .05). Death-censored graft survival at 10 years showed no difference (DDKT 88.9% vs LDKT 88.9%; P = .99). Mean serum creatinine at 5 years was 1.41 mg/dL for DDKT and 1.44 mg/dL for LDKT (P = .75). Mean estimated glomerular filtration rate at 5 years was 67.8 mL/min/1.73 m2 for DDKT and 62.1 mL/min/1.73 m2 for LDKT (P = .23). Twenty-three DDKT grafts (12.8%) and 47 LDKT grafts (17.1%) experienced acute rejection episodes (P = .22). DDKT recipients showed more cases of viral and bacterial infections compared with LDKT recipients (viral, 11.7% vs 2.2% [P < .05]; bacterial, 21.8% vs 7.3% [P < .05]).

Conclusion

Among kidney grafts surviving >5 years, there was no difference in survival or serum creatinine levels at 5 and 10 years between DDKT and LDKT grafts.  相似文献   

4.

Background

The clinical manifestation of ischemia/reperfusion injury in renal transplantation is delayed graft function (DGF), which is associated with an increase in acute rejection episodes (ARE), costs, and difficulties in immunosuppressive management. We sought to evaluated the DGF impact after renal transplant.

Methods

We evaluated a group of 628 patients undergoing deceased donor renal transplantation between 2002 and 2005 at 3 Brazilians institutions to define the main DGF characteristics.

Results

DGF incidence was 56.8%, being associated with elderly donors (P = .02), longer time on dialysis (P = .001), and greater cold ischemia time (CIT; P = .001). Upon multivariate analysis, time on dialysis >5 years increased DGF risk by 42% (P = .02) and CIT >24 hours increased it by 57% (P = .008). In contrast, DGF was associated with an higher incidence of ARE: 27.7% in DGF versus 18.4% in IGF patients (P = .047). The ARE risk was 46% higher among individuals with DGF (P = .02), 44% among patients >45 years old (P < .001), 50% among those with >5 years of dialysis time (P = .02), and 47% lower among the who were prescribed mycophenolate instead of azathioprine (P < .001). Patients with DGF showed worse 1-year graft function (54.6 ± 20.3 vs 59.6 ± 19.4 mL/min; P = .004), particularly those with ARE (55.5 ± 19.3 vs 60.7 ± 20.4; P = .009). One-year graft survival was 88.5% among DGF versus 94.0% among non-DGF patients.

Conclusion

The high incidence of DGF was mainly associated with a prolonged CIT. There was a relationship between DGF and ARE, as well as with a negative influence on long-term graft function.  相似文献   

5.

Background

Hypophosphatemia is a common complication after renal transplantation. Hyperparathyroidism has long been thought to be the cause, but hypophosphatemia can persist after high parathyroid hormone (PTH) levels normalize. Furthermore, calcitriol levels remain inappropriately low after transplantation, suggesting that mechanisms other than PTH contribute. Fibroblast growth factor 23 (FGF-23) induces phosphaturia, inhibits calcitriol synthesis, and accumulates in chronic kidney disease. We performed prospective study to investigate if FGF-23 early after renal transplantation contributes to hypophosphatemia.

Methods

We measured FGF-23 levels before and at 1, 2, 4, and 12 weeks after transplantation in 20 renal transplant recipients. Serum creatinine, calcium (Ca), phosphate (Pi), intact PTH (PTH), and 1,25-dihydroxy vitamin D (1,25(OH)2VitD) were measured at the same time.

Results

FGF-23 levels decreased by 97% at 4 weeks after renal transplantation (PRT) (7,471 ± 11,746 vs 225 ± 295 pg/mL; P < .05) but were still above normal. PTH and Pi levels also decreased significantly after renal transplantation, and Ca and 1,25(OH)2VitD slightly increased. PRT hypophosphatemia of <2.5 mg/dL developed in 15 (75%) and 12 (60%) patients at 4 weeks and 12 weeks respectively. Compared with nonhypophosphatemic patients, the levels of FGF-23 of hypophosphatemic patients were higher (303 ± 311 vs 10 ± 6.9 pg/mL; P = .02) at 4 weeks PRT. FGF-23 levels were inversely correlated with Pi (r2 = 0.406; P = .011); PTH was not independently associated with Pi (r2 = 0.132; P = .151).

Conclusions

FGF-23 levels decrease dramatically after renal transplantation. During the early PRT period, Pi rapidly decreased, suggesting that FGF-23 is cleared by the kidney, but residual FGF-23 may contribute to the PRT hypophosphatemia. FGF-23, but not PTH levels, was independently associated with PRT hypophosphatemia.  相似文献   

6.

Background

Hepatitis C virus (HCV) is the most common indication for liver transplantation, but HCV recurrence is frequent after 1 year and is associated with increased morbidity and mortality. Oxidative stress (OxS) is involved in the pathogenesis of HCV, but little is known about its presence prior to disease recurrence.

Aim

To determine if at 6 months HCV-positive liver recipients (HCV-OLT) without recurrence were oxidatively stressed.

Methods

33 HCV-OLTs, 12 controls, and 39 HCV-positive nontransplant patients (HCV-NTs). OxS was assessed by using commercial kits to measure liver lipid peroxidation (LPO) and antioxidant potential (AOP). Plasma vitamin E, retinol (HPLC), and vitamin C (spectrophotometry) were assessed. We collected Anthropometry and 3-day food records. We performed analysis by the Kruskal-Wallis test expressing data as mean values ± standard errors of the mean.

Result

Waist-hip ratio was higher in both HCV-OLTs and HCV-NTs compared to the controls. HCV-OLTs showed higher hepatic LPO (μmol malondialdehyde/g tissue) versus controls (1.4 ± 0.20 vs 0.54 ± 0.10; P = .010) and compared to HCV-NTs (0.98 ± 0.17; P = .030). No significant differences were found among the groups regarding hepatic AOP. However, lower plasma AOP (μmols UEA) were observed in HCV-OLTs (0.07 ± 0.008) versus controls (0.17 ± .040; P = .021) or HCV-NTs (0.08 ± 0.009; P = .015) versus controls. Plasma γ-tocopherol was higher in HCV-OLTs and HCV-NTs compared to controls (P = .001). We observed lower vitamin A intake in HCV-OLTs compared with the other two groups (P = .001).

Conclusions

HCV-OLTs without disease recurrence are oxidatively stressed compared with control and HCV-NTs. Future research is needed to determine the impact of this increased oxidative stress on HCV disease recurrence.  相似文献   

7.

Introduction

The success of simultaneous pancreas-kidney transplantation (SPK) depends in a large degree on avoidance of surgical complications in the early postoperative period. The aim of the study was to analyze the Pre-procurement Pancreas Allocation Suitability Score (P-PASS) and the deceased donor parameters included within it as risk factors for early surgical complications after SPK.

Material and Methods

Forty-six consecutive donors whose kidney and pancreas were simultaneously transplanted were included in the study.

Results

Donor age was older among recipients who lost their pancreatic grafts: 30.4 ± 6.9 versus 24.1 ± 6.9 years. Donor age was also older among recipients who lost their pancreatic grafts or died compared with those discharged with a functioning graft: 29.3 ± 5.7 versus 24.0 ± 6.9 years. Donor body mass index (BMI) was higher among patients who died compared with those who were discharged: 25.3 ± 1.1 versus 23.2 ± 2.5 kg/m2. P-PASS was higher in patients who lost their pancreatic grafts (17.6 ± 2.1 vs 15.2 ± 1.8) or died (15.3 ± 1.9 vs 17.2 ± 1.9), or lost pancreatic graft or died (15.2 ± 1.8 vs 17.0 ± 2.2) or with intra-abdominal infections (IAI; 17.1 ± 1.7 vs 15.0 ± 1.8). The incidence of donors ≥30 years old was higher among recipients with IAI (45.4% vs 14.3%; P = .04). An higher rate of donors with P-PASS >16 was revealed among patients who lost their pancreatic grafts (26.7% vs 3.2%), died (26.7% vs 3.2%), lost the pancreatic graft or died (33.3% vs 6.4%), or experienced IAI (46.7% vs 9.7%). Multivariate logistic regression analysis revealed P-PASS (odds ratio 2.57; P = .014) and serum sodium (odds ration, 0.91; P = .048) to be important predictors of IAI development.

Conclusion

Older age and higher BMI among deceased donors increased the risk of IAI, pancreatic graft loss, or recipient death after SPK. Transplantation of a pancreas from a donor with a low P-PASS score was associated with a lower risk of surgical complications after SPK.  相似文献   

8.

Introduction

Maintenance of the target blood levels of immunosuppressive drugs is one of the main factors determining transplant function. The aim of this study was to assess the effect of the conversion of tacrolimus from twice daily (Tc) to the prolonged release form administered once daily (Tc-pr) including the variability of blood concentrations and glomerular filtration rates in kidney transplantation patients.

Materials and Methods

This retrospective analysis evaluated 52 patients including 23 females, and 29 males with established grafts who underwent a scheduled change of treatment from Tc to Tc-pr. We examined data from six consecutive visits before and six visits after conversion.

Results

The average daily dose of Tc was 3.8 ± 2.6 mg/24 h, whereas mean coefficient of variation (CoV) calculated from the visits before conversion was 68%. After the conversion, the mean total daily dose of Tc-pr was not significantly lower (3.2 ± 1.8 mg), as was the mean CoV at six subsequent visits 57% (P = ns). Blood concentrations in both analyzed periods remained in the target range (Tc-pr 6.7 ± 2.9 ng/mL versus Tc-pr 5.0 ± 1.11) with a lower CoV in the case of Tc-pr compared to Tc (22% versus 44%; P < .001). There was no difference in graft function in the analyzed periods. After conversion, lower blood glucose levels were observed: 103.4 ± 28.3 mg/dL versus 95 ± 25.9 mg/dL (P < .03).

Conclusions

The slow-release form of tacrolimus provided greater stability of drug blood concentrations compared with the standard form administered twice daily. The change of the tacrolimus treatment from Tc to Tc-pr dosing did not effect organ function but seemed to improve glycemic control.  相似文献   

9.

Background

Primary graft failure (PGF) is a severe complication responsible for 42% of the in-hospital mortality after heart transplantation. It has been postulated that once 30-day survival is achieved, patients with PGF have no increased risk of death. Levosimendan increases the 30-day survival among patients with PGF. Herein we have reported a 3-year follow-up at a single center of a patient cohort including PGF cases treated with levosimendan.

Methods

From September 2005 to December 2006 53 patients underwent heart transplantation at our institution, including 12 patients (22.6%) who presented with PGF and were treated with levosimendan using a 24-hour continuous infusion (0.10 μg/kg/min). Risk factors for 1-year and three-year mortality were analyzed using 30-day as well as 1 and 3-year survivals comparing patients with versus without PGF (n = 41).

Results

There were no significant differences in donor age, weight, height, and serum sodium between the groups. However, the ischemia time (259 ± 53 vs 227 ± 50 min; P = .06) and recipient age (51.6 ± 15 vs 41.5 ± 21 years; P = .07) were greater among the PGF patients. The 30-day survival rate was 92% in both groups. After 1 and 3 years, the survival rate was significantly lower among the PGF cohort (50% vs 80.6% and 41.7% vs 80.6%; P < .05) with 86.5% of PGF patients succunding due to non cardiac reasons, predominantly infections.

Conclusions

Although treatment of PGF with levosimendan increased the 30-day survival, the 1 year and 3-year rates were reduced among this cohort of patients. PGF was associated with poor long-term outcomes, which may be a consequence of systemic malperfusion during the stage of cardiac low-output after transplantation.  相似文献   

10.

Background

In Spain, the number of ideal kidney transplant donors has fallen, with at the same time an increase in the number of older recipients on the waiting list.

Aim

To analyze the results of expanded criteria cadaveric donor kidney transplants into older recipients using grafts selected by kidney biopsy.

Patients and methods

We studied 360 kidney transplant recipients who had been followed to December 2009: 180 in the study group and 180 in a control group composed of younger patients who received grafts from non-expanded criteria donors between 1999 and 2006. A paraffin-embedded kidney biopsy was evaluated by the percentages of sclerosed glomeruli, arteriolar hyalinosis, intimal wall thickening, interstitial fibrosis, and tubular atrophy.

Results

Significant differences were observed in donor age (63.50 ± 5.46 vs 31.90 ± 13.29 years; P < .001) and recipient age (58.40 ± 8.80 vs 40.71 ± 13.23 years; P < .001). Donor renal function was significantly worse among the expanded criteria group (90.80 vs 108.11 mL/min/1.73 m2; P = .006), remaining so over time in the recipient (at 1 year: 42.08 vs 63.71 [P < .001]; at 3 years: 41.25 vs 62.31 [P < .001], and at 7 years: 38.17 vs 64.18 [P < .001]). Censored 7-year graft survivals were 73% versus 87% (P < .001) with similar patient survivals (90.5% vs 95%; P = .39).

Conclusions

Selection of expanded criteria donors by kidney biopsy resulted in good renal function as well as graft and patient survivals at 7 years in older recipients.  相似文献   

11.

Background

Calcineurin inhibitor (CNI)-free immunosuppression is used increasingly after heart transplantation to avoid CNI toxicity, but in the absence of a randomized trial, concerns remain over an increased rejection risk.

Methods

We studied the incidence of graft rejection episodes among all cardiac graft recipients, beginning with the first introduction of CNI-free protocols. We compared events during CNI-free and CNI-containing immunosuppression among 231 transplant recipients of overall mean age 55.2 ± 11.8 years, from a mean 5.2 ± 5.4 years after transplantation through a mean follow-up of 3.1 ± 1.4 years. We considered as acute rejection episodes requiring treatment those of International Society for Heart and Lung Transplantation.

Results

During the total follow-up of 685 patient years (CNI-containing, 563; CNI-free, 122), we performed 1,374 biopsies which diagnosed 78 rejection episodes. More biopsies were performed in CNI-free patients: biopsies/patient-month of CNI-containing, 0.13 versus CNI-free, 0.22 (P < .05). The incidence of rejection episodes per patient-month was significantly higher on CNI-free compared with CNI therapy, among patients switched both early and later after heart transplantation, namely, within 1 year, 0.119 versus 0.035 (P = .02); beyond 1 year, 0.011 versus 0.004 (P = .007); beyond 2 years, 0.007 versus 0.003 (P = .04); and beyond 5 years: 0.00578 versus 0.00173 (P = .04).

Conclusions

Rejection incidence during CNI-free immunosuppression protocols after heart transplantation was significantly increased in both early and later postoperative periods. Given the potentially long delay to rejection occurrence, patients should be monitored closely for several months after a switch to CNI-free immunosuppressive protocols.  相似文献   

12.

Introduction

Statins, although the treatment of choice for dyslipidemia after heart transplantation (HT), are not always well tolerated or effective. In such cases, administration of ezetimibe may be useful.

Aim

The aim of this study was to assess the efficacy and safety of ezetimibe, with or without statins, after HT.

Method

Thirty-six HT patients, 97% of whom were males of overall mean age of 57 ± 13 years, were all unable to reach target lipid levels with statins alone and/or were intolerant of statins. They were prescribed ezetimibe, with or without a statin. Efficacy and safety were evaluated after 1, 3, 6, and 12 months.

Results

Thirty-four patients were evaluated at 1 month and 12 months. Ezetimibe was prescribed to 27 patients (75%) because of statin inefficacy, and to 9 patients (25%) because of statin intolerance, manifested by myalgia in 4 cases (11%), hepatotoxicity in 2 cases (6%), and rhabdomyolysis in 3 cases (8%). Lipid levels (mg/dL; baseline vs 1 year) were as follows: cholesterol, 235 ± 49 versus 167 ± 32 (P = .013); LDL cholesterol, 137 ± 47 versus 89 ± 29 (P = .001); HDL cholesterol, 54 ± 13 versus 51 ± 10 (P = .235); and triglycerides, 243 ± 187 versus 143 ± 72 (P = .022). There were no cases of liver toxicity, renal dysfunction, or significant alteration of immunosuppressive pharmacokinetics. Ezetimibe was withdrawn from 2 patients because of hand edema or asymptomatic recurrence of rhabdomyolysis first caused by statins.

Conclusions

With or without a statin, ezetimibe was generally well tolerated, reducing total cholesterol, LDL cholesterol, and triglyceride levels with no long-term alteration of HDL cholesterol levels. CPK surveillance is recommended because of a slight continued risk of adverse effects. Further studies should evaluate the benefit for survival.  相似文献   

13.

Background

We compared values of apparent diffusion coefficient (ADC) with renal function indices among a population of kidney transplant recipients who underwent magnetic resonance with diffusion-weighted imaging (DWI) of their grafts.

Materials and Methods

Thirty-five patients with right iliac transplanted kidneys were studied using 1.5-T magnetic resonance. Diffusion echo-planar sequences with several b-values were acquired to investigate transplanted grafts. Patients were divided into 3 groups according to their creatinine clearances; Group A, clearance >60 mL/min; Group B, clearance >30 and ≤60 mL/min; and Group C, clearance ≤30 mL/min. ADC values between groups were compared using Mann-Whitney U test. Receiver operating characteristic (ROC) curves were used to predict the normal function (Group A) versus renal failure cohorts (Group C).

Results

Comparing mean values of ADC between Group A and Group C patients, we observed a significant difference (P = .0003) with higher ADC values among patients with a normal creatinine clearance (>60 mL/min). Comparing Groups B and C did not show a significant difference (P = .05); nor did Group A and Group B reveal a significant difference (P = .38). To predict normal clearance values, the Group A ROC curve showed an area under curve (AUC) of 0.780 with a sensitivity of 92.3% and a specificity of 68.2% at a threshold ADC value of ≥2.08 × 10−3 mm2/sec. In the prediction of low clearance values, the Group C ROC curve showed an AUC of 0.846 with a sensitivity of 83.3% and a specificity of 82.6% using a threshold ADC value of ≤2.07 × 10−3 mm2/sec.

Conclusions

Updating our experience among 35 patients, DWI was confirmed to be a promising noninvasive tool to assess renal function; an ADC ≥2.08 × 10−3 mm2/sec may be used as a threshold to predict a normal clearance. However, an overlap of ADC values between groups is a limit.  相似文献   

14.

Background

Metabolic syndrome (MetS) may represent risk factor for long-term renal function of kidneys from living donors. The aim of this study was to evaluate the impact of MetS on renal function in donors.

Methods

Data regarding the presence or absence of MetS and renal function, as assessed by estimated glomerular filtration rate (eGFR) were obtained from 140 kidney donors before nephrectomy (BN) and at follow-up (AF). Donors were divided into those with (group 1; n =28) versus without MetS (group 2; n = 112).

Results

Comparing the groups, we observed a significantly greater reduction in eGFR among the group with MetS BN versus AF 27.5% (19.3-33.0) versus 21.4% (9.6-34.1 P = .02) respectively using a Cox regression model, including age, gender, serum uric acid, body mass index (BMI), and basal eGFR, MetS BN (hazard ratio = 2.2; 95% confidence interval [CI], 1.21-4.01; p = .01) was an independent factor associated with a greater risk of a-eGFR <70 mL/min/1.73 m2 at follow-up (P < .001). Additionally, age (hazard ratio = 1.03%; 95% CI, 1.01-1.06; P < .001), and female gender (hazard ratio = 1.86; 95% CI, 1.03-3.36; P = .03) were associated with a greater decrease in eGFR. Individuals with MetS BN showed a GFR <70 mL/min/1.73 m2 at significantly shorter follow-up time (5.6 ± 0.8 years) versus persons without MetS (12.8 ± 1.0 years; P = .001)

Conclusion

Kidney donors with MetS BN experiment a significantly greater decrease in eGFR at follow-up.  相似文献   

15.

Introduction

Proteinuria is related to a poor prognosis for graft survival.

Materials and methods

We undertook a retrospective study of renal transplant biopsies between 2006 and 2009 performed because of proteinuria. Data were collected on demographic, analytical, and histological characteristics.

Results

The study included 49 biopsies from 65% men with an overall mean age of 52 ± 13 years. The mean time from transplant to biopsy was 6.5 ± 5.3 years. All cases displayed proteinuria: 2.2 g/24 h (1.2-3.2). In 56% of cases, it was also associated with worsening glomerular filtration rate (GFR) (MDRDa 33 ± 16 mL/min). In 14% of cases, the sample was insufficient to determine glomerular pathology, whereas 51% displayed glomerular disease, among which were transplant glomerulopathy (40%), glomerulonephritis (48%), and diabetes (12%). Interstitial fibrosis and tubular atrophy (IFTA) was present in 85%: 33% mild, 27% moderate, and 25% severe. Arteriolar hyalinosis was present in 60%. Thirty-four percent of subject lost their grafts at a mean of 11 ± 9 months after the biopsy. The GFR at the time of biopsy was worse among those subjects who returned to dialysis than those who retained function (MDRDa 22 ± 7.5 vs 34 ± 15 mL/min; P = .006). Proteinuria was also greater among those who lost their grafts (4.1 ± 3.4 vs 2.1 ± 1.6 g/24 h; P = .007). The absolute increase in the risk of graft loss was 52% among subjects who displayed moderate to severe versus those who had mild IFTA (relative risk [RR] 7; confidence interval [CI] 1.8-28; P < .001). The presence of glomerulosclerosis >50% was also associated with a 48% absolute increased risk of graft loss compared with those patients with no glomerulosclerosis or <50% (RR 3; CI 1.5-12; P = .02). After the biopsy, the dose of angiotensin converting enzyme inhibitors and/or angiotensin receptor antagonist was increased in 90%, with 34% of subjects, experiencing a change in immunosuppression.

Conclusions

Transplant patients undergoing a biopsy due to proteinuria, the occurrence of graft loss was associated with reduced GFR and the amount of proteinuria at the time of the biopsy, as well as with the degree of IFTA and of glomerular involvement.  相似文献   

16.

Objective

Over the past years both donor and recipient profiles have changed in heart transplantation. Satisfactory clinical outcomes of marginal donors in candidates >60 years of age have led us to allocate suboptimal donors to younger recipients as well. Therefore, we retrospectively reviewed our experience.

Methods

Among 199 patients undergoing heart transplantation from January 2000 to February 2010, there were 83 (41%) aged 61-72 years. The other 116 (59%) ranged in age between 18 and 60 years. According to their clinical conditions as heart transplantation candidates, They were classified into 4 groups: younger recipients (n = 116) of either optimal donors (n = 72; group 1 [G1]) or marginal donors (n = 44; group 2 [G2]) and older recipients (n = 83) of either marginal grafts (n = 70, group 3 [G3]) or optimal grafts (n = 13; group 4 [G4]). The gender distribution, cause of end-stage heart failure, preoperative pulmonary hypertension incidence, pretransplantation clinical status, and mean follow-up were not significantly different among the 4 groups.

Results

Overall 30-day survival was 90 ± 1% and 10-year rate was 78 ± 9%. Among the groups, 30-day and 10-year actuarial survival rates were, respectively: 94 ± 4% and 87 ± 1% for G1; 86 ± 5% and 84 ± 7% for G2; 88 ± 4% and 71 ± 7% for G3 and were 100% and 82 ± 7% for G4 (P = .7). In comparison among the 4 groups, there was no significant difference regarding freedom from graft failure (P = .3), right ventricular failure (P = .3), acute rejection episodes (P = .2), chronic rejection (P = .2), neoplasia (P = .5), or chronic renal failure (P = .1). Older recipients of marginal donors [G3] had a 4% (n = 3) prevalence of permanent pacemaker implant, versus G2: 3% (n = 2) among (P = .1).

Conclusion

Our results suggest that extended donor and recipient criteria do not compromise clinical outcomes after transplantation.  相似文献   

17.

Background

The increase in indications for liver transplantation has led to acceptance of donors with expanded criteria. The donor risk index (DRI) was validated with the aim of being a predictive model of graft survival based on donor characteristics. Intraoperative arterial hepatic flow and indocyanine green clearance (plasma clearance rate of indocyanine green [ICG-PDR]) are easily measurable variables in the intraoperative period that may be influenced by graft quality. Our aim was to analyze the influence of DRI on intraoperative liver hemodynamic alterations and on intraoperative dynamic liver function testing (ICG-PDR).

Methods

This investigation was an observational study of a single-center cohort (n = 228) with prospective data collection and retrospective data analysis. Measurement of intraoperative flow was made with a VeriQ flowmeter based on measurement of transit time (MFTT). The ICG-PDR was obtained from all patients with a LiMON monitor (Pulsion Medical Systems AG, Munich, Germany). DRI was calculated using a previously validated formula. Normally distributed variables were compared using Student's t test. Otherwise, the Mann-Whitney U test or Kruskal-Wallis test was applied, depending on whether there were 2 or more comparable groups. The qualitative variables and risk measurements were analyzed using the chi-square test. P < .05 was considered statistically significant.

Results

DRI score (mean ± SD) was 1.58 ± 0.31. The group with DRI >1.7 (poor quality) had an intraoperative arterial flow of 234.2 ± 121.35 mL/min compared with the group having DRI < 1.7 (high quality), with an intraoperative arterial flow of 287.24 ± 156.84 mL/min (P = .02). The group with DRI >1.70 had an ICG-PDR of 14.75 ± 6.52%/min at 60 minutes after reperfusion compared to the group with DRI <1.70, with an ICG-PDR of 16.68 ± 6.47%/min at 60 minutes after reperfusion (P = .09).

Conclusion

Poor quality grafts have greater susceptibility to ischemia-reperfusion damage. Decreased intraoperative hepatic arterial flow may represent an increase in intrahepatic resistance early in the intraoperative period.  相似文献   

18.

Purpose

This retrospective analysis evaluated the impacts of sirolimus (SRL), cyclosporine (CsA), and steroids (S) on the occurrence, treatment, and complications of new-onset diabetes after transplantation (NODAT).

Methods

We compared 4 groups: group 1, SRL plus full-exposure CsA/S (n = 118); group 2, full-exposure CsA/S/no SRL ± antiproliferative drug (n = 141); group 3, SRL plus reduced CsA exposure/S (n = 212); and group 4, no SRL/full-exposure CsA/S ± antiproliferative drug (n = 43).

Results

NODAT rates reflected the level of CsA exposure; at 10 years 54% versus 30% for groups 1 versus 2 (P = .0001); at 5 years 30% versus 21% for Groups 3 versus 4 (P = .3); 81% of cases were detected within 1 year. The lower NODAT rate in group 3 reflected a benefit of reduced CsA exposure (P = .02; hazard ratio (HR), 1.006). Group 1 showed higher CsA (P = .0001) and lower SRL concentrations (P = .016) versus group 3. CsA exposure closely correlating with NODAT among group 1 (P = .0001) was the major difference between groups 1 and 3 (P = .04; HR, 0.97). Differences in steroid treatment did not play a significant role in NODAT. Comparing groups 1 and 2, SRL was an independent risk factor for NODAT (P = .004; HR, 3.5).

Conclusions

Our 10-year experience revealed SRL to be an etiologic agent for NODAT, displaying interactive, possibly pharmacokinetic, and pharmacodynamic effects with concomitant CsA in combination treatment.  相似文献   

19.

Background

In the Philippines, maintenance of immunosuppression may not always be affordable, leading to acute rejection and graft loss. The availability of the generic cyclosporine Arpimune could be economically beneficial, but its safety and efficacy should be established.

Methods

This prospective cohort study enrolled 30 renal transplant patients who received Arpimune with mycophenolate/prednisone. Their results were compared up to 6 months with 30 matched control patients who received Neoral during the same period. Areas under the receiver operating characteristic curves (AUC) after intake of Arpimune and therapeutic drug monitoring using cyclosporine levels 2 hours after each dose were done. Pearson correlation was performed to determine linearity of relationship between the generic cyclosporine concentrations and AUC 0-4. Chi-square test was used in obtaining cyclosporine Arpimune concentrations.

Results

The abbreviated concentration AUC of Arpimune was similar to that of Neoral, and the 2-hour sampling point (r = 0.813; P < .001) showed the best correlation. Calculated creatinine clearance (mL/min) versus Neoral was 71.36 ± 13 versus 68.03 ± 16.6 (P = .61) at 1 month, 70.4 ± 14.8 versus 64.2 ± 11.4 (P = .12) at 3 months, and 74.02 ± 15.8 versus 62.03 ± 12.1 (P = .002) at 6 months. Two Arpimune versus 4 Neoral patients (P = .67) developed biopsy-proven acute rejection. One septic death occurred in the Arpimune group. Graft survival was 100% in both groups. Hyperlipidemia was the most frequent side effect for both.

Conclusions

The AUC of Arpimune was similar to that of Neoral. Use of the generic cyclosporine Arpimune provided effective immunosuppression in the 6 months after transplantation. Renal allograft function was similar to that of Neoral, with minimal rates of acute rejection and adverse events.  相似文献   

20.

Background

Left ventricular hypertrophy, considered an independent factor for cardiovascular mortality, is frequent among renal transplant recipients (RTR), in whom we investigated changes in left ventricular mass (LVM) after grafting and associations with possible causal factors, especially glucose metabolism and oxidative stress.

Methods

We performed a prospective study of 37 RTR without prior diabetes mellitus who were evaluated at three times after transplantation (medians of 0.6, 16 and 28 months) by means of the LVM index (LVMI, echocardiographic measure of LVM related to body surface area, g/m2), oral glucose tolerance test and determinations of malondialdehyde and total glutathione (GSH), as well as glomerular filtration rate (GFR) estimate by the Modification of Diet in Renal Disease formula. We calculated the overall increment (DeltaLVMI) and percent change of LVMI. Patients were diagnosed to be prediabetic (PD) or new-onset diabetes after transplant (NODAT) according to ADA criteria.

Results

The mean LVMI decreased significantly over time among whole group baseline = 108.34 ± 27.71 g/m2 versus middle: 100.03 ± 27.53 g/m2 versus final: 90.62 ± 24.06 g/m2 (P = .000). However, 13.5% of subjects showed an increased LVMI and 59.5%, a decrease less than 20%. Patients with NODAT at the end of the study showed a positive DeltaLVMI, which was negative in nondiabetics (0.24 ± 16.14 versus -19.86 ± 12.61 g/m2, P = .018). Compared with DeltaLVMI(−) recipients, patients with DeltaLVMI(+) showed a greater proportion of PD and NODAT at baseline (60% and 40% versus 18.8% and 12.5%, P = .017), and significantly higher all-time fasting glycemia, lower estimated GFR, and greater increments of malondialdehyde and GSH over time. Those with a <20% LVMI decrease experienced progressive GFR impairment over time, as opposed to those with an LVMI decrease > 20%, who showed greater and improving GFR over the whole study.

Conclusions

LVMI does not always improve in RTR; the evolution of ventricular mass after renal transplantation is influenced by glucose metabolism disorders, oxidative stress, and graft function.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号