首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
BACKGROUND: Postischemic acute renal transplant failure occurs in approximately one fourth of all dead donor transplantations. Uncertainty exists regarding the putative association between the use of angiotensin converting enzyme inhibitors (ACEIs) or angiotensin II AT1 receptor blockers (ARBs) and kidney transplant graft survival in patients with delayed allograft function. METHODS: We conducted an open cohort study of all 436 patients who experienced an acute renal transplant failure out of all 2,031 subjects who received their first kidney transplant at the Medical University of Vienna between 1990 and 2003. Actual and functional graft survival was compared between users and nonusers of ACEI/ARB using exposure propensity score models and time-dependent Cox regression models. RESULTS: Ten-year actual graft survival averaged 44% in the ACEI/ARB group, but only 32% in patients without ACEI/ARB (P=0.002). The hazard ratio of actual graft failure was 0.58 (95% confidence interval: 0.35-0.80, P=0.002) for ACEI/ARB users compared with nonconsumers. Seventy-one percent of subjects with ACEI/ARB had a functional graft at 10 years versus 64% of ACEI/ARB nonusers (P=0.027). The hazard ratio of functional graft loss was 0.48 (95% confidence interval: 0.24-0.91, P=0.025). CONCLUSIONS: Use of ACEI/ARB in patients experiencing delayed allograft function was associated with longer actual and functional transplant survival.  相似文献   

2.
Whether the use of angiotensin-converting enzyme inhibitor or angiotensin receptor blocker inhibitor (ACEI/ARB) is beneficial in renal transplant recipients remains controversial. In this retrospective study on 505 renal transplant recipients, we analyzed blood pressure and graft survival according to antihypertensive treatment with ACE-I/ARB and/or calcium channel blockers (CCB) over a period of 10 years. Patients were stratified according to their blood pressure 1 year after transplantation [controlled (≤130/80 mmHg; CTR, 181 patients) and noncontrolled (>130/80 mmHg; non-CTR, 324 patients)] and according to antihypertensive treatment (ACE-I/ARB and/or CCB taken for at least 2 years). One year after transplantation, 88.4% of CTR and 96.6% of non-CTR received antihypertensive treatment ( P  < 0.05). Graft survival was longer in CTR than in non-CTR ( P  < 0.05). Importantly, graft survival was longer in patients who received long-term treatment with ACEI/ARB, CCB, or a combination of ACEI/ARB and CCB ( P  < 0.001). The beneficial effect of ACEI/ARB therapy was more pronounced in non-CTR compared with that of CTR. We conclude that blood pressure control is a key target for long-term graft survival in renal transplant patients. Long-term ACEI/ARB and CCB therapy is beneficial for graft survival, especially in patients with diabetes and/or albuminuria.  相似文献   

3.
He X  Johnston A 《Transplantation》2005,79(8):953-957
BACKGROUND: After the introduction of cyclosporine A (CsA), 2-year graft survival of transplanted kidneys improved from less than 60% to more than 80%, but long-term graft survival and graft half-life have shown less change. This study investigates the impact of a range of demographic and treatment factors on long-term graft survival in renal recipients treated with CsA from all renal transplant centers in the United Kingdom. METHODS: Data were obtained from the Long-Term Efficacy and Safety Surveillance study of renal transplant recipients receiving CsA (Neoral; Novartis, Basel, Switzerland). A total of 1,757 de novo patients with a functioning graft at 1 year were evaluated. The endpoints considered were the need for regular dialysis or death. A stepwise stratified Cox model was used to identify the factors associated with outcome. RESULTS: Seven independent risk factors for allograft failure were identified: older recipient (hazard ratio [HR] 1.8, 95% confidence interval [CI] 1.2-2.6), male recipient (HR 1.8, 95% CI 1.2-2.7), younger donor (HR 1.7, 95% CI 1.2-2.5), above average creatinine (HR 1.9, 95% CI 1.3-2.8), chronic allograft nephropathy (HR 7.0, 95% CI 4.7-10.4), diabetic recipient (HR 2.2, 95% CI 1.2-4.1), and neoplasm after transplant (HR 1.7, 95% CI 1.2-2.6). CONCLUSION: Seven independent risk factors were found to influence graft survival. Only two of these can be modified by clinical intervention, elevated serum creatinine at 1 year and the occurrence of chronic allograft nephropathy. To influence these two factors, the optimization of immunosuppressive therapy is essential.  相似文献   

4.
BACKGROUND: Angiotensin-converting enzyme inhibitors (ACEI) or angiotensin II type 1 receptor blockers (ARB) are frequently prescribed to renal transplant recipients with a reduced glomerular filtration rate (GFR). The aim of this study was to investigate the association of ACEI/ARB use and serum potassium levels in renal graft recipients. METHODS: We conducted an open cohort study of 2041 first renal allograft recipients, transplanted at the Medical University of Vienna between 1990 and 2003. Serum potassium levels were compared over an up to 10 years of observation period between subjects with versus without ACEI/ARB therapy using a mixed effects general linear model. The analysis was adjusted for several covariables known to influence serum potassium such as the use of diuretics, beta blockers, calcineurin inhibitor (CNI) based immunosuppression, estimated GFR, time since renal transplantation, diabetes, years on dialysis and recipient age. RESULTS: The overall adjusted estimated serum potassium difference between recipients with versus without ACEI/ARB therapy was 0.08 mmol/l (P < 0.001). The use of diuretics was associated with a 0.11 mmol/l (P < 0.001) lower potassium concentration whereas each GFR decrease by 10 ml/min led to an increase of 0.04 mmol/l (P < 0.001). CNI intake increased serum potassium by 0.06 mmol/l (P = 0.002). Furthermore, serum potassium increased by 0.17 mmol/l within the first decade after transplantation (P < 0.001) while holding the other covariables constant. No effect modification between ACEI/ARB and time since transplantation was observed. Nineteen subjects (2.4%) discontinued the ACEI/ARB therapy due to hyperkalaemia. CONCLUSIONS: In summary, relevant hyperkalaemia associated with ACEI/ARB therapy is negligible in renal transplant recipients during long-term follow-up. The hyperkalaemic effect of ACEI/ARB is balanced by the use of diuretics.  相似文献   

5.
ACE-inhibitors and angiotensin receptor blockers (ARB) slow the progression of renal disease in non-transplant patients. A systematic review of randomized trials (n = 21 trials with 1549 patients) was conducted to determine the effect of ACE-inhibitor or ARB use following kidney transplantation. With a median follow-up of 27 months, ACE-inhibitor or ARB use was associated with a significant decrease in glomerular filtration rate (-5.8 mL/min; 95% CI -10.6 to -0.99). ACE-inhibitor or ARB use resulted in a lower hematocrit (-3.5%; 95% CI -6.1 to -0.95), reduction in proteinuria (-0.47 gm/d; 95% CI -0.86 to -0.08) but no change in the serum potassium (0.18 mmol/L; 95% CI -0.03 to 0.40). ACE-inhibitor or ARB use results in clinically important reductions in proteinuria, hematocrit and glomerular filtration rate in renal transplant recipients, but there are insufficient data to determine the effect on patient or graft survival. Randomized trials of sufficient power and duration that examine these hard outcomes should be conducted. Until such trials are completed, this study provides quantitative estimates of the risks and benefits of ACE-inhibitor or ARB use that can be used by clinicians considering prescribing these medications to kidney transplant recipients or to researchers designing future trials.  相似文献   

6.
There have been few studies of patients with renal allografts functioning for more than 20 years. We sought to identify clinical factors associated with ultra long‐term (>20 year) renal allograft survival and to describe the clinical features of these patients. We performed a retrospective analysis of the Irish Renal Transplant Database and included 1174 transplants in 1002 patients. There were 255 (21.74%) patients with graft function for 20 years or more. Multivariate analysis identified recipient age (HR 1.01, CI 1.01–1.02), gender (male HR 1.25, CI 1.08–1.45), acute rejection (HR 1.26, CI 1.09–1.45) and transplant type (living related donor vs. deceased donor) (HR 0.52, CI 0.40–0.66) as significantly associated with long‐term graft loss. Median serum creatinine was 115 μmol/L. The 5‐year graft survival in 20‐year survivors was 74.7%. The mean age at death was 62.7 years (±10.6). The most common causes of death were cardiovascular disease and malignancy. The two major causes of graft loss were death (with function) and interstitial fibrosis/tubular atrophy. Comorbidities included skin cancer (36.1%), coronary heart disease (17.3%) and other malignancies (14.5%). This study identifies factors associated with long‐term allograft survival and a high rate of morbidity and early mortality in long‐term transplant recipients.  相似文献   

7.
AIM OF THE STUDY: Chronic glomerulonephritis (GN) is reported as a common cause of late kidney allograft loss. The aim of this study was to identify risk factors associated with kidney allograft loss in the course of posttransplantation GN. PATIENTS AND METHODS: The study analyzed 75 kidney allograft recipients with biopsy-confirmed posttransplantation GN, including 27 cases of immunoglobulin (Ig)A nephropathy (IgAN), 30 of membranous GN (MGN), 6 of mesangiocapillary GN (MCGN); and 12 of focal segmental GN (FSGS). The risk factors for kidney allograft loss, defined as dialysis reintroduction after GN onset, were identified through are historical cohort study. CLINICAL FINDINGS: After the onset of posttransplantation GN, the median time to dialysis introduction was 46 months. The risk factors for kidney allograft loss were as follows: male gender (hazard ratio [HR] = 1.92; 95% confidence intervall [CI] 1.0-3.70; P = .052), initial unsatisfactory kidney function (HR = 1.86 per 1 mg/dL serum creatinine increment; 95% CI 1.0-3.46; P < .05), graft dysfunction at diagnosis (HR = 1.65 per 1 mg/dL serum creatinine increment; 95% CI 1.32-2.07; P < .001), nephrotic syndrome (HR = 2.3; 95% CI 1.13-4.99; P < .05) late-onset GN (HR = 1.1 per each additional year of observation, 95% CI 1.0-1.21; P < .05), and MPGN as a type of GN. Enhanced immunosuppression increased and ACEI and/or statin treatment decreased the risk of return to dialysis, respectively: HR = 1.56, 95% CI 0.76-3.18, P = .22; HR = 0.39, 95% CI 0.16-0.98, P = .0037; and HR = 0.367, 95% CI 0.15-0.88, P = .025. CONCLUSIONS: Identification of risk factors can help discover patients who will have a faster progression to kidney allograft loss due to GN. In posttransplantation GN, statins and/or ACEI should be prescribed, if there are no contraindications.  相似文献   

8.
Abstract: Background: Erythrocytosis is relatively common post‐kidney transplantation and may have adverse consequences. This study examined whether the incidence of erythrocytosis has remained stable over time and explored the impact of this condition on patient outcomes. Methods: This was a retrospective single center review of an incidence cohort (transplanted between 1993 and 2005). Predictors of erythrocytosis and hemoglobin levels and subsequent patient and allograft survival were examined. Results: Erythrocytosis (hemoglobin >170 g/L for >1 month) was observed in 59 of 511 recipients. Erythrocytosis developed in only 8.1% of those transplanted from 1997 to 2005, compared with 18.7% in those transplanted from 1993 to 1996 (p = 0.0005). Independent predictive factors were use of angiotensin converting enzyme inhibitors/angiotensin receptor blockers (ACEi/ARBs) (HR 0.176, 95% CI 0.040–0.71, p = 0.016), male gender (HR 3.72, 95% CI 1.54–9.0, p = 0.003), and mycophenolic acid agents (HR 0.49, 95% CI 0.237–0.99, p = 0.049). Patients with erythrocytosis had superior overall survival (HR for death 0.105, 95% CI 0.014–0.760, p = 0.026) but a trend for worse death censored graft loss (univariate HR 2.06, 95% CI 0.91–4.65, p = 0.084). Conclusions: The incidence of erythrocytosis is falling and is likely related to greater ACEi/ARB use and possibly more antiproliferative immunosuppression. Patient survival is excellent in those with erythrocytosis, but long‐term graft survival may be compromised.  相似文献   

9.
Risk factors and prognosis for proteinuria in renal transplant recipients   总被引:6,自引:0,他引:6  
INTRODUCTION: Proteinuria in renal transplant recipients has been recognized as a risk factor of progression of chronic allograft nephropathy and for cardiovascular disease, the main causes of transplant failure. PATIENTS AND METHODS: We analyzed the risk factors for persistent proteinuria (>0.5 g/day) among 337 kidney allograft recipients with a minimum follow-up of 6 months, among a series of 375 transplants performed during a decade, as well as their association with allograft and patient survivals. Patients with proteinuria greater than 0.5 g/d were treated with angiotensin-converting enzyme inhibitors (ACEI) and/or angiotensin-receptor blockers. RESULTS: After a mean follow-up of 53.35 +/- 52.63 months, 68 patients (20.17%) had persistent proteinuria greater than 0.5 g/d. Female patients (P = .012), body mass index (BMI) >25 (P = .008), pretransplant HLA sensitization (P = .039), and delayed graft function (DGF; P = .001) were associated with proteinuria. Induction treatment with antithymocyte globulin (P = .030) and treatment with tacrolimus instead of cyclosporine (P = .046) were associated with an increased risk of proteinuria. Multivariate analysis confirmed the independent value of DGF (RR = 2.23; 95% confidence interval [CI] 1.22 to 4.07; P = .009) and BMI >25 (RR = 1.968; 95% CI 1.05 to 3.68; P = .035) to predict postransplant proteinuria. The mean values of serum creatinine (P = .000) and systolic blood pressure (P < .05) were persistently higher from the early stages after transplantation in the proteinuric group. Graft survival at 5 years was 69% among patients who developed proteinuria and 93% in those without proteinuria (P = .000), with no differences in patient survival (P = .062). CONCLUSION: Proteinuria in renal transplant recipients was related to immunological and nonimmunological factors, some of which, such as hypertension and obesity could be modifiable. Proteinuria in renal transplant recipients predicted a worse allograft survival despite of intensive treatment of hypertension including ACEI/angiotensin-receptor blockers.  相似文献   

10.
The association of serum uric acid (UA) with kidney transplant outcomes is uncertain. We examined the predictive value of UA during the first year posttransplant as a time‐varying factor for graft survival after adjustment for time‐dependent and independent confounding factors. Four hundred and eighty‐eight renal allograft recipients transplanted from January 2004 to June 2006 and followed for 41.1 ± 17.7 months were included. Data on UA, estimated glomerular filtration rate (eGFR), tacrolimus level, mycophenolate mofetil (MMF) and prednisone doses, use of allopurinol, angiotensin‐converting enzyme‐inhibitor/angiotensin‐receptor‐blocker (ACEi/ARB) and diuretics at 1, 3, 6, 9 and 12 months were collected. Primary endpoint of the study was graft loss, defined as graft failure and death. Cox proportional hazard models and generalized estimating equations were used for analysis. UA level was associated with eGFR, gender, retransplantation, decease‐donor organ, delayed graft function, diuretics, ACEi/ARB and MMF dose. After adjustment for these confounders, UA was independently associated with increased risk of graft loss (HR: 1.15, p = 0.003; 95% CI: 1.05–1.27). Interestingly, UA interacted with eGFR (HR: 0.996, p < 0.05; 95% CI: 0.993–0.999 for interaction term). Here, we report a significant association between serum UA during first year posttransplant and graft loss, after adjustment for corresponding values of time‐varying variables including eGFR, immunosuppressive drug regimen and other confounding factors. Its negative impact seems to be worse with lower eGFR.  相似文献   

11.
OBJECTIVE: HLA mismatches have a strong impact on acute rejection and renal allograft survival. The objective of this study was to evaluate the effectiveness of antibody induction according to the degree of HLA mismatches. METHODS: Of 20,429 deceased donor (DD) transplantations and 12,859 living donor (LD) transplantations reported to the United Network for Organ Sharing (UNOS) between 1999 and 2001, 51% of DD and 45% of LD transplant recipients received induction therapy. Propensity scores (PS) were calculated to indicate independent factors associated with the use of induction. Levels of HLA match examined for DD transplant recipients were 0 ABDR (n = 3239), 0 DR (n = 4210), and DR mismatched transplants (n = 12,980), and 0 (n = 1133), 1 (n = 3836), and 2 (n = 7890) haplotype mismatches for LD transplant recipients. Outcome parameters were reported as hazard ratios (HR) for graft loss and odds ratios (OR) for first-year acute rejection. RESULTS: Recipients with HLA mismatches were more likely to receive induction antibody for DR mismatch in DDs (PS = 1.11, 95% confidence interval [CI] 1.04-1.19) and for haplotype mismatch in LDs (PS = 1.36, 95% CI 1.22-1.52). Induction reduced the likelihood of acute rejection for DD transplant recipients regardless of the level of HLA mismatch (OR = 0.70; 95% CI 0.57-0.85 in 0 ABDR MM; OR = 0.76, 95% CI 0.64-0.89 in 0 DR MM; and OR = 0.69, 95% CI 0.62-0.77 in DR MM), and for 2 haplotype mismatched LD transplant recipients (OR = 0.82, 95% CI 0.70-0.96); in other LD transplant recipients, reductions in acute rejection rates were observed but not statistically significant. Induction reduced the risk of graft loss for DR mismatched DD transplant recipients by about 12% (HR = 0.88; 95% CI 0.80-0.97). CONCLUSIONS: Antibody induction resulted in a significant reduction of acute rejection and graft loss for patients with HLA mismatch.  相似文献   

12.
AIM: We prospectively followed a cohort of 202 renal transplant recipients for 5 years to examine the impact of fasting homocysteinemia on long-term patient and renal allograft survival. METHODS: Cox proportional hazards regression analysis was used to identify independent predictors of all-cause mortality and graft loss. RESULTS: Hyperhomocysteinemia (tHcy >15 micromol/L) was present in 48.7% of the 202 patients, predominantly among men (55.8%) as opposed to women (37.1%). At the end of the follow-up period, 13 (6.4%) patients had died including 10 from cardiovascular disease, and 23 had (11.4%) had lost their grafts. Patient death with a functioning allograft was the most prevalent cause of graft loss (13 recipients). Levels of tHcy were higher among patients who died than among survivors (median 23.9 vs 14.3 micromol/L; P = .005). Median tHcy concentration was also higher among the patients who had lost their allografts than those who did not (median 19.0 vs 14.1 micromol/L; P = .001). In a Cox regression model including gender, serum creatinine concentration, transplant duration, traditional cardiovascular risk factors, and associated conditions, such as past cardiovascular disease, only tHcy concentration (ln) (HR = 5.50; 95% CI, 1.56 to 19.36; P = .008) and age at transplantation (HR = 1.07; 95% CI, 1.02 to 1.13; P = .01) were independent predictors of patient survival. After censoring data for patient death, tHcy concentration was not a risk factor for graft loss. CONCLUSIONS: This prospective study shows that tHcy concentration is a significant predictor of mortality, but not of graft loss, after censoring data for patient death.  相似文献   

13.
目的 探讨血管紧张素受体拮抗剂缬沙坦与血管紧张素转化酶抑制剂贝那普利联合应用治疗肾移植患者移植肾慢性损伤的远期效果. 方法非糖尿病患者肾移植术后尿蛋白>0.5g/d或SCr>177 mmol/L(>2 mg/d)32例,随机分2组:①治疗组23例.男9例,女14例.平均40岁.病理诊断慢性移植物肾病(CAN)13例、环孢素中毒3例、肾小球疾病7例.②对照组9例.男4例,女5例.平均35岁.CAN 6例、环孢素中毒1例、肾小球疾病2例.治疗组给予缬沙坦(80mg/d)与贝那普利(20 mg,2次/d)联合治疗3年,对照组未进行此项处理.比较2组患者治疗前后SCr、24 h尿蛋白变化及移植肾生存时间.结果 随访3年后,治疗组SCr为(252.2±117.9)mmol/L,对照组为(375.3±203.0)mmol/L,2组比较差异有统计学意义(P<0.05).治疗组CAN患者SCr为(282.4±147.3)mmol/L,对照组为(528.7±107.8)mmol/L,2组比较差异有统计学意义(P<0.01).治疗组24 h尿蛋白为(1.0±0.6)g,对照组为(1.3±0.7)g,组问差异无统计学意义(P>0.05).移植肾存活时间治疗组76个月,对照组为71个月,组间差异无统计学意义(P>0.05).结论 缬沙坦与贝那普利联合应用可保护移植肾功能,对蛋白尿及移植肾远期存活的影响有待进一步观察.  相似文献   

14.
Our previous study of a group of renal transplant recipients treated with CsA showed a significantly faster development of chronic graft failure among patients with gingival hyperplasia (GH) compared to unaffected patients. The aim of the present research was to establish the impact of CsA dose and blood levels on the incidence of chronic graft nephropathy and gingival overgrowth as well as to assess risk factors for chronic graft nephropathy. The study included 64 renal graft recipients (32 patients with GH and 32 without GH) transplanted between 1989 and 1994. There were no significant differences between the pretransplant demographic and clinical data of the patients with and without GH. Patients with GH received a significantly higher total yearly dosages of CsA compared those without GH (P <.03). Serum creatinine in the first year posttransplant in patients with GH was 1.9 mg/dL versus 1.6 mg/dL in those without GH. During 9 to 14 years follow-up, end-stage renal failure due to chronic nephropathy occurred in 18 patients (56%) with GH and eight patients (25%) without GH. Ten-year renal graft survival was 35% in GH patients and 69% in patients without GH. Ten-year patient survival was 69% in the GH group and 91% in the group without GH. CsA dosage was a risk factor for GH and for graft loss, which implies a role of CsA toxic effects on the pathological mechanisms of GH and of chronic allograft nephropathy.  相似文献   

15.
《Transplantation proceedings》2023,55(7):1581-1587
BackgroundKidney transplantation is a treatment option for patients with end-stage renal disease (ESRD) who are infected with hepatitis B virus (HBV). However, the impact of nucleos(t)ide analogues usage on the clinical outcomes in HBV-infected ESRD patients undergoing kidney transplantation is not well understood. This study aimed to assess the outcomes of kidney transplant recipients with HBV infection using real-world data to provide insight into the disease course over time.MethodsA nationwide retrospective longitudinal population-level cohort study was conducted using the National Health Insurance Research Database. The study evaluated patient and allograft survival and kidney-related and liver-related events and identified factors contributing to these events.ResultsOf the 4838 renal transplant recipients in the study, there were no significant differences in graft survival between the HBV-infected and non-infected groups (P = .244). However, the HBV-infected group had suboptimal patient survival compared to the non-infected group (hazard ratio [HR] for overall survival, 1.80; 95% CI 1.40-2.30; P < .001). Diabetes mellitus was associated with a higher re-dialysis rate (HR, 1.71; 95% CI, 1.38-2.12; P < .001) regarding kidney-associated events. For liver-associated events, HBV-infected status (HR, 9.40; 95% CI, 5.66-15.63; P < .001), and age >60 years (HR, 6.90; 95% CI, 3.14-15.19; P < .001) were associated with increased incidence of liver cancer.ConclusionsHepatitis B-infected renal transplant recipients have comparable graft survival but inferior patient survival outcomes due to pre-existing diseases and increasing liver-related complications. The findings of this study can help optimize treatment strategies and improve long-term outcomes for this patient population.  相似文献   

16.
We undertook this study to assess the rate of poor early graft function (EGF) after laparoscopic live donor nephrectomy (lapNx) and to determine whether poor EGF is associated with diminished long-term graft survival. The study population consisted of 946 consecutive lapNx donors/recipient pairs at our center. Poor EGF was defined as receiving hemodialysis on postoperative day (POD) 1 through POD 7 (delayed graft function [DGF]) or serum creatinine ≥ 3.0 mg/dL at POD 5 without need for hemodialysis (slow graft function [SGF]). The incidence of poor EGF was 16.3% (DGF 5.8%, SGF 10.5%), and it was stable in chronologic tertiles. Poor EGF was independently associated with worse death-censored graft survival (adjusted hazard ratio (HR) 2.15, 95% confidence interval (CI) 1.34–3.47, p = 0.001), worse overall graft survival (HR 1.62, 95% CI 1.10–2.37, p = 0.014), worse acute rejection-free survival (HR 2.75, 95% CI 1.92–3.94, p < 0.001) and worse 1-year renal function (p = 0.002). Even SGF independently predicted worse renal allograft survival (HR 2.54, 95% CI 1.44–4.44, p = 0.001). Risk factors for poor DGF included advanced donor age, high recipient BMI, sirolimus use and prolonged warm ischemia time. In conclusion, poor EGF following lapNx has a deleterious effect on long-term graft function and survival.  相似文献   

17.
The PIRCHE (Predicted Indirectly ReCognizable HLA Epitopes) score is an HLA epitope matching algorithm. PIRCHE algorithm estimates the level of presence of T-cell epitopes in mismatched HLA. The PIRCHE-II numbers associate with de novo donor-specific antibody (dnDSA) formation following liver transplantation and kidney allograft survival following renal transplantation. The aim of our study was to assess the PIRCHE-II score in calcineurin inhibitor (CNI)-free maintenance immunosuppression recipients.This was a retrospective study of forty-one liver transplant recipients on CNI-free immunosuppression and with available liver allograft biopsies. Donors and recipients were HLA typed. The HLA-derived mismatched peptide epitopes that could be presented by the recipient's HLA-DRB1 molecules were calculated using PIRCHE-II algorithm. The associations between PIRCHE-II scores and graft immune-mediated events were assessed using receiver operating characteristics curves and subsequent univariate and multivariate analyses.CNI-free patients with cellular rejection, humoral rejection, or severe portal inflammation had higher mean PIRCHE-II scores compared to patients with normal liver allografts. PIRCHE-II score and donor age were independent risk factors for liver graft survival in CNI-free patients (HR: 8.0, 95% CI: 1.3–49, p = .02; and HR: 0.88, 95% CI: 0.00–0.96, p = .007, respectively).PIRCHE-II scores could be predictive of liver allograft survival in CNI-free patients following liver transplantation. Larger studies are needed to confirm these results.  相似文献   

18.
INTRODUCTION: A growing number of patients are losing their kidney allografts due to glomerulonephritis. Although posttransplant IgA nephropathy (IgAN) is regarded as benign, it may lead to late allograft loss in a substantial number of patients. The aim of this study was to evaluate the influence of posttransplant IgAN on long-term transplantation outcomes, risk factors for progression of graft dysfunction, and effectiveness of therapeutic interventions. PATIENTS AND METHODS: We evaluated, potential risk factors for accelerated graft loss among 27 kidney allograft recipients with posttransplant IgAN, comparing graft survival in a control group matched for population and transplantation-related parameters. We evaluated the effectiveness of therapeutic interventions regarding immunosuppressive regimen, and hypertension control including angiotensin converting enzyme inhibitor (ACEI) usage with Kaplan-Meier, Cox proportional hazard plots, and log-rank tests in statistical analyses. RESULTS: Compared with the control group, patients with IgAN experienced a 6.57 higher risk for dialysis dependence (P < .01, 95% CI 1.4 to 30.83). The risk for accelerated graft loss in the course of IgAN was associated with graft dysfunction (RR = 2.16 for additional 1 mg/dL of serum creatinine at glomerulonephritis presentation; P < .03, 95% CI 1.2 to 4.36) and intense proteinuria as evidenced by a RR = 4.67 for the presence of the nephrotic syndrome (P < .05, 95% CI 0.95 to 22.8). Immunosuppression enhancement resulted in a significantly decreased risk of dialysis dependence, namely, RR = 4.76 (95% CI 1.12 to 20, P < .04). With ACEI treatment there was a tendency for a 2.8-fold decreased risk of dialysis dependence, without reaching statistical significance (P = .14). CONCLUSIONS: Patients with posttransplant IgAN may benefit from intensifying maintenance immunosuppression, which slows progression to end-stage graft dysfunction.  相似文献   

19.
It was reported recently that treatment of kidney transplant recipients with angiotensin-converting enzyme inhibitors (ACEI) or angiotensin II type 1 receptor blockers (ARB) is associated with strikingly improved long-term graft and patient survival. This finding has important implications for future posttransplantation therapy recommendations. In an analysis of 17,209 kidney and 1744 heart transplant recipients, an association of treatment with ACEI/ARB with improved transplant outcome could not be confirmed. It is concluded that recommendations for a widespread use of ACEI/ARB treatment in transplant recipients are unwarranted.  相似文献   

20.
Renin-angiotensin system blockade in biopsy-proven allograft nephropathy   总被引:1,自引:0,他引:1  
Allograft nephropathy leads to progressive renal injury and ultimate graft loss. In native kidney disease, the use of angiotensin-converting enzyme inhibitors (ACEi) or angiotensin receptor blockers (ARB) is beneficial in retarding the decline of renal function. We reviewed a cohort of renal transplant recipients who were prescribed either an ACEi or ARB for biopsy-proven allograft nephropathy. Patients were followed from time of initiation of ACEi/ARB and were stratified based on biopsy findings. Outcomes of interest included safety, allograft survival, renal function, and rate of renal function decline pre- and post-ACEi/ARB. The 5-year allograft survival after biopsy was 83%. Mean serum creatinine was 2.2 +/- 1.1 mg/dL (range 1.0 to 4.3) at time of biopsy and 2.6 +/- 1.2 mg/dL (1.2 to 6.5) at last follow-up. The mean slope of the creatinine versus time (SD) was 2.43 (7.93) in the 12 months prior to therapy and 1.45 (3.66) following therapy, with the absolute difference in slope -3.38 (6.06) (P =.0004). We conclude that treatment with ACEi/ARB is beneficial in the management of allograft nephropathy.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号