首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

Cytomegalovirus (CMV) remains the most critical viral pathogen after kidney transplantation (KTx). The universal prophylaxis, but not pre-emptive therapy, could avoid the wide range of indirect effects induced by CMV infection. This study aims to examine the effect of universal prophylaxis with oral valganciclovir for the first year of CMV disease after KTx.

Methods

The universal prophylaxis therapy was started in May 2008. Patients who received KTx between January 2006 and September 2010 were included in the study. Oral valganciclovir (Valcyte) was used for 3 months with dosage adjusted by eGFR. CMV disease was defined by typical CMV syndrome with positive viremia or tissue proven. The study end points are episode of CMV disease and first-year biopsy-proven acute rejection.

Results

In total, 68 KTx patients who received universal prophylaxis for 3 months (study group) and another 50 KTx recipients without universal prophylaxis (control group) were enrolled. The incidence of CMV disease was 8.0% (4 of 50) in the control group. The universal prophylaxis significantly reduced the first-year episodes of CMV disease to 0% (0 of 68). There were 8 episodes of biopsy-proven acute rejection (8 of 50, 16%) within 1 year after KTx in the control group, but only 2 episodes of biopsy-proven acute rejection (2 of 68, 2.9%) in the treatment group (P < .05).

Conclusions

Universal prophylaxis with oral valganciclovir for 3 months significantly reduced episodes of first-year CMV disease and biopsy-proven acute rejection in kidney transplant recipients.  相似文献   

2.
BACKGROUND: The presence of a few circulating donor cells in recipient's blood was first thought to be only an epiphenomenon of solid organ transplantation, also called microchimerism, but several authors have suggested that these circulating cells may contribute to tolerance induction. This study aims to assess the rate of microchimerism after kidney transplantation and determine its influence on acute rejection in a 4-year follow-up. METHODS: A total of 84 single-kidney recipients were included for microchimerism detection and quantification 2, 6, 12, and 18 months after transplantation by specific detection of non-shared STR, VNTR, human leukocyte antigen-A, -B, -DRB1, and SRY alleles. Kinetic establishment of microchimerism was monitored in a double kidney transplanted recipient for 150 min after declamping and after 7 days. RESULTS: Microchimerism was detected in 56.2% of kidney recipients 2 months after transplantation (M2): this fell to 30.1% at 12 months. In renal calcineurin inhibitor-based immunosuppression cohort (n=73), the microchimerism-negative group (n=32) showed 37.9% biopsy-proven acute rejection (BPAR), whereas in the microchimerism-positive group (n=41), no recipient did (P<0.001). Regardless of immunosuppression, BPAR incidence was 35.6% and 4.9%, respectively (P<0.001). Multivariate study showed microchimerism as a protective factor against BPAR (odds ratio: 8.3; 95% confidence interval: 1.8 to 37.9; P = 0.006), blinding other well-known rejection-risk variables. Microchimerism M2 presence did not correlate with a multifactorial critical outcome such as late graft loss. CONCLUSION: Microchimerism was frequent after kidney transplantation and correlated with a significantly lower incidence of rejection. We propose that early microchimerism monitoring could help early detection of low rejection-risk recipients.  相似文献   

3.

Objectives

Kidney transplantation from donation after cardiac death (DCD) donors with terminal acute renal failure (ARF) is not widely accepted due to concern about the organ quality. Here we report our initial clinical outcomes of kidney transplantation from DCD donors with ARF.

Materials and Methods

The results of 29 kidney transplants from ARF DCD donors were compared with those of 60 kidney transplants from non-ARF DCD donors performed at our center from August 2011 to March 2013.

Results

There was no difference in the incidence of delayed graft function and acute rejection between ARF and non-ARF kidneys (27.6% vs 16.7%, 10.3% vs 8.3%, respectively). Estimated glomerular filtration rate at 12 months was similar between ARF and non-ARF kidneys. With a mean follow-up of 18 months (range 7 to 26 months), actual patient and graft survival rates for ARF DCD recipients were 100% and 96.6%, respectively, which were similar to those of the control group of kidney transplants from non-ARF kidneys (98.3% and 95.0%).

Conclusions

Kidneys from DCD donors with terminal ARF have excellent short-term outcomes and may represent another potential method to safely expand the donor pool.  相似文献   

4.

Background

Proteinuria is among the major and nonspecific sign of the renal disease. It is well known that late-onset proteinuria after renal transplantation has been associated with poor allograft outcomes and with mortality. Knowledge about the impact of early proteinuria on the various outcomes is limited. We have evaluated the utility of measuring early proteinuria in the management of pediatric renal transplant recipients.

Methods

We analyzed the effect of proteinuria at 3 months of posttransplantation on allograft rejection, graft loss, and estimated glomerular filtration rate (GFR) at 3 years. Proteinuria was assessed using 24-hour urine protein excretion. Renal biopsy was performed when elevated creatinine levels were elevated during routine follow-up and an acute rejection episode was proven with biopsy.

Results

Sixty-seven pediatric renal transplant recipients were included to the study. Mean follow-up time after transplantation was 48.8 ± 12.1 months. Thirty-nine recipients (58%) have proteinuria >500 mg/d. The relationship could not be shown between proteinuria at posttransplant month 3 and other outcomes parameters, such as graft loss and lower estimated GFR. A significant positive correlation between acute rejection and the proteinuria at posttransplant month 3 was shown.

Conclusion

We demonstrated that early proteinuria is a common finding in children after transplantation. Posttransplant early proteinuria cannot be used as a long-term prognostic marker of poor renal outcome. However, early proteinuria is associated with an high risk of acute rejection episodes. This would permit an opportunity for early intervention.  相似文献   

5.

Background

Mineral and bone disorder (MBD) is a major complication of chronic kidney disease and remains a major problem even after kidney transplantation. Although early steroid withdrawal protocols have beneficial effects on mineral and bone metabolism, they are also associated with significantly increased rates of acute allograft rejection (AR). Recently, patients have been treated with early rapid corticosteroid reduction protocols, but it is still unclear whether these protocols reduce the rate of MBD. The aim of this study was to evaluate the effects of early rapid corticosteroid reduction on MBD after kidney transplantation.

Methods

We retrospectively evaluated 34 adult kidney transplant recipients who were treated with an early rapid corticosteroid reduction protocol. Glucocorticoid treatment was reduced to methylprednisolone 4 mg/d at 1 month after transplantation.

Results

The AR rate at 3 years after transplantation was 15%. Bone mineral density was slightly decreased in the femur at 4 months after transplantation but returned to the preoperative level by 24 months after transplantation. There was no significant decrease in the bone mineral density of the lumbar spine during the first year after transplantation. Urinary deoxypyridinoline levels and plasma osteocalcin levels returned to the normal range during the follow-up period. Bone mineral density tended to be lower in female patients than male patients and in patients who underwent long-term pretransplant dialysis than those who did not undergo long-term pretransplant dialysis.

Conclusion

The present study found that MBD was temporary in kidney transplant recipients who were treated with an early rapid corticosteroid reduction protocol and that these patients did not have an increased AR rate.  相似文献   

6.

Objective

Blood group incompatibility in kidney transplants from a living donor can be successfully overcome by using various desensitization protocols: intravenous immunoglobulin, plasmapheresis (PP), immunoadsorption, and double filtration PP.

Patients and Methods

From July 2010 to October 2013, we performed 10 ABO incompatible kidney transplantation (KT) procedures from a living donor. The desensitization protocol was based on rituximab and PP + cytomegalovirus immune globulin. All patients received induction with basiliximab, except 1 case treated with Thymoglobuline® (ATG) for the simultaneous presence of donor-specific antibody. Tacrolimus and mycophenolate mofetil were initiated at the time of desensitization and continued after the transplant.

Results

After a mean follow-up of 11.6 ± 10.4 months, all patients are alive with a functioning graft. The mean serum creatinine concentration at 1 month, 3 months, 6 months, and 1 year was 1.48 ± 0.29, 1.47 ± 0.18, 1.47 ± 0.27, and 1.5 ± 0.27 mg/dl. Three episodes of acute cellular rejection occurred in 2 patients. There was only 1 case of BK virus infection, treated with reduction of immunosuppressive therapy. The protocol biopsy specimens at 1, 3, and 6 months were C4d positive in the absence of acute rejection.

Conclusions

Desensitization with rituximab, PP, and anti–cytomegalovirus immune globulin allowed us to perform transplants from living donors to ABO incompatible recipients with excellent results and reduced costs.  相似文献   

7.

Background

Posttransplant anemia (PTA) influences kidney graft function and prognosis; however, there is no consensus regarding target hemoglobin (Hb) levels.

Methods

We examined several cases of PTA to identify any correlation between Hb levels and graft function. We evaluated 84 kidney transplant recipients (50 men and 34 women; mean age, 46.7 years) who were treated at our department between February 2004 and March 2012 and were available for a 2-year post-transplant follow-up.

Results

Hb levels and serum creatinine levels before transplantation and at 1, 3, 6, 12, and 24 months after transplantation were compared. We examined the correlation between the degree of anemia and renal function among the patients. Data were analyzed using Spearman's rank correlation coefficient and Friedman tests. The mean pretransplantation Hb level was 10.4 g/dL, whereas Hb levels at 6, 12, and 24 months after transplantation were significantly increased to 11.6, 12.2, and 12.4 g/dL, respectively, suggesting an improvement in anemia after the transplantation. Correlation analysis between anemia and kidney graft dysfunction revealed significant correlations at 1, 3, 12, and 24 months after transplantation. Subjects were stratified for correlation analysis according to Hb level at 24 months after transplantation: <10, 10–10.9, 11.0–11.9, 12.0–12.9, and ≥13.0 g/dL. A significant improvement in kidney graft function was noted in patients with an Hb level ≥11 g/dL at 2 years after transplantation. Anemia improved significantly by 3 months after transplantation.

Conclusions

A significant correlation between PTA and kidney graft function was apparent, and the prognosis for kidney graft function was poor in patients with Hb levels ≤11 g/dL.  相似文献   

8.

Objective

The aims of this study were to determine if characterization of serum concentrations of interferon-gamma inducible protein-10 (IP-10), fractalkine, and their receptors (CXCR3 and CX3CR1) were predictive of acute allograft rejection in kidney transplant recipients.

Methods

Kidney transplant recipients (n = 52) were enrolled in this study and divide into either the acute rejection (AR, n = 15) or non–acute rejection (NAR, n = 35) groups. Serum samples from recipients were collected 1 day prior to transplantation and on days 1, 3, 5, 7, and 9 post-transplantation. The accuracy of chemokine concentrations for predicting acute rejection episodes was evaluated using receiver operator characteristic (ROC) curves.

Results

AR was diagnosed in 15 patients based on histologic changes to renal biopsies. AR patients had significantly higher serum fractalkine, CXCR1, IP-10, and CXCR3 levels compared to levels observed in the NAR group and healthy controls. Fractalkine and IP-10 had the largest area under the ROC curve at 0.86 (95% confidence interval: 0.77–0.96). Following steroid therapy, chemokine levels decreased, which may serve to predict the therapeutic response to steroid therapy.

Conclusion

Measuring serum levels of fractalkine, IP-10, and their receptors (especially the fractalkine/IP-10 combination) may serve as a noninvasive approach for the early diagnosis of renal allograft rejection.  相似文献   

9.

Objective

Simultaneous pancreas and kidney transplantation (SPKTx) is the most often performed multiorgan transplantation. The main source of complication is transplanted pancreas; as a result, early complications related to kidney transplant are rarely assessed. The aim of this study was to evaluate prevalence, types, and severity of postoperative complications due to kidney graft among the simultaneous pancreas and kidney recipients.

Methods

Complications related to transplanted kidney among 112 SPKTx recipients were analyzed. The indication for SPKTx was end-stage diabetic nephropathy due to long-lasting diabetes type 1. The cumulative survival rates for kidney graft function and cumulative freedom from complication on days 60 and 90 after transplantation were assessed. Severity of complications was classified according to the modified Dindo-Clavien scale.

Results

The 12-month cumulative survival rate for kidney graft was 0.91. Cumulative freedom from complication on the 60th day after transplantation was 0.84. The rates for II, IIIA, IIIB, IVA, and IVB severity grades were: 34.9%, 4.3%, 26.1%, 26.1%, and 8.6%, respectively. Acute tubular necrosis and rejection were the most frequent (43.4%) cause of complication. The most frequent reasons for graft nephrectomy were infections (2/7; 28.6%) and vascular thrombosis due to atherosclerosis of recipient iliac arteries (2/7; 28.6%). The most severe (IVB) complications were caused by fungal infection.

Conclusion

Rate and severity of complications due to renal graft after SPKTx was low; however, to prevent the most serious ones reduction of fungal infection was necessary.  相似文献   

10.

Background

This single-center study sought to examine the clinical outcomes of kidney transplant recipients from donors displaying acute kidney injury (AKI).

Methods

We analyzed retrospectively the medical records of the donors and recipients of 54 deceased-donor kidney transplantations performed in our center between March 2009 and March 2012.

Results

Among the 54 deceased donors, 36 (66.7%) experienced AKI as determined by the final mean serum creatinine levels measured before graft harvest of 2.66 ± 1.62 mg/dL versus 0.82 ± 0.28 mg/dL among non-AKI donors. The risks of delayed graft function and slow graft function were increased among the AKI versus non-AKI groups in the early post-transplantation period. However, the renal function status of recipients at 3, 6, and 12 months after transplantation was not significantly different between the two groups. Moreover, rejection-free survival rates during the study period were similar. Multivariate analysis revealed an acute rejection episodes (P = .047) and a lower body mass index in the donor relative to the recipient (P = .011) to be independent risk factors predicting poor graft function defined as a 1-year estimated glomerular filtration rate less than 50 mL/min/l.73 m2. Donor AKI with either a high level (>4.0 mg/dL), an increasing trend of creatinine, or greater severity by the Risk, Injury, Failure, Loss, and End-stage kidney disease (RIFLE) classification was not a significant risk factor.

Conclusion

Transplantation of kidneys from the AKI donors, namely, patients with severely decreased renal function, displayed excellent short-term outcomes. Accordingly, kidney transplantations from deceased donors with AKI should be considered more actively to expand the donor pool in Korea.  相似文献   

11.
12.

Background

Renal cancers commonly occur in the native kidneys of renal transplant recipients, whereas renal cancer in the grafted kidney has been reported occasionally. Renal cancer in the grafted kidney occurred 16 years after graft loss in this case, which would be a more rare case.

Case Report

A 60-year-old man who had a kidney transplant from his mother at the age of 31 years and had hemodialysis again because of chronic rejection from the age of 44 years had right lower abdominal pain. Computerized tomography (CT) showed tumor involvement in the grafted kidney. Positron-emission tomography–CT also showed hot spots in the liver, cervical vertebra, and costal bone. Needle biopsy for grafted kidney and liver tumors were done, and pathologic findings revealed renal cancer of grafted kidney and metastatic liver tumor. Graftectomy was done, and renal cancer was diagnosed as spindle cell carcinoma. Irradiation for cervical bone metastasis was done after the surgery. He complained of abdominal pain and eating disturbance 2 months after the surgery. CT showed a huge recurrence tumor and multiple tumor dissemination. Small intestine was involved and obstructed by the main tumor. He died of recurrence of renal cancer 3 months after the surgery.

Conclusions

It is reported that the rate of renal cell carcinoma in the grafted kidney was 0.19%–0.5% and it occurred at a mean of 12.6 years after renal transplantation. Herein, we report a rare case of renal cancer that occurred 29 years after renal transplantation. Long-term observation should be required for recipients who had rehemodialysis.  相似文献   

13.

Background

Endomyocardial biopsy to evaluate rejection in the transplanted heart is accepted at the “gold standard.” The complexity of microscopic images suggested using digital methods for precise evaluating of acute rejection episodes with numerical representation. The aim of the present was study to characterize digitally acute rejection of the transplanted heart using complexity/fractal image analysis.

Material and Methods

Biopsy samples harvested form 40 adult recipients after orthotropic heart transplantation were collected and rejection grade was evaluated according to the International Society for Heart and Lung Transplantation (0, 1a, 1b, or 3a) at transverse and longitudinal sections. Fifteen representative digital microscope images from each grade were collected and analyzed after Sobel edge detection and binarization.

Results

Only mean fractal dimension showed a progressive and significant increase and correlation based on rejection grade using longitudinal sections. Lacunarity and number of foreground pixels showed unequivocal results.

Conclusion

Mean fractal diameter could serve as auxiliary digital parameter for grading of acute rejection in the transplanted heart.  相似文献   

14.

Background

We evaluated the prevalence of pretransplantation and posttransplantation anemia and its effect on serum creatinine levels among living donor kidney transplant recipients.

Methods

We reviewed retrospectively 170 adult patients who underwent living donor kidney transplantation between 1994 and 2009. We defined anemia as hemoglobin (Hb) ≤12 g/dL for women and ≤13 g/dL for men with severe anemia as Hb <11 g/dL for both men and women (World Health Organization criteria). Patients were also categorized according to Hb levels less than or greater than 10 g/dL for correlation with recipient serum creatinine levels at months 1, 3, 6, and 12.

Results

Mean recipient and donor ages were 33 ± 10 and 45 ± 12 years, respectively. Mean cold ischemia time was 76 ± 43 minutes. At the time of transplantation, anemia and severe anemia prevalences were 86.7% and 58.8%, respectively. Anemia was observed in 64 patients (42.1%) at posttransplantation month 3. Pretransplantation severe anemia was a good predictor of both Hb levels and anemia presence posttransplantation. Pretransplantation anemia and severe anemia caused greater requirements for posttransplantation blood transfusions (P < .05). Younger age and female gender were significant risk factors for severe anemia pretransplantation. There was a significant correlation between posttransplantation Hb levels and serum creatinine levels at 12 month (P = .01). Recipient female gender and longer hospital stay were significant risk factors for both anemia and severe anemia posttransplantation. Higher recipient weight and history of acute rejection episode were also significant for posttransplantation severe anemia.

Conclusion

This study indicated that successful kidney transplantation had a positive effect on Hb levels. Posttransplantation anemia predicted worse graft function in the first month after transplantation.  相似文献   

15.

Introduction

Ischemia-reperfusion injury (IRI) causes a high rate of delayed graft function (DGF), the most frequent complication in the immediate postoperative period after cadaveric donor kidney transplantation. Herein we evaluated the impact of donor and recipient characteristics on DGF development in terms of the incidence of acute rejection episodes, hospital stay, renal function, and long-term graft and patient survivals.

Materials and Methods

Between February 1998 and July 2011, 761 patients underwent cadaveric donor kidney transplantations. DGF was defined as the need for dialysis in the first week. Patients were subdivided according to initial graft function as immediate graft function (IGF) or DGF.

Results

DGF observed in 241 patients (31.6%) was associated independently with expanded criteria donors, extended cold ischemia time, Karpinsky histological score, and prior dialysis duration both univariate and multivariate analysis. The incidence of acute rejection episodes was 18.1% among the DGF group versus 1.3% in the IGF group (P < .01). DGF significantly reduced both graft and patient survivals at 6, 12, 36, and 60 months.

Conclusion

DGF was responsible for a longer hospital stay, worse early and long-term renal function, a higher incidence of acute rejection episodes as well as reduced graft and patient survivals.  相似文献   

16.

Background

The cirrhotic kidney is the cause of sympathetic nervous system and the renin-angiotensin system activation leading to increased vascular resistance and arterial hypertension. The impact of unilateral or bilateral nephrectomy (UN or BN) performed before kidney transplantation on kidney graft intrarenal resistance has not been assessed yet. The aim of this study is to assess the intrarenal resistance parameters measured by Doppler ultrasound in the transplanted kidney in either nephrectomized or non-nephrectomized kidney transplant recipients.

Methods

Among 686 consecutive successful first cadaveric kidney graft recipients transplanted from 1998 to 2012, we identified 43 patients who underwent BN and 49 patients who underwent UN. Patients with acute rejection episodes within an early post-transplantation period were excluded. We have analyzed both pulsatility (PI) and resistance (RI) indices measured within the kidney graft before discharge from the hospital.

Results

The prevalence of hypertension in the follow-up period after transplantation was significantly lower in the BN group (65.1% versus 81.0% in other groups). Neither BN nor UN influenced the PI or RI values. The mean PI and RI values were 1.50 (1.38–1.61) and 0.75 (0.73–0.78) in BN, 1.48 (1.37–1.58) and 0.76 (0.73–0.79) in UN, and 1.47 (1.43–1.50) and 0.74 (0.73–0.75) in non-nephrectomized patients, respectively. The results of multivariate analysis confirmed the lack of influence of nephrectomy on kidney graft resistive indices.

Conclusion

BN before transplantation resulted in lower frequency of hypertension, but it did not affect the intrarenal vascular resistance measured in the kidney graft.  相似文献   

17.

Background

Long-term function of transplanted kidney is the factor determining quality of life for transplant recipients. The aim of this study was to evaluate the effect of selected factors on time of graft function after renal transplantation within 15 years of observation.

Methods

Preoperative and intraoperative factors were analyzed in 232 kidney recipients within a 15-year observation period. Analysis included age, sex, cause of recipient's renal failure, length of hemodialyses before transplantation, peak panel reactive antibodies test, human leukocyte antigen compatibility, cold ischemia time, delayed graft function occurrence, length and time of hemodialyses after transplantation, early graft rejection, creatinine level at days 1, 3, 7, 30, 90, and 180 after transplantation, and influence of these factors on the time of graft function. Statistical analysis was performed with the use of univariate and multivariate Kaplan-Meier test and Cox regression proportional hazards model, with P < .05 considered to be significant.

Results

Univariate analysis showed significantly shorter renal graft function in the group of recipients with higher creatinine levels in all of the analyzed time periods and in patients experiencing delayed graft function. Length of time of hemodialyses after transplantation and number of dialyses had significant impact on worsening of late transplant results. Multivariate analysis reported that early graft rejection in the postoperative period is an independent factor improving late graft function: P = .002; hazard ratio (HR), 0.49 (95% confidence interval [CI], 0.31–0.78). Higher creatinine level at day 90 after kidney transplantation is a predictive factor of late graft dysfunction: P = .002; HR, 1.68 (95% CI 1.2–2.35).

Conclusions

Creatinine level at day 90 after renal transplantation is the prognostic factor of long-term kidney function. Early transplant rejection leads to introduction of more aggressive immunosuppression protocol, which improves long-term transplant results.  相似文献   

18.

Background

The prevention and early detection of post-transplantation rejection and infection are key clinical points to achieve long-term survival after lung transplantation. Although surveillance bronchoscopy (SB) is performed in many transplantation centers, it is still controversial because of its undefined clinical significance and its possible complications. We evaluated the clinical utility of SB after cadaveric lung transplantation in Japan, where bilateral transplantation is officially limited to patients medically requiring bilateral grafts.

Patients and Methods

Twenty-eight patients who underwent cadaveric lung transplantation followed by SB were retrospectively analyzed with reference to the results of bronchoscopy. SB is routinely performed at 1, 2, 3, 6, and 12 months after lung transplantation and annually thereafter. Clinically indicated bronchoscopy (CIB) is considered in patients with suspected rejection or airway infection, and for follow-up examination after treatment for acute rejection.

Results

There were 206 bronchoscopies, including 189 SBs and 17 CIBs, performed in 28 patients who underwent cadaveric lung transplantation between 2000 and 2013 at Osaka University Hospital. Among SBs, 92 (49%) showed positive results of transbronchial lung biopsy (TBLB) or bronchoalveolar lavage (BAL), and intervention was applied following 34 SBs (18%). Among CIBs, 8 (47%) showed positive results of TBLB or BAL, with intervention performed in 3 patients (18%). A2-3 and B2R findings according to the revised International Society for Heart and Lung Transplantation (ISHLT) rejection score and airway infection/colonization were frequently observed within a year following lung transplantation. Cytomegalovirus infection was found in 7 SBs (6%) by TBLB only within 2 months after transplantation. Regarding complications, moderate bleeding occurred in 21 (11%), pneumothorax in 2 (1%), prolonged hypoxemia in 1 (0.5%), and pneumonia in 1 (0.5%) among the 189 SBs.

Conclusion

SB frequently detects rejection and airway infection or colonization with minimum complications, especially within 12 months after cadaveric lung transplantation.  相似文献   

19.

Background

Cyclosporine and tacrolimus (TAC) are the most potent immunosuppressants. TAC is considered less nephrotoxic, but may be an important factor in chronic graft dysfunction. The aim of the study was to evaluate kidney function and cardiovascular risk profile in 2 groups of low immunological risk kidney allograft recipients receiving 2 TAC dosages.

Materials and methods

Patients were randomly assigned to 2 TAC-based treatments (group I [n = 14], standard dose; group II [n = 15], reduced dose). Patient and graft survival, graft function, occurrence of cardiovascular events (cardiac death, myocardial infarction, stroke), incidence of new-onset diabetes mellitus after transplantation, and cardiovascular risk factors were assessed over a 5-year period.

Results

Patient demographics and transplant characteristics were not statistically different between groups. TAC trough levels were significantly higher in group I for 24 months post transplant. Patient survival did not differ, but there were more acute rejection episodes and graft losses in group II. There were no significant differences in the rate of cardiac events. Graft function measured as serum creatinine levels and calculated glomerular filtration rate did not differ between groups. The same applies to new-onset diabetes mellitus after transplantation incidence. Office blood pressures were numerically higher in group I up to 24 months but this difference did not reach significance at any time. Similar results were obtained for serum lipids.

Conclusions

Immunosuppression based on low doses of tacrolimus seems to be safe in the group of low immunological risk patients but in the 60-month follow-up does not offer any clear benefits in terms of potential nephrotoxicity or cardiovascular risk.  相似文献   

20.

Background

Donor-specific antibodies (DSAs) play a fundamental role in kidney transplantation. The identification of DSAs is an essential rejection parameter.

Patients and Methods

We evaluated a protocol in 237 patients receiving kidneys from living (LDs) and deceased donors (DDs). Recipients were classified as being at low (LR), medium (MR), high (HR), or strong (SR) risk of rejection based on Luminex panel reactive antibody (PRA)–single antigen beads (SABs). Grafts that survived for 1 year were evaluated.

Results

Of the 237 transplanted patients, 129 (54.43%) received a kidney from an LD and 108 (45.57%) from a DD. Of 95 LR recipients receiving kidneys from LDs, 2 patients lost the graft due to non-immunological causes. Of 34 MR recipients, 13 had rejection episodes, and 2 lost the graft by AMR and one by cellular rejection (CR). Of 108 recipients receiving a kidney from a DD, 59 (54.63%) were LR, 31 (28.70%) MR, 11 (10.19%) HR, and 7 (6.48%) SR. Twenty of all transplanted recipients lost their grafts; 4 were due to clinical causes, 4 by cellular rejection, and 12 by antibody-mediated rejection (AMR) with PRA-SAB mean fluorescent intensity of 530 to 12,591. One-year graft survival for LD transplanted LR and MR patients was 97.6% and 94.1%, respectively (P = .004). In DD recipients, the LR vs MR SD was P = .011, and for LR vs HR + SR it was P = .001. For MR vs HR+SR no SD was found (P = .323).

Conclusion

Rejections were detected in 51 patients (21.52%). Graft failure occurred in 16 patients (6.75%). A total of 218 (91.98%) recipients maintained good kidney function after 1 year. This protocol based on fluxogram risk assessment of AMR provided fast and precise immunological evaluation of recipients and donors and stratification by immunological risk of AMR.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号