首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 13 毫秒
1.

Objectives

This study aimed to identify the potential risk factors of acute rejection after deceased donor kidney transplantation in China.

Methods

Adult kidney transplantations from deceased donors in our center from February 2004 to December 2015 were enrolled for retrospective analysis. All deceased donations complied with China's Organ Donation Program. No organs from executed prisoners were used. The incidence of clinical and biopsy-proved acute rejection was assessed with the Kaplan-Meier method, and the Cox proportional hazard model was used for multivariate analysis.

Results

One-year, 2-year, 3-year and 5-year incidences of acute rejection were 12.4%, 14.2%, 14.8%, and 17.1%, respectively. Multivariate analysis demonstrated that longer pre-transplant dialysis duration (hazard ratio [HR] 1.009 per month; 95% confidence interval, 1.003–1.015; P = .003), positive pre-transplant panel reactive antibody (PRA) (positive vs negative HR 3.266; 1.570–6.793; P = .023), and increasing HLA mismatches (≥4 vs < 4 HR 2.136; 1.022–4.465; P = .044) increased the risk of acute rejection, while tacrolimus decreased acute rejection risk compared to cyclosporine (HR 0.317; 0.111–0.906; P = .032).

Conclusion

Longer pre-transplant dialysis duration, HLA mismatch, and positive pre-transplant PRA increase the risk of acute rejection, while tacrolimus helps prevent acute rejection compared to cyclosporine in deceased donor kidney transplantation.  相似文献   

2.

Background

We performed a clinical and pathological analysis of cases of acute vascular rejection (AVR), characterized by intimal arteritis and transmural arteritis (Banff v score) after kidney transplantation, in an attempt to clarify the mechanisms underlying the development and prognostic significance of AVR.

Methods

AVR (Banff score: v >0) was diagnosed in 31 renal allograft biopsy specimens (BS) obtained from 31 renal transplant patients receiving follow-up care at the Department of Urology, Tokyo Women's Medical University, between January 2010 and April 2016.

Results

AVR was diagnosed at a median of 124.6 days after transplantation. Among the 31 BS showing evidence of AVR, AVR was mild (v1 in Banff's classification) in 25 cases, moderate (v2) in 6, and severe (v3) in none. We classified the 31 BS with evidence of AVR by their overall histopathological features as follows: isolated v lesions were observed in 6 BS, acute antibody-mediated rejection (AAMR) in 7, acute T-cell–mediated rejection (ATCMR) in 12, and both ATCR and AAMR in 6. Loss of the renal allograft occurred during the observation period in 3 patients, and, of the remaining cases with functioning grafts, deterioration of renal allograft function after biopsy occurred in only 2 patients.

Conclusions

The results of our study suggest that ATCMR contributes to AVR in 40% to 60% of cases, AAMR in 20% to 40% of cases, and isolated v lesions in 20% of cases. The prognosis of the patient with the graft that had AVR was relatively good under the present immunosuppression protocol and current anti-rejection therapies.  相似文献   

3.

Background

De novo complement-binding donor-specific anti-human leukocyte antigen antibodies (DSAs) are reportedly associated with an increased risk of kidney graft failure, but there is little information on preformed complement-binding DSAs. This study investigated the correlation between preformed C1q-binding DSAs and medium-term outcomes in kidney transplantation (KT).

Methods

We retrospectively studied 44 pretransplant DSA-positive patients, including 36 patients who underwent KT between April 2010 and October 2016. There were 17 patients with C1q-binding DSAs and 27 patients without C1q-binding DSAs. Clinical variables were examined in the 2 groups.

Results

Patients with C1q-binding DSAs had significantly higher blood transfusion history (53.0% vs 18.6%; P = .0174), complement-dependent cytotoxicity crossmatch (CDC-XM)-positivity (29.4% vs 0%; P = .0012), and DSA median fluorescence intensity (MFI) (10,974 vs 2764; P = .0009). Among patients who were not excluded for CDC-XM-positivity and underwent KT, there was no significant difference in cumulative biopsy-proven acute rejection rate (32.5% vs 33.5%; P = .8354), cumulative graft survival, and 3-month and 12-month protocol biopsy results between patients with and without C1q-binding DSAs. Although patients with C1q-binding DSAs showed a higher incidence of delayed graft function (54.6% vs 20.0%; P = .0419), multivariate logistic regression showed that DSA MFI (P = .0124), but not C1q-binding DSAs (P = .2377), was an independent risk factor for delayed graft function.

Conclusions

In patients with CDC-XM-negativity, preformed C1q-binding DSAs were not associated with incidence of antibody-mediated rejection and medium-term graft survival after KT. C1q-binding DSAs were highly correlated with DSA MFI and CDC-XM-positivity.  相似文献   

4.
Although a nationwide activation system has been developed to increase deceased donor kidney transplantation (DDKT), there is still enormous discrepancy between transplant need and deceased donor supply in Korea, and therefore waiting time to DDKT is still long. We need to determine the current status of waiting time and the risk factors for long waiting time. We retrospectively analyzed the medical records of the patients on the wait list for DDKT at the Seoul National University Hospital from 2000 to 2017. Among 2,211 wait-listed patients, 606 (27.5%) received DDKT and mean waiting time to DDKT was 45 months. Among them, blood type A was most prevalent (35.6%) and type AB was the least (14.0%). Panel-reactive assay (PRA) was positive in 59 (11.0%) in the first transplant group and 25 (35.0%) in retransplant group. Waiting time in PRA-positive recipients was 63 and 66 months in the first transplant group and retransplant group, respectively. However, waiting time for patients with negative PRA was 42.8 months. Waiting time was shorter in blood type AB (39 months) than other types (46 months). Waiting time was the shortest in children and adolescents. Among patients who were still on the wait list, retransplantation candidates, especially with PRA higher than 50%, had longer waiting time than first transplant candidates. In conclusion, non-AB blood type, positive PRA, and adult age were significantly associated with long waiting time. Therefore, it is necessary to establish a management strategy such as tailored desensitization for highly sensitized patients on the wait list to reduce their waiting time.  相似文献   

5.

Introduction

The prevalence and impact of pre-existing and de novo anti-HLA donor-specific antibodies (DSAs) after orthotopic liver transplantation (OLT) is still controversial. We investigated the prevalence of DSAs and their implication in the development of allograft dysfunction after OLT.

Patients and Methods

A total of 65 liver transplant patients were tested for anti-HLA antibodies, with single antigen bead technology, before, 1, 3, 6, and 12 months after transplantation, and thereafter annually, along with other risk factors. Sixteen out of 65 patients (24.6%) had circulating pre-existing anti-HLA antibodies, and 4 of them (25%) had DSAs. All patients positive for anti-HLA antibodies (100%) presented allograft dysfunction. Fourteen out of 65 patients (21.5%) had circulating de novo DSAs, and 12 out of 14 (85.7%) presented allograft dysfunction. The investigated risk factors for allograft dysfunction were: recipient and donor age, time on the waiting list, cold ischemia time, cytomegalovirus infection, immunosuppression regimen, de novo DSAs, Model for End-Stage Liver Disease, aspartate aminotransferase, alanine aminotransferase, gamma-glutamyl transpeptidase (GGT), direct bilirubin and total bilirubin peak post-transplant, and alkaline phosphatase. The multivariate analysis showed that de novo DSAs and time on the waiting list were independent risk factors for allograft dysfunction.

Conclusion

Our results show that de novo DSAs are an independent risk factor for allograft dysfunction, along with time on the waiting list.  相似文献   

6.

Objectives

To compare the clinical outcome of kidney transplantation from living-related and deceased donors.

Patients and methods

Consecutive adult kidney transplants from living-related or deceased donors from February 2004 to December 2015 in a single center were enrolled for retrospective analysis. Estimated glomerular filtration rate (eGFR) was compared with linear mixed models controlling the effect of repeated measurement at different time points.

Results

There were 536 living-related and 524 deceased donor kidney transplants enrolled. The 1-year, 3-year, and 5-year graft survival rates were 98.8%, 98.5% and 97.2% in living-related kidney transplantation (KTx), and 94.9%, 91.3% and 91.3% in deceased donor KTx (log-rank, P < .001). A significantly higher incidence of delayed graft function (DGF) was observed in deceased donor KTx (20.6% vs 2.6%, P < .001). eGFR in deceased donor KTx was significantly higher than that in living-related KTx (68.0 ± 23.7 vs 64.7 ± 17.9 mL/min/1.73 m2 at 1 year postoperation, 70.1 ± 23.3 vs 64.3 ± 19.3 mL/min/1.73 m2 at 2 years postoperation, and 72.5 ± 26.2 vs 65.2 ± 20.4 mL/min/1.73 m2 at 3 years postoperation; P < .001). The donor age was significantly higher in living-related KTx group (47.5 ± 11.0 vs 31.1 ± 14.4 years, P < .001).

Conclusion

Living-related graft survival is superior to deceased graft survival at this center, while better 5-year renal allograft function is obtained in deceased donor KTx patients, which may be attributable to the higher age of living-related donors.  相似文献   

7.

Background

Although tacrolimus is one of the essential drugs used for the prevention of rejection in kidney recipients, target trough levels are not well established. In this study, we aimed to investigate the association between average tacrolimus trough levels (TTLs) of the first month after transplantation and biopsy-proven acute rejection (BPAR) during the first 12 months after transplant.

Methods

A total of 274 patients who underwent kidney-alone transplantation between 2002 and 2014 were enrolled in the study. Average TTLs of the first month were assessed by means of receiver operating characteristic (ROC) curve analysis to discriminate patients with and those without BPAR. Univariate and multivariate Cox proportional hazards models were used to determine the effect of average TTLs of the first month on BPAR.

Results

According to ROC curve analysis, the highest area under the curve (AUC) was obtained from 8 ng/mL (AUC = 0.73 ± 0.11; 95% confidence interval [CI], 0.62–0.84). Forty-two (31.8%) of the 132 patients with average TTLs <8 ng/mL and 13 (9.1%) of 142 patients with ≥8 ng/mL had BPAR during the first 12 months after transplant (P < .001). In univariable analysis, average TTLs of the first month <8 ng/mL were associated with higher risk of BPAR (P < .001), and the significance remained in Cox multivariable analysis (hazard ratio, 2.79; 95% CI, 1.76–3.82; P = .001). No significant differences were observed in the glomerular filtration rate, cytomegalovirus, BK viremia, or BK nephropathy between groups at post-transplant month 12.

Conclusions

Keeping the average TTLs of the first month after transplantation at ≥8 ng/mL not only prevents BPAR occurrence but also minimizes the toxic effects of the use of a single-trough level.  相似文献   

8.
Renal transplant is the best form of treatment for most patients with end-stage renal disease. The aim of this study was to examine the prevalence of eye problems in patients with end-stage renal disease on the kidney transplantation waiting list in regard to their status (active vs temporarily disqualified).The cross-sectional study was conducted on 90 prevalent patients in 1 regional qualification center. There were 24 peritoneally dialyzed patients, 5 patients registered for preemptive transplantation, and 61 hemodialyzed patients. Average age of patients who had been registered on the cadaver kidney waiting list was 50 (±?14) years, with a balanced sex ratio and median dialysis duration of 38 months. The primary cause of end-stage renal failure was chronic glomerulonephritis in 42 cases, diabetic nephropathy in 10 cases, hypertensive nephropathy in 12 cases, autosomal dominant polycystic kidney disease in 7 cases, and other or unknown in the remaining patients. The major diagnosis was hypertensive angiopathy (related to the presence of long-term hypertension and history of kidney disease) in 56 patients, diabetic retinopathy in 8 patients, blindness in 4 cases (due to solvent intoxication in 1 case), and eyesight abnormalities (myopia, hyperopia, anisometropia) in 7 cases. Cataracts were described in 10 patients in addition to other findings. In 15 patients ophthalmology examination was normal, predominantly in younger patients. Abnormalities were more common in patients on the inactive list.In the vast majority of potential kidney transplant recipients, ophthalmology disturbances are primarily related to the underlying disease. The ophthalmology consult is part of the qualification, but the abnormalities are not the exclusion criteria.  相似文献   

9.
BackgroundMalignancy is an important cause of mortality in renal transplants recipients. The aim of this study was to evaluate the incidence, prognosis, and survival of patients developing a de novo post-transplant cancer.MethodsUsing a retrospective cohort design, we evaluated the incidence of de novo cancers among kidney transplants patients in our hospital from January 2000 to December 2012. We also evaluated the patient survival after tumor diagnosis.ResultsWe included 535 kidney transplants recipients with a mean follow-up of 7.8 years; among them, 39 (7.2%) developed malignancies. Median time from transplant to cancer diagnosis was 3 years, with a median age at diagnosis of 60 years. Male patients were significantly older at time of cancer diagnosis (68.5 years) compared with women (38 years, P < .05), and cancer diagnosis occurred significantly earlier in men (3.5 years since transplantation) than in women (8.5 years, P < .05). Among 39 patients affected by a de novo post-transplant cancer, 18 patients (46.2%) died, with an average age at death of 58.5 years. The average time from cancer diagnosis to death was 1.5 years. Among the group of patients who did not develop a post-transplant cancer, 83 patients (16.7%) died, with a median age at time of death of 54.5 years (P < .05).ConclusionsKidney transplant recipients are at higher risk of developing a post-transplant cancer. Prognosis after cancer diagnosis is poor, probably as a consequence of a more aggressive behavior of cancer in transplant recipients. Intensive screening protocols could allow for an earlier diagnosis thereby improving the long-term outcome of these patients.  相似文献   

10.

Background

There is no unanimity in the literature regarding the value of transbronchial biopsies (TBBs) performed at a scheduled time after lung transplantation (surveillance TBBs [SBs]), compared to biopsies performed for suspected clinical acute rejection (clinically indicated TBBs [CIBs]). This study exposes an assessment of our experience over the last 4 years through a retrospective analysis of the data collected.

Methods

In our center, SBs are performed at 3, 6, and 12 months after a transplant. Data from 110 patients who underwent a TBB were collected from January 2013 to November 2017. Clinical and functional data along with the histologic results and complications were collected.

Results

Overall 251 procedures were performed: 223 for surveillance purposes and 28 for clinical indications. The SBs diagnostic rate was 84%. A grade 2 acute rejection (AR) was detected in 9 asymptomatic patients, all of whom were medically treated, with downgrading of AR documented in all cases. The rate of medical intervention in the SB group was 8%. The CIBs diagnostic rate was 96%. The rate of AR detected by CIBs was significantly higher than by SBs (36% versus 4%; P < .0001). Overall the major complication rate was 4%; no patients required transfusions and no mortality occurred in the patient cohort.

Conclusions

The surveillance protocol did not eliminate the necessity of CIBs, but in 8% of patients early rejection was histologically assessed. The correlation between histologic and clinical data allows a more careful approach to transplanted patients.  相似文献   

11.

Introduction

In Poland there is an average of 2.31 doctors per 1000 population (according to the Organization of Economic Cooperation and Development 2016), which is one of the lowest results in all of the European Union countries. With the persistent low number of specialists, the quantity of patients and their medical records keep increasing. In order to facilitate the work of doctors and to improve monitoring of the patients, most medical institutions decided to implement IT systems. At the Department of General and Transplant Surgery of Pomeranian Medical University in Szczecin, we have been using the Asseco Medical Management Solutions (AMMS) system for 5 years. The aim of this study was to evaluate the usefulness of the AMMS system in monitoring and early detection of disorders in renal transplant recipients.

Materials and methods

The retrospective study included 100 patients who underwent surgical treatments between 2012 and 2017. By analyzing the results of laboratory tests (ie, complete blood count), the concentration of creatinine and urea, potassium, and the doses of immunosuppressive drugs, we evaluated the usefulness of the AMMS system.

Results

The clinical usefulness of the AMMS system related to the evaluation of predictors of early graft dysfunction has been confirmed. We have saved time in the analysis of medical records.

Conclusions

The AMMS system improved the management of medical records of patients after surgeries and allowed for the retrospective evaluation of the treatment and its immediate modification. The unification of medical IT systems in Poland would increase the availability of patients' data and support information transfer among all medical institutions. It would enable multifacility scientific meta-analysis.  相似文献   

12.
The number of recipients waiting for a transplant is increasing. In Japan, there is more frequent use of organs from expanded-criteria donors (ECDs) after circulatory death. We retrospectively analyzed long-term outcomes of kidney transplantation (KT) from expanded-criteria donation after circulatory death (DCD). From 1995 to 2013, 97 cases of KT from DCD donors were performed in our department. Death-censored graft survival rates of ECD kidneys (n = 50) versus standard-criteria deceased-donor (SCD) kidneys (n = 47) for 1, 5, and 10 years after transplantation were 84.0% vs 97.9%, 74.8% vs 95.6%, and 70.2% vs 81.8%, respectively. No significant difference was found between the 2 groups (P = .102). Kidneys from donors with a history of hypertension (HTN) and cerebrovascular events (CVE) and contribution from older donors had significantly lower 10-year graft survival rates (P values of .010, .036, and .050, respectively). Cox proportional hazard regression analyses showed donor age to be significantly associated with long-term graft survival independently from other factors. These results suggest that ECD kidneys remain an acceptable alternative to dialysis under certain conditions. Increased donor age was a significant risk factor determining long-term graft function. Moreover, comorbidities of HTN and CVE could become significant risk factors, especially in older donors.  相似文献   

13.
Atypical hemolytic uremic syndrome (aHUS) develops as the result of unregulated complement progression and precipitates de novo thrombotic microangiopathy. Plasma therapy is used to control the progression of the complement cascade, but that therapy is not effective in all patients and is accompanied by risk of infection and/or allergy. Eculizumab has been reported as an efficient therapy for aHUS. We report the case of a 35-year old woman who underwent effective eculizumab therapy for aHUS recurrence and antibody-mediated rejection (AMR) progress after renal transplantation with preformed donor-specific antibodies (DSA). She developed end-stage renal disease due to suspicious IgA nephropathy at age 33 years. Kidney transplantation was performed at age 35 years, and aHUS recurred 2 weeks later, leading to the progressive hemolytic anemia and renal dysfunction. Therefore, she underwent plasma therapy several times. Because it was difficult to continue to plasma therapy for severe allergy, eculizumab was proposed as an alternate therapy. Treatment with eculizumab was initiated 36 days after renal transplantation. After 3 years of eculizumab treatment, and without plasma therapy, schistocytes decreased, haptoglobin increased to within normal limits, creatinine levels stabilized, and no further episodes of diarrhea were reported. At protocol biopsy 1 year after transplantation, she was diagnosed with C4d-negative subclinical AMR. However, her pathologic findings at follow-up biopsy 3 years after transplantation were recovered. We conclude that eculizumab alone, without plasma therapy, is sufficient to treat recurrence of aHUS and AMR due to DSA after renal transplantation and to maintain long-term graft function.  相似文献   

14.

Objective

Bone morphogenetic proteins (BMP) belong to the transforming growth factor beta superfamily of proteins. This study was performed to evaluate the association of BMP gene polymorphisms with acute renal allograft rejection (AR) and graft dysfunction (GD) in Koreans.

Methods

Three hundred thirty-one patients who had kidney transplantation procedures were recruited. Transplantation outcomes were determined in terms of AR and GD criteria. We selected six single nucleotide polymorphisms (SNPs): rs1979855 (5′ near gene), rs1049007 (Ser87Ser), rs235767 (intron), rs1005464 (intron), rs235768 (Arg190Ser), and rs3178250 (3; untranslated region).

Results

Among the six SNPs tested, the rs235767, rs1005464, and rs3178250 SNPs were significantly associated with AR (P < .05). The rs1049007 and rs235768 SNPs also showed an association with GD (P < .05).

Conclusions

In conclusion, these results suggest that the BMP2 gene polymorphism may be related to the development of AR and GD in kidney transplant recipients.  相似文献   

15.
Renal function after heart transplantation (HTx) typically follows a biphasic pattern and an initial decay within 1 to 2 years. Trajectory of renal function after HTx is less reported, especially in Asia. The aims of this cohort study were to describe the changes in HTx recipients' serum creatinine and estimated glomerular filtration rate (eGFR) levels 5 years following HTx in Taiwan.

Methods

We retrospectively reviewed 5 years of 440 consecutive adult patients (≥?18 years) who underwent first HTx from June 1987 to December 2014 at the National Taiwan University Hospital.

Results

Among 422 participants, they received induction therapy consisting of intravenous rabbit antithymocyte globulin. Here, we illustrated the trends over the years by dividing the subjects into 2 groups based on their immunosuppressive regimen of transplantation (1987–2002 and 2003–2014) The pretransplantation median serum creatinine concentration level was 1.2 mg/dL, rose to 1.4 mg/dL at 3 months after surgery, and remained steady over 5 years after HTx. Pretransplant median eGFR was 67 mL/min/1.73 m2.The median serum creatinine concentration level and eGFR at baseline were all significantly difference than pretransplantation (P > .05). This result has showed that an initial steep decline within 3 months after transplant remained stable 5 years after HTx.

Conclusion

As renal function deteriorates after HTx, we observed a steep decline in serum creatinine level and glomerular filtration rate within the 3 months after HTx, followed by a slow rate of deterioration over the following months. We found a time-related progressive deterioration in renal function during the 5 years after HTx.  相似文献   

16.

Background

Cardiovascular disease is the leading cause of morbidity and mortality in kidney transplantation (KT) patients. The prevalence of left ventricular hypertrophy increases with the progression of renal insufficiency.

Methods

We investigated the association between the progression of renal insufficiency and left ventricular hypertrophy after KT. We reviewed KT patients at Seoul National University Hospital from January 1973 to December 2009. The creatinine elevation ratio (CER, the percentage change in the creatinine level from 1 month to 5 years after transplant) was calculated as follows: (creatinine level at 5 years minus creatinine level at 1 month)/creatinine level at 1 month × 100.

Results

The study population was classified into a high-CER group (CER ≥25%) and low-CER group (CER <25%). Mean left ventricular mass index (LVMI) values were 135.7 and 134.7 g/m2 before KT and 101.7 and 123.7 g/m2 at 5 years after KT in the low-CER and high-CER groups, respectively. The LVMI before or 1 year after KT was not different between the 2 groups, but the LVMI at 5 years post-transplant was higher in the high-CER group than in the low-CER group. The LVMI increased after its initial decrease in the high-CER group, whereas its reduction was maintained in the low-CER group during the 5 years after KT (P = .009, repeated-measures analysis of variance).

Conclusions

These data suggest that deterioration of renal allograft function is associated with left ventricular remodeling after KT.  相似文献   

17.

Background

Increased cold ischemia time in cadaveric kidney transplants has been associated with a high rate of delayed graft function (DGF), and even with graft survival. Kidney transplantation using in-house donors reduces cold preservation time. The purpose of this study was to compare the clinical outcomes after transplantation in house and externally.

Methods

We retrospectively reviewed the medical records of donors and recipients of 135 deceased-donor kidney transplantations performed in our center from March 2009 to March 2016.

Results

Among the 135 deceased donors, 88 (65.2%) received the kidneys from in-house donors. Median cold ischemia time of transplantation from in-house donors was shorter than for imported donors (180.00 vs 300.00 min; P < .001). The risks of DGF and slow graft function were increased among the imported versus in-house donors. Imported kidney was independently associated with greater odds of DGF in multivariate regression analysis (odds ratio, 4.165; P = .038). However, the renal function of recipients at 1, 3, 5, and 7 years after transplantation was not significantly different between the 2 groups.

Conclusions

Transplantation with in-house donor kidneys was significantly associated with a decreased incidence of DGF, but long-term graft function and survival were similar compared with imported donor kidneys.  相似文献   

18.
PurposeKidney transplantation from elderly donors with acute kidney injury (AKI) has increased recently due to donor shortage, but the safety and prognosis are not well known. We examined the effect of donor age on the outcomes of kidney transplantation (KT) from donors with histologic AKI.Materials and methodsWe retrospectively analyzed the medical records of 59 deceased-donor KTs with acute tubular necrosis (ATN) on preimplantation donor kidney biopsy between March 2012 and October 2017. Histologic evaluations of ATN, inflammation, glomerulosclerosis (GS), interstitial fibrosis, tubular atrophy, and arterial sclerosis were performed.ResultsTwenty and 39 recipients received kidneys from elderly (> 60, 68.9 ± 5.0 years) and young (≤ 60, 45.9 ± 9.6 years) donors with ATN, respectively. Among the elderly donors, significantly increased donor creatinine was observed in only 44% donors, and there were more diabetic patients and women and a higher proportion of GS than among the young donors. Six months after KT, estimated glomerular filtration rate was significantly lower in recipients who received kidneys from elderly donors compared to young donors. Donor creatinine level and AKI severity did not significantly affect the recipient outcomes in either group. However, the presence of ATN and GS were significant factors that exacerbated renal outcomes after KT from elderly donors only. On multivariate analysis, severe ATN was the strongest independent predictor of elderly recipient renal function.ConclusionsHistologic injury may predict renal outcomes in KT from elderly donors. A donor allocation protocol including preimplantation renal histology should be established for KT from elderly donors.  相似文献   

19.

Background

Numerous studies have shown that osteoporosis is common in kidney transplant recipients. However, the change in bone mineral density after kidney transplantation (KT) is not fully understood.

Methods

Thirty-nine kidney transplant recipients with bone densitometry at pretransplant and 24 months after KT were reviewed.

Results

The recipients' median age (44.5 ± 10.7 years) and dialysis duration before KT (4.2 ± 3.4 years) were recorded. The T-scores of the lumbar spine and femur neck at 24 months after KT were positively associated with the respective pretransplant T-score (P < .001 in the lumbar spine and P < .001 in the femur neck). However, the T-score after KT did not show significant change (P?=?.680 in lumbar spine, P?=?.093 in femur neck). Changes in the T-scores of the lumbar spine and femur neck over 24 months (delta T-score) were negatively associated with the respective pretransplant T-scores (P = .001 in lumbar spine, P?=?.026 in femur neck). Changes in the T-scores of the lumbar spine and femur neck over 24 months (delta T-score) were also associated with the pretransplant T-scores after the adjustment of other variables.

Conclusion

The change of bone mineral density was related with pretransplant bone mineral density. Careful follow-up of bone densitometry for KT recipients was needed.  相似文献   

20.

Background

This study investigated the prevalence of osteoporosis and the risk factors for its progression in kidney transplant recipients (KTRs).

Methods

Dual energy X-ray absorptiometry was used to prospectively measure changes in bone mineral density (BMD) before kidney transplantation (KT) and 1 year after transplantation in 207 individuals. We also analyzed the risk factors of osteoporosis progression during this period.

Results

Prior to KT, the mean BMD score (T-score of the femur neck area) was ?2.1 ± 1.2, and the prevalence of osteoporosis was 41.5% (86/207). At 1 year post-transplantation, the mean BMD score significantly decreased to ?2.3 ± 1.1 (P < .001), and the prevalence of osteoporosis increased to 47.3% (98/207; P = .277). The BMD score worsened over the study period in 69.1% (143/207) of patients, improved in 24.1% (50/207), and showed no change in 6.8% (14/207). Minimal intact parathyroid hormone (iPTH) improvement after KT was found to be an independent risk factor of osteoporosis progression.

Conclusions

This study demonstrates progressive loss of BMD after KT and sustained secondary hyperparathyroidism might influence the progression of osteoporosis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号