首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

Kidney transplant recipients are frequently treated for other medical conditions and experience polypharmacy. The aim of our study was to evaluate quality of life in relation to medicines' burden in these patients.

Methods

We studied 136 unselected patients with mean post-transplant time of 7.2 ± 4.6 years. Quality of life was evaluated using a validated Polish version of the Kidney Disease Quality of Life–Short Form questionnaire. Data concerning the type (generic name) and number of currently prescribed medications were collected by interview survey. The participants were divided into 3 groups: group 1, patients with a maximum of 4 different medications (n = 37); group 2, patients with 4 to 9 medications (n = 76); and group 3, patients receiving at least 10 different medications (n = 23).

Results

The number of medicines taken regularly ranged from 2 to 16. Patients with ≥10 drugs had the highest body mass index and lowest estimated glomerular filtration rate. Patients treated with ≥10 drugs, compared to patients from the 2 other groups, had presented lower subscales results concerning the physical functioning (65.9 vs 84.5 in group 1 and 83.4 in group 2, P < .001 for both comparisons), pain (57.2 vs 82.7 and 76.5, respectively, P < .001 for both), social function (66.8 vs 82.1 and 80.4, respectively, P = .04 for both), and energy/fatigue (54.8 vs 67.7, P = .03 and 65.4, P < .05). Multivariate regression analysis revealed that the number of drugs independently influenced physical functioning, pain, and social function subscales.

Conclusions

Polypharmacy is associated with lower quality of life in patients after successful kidney transplantation. The negative impact of polypharmacy is particularly seen regarding physical functioning and pain severity.  相似文献   

2.

Background

Nowadays, a reduced initial daily dose of tacrolimus (Tac) (0.1–0.15 mg/kg) is recommended for the majority of kidney transplant recipients (KTRs). The aim of the study was to analyze the safety of such a regimen, including the risk of first inadequately low Tac blood level, acute rejection (AR) occurrence, or early graft dysfunction.

Methods

In 2011, we introduced a modified (0.1–0.15 mg/kg/d) initial Tac dosing regimen in older (>55 years) and/or overweight KTRs. To assure the safety of this protocol, we monitored the risk of inadequately low blood Tac level (<6 ng/mL) and incidence of AR or delayed graft function (DGF). The historical cohort with the higher Tac dosing regimen (0.2 mg/kg/d, n = 208) served as a control group.

Results

The mean Tac daily dose in 78 KTRs (group with reduced dosing) was 0.133 (95% confidence interval [CI], 0.130–0.136) mg/kg and was significantly lower than the standard, previously prescribed dose of 0.195 (95% CI, 0.194–0.197) mg/kg. Of note, induction therapy was employed twice more often in the reduced Tac dosing group. The dose reduction resulted in a slight, nonsignificant decrease in first Tac trough level. The percentages of patients with first Tac troughs <6 ng/mL (5.1% vs 4.8%), AR (6.4% vs 5.8%), and DGF (25.6% vs 31.2%) were similar in the reduced and standard dosing groups.

Conclusion

The currently recommended reduction in Tac initial dosing does not increase the risk of inadequate immunosuppression and does not affect the early graft function. Regardless of Tac dose reduction, there is still a substantial risk of Tac overdosing in older or overweight KTRs.  相似文献   

3.

Background

Renal transplant candidates present immune dysregulation caused by chronic uremia, and deceased kidney donors present immune activation induced by brain death. Pretransplant donor and recipient immune-related gene expression were examined in the search for novel predictive biomarkers crosslinking recipient and donor pretransplant immune status with transplant outcome.

Materials and methods

This study included 33 low-risk consecutive renal transplant recipients and matched deceased donors. The expression of 29 genes linked to tissue injury, T-cell activation, cell migration, and apoptosis were assessed in postreperfusion kidney biopsies, as well as 14 genes in pretransplant peripheral blood of the kidney recipients. Gene expression was analyzed with real-time polymerase chain reaction on custom-designed low-density arrays.

Results

Donor MMP9 expression was related to delayed graft function occurrence (P = .036) and short term kidney allograft function (14th day rs = ?0.44, P = .012; 1st month rs = ?0.46, P = .013). Donor TGFB1 expression was associated with short- and long-term graft function (14th day rs = ?0.47, P = .007; 3rd month rs = ?0.63, P = .001; 6th month rs = ?0.52, P = .010; 12th month rs = ?0.45, P = .028; 24th month rs = ?0.64, P = .003). Donor TGFB1 expression was not related to donor age (rs = 0.32, P = .081), which was also an independent factor influencing the outcome. Recipient gene expression was not related to graft function but determined the acute rejection risk. Recipient IFNG and, to a lesser extent, IL18 expression were protective against acute rejection (area under the curve [AUC] 0.84, P < .001, and AUC 0.79, P < .001, respectively).

Conclusion

Kidney transplant outcome depends on the interplay between donor-related immune factors, which mostly affect allograft function and recipient immune milieu, influencing an alloreactive response.  相似文献   

4.

Background

Liver transplantation (LTx) is the only effective treatment for end-stage liver failure. Due to the ongoing lack of organs available for transplantation, there is a tendency to extend liver donor selection criteria. The aim of the study was to determine whether extension of donor acceptance criteria with increasing experience in LTx occurred at our transplant center.

Methods

This retrospective analysis included 288 donors harvested between 2005 and 2016. The donors were divided chronologically into 4 equally sized groups. They were assessed in subsequent groups according to sex, age, height, body mass index (BMI), cause of death, amount of days spent in the intensive care unit, number of episodes of cardiac arrest before organ removal, and results of laboratory and virologic tests.

Results

A statistically significant increase in the age of accepted donors was observed between group 2 and group 4 (median 40 vs 45 years, P < .05). There was a significant increase in the acceptance of anti–HBc-positive donors (0% in group 1 vs 7% in group 4). The remaining parameters did not show statistically significant differences.

Conclusion

Experience acquired by our transplant center during the period of analysis did not lead to extension of liver donor acceptance criteria. Statistically significant differences for liver donor age and virologic profile (anti-HBc) between groups were observed; however, overall analysis did not confirm a clear tendency to extend liver donor acceptance criteria at this center.  相似文献   

5.

Background

Liver transplantation (LTx) is one of the most complex transplant procedures. The aim of the present study was to determine whether the learning process can be observed after the introduction of LTx in a center with extensive previous experience in renal transplantation.

Methods

This retrospective analysis included 264 primary LTx procedures performed with the piggyback technique (2005–2016). The procedures were divided into 4 equal groups. The characteristics of the recipients, data related to the surgery, and the postoperative course and complications were analyzed.

Results

We observed a significant reduction in surgical time and in the anhepatic phase duration between Group 1 and the other groups (median surgical time was 455 minutes vs 415 minutes, 410 minutes and 387 minutes, respectively, P < .05; median anhepatic phase duration was 75 min vs 60 min, 62 min, 60 min, respectively, P < .05). There was a decrease in the number of transfused blood units (median in Group 1 of 6 packs vs 3 packs in Group 4, P < .05) and a decrease in blood recovered from the operating field using the Cell Saver system (median in Group 1 of 1570 mL vs 1057 mL, 1123 mL, and 1045 mL, respectively, P < .05). A significant reduction in the number of hemorrhages was found (1.5% in Group 4 vs 13.6%, 10.6%, and 7.6% in the other groups P < .05). The remaining studied parameters were not statistically significant.

Conclusions

Extensive previous transplantation experience affected the lack of typical features of the learning process.  相似文献   

6.
The aim of this paper was to describe the differences in vascular endothelial growth factor (VEGF) concentration in porcine kidneys removed from living donors (group I), donors after prior induction of brain death by brain herniation (group II), and donors after cardiopulmonary arrest (group III). The groups consisted of 6 animals which underwent dual renal removal procedures; kidneys were rinsed, stored for 24 hours at 4°C and rinsed again. Renal specimens (4g) were collected before and after perfusion (time 0 and 1), after 12 hours (time 2), and after reperfusion (time 3). A Western blot was used to evaluate VEGF concentration in collected tissues homogenates. Additionally, the levels of VEGF, interleukin 1β, tumor necrosis factor α, and endothelial nitric oxide synthase (eNOS) were detected with enzyme-linked immunosorbent assays. Directly after the removal procedure, no significant differences in VEGF levels (IOD) were observed depending on the donor (moderate levels were observed in all groups: 1.51 in group I, 1.48 in group II, and 1.35 in group III). As a consequence of perfusion and 12 hours of storage, a stable concentration in groups I and III was observed with a gradual increase of VEGF levels in group II (1.23, 2.08, and 1.67 in the respective groups at time 1; 1.49, 2.12, and 1.63 in the respective groups at time 2). After the following 12 hours, a statistically significant (P < .05) higher level of VEGF was observed in group II (2.34) in comparison to groups I and III (1.58 and 1.81, respectively). In group I, a correlation between VEGF concentration and IL-1β was observed, while in group II there was correlation between VEGF and eNOS levels.  相似文献   

7.

Introduction

Primary graft dysfunction (PGD) is a multifactorial syndrome related to the most adverse outcomes after liver transplantation. Ischemia–reperfusion injury is recognized as the predominant cause of this complication. PGD may be subdivided into early allograft dysfunction, diagnosed by the presence of a serum bilirubin level ≥10 mg/dL (171 μmol/L), International Normalized Ratio ≥1.6, or alanine and aspartate transaminase levels ≥2000 IU/L on the seventh postoperative day; and primary nonfunction, defined as either a need for retransplantation or patient death within the first 7 days. We aimed to determine the preoperative and intraoperative risk factors for PGD.

Materials and Methods

We enrolled 109 patients who underwent orthotopic liver transplantation between 2012 and 2016. Analysis included inter alia: biochemical parameters, morphology, blood transfusions, as well as intraoperative fluctuations of blood pressure.

Results

Fourteen percent of patients were diagnosed with PGD. Using logistic regression and multivariate and receiver operating characteristic and area under the curve analysis, a preoperative neutrophils level above 4030/μL (OR = 4.03, P = .012) and decrease of the mean arterial pressure after reperfusion were recognized as the major independent PGD risk factors.

Conclusions

A high preoperative neutrophils level may be a novel recipient-related risk factor for PGD. A decrease of the arterial blood pressure after graft reperfusion may influence the development of PGD.  相似文献   

8.

Objective

To analyze results of transplantation of kidneys procured from donors after brain death aged 60 years and older (hereafter denoted by “≥60”) compared to kidneys procured from donors after brain death aged 40–59 years (hereafter denoted by “40–59”) in medium-term follow-up period, and to assess factors that affect recipient and kidney graft survival.

Material and methods

92 transplant recipients of kidneys procured from donors after brain death ≥60 were enrolled into the study. The control group were 363 recipients of kidneys procured from donors after brain death 40–59.

Results

Mean values of serum creatinine were higher in recipients of kidneys procured from donors after brain death ≥60 compared to control after 3 years: 168.2 ± 57.5 (n = 59) vs 147.9 ± 65.7 (n = 294), P < .05; and after 5 years: 196.2 ± 95.3 (n = 38) vs 157.3 ± 80.0 μmol/L (n = 211), P < .01. Restricted mean recipient survival time was 56.4 (95% confidence interval: 55.0–57.8) and 52.0 (48.0–56.1) months, P < .05; and kidney graft survival time was 51.6 (49.6–53.5) and 43.9 (39.0–48.9) months, P < .01 in recipients who received kidneys from donors after brain death 40–59 and from donors after brain death ≥60 respectively. In Cox regression, donor death due to cardiovascular disease proved to be the factor increasing risk of kidney graft loss (hazard ratio 1.553, P < .001).

Conclusions

The survival and function of kidneys procured from donors after brain death ≥60 at medium-term follow-up remain worse compared to kidneys procured from donors after brain death 40–59, and the donor dependent risk factor of kidney graft loss is cardiovascular disease, which caused donor death.  相似文献   

9.
Complement activation is considered one of the mediators of renal ischemia-reperfusion injury. Elevated levels of C5b-9, C3a, and C5a are detected in sera of deceased kidney donors. The goal of the study was to characterize the functional activity of complement pathways in donor sera and to assess their influence on transplant outcome.

Materials and methods

Sixty-four deceased kidney donors (age 45 ± 16 years; 28 female, 36 male) and 27 healthy controls (age 42 ± 12 years; 14 female, 13 male) were enrolled in the study. The results of transplantation for the respective 122 kidney recipients were included in the analysis. The functional activities of classical (CP), lectin (LP), and alternative (AP) pathways were measured using Wielisa-kit (reference normal level = 100%). In most cases, decreased functional activity reflects the activation status of the pathway.

Results

The median (interquartile range) functional activities of the pathways in donor sera were CP 118 (89–150)%, LP 80 (20–127)%, and AP 74 (50–89)%, and did not differ from the control values CP 110 (102–115)%, LP 81 (26–106)%, AP 76 (61–88)%. The frequency of pathway activation observed in controls was CP 0%, LP 11%, and AP 0%. Deceased donors did not differ in activation of classical (11%) and lectin (13%) pathways, but presented a higher rate of alternative pathway activation (19%, P = .03). No significant influence of any pathway functional activity or its activation was proved to influence the transplant outcome.

Conclusion

Complement activation via alternative pathway was observed in diseased donor sera. No predictive potential of donor complement functional activity on the transplant outcome could be proved.  相似文献   

10.

Background

Optimization of immunosuppressive therapy reduced the incidence of acute rejection, and therefore vascular complications, including graft thrombosis, which have emerged as the main cause of graft loss in the early post-transplant period. A thrombophilic condition may lead to renal graft loss. The aim of the study was to assess renal graft function in thrombophilic renal recipients receiving anticoagulation treatment.

Methods

This is a retrospective study including 29 renal recipients (ktx group) with a history of thrombosis and confirmed thrombophilic factor. Graft function was evaluated by median serum creatinine concentration at the third month after ktx (SCr1) and at the end of the observation (SCr2) with respect to hypercoagulability (factor V Leiden [FVL], mutation G20210A, antiphospholipid antibodies, deficiency of protein S [PS] or C [PC], factor VIII >200%).

Results

Recipients underwent retransplantation because of graft thrombosis (P < .001). They more often underwent urgent transplantation (P = .008), received induction therapy (P = .021), underwent an indication other than protocol biopsy (P = .001), or experienced acute rejection (P = .042). Differences in graft function (SCr2) were found at the end of observation (ktx group vs controls 1.9 mg/dL vs 1.3 mg/dL, respectively, P = .014). Multivariate analysis revealed inferior thrombophilic graft function in the model with SCr1 <2 mg/dL (odds ratio 0.07, 95% confidence interval 0.01–0.57, P = .014) and in the model with SCr2 <2 mg/dL (odds ratio 0.15; 95% confidence interval 0.04–0.54, P = .004). The incidence of antiphospholipid syndrome was 31%; FVIII, 31%; FVL, 24.1%; and PC/PS, 13.8%. After anticoagulation was introduced no thromboembolic events or bleeding complications occurred.

Conclusion

Hypercoagulability is not a contraindication to ktx but may worsen graft function. Post-transplant care in thrombophilic recipients is demanding (retransplantation, immunization, protocol biopsy, anticoagulation), but is the only means by which to maintain a graft.  相似文献   

11.
Both Toll-like receptor 4 (TLR4) and monocytes focus stimuli, causing them to contribute differently to chronic injury of a transplanted kidney.

Aim

The aim of our study was to determine if TLR4 monocyte is a diagnostic tool and possibly a target for therapeutic intervention.

Materials

We studied 143 kidney transplant (KT) patients (88 male, 55 female; 50.3 ± 12.8 years); median was 10.4 post KT, follow-up was 11.4 months, and 46 patients had delayed graft function (DGF+) history. Control group (38 healthy volunteers) had monocyte mRNA-TLR4 expression (TLR4ex). DGF+ were divided by median of TLR4ex (?0.1034) into 2 groups: low-TLR4 expression (L-TLR4ex) and high-TLR4 expression (H-TLR4ex).

Results

We showed that in comparison with DGF?, the DGF+ had much lower TLR4ex, and worse KT function both currently (TLR-day) (serum creatinine [sCr] P?=?.002; estimated glomerular filtration rate [eGFR] P = .001) and post follow-up (sCr P?=?.006; eGFR P?=?.005). The DGF+ with L/H-TLR4ex comparison showed no differences in TLR-day KT function but did show differences in post follow-up (sCr P?=?.01; eGFR P?=?.02; ΔeGFR% P?= .001). Regression analysis showed an association between recipient age, tacrolimus concentration, and uremic milieu (ie, TLR-day sCr and GFR with TLR4ex). Reverse regression analysis indicated an association of TLR4ex (especially L/H-TLR4ex) with post follow-up parameters of KT function and numeric/qualitative measures of change.

Conclusion

DGF affects the fate of a graft. Within a several months after transplantation, TLR4ex of peripheral blood mononuclear cells declines in DGF patients. Low LR4ex in patients with DGF+ is associated with poor prognosis for the efficiency of the KT. In patients with DGF+, the proper selection of immunosuppression (tacrolimus dosing) is very important. Higher concentrations of tacrolimus may improve prognosis. The analysis of TLR4ex change may be a useful parameter for the real assessment of immunosuppression efficacy. It is important for transplanted organ function that peripheral blood mononuclear cells effectively leave circulation and remain in the graft.  相似文献   

12.
Previously transplanted highly sensitized patients experience problems with subsequent transplantation. It is also difficult to provide optimal hemodynamic conditions during successive kidney transplantation in heart transplant recipients.

Patient and methods

We present a case of a 56-year old patient with end-stage renal failure after heart transplantation performed 21 years ago and hemodialyzed using arteriovenous fistula. The patient had 69% panel-reactive antibodies, had been on the active waiting list since 2013, and presented 335 positive crossmatches with deceased donors. He also positively crossmatched with a potential living donor. Detailed examination of anti-HLA antibodies revealed the absence of IgG donor-specific antibodies and negative crossmatch with dithiothreitol-treated serum. The transplantation from his wife was performed with positive crossmatch after 4 plasma exchanges and thymoglobulin induction. Because sympathetic and parasympathetic denervation of the transplanted heart and the presence of arteriovenous fistula induced volume overload of the right heart, we used central venous pressure (CVP) and the PiCCO2 for postsurgical assessment of cardiac output.

Results

Monitoring, like CVP and other static exponents of preload obtained by PICCO (extravascular lung water, global end-diastolic volume index) as well as the dynamic parameters obtained by PiCCO2 (pulse pressure variation, stroke volume variation), was not sensitive enough to describe recipient volume status. The immediate graft function was observed, and after 11 months satisfactory estimated glomerular filtration rate is noted with the absence of donor-specific antibodies.

Conclusion

The history of heart transplantation with existing arteriovenous fistula makes clinical tools such as continuous cardiac output monitoring and CVP parameter inadequate for describing the hemodynamic situation. The high level of panel-reactive antibodies and positive crossmatch possibly caused by IgM antibodies do not have to withdraw the recipient from kidney transplantation.  相似文献   

13.

Background

Renin-angiotensin system (RAS) blocking agents efficiently control hypertension in renal transplant recipients (RTRs), and reduce proteinuria and post-transplant erythrocytosis. A beneficial effect on the retardation of the long-term decline in renal function has not yet been demonstrated. The aim of the study was to evaluate the effects of RAS blockade on allograft function.

Methods

In order to minimize donor variability and bias, 33 pairs of RTRs receiving grafts from the same donor were included into the retrospective analysis. A total of 66 RTRs were enrolled in which 1 patient from the pair used an angiotensin-converting enzyme inhibitor or angiotensin receptor blocker for a minimum period of 60 months (RAS[+]) and the second one did not use it at all (RAS[-]).

Results

There were no differences between RAS(+) and RAS(-) subjects in terms of age, body mass index, mismatches number, duration of total ischemia, episodes of cytomegalovirus infections, acute rejections, or immunosuppressive treatment. Significantly, more RAS(+) patients presented with diabetes and cardiovascular complications. Among RAS(+) patients, angiotensin-converting enzyme inhibitors and angiotensin receptor blockers were used in 28 (84.84%) and 5 (15.15%) patients in a mean dose of 23.03 ± 16.83% and 30 ± 11.18% of their maximum doses, respectively. There were no significant differences in estimated glomerular filtration rate changes (?0.37 ± 12.68 vs 2.54 ± 20.76 mL/min) and serum creatinine changes (0.05 ± 0.39 vs 0.14 ± 0.79 mg/dL) between RAS(+) and RAS(-) patients during the 60 months follow-up.

Conclusion

Agents inhibiting the RAS did not significantly affect graft function in RTRs during 60 months of observation.  相似文献   

14.

Background

It has been determined that there are about 25% patients with renal allograft failure on the waiting lists.

Methods

We analyzed 406 patients who received a kidney graft from 2013 to 2015 in a single center. The analysis resulted in 33 pairs of patients: for one recipient in the pair it was the first transplantation and for the other it was the second or a subsequent one. Graft and patient survival, graft function, delayed graft function episodes, primary nonfunction, and acute rejection episodes were analyzed to assess the outcome of kidney retransplantation. The follow-up period was 2 years.Delayed graft function was observed in both groups (P = .3303).

Results

Although in the second group there were twice as many episodes of acute rejection than in the first group (8 to 4), the results are not statistically significant (P = .1420). Primary graft dysfunction was observed only in the second group. Five patients who had lost their kidney graft during the follow-up period were observed in the second group. The probability of graft loss in the second group was as follows: 3% on the day of the transplantation, 12% after 3 months, and 15% after 13 months. All of the patients survived during the 2-year follow-up period. A similar estimated glomerular filtration rate was observed in dialysis time in both groups.

Conclusion

There are no statistically significant differences in kidney graft function between patients with the first transplantation and those with the repeat one. Good kidney transplantation results are attainable in both groups. It seems that retransplantation is the best treatment option for patients with primary graft failure.  相似文献   

15.
BackgroundRejection is still a barrier to long-term allograft survival, but there are not many reports of clinical outcomes according to rejection types. The purpose of this study was to investigate differences in pathologic features and graft outcomes of rejection on kidney transplant (KT).Materials and MethodsWe retrospectively analyzed 139 kidney transplant recipients diagnosed to rejection by allograft biopsy results between 2006 and 2018. We divided kidney transplant recipients into 3 groups as follows: T cell–mediated rejection (TCMR), antibody-mediated rejection, and mixed rejection. We investigated clinical characteristics, pathologic findings, death-censored graft survival rates, and patient survival rates among the 3 groups.ResultsMean follow-up duration was 113.5 (SD, 80.6) months. The mixed rejection group was the youngest significantly. There were no significant differences of the proportion of sex, KT type, KT number, number of HLA mismatches, induction immunosuppressant, and maintenance immunosuppressant among the 3 groups. In pathologic findings, microvascular inflammation and C4d were significantly different among the 3 groups. Death-censored graft survival of mixed rejection was the least. In multivariate analysis, recipient age, TCMR, and positive C4d were the risk factors associated with graft failure. However, patient survival rates showed no significant differences among the 3 groups.ConclusionsOur study showed that mixed rejection had poor prognosis in comparison with TCMR and antibody-mediated rejection groups, and TCMR and positive C4d were the most important risk factors for graft survival. Therefore, constant monitoring through allograft biopsy and early treatment for rejection are very important in post-transplant clinical outcomes.  相似文献   

16.
BackgroundSlow graft function (SGF) is considered to be an intermediate state between immediate graft function (IGF) and delayed graft function (DGF). However, the criteria of SGF is still arbitrary, and the clinical outcomes of SGF are not fully understood.MethodsA total of 212 deceased donor kidney transplantation recipients were enrolled. Three schemas were adopted, which classified SGF according to the serum creatinine (Cr) level by a given postoperative day (POD). SGF was defined as Cr ≥ 3.0 mg/dL on POD5, Cr ≥ 2.5 mg/dL on POD7, and Cr ≥ 1.5 mg/dL on POD14 without dialysis in schema I, II, and III, respectively. Estimated glomerular filtration rate (eGFR) after transplantation, acute rejection, and graft survival were compared in each schema. Decreased renal function, defined as eGFR less than 30.0 mL/min/1.73m2, was also compared.ResultsIn schema I and III, SGF had significantly lower eGFR at 3 months after transplantation compared with IGF (P < .017), and only schema III maintained the difference until 36 months after transplantation. The incidence of decreased renal function showed significant difference among groups in schema I and III (P < .05). Graft survival did not show significant difference among groups in all schemas. However, SGF and DGF groups showed a higher probability of decreased renal function than the IGF group (P < .017) in schema I and III.ConclusionsIn deceased donor kidney transplantation, certain definitions of SGF identified significantly worse clinical outcomes compared with IGF, suggesting similar impact with DGF. It is necessary to reach a consensus on a clearer definition of SGF with further studies.  相似文献   

17.

Introduction

Kidney transplantation procedures commonly result in a cold ischemia time (CIT) gap when both kidney grafts are implanted in the same center. Owing to logistics, the procedure is usually consecutive, first accomplishing one surgery and then the other. CIT constitutes an independent risk factor for the development of delayed graft function (DGF) in kidney transplants. The effect that CIT exerts on graft and patient survival is still unclear. This study evaluates the relation of CIT and transplant outcomes by comparing paired kidney transplants in terms of survival and graft function.

Methods

We accomplished a retrospective analysis of 402 kidney transplants performed in our center between 2000 and 2017. We selected all transplants where both organs from the same donor were implanted at our hospital, establishing 2 study groups (group 1: first graft implanted and group 2: second graft implanted) to compare by paired data statistical methods.

Results

We found an increase in the incidence of DGF in group 2 (42% vs 28.8%; P < .05). Group 2 had significantly worse graft function on day 5 posttransplant (4.7 ± 2.88 vs 3.86 ± 2.8 mg/dL of serum creatinine; P < .05). No significant differences in graft function were found on days 30 and 90 posttransplant. We didn't find any difference in graft survival between both groups. Length of hospitalization stay (17.6 days [± 13] vs 21.6 days [± 17]) and hemodialysis sessions (mean of 2.8 [± 2] vs 3.6 [± 2.2]) were higher in group 2.

Conclusion

CIT acts as an independent risk factor for the development of DGF in kidney transplantation. CIT had no isolated effect on graft survival.  相似文献   

18.

Background

Incidence of malignancy in transplant recipients is higher than in the general population. Malignancy is a major cause of mortality following solid organ transplantation and a major barrier to long-term survival for the kidney. The aim of this study was to estimate the incidence of solid organ malignancy (SOM) and melanoma in renal transplant recipients (RTR) transplanted at 2 representative transplant centers in Poland based on data from the Polish Tumor Registry.

Material and Methods

We analyzed the medical data of 3069 patients who underwent kidney transplantation (KTx) between 1995 and 2015.

Results

In our study 112 SOM (3.6%) were diagnosed. The majority of patients were male (n = 71; 63.4%; P < .01). The mean age at KTx was 48.0 ± 13.1 years and the mean age at the time of cancer diagnosis was 55.9 ± 12.7 years. The average time of malignancy occurrence was 5.9 ± 5.0 years after KTx. SOM was the cause of death in 60 patients (53%). The most common were malignancies of gastrointestinal tract (25%), urinary tract tumors (23.2%), lung cancer (n = 18; 16%), and lymphoma (13.4%). We found an increase in the percentage of chronic glomerular nephropathy in the group of SOM (n = 56; 50%) compared with renal insufficiency of other etiologies.

Conclusions

RTR in Poland are at a significant risk of malignancy development in a variety of organs, primarily urinary tract tumors and lymphoma. Cancers most frequently occurring in the general population such as lung and colorectal cancer are common in our RTR. On this basis an appropriate tumor screening schedule can be developed in individual countries.  相似文献   

19.
The aim of this study was to determine distinctive risk factors for graft survival of living-related and deceased donor kidney transplantation (KTx).

Methods

Consecutive 536 living-related and 524 deceased donor kidney transplant recipients from February 2014 to December 2015 in a single center were enrolled for retrospective analysis. Graft survival was assessed with the Kaplan-Meier method, and the Cox proportional hazard model was used to determine independent risk factors of allograft survival.

Results

One-, 3-, and 5-year graft survival rates were 98.8%, 98.5%, and 97.2%, respectively, in living-related donor KTx and were 94.9%, 91.3%, and 91.3%, respectively, in deceased donor KTx (log-rank, P < .001). Multivariate analysis demonstrated that risk factors for graft survival in living-related donor KTx were pretransplant dialysis duration (hazard ratio [HR], 1.023 per month; P = .046), delayed graft function (HR, 5.785; P = .02), and acute rejection (HR, 2.706; P = .04); risks factors in deceased donor KTx were recipient age (HR, 1.066 per year; P = .004), recipient history of diabetes mellitus (HR, 3.011; P = .03), pretransplant positive panel reactive antibody (HR, 3.353; P = .02), and donor history of hypertension (HR, 2.660; P = .046).

Conclusion

Distinctive risk factors for graft survival of living-related and deceased donor KTx were found.  相似文献   

20.

Objective

B cell activating factor (BAFF) has been shown to play a role in B cell survival, maturation, and activation, and has been linked with renal transplant outcome. BAFF signaling has been associated with plasmablast survival, anti-HLA immunization, and loss of graft function. We aimed to analyze the interplay between BAFF, memory B cells, and plasmablasts in relation to allograft function in long-term kidney transplant (KTx) recipients and their anti-HLA sensitization.

Materials and Methods

This study included 70 long-term KTx recipients on standard immunosuppression 15 ± 6 years post transplantation (44 stable, 26 chronic allograft dysfunction, CAD) and 25 healthy volunteers. CD19+ B cells, memory B cells (CD19+CD27+), and plasmablasts (CD19+CD24-CD27++CD38++) were enumerated with flow cytometry. BAFF serum level and anti-HLA antibodies were assessed by Luminex bead arrays.

Results

We found no difference in BAFF levels between KTx recipients and controls (median, interquartile range: 1.67, 1.40–1.97 vs 1.78, 1.63–1.93 ng/mL, P = .478) and no correlation between BAFF level and cell counts. Recipients presented lower plasmablast count than controls (22.5, 8–57 vs 79, 48–166 cells/mL, P < .001). There was a positive correlation between estimated glomerular filtration rate and plasmablasts (rs = 0.30, P = .013) in recipients. Cell populations and BAFF were not related to the presence of anti-HLA antibodies. None of the parameters investigated was related to deterioration of allograft function during the 2-year follow-up.

Conclusion

BAFF serum level is not related to anti-HLA sensitization, circulating memory B cells, plasmablast count, or allograft function. Circulating plasmablasts are associated with current allograft function but are not prognostic for future course.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号