首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Introduction

Intestinal transplantation has become an accepted therapy for individuals permanently dependent on total parenteral nutrition (TPN) with life-threatening complications. Quality of life and psychological well-being can be seen as important outcome measures of transplantation surgery.

Methods

We evaluated 24 adult intestinal transplant recipients and 24 healthy subjects (a control group). All subjects were administered the Italian Version of the Psychological Well-Being Scales (PWB) by C. Ryff, the World Health Organization Quality of Life-Brief (WHOQOL), and the Symptom Questionnaire (SQ) by R. Kellner and G.A. Fava, a symptomatology scale. Quality of life and psychological well-being were assessed in transplant recipients in relationship to the number of rejections, the number of admissions, and the immunosuppressive protocol.

Results

Intestinal transplant recipients reported significantly higher scores in the “personal growth” category (P = .036) and lower scores in the “positive relation with others” (P = .013) and “autonomy” (P = .007) dimensions of PWB, compared with the controls. In the WHOQOL, the scores of transplant recipients were lower only in the psychological domain (P = .011). Transplant recipients reported significantly higher scores in the “somatic symptom” (P = .027) and “hostility” (P = .018) dimensions of the SQ, compared with the controls. Transplant recipients with number of admissions >8 reported higher scores in “anxiety” (P = .019) and “depression” (P = .021) scales of the SQ, and the patients with a Daclizumab protocol reported higher scores in “depression” (P = .000) and “somatic symptom” (P = .008) of the SQ. There were no significant differences regarding number of rejections and socio-demographic variables.

Conclusion

Improvement of psychological well-being in the transplant population may be related to the achievement of the goal of transplantation: recovery of bowel function. But the data confirmed that the transplant experience required a long and difficult adaptation trial to the new condition of “transplant recipient.”  相似文献   

2.

Background

Extended-release tacrolimus (TAC-ER) was developed to provide a more convenient treatment compliance and improve safety by avoiding toxic peak levels. We prospectively evaluated the safety and effectiveness of a 1:1 dose switch from twice-daily tacrolimus to once-daily TAC-ER in stable kidney transplant recipients and assessed their satisfaction with the regimen.

Patients and methods

Tacrolimus was switched to TAC-ER (1:1 dose) in 12 kidney transplant recipients with stable renal function from March 2010 to August 2011. The posttransplantation follow-up period was 7.6 ± 4.3 years (range 1.5-13.2 years). No patient had diabetes mellitus in this group. We evaluated the tacrolimus trough levels, serum creatinine, potassium, glucose, glycohemoglobin (HbA1c), and urine protein concentrations once a month from 6 months prior to 1 year after switching. A satisfaction survey for TAC-ER treatment was performed 3 months after the switch. The questionnaire included administration compliance questions such as “forget to take less often,” “easy to carry,” “easy to store,” and “general satisfaction.”

Results

After the switch to TAC-ER, we observed a quick and sustained 25% decrease in TAC trough levels from 4.8 ± 1.0 to 3.6 ± 0.8 (P = .0002). No significant differences in serum creatinine, potassium, glucose, HbA1c, or urine protein concentration were observed during the 14.6 ± 2.6 months' follow-up period. No recipient experienced acute rejection. The satisfaction survey demonstrated that the stable kidney transplant recipients were satisfied with the switch.

Conclusions

A switch from twice-daily tacrolimus to once-daily TAC-ER (1:1 dose) was safe and effective. TAC-ER can improve treatment compliance in stable kidney transplant recipients.  相似文献   

3.

Background

The Luminex Single-Antigen Beads (LSA) assay allows an accurate detection and characterization of preexisting donor-specific antibodies (DSA) in kidney transplant candidates. But the ability of LSA to detect quite low levels of antibodies makes it hard to correctly predict crossmatch results in donor selection. In this study we retrospectively analyzed the accuracy of our virtual crossmatch (v-XM) protocol, which was used for selection of potential kidney transplant recipients, in predicting the results of actual crossmatch (a-XM) in cadaver-donor renal transplantation. We also investigated correlation between negative a-XM results and strength/specificity of preformed DSA.

Methods

The correlation between negative v-XMs and a-XMs performed in 2007–2012 at the Regional Transplant Center of the Lazio Region, Italy, was analyzed. In carrying out v-XM, the donor HLA molecules against which patients showed LSA-detected DSA with normalized mean fluorescence intensity (MFI) ≥5,000 were considered to be “unacceptable DSA,” and LSA-DSA showing MFI <5,000 were defined as “acceptable DSA.” All cadaver donors had been typed for HLA-A, -B, -DR, and -DQB molecules by sequence-specific primer methods. On the basis of a negative v-XM, we performed 507 a-XMs between serum samples from 256 renal transplant candidates and T/B lymphocytes from 302 cadaver donors with the use of both complement-dependent cytotoxicity (CDC) and flow cytometry (FC) methods.

Results

The v-XM negative results showed good correlation with both CDC and FC a-XMs (97% and 90%, respectively). The sensitivity of v-XM was 100%; this high value was related to the lack of false-negative DSA results. The limited specificity with both techniques (CDC-XM, 74%; FC-XM, 79%) was due to the presence of “acceptable” and/or anti-DQA/DPB DSA in some patient sera used to perform the a-XMs. During the study period, 171 (67%) of the 256 sensitized patients received a kidney transplant: 30% of these had “acceptable DSA” and/or anti-DQA/DPB DSA. No antibody-mediated rejection due to preformed HLA-DSA was observed.

Conclusions

Our v-XM protocol showed high sensitivity in predicting donor-recipient immunologic compatibility. The results of this study also demonstrated the importance of evaluating DSA strength for implementing v-XM results in the selection of kidney transplant recipients. Moreover, the finding of anti-DQA/DPB DSA, especially in serum samples that gave positive results with the use of both CDC and FC a-XMs, highlights the importance of defining all of the donor HLA molecules to perform an accurate v-XM.  相似文献   

4.

Background

New-onset diabetes mellitus after transplantation (NODAT) contributes to the risk of cardiovascular disease (CVD) and infection, reducing graft and patient survival in kidney transplant recipients. To reduce CVD and improve outcomes of kidney transplant recipients, it is of great interest to more precisely elucidate the risk factors that contribute to the development of NODAT. A previous study reported that hypomagnesemia is an independent predictor of NODAT. Elevated gamma-glutamyltransferase (GGT) activity increases the risk of incident type 2 diabetes in the general population. The objective of this study was to determine whether magnesium (Mg) and GGT were risk factors for NODAT among our population of kidney transplant recipients.

Methods

We retrospectively analyzed 205 non-previously diabetic kidney transplant recipients. GGT was measured before transplantation as well as at months 1, 2, and 12. Mg was measured at months 1, 2, and 12. NODAT was defined at month 12 and at the end of follow-up according to the “2003 international consensus guidelines.”

Results

Although 36 patients (17.5%) developed NODAT at month 12, 55 patients (26.8%) displayed it at the end of follow-up. We did not observe any significant difference, either in mean Mg (month 1, 1.73 ± 0.24 vs 1.75 ± 0.30 [P = .824]; month 2, 1.71 ± 0.22 vs 1.68 ± 0.26 [P = .565]; month 12, 1.77 ± 0.27 vs 1.80 ± 0.24 [P = .596]) or GGT values (pretransplantation, 32 ± 27 vs 33 ± 85 [P = .866]; month 1:39 ± 24 vs 48 ± 70 [P = .452]; month 2, 53 ± 96 vs 48 ± 83 [P = .739]; month 12, 40 ± 37 vs 38 ± 53 [P = .830]) between NODAT and non-NODAT patients at month 12 or at the end of follow-up.

Conclusion

Hypomagnesemia and high GGT activity were not risk factors for NODAT development in kidney transplant recipients.  相似文献   

5.

Background

As the disparity between the numbers of available organ donors and patients awaiting transplantation increases, different strategies have been proposed to extend the donor pool. Patients with acute kidney injury (AKI) developing during an intensive care unit (ICU) stay are often considered to be donors, but the long-term outcomes of such high-risk kidney transplantations is unknown. We analyzed the renal function and outcomes over 5 years of kidney grafts recovered from deceased donors diagnosed with AKI.

Materials and Methods

We collected data from 61 deceased kidney donors, identified in 1 ICU, and 120 kidney graft recipients who underwent transplantation between January 1999 and December 2006. Donors were stratified according to the RIFLE classification, based on their creatinine and urine output change from admission to the ICU and organ procurement. Recipient kidney graft function (eGFR) calculated according to the MDRD (Modification of Diet in Renal Disease) equation was estimated every 6 months.

Results

Among 61 donors, 10 (16.4%) developed AKI, including 7 classified as “risk”, 2 as “injury,” and 1 as “failure.” The mean follow-up of kidney graft recipients was 49 ± 18 months. The long-term risk for graft loss was significantly higher among the group of kidneys recovered from donors with AKI (27.8% vs 7.1%; P = .02; log-rank = 0.07). Their excretory function was worse over the whole follow-up period.

Conclusion

Patients with kidney grafts obtained from the donors with AKI showed a higher risk for graft loss and worse excretory function upon long-term follow-up.  相似文献   

6.

Background

Infections remain a major cause of morbidity and mortality in solid organ transplant recipients. An increased risk of up to 50% of herpes simplex virus (HSV) reactivation in transplant recipients in the first months posttransplantation was well-documented during the pre-cytomegalovirus prophylaxis era. Previous reports suggest that these patients are likely to experience a more aggressive disease course and a higher rate of acyclovir-resistant HSV. No data currently exist regarding the course of HSV infection in pancreas or pancreas-kidney transplant (PKT) recipients. The goal of this study was to evaluate the incidence and severity of HSV infections in pancreas transplant and PKT recipients.

Study Design

We analyzed a transplant patient database of the Royal Victoria Hospital to identify 137 pancreas transplant or PKT performed between January 1999 and October 2010. A retrospective chart review was subsequently performed to evaluate the incidence and severity of herpetic infections post transplantation.

Results

Our findings show that the incidence of HSV infection in our patients was approximately 10% (10/98 cases). The majority of infections (80%) took place within the first 2 years after the transplantation. Most patients (90%) experienced a uniform, mild disease course and responded well to treatment. One patient died of an unrelated cause. Six patients were treated in hospital with a mean stay of 12.3 ± 6.35 days. The initial immunosuppressive regimen remained unchanged for half of the affected patients. None of our patients developed a drug-resistant HSV.

Conclusion

These findings are intriguing and warrant a larger, multicenter, prospective study. Most important, they suggest that the new incidence of HSV reactivation is now much lower in the “cytomegalovirus prophylaxis era” and that with timely diagnosis and proper treatment most patients recover well from their HSV infections and respond to the current treatment regimens.  相似文献   

7.

Objective

This study examined the current state of information on renal replacement therapy and the educational demands of kidney transplant recipients.

Methods

The study was conducted through a survey. The questionnaire of this study was developed by researchers and was completed by 72 kidney recipients.

Results

The recipients were most frequently informed of hemodialysis (87.5%), followed by kidney transplantation (69.4%) or peritoneal dialysis (48.6%) as a modality of renal replacement therapy at the time of diagnosis of chronic renal failure. Information about kidney transplantation was provided when they were diagnosed with end-stage renal disease (ESRD; 33.3%) or right after initiation of dialysis (15.3%) or a few years thereafter (9.7%). They were informed about kidney transplantation mostly by transplantation surgeons (mean degree score = 3.1 ± 1.3; range, 1-4), followed in order by transplant coordinators, nephrologists, family members, other patients, artificial kidney unit nurses, and mass media or internet. Regarding the influence of the information on their decision to receive a transplant, the mean score was 3.2 ± 1.2 (range, 1-5). Also, kidney transplantation was evaluated as the best renal replacement therapy for work, pregnancy/delivery, traveling, and diet.

Conclusion

Patients diagnosed with ESRD are not fully informed of transplantation as a primary optimal renal replacement therapy for their quality of life.  相似文献   

8.

Background

Viral infections are the most common cause of opportunistic infections after kidney transplantation. Among hepatotropic viruses that induce kidney graft failure and rejection, hepatitis B virus (HBV) has an important and critical role. Extrahepatic HBV-related disorders increase morbidity and mortality in kidney transplant recipients.

Objective

To analyze the molecular prevalence of HBV infection in kidney transplant recipients and donors before and after transplantation.

Patients and Methods

This cross-sectional study included 273 serum samples collected between 2005 and 2008 in 96 kidney transplant recipients and 59 donors. Detection of HBV DNA was via amplification of the S gene fragment of HBV genome using a qualitative simple polymerase chain reaction assay. Also analyzed were statistical relationships between HBV infection and laboratory and clinical demographic data in all kidney transplant donors and recipients.

Results

The HBV genome was detected in 102 of 273 serum samples. Molecular HBVinfection was demonstrated in 2 of 13 serum samples (15.4%) from recipients testedbefore transplantation. HBV DNA was detected in 42 of 96 patients (43.7%) after kidneytransplantation. The HBV genome was demonstrated in 21 of 59 donors (35.6%).Significant relationships were observed between HBV infections and hematologic andbiochemical indices after kidney transplantation.

Conclusion

Detection of a high molecular prevalence of HBV infection in kidneyrecipients enforces the importance of HBV infection in clinical outcome.  相似文献   

9.

Background

Functional iron deficiency is characterized by the presence of adequate iron stores as defined by conventional criteria, but insufficient iron mobilization to adequately support erythropoiesis. The aim of this study was to assess the prevalence of functional iron deficiency in heart and kidney transplant recipients based on data from recent medical records.

Methods

Using standard laboratory methods obtained during routine checkups, we assessed iron status by determinations of serum iron, total iron-binding capacity, ferritin and total saturation of transferrin (TSAT), as well as complete blood count and creatinine.

Results

Iron parameters were available for 62% of heart transplant recipients, but only for 26% of kidney transplant recipients. Absolute iron deficiency was observed in 35% of the heart and 8% of the kidney transplant recipients (P < .001). Functional iron deficiency was present in 4% of the heart and 6% of the kidney transplant recipients. Functional iron deficiency was associated with significantly higher serum ferritin and lower TSAT. In addition, although their hemoglobin values did not differ significantly, heart transplant recipients with absolute iron deficiency showed lower erythrocyte blood counts, were younger, and had a shorter time after transplantation.

Conclusions

Iron parameters are assessed infrequently, particularly among kidney transplant recipients. Iron deficiency was present in a considerable group of heart transplant recipients. This population should be carefully screened for possible reversible causes of iron deficiency to slow or to minimize anemia development.  相似文献   

10.

Introduction

New-onset diabetes mellitus, which occurs after kidney transplant and type 2 diabetes mellitus (T2DM), shares common risk factors and antecedents in impaired insulin secretion and action. Several genetic polymorphisms have been shown to be associated with T2DM. We hypothesized that transplant recipients who carry risk alleles for T2DM are “tipped over” to develop diabetes mellitus in the posttransplant milieu.

Methods

We investigated the association of genetic and traditional risk factors present before transplantation and the development of new-onset diabetes mellitus after kidney transplantation (NODAT). Markers in 8 known T2DM-linked genes were genotyped using either the iPLEX assay or allelic discrimination (AD)-PCR in the study cohort testing for association with NODAT. We used univariate and multivariate logistic regression models for the association of pretransplant nongenetic and genetic variables with the development of NODAT.

Results

The study cohort included 91 kidney transplant recipients with at least 1 year posttransplant follow-up, including 22 who developed NODAT. We observed that increased age, family history of T2DM, pretransplant obesity, and triglyceridemia were associated with NODAT development. In addition, we observed positive trends, although statistically not significant, for association between T2DM-associated genes and NODAT.

Conclusions

These findings demonstrated an increased NODAT risk among patient with a positive family history for T2DM, which, in conjunction with the observed positive predictive trends of known T2DM-associated genetic polymorphisms with NODAT, was suggestive of a genetic predisposition to NODAT.  相似文献   

11.

Objective

Hepatitis B virus core antibody (HBcAb)-positive organ donors have the potential to transmit infection to transplant recipients.

Patients and Methods

We investigated the use of a single dose of 2000 IU of hepatitis B immunoglobulin in 18 patients among a population of 54 kidney transplant recipients from HBcAb-positive deceased donors.

Results

Twelve recipients were HBcAb-positive before transplantation. Among the other 42 patients, 5 (11.9%) seroconverted from HBcAb-negative to HBcAb-positive, whereas one HBcAb-positive recipient became hepatitis B virus surface antigen-positive with clinical signs of active hepatitis 6 years after transplantation. In the 18 patients who underwent prophylaxis, we did not find any seroconversion or hepatitis B virus (HBV) transmission. Graft and patient survival of HBcAb-positive kidney transplants did not differ significantly with a matched population of HBcAb-negative transplantation.

Conclusion

These results suggest that kidney transplantation from HBcAb-positive donors is safe with a low rate of HBV transmission. A prophylaxis with a single shot of hepatitis B immunoglobulin may be effective in reducing the risk of HBV seroconversion or reactivation and may be suggested in all naïve or HBcAb-positive transplant recipients.  相似文献   

12.

Background

Renalase is an enzyme that catabolizes catecholamines such as adrenaline and noradrenaline in the circulation. The human kidney releases this protein into the bloodstream to regulate blood pressure. In kidney transplant recipients, the prevalence of hypertension is 60%-80%.

Objective

The aim of our study was to assess possible correlations between renalase, blood pressure, and kidney function among 89 prevalent kidney allograft recipients. To obtain normal ranges, we also studied renalase levels in 27 healthy volunteers.

Methods

Complete blood counts, urea, serum lipids, fasting glucose, and creatinine were measured by standard laboratory methods in the hospital central laboratory. Renalase was assessed with the use of a commercially available kit.

Results

In kidney transplant recipients renalase was significantly higher than in healthy volunteers (P < .001). In kidney transplant recipients, renalase correlated with age (r = 0.29; P < .05), time after transplantation (r = 0.34; P < .01), systolic blood pressure (r = 0.28; P < .05), diastolic blood pressure (r = 0.27; P < .05), serum creatinine (r = 0.49; P < .001), estimated glomerular filtration rate (Chronic Kidney Disease Endemiology collaboration: r = −0.44; P < .0001; Modification of Diet in Renal Disease: r = −0.43; P < .001; Cockcroft-Gault r = −0.39; P < .01), serum phosphate (r = 0.34; P < .05). Upon multiple regression analysis renalase was predicted by 70% using age (beta value 0.21, P = 0.043), time after transplantation (beta value, 0.22; P = .037), serum creatinine (beta value, 0.50; P = .016), and diastolic blood pressure (beta value, 0.33; P = .027).

Conclusions

Renalase is highly elevated in kidney transplant recipients, predominantly dependent on kidney function, which deteriorates with time after kidney transplantation and age. Further studies are needed to establish its putative role in the pathogenesis of hypertension after transplantation and possible novel targeted therapies.  相似文献   

13.
Double kidney transplantation is an accepted strategy to increase the donor pool. Regarding older donor kidneys, protocols for deciding to perform a dual or a single transplantation are mainly based on preimplantation biopsies. The aim of our study was to evaluate the long-term graft and patient survivals of our “Dual Kidney Transplant program.” Patients who lost one of their grafts peritransplantation were used as controls. A total of 203 patients underwent kidney transplantation from December 1996 to January 2008 in our “old for old” renal transplantation program. We excluded 21 patients because of a nonfunctioning kidney, hyperacute rejection, or patient death with a functioning graft within the first month. Seventy-nine among 182 kidney transplantation the “old for old” program were dual kidney transplantation (DKT). Fifteen of 79 patients lost one of their kidney grafts (the uninephrectomized (UNX) UNX group). At 1 year, renal function was lower and proteinuria greater among the UNX than the DKT group. Patient survival was similar in both groups. However, death-censored graft survival was lower in UNX than DKT patients. The 5-year graft survival rate was 70% in UNX versus 93% in DKT cohorts (P = .04). In conclusion, taking into account the kidney shortage, our results may question whether the excellent transplant outcomes with DKT counter balance the reduced donor pool obviating acceptable transplant outcomes for more patients with single kidney transplantation.  相似文献   

14.

Introduction

Despite an increased quality of life after transplant, in the United States, recipients participate less in employment compared to the general population. Employment after kidney transplantation is an important marker of clinically significant individual health recovery. Furthermore, it has been shown that employment status in the post-transplant period has a strong and independent association with patient and graft survival.

Materials and Methods

Using the United Network for Organ Sharing (UNOS) database, we identified all adults (between 18 and 64 years of age) who underwent kidney transplantation between 2004 and 2011. Patients with a stable renal allograft function and with full 1-, 3-, and 5-year follow-up were included. For recipients of multiple transplants, the most recent transplant was considered the target transplant. The data collected included employment rate after kidney transplantation in recipients employed and unemployed before transplant. The employment data were stratified for insurance payer (private, Medicaid, and Medicare). The results of categorical variables are reported as percentages. Comparisons between groups for categorical data were performed using the χ2 test with Yates continuity correction or Fisher test when appropriate.

Results

The UNOS database available for this study included a total of 100,521 patients. The employment rate at the time of transplant was 23.1% (n = 23,225) under private insurance and 10% (n = 10,032) under public insurance (Medicaid and Medicare, P < .01, compared to private insurance). Over a total of 29,809 recipients analyzed, alive and with stable renal allograft function who were working at time of transplantation, the employment rate was 47% (n = 14,010), 44% (n = 13,115), and 43% (n = 12,817) at 1, 3, and 5 years after transplant under private insurance and 16% (n = 4769), 14% (n = 4173), and 12% (n = 3567), respectively, under public insurance (P < .01, compared to private insurance). Over a total of 46,363 recipients alive and with stable renal function who were not working at time of transplant, the employment rate was 5.3% (n = 2457), 5.6% (n = 2596), and 6.2% (n = 2874) at 1, 3, and 5 years after transplant under private insurance and 6.5% (n = 3013), 7.8% (n = 3616), and 7.5% (n = 3477), respectively, under public insurance (P < .01, compared to private insurance).

Conclusion

The employment rates at the time of transplant in the United States are generally low, although privately insured patients are significantly more likely than patient with public insurance to have employment. Only a portion of these patients returns to work after transplantation. For the patients unemployed at the time of transplantation, the chance to find a job afterward is quite low even in privately insured patients. A concerted effort should be made by the transplant community to improve the ability of successful kidney transplant recipients to return to work or find a new employment. It had been shown that employment status in the post-transplant period has a strong and independent association with the graft and recipient survival.  相似文献   

15.

Background

The number of obese kidney transplant candidates has been growing. However, there are conflicting results regarding to the effect of obesity on kidney transplantation outcome. The aim of this study was to investigate the association between the body mass index (BMI) and graft survival by using continuous versus categoric BMI values as an independent risk factor in renal transplantation.

Methods

We retrospectively reviewed 376 kidney transplant recipients to evaluate graft and patient survivals between normal-weight, overweight, and obese patients at the time of transplantation, considering BMI as a categoric variable.

Results

Obese patients were more likely to be male and older than normal-weight recipients (P = .021; P = .002; respectively). Graft loss was significantly higher among obese compared with nonobese recipients. Obese patients displayed significantly lower survival compared with nonobese subjects at 1 year (76.9% vs 35.3%; P = .024) and 3 years (46.2% vs 11.8%; P = .035).

Conclusions

Obesity may represent an independent risk factor for graft loss and patient death. Careful patient selection with pretransplantation weight reduction is mandatory to reduce the rate of early posttransplantation complications and to improve long-term outcomes.  相似文献   

16.

Background

Long-term function of transplanted kidney is the factor determining quality of life for transplant recipients. The aim of this study was to evaluate the effect of selected factors on time of graft function after renal transplantation within 15 years of observation.

Methods

Preoperative and intraoperative factors were analyzed in 232 kidney recipients within a 15-year observation period. Analysis included age, sex, cause of recipient's renal failure, length of hemodialyses before transplantation, peak panel reactive antibodies test, human leukocyte antigen compatibility, cold ischemia time, delayed graft function occurrence, length and time of hemodialyses after transplantation, early graft rejection, creatinine level at days 1, 3, 7, 30, 90, and 180 after transplantation, and influence of these factors on the time of graft function. Statistical analysis was performed with the use of univariate and multivariate Kaplan-Meier test and Cox regression proportional hazards model, with P < .05 considered to be significant.

Results

Univariate analysis showed significantly shorter renal graft function in the group of recipients with higher creatinine levels in all of the analyzed time periods and in patients experiencing delayed graft function. Length of time of hemodialyses after transplantation and number of dialyses had significant impact on worsening of late transplant results. Multivariate analysis reported that early graft rejection in the postoperative period is an independent factor improving late graft function: P = .002; hazard ratio (HR), 0.49 (95% confidence interval [CI], 0.31–0.78). Higher creatinine level at day 90 after kidney transplantation is a predictive factor of late graft dysfunction: P = .002; HR, 1.68 (95% CI 1.2–2.35).

Conclusions

Creatinine level at day 90 after renal transplantation is the prognostic factor of long-term kidney function. Early transplant rejection leads to introduction of more aggressive immunosuppression protocol, which improves long-term transplant results.  相似文献   

17.

Background

Acute rejection (AR) is a common medical problem among kidney transplant recipients, which may cause a significant impact on patients’ and allografts’ survival. Kidney allograft biopsy remains the “gold standard” for assessing the cause of kidney transplant dysfunction. However, there are limitations for the allograft biopsy; these include the risk of bleeding, injury to the adjacent organs, and the possibility of sampling error leading to misdiagnosis.

Methods

We conducted a comprehensive review of the literature and main published data that discussed the most relevant serum and urine biomarkers in acute allograft dysfunction, along with their clinical significance.

Results

There have been significant discoveries of several important biomarkers that correlated with biopsy findings, clinical outcomes and possibly graft survival. Proteomic and genomic have been utilized in this area with some success, along with a growing discoveries of cytokines surrogate makers.

Summary

The discovery of surrogate biomarkers in kidney transplantation is an evolving field of crucial importance that may change the way we practice transplant nephrology in the future.  相似文献   

18.

Background

The purpose of this study was to investigate the effect of liver compliance on computed tomography (CT) volumetry and to determine its association with postoperative small-for-size syndrome (SFSS).

Patients and methods

Unenhanced, arterial, and venous phase CT images of 83 consecutive living liver donors who underwent graft hepatectomy for adult-to-adult living donor liver transplantation (ALDLT) were prospectively subjected to three-dimensional (3-D) CT liver volume calculations and virtual 3-D liver partitioning. Graft volume estimates based on 3-D volumetry, which subtracted intrahepatic vascular volume from the “smallest” (native) unenhanced and the “largest” (venous) CT phases, were subsequently compared with the intraoperative graft weights. Calculated (preoperative) graft volume-to-body weight ratios (GVBWR) and intraoperative measured graft weight-to-body weight ratios (GWBWR) were analyzed for postoperative SFSS.

Results

Significant differences in minimum versus maximum total liver volumes, graft volumes, and GVBWR calculations were observed among the largest (venous) and the smallest (unenhanced) CT phases. SFSS occurred in 6% (5/83) of recipients, with a mortality rate of 80% (4/5). In four cases with postoperative SFSS (n = 3 lethal, n = 1 reversible), we had transplanted a small-for-size graft (real GWBWR < 0.8). The three SFS grafts with lethal SFSS showed a nonsignificant volume “compliance” with a maximum GVBWR < 0.83. This observation contrasts with the seven recipients with small-for-size grafts and reversible versus no SFSS who showed a “safe” maximum GVBWR of 0.92 to 1.16.

Conclusion

The recognition and precise assessment of each individual's liver compliance displayed by the minimum and maximum GVBWR values is critical for the accurate prediction of functional liver mass and prevention of SFSS in ALDLT.  相似文献   

19.

Introduction

The aortic calcification index (ACI) is reported to be closely associated with renal dysfunction and cardiovascular events; however, its implication in renal transplant recipients has not been well examined. In this study, we investigated the relationship between pretransplant ACI, ACI progression, post-transplant renal function, and post-transplant cardiovascular events in renal transplant recipients.

Patients and methods

The study from June 1996 to Jan 2012 included 61 renal transplant recipients (living donors, 47; cadaveric donors, 14). The median follow-up period was 60 months. ACI was quantitatively measured on abdominal computed tomography. The relationship between age, dialysis period, estimated glomerular filtration rate (eGFR), and pre- and post-transplant ACI was longitudinally evaluated. Risk factors for post-transplant ACI progression were determined by logistic regression analysis. Patient background and the incidence of post-transplant cardiovascular events were also assessed.

Results

The pretransplant ACI (median 4.2%) significantly correlated with age at transplant, dialysis period, and diabetes mellitus. ACI gradually increased up to 2.8 times at 10 years after transplantation. Post-transplant eGFR significantly correlated with ACI progression in patients with chronic kidney disease of stage ≥3. Logistic regression analyses showed that age at transplantation, post-transplant period, cadaveric donors, and post-transplant chronic kidney disease stage 3 were risk factors for post-transplant ACI progression. The pretransplant ACI was higher (median 66%) in 3 patients who experienced post-transplant cardiovascular events.

Conclusions

ACI progression closely correlates with age and post-transplant renal function. A high pretransplant ACI is a risk factor for post-transplant cardiovascular events in renal transplant recipients.  相似文献   

20.

Objective

To determine the incidence of colon cancer in lung transplant recipients with cystic fibrosis (CF) and review screening colonoscopic findings in other recipients with CF.

Methods

A retrospective chart review was performed for all patients with CF transplanted at the University of Wisconsin Hospital and Clinics (January 1994 through December 2010).

Results

Four of 70 transplant recipients with CF developed fatal colon carcinoma following transplantation, and the cancer was advanced in all 4 recipients (age 31, 44, 44, 64) at the time of diagnosis. In contrast, only one of 287 recipients transplanted for non-CF indications developed colon cancer. Of all recipients with CF who did not develop colon cancer, 20 recipients underwent screening colonoscopy at 1 to 12 years following transplantation. Seven (35%) of the screened transplant recipients (ages 36, 38, 40, 41, 43, 49, 51) had colonic polyps in locations ranging from cecum to sigmoid colon and up to 3 cm in diameter.

Conclusions

In contrast to non-CF recipients, patients with CF displayed a significant incidence of colon cancer (4 of 70 recipients; 5.7%) with onset ranging from 246 days to 9.3 years post-transplant, which may be due to a combination of their underlying genetic disorder plus intense, sustained immunosuppression following lung transplantation. Colonoscopic screening may identify patients with pre-malignant colonic lesions and prevent progression to colonic malignancy.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号