首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 843 毫秒
1.

Objective

We sought to investigate the clinical courses of renal transplant recipients with plasma BK viral loads >104 copies/mL.

Methods

A single-center retrospective review was performed of 88 kidney transplant patients in whom high BK viremia (defined as plasma BKV load >104 copies/mL) was detected more than once from January 1, 2004, to December 31, 2011.

Results

At the time of transplantation, the mean recipient and donor ages were 44.5 ± 11.1 and 43.9 ± 11.3 years, respectively, and 59 subjects (67.0%) were male. The median times to first BK positivity and high BK viremia after transplantation were 44 and 136 days, respectively. Within 3 months after transplantation, we detected, 56 cases of high BK viremia (63.6%). The mean duration of high BK viremia was 8.2 ± 7.7 months. When plasma BKV load was first >4 logs, the mean log BKV load was 5.50 ± 1.11 log copies/mL, which rose to a maximum of 5.82 ± 1.11. At these times, mean serum creatinine concentrations were 1.67 ± 0.79 and 2.64 ± 2.78 mg/dL, respectively. There were 31 cases (35%) of biopsy-proven BK nephropathy patients among 51 (58%) biopsies. Treatment modalities included discontinuation or dose reduction of mycophenolic acid drugs (n = 68) and switch from tacrolimus to cyclosporine (n = 9), cidofovir (n = 9), and leflunomide (n = 3). Based on the serum creatinine elevation after high BK viremia, patients were divided into group 1 (n = 27; 30.1%), whose maximal creatinine change was <0.5 mg/dL, and group 2, with a greater alteration. On multivariate logistic regression analysis, the maximal plasma BK viral load was significantly associated with a greater serum creatinine elevation (P < .001).

Conclusions

High BK viremia mostly occurred within 3 months after kidney transplantation. About 30% of renal allograft recipients with high BK viremia maintained stable renal function. Maximal plasma BK viral load was the only independent risk factor for high serum creatinine elevation.  相似文献   

2.
We undertook a prospective, matched cohort study of patients with Staphylococcus aureus bacteremia (SAB) and gram-negative bacteremia (GNB) to compare the characteristics, outcomes, and chemokine and cytokine response in transplant recipients to immunocompetent, nontransplant recipients. Fifty-five transplant recipients (GNB n = 29; SAB n = 26) and 225 nontransplant recipients (GNB n = 114; SAB n = 111) were included for clinical analysis. Transplant GNB had a significantly lower incidence of septic shock than nontransplant GNB (10.3% vs 30.7%, p = .03). Thirty-day mortality did not differ significantly between transplant and nontransplant recipients with GNB (10.3% vs 15.8%, p = .57) or SAB (0.0% vs 11.7%, p = .13). Next, transplant patients were matched 1:1 with nontransplant patients for the chemokine and cytokine analysis. Five cytokines and chemokines were significantly lower in transplant GNB vs nontransplant GNB: IL-2 (median [IQR]: 7.1 pg/ml [7.1, 7.1] vs 32.6 pg/ml [7.1, 88.0]; p = .001), MIP-1β (30.7 pg/ml [30.7, 30.7] vs 243.3 pg/ml [30.7, 344.4]; p = .001), IL-8 (32.0 pg/ml [5.6, 53.1] vs 59.1 pg/ml [39.2, 119.4]; p = .003), IL-15 (12.0 pg/ml [12.0, 12.0] vs 12.0 pg/ml [12.0, 126.7]; p = .03), and IFN-α (5.1 pg/mL [5.1, 5.1] vs 5.1 pg/ml [5.1, 26.3]; p = .04). Regulated upon Activation, Normal T Cell Expressed and Secreted (RANTES) was higher in transplant SAB vs nontransplant SAB (mean [SD]: 750.2 pg/ml [194.6] vs 656.5 pg/ml [147.6]; p = .046).  相似文献   

3.

Background

Hyperuricemia may be associated with the development of new cardiovascular events and graft loss in renal transplant recipients. This study was conducted to clarify whether hyperuricemia is a persistently independent predictor of long-term graft survival and patient outcome.

Methods

Renal allograft recipients (n = 880) who underwent transplantation from December 1999 to March 2013 were included. Participants were divided into 2 groups: a hyperuricemic group (n = 389) and a normouricemic group (n = 491). The mean serum uric acid (UA) level was obtained by averaging all measurements, once per month for 3 months, before the study began. Clinical and laboratory data were collected. We investigated the role of hyperuricemia in the primary endpoint of graft failure by using time-varying analysis and Kaplan-Meier plots. All-cause mortality in renal transplant recipients was also surveyed.

Results

During a mean follow-up of 43.3 ± 26.3 months, the major predisposing factors in the 389 patients with hyperuricemia were male predominance (62.98%), high entry serum UA (7.70; range 6.70–8.80 mg/dL), more hypertension (92.29%), previous hemodialysis mode (29.56%), hepatitis C infection (24.42%), more frequent use of UA-lowering agents (43.44%), and use of more drugs for inducing high serum UA (17.74%). After 12 months, the hyperuricemic group had persistently high serum UA (7.66 ± 2.00 vs 6.17 ± 1.60 mg/dL, P < .001) and poor renal function (serum creatinine 2.96 ± 3.20 vs 1.61 ± 1.96 mg/dL, P < .001) compared with the normouricemic group. Survival analysis showed the hyperuricemic group had poorer graft survival (60.47%) than the normouricemic group (75.82%, P = .0069) after 13-year follow-up. However, there was no difference in all-cause mortality between the 2 groups.

Conclusion

Persistently high serum UA seems to be implicated in elevation of serum creatinine, which could increase the risk for allograft dysfunction.  相似文献   

4.

Introduction

Despite an increased quality of life after transplant, in the United States, recipients participate less in employment compared to the general population. Employment after kidney transplantation is an important marker of clinically significant individual health recovery. Furthermore, it has been shown that employment status in the post-transplant period has a strong and independent association with patient and graft survival.

Materials and Methods

Using the United Network for Organ Sharing (UNOS) database, we identified all adults (between 18 and 64 years of age) who underwent kidney transplantation between 2004 and 2011. Patients with a stable renal allograft function and with full 1-, 3-, and 5-year follow-up were included. For recipients of multiple transplants, the most recent transplant was considered the target transplant. The data collected included employment rate after kidney transplantation in recipients employed and unemployed before transplant. The employment data were stratified for insurance payer (private, Medicaid, and Medicare). The results of categorical variables are reported as percentages. Comparisons between groups for categorical data were performed using the χ2 test with Yates continuity correction or Fisher test when appropriate.

Results

The UNOS database available for this study included a total of 100,521 patients. The employment rate at the time of transplant was 23.1% (n = 23,225) under private insurance and 10% (n = 10,032) under public insurance (Medicaid and Medicare, P < .01, compared to private insurance). Over a total of 29,809 recipients analyzed, alive and with stable renal allograft function who were working at time of transplantation, the employment rate was 47% (n = 14,010), 44% (n = 13,115), and 43% (n = 12,817) at 1, 3, and 5 years after transplant under private insurance and 16% (n = 4769), 14% (n = 4173), and 12% (n = 3567), respectively, under public insurance (P < .01, compared to private insurance). Over a total of 46,363 recipients alive and with stable renal function who were not working at time of transplant, the employment rate was 5.3% (n = 2457), 5.6% (n = 2596), and 6.2% (n = 2874) at 1, 3, and 5 years after transplant under private insurance and 6.5% (n = 3013), 7.8% (n = 3616), and 7.5% (n = 3477), respectively, under public insurance (P < .01, compared to private insurance).

Conclusion

The employment rates at the time of transplant in the United States are generally low, although privately insured patients are significantly more likely than patient with public insurance to have employment. Only a portion of these patients returns to work after transplantation. For the patients unemployed at the time of transplantation, the chance to find a job afterward is quite low even in privately insured patients. A concerted effort should be made by the transplant community to improve the ability of successful kidney transplant recipients to return to work or find a new employment. It had been shown that employment status in the post-transplant period has a strong and independent association with the graft and recipient survival.  相似文献   

5.
IntroductionBone mineral density (BMD) was significantly lower in heart failure patients. Our aim was to evaluate the relationship between BMD and fasting serum N-terminal pro-B-type natriuretic peptide (NT-proBNP) concentration in renal transplant recipients.MethodsFasting blood samples were obtained from 69 renal transplant recipients. BMD was measured by dual energy x-ray absorptiometry in lumbar vertebrae (L2–L4). Serum NT-proBNP levels were measured by electrochemiluminescence immunoassay.ResultsAmong the renal transplant recipients, 8 patients (11.6%) had osteoporosis and 28 (40.6%) had osteopenia; 33 had a normal BMD. Increased serum NT-proBNP (P < .001) and decreased body mass index (P = .033) and body weight (P = .010) were significantly correlated with low lumbar T-score cutoff points between groups (normal, osteopenia, and osteoporosis). Women had lower lumbar BMD than did men (P = .013). Menopause in women (P = .005), use of tacrolimus (P = .020), and use of cyclosporine (P = .046) among renal transplant recipients were associated with lower lumbar BMD. Multivariate forward stepwise linear regression analysis of the significant variables revealed that log-transformed NT-proBNP (β, ?0.545; R2 = 0.331; P < .001), and body weight (β, 0.273, R2 = 0.104; P = .005) were independent predictors of lumbar BMD values among the renal transplant recipients.ConclusionsSerum NT-proBNP concentrations correlate negatively with lumbar BMD values among renal transplant recipients and may be an alternative to energy x-ray absorptiometry for identifying at risk of osteoporosis in renal transplant recipients.  相似文献   

6.

Background

Routine microbiologic surveillance is a method of infection control, but its clinical significance in transplant recipients is not known. We analyzed microbiologic data to evaluate the influence of cultured microorganisms between the point of surveillance and infectious episodes in liver transplant recipients.

Methods

We performed surveillance culture for sputum and peritoneal fluid in liver transplant recipients from January 2009 to December 2011, at the time of transplantation (T1), 5 days (T2), and 10 days (T3) postoperatively.

Results

Of the 179 recipients, 32.9% had a positive sputum culture result and 37.4% had a positive peritoneal culture result during surveillance. In the culture surveillance of sputum, 37 organisms were isolated from 35 recipients at T1, and the most common organism was Staphylococcus aureus (n = 13). At T2, 45 organisms were isolated from 39 recipients, including Klebsiella pneumoniae (n = 10), S aureus (n = 8), and Acinetobacter baumannii (n = 6). At T3, 18 organisms were isolated from 15 patients, including Stenotrophomonas maltophilia (n = 5) and K pneumonia (n = 4). In the peritoneal fluid, 11 organisms were isolated from 10 recipients at T1, including Pseudomonas aeruginosa (n = 2) and Enterococcus species (n = 2). At T2, 39 organisms were isolated from 36 recipients, including coagulase-negative Staphylococcus species (CNS; n = 8) and Enterococcus species (n = 7). At T3, 54 organisms were isolated from 51 recipients, including CNS (n = 17) and Candida species (n = 8). Among the 59 patients with positive culture results for sputum surveillance, 16.9% developed pneumonia caused by the same organisms. Among the 67 patients with positive peritoneal fluid culture, 16.4% developed an intra-abdominal infection caused by the same organisms cultured. The recipients with positive surveillance culture had a higher risk of pneumonia (20.3% [12/59] vs 1.6% [2/120]; P < .001) and intra-abdominal infection (31.3% [21/67] vs 18.7% [21/112]; P = .05).

Conclusions

Periodic microbiologic surveillance may be useful in the prediction of post-transplantation pneumonia and intra-abdominal infection and could offer a potential target for empirical antimicrobial therapy in cases of infection.  相似文献   

7.

Background

In this study, we used a single-center database to examine the risks of renal transplantation in patients with diabetes mellitus (DM). We aimed to compare 1-year outcomes of survival and morbidity after renal transplantation among recipients with and without DM.

Methods

We reviewed retrospectively 1211 adult patients who underwent renal transplantation from January 2001 to December 2010. The patients were divided into 2 groups: Those with (33%) and those without (67%) pretransplant diabetes. Unpaired Student's t tests and χ2 tests were used to compare outcomes between diabetic and nondiabetic renal transplant recipients. We analyzed survival, renal function, development of proteinuria, rejection, and infection (requiring hospitalization).

Results

Patients with diabetes were older, had a greater body mass index (mean, 29.5 vs 25.3 kg/m2; P < .05), and had lower creatinine clearance (44.2 ± 11.4 vs 56.0 ± 18.2; P = .01). Forty-one patients died in hospital (3.4%; P = nonsignificant). Furthermore, survival rates were similar between these 2 groups. However, we found a trend toward decreased survival for those with DM at 1 year (80.4% vs 88.7%; P = .20). Mean follow-up time was 3.2 years. Infection rate within 6 months was greater among those with DM (19% vs 5%; odds ratio, 6.25). Freedom from rejection at 3 years was similar (75.2% vs 76.8%; P = .57). Multivariate analysis showed increased baseline creatinine level as a significant risk factor for survival. Body mass index >30 kg/m2 was a significant risk factor for survival among patients with DM.

Conclusion

We found an increased risk of serious infections in patients with DM, particularly within the first 6 months. However, our data suggest that diabetes is not associated with worse 1-year survival or higher morbidity in renal transplant patients, as long as good blood glucose control is maintained.  相似文献   

8.

Background

Metabolic syndrome (MS) may affect patient and graft survival in renal transplant recipients. However, the evolution of MS during prospective follow-up remains uncertain.

Methods

Renal transplant patients were recruited for a study of MS in 2010 and then prospectively followed for 2 years. The modified Adult Treatment Panel III criteria adopted for Asian populations were used to define MS.

Results

A total of 302 cases (male:female = 154:148) with a mean duration of 10.5 ± 5.7 years after transplantation were enrolled. At initiation, 71 cases (23.5%) fulfilled the criteria of MS. At the end of follow-up, 11 cases had died and 21 had graft failure. Nine cases had insufficient data for reclassification. The remaining 261 cases completed a 2-year follow-up, and the prevalence of MS was 26.1% at the end of study. Of these, 7.79% (18 cases) of patients without MS had developed new-onset MS. Conversely, 16.9% (12 cases) with MS were free from MS at the end of study (P = .362). Patients with MS were associated with older age (57.1 ± 10.4 vs 52.6 ± 12.4 y; P = .006), more chronic allograft nephropathy (17.4% vs 7.1%; P = .01), proteinuria (22.5% vs 10.8%; P = .012), and use of more antihypertensive agents (1.49 ± 0.86 vs 0.80 ± 0.98; P < .0001). There was no significant change in serum creatinine in each subgroup.

Conclusions

The status of MS in renal transplant patients is dynamic. MS patients were associated with more chronic allograft nephropathy and proteinuria.  相似文献   

9.
Most renal transplant recipients display vitamin D deficiency or insufficiency. The KDIGO guidelines suggest that this deficit should be treated as in the general population. Since there are few studies about the effects of cholecalciferol in de novo renal transplant recipients, we sought to assess these effects in long-term kidney graft recipients. Among 37 renal transplant recipients (19 males, 18 females) at a mean of 105 ± 82 months posttransplantation, vitamin D insufficiency or deficiency was treated with cholecalciferol (400-800 IU/d) plus calcium supplements (600-1200 mg/d of elemental calcium). These subjects were compared with 37 untreated recipients for a period between 6 and 12 months. At baseline, there were no differences between the groups in age at transplantation, sex, length of follow-up after grafting, function measured by estimated glomerular filtration rate (44.4 ± 16.8 treated vs 42.0 ± 15.0 mL/min/1.73 m2 untreated; P = .527); iPTH (157 ± 103 treated vs 176 ± 118 pg/mL untreated; P = .461); 25OHD (14.7 ± 4.7 treated vs 15.7 ± 9.7 ng/mL untreated; P = .584); or 1.25OHD (34.1 ± 26.0 treated vs 34.0 ± 13.0 pg/mL untreated; P = .950). When compared with baseline values, iPTH (157 ± 103 vs 144 ± 89 pg/mL; P = .11) and 1.25OHD levels at 6 months (34.1 ± 26.0 vs 35.9 ± 26.3 pg/mL; P = .282) showed no change but 25OHD levels (14.7 ± 4.7 vs 22.6 ± 7.4 ng/mL; P = .000) and phosphate tubular reabsorption (64% ± 17% baseline vs 69% ± 14% at 6 months; P = .030) were increased in the treated patients. There were no differences in the parameters studied in untreated patients. Among the 27 recipients followed at 12 months, iPTH was decreased compared with baseline values (157 ± 103 vs 124 ± 62 pg/mL; P = .024) and 25OHD remained stable with respect to the values at 6 months (21.1 ± 5.3 ng/mL). No adverse effects of cholecalciferol were observed such as those to increase urinary calcium excretion. Low doses of cholecalciferol improved vitamin D status and decreased iPTH levels at 12 months. Higher doses than those used in our study are needed to increase serum 25OHD concentrations above 30 ng/mL.  相似文献   

10.
Earlier detection and intervention for chronic renal allograft injury (CRAI) remain major challenges for transplantation physicians. Endocan plays a key role in the regulation of cell adhesion, inflammatory disorders, and tumor progression. We conducted this cross-sectional study of 97 renal transplant (RT) recipients with mean RT duration of 7.0 ± 5.7 years to determine whether Endocan could be a diagnostic and prognostic marker. The patients' mean age was 43.6 ± 13.2 years, and 55.7% (54/97) were male. Higher Endocan levels were found in more advanced chronic kidney disease (CKD) stages in a dose-dependent manner. Interestingly, the Endocan ≥643.19 pg/mL group had higher creatinine (Cr; 1.2 ± 0.4 vs 1.6 ± 1.1 mg/dL; P = .029) and lower estimated glomerular filtration rate (eGFR; 67.8 ± 23.8 mL/min vs 54.4 ± 22.0; P = .006) than the Endocan <643.19 pg/mL group after 3 months of follow-up, respectively. Linear regression analysis found tumor necrosis factor (TNF)-α correlated well with Endocan. To elucidate the response of endothelium activation, we stimulated human umbilical vein endothelial cells (HUVECs) with TNF-α in vitro, and found the levels of Endocan (P = .022) and transforming growth factor (TGF)-β1 (P = .034) increased with time, but interleukin (IL)-10 decreased (P = .013). In summary, Endocan may reflect the degree of endothelial cell injury in renal allografts, and showed a trend of elevation in late-stage CKD. An in vitro study demonstrated TNF-α–activated HUVECs secreted high levels of Endocan and TGF-β1, which could lead to a better understanding of the role of endothelium in immune balance. In conclusion, Endocan may have potential as a useful long-term indicator of CRAI in RT recipients, but further study is needed to verify our findings.  相似文献   

11.
《Transplantation proceedings》2021,53(7):2204-2205
BackgroundThe aim of this study was to assess the impact of the Belfast Protocol for enhanced recovery after surgery on hospital length of stay (LOS) after kidney transplant.MethodsA prospectively collected database was analyzed for all consecutive renal transplant recipients in 2010 and compared with consecutive renal transplant recipients in 2018 before and immediately after the full implementation of the Belfast Protocol.ResultsThere were 73 renal transplants in 2010 and 115 in 2018. Between 2010 and 2018 there was a significant decrease in LOS from 12 to 7 days (P < .0001). Compared with 2010, in 2018 there was a significant increase in donor age (47 vs 54 years, P < .0001) and kidney transplant from donation after circulatory death donors (0% vs 9%, P < .0001). Although there was no change in the proportion of living donors (59% vs 50%, P = .32), in 2018 there were more blood group incompatible living donors (0% vs 7%, P = .21). Compared with 2010, in 2018 there was a significant increase in recipient age (43 vs 54 years, P = .0002), diabetic nephropathy (5% vs 16%, P = .03), and recipient body mass index >35 kg/m2 (0% vs 9%, P = .02).ConclusionsImplementation of the Belfast Protocol has decreased LOS in renal transplant recipients despite increasingly complex donor and recipient profiles.  相似文献   

12.

Background

Monitoring cell-mediated immunity (CMI) can be used to estimate the risk of viral infections in kidney transplant recipients. The Immuknow (IMK) assay measures CD4+ T-cell adenosine triphosphate activity, assesses patient CMI status, and assists clinicians in determining the risk of viral infection.

Methods

We retrospectively analyzed 224 IMK values in 39 kidney transplant recipients at our institution from April 2012 to January 2013. We analyzed the relationship between IMK value and viral infection during the early and late post-transplantation periods. Multiple regression analyses were performed, to determine which factors impacted the results of the IMK assay.

Results

Eight patients developed viral infections, including BK virus, cytomegalovirus, herpes simplex, and shingles. Five infections occurred in the early post-transplantation period (<50 d) and 3 in the late period (>120 d). The IMK levels in patients who developed an infection in the early period were within normal limits; however, those in the late period were significantly lower than 200 ng/mL (421.0 ± 062.6 for early vs 153.7 ± 72.7 for late; P = .02).Our multiple regression analyses indicated that peripheral white blood cell and neutrophil counts affected IMK values (P = .03 and P = .02, respectively).

Conclusions

The IMK assay is a useful test for identifying patients at risk for post-transplantation viral infections in the late transplant period.  相似文献   

13.
Background and ObjectiveVariable age thresholds are often used at transplant centers for simultaneous heart and kidney transplantation (HKT). We hypothesize that selected older recipients enjoy comparable outcome to younger recipients in the current era of HKT.MethodsWe performed a retrospective analysis of HKT outcomes in the United Network for Organ Sharing (UNOS) registry from 2006 to 2018, classifying patients by age at transplant as ≥ 65 or < 65 years. The primary outcome was patient death. Secondary outcomes included all-cause kidney graft failure and death-censored kidney allograft failure.ResultsOf 973 patients, 774 (80%) were younger than 65 years (mean 52 ± 10 years) and 199 (20%) were 65 years or older (mean 67 ± 2 years). The older HKT cohort had fewer blacks (22% vs 35%, P = .01) and women (12 vs 18%, P = .04). Fewer older patients received dialysis (30% vs 54%, P < .001) and mechanical support (36% vs 45%, P = .03) before HKT. Older recipients received organs from slightly older donors. The median follow-up time was shorter for patients 65 years or older than for the younger group (2.3 vs 3.3 years, P < .001). Patient survival was similar between the groups (mean 8.8 vs 9.8 years, P = .3), with the most common causes of death being cardiovascular (29%) and infectious complications (28%). There was no difference in all-cause kidney graft survival (mean 8.7 vs 9.3 years, P = .8). Most commonly, recipients died with a functional renal allograft (59.8%), and this occurred more commonly in older patients (81.4% vs 54.8%, P = .001). Cox proportional hazard modeling showed that higher donor age (hazard ratio [HR] 1.015, P = .01; HR 1.022, P = .02) and use of pre-transplant dialysis (HR 1.5, P = .004; HR 1.8, P = .006) increased the risk for both all-cause and death-censored kidney allograft failure, respectively.ConclusionsOur study showed that carefully selected older patients have outcomes similar to those of a younger cohort and argues for comprehensive evaluation of the recipients with age as part of comorbidity assessment rather than use of an arbitrary age threshold for candidacy.  相似文献   

14.
The impact of borderline rejection in renal graft remains controversial. The aim of this study was to analyze the presence of C4d deposits in peritubular capillaries and macrophage infiltration in renal biopsies with diagnosis of borderline rejection ant its effect on graft function. Thirty-one renal transplant recipients with a diagnosis of borderline rejection were included. Initial and sequential biopsies were analyzed for morphology, C4d, and macrophage staining and compared with clinical data. Initial biopsies showed 12 samples to be C4d positive, associated with a higher incidence of delayed graft function, earlier post-transplantation time, higher acute tubular necrosis score, capillaritis, and glomerular macrophage infiltration, and a lower level of tubulitis, interstitial fibrosis, and tubular atrophy compared with the C4d-negative samples. In sequential biopsies, 5 patients from the negative group turned C4d positive. Patients with ≥1 positive C4d biopsy (n = 17) showed lower renal graft function at 6 months (1.8 ± 0.8 vs 1.4 ± 0.5 mg/dL; P < .01), 1 year (2.1 ± 1 vs 1.5 ± 0.5 mg/dL; P < .01), and 2 years (2.3 ± 1.3 vs 1.5 ± 0.7 mg/dL; P < .05) of follow-up. The expression of C4d in peritubular capillaries of renal biopsies classified as borderline rejection was associated with a worse prognosis for the renal allograft.  相似文献   

15.
IntroductionEffective workup and listing of end-stage renal disease (ESRD) patients for renal transplantation, often with multiple co-morbidities, poses a challenge for transplant teams. Obesity is a common co-morbidity associated with adverse outcomes in ESRD and kidney transplant (KT) recipients. Bariatric and metabolic surgery (BMS) has long been established as a safe and effective treatment for morbid obesity. In this study, the authors aimed to evaluate the strength of evidence for both the efficacy and safety of bariatric surgery in patients with ESRD or kidney transplantation.MethodsA literature search was performed using key terms including “transplantation”, “kidney”, “renal”, “obesity”, and “bariatric”. Databases searched include MEDLINE, EMBASE and Web of Science from inception to date (April 2021). Methodological quality was assessed using the Newcastle-Ottawa tool. Selected articles were then categorised into patients awaiting waiting list acceptance, patients awaiting transplantation, patients undergoing simultaneous BMS + KT and patients undergoing BMS following a previous renal transplant. Summary effects are presented with a level of statistical significance and 95% Confidence Intervals.ResultsA total of 28 articles were selected following the literature search. Fourteen studies on patients awaiting listing (n = 1903), nine on patients on the KT waiting list (n = 196), a single study on simultaneous BMS and KT and ten studies on patients undergoing BMS following KT (n = 198). Mean change in BMI for patients awaiting listing was −11.3 kg/m2 (95%CI: −15.3 to −7.3, p < 0.001), mean change in BMI for patients listed for KT was −11.2 kg/m 2(95%CI: −12.9 to −9.5, p 0.001) and mean change for patients with prior KT was −11.0 kg/m2 (95%CI: −7.09 to −14.9, p < 0.001). The combined mortality rate for patients who had undergone both BMS and KT was 4% (n = 15).DiscussionThis review demonstrates BMS is both safe and efficacious in patients with ESRD prior to KT and in those post KT. It would enable difficult-to-list obese recipients the possibility to undergo transplantation and should be considered as part of the work up process.  相似文献   

16.

Background

This study was designed to compare donors who underwent open (ODN) versus retroperitonoscopic nephrectomy (RDN) in terms of intra-operative oxidative stress and recipients graft function in the early postoperative period.

Methods

Among 40 patients who underwent donor nephrectomy, 23 were operated via an open method and 17 via retroperitonoscopic method. To analyze oxidative stress, we measured plasma levels of malondialdehyde (MDA), protein carbonyl, and protein sulfhydryl moieties in donor venous blood before induction of anesthesia and postoperatively at 0, 6, and 24 hours. The influence of oxidative stress on graft function was evaluated by means of the postoperative 5th day recipient creatinine and estimated glomerular filtration rate (eGFR) Modification of Diet in Renal Disease Formula (MDRD) to evaluate delayed graft function (DGF) status.

Results

ODN patients showed significantly higher 24-hour mean levels of MDA, (6,139 ± 1,854 vs 4,813 ± 1,771 nmol/L; P = .01), protein carbonyl (366 ± 64 vs 311 ± 62 μmol/L; P = .01) and protein sulfhydryl (468 ± 110 vs 386 ± 75 μmol/L; P = .01) moieties compared with those RDN patients. However, ODN and RDN recipients were similar in terms of 5th day mean creatinine and eGFR (1.1 ± 0.3 vs 1.4 ± 0.8 mg/dL and 69.15 ± 12.24 vs 56.31 ± 25.2, respectively) and DGF status (4.4% [1/23] vs 5.9% [1/17], respectively).

Conclusions

Although ODN donors were more prone to intra-operative oxidative stress than RDN donors, based on significantly higher levels of oxidative stress markers, this difference seems to not significantly influence recipients early graft function.  相似文献   

17.
BackgroundBelatacept is employed alongside calcineurin inhibitor (CNI) therapy to prevent graft rejection in kidney transplant patients who are Epstein-Barr virus (EBV) seropositive. Preliminary data suggested that rates of post-transplant lymphoproliferative disorder (PTLD) were higher in individuals treated with belatacept compared to CNI therapy alone.MethodsThe records of 354 adults who underwent kidney only transplantation from January 2015 through September 2021 at one medical center were evaluated. Patients underwent treatment with either low-doses of mycophenolate, tacrolimus and sirolimus (B0, n = 235) or low-doses of mycophenolate, tacrolimus and belatacept (B1, n = 119). All recipients underwent induction with antithymocyte globulin and a rapid glucocorticosteroid taper. Relevant donor and recipient information were analyzed and endpoints of PTLD were assessed.ResultsThere were no cases of PTLD in either cohort within the study period. Recipients in the belatacept cohort experienced lower estimated glomerular filtration rates at 12 months (B0: 67.48 vs. B1: 59.10, p = 0.0014). Graft failure at 12 (B0: 1.28% vs. B1: 0.84%, p = 1.0) and 24 months (B0:2.55% vs. B1: 0.84%, p = 0.431) were similar. There was no difference in rejection rates at 12 (B0: 1.27% vs. B1: 2.52%, p = 0.408) or 24 months (B0: 2.12% vs. B1: 2.52%, p = 1.000). Both groups had similar rates of malignancy, mortality and CMV/BK viremia.ConclusionNon-belatacept (MMF, tacrolimus and sirolimus) and belatacept-based (MMF, tacrolimus and belatacept) regimens do not appear to pose any increased risk of early onset PTLD. Both cohorts benefited from low rates of rejection, malignancy, mortality and graft failure. Recipients will continue to be monitored as PTLD can manifest as a long-term complication.  相似文献   

18.
19.

Background

Liver resection (LR) in liver transplant (OLT) recipients, an extremely rare situation, who performed on 8 recipients.

Methods

This retrospective analysis of prospectively collected data concerned 8 (0.66%) 1198 LR cases among OLT performed from 1997 to 2011. We analyzed demographic data, surgical indications, and postoperative courses.

Results

The indications were resectable recurrent hepatocellular carcinomas (HCC, n = 3), persistent fistula from a posterior sectorial duct (n = 1), recurrent cholangitis due to anastomotic stricture on the posterior sectorial duct (n = l), hydatid cyst (n = l), left arterial hepatic thrombosis with secondary ischemic cholangitis (n = 1), and a large symptomatic biliary cyst (n = 1). The mean interval time to liver resection was 23.7 months (range, 5–47). LR included right hepatectomy (n = 1), right posterior hepatectomy (n = 1), left lobectomy (n = 4), pericystectomy (n = 1), or biliary fenestration (n = 1). Which there was no postoperative mortality, the global morbidity rate was 62% (5/8). The mean follow-up after LR was 92 months (range, 11–156). No patients required retransplantation. None of the 3 patients who underwent LR for HCC showed a recurrence.

Conclusions

LR in OLT recipients is safe, but associated with a high morbidity rate. This procedure can avoid retransplantation in highly selected patients, presenting a possible option particularly for transplanted patients with a resectable, recurrent HCC.  相似文献   

20.

Purpose

The patent covering mycophenolate mofetil (MMF) in Korea has expired and, thus, several generic MMF agents are now commercially available. The supply of Cellcept (Roche Korea) was interrupted at the end of 2011, so it was inevitable that a generic MMF would be used instead. During this period, we performed a prospective pilot study to examine the safety and efficacy of a generic mycophenolate agent (Myconol: Hanmi Pharmaceutical, Seoul Korea) for use as conversion maintenance therapy in stable liver transplantation (OLT) recipients.

Methods

OLT recipients, who were treated with MMF on an outpatient basis from January 2012 to March 2012, attended follow-up interviews conducted. The patients had undergone OLT ≥ 2 years before the study, had tolerated Cellcept, and showed stable liver function. Fifty-three patients were followed up for more than 3 months after conversion to the same dose of Myconol.

Results

After conversion to Myconol, 6 patients (11.3%) experienced new side effects, which disappeared when they reverted to Cellcept (n = 5) or stopped taking Myconol medication (n = 1). The side effects associated with Myconol included gastrointestinal symptoms (indigestion and diarrhea; n = 3), skin eruptions (n = 1), pruritus (n = 1), and insomnia (n = 1). The mean mycophenolic acid levels were 1.71 ± 0.88 μg/mL for Cellcept and 1.83 ± 0.91 μg/mL for Myconol, which showed a strong correlation (r2 = 0.92, P < .001).

Conclusions

Myconol showed similar pharmacokinetics to those of Celcept, but a small proportion of patients experienced agent-specific side effects; therefore, patients should be closely monitored when taking Myconol. Also, further studies, with a greater number of patients, are required to identify the full spectrum of drug-associated side effects.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号