首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Abstract:  The shortage of kidney donors has led to broadening of the acceptance criteria for deceased donor organs beyond the traditional use of young donors. We determined long-term post-transplant outcomes in recipients of dual expanded criteria donor kidneys (dECD, n = 44) and compared them to recipients of standard criteria donor kidneys (SCD, n = 194) and single expanded criteria donor kidneys (sECD, n = 62). We retrospectively reviewed these 300 deceased donor kidney transplants without primary non-function (PNF) or death in the first two wk, at our center from 1996 to 2003. The three groups were similar in baseline characteristics. Kidney allograft survival and patient survival (nine yr) were similar in the three respective donor groups, SCD, sECD and dECD (60% vs. 59% vs. 64% and 82% vs. 73% vs. 73%). Acute rejection in the first three months was 23.2%, 16.1%, and 22.7% in SCD, sECD and dECD, respectively (p = 0.49) and delayed graft function was 25.2%, 31.9% and 17.1% in the three groups, respectively (p = 0.28). When PNF and death within the first two wk was included, there was no significant difference in graft survival between the three groups. In our population, recipients of dECD transplants have acceptable patient and graft survival with kidneys that would have usually been discarded.  相似文献   

2.
The use of kidneys from hepatitis C virus (HCV)‐positive (D+) deceased donors for HCV‐negative recipients (R?) might increase the donor pool. We analyzed the national Organ Procurement and Transplant Network (OPTN) registry from 1994 to 2014 to compare the outcomes of HCV D+/R? (n = 421) to propensity‐matched HCV‐negative donor (D?)/R? kidney transplants, as well as with waitlisted patients who never received a transplant, in a 1:5 ratio (n = 2105, per matched group). Both 5‐year graft survival (44% vs 66%; < .001) and patient survival (57% vs 79%; < .001) were inferior for D+/R? group compared to D?/R?. Nevertheless, 5‐year patient survival from the time of wait listing was superior for D+/R? when compared to waitlisted controls (68% vs 43%; < .001). Of the 126 D+/R? with available post‐transplant HCV testing, HCV seroconversion was confirmed in 62 (49%), likely donor‐derived. Five‐year outcomes were similar between D+/R? that seroconverted vs D+/R? that did not (n = 64). Our analysis shows inferior outcomes for D+/R? patients although detailed data on pretransplant risk factors was not available. Limited data suggest that HCV transmission occurred in half of HCV D+/R? patients, although this might not have been the primary factor contributing to the poor observed outcomes.  相似文献   

3.
Abstract:  We examined a group of SPK recipients that had early (<90 d post-transplant) pancreas graft failure caused by a technical complication, and looked at outcomes of the kidney graft in these recipients. Of 289 SPK transplants, 36 (12.5%) had early pancreas graft failure because of a technical complication: thrombosis (n = 16), leak (n = 5), infection (n = 14), and pancreatitis (n = 1). Once the pancreas was lost, there was a high incidence of subsequent kidney graft failure. Kidney graft survival in these 36 recipients was 71.4% at one yr and 59.5% at three yr, significantly inferior compared to recipients that did not have early failure of the pancreas (86% at one yr and 82% at three yr, p < 0.001). Of the 36 recipients with early pancreas loss, 18 have gone on to failure of the kidney graft. Causes included thrombosis (n = 3), infection (n = 1), death with function (n = 6), chronic rejection (n = 4), ischemia (n = 1), and other (n = 3). Of the 18 kidney graft failures, nine occurred within three months after loss of the pancreas graft, usually either because of graft thrombosis, or patient death (usually from systemic sepsis). Multivariate analysis showed technical failure of the pancreas to be the most significant risk factor for kidney graft loss (HR = 2.08, p = 0.006).  相似文献   

4.
Abstract:  Mycophenolate mofetil (MMF) and sirolimus (SRL) are effective immunosuppressive drugs with distinct safety profile.
Methods:  Kidney transplant recipients receiving tacrolimus (TAC)-based immunosuppressive regimen were randomized to receive fixed daily doses of MMF (2 g/d, n = 50) or SRL (one loading dose of 15 mg, 5 mg/d till day 7 and 2 mg/d thereafter, n = 50) without induction therapy.
Results:  No differences were observed in the incidence of the composite (biopsy-confirmed acute rejection, graft loss or death) end-point (18% vs. 16%, p = 1.000), biopsy confirmed acute rejection (12% vs. 14%, p = 1.000), one-yr patient (94% vs. 98%, p = 0.308), graft (92% vs. 98%, p = 0.168), and death-censored graft survival (98% vs. 100%, p = 0.317) comparing patients receiving MMF or SRL respectively. Patients receiving SRL showed worse safety outcomes, higher mean creatinine (1.6 ± 0.5 mg/dL vs. 1.4 ± 0.3 mg/dL, p = 0.007), higher proportion of patients with proteinuria (52.0% vs. 10.7%, p = 0.041), higher mean urinary protein concentrations (0.3 ± 0.5 g/L vs. 0.1 ± 0.2 g/L, p = 0.012), higher mean cholesterol concentration (217 mg/dL vs. 190 mg/dL, p = 0.030), and higher proportion of patients prematurely discontinued from randomized therapy (26% vs. 8%, p = 0.031).
Conclusion:  In patients receiving TAC, MMF produced similar efficacy but superior safety profile compared with SRL.  相似文献   

5.
SIMCER was a 6‐mo, multicenter, open‐label trial. Selected de novo liver transplant recipients were randomized (week 4) to everolimus with low‐exposure tacrolimus discontinued by month 4 (n = 93) or to tacrolimus‐based therapy (n = 95), both with basiliximab induction and enteric‐coated mycophenolate sodium with or without steroids. The primary end point, change in estimated GFR (eGFR; MDRD formula) from randomization to week 24 after transplant, was superior with everolimus (mean eGFR change +1.1 vs. ?13.3 mL/min per 1.73 m2 for everolimus vs. tacrolimus, respectively; difference 14.3 [95% confidence interval 7.3–21.3]; p < 0.001). Mean eGFR at week 24 was 95.8 versus 76.0 mL/min per 1.73 m2 for everolimus versus tacrolimus (p < 0.001). Treatment failure (treated biopsy‐proven acute rejection [BPAR; rejection activity index score >3], graft loss, or death) from randomization to week 24 was similar (everolimus 10.0%, tacrolimus 4.3%; p = 0.134). BPAR was more frequent between randomization and month 6 with everolimus (10.0% vs. 2.2%; p = 0.026); the rate of treated BPAR was 8.9% versus 2.2% (p = 0.055). Sixteen everolimus‐treated patients (17.8%) and three tacrolimus‐treated patients (3.2%) discontinued the study drug because of adverse events. In conclusion, early introduction of everolimus at an adequate exposure level with gradual calcineurin inhibitor (CNI) withdrawal after liver transplantation, supported by induction therapy and mycophenolic acid, is associated with a significant renal benefit versus CNI‐based immunosuppression but more frequent BPAR.  相似文献   

6.
Complement‐dependent cytotoxicity cross‐match (CDCXM) is used for evaluation of preformed HLA‐specific antibodies in patients undergoing heart transplantation. Flow cytometry cross‐match (FCXM) is a more sensitive assay and used with increasing frequency. To determine the clinical relevance of a positive FCXM in the context of negative CDCXM in heart transplantation, the United Network for Organ Sharing (UNOS) database was analyzed. Kaplan‐Meier analysis and Cox proportional hazard modeling were used to assess graft survival for three different patient cohorts defined by cross‐match results: T‐cell and B‐cell CDCXM+ (“CDCXM+” cohort), CDCXM? but T‐cell and/or B‐cell FCXM+ (“FCXM+” cohort), and T‐cell/B‐cell CDCXM? and FCXM‐ (“XM?” cohort). During the study period, 2558 patients met inclusion criteria (10.7% CDCXM+, 18.8% FCXM+, 65.5% XM?). CDCXM+ patients had significantly decreased graft survival compared to FCXM+ and XM? cohorts (P = .003 and <.001, respectively). CDCXM? and FCXM+ patients did not have decreased graft survival compared to XM? patients (P = .09). In multivariate analysis, only CDCXM+ was associated with decreased graft survival (HR 1.22, 95% CI 1.01‐1.49). In conclusion, positive FCXM in the context of negative CDCXM does not confer increased risk of graft failure. Further study is needed to understand implications of CDCXM and FCXM testing in heart transplant recipients.  相似文献   

7.
Building on studies showing that ischemia–reperfusion‐(I/R)‐injury is complement dependent, we tested links among complement activation, transplantation‐associated I/R injury, and murine cardiac allograft rejection. We transplanted BALB/c hearts subjected to 8‐h cold ischemic storage (CIS) into cytotoxic T‐lymphocyte associated protein 4 (CTLA4)Ig‐treated wild‐type (WT) or c3?/? B6 recipients. Whereas allografts subjected to 8‐h CIS rejected in WT recipients with a median survival time (MST) of 37 days, identically treated hearts survived >60 days in c3?/? mice (p < 0.05, n = 4–6/group). Mechanistic studies showed recipient C3 deficiency prevented induction of intragraft and serum chemokines/cytokines and blunted the priming, expansion, and graft infiltration of interferon‐γ–producing, donor‐reactive T cells. MST of hearts subjected to 8‐h CIS was >60 days in mannose binding lectin (mbl1?/?mbl2?/?) recipients and 42 days in factor B (cfb?/?) recipients (n = 4–6/group, p < 0.05, mbl1?/?mbl2?/? vs. cfb?/?), implicating the MBL (not alternative) pathway. To pharmacologically target MBL‐initiated complement activation, we transplanted BALB/c hearts subjected to 8‐h CIS into CTLA4Ig‐treated WT B6 recipients with or without C1 inhibitor (C1‐INH). Remarkably, peritransplantation administration of C1‐INH prolonged graft survival (MST >60 days, p < 0.05 vs. controls, n = 6) and prevented CI‐induced increases in donor‐reactive, IFNγ‐producing spleen cells (p < 0.05). These new findings link donor I/R injury to T cell–mediated rejection through MBL‐initiated, complement activation and support testing C1‐INH administration to prevent CTLA4Ig‐resistant rejection of deceased donor allografts in human transplant patients.  相似文献   

8.
The purpose of this study was to sequentially monitor anti‐HLA antibodies and correlate the results with antibody‐mediated rejection (AMR), graft survival (GS), and graft function (GF). We collected sera from 111 kidney transplant recipients on transplant days 0, 7, 14, 30, 60, 90, 180, and 360 and analyzed PRA levels by ELISA. DSAs were analyzed by single‐antigen beads in rejecting kidneys. At pre‐transplant, 79.3% of the patients were non‐sensitized (PRA = 0%) and 20.7% were sensitized (PRA > 1%). After transplant, patients were grouped by PRA profile: no anti‐HLA antibodies pre‐ or post‐transplant (group HLApre?/post?; n = 80); de novo anti‐HLA antibodies post‐transplant (group HLApre?/post+; n = 8); sensitized pre‐transplant/increased PRA post‐transplant (group HLApre+/post↑; n = 9); and sensitized pre‐transplant/decreased PRA post‐transplant (group HLApre+/post↓; n = 14). De novo anti‐HLA antibodies were detected at 7–180 d. In sensitized patients, PRA levels changed within the first 30 d post‐transplant. Incidence of AMR was higher in HLApre?/post+ and HLApre+/post↑ than in HLApre?/post?, and HLApre+/post↓ (p < 0.001) groups. One‐yr death‐censored GS was 36% in group HLApre+/post↑, compared with 98%, 88% and 100% in groups HLApre?/post?, HLApre?/post+, and HLApre+/post↓, respectively (p < 0.001). Excluding first‐year graft losses, GF and GS were similar among the groups. In conclusion, post‐transplant antibody monitoring can identify recipients at higher risk of AMR.  相似文献   

9.
Neostigmine reverses non‐depolarising neuromuscular blockade, but may cause muscle weakness when administered after full recovery of neuromuscular function. We hypothesised that neostigmine in therapeutic doses impairs muscle strength and respiratory function in awake healthy volunteers. Twenty‐one volunteers were randomised to receive two doses of either intravenous (i.v.) neostigmine 2.5 mg with glycopyrrolate 450 μg (neostigmine group, n = 14) or normal saline 0.9% (placebo group, n = 7). The first dose was administered immediately after obtaining baseline measurements, and the second dose was administered 15 min later. All 14 volunteers in the neostigmine group received the first dose, mean (SD) 35 (5.8) μg.kg?1, but only nine of these volunteers agreed to receive the second dose, 34 (3.5) ?g.kg‐1. The primary outcome was hand grip strength. Secondary outcomes were train‐of‐four ratio, single twitch height, forced expiratory volume in 1 s, forced vital capacity, forced expiratory volume in 1 s/forced vital capacity ratio, oxygen saturation, heart rate and mean arterial pressure. The first dose of intravenous neostigmine with glycopyrrolate resulted in reduced grip strength compared with placebo, ?20 (20) % vs. +4.3 (9.9) %, p = 0.0016; depolarising neuromuscular blockade with decreased single twitch height, ?14 (11) % vs. ?3.8 (5.6) %, p = 0.0077; a restrictive spirometry pattern with decreased predicted forced expiratory volume in 1 s, ?15 (12) % vs. ?0.47 (3.4) %, p = 0.0011; and predicted forced vital capacity, ?20 (12) % vs. ?0.59 (3.2) %, p < 0.0001 at 5 min after administration. The second dose of neostigmine with glycopyrrolate further decreased grip strength mean (SD) ?41 (23) % vs. +1.0 (15) %, p = 0.0004; single twitch height ?25 (15) % vs. ?2.5 (6.6) %, p = 0.0030; predicted forced expiratory volume in 1 s ?23 (24) % vs. ?0.7 (4.4) %, p = 0.0063; and predicted forced vital capacity, ?27.1 (22.0) % vs. ?0.66 (3.9) %, p = 0.0010. Train‐of‐four ratio remained unchanged (p = 0.22). In healthy volunteers, therapeutic doses of neostigmine induced significant and dose‐dependent muscle weakness, demonstrated by a decrease in maximum voluntary hand grip strength and a restrictive spirometry pattern secondary to depolarising neuromuscular blockade.  相似文献   

10.
Abstract:  The relationship between global economic indicators and kidney allograft and patient survival is unknown. To investigate possible relationships between the two, we analyzed kidney transplant recipients receiving transplants between January of 1995 and December of 2002 (n = 105 181) in the USA using Cox regression models. We found that: The Dow Jones Industrial Average had a negative association with outcome at one year post-transplant (HR 1.03 and 1.06, p < 0.001 for graft and recipient survival, respectively) but changed to a protective effect in the late period (HR 0.77, p < 0.001, and HR 0.83, p < 0.001 for graft and recipient survival, respectively, five yr after transplantation). Unemployment rate had a protective effect at the time of transplantation (HR 0.97, p < 0.005) and at one year after transplantation (HR 0.95, p < 0.005) but changed to the opposite in the late period at the fifth post-transplant year (HR 1.35, p < 0.001, and HR 1.20, p < 0.001, for graft and recipient survival respectively). The Consumer Price Index measured at different post-transplant time points seems to have had a protective effect on the graft (HR 0.77, p < 0.001 at five yr) and recipient (HR 0.83, p < 0.001 at five yr) survival. Beyond three yr after transplantation, when some of the recipients lose Medicare benefits, economic downturns might have a negative association with the kidney graft and recipient survival.  相似文献   

11.
Appropriate recipient selection of simultaneous liver/kidney transplantation (SLKT) remains controversial. In particular, data on liver graft survival in hepatitis C virus‐infected (HCV+) SLKT recipients are lacking. We conducted a single‐center, retrospective study of HCV+ SLKT recipients (N = 25) in comparison with HCV? SLKT (N = 26) and HCV+ liver transplantation alone (LTA, N = 296). Despite backgrounds of HCV+ and HCV? SLKT being similar, HCV+ SLKT demonstrated significantly impaired 5‐year liver graft survival of 35% (HCV? SLKT, 79%, P = 0.004). Compared with HCV+ LTA, induction immunosuppression was more frequently used in HCV+ SLKT. Five‐year liver graft survival rate for HCV+ SLKT was significantly lower than that for LTA (35% vs. 74%, respectively, P < 0.001). Adjusted hazard ratio of liver graft loss in HCV+ SLKT was 4.9 (95% confidence interval 2.0–12.1, P = 0.001). HCV+ SLKT recipients were more likely to succumb to recurrent HCV and sepsis compared with LTA (32% vs. 8.8%, < 0.001 and 24% vs. 8.8%, P = 0.030, respectively). Ten HCV+ SLKT recipients underwent anti‐HCV therapy for recurrent HCV; only 1 achieved sustained virological response. HCV+ SLKT is associated with significantly decreased long‐term prognosis compared with HCV? SLKT and HCV+ LTA.  相似文献   

12.
Cytotoxic T‐lymphocyte antigen‐4 (CTLA‐4) is a cell surface protein, which down‐regulates the immune response at CTLA‐4/CD28/B7 pathway. We aimed to investigate the influence of the ?318C/T, +49A/G, ?1661A/G and CT60A/G, and CTLA‐4 gene polymorphisms on acute rejection of kidney allograft in Turkish patients. The study design was a case–control study that consists of three groups: Group 1 (n = 34) represented the kidney transplant (Ktx) recipients who experienced acute rejection, Group 2 (n = 47) was randomly assigned Ktx recipients without acute rejection, and Group 3 (n = 50) consisting of healthy volunteers to evaluate the normal genomic distribution. The polymerase chain reaction–restriction fragment length polymorphism technique was used to determine the polymorphisms. Genotype and allele frequencies among three groups denoted similar distributions for +49A/G, ?1661A/G, and CT60A/G. Conversely, ?318C/T genotype was three times more frequent in the acute rejection group than in the non‐rejection group (OR = 3.45; 95%CI = 1.18–10.1, p = 0.015) and two times more frequent than the healthy control group (OR = 2.45; 95% CI = 0.98 – 6.11, p = 0.047). Additionally, having a T allele at ?318 position was significantly associated with acute rejection (0.147 vs. 0.043, OR = 3.45; 95% CI = 1.13–10.56, p = 0.02). 318C/T gene polymorphism and T allelic variant were found to be associated with increased acute rejection risk in Turkish kidney allograft recipients.  相似文献   

13.
Despite a variety of urinary tract reconstructive techniques, urinary complications are the most frequent technical adverse event following kidney transplantation. We examined outcomes of two ureteroneocystostomy techniques, the full‐thickness (FT) technique and the Lich–Gregoir (LG) technique in 634 consecutive kidney‐alone transplants (327 FT and 307 LG) between December 2006 and December 2010. Urological complications at one yr post‐transplantation occurred in 27 cases (4.3%) including 16 ureteral strictures (2.5%), four ureteral obstructions (0.6%) owing to donor‐derived stones or intrinsic hematoma, and seven urine leaks (1.1%). Compared with LG, the FT technique was associated with similar proportions of ureteral complications overall (3.9% vs. 4.6%, p = 0.70), ureteral strictures (3.7% vs. 1.3%, p = 0.08), urinary stones/hematoma (1.0% vs. 0.3%, p = 0.36), and overall urinary leaks (1.6% vs. 0.6%, p = 0.22); however, the FT technique was associated with somewhat fewer urine leaks at the ureterovesical junction (0% vs. 1.3%, p = 0.05). There were no differences between the two groups in terms of length of stay, delayed graft function, urinary tract infection with the first post‐transplant year, estimated glomerular filtration rate, and overall graft and patient survival. The FT technique of ureteroneocystostomy is technically simple to perform and has a similar incidence of urinary complications compared with the LG technique.  相似文献   

14.
Abstract:  Nocturnal home hemodialysis (NHD) is a novel dialysis strategy associated with multiple advantages over conventional hemodialysis (CHD). Short- and long-term clinical outcomes of NHD patients after kidney transplantation are unknown. We hypothesized that the incidence of delayed graft function (DGF), patient and graft survival, and post-transplant estimated glomerular filtration rate (eGFR) is better among CHD-transplanted individuals than among those having received NHD. Of 231 NHD patients, 36 underwent renal transplantation between 1994 and 2006 and were matched to 68 transplanted CHD patients with a maximum follow-up of 11.7 yr. The incidence of DGF was not different between the two groups [NHD: 15/35 (42.9%) vs. CHD: 25/68 (36.8%) p = 0.43]. In modeling eGFR pre-transplant weight, donor age and recipient race were most predictive. Dialysis modality prior to transplantation influenced neither the level of eGFR post-transplantation (p = 0.34), nor the rate of eGFR decline. Patient survival was comparable between NHD and CHD groups (log-rank p = 0.91). Based on this analysis, it appeared that the incidence of DGF was similar between NHD- and CHD-transplanted patients and that pre-transplant modality did not impact on the level or rate of deterioration of post-transplant eGFR.  相似文献   

15.
Epstein‐Barr virus (EBV)‐induced post‐transplant lymphoproliferative disorder (PTLD) occurs frequently when rabbit antithymocyte globulin (ATG) is used in hematopoietic cell transplant (HCT) conditioning. We retrospectively studied 554 patients undergoing ATG‐conditioned myeloablative HCT. Strategies used to minimize mortality due to PTLD were either therapy of biopsy‐diagnosed PTLD in the absence of EBV DNAemia monitoring (n = 266) or prompt therapy of presumed PTLD (based on clinical/radiologic signs and high EBV DNAemia, in the setting of weekly EBV DNAemia monitoring) (n = 199). Both strategies resulted in similar mortality due to PTLD (0.7% vs 1% at 2 years, P = .43) and similar overall survival (63% vs 67% at 2 years, P = .23) even though there was a trend toward higher PTLD incidence with the prompt therapy. Donor positive with recipient negative EBV (D+R?) serostatus was a risk factor for developing PTLD. Older patient age, HLA‐mismatched donor, and graft‐versus‐host disease were not associated with increased risk of PTLD. In summary, in ATG‐conditioned HCT, D+R? serostatus, but not older age, mismatched donor or GVHD is a risk factor for developing PTLD. EBV DNAemia monitoring may be a weak risk factor for developing/diagnosing PTLD; the monitoring coupled with prompt therapy does not improve survival.  相似文献   

16.
We aimed to determine the role of cytomegalovirus (CMV)‐infected donor cells in the development of a CMV‐specific immune response in kidney transplant recipients. We assessed the CMV pp65‐specific immune response by using interferon‐? ELISPOT and dextramers in peripheral blood mononuclear cells from 115 recipients (D+R? 31, D+R + 44, D?R + 40) late after transplantation (mean 59 ± 42 months). Receiving a kidney from a D+ donor resulted in a higher number of IFN‐?‐producing anti‐CMV T cells (P = .004). This effect disappeared with the absence of shared HLA class I specificities between donors and recipients (P = .430). To confirm the role of donor cells in stimulating the expansion of newly developed CMV‐specific CD8+ T cells after transplantation, we compared the number of HLA‐A2–restricted CMV‐specific CD8+ T cells in primo‐infected recipients who received an HLA‐A2 or non–HLA‐A2 graft. The median of anti‐CMV pp65 T cells restricted by HLA‐A2 was very low for patients who received a non–HLA‐A2 graft vs an HLA‐A2 graft (300 [0‐14638] vs. 17972 [222‐85594] anti‐CMV pp65 CD8+ T cells/million CD8+ T cells, P = .001). This adds new evidence that CMV‐infected kidney donor cells present CMV peptides and drive an inflation of memory CMV‐specific CD8+ T cells, likely because of frequent CMV replications within the graft.  相似文献   

17.
Abstract  The impact of a three-drug regimen including mycophenolate mofetil (MMF) vs. a two-drug (no MMF) regimen on progressive renal dysfunction (PRD) in liver transplant recipients with hepatitis C virus (HCV) infection has not been well described. Adults with HCV who received a primary liver transplant between January 1, 2000 and December. 31, 2005 and were discharged from the hospital on a three-drug regimen [CNI+MMF+steroids (S)] (n = 4 946) were compared with those discharged on two-drug regimen (CNI+S) (n = 3 884). Time to PRD (defined by a post-transplant 25% decline in estimated GFR, based on the four-variable MDRD equation) and recipient death were evaluated using Kaplan–Meier analysis. Cox proportional hazards regression was used to estimate the risk for post-transplant PRD and death after controlling for baseline characteristics and extended steroid use. The two groups were similar in baseline characteristics. The percentage of recipients on three- vs. two-drug regimen without PRD was higher, 36.8% vs. 31.9%, (p < 0.001), at three yrs post-transplant; three-drug therapy was associated with a 6% lower adjusted risk of PRD. The death rate and adjusted risk for death was lower for recipients on a three- vs. two-drug regimen. Liver transplant recipients with HCV on a MMF-containing regimen are at a lower risk for PRD and death compared with recipients on a regimen not including MMF.  相似文献   

18.
19.
Delayed graft function (DGF) following deceased donor kidney transplantation is associated with inferior outcomes. Delayed graft function following living‐donor kidney transplantation is less common, but its impact on graft survival unknown. We therefore sought to determine risk factors for DGF following living‐donor kidney transplantation and DGF's effect on living‐donor kidney graft survival. We analyzed living‐donor kidney transplants performed between 2000 and 2014 in the UNOS dataset. A total of 64 024 living‐donor kidney transplant recipients were identified, 3.6% developed DGF. Cold ischemic time, human leukocyte antigen mismatch, donor age, panel reactive antibody, recipient diabetes, donor and recipient body mass index, recipient race and gender, right nephrectomy, open nephrectomy, dialysis status, ABO incompatibility, and previous transplants were independent predictors of DGF in living‐donor kidney transplants. Five‐year graft survival among living‐donor kidney transplant recipients with DGF was significantly lower compared with graft survival in those without DGF (65% and 85%, respectively, P < 0.001). DGF more than doubled the risk of subsequent graft failure (hazard ratio = 2.3, 95% confidence interval: 2.1–2.6; P < 0.001). DGF after living‐donor kidney transplantation is associated with inferior allograft outcomes. Minimizing modifiable risk factors may improve outcomes in living‐donor kidney transplantation.  相似文献   

20.
The objective of this review was to assess whether dual kidney transplantation (DKT ) is better than single KT (SKT ) for optimizing the use of expanded criteria donor kidneys. We did a systematic literature search and meta‐analyses when possible, pooling data for calculating relative risks (RR ) of major outcomes. Twenty‐five studies met the inclusion criteria. One‐year serum creatinine was better after DKT vs. SKT (mean difference ?0.27 [?0.37, ?0.17], P  < 0.001), with less incidence of acute rejection (RR 0.66 [0.52, 0.85], P  < 0.001) and without differences at five years. Less DGF was seen in DKT (RR 0.88 [0.76, 1.02], P  = 0.09). Mortality at 1 and 3 years was similar after dual or SKT , but mortality at five years was lower after DKT (RR 0.71 [0.53, 0.94], P  = 0.02). One‐year graft loss was similar between dual (n  = 4158) and SKT (n  = 51 800) (RR 0.97 [0.87, 1.09], P  = 0.62). Three‐ and five‐year graft loss was not considered because of high heterogeneity between studies. In conclusion, short‐term graft function and long‐term patient survival are better in recipients receiving DKT vs. SKT . However, these differences are based on few retrospective reports with a relatively low number of cases. Good quality randomized controlled trials are needed to assess whether the investment of two kidneys in one recipient is justified in face of the current organ shortage.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号