首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Hepatitis C virus infection has been the most common etiology in HCC‐related liver transplantation (LT). Since 2014, direct‐acting antivirals (DAAs) have dramatically improved HCV cure. We aimed to study the changing pattern of etiologies and impact in outcome in HCC‐related LT according to HCV treatment‐era through retrospective analysis of the Scientific Registry of Transplant Recipients (SRTR) database (1987‐2017). A total of 27 855 HCC‐related liver transplants were performed (median age 59 years, 77% male). In the DAA era (2014‐2017) there has been a 14.6% decrease in LT for HCV‐related HCC; however, HCV remains the most common etiology in 50% of cases. In the same era, there has been a 50% increase in LT for NAFLD‐related HCC. Overall survival was significantly worse for HCV‐related HCC compared to NAFLD‐related HCC during pre‐DAA era (2002‐2013; P = .031), but these differences disappeared in the DAA era. In addition, HCV patients had a significant improvement in survival when comparing the DAA era with IFN era (P < .001). Independent predictors of survival were significantly different in the pre‐DAA era (HCV, AFP, diabetes) than in the DAA era (tumor size). HCV‐related HCC continues to be the main indication for LT in the DAA era, but patients’ survival has significantly improved and is comparable to that of NAFLD‐related HCC.  相似文献   

2.
Benefit of direct‐acting antivirals (DAA) for hepatitis C virus (HCV) on clinical outcomes is unclear. We examined temporal trends in liver transplant (LT) listings, receipt of LT, re‐LT, and survival between pre‐DAA (2009–2012) and DAA era (2013–2016) using UNOS database. Of 32 319 first adult LT, 15 049 (47%) were performed for HCV. Trends on listing, first LT, and of re‐LT for HCV showed 23%, 20%, and 21% decrease in DAA compared to pre‐DAA era (P < 0.0001). One‐year liver graft and patient survival among HCV LT improved in DAA era (90% vs. 86% and 92% vs. 88%, respectively, P < 0.0001). Non‐HCV LT showed no improvement in survival (89% vs. 89% and 92% vs. 92.4%, P = NS). On cox regression, compared to non‐HCV LTs in DAA era, LT for HCV in pre‐DAA era had worse patient survival (HR 1.56 [1.04–2.35]). The outcome was similar when compared to LTs for HCV in DAA era and for non‐HCV in pre‐DAA era. Burden of HCV‐related LT waitlist and LT is declining in DAA era, with improved post‐transplant outcomes, more so in later than earlier DAA era. Our findings negate recent Cochrane meta‐analysis on DAA therapy and encourage studies to examine HCV clinical outcomes outside LT setting.  相似文献   

3.
Acute kidney injury (AKI) is common after lung transplantation, but molecular markers remain poorly studied. The endothelial activation markers soluble thrombomodulin (sTM), protein C, and plasminogen activator inhibitor‐1 (PAI‐1) are implicated in kidney microcirculatory injury in animal models of AKI. We tested the association of 6‐hour postreperfusion plasma levels of these markers with posttransplant AKI severity in patients enrolled in the Lung Transplant Outcomes Group prospective cohort study at the University of Pennsylvania during two eras: 2004‐06 (n = 61) and 2013‐15 (n = 67). We defined AKI stage through postoperative day 5 using Kidney Disease Improving Global Outcomes creatinine criteria. We used multivariable ordinal logistic regression to determine the association of each biomarker with AKI, adjusted for primary graft dysfunction and extracorporeal life support. AKI occurred in 57 (45%) patients across both eras: 28 (22%) stage 1, 29 (23%) stage 2‐3. Higher sTM and lower protein C plasma levels were associated with AKI stage in each era and remained so in multivariable models utilizing both eras (sTM: OR 1.76 [95% CI 1.19‐2.60] per standard deviation, P = .005; protein C: OR 0.54 [1.19‐2.60], P = .003). We conclude that 6‐hour postreperfusion plasma sTM and protein C levels are associated with early postlung transplant AKI severity.  相似文献   

4.
Direct‐acting antiviral medications (DAAs) have revolutionized care for hepatitis C positive (HCV+) liver (LT) and kidney (KT) transplant recipients. Scientific Registry of Transplant Recipients registry data were integrated with national pharmaceutical claims (2007‐2016) to identify HCV treatments before January 2014 (pre‐DAA) and after (post‐DAA), stratified by donor (D) and recipient (R) serostatus and payer. Pre‐DAA, 18% of HCV+ LT recipients were treated within 3 years and without differences by donor serostatus or payer. Post‐DAA, only 6% of D‐/R+ recipients, 19.8% of D+/R+ recipients with public insurance, and 11.3% with private insurance were treated within 3 years (P < .0001). LT recipients treated for HCV pre‐DAA experienced higher rates of graft loss (adjusted hazard ratio [aHR] 1.341.852.10, P < .0001) and death (aHR 1.471.681.91, P < .0001). Post‐DAA, HCV treatment was not associated with death (aHR 0.340.671.32, P = .25) or graft failure (aHR 0.320.641.26, P = .20) in D+R+ LT recipients. Treatment increased in D+R+ KT recipients (5.5% pre‐DAA vs 12.9% post‐DAA), but did not differ by payer status. DAAs reduced the risk of death after D+/R+ KT by 57% (0.190.430.95, P = .04) and graft loss by 46% (0.270.541.07, P = .08). HCV treatment with DAAs appears to improve HCV+ LT and KT outcomes; however, access to these medications appears limited in both LT and KT recipients.  相似文献   

5.
Although aortohepatic conduits (AHCs) provide an effective technique for arterialization in liver transplantation (LT) when the native recipient artery is unusable, various publications report higher occlusion rates and impaired outcome compared to conventional anastomoses. This systematic review and meta‐analysis investigates the published evidence of outcome and risk of AHCs in LT using bibliographic databases and following the Preferred Reporting Items for Systematic Reviews and Meta‐Analysis (PRISMA) guidelines. Primary and secondary outcome were artery occlusion as well as graft and patient survival. Twenty‐three retrospective studies were identified with a total of 22 113 patients with LT, of whom 1900 patients (9%) received an AHC. An AHC was used in 33% of retransplantations. Early artery occlusion occurred in 7% (3%‐16%) of patients with AHCs, compared to 2% (1%‐3%) without conduit (OR 3.70; 1.63‐8.38; P = .001). The retransplantation rate after occlusion was not significantly different in both groups (OR 1.46; 0.67‐3.18; P = .35). Graft (HR 1.38; 1.17‐1.63; P < .001) and patient (HR 1.57; 1.12‐2.20; P = .009) survival was significantly lower in the AHC compared to the nonconduit group. In contrast, graft survival in retransplantations was comparable (HR 1.00; 0.82‐1.22; P = .986). Although AHCs provide an important rescue option, when regular revascularization is not feasible during LT, transplant surgeons should be alert of the potential risk of inferior outcome.  相似文献   

6.
The average age of renal transplant recipients in the United States has increased over the past decade. The implications, however, have not been fully investigated. We explored predictors of success and demographic variables related to outcomes in elderly live donor transplantation. Retrospective analysis was performed using the UNOS database between 2001 and 2016. Donor characteristics and the graft failure rate of recipients above and below 70 years of age were compared across four eras: 2001-2004, 2005-2008, 2009-2012, and 2013-2016. There was a steady increase in average donor age from the first era to the fourth era (40-44) which was more evident among the septuagenarian patients (43-50) (P < .001). The 2-year graft survival rate improved from 92% in the first era to 96% in the fourth era (P < .001), and this was also more prominent in the >70 population (87%-93%) (P < .001). The >70 recipients were more likely to be non-Hispanic white (80.1% vs 65.1%, P < .001) and male (70.1% vs 61.0% P < .001), respectively. The donors were more likely to be non-Hispanic white and female in the >70 population. Live donation in the elderly is justified based on graft survival and patient survival. However, racial and gender differences exist in septuagenarian recipients and their donors.  相似文献   

7.
We conducted this study using the updated 2005‐2016 Organ Procurement and Transplantation Network database to assess clinical outcomes of retransplant after allograft loss as a result of BK virus–associated nephropathy (BKVAN). Three hundred forty‐one patients had first graft failure as a result of BKVAN, whereas 13 260 had first graft failure as a result of other causes. At median follow‐up time of 4.70 years after the second kidney transplant, death‐censored graft survival at 5 years for the second renal allograft was 90.6% for the BK group and 83.9% for the non‐BK group. In adjusted analysis, there was no difference in death‐censored graft survival (P = .11), acute rejection (P = .49), and patient survival (P = .13) between the 2 groups. When we further compared death‐censored graft survival among the specific causes for first graft failure, the BK group had better graft survival than patients who had prior allograft failure as a result of acute rejection (P < .001) or disease recurrence (P = .003), but survival was similar to those with chronic allograft nephropathy (P = .06) and other causes (P = .05). The better allograft survival in the BK group over acute rejection and disease recurrence remained after adjusting for potential confounders. History of allograft loss as a result of BKVAN should not be a contraindication to retransplant among candidates who are otherwise acceptable.  相似文献   

8.
There is a paucity of data on long‐term outcomes following visceral transplantation in the contemporary era. This is a single‐center retrospective analysis of all visceral allograft recipients who underwent transplant between November 2003 and December 2013 with at least 3‐year follow‐up data. Clinical data from a prospectively maintained database were used to assess outcomes including patient and graft survival. Of 174 recipients, 90 were adults and 84 were pediatric patients. Types of visceral transplants were isolated intestinal transplant (56.3%), combined liver‐intestinal transplant (25.3%), multivisceral transplant (16.1%), and modified multivisceral transplant (2.3%). Three‐, 5‐, and 10‐year overall patient survival was 69.5%, 66%, and 63%, respectively, while 3‐, 5‐, and 10‐year overall graft survival was 67%, 62%, and 61%, respectively. In multivariable analysis, significant predictors of survival included pediatric recipient (P = .001), donor/recipient weight ratio <0.9 (P = .008), no episodes of severe acute rejection (P = .021), cold ischemia time <8 hours (P = .014), and shorter hospital stay (P = .0001). In conclusion, visceral transplantation remains a good option for treatment of end‐stage intestinal failure with parenteral nutritional complications. Proper graft selection, shorter cold ischemia time, and improvement of immunosuppression regimens could significantly improve the long‐term survival.  相似文献   

9.
Socioeconomic deprivation is associated with poorer outcomes in chronic diseases. The aim of this study was to investigate the effect of socioeconomic deprivation on outcomes following pancreas transplantation among patients transplanted in England. We included all 1270 pancreas recipients transplanted between 2004 and 2012. We used the English Index of Multiple Deprivation (EIMD) score to assess the influence of socioeconomic deprivation on patient and pancreas graft survival. Higher scores mean higher deprivation status. Median EIMD score was 18.8, 17.7, and 18.1 in patients who received simultaneous pancreas and kidney (SPK), pancreas after kidney (PAK), and pancreas transplant alone (PTA), respectively (P = .56). Pancreas graft (censored for death) survival was dependent on the donor age (P = .08), cold ischemic time (CIT; P = .0001), the type of pancreas graft (SPK vs. PAK or PTA, P = .0001), and EIMD score (P = .02). The 5‐year pancreas graft survival of the most deprived patient quartile was 62% compared to 75% among the least deprived (P = .013), and it was especially evident in the SPK group. EIMD score also correlated with patient survival (P = .05). When looking at the impact of individual domains of deprivation, we determined that “Environment” (P = .037) and “Health and Disability” (P = .035) domains had significant impact on pancreas graft survival. Socioeconomic deprivation, as expressed by the EIMD is an independent factor for pancreas graft and patient survival.  相似文献   

10.
Limited data exist regarding the impact of donation after circulatory death (DCD) allografts on outcomes following liver transplantation in fulminant hepatic failure (FHF). Utilizing the Scientific Registry of Transplant Recipients (SRTR), we compared outcomes after DCD in FHF to donation after brain death (DBD) in FHF and DCD in non-FHF over a 15-year period. Primary outcome measures were graft and patient survival. A total of 117, 3437, and 4379 recipients underwent DCD-FHF, DBD-FHF and DCD-non-FHF, respectively. One-year graft survival in DCD-FHF was inferior to DBD-FHF (72.9% vs. 83.8%, p = .002), but comparable to DCD-non-FHF (72.9% vs. 82.7%, p = .23). However, 3- and 5-year graft survival in DCD-FHF were comparable to DBD-FHF (67.9 vs. 77.6%, p = .63; 57.8% vs. 73.2%, p = .27) and DCD-non-FHF (67.9% vs. 72.9%, p = .44; 57.8% vs. 66.6%, p = .06). One-, 3-, and 5-year patient survival were also comparable among the three groups. Graft and patient survival in DCD-FHF improved over the study period. Multivariable analysis identified recipient age, male gender, African American ethnicity, donor age, and cold ischemia time as predictors of graft and patient survival in FHF, while DCD status was only predictive of graft survival. Long-term graft survival and patient survival in DCD-FHF are comparable to DBD-FHF and DCD-non-FHF. Consideration of DCD in FHF could help expand the donor pool in this subset of critically ill patients.  相似文献   

11.
Full-left-full-right split liver transplantation (FSLT) for adult recipients, may increase the availability of liver grafts, reduce waitlist time, and benefit recipients with below-average body weight. However, FSLT may lead to impaired graft and patient survival. This study aims to assess outcomes after FSLT. Five databases were searched to identify studies concerning FSLT. Incidences of complications, graft- and patient survival were assessed. Discrete data were pooled with random-effect models. Graft and patient survival after FSLT were compared with whole liver transplantation (WLT) according to the inverse variance method. Vascular complications were reported in 25/273 patients after FSLT (Pooled proportion: 6.9%, 95%CI: 3.1–10.7%, I2: 36%). Biliary complications were reported in 84/308 patients after FSLT (Pooled proportion: 25.6%, 95%CI: 19–32%, I2: 44%). Pooled proportions of graft and patient survival after 3 years follow-up were 72.8% (95%CI: 67.2–78.5, = 231) and 77.3% (95%CI: 66.7–85.8, = 331), respectively. Compared with WLT, FSLT was associated with increased graft loss (pooled HR: 2.12, 95%CI: 1.24–3.61, = 0.006, = 189) and patient mortality (pooled HR: 1.81, 95%CI: 1.17–2.81, = 0.008, = 289). FSLT was associated with high incidences of vascular and biliary complications. Nevertheless, long-term patient and graft survival appear acceptable and justify transplant benefit in selected patients.  相似文献   

12.
The United States opioid use epidemic over the past decade has coincided with an increase in hepatitis C virus  (HCV) positive donors. Using propensity score matching, and the Organ Procurement Transplant Network data files from January 2015 to June 2019, we analyzed the short‐term outcomes of adult deceased donor kidney transplants of HCV uninfected recipients with two distinct groups of HCV positive donors (HCV seropositive, nonviremic n = 352 and viremic n = 196) compared to those performed using HCV uninfected donors (n = 36 934). Compared to the reference group, the transplants performed using HCV seropositive, nonviremic and viremic donors experienced a lower proportion of delayed graft function (35.2 vs 18.9%; P < .001 [HCV seropositive, nonviremic donors] and 36.2 vs 16.8% ;  P < .001[HCV viremic donors]). The recipients of HCV viremic donors had better allograft function at 6 months posttransplant (eGFR [54.1 vs 68.3 mL/min/1.73 m2; P = .004]. Furthermore, there was no statistical difference in the overall graft failure risk at 12 months posttransplant by propensity score matched multivariable Cox proportional analysis (HR =  0.60, 95% CI  0.23 to  1.29 [HCV seropositive, nonviremic donors] and HR =  0.85, 95% CI 0.25 to  2.96 [HCV viremic donors]). Further studies are required to determine the long‐term outcomes of these transplants and address unanswered questions regarding the use of HCV viremic donors.  相似文献   

13.
We aimed to evaluate the influence of urological complications occurring within the first year after kidney transplantation on long‐term patient and graft outcomes, and sought to examine the impact of the management approach of ureteral strictures on long‐term graft function. We collected data on urological complications occurring within the first year posttransplant. Graft survivals, patient survival, and rejection rates were compared between recipients with and without urological complications. Male gender of the recipient, delayed graft function, and donor age were found to be significant risk factors for urological complications after kidney transplantation (P < .05). Death censored graft survival analysis showed that only ureteral strictures had a negative impact on long‐term graft survival (P = .0009) compared to other complications. Death censored graft survival was significantly shorter in kidney recipients managed initially with minimally invasive approach when compared to the recipients with no stricture (P = .001). However, graft survival was not statistically different in patients managed initially with open surgery (P = .47). Ureteral strictures following kidney transplantation appear to be strongly negatively correlated with long‐term graft survival. Our analysis suggests that kidney recipients with ureteral stricture should be managed initially with open surgery, with better long‐term graft survival.  相似文献   

14.
Pediatric kidney transplant outcomes associated with expanded-criteria donors (ECD) and high Kidney Donor Profile Index (KDPI) kidneys are unknown. We reviewed the Scientific Registry of Transplant Recipients data from 1987-2017 to identify 96 ECD and 92 > 85 KDPI kidney recipients (<18 years). Using propensity scores, we created comparison groups of 375 non-ECD and 357 ≤ 85 KDPI recipients for comparisons with ECD and > 85 KDPI transplants, respectively. We used Cox regression for patient/graft survival and sequential Cox approach for survival benefit of ECD and > 85 KDPI transplantationvs remaining on the waitlist. After adjustment, ECD recipients were at significantly increased risk of graft failure (adjusted hazard ratio [aHR] = 1.6; P = .001) but not of mortality (aHR = 1.33; P = .15) compared with non-ECD recipients. We observed no survival benefit of ECD transplants vs remaining on the waitlist (aHR = 1.05; P = .83). We found no significant difference in graft failure (aHR = 1.27; P = .12) and mortality (aHR = 1.41; P = .13) risks between > 85 KDPI and ≤ 85 KDPI recipients. However, > 85 KDPI transplants were associated with a survival benefit vs remaining on the waitlist (aHR = 0.41; P = .01). ECD transplantation in children is associated with a high graft loss risk and no survival benefit, whereas > 85 KDPI transplantation is associated with a survival benefit for children vs remaining on the waitlist.  相似文献   

15.
Increased risk donors (IRDs) may inadvertently transmit blood‐borne viruses to organ recipients through transplant. Rates of IRD kidney transplants in children and the associated outcomes are unknown. We used the Scientific Registry of Transplant Recipients to identify pediatric deceased donor kidney transplants that were performed in the United States between January 1, 2005 and December 31, 2015. We used the Cox regression analysis to compare patient and graft survival between IRD and non‐IRD recipients, and a sequential Cox approach to evaluate survival benefit after IRD transplants compared with remaining on the waitlist and never accepting an IRD kidney. We studied 328 recipients with and 4850 without IRD transplants. The annual IRD transplant rates ranged from 3.4% to 13.2%. IRDs were more likely to be male (= .04), black (P < .001), and die from head trauma (P = .006). IRD recipients had higher mean cPRA (0.085 vs 0.065, P = .02). After multivariate adjustment, patient survival after IRD transplants was significantly higher compared with remaining on the waitlist (adjusted hazard ratio [aHR]: 0.48, 95% CI: 0.26‐0.88, P = .018); however, patient (aHR: 0.93, 95% CI: 0.54‐1.59, P = .79) and graft survival (aHR: 0.89, 95% CI: 0.70‐1.13, P = .32) were similar between IRD and non‐IRD recipients. We recommend that IRDs be considered for transplant in children.  相似文献   

16.
The current Banff scoring system was not developed to predict graft loss and may not be ideal for use in clinical trials aimed at improving allograft survival. We hypothesized that scoring histologic features of digitized renal allograft biopsies using a continuous, more objective, computer‐assisted morphometric (CAM) system might be more predictive of graft loss. We performed a nested case‐control study in kidney transplant recipients with a surveillance biopsy obtained 5 years after transplantation. Patients that developed death‐censored graft loss (n = 67) were 2:1 matched on age, gender, and follow‐up time to controls with surviving grafts (n = 134). The risk of graft loss was compared between CAM‐based models vs a model based on Banff scores. Both Banff and CAM identified chronic lesions associated with graft loss (chronic glomerulopathy, arteriolar hyalinosis, and mesangial expansion). However, the CAM‐based models predicted graft loss better than the Banff‐based model, both overall (c‐statistic 0.754 vs 0.705, P < .001), and in biopsies without chronic glomerulopathy (c‐statistic 0.738 vs 0.661, P < .001) where it identified more features predictive of graft loss (% luminal stenosis and % mesangial expansion). Using 5‐year renal allograft surveillance biopsies, CAM‐based models predict graft loss better than Banff models and might be developed into biomarkers for future clinical trials.  相似文献   

17.
Prior single center or registry studies have shown that living donor liver transplantation (LDLT) decreases waitlist mortality and offers superior patient survival over deceased donor liver transplantation (DDLT). The aim of this study was to compare outcomes for adult LDLT and DDLT via systematic review. A meta-analysis was conducted to examine patient survival and graft survival, MELD, waiting time, technical complications, and postoperative infections. Out of 8600 abstracts, 19 international studies comparing adult LDLT and DDLT published between 1/2005 and 12/2017 were included. U.S. outcomes were analyzed using registry data. Overall, 4571 LDLT and 66,826 DDLT patients were examined. LDLT was associated with lower mortality at 1, 3, and 5 years posttransplant (5-year HR 0.87 [95% CI 0.81–0.93], p < .0001), similar graft survival, lower MELD at transplant (p < .04), shorter waiting time (p < .0001), and lower risk of rejection (p = .02), with a higher risk of biliary complications (OR 2.14, p < .0001). No differences were observed in rates of hepatic artery thrombosis. In meta-regression analysis, MELD difference was significantly associated with posttransplant survival (R2 0.56, p = .02). In conclusion, LDLT is associated with improved patient survival, less waiting time, and lower MELD at LT, despite posing a higher risk of biliary complications that did not affect survival posttransplant.  相似文献   

18.
The impact of cytomegalovirus (CMV) serostatus on kidney transplant outcomes in an era when CMV prophylactic and preemptive strategies are used routinely is not clearly established. Using United Network for Organ Sharing/Organ Procurement and Transplantation Network data, recipients with first deceased donor kidney transplant (≥18 years, 2010‐2015) were stratified into 4 groups in the main cohort: CMV‐seronegative donor (D?)/CMV‐seronegative recipient (R?), CMV‐seropositive donor (D+)/R?, D+/CMV‐seropositive recipient (R+), and D?/R+. In a paired kidney cohort, we identified 2899 pairs of D? kidney transplant with discordance of recipient serostatus (D?/R? vs D?/R+) and 4567 pairs of D+ kidney transplant with discordance of recipient serostatus (D+/R? vs D+/R+). In the main cohort, D+/R? was associated with a higher risk of graft failure (hazard ratio [HR] = 1.17, P = .01), all‐cause mortality (HR = 1.18, P < .001), and infection‐related mortality (HR = 1.38, P = .03) compared with D?/R?. In the paired kidney analysis, D+/R? was an independent risk factor for all‐cause mortality (HR = 1.21, P = .003) and infection‐related mortality (HR = 1.47, P = .04) compared with D+/R+. No difference in graft loss between D+/R? and D+/R+. CMV mismatch is still an independent risk factor for graft loss and patient mortality. The negative impact of D+/R? serostatus on mortality persists after fully matching for donor factors.  相似文献   

19.
The study was intended to compare pancreas graft survival rates in two groups of pancreas and kidney transplant recipients prospectively randomized to treatment either with sirolimus or MMF. From 2002 to 2013, 238 type 1 diabetic recipients with end‐stage kidney disease were randomized 1:1 to sirolimus or MMF treatment. Noncensored pancreas survival at 5 years was 76.4 and 71.6% for sirolimus and MMF groups, respectively (P > .05). Death‐censored pancreas survival was better in the sirolimus group (P = .037). After removal of early graft losses pancreas survival did not differ between groups (MMF 83.1% vs sirolimus 91.6%, P = .11). Nonsignificantly more grafts were lost due to rejection in the MMF group (10 vs 5; P = .19). Cumulative patient 5‐year survival was 96% in the MMF group and 91% in the sirolimus group (P > .05). Five‐year cumulative noncensored kidney graft survival rates did not statistically differ (85.6% in the sirolimus group and 88.8% in MMF group). Recipients treated with MMF had significantly more episodes of gastrointestinal bleeding (7 vs 0, P = .007). More recipients in the sirolimus group required corrective surgery due to incisional hernias (21 vs 12, P = .019). ClinicalTrials No.: NCT 03582878.  相似文献   

20.
Revisiting posttransplant survival for hepatitis C virus recipients   总被引:1,自引:0,他引:1  
Hepatitis C virus (HCV) is the most frequent indication for adult liver transplantation in Europe and United States. Posttransplantation HCV recurrence is universal and previous studies have reported a reduced survival in comparison to non-HCV recipients. We report the findings of a comparative survival analysis of adult recipients (n=12,434) with HCV from two eras using the United Network for Organ Sharing database. Cox regression modeling was used to compare both eras (A: 1994-1998 and B: 1999-2003). The 1-, 3-, and 5-year adjusted graft survivals for era A (n=5,215) and era B (n=7,519) were 80%, 69%, and 62% versus 84%, 72%, and 64%, respectively (P<0.001), whereas the 1-, 3-, and 5-year adjusted patient survivals were 86%, 77%, and 70% for era A versus 87%, 78%, and 70% for era B, respectively (P=0.79). This comparative analysis of posttransplant outcomes for HCV recipients suggests an improvement in graft survival in the latter years.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号