首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
BACKGROUND: As more expanded-criteria organ donors are used to bridge the widening gap between organ supply and demand, non-heart-beating (NHB) donors will become increasingly important. The purpose of this study was to analyze renal transplant outcomes using this source of cadaveric (CAD) organs and compare the results with heart-beating organ sources. METHODS: Data from 98,698 adult CAD renal transplant recipients and 34,531 living donor renal transplant recipients registered in the U. S. Renal Data System database between January 1993 and June 2000 were analyzed. Kaplan-Meier survival curves were used to compare graft and patient survival rates between NHB, CAD, and living donor transplant recipients. Cox proportional hazards models were used to identify risk factors for NHB donor recipients, while adjusting for potential confounding variables. RESULTS: Recipients of NHB donor organs experienced nearly twice the incidence of delayed graft function (DGF) compared with heart-beating donors (42.4% vs. 23.3%, respectively). NHB donor transplants experienced comparable allograft survival when compared with CAD transplants at 6 years (73.2% vs. 72.5%, respectively; P=NS); patient survival was greater at 6 years for NHB compared with CAD renal transplant recipients (80.9% vs. 77.8%, respectively; P=NS). Significant factors for allograft loss for NHB donor organ recipients included the following: organ used for repeat transplants; DGF; donor age older than 35 years; and head trauma as a cause of initial injury (relative risk 2.74, 1.90, 1.78, and 1.41, respectively). CONCLUSIONS: Although exhibiting elevated DGF rates, allograft and patient survival rates of transplants from NHB donor sources are equivalent to those from conventional CAD sources. Donor age, recipient transplant number, female recipient, mechanism of injury, and DGF were the most pertinent variables leading to poor outcomes.  相似文献   

2.
The Survival Benefit of Liver Transplantation   总被引:2,自引:0,他引:2  
Demand for liver transplantation continues to exceed donor organ supply. Comparing recipient survival to that of comparable candidates without a transplant can improve understanding of transplant survival benefit. Waiting list and post-transplant mortality was studied among a cohort of 12 996 adult patients placed on the waiting list between 2001 and 2003. Time-dependent Cox regression models were fitted to determine relative mortality rates for candidates and recipients. Overall, deceased donor transplant recipients had a 79% lower mortality risk than candidates (HR = 0.21; p < 0.001). At Model for End-stage Liver Disease (MELD) 18-20, mortality risk was 38% lower (p < 0.01) among recipients compared to candidates. Survival benefit increased with increasing MELD score; at the maximum score of 40, recipient mortality risk was 96% lower than that for candidates (p < 0.001). In contrast, at lower MELD scores, recipient mortality risk during the first post-transplant year was much higher than for candidates (HR = 3.64 at MELD 6-11, HR = 2.35 at MELD 12-14; both p < 0.001). Liver transplant survival benefit at 1 year is concentrated among patients at higher risk of pre-transplant death. Futile transplants among severely ill patients are not identified under current practice. With 1 year post-transplant follow-up, patients at lower risk of pre-transplant death do not have a demonstrable survival benefit from liver transplant.  相似文献   

3.
A prognostic index to predict survival after liver transplantation could address several clinical needs. Here, we devised a scoring system that predicts recipient survival after pediatric liver transplantation. We used univariate and multivariate analysis on 4565 pediatric liver transplant recipients data and identified independent recipient and donor risk factors for posttransplant mortality at 3 months. Multiple imputation was used to account for missing variables. We identified five factors as significant predictors of recipient mortality after pediatric liver transplantation: two previous transplants (OR 5.88, CI 2.88–12.01), one previous transplant (OR 2.54, CI 1.75–3.68), life support (OR 3.68, CI 2.39–5.67), renal insufficiency (OR 2.66, CI 1.84–3.84), recipient weight under 6 kilograms (OR 1.67, CI 1.12–2.36) and cadaveric technical variant allograft (OR 1.38, CI 1.03–1.83). The Survival Outcomes Following Pediatric Liver Transplant score assigns weighted risk points to each of these factors in a scoring system to predict 3‐month recipient survival after liver transplantation with a C‐statistic of 0.74. Although quite accurate when compared with other posttransplant survival models, we would not advocate individual clinical application of the index.  相似文献   

4.
The growing imbalance between the number of cadaveric organ donors and recipients has led to an increasing use of high-risk donors as an option to expand the donor pool. The aim of this study was to evaluate our experience with the use of older liver (donor>50 yr of age) allografts. The medical records, postreperfusion biopsies and laboratory results were reviewed of the 393 patients who underwent orthotopic liver transplantation between 1986 and 1997. The outcome of the 61 patients who received older livers (OL) was compared to that of the other 332 recipients. Increasing use of OL was evident from 1992 onwards. Recipients of OL were older than recipients of younger livers (YL, p<0.001) and more commonly had underlying chronic viral hepatitis (CVH) or fulminant hepatic failure (p<0.05). Patient and allograft survival were only slightly less in recipients of OL versus YL (p=NS). Although postperfusion biopsies showed more damage in OL than YL allografts (p<0.05), this was not associated with increased primary graft failure. OL allografts can be transplanted with acceptable results into recipients without the concern of early allograft loss. SUMMARY OF ARTICLE: This report of one centre's experience with 61 recipients of older donor liver allografts identifies recipient factors that may also have a negative impact on allograft outcome. These factors include a diagnosis of either CVH or fulminant hepatic failure at the time of transplantation. Postreperfusion biopsies of older donor allografts tend to show more damage, but this is not associated with primary non-function.  相似文献   

5.
The restoration of kidney function by transplantation improves the common finding of chronic inflammation in patients with end-stage renal disease (ESRD). The C-reactive protein (CRP) level is a reliable marker of inflammation in renal transplant recipients. We analyzed the predictive value of posttransplant CRP surges on renal allograft survival among 141 ESRD patients who underwent renal transplantation between May 1999 and September 2001 at our institution. Twenty-seven cadaveric and 114 living donors were also studied. The subjects' demographic, clinical, and laboratory data were recorded. The renal transplant recipients were divided into three groups defined by the type of serum CRP surge: a normal, intermittently high, or consistently high serum CRP concentration. Renal allograft survival rates were 90.0% among recipients with normal serum CRP concentrations, 72.6% among those with intermittently high concentrations, and 11.1% in those with consistently high concentrations. A Cox regression analysis of factors that affect allograft survival showed that acute rejection, advanced recipient age, and consistently high serum CRP concentrations were associated with a high risk of renal allograft loss. Intermittent elevations in the serum CRP level were not associated with an increased risk of allograft loss, according to the Cox regression model. We concluded that consistently high serum CRP concentrations in renal allograft recipients showed a high negative predictive value for renal allograft survival. In recipients who exhibited ongoing inflammatory process in the 5-year posttransplant period, additional efforts are necessary to manage inflammation and therefore prolong renal allograft survival.  相似文献   

6.
The use of extended criteria liver donors (ECD) is controversial, especially in the setting of retransplantation. The aims of this study are to investigate the effects of ECD grafts on retransplantation and to develop a predictive mortality index in liver retransplantation based on the previously established donor risk index. The United Network for Organ Sharing (UNOS) liver transplant dataset was analyzed for all adult, non-status 1, liver retransplantations occurring in the United States since February 2002. All donors were categorized for multiple characteristics of ECD, and using multivariate survival models a retransplant donor risk index (ReTxDRI) was developed. A total of 1327 retransplants were analyzed. There were 611 (46%) recipients who received livers with at least one ECD criterion. The use of ECD grafts in recipients with HCV did not incur worse survival than the non-ECD grafts. The addition of the cause of recipient graft failure to the donor risk index formed the ReTxDRI. After adjusting for multiple recipient factors, the ReTxDRI was predictive of overall recipient survival and was a strongly independent predictor of death after retransplantation (HR 2.49, 95% CI 1.89–3.27, p < 0.0001). The use of the ReTxDRI can improve recipient and donor matching and help to optimize posttransplant survival in liver retransplantation.  相似文献   

7.
Cholesterol atheroembolic renal disease is a rare cause of renal allograft dysfunction. Two recipients of cadaveric kidney transplantats from the same donor are discussed with presumed graft failure due to cholesterol emboli of donor origin. A review of the literature summarizes the reported cases in renal transplant recipients. While cholesterol embolization of presumed donor origin seems to have a poor renal outcome, cholesterol emboli originating in the recipient have a more favorable prognosis. As donors and recipients of increasing age or prominent atherosclerosis are accepted for transplantation, cholesterol atheroembolic renal disease may become more prevalent and should be considered in patients with renal allograft dysfunction.  相似文献   

8.
We aimed to identify recipient, donor and transplant risk factors associated with graft failure and patient mortality following donation after cardiac death (DCD) liver transplantation. These estimates were derived from Scientific Registry of Transplant Recipients data from all US liver‐only DCD recipients between September 1, 2001 and April 30, 2009 (n = 1567) and Cox regression techniques. Three years post‐DCD liver transplant, 64.9% of recipients were alive with functioning grafts, 13.6% required retransplant and 21.6% died. Significant recipient factors predictive of graft failure included: age ≥ 55 years, male sex, African–American race, HCV positivity, metabolic liver disorder, transplant MELD ≥ 35, hospitalization at transplant and the need for life support at transplant (all, p ≤ 0.05). Donor characteristics included age ≥ 50 years and weight >100 kg (all, p ≤ 0.005). Each hour increase in cold ischemia time (CIT) was associated with 6% higher graft failure rate (HR 1.06, p < 0.001). Donor warm ischemia time ≥ 35 min significantly increased graft failure rates (HR 1.84, p = 0.002). Recipient predictors of mortality were age ≥ 55 years, hospitalization at transplant and retransplantation (all, p ≤ 0.006). Donor weight >100 kg and CIT also increased patient mortality (all, p ≤ 0.035). These findings are useful for transplant surgeons creating DCD liver acceptance protocols.  相似文献   

9.
BACKGROUND: Mechanisms by which delayed allograft function reduces renalallograft survival are poorly understood, This study evaluatedthe relationship of delayed allograft function to acute rejectionand long-term survival of cadaveric allografts. METHODS: 338 recipients of cadaveric allografts were followed until death,resumption of dialysis, retransplantation, loss to follow-up,or the study's end, whichever came first. Delayed allograftfunction was defined by dialysis during the first week followingtransplantation, Multivariate Cox proportional hazards survivalanalysis was used to assess the relationship of delayed allograftfunction to rejection and allograft survival. RESULTS: Delayed allograft function, recipient age, preformed reactiveantibody levels, prior kidney transplantation, recipient race,rejection during the first 30 days and rejection subsequentto 30 days following transplantation were predictive of allograftsurvival in multivariate survival models. Delayed allograftfunction was associated with shorter allograft survival afteradjustment for acute rejection and other covariates (relativerate of failure [RR]=1.72 [95% CI, 1.07, 2.76]). The adjustedRR of allograft failure associated with any rejection duringthe first 30 days was 1.99 (1.23, 3.21), and for rejection subsequentto the first 30 days was 3.53 (2.08, 6.00). The impact of delayedallograft function did not change substantially (RR=1.84 [1.15,2.95]) in models not controlling for acute rejection. Theseresults were stable among several subgroups of patients andusing alternative definitions of allograft survival and delayedallograft function. CONCLUSIONS: This study demonstrates that delayed allograft function andacute allograft rejection have important independent and deleteriouseffects on cadaveric allograft survival. These results suggestthat the effect of delayed allograft function is mediated, inpart, through mechanisms not involving acute clinical rejection.  相似文献   

10.
BACKGROUND: In hepatitis C virus (HCV)-positive liver transplant recipients, infection of the allograft and recurrent liver disease are important problems. Increased donor age has emerged as an important variable affecting patient and graft survival; however, specific age cutoffs and risk ratios for poor histologic outcomes and graft survival are not clear. METHODS: A longitudinal database of all HCV-positive patients transplanted at our center during an 11-year period was used to identify 111 patients who received 124 liver transplants. Graft survival and histological endpoints (severe activity and fibrosis) of HCV infection in the allografts were compared as a function of donor age at transplantation. RESULTS: By Kaplan-Meier analyses, older allografts showed earlier failure and decreased time to severe histological activity and fibrosis as compared with allografts from younger donors. By Cox proportional hazards analysis, older allografts were at greater risk for all severe histologic features and decreased graft survival as compared with younger allografts (P< or =0.02 for all outcomes). Analysis of donor age as a dichotomous variable showed that donors greater than 60 yr were at high risk for deleterious histologic outcomes and graft failure. An age cutoff of 60 yr showed a sensitivity of 94% and specificity of 67% for worse graft survival by receiver operating characteristics curve. CONCLUSIONS: Advanced donor age is associated with more aggressive recurrent HCV and early allograft failure in HCV-positive liver transplant recipients. Consideration of donor age is important for decisions regarding patient selection, antiviral therapy, and organ allocation.  相似文献   

11.
The association of donor and recipient age with survival following adult heart transplantation has not been well characterized. The purpose of this study was to examine the impact of the relationship between donor and recipient age on post‐transplant survival. We retrospectively reviewed the 2005–2018 UNOS heart transplant database for all adult recipients undergoing first‐time isolated heart transplantation. The impact of donor and recipient age on survival was analyzed with Cox proportional hazards modeling using restricted cubic splines. A total of 25 480 heart transplant donor and recipient pairs met inclusion criteria. Unadjusted and adjusted Cox proportional hazards modeling demonstrated a near‐linear association between increasing donor age and decreased survival; in addition, older and younger recipient age was associated with decreased survival. After adjustment, there was no significant interaction between donor and recipient age. Older donors decreased survival similarly in both older and younger recipients. Increasing donor age and both younger and older recipient age are independently associated with worsened post‐heart transplant survival. The relationship between donor and recipient age does not significantly affect survival following heart transplant.  相似文献   

12.
BACKGROUND: Clinical outcome of renal transplantation among systemic lupus erythematosus (SLE) patients remains a topic of controversy. Most of the previous reports were based upon small single-centre studies that were not always well-designed. METHODS: We conducted the retrospective analysis using data from USRDS and UNOS databases. Patients were divided into five groups based on the cause of end-stage renal disease (ESRD): diabetes mellitus (DM), SLE, glomerulonephritis, hypertension and other causes. Between 1990 and 1999, 2886 renal transplantation recipients with ESRD due to SLE were identified from a total of 92 844 patients. RESULTS: The mean follow-up period of this study was 4.7 +/- 2.4 years. While unadjusted analysis using Kaplan-Meier curves demonstrated an association between SLE and improved allograft survival compared with DM, in multivariate analysis the SLE group had worse allograft [hazard ratio (HR) 1.09, P < 0.05] and recipient (HR 1.19, P < 0.05) survival compared with the DM group. Subgroup analysis based on the type of donor showed that SLE patients who received deceased donor allograft had worse allograft and recipient survival (HR 1.14, P = 0.002 and HR 1.30, P = 0.001, respectively) compared with non-SLE deceased donor allograft recipients. Among living allograft recipients, there were no significant differences in either allograft or recipient survival compared with non-SLE recipients. CONCLUSIONS: SLE as a cause of ESRD in renal transplant recipients is associated with worse allograft and recipient survival compared with DM; this association is true for the entire population and for the recipients of deceased donor (but not living donor) transplant. Deceased donor allograft recipients have worse outcomes compared with living allograft recipients.  相似文献   

13.
Numerous donor and recipient risk factors interact to influence the probability of survival after liver transplantation. We developed a statistic, D-MELD, the product of donor age and preoperative MELD, calculated from laboratory values. Using the UNOS STAR national transplant data base, we analyzed survival for first liver transplant recipients with chronic liver failure from deceased after brain death donors. Preoperative D-MELD score effectively stratified posttransplant survival. Using a cutoff D-MELD score of 1600, we defined a subgroup of donor–recipient matches with significantly poorer short- and long-term outcomes as measured by survival and length of stay (LOS). Avoidance of D-MELD scores above 1600 improved results for subgroups of high-risk patients with donor age ≥60 and those with preoperative MELD ≥30. D-MELD ≥1600 accurately predicted worse outcome in recipients with and without hepatitis C. There is significant regional variation in average D-MELD scores at transplant, however, regions with larger numbers of high D-MELD matches do not have higher survival rates. D-MELD is a simple, highly predictive tool for estimating outcomes after liver transplantation. This statistic could assist surgeons and their patients in making organ acceptance decisions. Applying D-MELD to liver allocation could eliminate many donor/recipient matches likely to have inferior outcome.  相似文献   

14.
In several murine models of transplantation, the “cross-dressing” of recipient antigen presenting cells (APCs) with intact donor major histocompatibility complex (MHC) derived from allograft-released small extracellular vesicles (sEVs) has been recently described as a key mechanism in eliciting and sustaining alloimmune responses. Investigation of these processes in clinical organ transplantation has, however, been hampered by the lack of sensitivity of conventional instruments and assays. We have employed advanced imaging flow cytometry (iFCM) to explore the kinetics of allograft sEV release and the extent to which donor sEVs might induce cross-dressing following liver and kidney transplantation. We report for the first time that recipient APC cross-dressing can be transiently detected in the circulation shortly after liver, but not kidney, transplantation in association with the release of HLA-bearing allograft-derived sEVs. In liver transplant recipients the majority of circulating cells exhibiting donor HLA are indeed cross-dressed cells and not passenger leukocytes. In keeping with experimental animal data, the downstream functional consequences of the transfer of circulating sEVs harvested from human transplant recipients varies depending on the type of transplant and time posttransplant. sEVs released shortly after liver, but not kidney, transplantation exhibit immunoinhibitory effects that could influence liver allograft immunogenicity.  相似文献   

15.
Liver transplantation is an important health care issue for Canadians. Very few studies have assessed survival and determinants of survival in liver transplant patients in Canada. METHODS: We carried out an epidemiological analysis of 1 year survival and determinants of 1 year survival in liver transplant patients, using Canadian Organ Replacement Registry data (1997-2002). Survival curves were plotted by the Kaplan-Meier method. Cox proportional hazards analysis was applied to evaluate hazard ratios with different age groups, gender, ethnicity, blood groups, donor type, pretransplantation medical status, and HBV infection status. RESULTS: A total of 1164 liver transplant patients were included in the analysis. One-year survival rate was 84.7%. Male recipients had a 21% higher risk of developing organ failure than females. Recipients over 60 years of age had a 5% lower survival probability in comparison with recipients below 20 years of age. Pacific Islanders and Aboriginals had 32% and 9% lower survival probabilities, respectively, in comparison with Caucasians. Type B blood recipients had a 12% higher survival probability, whereas type AB blood recipients had a 7% lower survival probability compared with type O blood recipients. Twenty-six live organ recipients had 40% higher survival probabilities than 1138 cadaveric organ recipients. Patients with fulminant hepatitis (status 3F) had the highest survival, while patients with fulminant failure in ICU with intubation/ventilation (status 4F) had the lowest survival. One hundred sixty-seven recipients with positive HBsAg antigen showed 10% lower survival probability than 997 cases with negative HBsAg antigen. CONCLUSION: In Canada, the first year survival rate is about 85%, which is comparable with other industrialized countries. Type of donor organs and recipient gender, ethnicity, ABO blood group, pretransplantation medical status, and HBV infection status had significant affects on the recipient survival.  相似文献   

16.
Standardization in allocation of kidneys for transplant simultaneous with livers and the creation of a “safety net” for kidney transplant after liver transplant alone (LTA) was designed to encourage clinicians to list patients for LTA when the likelihood of renal recovery and the necessity of simultaneous liver and kidney (SLK) transplant were unclear. We analyzed the United Network for Organ Sharing database of SLK recipients starting January 1, 2015. Organs from one deceased donor were used in each individual case. Univariate analysis was used to analyze recipient and donor characteristics against patient and graft survival of at least 1 year. Cox regression was employed for multivariable analysis controlling for donor risk index variables. SLK recipients who failed to achieve 1 year of post-transplant survival were more likely to be older, have higher model for end-stage liver disease scores, have diabetes, have received dialysis within one week of transplant, and required intensive care unit admission at transplantation. Patients who failed to survive for at least 1 year after SLK were more likely to have received organs from donors who were older with a higher kidney donor profile index. Using national data we identified SLK donor and recipient characteristics associated with poor post-transplant outcome. Clinicians involved in the decision to list patients with liver failure for LTA or SLK may use these associations to help guide decision making.  相似文献   

17.
BACKGROUND: Waiting time on dialysis has been shown to be associated with worse outcomes after living and cadaveric transplantation. To validate and quantify end-stage renal disease (ESRD) time as an independent risk factor for kidney transplantation, we compared the outcome of paired donor kidneys, destined to patients who had ESRD more than 2 years compared to patients who had ESRD less than 6 months. METHODS: We analyzed data available from the U.S. Renal Data System database between 1988 and 1998 by Kaplan-Meier estimates and Cox proportional hazards models to quantify the effect of ESRD time on paired cadaveric kidneys and on all cadaveric kidneys compared to living-donated kidneys. RESULTS: Five- and 10-year unadjusted graft survival rates were significantly worse in paired kidney recipients who had undergone more than 24 months of dialysis (58% and 29%, respectively) compared to paired kidney recipients who had undergone less than 6 months of dialysis (78% and 63%, respectively; P<0.001 each). Ten-year overall adjusted graft survival for cadaveric transplants was 69% for preemptive transplants versus 39% for transplants after 24 months on dialysis. For living transplants, 10-year overall adjusted graft survival was 75% for preemptive transplants versus 49% for transplants after 24 month on dialysis. CONCLUSIONS: ESRD time is arguably the strongest independent modifiable risk factor for renal transplant outcomes. Part of the advantage of living-donor versus cadaveric-donor transplantation may be explained by waiting time. This effect is dominant enough that a cadaveric renal transplant recipient with an ESRD time less than 6 months has the equivalent graft survival of living donor transplant recipients who wait on dialysis for more than 2 years.  相似文献   

18.
Because of the shortage of deceased donor organs, transplant centers accept organs from marginal deceased donors, including older donors. Organ-specific donor risk indices have been developed to predict graft survival with various combinations of donor and recipient characteristics. Here we review the kidney donor risk index (KDRI) and the liver donor risk index (LDRI) and compare and contrast their strengths, limitations, and potential uses. The KDRI has a potential role in developing new kidney allocation algorithms. The LDRI allows a greater appreciation of the importance of donor factors, particularly for hepatitis C virus-positive recipients; as the donor risk index increases, the rates of allograft and patient survival among these recipients decrease disproportionately. The use of livers with high donor risk indices is associated with increased hospital costs that are independent of recipient risk factors, and the transplantation of livers with high donor risk indices into patients with Model for End-Stage Liver Disease scores < 15 is associated with lower allograft survival; the use of the LDRI has limited this practice. Significant regional variations in donor quality, as measured by the LDRI, remain in the United States. We also review other potential indices for liver transplantation, including donor-recipient matching and the retransplant donor risk index. Although substantial progress has been made in developing donor risk indices to objectively assess donor variables that affect transplant outcomes, continued efforts are warranted to improve these indices to enhance organ allocation policies and optimize allograft survival.  相似文献   

19.
In certain regions of the United States in which organ donor shortages are persistent and competition is high, recipients wait longer and are critically ill with Model for End‐Stage Liver Disease (MELD) scores ≥40 when they undergo liver transplantation. Recent implementation of Share 35 has increased the percentage of recipients transplanted at these higher MELD scores. The purpose of our study was to examine national data of liver transplant recipients with MELD scores ≥40 and to identify risk factors that affect graft and recipient survival. During the 12‐year study period, 5002 adult recipients underwent deceased donor whole‐liver transplantation. The 1‐, 3‐, 5‐ and 10‐year graft survival rates were 77%, 69%, 64% and 50%, respectively. The 1‐, 3‐, 5‐ and 10‐year patient survival rates were 80%, 72%, 67% and 53%, respectively. Multivariable analysis identified previous transplant, ventilator dependence, diabetes, hepatitis C virus, age >60 years and prolonged hospitalization prior to transplant as recipient factors increasing the risk of graft failure and death. Donor age >30 years was associated with an incrementally increased risk of graft failure and death. Recipients after implementation of Share 35 had shorter waiting times and higher graft and patient survival compared with pre–Share 35 recipients, demonstrating that some risk factors can be mitigated by policy changes that increase organ accessibility.  相似文献   

20.
Split-liver transplantation (SLT) effectively expands the cadaveric donor pool for children. The remaining right trisegmental (RTS) graft can be transplanted into adults. Limited information exists regarding the outcomes of RTS allografts. Sixty-five RTS graft recipients from five adult transplant programs in Texas were identified. Donor and recipient information were analyzed retrospectively. Most livers (75%) were originally allocated to pediatric recipients. Liver splitting occurred via the in situ (72%) and ex situ (28%) techniques. Arterial reconstruction of RTS grafts was common (52%). Patient and graft survival at 3 months were comparable for the in situ and ex situ techniques (p = 0.2). Cox regression showed only in situ splitting to be a predictor of outcome longer than 3 months posttransplant. Sharing of grafts between centers was frequent (37% of total). One-year patient and allograft survival (87.1% and 85.4%) were excellent with no cases of primary nonfunction. SLT consistently generates two functional liver allografts with excellent recipient survival. In situ splitting of the liver is the preferred technique. Decreased survival is observed with RTS graft use in higher risk recipients. Broader application of SLT with increased sharing is feasible and safely expands the number of liver allografts that can be transplanted.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号