首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Observation that 1,25‐Dihydroxyvitamin‐D3 has an immunomodulatory effect on innate and adaptive immunity raises the possible effect on clinical graft outcome. Aim of this study was to evaluate the correlation of biopsy‐proven acute rejection, CMV infection, BKV infection, with 1,25‐Dihydroxyvitamin‐D3 deficiency and the benefit of calcitriol supplementation before and during the transplantation. Risk factors and kidney graft function were also evaluated. All RTRs received induction therapy with basiliximab, cyclosporine, mycophenolic acid, and steroids. During the first year, the incidence of BPAR (4% vs 11%, P=.04), CMV infection (3% vs 9%, P=.04), and BKV infection (6% vs 19%, P=.04) was significantly lower in users compared to controls. By multivariate Cox regression analysis, 1,25‐Dihydroxyvitamin‐D3 deficiency and no calcitriol exposure were independent risk factors for BPAR (HR=4.30, P<.005 and HR=3.25, P<.05), for CMV infection (HR=2.33, P<.05 and HR=2.31, P=.001), and for BKV infection (HR=2.41, P<.05 and HR=2.45, P=.001). After one year, users had a better renal function: eGFR was 62.5±6.7 mL/min vs 51.4±7.6 mL/min (P<.05). Only one user developed polyomavirus‐associated nephropathy vs 15 controls. Two users lost their graft vs 11 controls. 1,25(OH)2‐D3 deficiency circulating levels increased the risk of BPAR, CMV infection, BKV infection after kidney transplantation. Administration of calcitriol is a way to obtain adequate 1,25(OH)2‐D3 circulating levels.  相似文献   

2.
Cirrhotic cardiomyopathy causes variable degree of systolic and diastolic dysfunction (DD) and conduction abnormalities. The primary aim of our study was to determine whether pre‐transplant DD and prolonged corrected QT (QTc) predict a composite of mortality, graft failure, and major cardiovascular events after liver transplantation. We also evaluated the reversibility of cirrhotic cardiomyopathy after transplantation. Adult patients who underwent liver transplantation at our institution from January 2007 to March 2009 were included. Data were obtained from institutional registry, medical record review, and evaluation of echocardiographic images. Among 243 patients, 113 (46.5%) had grade 1 DD, 16 (6.6%) had grade 2 DD, and none had grade 3 DD. The mean pre‐transplant QTc was 453 milliseconds. After a mean post‐transplant follow‐up of 5.2 years, 75 (31%) patients satisfied the primary composite outcome. Cox regression analysis did not show any significant association between DD and the composite outcome (P=.17). However, longer QTc was independently associated with the composite outcome (HR: 1.01, 95% confidence interval: 1.00–1.02, P=.05). DD (P<.001) and left ventricular mass index (P=.001) worsened after transplantation. In conclusion, QTc prolongation appears to be associated with worse outcomes. Although DD did not impact outcomes, it significantly worsened after transplantation.  相似文献   

3.
Invasive micropapillary carcinoma (IMPC) of the breast is a highly aggressive and a rare subtype of breast cancer. In this study, we aimed to investigate differences between pure and mixed IMPCs of the breast in terms of clinicopathologic features, and also to analyze the significance of expressions of ARID1A and bcl‐2 regarding prognosis. Sixty‐nine of IMPCs consisting of 21 pure and 48 mixed type diagnosed at Pathology Department of Istanbul Medical Faculty between 2000 and 2011, who had complete follow‐up data, were collected to analyze ARID1A and bcl‐2 expressions immunohistochemically with prognosis. The median follow‐up period was 94 months. No significant difference was found between pure and mixed type IMPC, as well as in luminal subgroups in terms of prognostic and clinicopatologic features. ARID1A and human epidermal growth factor receptor‐2 (Her‐2) status were found to be independent prognostic factors of both overall survival (OS) (HR=6.1, 95% CI 1.4‐26.6, P=.02; HR=15.9, 95% CI 3.5‐71.5, P<.0001, respectively) and disease free survival (DFS) (HR=4, 95% CI 1.1‐14.9, P=.04; HR=7.2, 95% CI 2‐25.4, P=.002, respectively) in multivariate analysis using Cox regression. The loss of ARID1A expression was significantly related with 10 year‐OS (P=.001) and 10 year‐DFS (P=.05). Statistically significant effect of ARID1A expression was also stated on DFS and OS in Luminal B group (P=.05 and P=.001 respectively). Pure and mixed type IMPCs are similar in terms of clinicopathologic and prognostic features. The loss of ARID1A expression and Her‐2 positivity have significant adverse effect clinical outcomes of IMPC patients.  相似文献   

4.
De novo donor‐specific antibodies (dnDSAs) have been associated with reduced graft survival. Tacrolimus (TAC)–based regimens are the most common among immunosuppressive approaches used in in clinical practice today, yet an optimal therapeutic dose to prevent dnDSAs has not been established. We evaluated mean TAC C0 (tacrolimus trough concentration) and TAC time in therapeutic range for the risk of dnDSAs in a cohort of 538 patients in the first year after kidney transplantation. A mean TAC C0 < 8 ng/mL was associated with dnDSAs by 6 months (odds ratio [OR] 2.51, 95% confidence interval [CI] 1.32–4.79, P = .005) and by 12 months (OR 2.32, 95% CI 1.30–4.15, P = .004), and there was a graded increase in risk with lower mean TAC C0. TAC time in the therapeutic range of <60% was associated with dnDSAs (OR 2.05, 95% CI 1.28‐3.30, P = .003) and acute rejection (hazard ratio [HR] 4.18, 95% CI 2.31–7.58, P < .001) by 12 months and death‐censored graft loss by 5 years (HR 3.12, 95% CI 1.53–6.37, P = .002). TAC minimization may come at a cost of higher rates of dnDSAs, and TAC time in therapeutic range may be a valuable strategy to stratify patients at increased risk of adverse outcomes.  相似文献   

5.
Body composition after kidney transplantation is linked to glucose metabolism, and impaired glucose metabolism is associated with increased risk of cardiovascular events and death. One year after transplantation, we examined 150 patients for post‐transplant diabetes performing oral glucose tolerance tests and body composition measurements including visceral adipose tissue (VAT) content from dual‐energy X‐ray absorptiometry scans. We found that glucose metabolism was generally improved over the first year post‐transplant, and that the levels of VAT and percentage VAT of total body fat mass (VAT%totBFM) were lowest in those with normal glucose tolerance and highest in those with post‐transplant diabetes mellitus. In a multivariable linear regression analysis, 87.4% of the variability in fasting glucose concentration was explained by insulin resistance (P<.001, HOMA‐IR index), beta cell function (P<.001, HOMA‐beta), VAT%totBFM (P=.007), and body mass index (BMI; P=.015; total model P<.001), while insulin resistance (P<.001) and beta cell function (P<.001) explained 31.9% of the variability in 2‐hour glucose concentration in a multivariable model (total model P<.001). VAT was associated with glucose metabolism to a larger degree than BMI. In conclusion, VAT is associated with hyperglycemia one year after kidney transplantation, and insulin resistance and beta cell function estimates are the most robust markers of glucose metabolism.  相似文献   

6.
Transplant glomerulopathy is mainly due to chronic antibody‐mediated rejection and actually represents a major cause of long‐term allograft failure. The lack of effective treatment remains a serious problem in transplantation. A retrospective and uni‐center study was performed in 48 kidney allograft recipients with transplant glomerulopathy between January 2010 and December 2015. Median time for diagnosis was 7.1 (3.6‐11.8) years post‐transplant. Light microscopy showed severity of transplant glomerulopathy in the majority of patients (cg1=10.4%; cg2=20.8%; cg3=68.8%). Moderate microvascular inflammation was present in 56.3% (g+ptc≥2), and almost half of recipients (51.1%) were C4d positive in immunofluorescence. Female gender (P=.001), age (P=.043), renal dysfunction (P=.002), acute rejection episodes (P=.026), and anti‐HLA class II antibodies (P=.004) were associated with kidney allograft failure. Treatment of transplant glomerulopathy was performed in 67.6% of patients. The histologic and laboratory features that led to a therapeutic intervention were score ptc (P=.021), C4d (P=.03), and the presence of anti‐HLA antibodies (P=.029), whereas score ah (P=.005) was associated with conservative measure. The overall cumulative kidney allograft survival at 10 years was 75%. Treatment of transplant glomerulopathy was ineffective to improve long‐term kidney allograft survival.  相似文献   

7.
Kidney transplant (KT) programs have extended recipient eligibility to those who were previously excluded due to advanced age. We aimed to determine the outcomes of the patients ≥70 years undergoing KT and investigate factors predicting survival. Two thousand six hundred and twenty‐four KT patients between 2003 and 2013 at two institutions were divided into two groups; those ≥70 years (n=300) and those <70 years (n=2324) at the time of KT. Patient survival at 1, 3, and 5 years was 95%, 86%, and 77% in ≥70 years of age group and 98%, 95%, and 90% in the <70 years group (P<.001). When graft loss due to death was censored, graft survival was not significantly different between the two groups (P=.18). On multivariable analysis, the significant predictors of inferior survival in patients ≥70 years included: body mass index (BMI)>30 kg/m2 (hazard ratio [HR] 1.07; P=.01), panel reactive antibody (PRA)>20% (HR 2.38; P=.01), previous coronary artery bypass grafting (CABG; HR 1.95; P=.03) and peripheral vascular disease (PVD; HR 2.60; P=.04). Acceptable outcomes can be achieved in KT recipients ≥70 years. Caution should be used when listing these patients if they have BMI>30 kg/m2, PRA>20%, CABG or PVD.  相似文献   

8.
Thirty percent of kidney transplant recipients are readmitted in the first month posttransplantation. Those with donor‐specific antibody requiring desensitization and incompatible live donor kidney transplantation (ILDKT) constitute a unique subpopulation that might be at higher readmission risk. Drawing on a 22‐center cohort, 379 ILDKTs with Medicare primary insurance were matched to compatible transplant‐matched controls and to waitlist‐only matched controls on panel reactive antibody, age, blood group, renal replacement time, prior kidney transplantation, race, gender, diabetes, and transplant date/waitlisting date. Readmission risk was determined using multilevel, mixed‐effects Poisson regression. In the first month, ILDKTs had a 1.28‐fold higher readmission risk than compatible controls (95% confidence interval [CI] 1.13‐1.46; P < .001). Risk peaked at 6‐12 months (relative risk [RR] 1.67, 95% CI 1.49‐1.87; P < .001), attenuating by 24‐36 months (RR 1.24, 95% CI 1.10‐1.40; P < .001). ILDKTs had a 5.86‐fold higher readmission risk (95% CI 4.96‐6.92; P < .001) in the first month compared to waitlist‐only controls. At 12‐24 (RR 0.85, 95% CI 0.77‐0.95; P = .002) and 24‐36 months (RR 0.74, 95% CI 0.66‐0.84; P < .001), ILDKTs had a lower risk than waitlist‐only controls. These findings of ILDKTs having a higher readmission risk than compatible controls, but a lower readmission risk after the first year than waitlist‐only controls should be considered in regulatory/payment schemas and planning clinical care.  相似文献   

9.
Tuberculosis (TB) mortality is high among kidney transplant (KT) recipients. Although local epidemiology is an important factor, diagnostic/therapeutic challenges and immunosuppressive therapy (ISS) may influence outcomes. We analyzed the cumulative incidence (CumI) of TB in KT recipients receiving a variety of ISS with long‐term follow‐up. Our retrospective single‐center cohort study included all KT procedures performed between January 1, 1998, and August 31, 2014, with follow‐up until August 31, 2014. Induction therapy was based on perceived immunological risk; maintenance ISS included prednisone and calcineurin inhibitor (CNI) plus azathioprine (AZA), and mycophenolic acid (MPA) or mechanistic target of rapamycin inhibitor (mTORi). Thirty‐four patients received belatacept/MPA. KT was performed on 11 453 patients and followed for 1989 (IQR 932 to 3632) days. Among these, 152 patients were diagnosed with TB (CumI 1.32%). Median time from KT to TB was 18.8 (IQR 7.2 to 60) months, with 59% of patients diagnosed after the first year. Unadjusted analysis revealed an increasing confidence interval (CI) of TB (0.94% CNI/AZA vs 1.6% CNI/MPA [HR = 1.62, 95% CI = 1.13 to 2.34, P = .009] vs 2.85% CNI/mTORi [HR = 2.45, 95% CI = 1.49 to 4.32, P < .001] vs 14.7% belatacept/MPA [HR = 13.14, 95% CI = 5.27 to 32.79, P < .001]). Thirty‐seven (24%) patients died, and 39 (25.6%) patients experienced graft loss. Cytomegalovirus infection (P = .02) and definitive ISS discontinuation (P < .001) were associated with death. Rejection (P = .018) and ISS discontinuation (P = .005) occurred with graft loss. TB occurred at any time after KT and was influenced by ISS.  相似文献   

10.
Kidney transplant in patients with liver cirrhosis and nondialysis chronic kidney disease (CKD) is controversial. We report 14 liver cirrhotic patients who had persistently low MDRD‐6 estimated glomerular filtration rate (e‐GFR) <40 mL/min/1.73 m2 for ≥3 months and underwent either liver transplant alone (LTA; n=9) or simultaneous liver‐kidney transplant (SLKT; n=5). Pretransplant, patients with LTA compared with SLKT had lower serum creatinine (2.5±0.73 vs 4.6±0.52 mg/dL, P=.001), higher MDRD‐6 e‐GFR (21.0±7.2 vs 10.3±2.0 mL/min/1.73 m2, P=.002), higher 24‐hour urine creatinine clearance (34.2±8.8 vs 18.0±2.2 mL/min, P=.002), lower proteinuria (133.2±117.7 vs 663±268.2 mg/24 h, P=.0002), and relatively normal kidney biopsy and ultrasound findings. Post‐LTA, the e‐GFR (mL/min/1.73 m2) increased in all nine patients, with mean e‐GFR at 1 month (49.8±8.4), 3 months (49.6±8.7), 6 months (49.8±8.1), 12 months (47.6±9.2), 24 months (47.9±9.1), and 36 months (45.1±7.3) significantly higher compared to pre‐LTA e‐GFR (P≤.005 at all time points). One patient developed end‐stage renal disease 9 years post‐LTA and another patient expired 7 years post‐LTA. The low e‐GFR alone in the absence of other markers or risk factors of CKD should not be an absolute criterion for SLKT in patients with liver cirrhosis.  相似文献   

11.
《Transplantation proceedings》2021,53(10):3007-3015
Identification of risk factors for biliary stricture after liver transplant and its potential prevention is crucial to improve the outcomes and reduce the complications. We retrospectively analyzed donor and recipient characteristics with intraoperative and postoperative parameters to identify the risk factors for development of post-transplant anastomotic and nonanastomotic biliary strictures with additional analysis of the time onset of those strictures. A total of 412 patients were included in this study. Mean (SD) follow-up time was 79 (35) months (range, 1-152 months). Biliary stricture was diagnosed in 84 patients (20.4%). Multivariate analysis indicated that postoperative biliary leakage (odd ratio [OR], 3.94; P = .001), acute cellular rejection (OR, 3.05; P < .001), donor age older than 47.5 years (OR, 2.05; P = .032), preoperative recipient platelet value < 77.5 × 103/mL (OR, 1.91; P = .023), University of Wisconsin solution (OR, 1.73; P = .041)), recipient male sex (OR, 1.78; P = .072), portal/arterial flow ratio > 4 (OR, 1.76; P = .083), and intraoperative bleeding > 2850 mL (OR, 1.70; P = .053) were independent risk factors for biliary stricture regardless of the time of their appearance. Multiple risk factors for biliary stricture were determined in this study. Some of these risk factors are preventable, and implementation of strategies to eliminate some of those factors should reduce the development of post-transplant biliary stricture.  相似文献   

12.
Differentiation between systemic inflammatory response syndrome and sepsis in surgical patients is of crucial significance. Procalcitonin (PCT) and C‐reactive protein (CRP) are widely used biomarkers, but PCT becomes compromised after antithymocyte globulin (ATG) administration, and CRP exhibits limited specificity. Presepsin has been suggested as an alternative biomarker of sepsis. This study aimed to demonstrate the role of presepsin in patients after heart transplantation (HTx). Plasma presepsin, PCT, and CRP were measured in 107 patients serially for up to 10 days following HTx. Time responses of biomarkers were evaluated for both noninfected (n=91) and infected (n=16) patients. Areas under the concentration curve differed in the two groups of patients for presepsin (P<.001), PCT (P<.005), and CRP (P<.001). The effect of time and infection was significant for all three biomarkers (P<.05 all). In contrast to PCT, presepsin was not influenced by ATG administration. More than 25% of noninfected patients had PCT above 42 μg/L on the first day, and the peak concentration of CRP in infected patients was reached on the third post‐transplant day (median 135 mg/L). Presepsin seems to be as valuable a biomarker as PCT or CRP in the evaluation of infectious complications in patients after HTx.  相似文献   

13.
Recipients of liver allografts from diabetic donors have decreased graft survival. However, limited data exist on the effects of donor HbA1c. We hypothesized that allografts from nondiabetic donors with elevated HbA1c would be associated with decreased survival. Liver transplant recipients from the UNOS database from nondiabetic donors were stratified into two groups: euglycemic (HbA1c<6.5) and hyperglycemic (HbA1c≥6.5). Propensity score matching (10:1) was used to adjust for donor and recipient characteristics. Kaplan‐Meier analysis was used to assess survival. Donors of hyperglycemic allografts were older (49 vs 36, P<.001), were more likely to be non‐white, had a higher BMI (29.8 vs 26.2, P<.001), were more likely to engage in heavy cigarette use (1.5% vs 1.3%, P=.004), had higher serum creatinine levels (1.3 vs 1.0, P=.002), and were more likely to be an expanded‐criteria donor (35.8% vs 14.4%, P<.001). After propensity matching to account for these differences, allograft survival was significantly decreased in the recipients of hyperglycemic allografts (P=.049), and patient survival showed a trend toward reduction (P=.082). These findings suggest that HbA1c may be a simple and inexpensive test with potential utility for better organ risk stratification.  相似文献   

14.
Desensitization has enabled incompatible living donor kidney transplantation (ILDKT) across HLA/ABO barriers, but added immunomodulation might put patients at increased risk of infections. We studied 475 recipients from our center from 2010 to 2015, categorized by desensitization intensity: none/compatible (n = 260), low (0-4 plasmaphereses, n = 47), moderate (5-9, n = 74), and high (≥10, n = 94). The 1-year cumulative incidence of infection was 50.1%, 49.8%, 66.0%, and 73.5% for recipients who received none, low, moderate, and high-intensity desensitization (P < .001). The most common infections were UTI (33.5% of ILDKT vs. 21.5% compatible), opportunistic (21.9% vs. 10.8%), and bloodstream (19.1% vs. 5.4%) (P < .001). In weighted models, a trend toward increased risk was seen in low (wIRR = 0.771.402.56,P = .3) and moderately (wIRR = 0.881.352.06,P = .2) desensitized recipients, with a statistically significant 2.22-fold (wIRR = 1.332.223.72,P = .002) increased risk in highly desensitized recipients. Recipients with ≥4 infections were at higher risk of prolonged hospitalization (wIRR = 2.623.574.88, P < .001) and death-censored graft loss (wHR = 1.154.0113.95,P = .03). Post–KT infections are more common in desensitized ILDKT recipients. A subset of highly desensitized patients is at ultra-high risk for infections. Strategies should be designed to protect patients from the morbidity of recurrent infections, and to extend the survival benefit of ILDKT across the spectrum of recipients.  相似文献   

15.
Biliary complications (BC) significantly affect morbidity and mortality after orthotopic liver transplantation (OLT). The aim of this study was to analyze the incidence and types of biliary complications after OLT in Hungary. We retrospectively analyzed data of 471 adult liver transplant recipients between 1995 and 2011. Biliary complications occurred in 28% of patients. The most frequent BCs were bile duct stricture, stenosis (19%), biliary leakage (12%), and necrosis (BN: 6.4%). Biliary complications were associated with the incidence of acute rejection (51% vs 31%; P = .003), hepatic artery thrombosis (43% vs 11%; P < .001), and hepatic artery stenosis (26% vs 11%; P = .002). When cold ischemic time was longer than 12 hours, leakage (10% vs 3%; P = .043), ischemic type biliary lesion (20% vs 3.4%; P = .05), and BN (12% vs 3%; P = .067) were more often diagnosed post-OLT. Most of the biliary complications were treated by radiologic interventions (70%). Bile duct necrosis was associated with lower graft and patient survival. In conclusion, acute rejection, hepatic artery thrombosis/stenosis and cold ischemic time longer than 12 hours increase the incidence of BCs. Successful management of these risk factors can reduce the incidence of biliary complications and improve mortality.  相似文献   

16.
Although experimental studies have reported that hepatic ischemia-reperfusion injury promotes tumor growth and metastases, the impact of graft hemodynamics on the recurrence of hepatocellular carcinoma (HCC) after liver transplantation (LT) is unclear. To investigate the association between graft hemodynamics and HCC recurrence after LT, we conducted a retrospective analysis of 279 patients who underwent LT for HCC. Graft hemodynamics including portal vein flow (PVF), hepatic artery flow (HAF), and total hepatic flow (THF) was analyzed as a predictor of HCC recurrence, using competing risk regression analyses. The cutoff values of PVF, HAF, and THF were set at the lower quartile of distribution. A cumulative recurrence curve demonstrated that low THF (<1511 mL/min, P = .005) was significantly associated with increased recurrence, whereas neither low PVF (<1230 mL/min, P = .150) nor low HAF (<164 mL/min, P = .110) was significant. On multivariate analysis, outside Milan criteria (sub-hazard ratio [SHR] = 3.742; P < .001), microvascular invasion (SHR = 3.698; P < .001), and low THF (SHR = 2.359; P = .010) were independently associated with increased HCC recurrence. In conclusion, our findings suggest that graft hemodynamics may play an important role in HCC recurrence after LT.  相似文献   

17.
The role of antithymocyte globulin (ATG) in patients with hematologic diseases undergoing umbilical cord blood transplantation (UCBT) remains controversial. This systematic review and meta-analysis was conducted to comprehensively evaluate this issue. PubMed, Embase, and the Cochrane Library were systematically searched. Clinical studies reporting the impact of ATG- vs non-ATG-containing conditioning regimens on transplantation outcomes were identified. Twenty-five studies were included. ATG significantly prevented grade II-IV and grade III-IV acute graft-vs-host disease (GVHD) (11 studies, 5020 patients, HR: 0.49, 95% CI: 0.42-0.56, P < .001; 5 studies, 5490 patients, HR: 0.60, 95% CI: 0.46-0.80, P < .001) but not chronic GVHD (8 studies, 5952 patients, HR: 0.78, 95% CI: 0.51-1.20, P = .266). However, use of ATG was associated with increased transplantation-related mortality and inferior overall survival (9 studies, 4244 patients, HR: 1.79, 95% CI: 1.38-2.33, P < .001; 8 studies, 5438 patients, HR: 1.96, 95% CI: 1.56-2.46, P < .001). Our study did not recommend routine use of ATG in UCBT. Individualizing the ATG timing and dose based on patient characteristics to retain the prophylactic effects of ATG on GVHD without compromising the survival of UCBT recipients may be reasonable.  相似文献   

18.
We aimed to evaluate the influence of urological complications occurring within the first year after kidney transplantation on long‐term patient and graft outcomes, and sought to examine the impact of the management approach of ureteral strictures on long‐term graft function. We collected data on urological complications occurring within the first year posttransplant. Graft survivals, patient survival, and rejection rates were compared between recipients with and without urological complications. Male gender of the recipient, delayed graft function, and donor age were found to be significant risk factors for urological complications after kidney transplantation (P < .05). Death censored graft survival analysis showed that only ureteral strictures had a negative impact on long‐term graft survival (P = .0009) compared to other complications. Death censored graft survival was significantly shorter in kidney recipients managed initially with minimally invasive approach when compared to the recipients with no stricture (P = .001). However, graft survival was not statistically different in patients managed initially with open surgery (P = .47). Ureteral strictures following kidney transplantation appear to be strongly negatively correlated with long‐term graft survival. Our analysis suggests that kidney recipients with ureteral stricture should be managed initially with open surgery, with better long‐term graft survival.  相似文献   

19.
The Model for End‐Stage Liver Disease (MELD) score predicts higher transplant healthcare utilization and costs; however, the independent contribution of functional status towards costs is understudied. The study objective was to evaluate the association between functional status, as measured by Karnofsky Performance Status (KPS), and liver transplant (LT) costs in the first posttransplant year. In a cohort of 598 LT recipients from July 1, 2009 to November 30, 2014, multivariable models assessed associations between KPS and outcomes. LT recipients needing full assistance (KPS 10%‐40%) vs being independent (KPS 80%‐100%) were more likely to be discharged to a rehabilitation facility after LT (22% vs 3%) and be rehospitalized within the first posttransplant year (78% vs 57%), all P < .001. In adjusted generalized linear models, in addition to MELD (P < .001), factors independently associated with higher 1‐year post‐LT transplant costs were older age, poor functional status (KPS 10%‐40%), living donor LT, pre‐LT hemodialysis, and the donor risk index (all P < .001). One‐year survival for patients in the top cost decile was 83% vs 93% for the rest of the cohort (log rank P < .001). Functional status is an important determinant of posttransplant resource utilization; therefore, standardized measurements of functional status should be considered to optimize candidate selection and outcomes.  相似文献   

20.
The success of direct-acting antiviral (DAA) therapy has led to near-universal cure for patients chronically infected with hepatitis C virus (HCV) and improved post–liver transplant (LT) outcomes. We investigated the trends and outcomes of retransplantation in HCV and non-HCV patients before and after the introduction of DAA. Adult patients who underwent re-LT were identified in the Organ Procurement and Transplantation Network/United Network for Organ Sharing database. Multiorgan transplants and patients with >2 total LTs were excluded. Two eras were defined: pre-DAA (2009-2012) and post-DAA (2014-2017). A total of 2112 re-LT patients were eligible (HCV: n = 499 pre-DAA and n = 322 post-DAA; non-HCV: n = 547 pre-DAA and n = 744 post-DAA). HCV patients had both improved graft and patient survival after re-LT in the post-DAA era. One-year graft survival was 69.8% pre-DAA and 83.8% post-DAA (P < .001). One-year patient survival was 73.1% pre-DAA and 86.2% post-DAA (P < .001). Graft and patient survival was similar between eras for non-HCV patients. When adjusted, the post-DAA era represented an independent positive predictive factor for graft and patient survival (hazard ratio [HR]: 0.67; P = .005, and HR: 0.65; P = .004) only in HCV patients. The positive post-DAA era effect was observed only in HCV patients with first graft loss due to disease recurrence (HR: 0.31; P = .002, HR 0.32; P = .003, respectively). Among HCV patients, receiving a re-LT in the post-DAA era was associated with improved patient and graft survival.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号