首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
BackgroundWe investigated the impact of de novo donor-specific anti-human leukocyte antigen antibodies (dnDSAs) on long-term death-censored graft survival and renal allograft rejection in kidney transplant recipients.MethodsThe sample for this retrospective cohort study comprised 121 recipients of kidney transplants with negative complement-dependent cytotoxicity crossmatches to their deceased donors. Recipients were divided into two groups: dnDSAs+ (n = 31) and dnDSAs- (n = 90). We evaluated rejection and long-term graft survival rates in the recipients along with pathologic changes in the transplanted kidneys.ResultsDnDSAs were identified in 31/121 patients (25.6%). The graft survival rate in the dnDSAs+ group was 87.1% (27/31) and that of the dnDSAs- group was 97.8% (88/90). The dnDSAs+ group had lower graft survival rates than patients without dnDSAs (p = 0.007). There was no difference in the graft survival rates between patients with high DSA mean fluorescence intensity (≥4000) and those with low intensity (<4000) (p = 0.669). There was also no difference in the graft survival rates of patients with HLA class I, II, and I + II dnDSAs (p = 0.571). The presence of dnDSA in serum was associated with a higher incidence of antibody- and T-cell–mediated rejection (p < 0.0001). Banff scores for arterial fibrointimal and arteriolar hyalin, thickening as well as C4d deposition differed for the dnDSAs+ and dnDSAs- groups (p < 0.05).ConclusionDnDSAs were found to be associated with decreased long-term graft survival rates and increased rejection rates, often accompanied by C4d deposition.  相似文献   

2.
Preventing conversion of donor‐specific anti‐HLA antibodies (DSAs) from an IgM‐to‐IgG could a way to prevent chronic rejection. We evaluated whether belatacept‐treated patients (belatacept less‐intensive [LI] or more‐intensive [MI] regimens) have a lower rate of conversion than do cyclosporine A (CsA)–treated patients. We included 330 HLA‐mismatched patients from 2 phase 3 trials with either (a) complete donor/recipient HLA‐A, ‐B, ‐DR, and ‐DQ loci typing or (b) incomplete HLA typing with IgG DSAs detected pretransplant or posttransplant. IgM and IgG DSAs were tested with single antigen beads at 0, 6, 12, 24, and 36 months posttransplant. The overall (preexisting or de novo) rates of IgM‐ and IgG‐positive DSAs were 29% and 34%, respectively. The pretransplant IgM and IgG DSA‐positive frequencies were similar between treatment groups. The IgG‐positive dnDSA rate was significantly higher in the CsA‐treated group (34%) compared with the belatacept‐LI (8%) and belatacept‐MI (11%) (P < .001) groups. In IgM‐positive dnDSA patients, the IgG‐positive dnDSA rate of conversion was 2.8 times higher in the CsA group than in the combined belatacept groups (P = .006). However, the observed association between belatacept treatment and more limited conversion of IgM‐to‐IgG dnDSAs was based on a limited number of patients and requires further validation.  相似文献   

3.
《Transplantation proceedings》2021,53(6):1865-1871
BackgroundRenal allograft survival is negatively affected by the development of de novo posttransplant donor-specific antibodies (dnDSA). We sought to determine whether treatment with intravenous immunoglobulin (IVIG) could remove or reduce the intensity of dnDSA.MethodsA single-center study of 12 recipients with dnDSA and stable function who received IVIG 1 g/kg monthly for 6 months were compared with a contemporaneous cohort of 24 recipients with dnDSA who did not receive IVIG.ResultsThe median time to first dnDSA was 6 months (interquartile range [IQR], 1-12), and follow-up was 83 months (IQR, 58-94) posttransplant. Resolution of dnDSA occurred in 27% of IVIG vs 46% of control recipients (P = .48). Fifty-eight percent of recipients in both cohorts demonstrated a reduction in the intensity of the dominant DSA at last follow-up (P =1.0). A reduction in the number of dnDSAs occurred in 58% vs 62% of the IVIG and control cohorts, respectively (P = .81). Post-dnDSA, acute rejection occurred in 8% of the IVIG vs 42% in the control group (P = .06). Forty-two percent of IVIG-treated vs 49% of control recipients had a deterioration in function from first dnDSA until most recent follow-up (P = .81). Actuarial graft survivals were equivalent between groups.ConclusionsIVIG treatment of dnDSA in recipients with stable graft function had no impact on DSA clearance or MFI reduction, but this outcome may also be owing to sample size. Larger studies or alternate dosing regimens may be required to determine if there is any role for the use of IVIG as a treatment for dnDSA.  相似文献   

4.
The relationship between blood transfusion following kidney transplantation (KT) and the development of de novo donor-specific antibodies (dnDSA) is controversial. This was investigated by conducting a meta-analysis of studies on patients who underwent KT with or without blood transfusion, and by evaluating the effect of post-KT blood transfusion on clinical outcomes of kidney transplant recipients. Relevant studies in the PubMed, EMBASE, and Cochrane Library databases were identified from inception to July 1, 2022. Two reviewers independently extracted data from the selected articles and estimated study quality. A fixed effects or random effects model was used to pool data according to the heterogeneity among studies. Data included in the meta-analysis were derived from 11 studies with a total of 19,543 patients including 6191 with and 13,352 without blood transfusion post-KT. We assessed the pooled associations between blood transfusion and occurrence of dnDSA and clinical outcomes of transplant recipients. Blood transfusion was strongly correlated with the development of dnDSA (relative risk [RR] = 1.40, 95% confidence interval [CI]: 1.17–1.67; P < 0.05). Patients with blood transfusion had a higher risk of developing anti-human leukocyte antigen (HLA) class I dnDSA than non-transfused patients (RR = 1.75, 95% CI: 1.14–2.69; P < 0.05) as well as significantly higher rates of antibody-mediated rejection (AMR) (RR = 1.41, 95% CI: 1.21–2.35; P < 0.05) and graft loss (RR = 1.75, 95% CI: 1.30–2.35; P < 0.05). There were no statistically significant differences between the two groups in the development of anti-HLA antibodies, anti-HLA class II dnDSA, and anti-HLA class I and II dnDSA; delayed graft function; T cell-mediated rejection; acute rejection; borderline rejection; or patient death. Our results suggest that blood transfusion was associated with dnDSA development in KT recipients. The findings of this systematic review also suggest that post-KT blood transfusion recipients have a higher risk of AMR, and graft loss compared with non-transfused patients. Evidence from this meta-analysis indicates that the use of blood transfusion post-KT is associated with a significantly higher risk of immunological sensitization. More and higher quality results from large randomized controlled trials are still needed to inform clinical practice.  相似文献   

5.
De novo donor‐specific antibodies (dnDSAs) have been associated with reduced graft survival. Tacrolimus (TAC)–based regimens are the most common among immunosuppressive approaches used in in clinical practice today, yet an optimal therapeutic dose to prevent dnDSAs has not been established. We evaluated mean TAC C0 (tacrolimus trough concentration) and TAC time in therapeutic range for the risk of dnDSAs in a cohort of 538 patients in the first year after kidney transplantation. A mean TAC C0 < 8 ng/mL was associated with dnDSAs by 6 months (odds ratio [OR] 2.51, 95% confidence interval [CI] 1.32–4.79, P = .005) and by 12 months (OR 2.32, 95% CI 1.30–4.15, P = .004), and there was a graded increase in risk with lower mean TAC C0. TAC time in the therapeutic range of <60% was associated with dnDSAs (OR 2.05, 95% CI 1.28‐3.30, P = .003) and acute rejection (hazard ratio [HR] 4.18, 95% CI 2.31–7.58, P < .001) by 12 months and death‐censored graft loss by 5 years (HR 3.12, 95% CI 1.53–6.37, P = .002). TAC minimization may come at a cost of higher rates of dnDSAs, and TAC time in therapeutic range may be a valuable strategy to stratify patients at increased risk of adverse outcomes.  相似文献   

6.

Background

De novo complement-binding donor-specific anti-human leukocyte antigen antibodies (DSAs) are reportedly associated with an increased risk of kidney graft failure, but there is little information on preformed complement-binding DSAs. This study investigated the correlation between preformed C1q-binding DSAs and medium-term outcomes in kidney transplantation (KT).

Methods

We retrospectively studied 44 pretransplant DSA-positive patients, including 36 patients who underwent KT between April 2010 and October 2016. There were 17 patients with C1q-binding DSAs and 27 patients without C1q-binding DSAs. Clinical variables were examined in the 2 groups.

Results

Patients with C1q-binding DSAs had significantly higher blood transfusion history (53.0% vs 18.6%; P = .0174), complement-dependent cytotoxicity crossmatch (CDC-XM)-positivity (29.4% vs 0%; P = .0012), and DSA median fluorescence intensity (MFI) (10,974 vs 2764; P = .0009). Among patients who were not excluded for CDC-XM-positivity and underwent KT, there was no significant difference in cumulative biopsy-proven acute rejection rate (32.5% vs 33.5%; P = .8354), cumulative graft survival, and 3-month and 12-month protocol biopsy results between patients with and without C1q-binding DSAs. Although patients with C1q-binding DSAs showed a higher incidence of delayed graft function (54.6% vs 20.0%; P = .0419), multivariate logistic regression showed that DSA MFI (P = .0124), but not C1q-binding DSAs (P = .2377), was an independent risk factor for delayed graft function.

Conclusions

In patients with CDC-XM-negativity, preformed C1q-binding DSAs were not associated with incidence of antibody-mediated rejection and medium-term graft survival after KT. C1q-binding DSAs were highly correlated with DSA MFI and CDC-XM-positivity.  相似文献   

7.
BackgroundComplement-binding donor-specific human leukocyte antigen (HLA) antibodies in kidney recipients have been associated with a higher risk of allograft rejection and loss. The objective of this meta-analysis was to investigate the correlation between C1q-binding donor-specific antibodies (DSAs) and clinical outcomes in kidney transplantation (KT) recipients.MethodsWe conducted systematic searches in the PubMed, EMBASE, and the Cochrane Library databases to identify all studies since inception to August 2021 that compared clinical outcomes between C1q + DSA and C1q-DSA patients who underwent KT. Data were independently extracted by two reviewers who assessed the risk of bias. Data were summarized with fixed effects or random effects models according to heterogeneity. We assessed clinical outcomes including graft loss, rejection, delayed graft function (DGF), and all-cause patient death.ResultsTwenty-six studies with a total of 1337 patients were included: 485 with C1q-binding DSAs, and 850 without C1q-binding DSAs. Compared with the C1q-DSA group, the C1q + DSA group had significant increases in antibody-mediated rejection (AMR) (relative risk [RR] = 2.09, 95% confidence interval [CI], 1.53–2.86; P < 0.00001), graft loss (RR = 2.40, 95% CI, 1.66–3.47; P < 0.00001), and death (RR = 3.13, 95% CI, 1.06–9.23; P = 0.04). The C1q + DSA and C1q-DSA groups did not show significant differences in T-cell-mediated rejection, acute rejection, acute cellular rejection, mixed rejection, or DGF.ConclusionThe findings of this systematic review suggest that C1q + DSA KT have a higher risk of AMR, graft loss, and death compared with C1q-DSA patients. Monitoring C1q-binding DSAs allows risk stratification of recipients and guides physician management.  相似文献   

8.
BackgroundBorderline changes suspicious for acute T-cell–mediated rejection (BC) are frequently seen on biopsy specimens, but their clinical significance and clinical management are still controversial. Our goal was to compare clinical outcomes of kidney transplant recipients with biopsy-proven BC vs acute T-cell–mediated rejection (aTCMR) and the influence of treating BC on graft outcomes.MethodsA retrospective cohort study was performed in all kidney transplant recipients with biopsy-proven BC and aTCMR between January 2012 and December 2018, according to Banff 2017 criteria; patients with concomitant antibody-mediated rejection were excluded.ResultsWe included 85 patients, 30 with BC (35.3%) and 55 with aTCMR (64.7%). There was no difference between groups regarding demographics, HLA matching and sensitization, immunosuppression, or time of transplant. Treatment with steroids was started in 15 patients with BC (50%) and in all patients with aTCMR, with 4 of the latter additionally receiving thymoglobulin (7.2%). At 1 year post biopsy, overall graft survival was 71%, and despite presenting better estimated glomerular filtration rate (eGFR) at biopsy (33.3 ± 23.4 vs 19.9 ± 13.2 mL/min/1.73 m2, P = .008), patients in the BC group presented the same graft survival as the aTCMR group according to Kaplan-Meyer survival curves. When analyzing the BC group (n = 30) and comparing the patients who were treated (n = 15) vs a conservative approach (n = 15), graft survival at 1 year was 87% for treated patients and 73% for nontreated patients (P = .651), with no difference in eGFR for patients with functioning graft. However, at longer follow-up, survival curves showed a trend for better graft survival in treated patients (70.2 ± 9.2 vs 38.4 ± 8.4 months, P = .087).ConclusionOur study showed that patients with BC did not present better graft survival or graft function at 1 year after biopsy or at follow-up compared with the aTCMR group, despite better eGFR at diagnosis. We found a trend for better graft survival in patients with BC treated with steroids compared with a conservative approach. These results reinforce the importance of borderline changes in graft outcomes and that the decision to treat can influence long-term outcomes.  相似文献   

9.
《Transplantation proceedings》2021,53(10):2879-2887
BackgroundThe aim of the study was to assess the influence of pretransplant body mass index (BMI [calculated as weight in kilograms divided by height in meters squared]) to the graft and patient 5- and 10-year survival.MethodsOur study group consisted of 706 patients who received their kidney transplant after the year 2000.ResultsAlmost half, 51.9% (n = 372) of the patients had BMI < 25, and 47.6% (n = 336) had BMI ≥ 25. Patients who were overweight or obese were significantly older than other groups (P = .01). The 5-year recipient survival was significantly better in the BMI < 25 group (n = 291, 79.5%) than the BMI ≥ 25 group (n = 238, 70.2%, P < .05). In addition, 10-year recipient survival was better in the BMI < 25 group (n = 175, 47.8%) compared with the BMI ≥ 25 group (n = 127, 37.5%, P < .05). Similarly, 5-year graft survival was better in the BMI < 25 group (66.9%, n = 242) compared with the BMI ≥ 25 group (61.1%, n = 204, P < .05). However, 10-year graft survival was not statistically significant (P = .08). Regarding the impact of diabetes on survival, we found patients with diabetes mellitus to have worse survival in all groups (P = .009).ConclusionsRecipient graft survival was affected by diabetes mellitus independently from being overweight. In the current study, we demonstrated that pretransplant obesity or being overweight affects recipient and graft short-term survival, but long-term comparison of patients who were overweight or obese with patients with normal BMI revealed minimal recipient survival differences and in graft survival analysis no difference. Although in many studies obesity and being overweight predict a bad outcome for kidney transplant recipient survival, our research did not fully confirm it. Diabetes mellitus had worse outcome in all patients groups.  相似文献   

10.
While de novo donor-specific HLA antibodies (dnDSAs) have a detrimental impact on kidney graft outcome, the clinical significance of de novo non donor-specific antibodies (dnNDSAs) is more controversial. We retrospectively evaluated for Ab development and characteristics of dnNDSAs serially collected post-transplant sera and, when available, graft biopsy eluates, from 144 non-sensitized, primary pediatric kidney recipients, consecutively transplanted at a single center between 2003 and 2017, using HLA class I and class II single-antigen flow-bead assays (SAB). The results were compared with clinical-pathologic data from HLA antibody negative and HLA dnDSA-positive patients.Forty-five out of 144 patients developed dnNDSAs (31%). Among the dnNDSA-positive patients, 86% displayed one or more class I/II antibodies recognizing antigens included in the CREG/shared epitope groups that also comprise the mismatched donor HLA antigens. Despite potential pathogenicity, as suggested by their occasional presence within the graft, dnNDSAs displayed significantly lower MFI, and limited complement binding and graft homing properties, when compared with dnDSAs. In parallel, the graft survival probability was significantly lower in patients with dnDSA than in those with dnNDSA or without HLA antibodies (p < 0.005). Indeed, the dnNDSA-positive patients remaining dnDSA-negative throughout the posttransplant period did not develop clinical antibody mediated rejection and graft loss, and maintained good graft function at a median follow-up of 9 years. The biological characteristics of dnNDSAs may account for the low graft damaging capability when compared to dnDSAs.  相似文献   

11.
Donor‐specific antibodies (DSA) increase the risk of allograft rejection and graft failure. They may be present before transplant or develop de novo after transplantation. Here, we studied the evolution of preformed DSA and their impact on graft outcome in kidney transplant recipients. Using the Luminex Single Antigen assay, we analyzed the sera on the day of transplantation of 239 patients who received a kidney transplant. Thirty‐seven patients (15.5%) had pre‐existing DSA detected the day of transplantation. After 5 years, the pre‐existing DSA disappeared in 22 patients whereas they persisted in 12. Variables associated with DSA persistence were age <50 years (P = 0.009), a history of previous transplantation (P = 0.039), the presence of class II DSA (P = 0.009), an MFI of preformed DSA >3500 (P < 0.001), and the presence of two or more DSA (P < 0.001). DSA persistence was associated with a higher risk of graft loss and antibody‐mediated rejection. Previously undetected preformed DSA are deleterious to graft survival only when they persist after transplantation.  相似文献   

12.
We conducted this study using the updated 2005‐2016 Organ Procurement and Transplantation Network database to assess clinical outcomes of retransplant after allograft loss as a result of BK virus–associated nephropathy (BKVAN). Three hundred forty‐one patients had first graft failure as a result of BKVAN, whereas 13 260 had first graft failure as a result of other causes. At median follow‐up time of 4.70 years after the second kidney transplant, death‐censored graft survival at 5 years for the second renal allograft was 90.6% for the BK group and 83.9% for the non‐BK group. In adjusted analysis, there was no difference in death‐censored graft survival (P = .11), acute rejection (P = .49), and patient survival (P = .13) between the 2 groups. When we further compared death‐censored graft survival among the specific causes for first graft failure, the BK group had better graft survival than patients who had prior allograft failure as a result of acute rejection (P < .001) or disease recurrence (P = .003), but survival was similar to those with chronic allograft nephropathy (P = .06) and other causes (P = .05). The better allograft survival in the BK group over acute rejection and disease recurrence remained after adjusting for potential confounders. History of allograft loss as a result of BKVAN should not be a contraindication to retransplant among candidates who are otherwise acceptable.  相似文献   

13.
IntroductionThe appearance of de novo donor-specific anti-human leukocyte antigen antibodies (dnDSAs) after kidney transplantation is independently associated with poor long-term allograft outcomes. The objective of the present study was to evaluate the predictive value of a flow cytometry crossmatching (FC-XM) assay after the appearance of dnDSAs related to antibody-mediated allograft rejection (ABMR) after kidney transplantation.Materials and methodsA total of 89 recipients with dnDSAs after transplantation were included. The crossmatching results were compared with the dnDSA profile (the mean fluorescence intensity (MFI), the complement-binding activity, and the IgG subclass profile) and the biopsy's morphological features.ResultsOf the 89 patients, 59 (66%) were positive in an FC-XM assay, 17 (19%) had complement-binding DSAs, 55 (62%) were positive for IgG1 and/or IgG3 in a solid phase assay, and 45 (51%) had morphological biopsy features linked to ABMR.ConclusionAn FC-XM assay was unable to discriminate between cases with or without ABMR on biopsy findings; it had a low positive predictive value (<70%) and a low negative positive predictive value (<42.9%), taking into account the sensitivity of our assay (limit of detection: DSAs with an MFI >3000). In this context, the height of the MFI of the dnDSAs might be enough for a high positive predictive value for ABMR and additional testing for complement binding activity can remain optional.  相似文献   

14.
《Transplantation proceedings》2021,53(10):2841-2852
BackgroundSince 1964 when Indiana University performed its first kidney transplant, immunosuppression protocol was steroid-based until 2004 when steroid-free immunosuppression protocol was adopted. We describe clinical outcomes on our patients administered early steroid withdrawal (ESW) protocol (5 days) compared with our historical cohort (HC), who were on chronic steroid-based immunosuppression.MethodsWe performed a retrospective study evaluating kidney transplant recipients between 1993 and 2003 (HC, n = 1689) and between 2005 and 2016 (ESW cohort, n = 2097) at the Indiana University program, with a median follow-up of 10.5 years and 6.1 years, respectively. Primary outcomes were patient and death-censored graft survival at 1, 3, and 5 years in both study cohorts. Secondary outcomes were 1-year rates of biopsy-proven acute rejection; graft function at 1, 3, and 5 years; and risk of post-transplant infection (BK virus and cytomegalovirus) in the ESW cohort. Cox proportional model and Kaplan-Meier estimates were used to estimate survival probabilities. Fisher exact tests were used to compare episodes of acute rejection in the ESW cohort.ResultsNo difference was observed in patient survival between the ESW and HC cohorts (P = .13). Compared with the ESW cohort, death-censored graft survival was significantly worse in the HC (5 year: 86.4% vs 90.6%, log-rank P < .001). One-year acute rejection reported in the ESW cohort alone was 15.7% and significantly worse in Black patients and younger patients (P < .05).ConclusionsIn this sizeable single-center cohort study with significant ethnic diversity, ESW is a viable alternative to steroid-based immunosuppression protocol in kidney transplant recipients.  相似文献   

15.
IntroductionDe novo donor-specific antibodies (DSAs) increase the risk of chronic lung allograft dysfunction (CLAD) in lung transplant recipients (LTRs). Both carfilzomib (CFZ) and rituximab (RTX) lower the mean fluorescent intensity (MFI) of DSAs, but comparative data are lacking. We compared CLAD-free survival and the degree and duration of DSA depletion after treatment of LTRs with CFZ or RTX.MethodsLTRs that received CFZ or RTX for DSA depletion between 08/01/2015 and 08/31/2020 were included. The primary outcome was CLAD-free survival. Secondary outcomes were change in MFI at corresponding loci within 6 months of treatment (ΔMFI), time to DSA rebound, and change in % predicted FEV1 6 months after treatment (ΔFEV1).ResultsForty-four LTRs were identified, 7 of whom had ≥2 drug events; therefore, 53 drug events were divided into 2 groups, CFZ (n = 17) and RTX (n = 36). Use of plasmapheresis, immunoglobulin, and mycophenolate augmentation was equivalent in both groups. CLAD-free survival with a single RTX event was superior to that after ≥2 drug events (p = 0.001) but comparable to that with a single CFZ event (p = 0.399). Both drugs significantly lowered the MFI at DQ locus, and the median ΔMFI was comparable. Compared to the RTX group, the CFZ group had a shorter median interval to DSA rebound (p = 0.015) and a lower ΔFEV1 at 6 months (p = 0.014).ConclusionAlthough both CFZ and RTX reduced the MFI of circulating DSAs, RTX prolonged the time to DSA rebound. Despite more pronounced improvement in FEV1 with RTX, comparable CLAD-free survival between the 2 groups suggests that both drugs offer a reasonable treatment strategy for DSAs in LTRs.  相似文献   

16.
BackgroundDonor-specific HLA antibodies are important risk factors in antibody-mediated rejection and graft loss after renal transplantation and are associated with higher rejection rates and lower graft survival. Most de novo donor specific antibodies (dnDSA) after renal transplantation are directed toward donor HLA-DQ antigens. An HLA-DQ antigen is a heterodimer consisting of an alpha and beta chain. Traditionally, HLA-DQA1 typing has not been part of pretransplant evaluation. Therefore, DQ alpha proteins are not usually considered in the interpretation of HLA-DQ antibody reactions.MethodsThe renal transplant recipient had a 0% panel reactive antibody pretransplant. Two years after transplantation, he developed symptoms of abdominal distension and bilateral lower extremity edema. Histopathological findings on renal puncture biopsy showed a combination of T-cell-mediated acute rejection type IIA and antibody-mediated rejection with a trend toward chronicity in the transplanted kidney. DSAs were investigated by HLA-I (HLA-A/B) and HLA-II (HLA-DRB1/DQA1/DQB1) single antigen bead (SAB) assay. HLA typing was performed to explain the antibody reactivity patterns by PCR-SSO and Sequencing-based typing (SBT). HLAMatchmaker analysis was performed to identify eplets that explain antibody reactivity patterns.ResultsHLA-II SAB analysis of the patient's serum at the time of rejection showed positive reactions with all DQB1*03:03-carrying beads with high mean fluorescence intensity (MFI). However, DQB1*03:03 was not a dnDSA antigen. High-resolution HLA typing revealed that HLA-DQA1*05:01 and DQA1*03:02 were mismatched donor antigens. HLA Matchmaker analysis demonstrated reactivity toward 130R and 116 V eplet on DQA1 and DQB1.ConclusionsAntibodies specific to DQα chains after renal transplantation were highlighted.  相似文献   

17.
The success of direct-acting antiviral (DAA) therapy has led to near-universal cure for patients chronically infected with hepatitis C virus (HCV) and improved post–liver transplant (LT) outcomes. We investigated the trends and outcomes of retransplantation in HCV and non-HCV patients before and after the introduction of DAA. Adult patients who underwent re-LT were identified in the Organ Procurement and Transplantation Network/United Network for Organ Sharing database. Multiorgan transplants and patients with >2 total LTs were excluded. Two eras were defined: pre-DAA (2009-2012) and post-DAA (2014-2017). A total of 2112 re-LT patients were eligible (HCV: n = 499 pre-DAA and n = 322 post-DAA; non-HCV: n = 547 pre-DAA and n = 744 post-DAA). HCV patients had both improved graft and patient survival after re-LT in the post-DAA era. One-year graft survival was 69.8% pre-DAA and 83.8% post-DAA (P < .001). One-year patient survival was 73.1% pre-DAA and 86.2% post-DAA (P < .001). Graft and patient survival was similar between eras for non-HCV patients. When adjusted, the post-DAA era represented an independent positive predictive factor for graft and patient survival (hazard ratio [HR]: 0.67; P = .005, and HR: 0.65; P = .004) only in HCV patients. The positive post-DAA era effect was observed only in HCV patients with first graft loss due to disease recurrence (HR: 0.31; P = .002, HR 0.32; P = .003, respectively). Among HCV patients, receiving a re-LT in the post-DAA era was associated with improved patient and graft survival.  相似文献   

18.

Background

Efforts to improve long-term patient and allograft survival have included use of induction therapies as well as steroid and/or calcineurin inhibitor (CNI) avoidance/minimization.

Methods

This is a retrospective review of kidney transplant recipients between September 2004 and July 2009. Immune minimization (group 1; n = 182) received alemtuzumab induction, low-dose CNI, and mycophenolic acid (MPA). Conventional immunosuppression (group 2; n = 232) received rabbit anti-thymocyte globulin, standard-dose CNI, MPA, and prednisone.

Results

Both groups were followed up for same length of time (49.4 ± 21.7 months; P = .12). Patient survival was also similar (90% vs 94%; P = .14). Death-censored graft survival was inferior in group 1 compared with group 2 (86% vs 96%, respectively; P = .003). On multivariate analysis, group 1 was an independent risk factor for graft loss (aHR = 2.63; 95% confidence interval [CI], 1.32–5.26; P = .006). Biopsy-proven acute rejection occurred more in group 1, due to late rejections compared with group 2 (7% vs 2%; P < .01 respectively). Graft function was lower in group 1 compared with group 2 at 3 months (49.5 mL/mt vs 70.7 mL/mt, respectively; P < .001) to 48 months (48.6 mL/mt vs 69.4 mL/mt, respectively; P = .04).

Conclusion

Minimization of maintenance immunosuppression after alemtuzumab correlated with higher acute rejection and inferior graft survival compared with thymoglobulin and conventional triple immunotherapy.  相似文献   

19.
BackgroundThe presence of intimal arteritis (v) in renal allograft biopsy specimens establishes the presence of acute T-cell mediated rejection (TCMR), Grade IIa-III, according to the Banff classification of rejection. The clinical significance of isolated v1 lesions (v1), characterized by arteritis alone, compared with lesions of arteritis with tubulointerstitial inflammation (i-t-v) has been controversial.MethodsWe performed a retrospective review of 280 patients undergoing renal transplantation between 2005 and 2015 who received a “for cause” transplant biopsy using the Banff 2013 classification. Patients with TCMR grade IIa (n = 83) were subdivided into groups with isolated v1 arteritis and i-t-v. Pre- and postoperative renal function, graft survival, and overall survival were evaluated in all patients.ResultsDonor and recipient demographics were similar between groups. One month following treatment of rejection, patients with v1 disease had superior recovery of glomerular filtration rate vs patients with i-t-v (P < .002). At a median follow-up of 41 months from transplant, death-censored graft survival was 92% vs 79% (P = .04), and overall survival was 98% vs 79% (P < .004) in the isolated v1 and i-t-v groups, respectively.ConclusionDespite having identical Banff classification of TCMR IIa, our results indicate that graft survival in patients with isolated v1 rejection is superior to those with i-t-v. Following corroboration with data from other centers, modification of the Banff classification scheme should be considered.  相似文献   

20.
《Transplantation proceedings》2021,53(7):2188-2196
IntroductionMatching for HLA-DQB1 molecules and anti-DQ donor-specific antibodies (DSAs) has been less studied to allocate transplants from deceased donors in developed countries. The aim of this study was to evaluate the clinical outcome of 519 kidney transplant recipients on the allograft function, loss, and survival and with emphasis on effects of HLA-DQB1-DSA+ at minimum of 10 years’ follow-up.MethodsFive hundred nineteen kidney transplant patients were allocated into 3 groups (G) by immunologic profiles, namely, G1 (SPI-SAB HLA-DQ negative [DQ]), G2 (SPI-SAB HLA-DQ positive DSA negative [DQ+/DSA]), and G3 (SPI-SAB HLA-DQ DSA positive [DQ+ DSA+]), and the outcomes were reported until 10 years after transplantation.ResultsThe proportion of rejection episodes was higher in G3 (25.0% and 26.32%, respectively) than in G1 (8.63% and 6.82%, respectively) and G2 (10.0% and 0%, respectively; P = .047 and P = .014, respectively). In G3, 3 patients lost their grafts by antibody-mediated rejection. Patients who received kidneys from deceased donors (G3) showed worse graft survival rates than those from G1 donors (P = .001). Patients from G3 had a 2.18-fold higher risk of graft loss than patients from G1 (P = .028).ConclusionAllograft function was worse in G3 than in G2 or G1, and graft losses were more frequent by T-cell-mediated rejection in G1, and graft losses by antibody-mediated rejection were similar in G1 and G3 due to HLA class I (A1, 11 and B 8, 52) and HLA class II by DR7 and DQ 2, 5, 9 DSA, respectively. Allograft survival decreased in patients with HLA-DQB1 DSA. The risk of graft loss was 1.75-fold that in patients who received transplants from living donors.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号