首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Preventing conversion of donor‐specific anti‐HLA antibodies (DSAs) from an IgM‐to‐IgG could a way to prevent chronic rejection. We evaluated whether belatacept‐treated patients (belatacept less‐intensive [LI] or more‐intensive [MI] regimens) have a lower rate of conversion than do cyclosporine A (CsA)–treated patients. We included 330 HLA‐mismatched patients from 2 phase 3 trials with either (a) complete donor/recipient HLA‐A, ‐B, ‐DR, and ‐DQ loci typing or (b) incomplete HLA typing with IgG DSAs detected pretransplant or posttransplant. IgM and IgG DSAs were tested with single antigen beads at 0, 6, 12, 24, and 36 months posttransplant. The overall (preexisting or de novo) rates of IgM‐ and IgG‐positive DSAs were 29% and 34%, respectively. The pretransplant IgM and IgG DSA‐positive frequencies were similar between treatment groups. The IgG‐positive dnDSA rate was significantly higher in the CsA‐treated group (34%) compared with the belatacept‐LI (8%) and belatacept‐MI (11%) (P < .001) groups. In IgM‐positive dnDSA patients, the IgG‐positive dnDSA rate of conversion was 2.8 times higher in the CsA group than in the combined belatacept groups (P = .006). However, the observed association between belatacept treatment and more limited conversion of IgM‐to‐IgG dnDSAs was based on a limited number of patients and requires further validation.  相似文献   

2.
Gender‐difference regarding antibody‐mediated rejection (AMR) after heart transplantation has been described. However, no study accounted for the presence of preformed donor‐specific antibodies (pfDSA), a known risk factor of AMR, more common among women than men. In a single‐institution 6‐year cohort (2010‐2015), time to AMR was assessed, comparing men with women by survival analysis with a 1‐year death‐censored follow‐up. All AMRs were biopsy proven. Confounding variables that were accounted for included mean intensity fluorescence (MFI) of pfDSA, recipient age, HLA‐, size‐ and sex‐mismatch. 463 patients were included. Overall incidence of AMR was 10.3% at 1 year. After adjusting for confounding variables, independent risk factors of AMR were female recipient gender (adjusted hazard‐ratio [adj. HR] = 1.78 [1.06‐2.99]), P = .03) and the presence of pfDSA (adj. HR = 3.20 [1.80‐5.70], P < .001). This association remained significant when considering pfDSA by their MFI; female recipient gender had an adj. HR = 2.2 (P = .026) and MFI of pfDSA (per 1 MFI‐increase) adj. HR = 1.0002 (P < .0001). In this cohort, women were at higher risk of AMR than men and this risk increase was additive to that of pfDSA. These findings may suggest a gender‐related difference in the severity of pfDSA.  相似文献   

3.
Anti‐denatured HLA‐Cw antibodies are highly prevalent, whereas anti‐native HLA‐Cw antibodies seem to lead to random flow cytometry crossmatch results. We aimed to reassess crossmatch prediction for anti‐HLA‐Cw using 2 types of single antigen flow beads (classical beads and beads with diminished expression of denatured HLA), and to compare the pathogenicity of preformed anti‐denatured and anti‐native HLA‐Cw antibodies in kidney transplantation. We performed 135 crossmatches with sera reacting against donor HLA‐Cw (classical beads fluorescence ≥500); only 20.6% were positive. Forty‐three (31.6%) were anti‐denatured HLA antibodies (beads with diminished expression of denatured HLA fluorescence <300); all were crossmatch negative. The correlation between classical beads fluorescence and the crossmatch ratio was low (ρ = 0.178), and slightly higher with beads with diminished expression of denatured HLA (ρ = 0.289). We studied 52 kidney recipients with preformed anti‐HLA‐Cw donor‐specific antibodies. Those with anti‐native HLA antibodies experienced more acute and chronic antibody‐mediated rejections (P = .006 and .03, respectively), and displayed a lower graft survival (P = .04). Patients with anti‐native HLA‐Cw antibodies more frequently had previous sensitizing events (P < .000001) or plausibility of their antibody profile according to known anti‐native HLA‐Cw eplets (P = .0001). Anti‐native but not anti‐denatured HLA‐Cw antibodies are deleterious, which underscores the need for reagents with diminished expression of denatured HLA.  相似文献   

4.
Antibody‐mediated rejection (AMR) driven by the development of donor‐specific antibodies (DSA) directed against mismatched donor human leukocyte antigen (HLA) is a major risk factor for graft loss in cardiac transplantation. Recently, the relevance of non‐HLA antibodies has become more prominent as AMR can be diagnosed in the absence of circulating DSA. Here, we assessed a single‐center cohort of 64 orthotopic heart transplant recipients transplanted between 1994 and 2014. Serum collected from patients with ≥ pAMR1 (n = 43) and non‐AMR (n = 21) were tested for reactivity against a panel of 44 non‐HLA autoantigens. The AMR group had a significantly greater percentage of patients with elevated reactivity to autoantigens compared to non‐AMR (P = .002) and healthy controls (n = 94, P < .0001). DSA‐positive AMR patients exhibited greater reactivity to autoantigens compared to DSA‐negative (P < .0001) and AMR patients with DSA and PRA > 10% were identified as the subgroup with significantly elevated responses. Reactivity to 4 antigens, vimentin, beta‐tubulin, lamin A/C, and apolipoprotein L2, was significantly different between AMR and non‐AMR patients. Moreover, increased reactivity to these antigens was associated with graft failure. These results suggest that antibodies to non‐HLA are associated with DSA‐positive AMR although their specific role in mediating allograft injury is not yet understood.  相似文献   

5.
Graft survival seems to be worse in positive cross‐match (HLAi) than in ABO‐incompatible (ABOi) transplantation. However, it is not entirely clear why these differences exist. Sixty‐nine ABOi, 27 HLAi and 10 combined ABOi+HLAi patients were included in this retrospective study, to determine whether the frequency, severity and the outcome of active antibody‐mediated rejection (AMR) were different. Five‐year death‐censored graft survival was better in ABOi than in HLAi and ABOi+HLAi patients (99%, 69% and 64%, respectively, P = 0.0002). Features of AMR were found in 38%, 95% and 100% of ABOi, HLAi and ABOi+HLAi patients that had a biopsy, respectively (P = 0.0001 and P = 0.001). After active AMR, a declining eGFR and graft loss were observed more frequently in HLAi and HLAi+ABOi than in ABOi patients. The poorer prognosis after AMR in HLAi and ABOi+HLAi transplantations was not explained by a higher severity of histological lesions or by a less aggressive treatment. In conclusion, ABOi transplantation offers better results than HLAi transplantation, partly because AMR occurs less frequently but also because outcome after AMR is distinctly better. HLAi and combined ABOi+HLAi transplantations appear to have the same outcome, suggesting there is no synergistic effect between anti‐A/B and anti‐HLA antibodies.  相似文献   

6.
Graft failure and survival are the major problems for patients with aplastic anemia undergoing hematopoietic stem cell transplantation (HSCT). Previous studies showed that anti‐HLA antibodies negatively impact engraftment in HSCT. This retrospective study of 51 pediatric patients with acquired aplastic anemia who underwent allogeneic HSCT at a single institution between 2006 and 2012 investigated the influence of anti‐HLA antibodies on the outcome of HSCT. Serum samples collected before HSCT were tested for the presence of anti‐HLA antibodies. Pre‐existing anti‐HLA antibodies were detected in 54.9% (28/51) of patients, among whom 39.2% (20/51) had anti‐HLA class I antibodies. Anti‐HLA antibodies were associated with worse five‐yr survival (78.6% vs. 100%, p = 0.021) and higher treatment‐related mortality (21.4% vs. 0%, p = 0.028) compared with antibody‐negative patients. Anti‐HLA class I antibody‐positive patients had poorer five‐yr survival (75.0%) than anti‐HLA class I&II antibody‐positive and antibody‐negative patients (87.5% and 100.0%, respectively, p = 0.039). Presence of anti‐HLA class I antibodies (p = 0.024) and older age (10 yr or more; p = 0.027) significantly increased the risk of post‐HSCT mortality. Pre‐existing anti‐HLA antibodies negatively affect the outcome of HSCT in pediatric patients with aplastic anemia. Routine testing for anti‐HLA antibodies concurrent with efficient treatment should be conducted prior to HSCT.  相似文献   

7.
There is increasing evidence that de novo anti‐HLA antibodies, more specifically de novo donor‐specific antibodies (DSA) following solid organ transplantation may be associated with negative outcomes including rejection in the first year and graft loss. Limited data are available in pediatric heart transplant recipients. We sought to prospectively determine the incidence, class and early impact of de novo anti‐HLA antibodies in a cohort of pediatric heart transplant recipients. Serial panel reactive antibody testing posttransplant was performed in 25 patients (14 males) transplanted between January 2008 and June 2010. Five patients were sensitized pretransplant; all patients had negative direct crossmatch. Seventy‐two percent developed de novo anti‐HLA antibodies at a median of 2.6 weeks (IQR 1.2 weeks to 6.2 months) posttransplant; 67% of these were DSA. The majority of recipients in our cohort developed de novo anti‐HLA antibodies within the first year posttransplant, with two‐thirds being donor‐specific. Acute cellular rejection, though frequent, was not different in patients with antibody development regardless of class or specificity, and there was no antibody‐mediated rejection, graft loss or early cardiac allograft vasculopathy.  相似文献   

8.
The recent recognition of complex and chronic phenotypes of T cell–mediated rejection (TCMR) has fostered the need to better evaluate the response of acute TCMR—a condition previously considered to lack relevant consequences for allograft survival—to the standard of care. In a prospective cohort of kidney recipients (n = 256) with biopsy‐proven acute TCMR receiving corticosteroids, we investigated clinical, histological, and immunological phenotypes at the time of acute TCMR diagnosis and 3 months posttreatment. Independent posttreatment determinants of allograft loss included the glomerular filtration rate (GFR) (HR = 0.94; 95% CI = 0.92‐0.96; P < .001), proteinuria (HR = 1.40; 95% CI = 1.10‐1.79; P = .007), time since transplantation (HR = 1.02; 95% CI = 1.00‐1.03; P = .016), peritubular capillaritis (HR = 2.27; 95% CI = 1.13‐4.55; P = .022), interstitial inflammation in sclerotic cortical parenchyma (i‐IF/TA) (HR = 1.87; 95% CI = 1.08‐3.25; P = .025), and donor‐specific anti‐HLA antibodies (DSAs) (HR = 2.67; 95% CI = 1.46‐4.88; P = .001). Prognostic value was improved using a composite evaluation of response to treatment versus clinical parameters only (cNRI = 0.68; 95% CI = 0.41‐0.95; P < .001). A classification tree for allograft loss identified five patterns of response to treatment based on the posttreatment GFR, i‐IF/TA, and anti‐HLA DSAs (cross‐validated accuracy = 0.80). Compared with responders (n = 155, 60.5%), nonresponders (n = 101, 39.5%) had a higher incidence of de novo DSAs, antibody‐mediated rejection, and allograft loss at 10 years (P < .001 for all comparisons). Thus, clinical, histological, and immunological assessment of response to treatment of acute TCMR revealed different profiles of the response to treatment with distinct outcomes.  相似文献   

9.
Clinical Trials in Organ Transplantation‐18 (CTOT‐18) is a follow‐up analysis of the 200‐subject multicenter heart transplant CTOT‐05 cohort. CTOT‐18 aimed to identify clinical, epidemiologic, and biologic markers associated with adverse clinical events past 1 year posttransplantation. We examined various candidate biomarkers including serum antibodies, angiogenic proteins, blood gene expression profiles, and T cell alloreactivity. The composite endpoint (CE) included death, retransplantation, coronary stent, myocardial infarction, and cardiac allograft vasculopathy. The mean follow‐up was 4.5 ± SD 1.1 years. Subjects with serum anti‐cardiac myosin (CM) antibody detected at transplantation and at 12 months had a higher risk of meeting the CE compared to those without anti‐CM antibody (hazard ratio [HR] = 2.9, P = .046). Plasma VEGF‐A and VEGF‐C levels pretransplant were associated with CE (odds ratio [OR] = 13.24, P = .029; and OR = 0.13, P = .037, respectively). Early intravascular ultrasound findings or other candidate biomarkers were not associated with the study outcomes. In conclusion, anti‐CM antibody and plasma levels of VEGF‐A and VEGF‐C were associated with an increased risk of adverse events. Although this multicenter report supports further evaluation of the mechanisms through which anti‐CM antibody and plasma angiogenesis proteins lead to allograft injury, we could not identify additional markers of adverse events or potential novel therapeutic targets.  相似文献   

10.
Cytokine‐expression profiles revealed IL‐1ß highly upregulated in rejecting skin of limb allografts. We investigate the effect of intragraft treatment with a neutralizing IL‐1β antibody in limb transplantation. Following allogenic hind‐limb transplantation, Lewis rats were either left untreated 1 or treated with anti‐lymphocyte serum + tacrolimus (baseline) 2 ; baseline immunosuppression + anti‐IL‐1β (1 mg/kg once/week, 6‐8 subcutaneous injections) into the transplanted 3 or contralateral 4 limb. Endpoint was rejection grade III or day 100. Graft rejection was assessed by histology, immunohistochemistry, flow cytometry phenotyping of immune cells, and monitoring cytokine expression. Anti‐IL‐1β injections into the allograft or contralateral limb resulted in a significant delay of rejection onset (controls: 58.60 ± 0.60; group 3: 75.80 ± 10.87, P = .044; group 4: 73.00 ± 6.49, P = .008) and prolongation of graft survival (controls: 64.60 ± 0.87; group 3: 86.60 ± 5.33, P = .002; group 4: 93.20 ± 3.82, P = .002), compared to controls. Although the phenotype of the graft infiltrating immune cells did not differ between groups, significantly decreased skin protein levels of IL‐1β, IL‐4, IL‐13, IP‐10, MCP‐1, and MCP‐3 in long‐term‐survivors indicate an overall decrease of chemoattraction and infiltration of immune cells as the immunosuppressive mechanism of anti‐IL‐1β. Inhibition of IL‐1β with short‐term systemic immunosuppression prolongs limb allograft survival and represents a promising target for immunosuppression in extremity transplantation.  相似文献   

11.
The diagnostic criteria for antibody‐mediated rejection (ABMR) after small bowel transplantation (SBT) are not clearly defined, although the presence of donor‐specific antibodies (DSAs) has been reported to be deleterious for graft survival. We aimed to determine the incidence and prognostic value of DSAs and C4d in pediatric SBT and to identify the histopathologic features associated with C4d positivity. We studied all intestinal biopsies (IBx) obtained in the first year posttransplantation (N = 345) in a prospective cohort of 23 children. DSAs and their capacity to fix C1q were identified by using Luminex technology. Eighteen patients (78%) had DSAs, and 9 had the capacity to fix C1q. Seventy‐eight IBx (22.6%) were C4d positive. The independent determinants of C4d positivity were capillaritis grades 2 and 3 (odds ratio [OR] 4.02, P = .047 and OR 5.17, P = .003, respectively), mucosal erosion/ulceration (OR 2.8, P = .019), lamina propria inflammation grades 1 and 2/3 (OR 1.95, P = .043 and OR 3.1, P = .016, respectively), and chorion edema (OR 2.16, P = .028). Complement‐fixing DSAs and repeated C4d‐positive IBx were associated with poor outcome (P = .021 and P = .001, respectively). Our results support that capillaritis should be considered as a feature of ABMR in SBT and identify C1q‐fixing DSAs and repeated C4d positivity as potential markers of poor outcome.  相似文献   

12.
Lung Transplant recipients are at increased risk of complicated diverticular disease. We aim to assess the rate of diverticular surgery in a postlung transplantation population and identify risk factors for surgery. We performed a retrospective cohort study of lung transplant recipients from 2007 to 2011. Demographic variables were evaluated with the Mann–Whitney U and chi‐squared tests. Cox regression was performed to evaluate 1‐ and 2‐year landmark survival, assess predictor variables of diverticular surgery and evaluate impact of surgery on CLAD development. Of 17 of 158 patients (10.7%) underwent diverticular‐related surgery. Surgical patients had significantly worse survival than nonsurgical patients at 1 year [aHR 2.93 (1.05–8.21), P = 0.041] and 2 year [aHR 4.17 (1.26–13.84), P = 0.020] landmark analyses. Transplant indication of alpha‐1 antitrypsin disease and cystic fibrosis were significantly associated with the need for diverticular surgery. Emergent surgery was associated with poorer survival [aHR 5.12(1.00–26.27), P = 0.050]. Lung transplant patients requiring surgery for complicated diverticular disease have significantly poorer survival than those who do not require surgery. Surgery was more common in patients transplanted for A1AT and CF. Optimal assessment and risk stratification of diverticular disease is necessary to prevent excessive morbidity and mortality following transplantation.  相似文献   

13.
In 2005, the Lung Allocation Score (LAS) was implemented as the allocation system for lungs in the US. We sought to compare 5‐year lung transplant outcomes before and after the institution of the LAS. Between 2000 and 2011, 501 adult patients were identified, with 132 from January 2000 to April 2005 (Pre‐LAS era) and 369 from May 2005 to December 2011 (Post‐LAS era). Kruskal‐Wallis or chi‐squared test was used to determine significance between groups. Survival was censored at 5 years. Overall, the post‐LAS era was associated with more restrictive lung disease, higher LAS scores, shorter wait‐list times, more preoperative immunosuppression, and more single lung transplantation. In addition, post‐LAS patients had higher O2 requirements with greater preoperative pulmonary impairment. Postoperatively, 30‐day mortality improved in post‐LAS era (1.6% vs 5.3%, P = .048). During the pre‐ and post‐LAS eras, 5‐year survival was 52.3% and 55.3%, respectively (P = .414). The adjusted risk of mortality was not different in the post‐LAS era (P = .139). Freedom from chronic lung allograft dysfunction was significantly higher in the post‐LAS era (P = .002). In this single‐center report, implementation of the LAS score has led to allocation to sicker patients without decrement in short‐ or medium‐term outcomes. Freedom from CLAD at 5 years is improving after LAS implementation.  相似文献   

14.
Recent OPTN proposals to address geographic disparity in liver allocation have involved circular boundaries: the policy selected 12/17 allocated to 150‐mile circles in addition to DSAs/regions, and the policy selected 12/18 allocated to 150‐mile circles eliminating DSA/region boundaries. However, methods to reduce geographic disparity remain controversial, within the OPTN and the transplant community. To inform ongoing discussions, we studied center‐level supply/demand ratios using SRTR data (07/2013‐06/2017) for 27 334 transplanted deceased donor livers and 44 652 incident waitlist candidates. Supply was the number of donors from an allocation unit (DSA or circle), allocated proportionally (by waitlist size) to the centers drawing on these donors. We measured geographic disparity as variance in log‐transformed supply/demand ratio, comparing allocation based on DSAs, fixed‐distance circles (150‐ or 400‐mile radius), and fixed‐population (12‐ or 50‐million) circles. The recently proposed 150‐mile radius circles (variance = 0.11, P = .9) or 12‐million‐population circles (variance = 0.08, P = .1) did not reduce the geographic disparity compared to DSA‐based allocation (variance = 0.11). However, geographic disparity decreased substantially to 0.02 in both larger fixed‐distance (400‐mile, P < .001) and larger fixed‐population (50‐million, P < .001) circles (P = .9 comparing fixed distance and fixed population). For allocation circles to reduce geographic disparities, they must be larger than a 150‐mile radius; additionally, fixed‐population circles are not superior to fixed‐distance circles.  相似文献   

15.
Sensitization is common in pediatric heart transplant candidates and waitlist mortality is high. Transplantation across a positive crossmatch may reduce wait time, but is considered high risk. We prospectively recruited consecutive candidates at eight North American centers. At transplantation, subjects were categorized as nonsensitized or sensitized (presence of ≥1 HLA antibody with MFI ≥1000 using single antigen beads). Sensitized subjects were further classified as complement‐dependent cytotoxicity crossmatch (CDC‐crossmatch) positive or negative and as donor‐specific antibodies (DSA) positive or negative. Immunosuppression was standardized. CDC‐crossmatch–positive subjects also received perioperative antibody removal, maintenance corticosteroids, and intravenous immunoglobulin. The primary endpoint was the 1 year incidence rate of a composite of death, retransplantation, or rejection with hemodynamic compromise. 317 subjects were screened, 290 enrolled and 240 transplanted (51 with pretransplant DSA, 11 with positive CDC‐crossmatch). The incidence rates of the primary endpoint did not differ statistically between groups; nonsensitized 6.7% (CI: 2.7%, 13.3%), sensitized crossmatch positive 18.2% (CI: 2.3%, 51.8%), sensitized crossmatch negative 10.7% (CI: 5.7%, 18.0%), P = .2354. The primary endpoint also did not differ by DSA status. Freedom from antibody‐mediated and cellular rejection was lower in the crossmatch positive group and/or in the presence of DSA. Follow‐up will determine if acceptable outcomes can be achieved long‐term.  相似文献   

16.
Prophylaxis of graft‐versus‐host disease (GVHD) after allogeneic hematopoietic stem cell transplantation (HCT) remains challenging. Because prospective randomized trials of in‐vivo T cell depletion using anti‐T‐lymphocyte globulin (ATLG) in addition to a calcineurin inhibitor and methotrexate (MTX) led to conflicting outcome results, we evaluated the impact of ATLG on clinical outcome, lymphocyte‐ and immune reconstitution survival models. In total, 1500 consecutive patients with hematologic malignancies received matched unrelated donor (MUD) HCT with cyclosporin and MTX (N = 723, 48%) or with additional ATLG (N = 777, 52%). In the ATLG cohort, grades III‐IV acute (12% vs 23%) and extensive chronic GVHD (18% vs 34%) incidences were significantly reduced (P < .0001). Nonrelapse mortality (27% vs 45%) and relapse (30% vs 22%) differed also significantly. Event‐free and overall survival estimates at 10 years were 44% and 51% with ATLG and 33% and 35% without ATLG (P < .002 and <.0001). A dose‐dependent ATLG effect on lymphocyte‐ and neutrophil reconstitution was observed. At ATLG exposure, lymphocyte counts and survival associated through a logarithmically increasing function. In this survival model, the lymphocyte count optimum range at exposure was between 0.4 and 1.45/nL (P = .001). This study supports additional ATLG immune prophylaxis and is the first study to associate optimal lymphocyte counts with survival after MUD‐HCT.  相似文献   

17.
In the phase II IM103‐100 study, kidney transplant recipients were first randomized to belatacept more‐intensive‐based (n = 74), belatacept less‐intensive‐based (n = 71), or cyclosporine‐based (n = 73) immunosuppression. At 3‐6 months posttransplant, belatacept‐treated patients were re‐randomized to receive belatacept every 4 weeks (4‐weekly, n = 62) or every 8 weeks (8‐weekly, n = 60). Patients initially randomized to cyclosporine continued to receive cyclosporine‐based immunosuppression. Cumulative rates of biopsy‐proven acute rejection (BPAR) from first randomization to year 10 were 22.8%, 37.0%, and 25.8% for belatacept more‐intensive, belatacept less‐intensive, and cyclosporine, respectively (belatacept more‐intensive vs cyclosporine: hazard ratio [HR] = 0.95; 95% confidence interval [CI] 0.47‐1.92; P = .89; belatacept less‐intensive vs cyclosporine: HR = 1.61; 95% CI 0.85‐3.05; P = .15). Cumulative BPAR rates from second randomization to year 10 for belatacept 4‐weekly, belatacept 8‐weekly, and cyclosporine were 11.1%, 21.9%, and 13.9%, respectively (belatacept 4‐weekly vs cyclosporine: HR = 1.06, 95% CI 0.35‐3.17, P = .92; belatacept 8‐weekly vs cyclosporine: HR = 2.00, 95% CI 0.75‐5.35, = .17). Renal function trends were estimated using a repeated‐measures model. Estimated mean GFR values at year 10 for belatacept 4‐weekly, belatacept 8‐weekly, and cyclosporine were 67.0, 68.7, and 42.7 mL/min per 1.73 m2, respectively (P<.001 for overall treatment effect). Although not statistically significant, rates of BPAR were 2‐fold higher in patients administered belatacept every 8 weeks vs every 4 weeks.  相似文献   

18.
There is a paucity of data on long‐term outcomes following visceral transplantation in the contemporary era. This is a single‐center retrospective analysis of all visceral allograft recipients who underwent transplant between November 2003 and December 2013 with at least 3‐year follow‐up data. Clinical data from a prospectively maintained database were used to assess outcomes including patient and graft survival. Of 174 recipients, 90 were adults and 84 were pediatric patients. Types of visceral transplants were isolated intestinal transplant (56.3%), combined liver‐intestinal transplant (25.3%), multivisceral transplant (16.1%), and modified multivisceral transplant (2.3%). Three‐, 5‐, and 10‐year overall patient survival was 69.5%, 66%, and 63%, respectively, while 3‐, 5‐, and 10‐year overall graft survival was 67%, 62%, and 61%, respectively. In multivariable analysis, significant predictors of survival included pediatric recipient (P = .001), donor/recipient weight ratio <0.9 (P = .008), no episodes of severe acute rejection (P = .021), cold ischemia time <8 hours (P = .014), and shorter hospital stay (P = .0001). In conclusion, visceral transplantation remains a good option for treatment of end‐stage intestinal failure with parenteral nutritional complications. Proper graft selection, shorter cold ischemia time, and improvement of immunosuppression regimens could significantly improve the long‐term survival.  相似文献   

19.
The purpose of this study is to examine whether postoperative antiblood type antibody rebound is attributed to kidney allograft rejection in ABO blood type‐incompatible (ABO‐I) living‐related kidney transplantation (KTx). A total of 191 ABO‐I recipients who received ABO‐I living‐related KTx between 2001 and 2013 were divided into two groups: Group 1 consisted of low rebound [(≦1:32), = 170] and Group 2 consisted of high rebound [(≧1:64), N = 21], according to the levels of the rebounded antiblood type antibodies within 1 year after transplantation. No prophylactic treatment for rejection was administered for elevated antiblood type antibodies, regardless of the levels of the rebounded antibodies. Within 1 year after transplantation, T‐cell‐mediated rejection was observed in 13 of 170 recipients (13/170, 8%) in Group 1 and in 2 of 21 recipients (2/21, 10%) in Group 2 (Groups 1 vs. 2, P = 0.432). Antibody‐mediated rejection was observed in 15 of 170 recipients (15/170, 9%) and 2 of 21 recipients (2/21, 10%) in Groups 1 and 2, respectively (P = 0.898). In this study, we found no correlation between the postoperative antiblood type antibody rebound and the incidence of acute rejection. We concluded that no treatment is necessary for rebounded antiblood type antibodies.  相似文献   

20.
Epstein‐Barr virus (EBV)–associated posttransplant lymphoproliferative disorder (EBV‐PTLD) is a serious complication in lung transplant recipients (LTRs) associated with significant mortality. We performed a single‐center retrospective study to evaluate the risks for PTLD in LTRs over a 7‐year period. Of 611 evaluable LTRs, we identified 28 cases of PTLD, with an incidence of 4.6%. Kaplan‐Meier analysis showed a decreased freedom from PTLD in idiopathic pulmonary fibrosis (IPF)‐LTRs (P < .02). Using a multivariable Cox proportional hazards model, we found IPF (hazard ratio [HR] 3.51, 95% confidence interval [CI] 1.33‐8.21, P = .01) and alemtuzumab induction therapy (HR 2.73, 95% CI 1.10‐6.74, P = .03) as risk factors for PTLD, compared to EBV mismatch (HR: 34.43, 95% CI 15.57‐76.09, P < .0001). Early PTLD (first year) was associated with alemtuzumab use (P = .04), whereas IPF was a predictor for late PTLD (after first year) (P = .002), after controlling for age and sex. Kaplan‐Meier analysis revealed a shorter time to death from PTLD in IPF LTRs compared to other patients (P = .04). The use of alemtuzumab in EBV mismatch was found to particularly increase PTLD risk. Together, our findings identify IPF LTRs as a susceptible population for PTLD. Further studies are required to understand the mechanisms driving PTLD in IPF LTRs and develop strategies to mitigate risk.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号