首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
BackgroundSensitized lung transplant recipients are at increased risk of developing donor-specific antibodies, which have been associated with acute and chronic rejection. Perioperative intravenous immune globulin has been used in sensitized individuals to down-regulate antibody production.MethodsWe compared patients with a pre-transplant calculated panel reactive antibody ≥25% who did not receive preemptive immune globulin therapy to a historical control that received preemptive immune globulin therapy. Our cohort included 59 patients, 17 patients did not receive immune globulin therapy and 42 patients received therapy.ResultsDonor specific antibody development was numerically higher in the non-immune globulin group compared to the immune globulin group (58.8% vs 33.3%, respectively, odds ratio 2.80, 95% confidence interval [0.77, 10.79], p = 0.13). Median time to antibody development was 9 days (Q1, Q3: 7, 19) and 28 days (Q1, Q3: 7, 58) in the non-immune globulin and immune globulin groups, respectively. There was no significant difference between groups in the incidence of primary graft dysfunction at 72 h post-transplant or acute cellular rejection, antibody-mediated rejection, and chronic lung allograft dysfunction at 12 months.ConclusionThese findings are hypothesis generating and emphasize the need for larger, randomized studies to determine association of immune globulin therapy with clinical outcomes.  相似文献   

2.
BackgroundThere is growing evidence on the important role of non-human leukocyte antigen (HLA) antibodies in lung and heart transplant rejection. Since data on the prevalence and clinical significance of non-HLA antibodies in the Asian population are scarce, we analyzed non-HLA antibodies in heart and lung transplant patients.MethodsWe used the Luminex method to measure non-HLA antibodies in patients who underwent heart transplantation (N = 28) or lung transplantation (N = 36) between 2016 and 2019. We evaluated the association between pre-existing non-HLA antibodies and acute rejection-free days in these recipients.ResultsOf 64 patients, 27 (42.2%) patients underwent rejection, with 26 (40.6%) acute cellular rejection and one (1.6%) acute antibody-mediated rejection. Among 33 identified different non-HLA antibodies, only the anti-glutathione S-transferase theta-1 (GSTT1) antibody positive rate was significantly higher in patients with acute rejection compared to those without rejection (14.8% vs. 0%, p = 0.016). The angiotensin II type I receptor positive rate was not significantly different between the two groups (40% vs. 18.5%, p = 0.129). In the multivariate Cox regression analysis, anti-GSTT1 antibody-positive patients had a higher risk of acute allograft rejection (hazard ratio, 4.19; 95% confidence interval [CI], 1.41–12.49; p = 0.010). The Kaplan-Meier curve showed that anti-GSTT1 antibody-positive patients had fewer acute rejection-free days (χ2 = 7.892; p = 0.005). Additionally, patients who underwent platelet transfusion (odds ratio, 1.49; 95% CI, 1.16–1.91; p = 0.002) before transplantation were more likely to be positive for anti-GSTT1 antibody.ConclusionPatients with antibodies against GSTT1 before heart or lung transplantation have anincreased risk of acute rejection.  相似文献   

3.
《Transplantation proceedings》2023,55(7):1487-1494
BackgroundPotential organ donors often have suffered anoxic and/or traumatic brain injury during which they may have experienced aspiration of gastric material (AGM). Evaluation of such donors typically includes a screening bronchoscopic examination during which determinations of aspiration are made. The efficacy of this visual screening and its relationship to post-transplant allograft function are unknown.MethodsBefore procurement, bronchoscopy was performed on donors in which both bronchoalveolar lavage fluid (BALF) was collected and a visual inspection made. As a marker of AGM, BALF specimens were analyzed for the presence of bile salts. Data collected on the corresponding recipients included primary graft dysfunction (PGD) score, post-transplant spirometry, acute rejection scores (ARS), and overall survival.ResultsOf 31 donors evaluated, bronchoscopies revealed only 2 with visual evidence of AGM, whereas BALF analysis for bile salts indicated AGM in 14. As such, screening bronchoscopy had a sensitivity of only 7.1%. Visual detection of AGM via bronchoscopy was not associated with any resulting grade of PGD (χ2 = 2.96, P = .23); however, AGM defined by detection of bile salts was associated (χ2 = 7.56, P = .02). Over the first post-transplant year, the corresponding recipients experienced a similar improvement in allograft function (χ2 = 1.63, P = .69), ARS (P = .69), and survival (P = .24).ConclusionVisual inspection during a single bronchoscopic examination of lung donors underestimates the prevalence of AGM. The detection of bile salts in donor BALF is associated with early allograft dysfunction in the corresponding recipients but not with later allograft proficiency, acute rejection responses, or 1-year post-transplant survival.  相似文献   

4.
BackgroundInduction therapy improves graft outcomes in kidney transplant recipients (KTRs). We aimed to compare the incidences of antibody-mediated rejection (AMR) and acute cellular rejection (ACR) as well as graft and patient outcomes in KTRs who underwent induction with alemtuzumab versus rabbit-antithymocyte globulin (r-ATG).MethodsThis was a single-center retrospective study involving patients who underwent kidney transplantation between January 2009 and December 2011 after receiving induction therapy with either alemtuzumab or r-ATG. Maintenance immunosuppression included tacrolimus and mycophenolate mofetil with early steroid withdrawal. Acute rejection was diagnosed using allograft biopsy.ResultsAmong the 108 study patients, 68 received alemtuzumab and 40 got r-ATG. There was a significantly higher incidence of AMR (15% vs 2.5%; P = .008) and similar incidence of ACR (4.4% vs 10%; P = .69) for alemtuzumab versus r-ATG groups. One-year serum creatinine levels (l.68 ± 0.8 mg/dL vs 1.79 ± 1.8 mg/dL; P = .66) as well as graft (91.1 ± 3.5% vs 94.5 ± 3.8%; P = .48) and patient (93.8 ± 3.0% vs 96.4 ± 3.5%; P = .92) survivals were similar for the alemtuzumab versus the r-ATG groups.ConclusionOur study showed a higher incidence of AMR and similar incidence of ACR in KTRs who underwent induction with alemtuzumab compared with those who received r-ATG and were maintained on tacrolimus and MMF. This was despite a lower HLA mismatch in the alemtuzumab group. One-year graft survival, patient survival, and allograft function were similar. Inadequate B-cell suppression by alemtuzumab as well as altered phenotypic and functional properties of repopulating B cells could be contributing to heightened risk of AMR in these patients.  相似文献   

5.
PurposeTo describe the evolution of the serum levels of soluble HLA-G (s-HLA-G) during the first 12 months after heart transplantation (HT) and to correlate it with clinical outcomes.MethodsObservational study based in a single-center cohort of 59 patients who underwent HT between December-2003 and March-2010. Soluble HLA-G levels were measured from serum samples extracted before HT, and 1, 3, 6 and 12 months after HT. The cumulative burden of s-HLA-G expression during the first post-transplant year was assessed by means of the area under the curve (AUC) of s-HLA-G levels over time and correlated with the acute rejection burden –as assessed by a rejection score–, the presence of coronary allograft vasculopathy (CAV) grade ≥ 1 and infections during the first post-transplant year; as well as with long-term patient and graft survival. Mean follow-up was 12.4 years.ResultsSoluble HLA-G levels decreased over the first post-transplant year (p = 0.020). The AUC of s-HLA-G levels during the first post-transplant year was higher among patients with infections vs. those without infections (p = 0.006). No association was found between the AUC of s-HLA-G levels and the burden of acute rejection or the development of CAV.Overall long-term survival, long-term survival free of late graft failure and cancer-free survival were not significantly different in patients with an AUC of s-HLA-G levels higher or lower than the median of the study population.ConclusionsSoluble HLA-G levels decreased over the first year after HT. Higher HLA-G expression was associated with a higher frequency of infections, but not with the burden of acute rejection or the development of CAV, neither with long-term patient or graft survival.  相似文献   

6.
BackgroundImmune monitoring of transplanted patients may provide a reliable basis for the individualization of immunosuppressive therapy. In addition, it might be applied for realizing the early and non-invasive diagnosis of acute allograft rejection.MethodsPercentages of TCD4 + IL-17+ (Th17) and TCD4 + CD25 + CD127dim/− (Treg) cells, as well as serum levels of interleukin (IL)-17 and transforming growth factor (TGF)-β1, were evaluated in 30 stable patients using flow cytometry and ELISA techniques before and six months after liver transplantation. Besides, the same cells and cytokines were quantified in 10 recipients with acute allograft rejection.ResultsSix months post-transplant, the percentage of Th17 and Treg cells in the peripheral blood of stable liver transplant recipients reduced significantly, but the Th17/Treg ratios were comparable to the pre-transplant period (1.24 vs. 1.56); however, Th17/Treg ratios in the rejection group was significantly higher than in the stable recipients (4.06 vs. 1.56, P-value = 0.001). Stable patients showed decreased amounts of serum IL-17 which was remarkably lower than in the rejection group (P-value = 0.01). Moreover, there was a significant correlation between the serum level of IL-17 and the percentage of Th17 cells (P-value <0.001). Th17 frequency was negatively associated with the liver allograft function. Notably, TGF-β1 levels differed neither between pre-and post-transplant samplings nor between stable and rejection groups.ConclusionSix months after liver transplantation, the mean Th17/Treg ratio in stable recipients remained comparable to the pre-transplant values; however, it was significantly elevated in patients with acute allograft rejection, suggesting the Th17/Treg ratio as a probable predictor of acute rejection.  相似文献   

7.
《Transplantation proceedings》2022,54(8):2317-2324
BackgroundMost lung transplantation centers prefer triple immunosuppressive therapy with tacrolimus, mycophenolate mofetil, and corticosteroids. However, to prevent complications and comorbidities caused by tacrolimus, replacing the drug with everolimus has been considered.MethodsThis is a retrospective observational study investigating everolimus switch for different reasons. The population was divided into 3 groups: chronic lung allograft dysfunction (CLAD), kidney impairment, and malignant neoplasm groups. We investigated whether we achieved the goal of the switch and the frequency of rejection, cytomegalovirus and fungal infections, and everolimus adverse effects.ResultsNineteen patients received everolimus therapy, and 5 of these were for CLAD, 7 for tacrolimus nephrotoxicity, and 7 for explant/de novo malignant neoplasm. The patients were followed up for a mean (SD) of 30 (16.7) months under the therapy. The number of acute cellular rejection, cytomegalovirus infection, and aspergillosis infection cases before switch were 7, 13, and 2, respectively, and 7, 2, and 3 after that. The mean values of creatinine and estimated glomerular filtration rate of the whole population after the switch improved with no statistical significance, whereas it was significant in tacrolimus nephrotoxicity group. Three patients in the CLAD group remained stable after switching, whereas 2 progressed. Only 1 of the 7 patients with malignant neoplasms had a recurrence during 31.1 (16.5) months of median follow-up. Eleven cases of everolimus adverse effects occurred in 9 patients (47.3%), with 2 (10.5%) withdrawal events. Kidney impairment (P = .02) and age (P = .05) stood out as significant risk factors for drug adverse effects.ConclusionsAfter lung transplant, everolimus can be a safe alternative for immunosuppression with acceptable adverse effects.  相似文献   

8.
《Transplantation proceedings》2019,51(6):1994-2001
BackgroundLifelong adherence with post-transplant immunosuppression is challenging, with nonadherence associated with greater acute rejection (AR) risk.MethodsThis retrospective study evaluated conversion from immediate-release tacrolimus (IRT) to prolonged-release tacrolimus (PRT), between January 2008 and December 2012 in stable adult heart transplant recipients. Cumulative incidence rate (IR) of AR and infection pre- and postconversion, safety, tacrolimus dose and trough levels, concomitant immunosuppression, and PRT discontinuation were analyzed (intention-to-treat population).ResultsOverall, 467 patients (mean age, 59.3 [SD, 13.3] years) converted to PRT at 5.1 (SD, 4.9) years post transplant and were followed for 3.4 (SD, 1.5) years. During the 6 months post conversion, 5 patients (1.1%; 95% CI, 0.35%–2.48%) had an AR episode and IR was 2.2/100 patient-years (95% CI, 0.91–5.26). Incidence of rejection preconversion varied by time from transplant to conversion. Infection IR was similar post- and preconversion (9.2/100 patient-years [95% CI, 7.4–11.3] vs 10.6/100 patient-years [95% CI, 8.8–12.3], respectively; P = .20). Safety variables remained similar post conversion. The IR of mortality/graft loss was 2.3/100 patient-years (95% CI, 1.7–3.1).ConclusionsConversion from IRT to PRT in heart transplant recipients in Spain was associated with no new safety concerns and appropriate immunosuppressive effectiveness.  相似文献   

9.
BackgroundBelatacept is employed alongside calcineurin inhibitor (CNI) therapy to prevent graft rejection in kidney transplant patients who are Epstein-Barr virus (EBV) seropositive. Preliminary data suggested that rates of post-transplant lymphoproliferative disorder (PTLD) were higher in individuals treated with belatacept compared to CNI therapy alone.MethodsThe records of 354 adults who underwent kidney only transplantation from January 2015 through September 2021 at one medical center were evaluated. Patients underwent treatment with either low-doses of mycophenolate, tacrolimus and sirolimus (B0, n = 235) or low-doses of mycophenolate, tacrolimus and belatacept (B1, n = 119). All recipients underwent induction with antithymocyte globulin and a rapid glucocorticosteroid taper. Relevant donor and recipient information were analyzed and endpoints of PTLD were assessed.ResultsThere were no cases of PTLD in either cohort within the study period. Recipients in the belatacept cohort experienced lower estimated glomerular filtration rates at 12 months (B0: 67.48 vs. B1: 59.10, p = 0.0014). Graft failure at 12 (B0: 1.28% vs. B1: 0.84%, p = 1.0) and 24 months (B0:2.55% vs. B1: 0.84%, p = 0.431) were similar. There was no difference in rejection rates at 12 (B0: 1.27% vs. B1: 2.52%, p = 0.408) or 24 months (B0: 2.12% vs. B1: 2.52%, p = 1.000). Both groups had similar rates of malignancy, mortality and CMV/BK viremia.ConclusionNon-belatacept (MMF, tacrolimus and sirolimus) and belatacept-based (MMF, tacrolimus and belatacept) regimens do not appear to pose any increased risk of early onset PTLD. Both cohorts benefited from low rates of rejection, malignancy, mortality and graft failure. Recipients will continue to be monitored as PTLD can manifest as a long-term complication.  相似文献   

10.
《Transplantation proceedings》2021,53(6):1998-2003
BackgroundAlthough effective for curtailing alloimmune responses, calcineurin inhibitors (CNIs) have an adverse-effect profile that includes nephrotoxicity. In lung transplant (LTx) recipients, the optimal serum levels of the CNI tacrolimus necessary to control alloimmune responses and minimize nephrotoxicity are unknown.MethodsThis retrospective, single-center study reviewed tacrolimus whole blood trough levels (BTLs), grades of acute cellular rejection (ACR), acute rejection scores, and creatinine clearance (CrCl) obtained in LTx recipients within the first year after their transplant procedure. Comparisons were made between the first 90 days post LTx (when tacrolimus BTLs were maintained >10 µg/L) and the remainder of the post-LTX year (when BTLs were <10 µg/L).ResultsDespite tacrolimus mean BTLs being higher during the first 90 days post LTx compared with the remainder of the first post-LTx year (10.4 ± 0.3 µg/L vs 9.5 ± 0.3 µg/L, P < .0001) there was no association with lower grades of ACR (P = .24). The intensity of ACR (as determined by acute rejection scores) did not correlate with tacrolimus mean BTLs at any time during the first posttransplant year (P = .79). During the first 90 days post LTx there was a significant decline in CrCl and a correlation between increasing tacrolimus mean BTLs and declining CrCl (r = −0.26, P = .03); a correlation that was not observed during the remainder of the year (r = −0.09, P = .52).ConclusionsIn LTx recipients, maintaining BTLs of the CNI tacrolimus >10µg/L did not result in superior control of acute rejection responses but was associated with declining renal function.  相似文献   

11.
BackgroundBelatacept has been demonstrated as an effective alternative immunosuppressant in kidney transplant recipients. This study focuses on outcomes of early and late conversion to Belatacept-based immunosuppression after kidney transplant.Materials and methodsThis retrospective analysis of a prospectively collected database included all adult kidney transplants patients at SUNY Upstate Medical Hospital from 1 January 2014 to 30 December 2022. Early conversion was defined as all conversions done at <6 months after kidney transplantation, and late conversion to belatacept was defined as conversion at >6 months after kidney transplantation.ResultsOut of 61 patients included in this study, 33 patients (54%) were in the early conversion group, and 28 patients (46%) were in the late conversion group. The mean eGFR in the early conversion group was 26.73 ± 16.26 ml/min/1.73 m2 before conversion to belatacept, which improved to 45.3 ± 21.01 ml/min/1.73 m2 at one-year post-conversion (p = 0.0006). Furthermore, eGFR changes in the late conversion group were insignificant, with 46.30 ± 15.65 ml/min/1.73 m2 before conversion to belatacept, and 44.76 ± 22.91 ml/min/1.73 m2 after one year of follow-up (p = 0.72). All four biopsy-proven allograft rejections in the early conversion group were acute T-cell-mediated rejections (ATMR). In the late conversion group, out of three biopsy-proven rejections, one was chronic antibody-mediated rejection (CAMR), one was ATMR, and one was mixed ATMR/CAMR. All four patients with ATMR rejection received mycophenolic acid (MPA) as part of their immunosuppressive regimen, and none received tacrolimus. The one-year post-conversion allograft survival rate in early and late conversion groups was 100%. However, the one-year post-conversion patient survival rate was 90.9% in the early conversion group and 100% in the late conversion group (P = 0.11).ConclusionsEarly post-transplant conversion to belatacept can improve the eGFR more meaningful when compared to late conversion. Patients who receive belatacept and MPA rather than tacrolimus may have increased rates of T-cell-mediated rejection.  相似文献   

12.
Calcineurin inhibitors (CNIs) are the backbone of traditional immunosuppressive regimens for lung transplant recipients (LTR). The CNIs are both narrow therapeutic index drugs with significant interpatient and intrapatient variability that require therapeutic drug monitoring to ensure safety and effectiveness. We hypothesized that tacrolimus time‐in‐therapeutic range (TTR) affects acute and chronic rejection rates in LTRs. This was a single‐center, observational, cross‐sectional study of 292 adult LTRs. Subjects who received tacrolimus posttransplant for the first year were included. TTR was calculated at 1 year using protocol goal ranges (12‐15 mg/mL months 0–6; 10–12 mg/mL for months 7–12). The primary outcome was acute cellular rejection (ACR) burden at 1 year. Chronic lung allograft dysfunction (CLAD), mortality, and infection rate were assessed as secondary outcomes at 1 year. Primary and secondary outcomes were assessed using logistic regression. Increasing TTR by 10% was associated with a significantly lower likelihood of high‐burden ACR at 1 year on univariable (OR 0.46, 95% CI 0.40–0.54, P < .001) and multivariable (OR 0.64, 95% CI 0.47–0.86, P = .003) assessment, controlling for age and induction agent. Increasing TTR by 10% was also associated with lower rates of CLAD (P < .001) and mortality (P < .001) at 1 year. Prospective studies confirming these findings appear warranted.  相似文献   

13.
Tacrolimus, the major immunosuppressant after heart transplant (HTx) therapy, is a narrow therapeutic index drug. Hence, achieving stable therapeutic steady state plasma concentrations is essential to ensure efficacy while avoiding toxicity. Whether high variability in steady state concentrations is associated with poor outcomes is unknown. We investigated the association between tacrolimus trough level variability during the first year post‐HTx and outcomes during and beyond the first postoperative year. Overall, 72 patients were analyzed for mortality, of whom 65 and 61 were available for rejection analysis during and beyond the first year post‐HTx, respectively. Patients were divided into high (median >28.8%) and low tacrolimus level variability (<28.8%) groups. Mean tacrolimus levels did not differ between the groups (12.7 ± 3.4 ng/mL vs 12.8 ± 2.4 ng/mL, P = .930). Patients in the high variability group exhibited higher long‐term rejection rate (median total rejection score: 0.33 vs 0, P = .04) with no difference in rejection scores within the first year post‐HTx. Multivariate analysis showed that high tacrolimus trough level variability was associated with >8‐fold increased risk for any rejection beyond the first year post‐HTx (P = .011). Mortality was associated only with cardiovascular complications (P = .018), with no effect of tacrolimus through level variability.  相似文献   

14.
《Transplantation proceedings》2021,53(10):3022-3029
BackgroundThe aim of this review is to provide consensus on the impact of antihuman leukocyte antigen (anti-HLA) de novo donor-specific antibodies (dnDSA) on pancreatic allograft loss.MethodsWe systematically searched electronic databases through August 2020 using Preferred Reporting Items for Systematic Reviews and Meta-Analyses methodology. Articles that provided or allowed estimation of the odds ratio (OR) and 95% confidence interval (CI) for pancreatic allograft loss in patients with and without anti-HLA dnDSA were included.ResultsEight studies with a total of 1434 patients were included. Patients with anti-HLA dnDSA had significantly higher odds of graft failure (OR = 4.42, 95% CI [3.15-6.22], I2 = 38%). Pooled data on graft rejection showed that patients with anti-HLA dnDSA have significantly higher odds of rejection than patients without anti-HLA (OR = 3.35, 95% CI [2.28-4.91], I2 = 38%).ConclusionThe results of our meta-analysis show that anti-HLA dnDSA is strongly associated with pancreas graft failure and rejection. Surveillance for anti-HLA dnDSA is an important component of post-transplant immune monitoring.  相似文献   

15.
BackgroundDiagnostic tools to measure the response to individual immunosuppressive drugs for transplant patients are currently lacking. We previously developed the blood-based Immunobiogram bioassay for in-vitro characterization of the pharmacodynamic response of patients' own immune cells to a range of immunosuppressants. We used Immunobiogram to examine the association between patients' sensitivity to their prescribed immunosuppressants and clinical outcome.MethodsWe conducted an international, multicenter, observational study in a kidney transplant population undergoing maintenance immunosuppressive therapy. Patients were selected by clinical course poor [PCC] N = 53 (with renal dysfunction, and rejection signs in biopsy or/and an increase in DSA strength in last 12 months) versus good [GCC] N = 50 (with stable renal function and treatment, no rejection and no DSA titers). Immunobiogram dose-response curve parameters were compared between both subgroups in patients treated with mycophenolate, tacrolimus, corticosteroids, cyclosporine A or everolimus. Parameters for which significant inter-group differences were observed were further analyzed by univariate and subsequent multivariate logistic regression.ResultsClinical outcome was associated with following parameters: area over the curve (AOC) and 25% (ID25) and 50% (ID50) inhibitory response in mycophenolate, tacrolimus, and corticosteroid-treated subgroups, respectively. These statistically significant associations persisted in mycophenolate (OR 0.003, CI95% <0.001–0.258; p = 0.01) and tacrolimus (OR < 0.0001, CI95% <0.00001–0.202; p = 0.016) subgroups after adjusting for concomitant corticosteroid treatment, and in corticosteroid subgroup after adjusting for concomitant mycophenolate or tacrolimus treatment (OR 0.003; CI95% <0.0001–0.499; p = 0.026).ConclusionsOur results highlight the potential of Immunobiogram as a tool to test the pharmacodynamic response to individual immunosuppressive drugs.  相似文献   

16.
BackgroundExtended release LCP-tacrolimus (LCPT) allows once-daily dosing in transplant recipients. The improved bioavailability may be beneficial for simultaneous pancreas-kidney recipients (SPK).MethodsThis is a study of 39 SPK recipients on standard immediate-release tacrolimus (IR-TAC, n = 21) or LCPT (n = 18). Coefficient of variability (CV = 1001standard deviation/mean) was calculated to assess drug levels. Hemoglobin A1c (HbA1c), tacrolimus and creatinine levels were measured postoperatively.ResultsThere was no difference in tacrolimus CV in the IR-TAC and LCPT groups at 1 month or 3 months postoperatively; however, a greater difference was observed at 1 year (41.0 vs. 33.1%; p = 0.19). There were six episodes of acute rejection in the IR-TAC group compared to zero episodes in the LCPT group (p = 0.01). HbA1c was significantly higher in the IR-TAC group compared to LCPT at 3 (5.5 vs. 4.9%, p = 0.01), 6 (5.6 vs. 4.9%, p = 0.01) and 12 months (5.8 vs. 5.1%, p = 0.07).ConclusionsSignificantly lower rates of rejection were observed in patients receiving LCPT. The once daily dosing may facilitate medication adherence and result in improved long-term outcomes.  相似文献   

17.
BackgroundVAV1 is an intracellular signal transduction protein that plays a significant role in signal transduction in T cells. Several studies suggest that VAV1 signaling plays significant roles in allograft rejection. The aim of this study was to examine the association between VAV1 gene polymorphisms and renal allograft function.MethodsThe study included 270 patients after allograft renal transplantation. We examined the associations between VAV1 gene polymorphisms and complications after transplantation, such as delayed graft function, acute rejection, and chronic allograft dysfunction.ResultsThere were no statistically significant associations between VAV1 genotypes and delayed graft function and chronic allograft dysfunction. Among patients with acute allograft rejection, we observed decreased frequencies of VAV1 rs2546133 TT and CT genotypes (P = .03) and T allele (P = .02), as well as VAV1 rs2617822 GG and AG genotypes (P = .05) and G allele (P = 0.04). In the multivariate regression analysis, the higher number of VAV1 rs2546133 T alleles showed a protective effect against the acute rejection in kidney allograft recipients.ConclusionsThe results of our study suggest that polymorphisms in the VAV1 gene are associated with kidney allograft rejection.  相似文献   

18.
PurposeInduction immunosuppression has improved the long-term outcomes after kidney transplant. This study explores the association of different induction immunosuppression medications (Basiliximab vs. Alemtuzumab vs. rabbit Antithymocyte Globulin) used at the time of kidney transplant with the development of de novo donor-specific HLA antibodies (DSA) in the first 12 months post-transplant period.MethodsA total of 390 consecutive kidney transplant recipients (KTR), between 2016 and 2018, were included in the analysis. A 104 (26.6%) received Basiliximab, 186 (47.6%) received Alemtuzumab, and 100 (25.6%) received rabbit Antithymocyte Globulin (rATG) for induction. All recipients had a negative flow cytometry crossmatch before transplant. Serum samples at 4- and 12-months post-transplant were assessed for the presence of de novo HLA DSA. kidney allograft function was compared among the three groups with calculated Creatinine Clearance on 24 h urine collection.ResultsDe novo HLA DSA were detected in total of 81 (20.8%) patients within 12 months post-transplant. De novo HLA DSA were detected in 12/104 (11.5%), 43/186 (23.11%), and 26/100 (26%) KTR that received Basiliximab, Alemtuzumab, and rATG respectively (p = 0.006). KTR that received Basiliximab were significantly older, and the last follow-up creatinine clearance was significantly lower at 42 ml/min compared to KTR that received Alemtuzumab or rATG (p = 0.006).ConclusionInduction immunosuppression utilizing Basiliximab is associated with significant reduction in development of de novo DSA within the first 12-months post kidney transplant but had lower creatinine clearance with long-term follow up.  相似文献   

19.
《Transplantation proceedings》2019,51(6):1791-1795
BackgroundThe 2013 Banff meeting updated the requirements for the diagnosis of acute/active antibody-mediated rejection (AAMR) in kidney allografts. There has been speculation that the changes lower the threshold for diagnosing AAMR, and may lead to possible unnecessary and expensive treatment.MethodsWe compared the 2013 Banff classification for AAMR to the previous 2007 Banff to determine if there was an increase in the number of patients receiving a diagnosis of AAMR and if the diagnosis affected allograft survival and post-biopsy 3-month and 6-month creatinine and eGFR values.ResultsA total of 212 renal allograft biopsies were compared to both 2007 and 2013 Banff classification requirements for AAMR. Ten patients (11 biopsies) met the 2007 criteria. An additional 15 patients (20 biopsies) met the 2013 criteria. These 2 groups showed no statistically significant demographic differences. By applying the 2013 Banff classification, we observed a 2.5-fold increase in the number of AAMR cases. One-year post-transplant allograft survival was higher in the 2013 group (.85 vs .55) and the 3-month and 6-month post-biopsy creatinine values were significantly lower for the 2013 group (1.6 ± .6 vs 3.3 ± 2.2, P value .01, and 1.7 ± .6 vs 3.4 ± 2.8, P value .03). The 3-month and 6-month eGFR values were higher in the 2013 group, although not statistically significant.ConclusionsThese results suggest that use of Banff 2013 criteria in place of Banff 2007 may result in diagnosing milder and earlier cases of AAMR with the possibility of initiating earlier treatment and improving graft outcomes.  相似文献   

20.
BackgroundComplement-binding donor-specific human leukocyte antigen (HLA) antibodies in kidney recipients have been associated with a higher risk of allograft rejection and loss. The objective of this meta-analysis was to investigate the correlation between C1q-binding donor-specific antibodies (DSAs) and clinical outcomes in kidney transplantation (KT) recipients.MethodsWe conducted systematic searches in the PubMed, EMBASE, and the Cochrane Library databases to identify all studies since inception to August 2021 that compared clinical outcomes between C1q + DSA and C1q-DSA patients who underwent KT. Data were independently extracted by two reviewers who assessed the risk of bias. Data were summarized with fixed effects or random effects models according to heterogeneity. We assessed clinical outcomes including graft loss, rejection, delayed graft function (DGF), and all-cause patient death.ResultsTwenty-six studies with a total of 1337 patients were included: 485 with C1q-binding DSAs, and 850 without C1q-binding DSAs. Compared with the C1q-DSA group, the C1q + DSA group had significant increases in antibody-mediated rejection (AMR) (relative risk [RR] = 2.09, 95% confidence interval [CI], 1.53–2.86; P < 0.00001), graft loss (RR = 2.40, 95% CI, 1.66–3.47; P < 0.00001), and death (RR = 3.13, 95% CI, 1.06–9.23; P = 0.04). The C1q + DSA and C1q-DSA groups did not show significant differences in T-cell-mediated rejection, acute rejection, acute cellular rejection, mixed rejection, or DGF.ConclusionThe findings of this systematic review suggest that C1q + DSA KT have a higher risk of AMR, graft loss, and death compared with C1q-DSA patients. Monitoring C1q-binding DSAs allows risk stratification of recipients and guides physician management.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号