首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
BackgroundComplement-binding donor-specific human leukocyte antigen (HLA) antibodies in kidney recipients have been associated with a higher risk of allograft rejection and loss. The objective of this meta-analysis was to investigate the correlation between C1q-binding donor-specific antibodies (DSAs) and clinical outcomes in kidney transplantation (KT) recipients.MethodsWe conducted systematic searches in the PubMed, EMBASE, and the Cochrane Library databases to identify all studies since inception to August 2021 that compared clinical outcomes between C1q + DSA and C1q-DSA patients who underwent KT. Data were independently extracted by two reviewers who assessed the risk of bias. Data were summarized with fixed effects or random effects models according to heterogeneity. We assessed clinical outcomes including graft loss, rejection, delayed graft function (DGF), and all-cause patient death.ResultsTwenty-six studies with a total of 1337 patients were included: 485 with C1q-binding DSAs, and 850 without C1q-binding DSAs. Compared with the C1q-DSA group, the C1q + DSA group had significant increases in antibody-mediated rejection (AMR) (relative risk [RR] = 2.09, 95% confidence interval [CI], 1.53–2.86; P < 0.00001), graft loss (RR = 2.40, 95% CI, 1.66–3.47; P < 0.00001), and death (RR = 3.13, 95% CI, 1.06–9.23; P = 0.04). The C1q + DSA and C1q-DSA groups did not show significant differences in T-cell-mediated rejection, acute rejection, acute cellular rejection, mixed rejection, or DGF.ConclusionThe findings of this systematic review suggest that C1q + DSA KT have a higher risk of AMR, graft loss, and death compared with C1q-DSA patients. Monitoring C1q-binding DSAs allows risk stratification of recipients and guides physician management.  相似文献   

2.
《Transplantation proceedings》2021,53(10):3022-3029
BackgroundThe aim of this review is to provide consensus on the impact of antihuman leukocyte antigen (anti-HLA) de novo donor-specific antibodies (dnDSA) on pancreatic allograft loss.MethodsWe systematically searched electronic databases through August 2020 using Preferred Reporting Items for Systematic Reviews and Meta-Analyses methodology. Articles that provided or allowed estimation of the odds ratio (OR) and 95% confidence interval (CI) for pancreatic allograft loss in patients with and without anti-HLA dnDSA were included.ResultsEight studies with a total of 1434 patients were included. Patients with anti-HLA dnDSA had significantly higher odds of graft failure (OR = 4.42, 95% CI [3.15-6.22], I2 = 38%). Pooled data on graft rejection showed that patients with anti-HLA dnDSA have significantly higher odds of rejection than patients without anti-HLA (OR = 3.35, 95% CI [2.28-4.91], I2 = 38%).ConclusionThe results of our meta-analysis show that anti-HLA dnDSA is strongly associated with pancreas graft failure and rejection. Surveillance for anti-HLA dnDSA is an important component of post-transplant immune monitoring.  相似文献   

3.
De novo donor‐specific antibodies (dnDSA) play an important role in antibody‐mediated rejection (ABMR) and graft failure, yet their development in kidney transplant recipients (KTx) of higher immunological risk has not been characterized. We prospectively determined the incidence of dnDSA at 3 and 12 months posttransplant and assessed their associations with outcomes in recipients stratified by low, moderate, and high immunological risk. Adult KTx were screened for DSA pretransplant, months 3 and 12 posttransplant, and when clinically indicated. Outcomes included incidence of dnDSA, death‐censored graft survival (DCGS), and ABMR. Of 371 recipients, 154 (42%) were transplanted across a pretransplant DSA that became undetectable by 12 months posttransplant in 78% of cases. dnDSA were detected in 16% (95% confidence interval [CI]: 12‐20%) by 3 months and 23% (95% CI: 18‐29%) by 12 months posttransplant. Incidence at 12 months was higher in the moderate (30%) and high‐risk groups (29%) compared to the low‐risk group (16%). dnDSA were associated with an increased risk of ABMR (hazard ratio [HR] 2.2; 95% CI: 1.1‐4.4; P = .04) but were not an independent risk factor for DCGS. In conclusion, dnDSA were more frequent in transplant recipients of higher immune risk and associated with an increased risk of ABMR.  相似文献   

4.
BackgroundPreemptive living donor kidney transplantation (P-LDKT) has shown a better prognosis than nonpreemptive living donor KT (NP-LDKT) or deceased donor KT (DDKT). However, association between KT type and de novo donor specific antibody (dnDSA) is uncertain.MaterialsWe retrospectively analyzed 1114 patients who underwent KT between 1994 and 2020. We investigated the clinical significance of dnDSA based on KT type.ResultsMean follow-up duration was 131.5 ± 89.5 months. Mean age of recipients, mismatched number of human leukocyte antigens and incidence of delayed graft function were significantly higher in DDKT group than P-LDKT and NP-LDKT groups. There were no significant differences of incidence of dnDSA and acute rejection within 1 year among them. Death-censored graft survival rate was significantly lower in all groups with dnDSA than without dnDSA, respectively. In positive dnDSA, NP-LDKT and DDKT groups tended to be lower in the death-censored graft survival compared to P-LDKT and there was a significant interaction between type of KT and dnDSA (P = .010). Independent risk factors were acute rejection within 1 year (hazard ratio [HR], 4.341; 95% CI, 1.758-10.720; P = .001), dnDSA positivity (HR, 3.170; 95% CI, 1.364-7.371; P = .007), and eGFR at 12 months after KT (HR, 3.701; 95% CI 2.049-6.686; P < .001).ConclusionsThere was no significant difference of incidence of dnDSA based on KT type, but allograft survival was poor in all recipients with dnDSA. NP-LDKT and DDKT with dnDSA showed poor prognosis compared to P-LDKT with dnDSA. Therefore, continuous and rigorous surveillance of DSA needs among NP-LDKT and DDKT.  相似文献   

5.
IntroductionHLA eplets mismatches (eMM) have been associated with negative kidney outcomes after transplantation, such as the development of de novo donor-specific antibody (dnDSA), antibody-mediated rejection (ABMR), and early graft loss. This study aimed to evaluate the clinical effects of the HLA eMM load on dnDSA development, ABMR, renal function, allograft survival and graft loss.Material and methodsThis retrospective study involved 159 living donor kidney transplant patients categorized into groups based on antigen HLA mismatches assessed traditionally and HLA eMM load. Patients had followed for at least one year. The EpViX online program was used to evaluate the HLA eMM load. Cox models were constructed to assess the risk of graft loss. Kaplan-Meier survival curves were carried out. The analyses had performed using the R program and p < 0.05 was considered significant.ResultsFrom all 159 patients, 28 (17.6%) lost their allografts. Rejection episodes occurred in 37.1% of patients, 13.6% of whom were ABMR. Patients with rejection episodes had higher HLA-AB (p = 0.032) and HLA-DR (p = 0.008) HLA eMM load, HLA-AB (p = 0.006) and HLA-DR (p = 0.009) antigens mismatches, and higher proportions of the following eMM in the HLA-DR locus: 70R eMM (p = 0.015), 70RE (p = 0.015), 74E (p = 0.015) and 48Q (p = 0.047). In multiple models, the presence of HLA-DR 70qq eMM (HR 3.75, 95% CI 1.47; 9.55) add an increase in creatinine levels at 1-year (HR 3.87, 95% CI 2.30, 6.53) were associated with the risk of graft loss.ConclusionThe HLA eMM load was related to episodes of rejection and allograft loss. The HLA-DR eMM was most strongly associated with a worse immunologic outcome than eMM mismatches for HLA-AB.  相似文献   

6.
Performing third or fourth kidney transplantation (3KT and 4KT) in older patients is rare due to surgical and immunologic challenges. We aimed to analyze and compare the outcomes of younger (18–64 years) and older (≥65 years) recipients of 3KT and 4KT. Between 1990 and 2016, we identified 5816 recipients of 3KTs (153 were older) and 886 recipients of 4KTs (18 were older). The incidences of delayed graft function (24.3% vs. 24.8%, = .89), primary non-function (3.2% vs. 1.3%, p = .21), 1-year acute rejection (18.6% vs. 14.8%, p = .24), and 5-year death censored graft failure (DCGF) (24.8% vs. 17.9%, p = .06) were not different between younger and older recipients of 3KT. However, 5-year mortality was higher in older recipients (14.0% vs. 33.8%, p < .001) which remained significant after adjustment (aHR = 3.21, 95% CI: 2.59–3.99). Similar patterns were noted in the 4KT cohort. When compared with waitlisted patients, 3KT and 4KT are associated with a lower risk of mortality; aHR = 0.37, 95% CI: 0.33–0.41 and aHR = 0.31, 95% CI: 0.24–0.41, respectively. This survival benefit did not differ by recipient age (younger vs. older, p for interaction = 3KT: .49 and 4KT: .58). In the largest cohort described to date, we report that there is a survival benefit of 3KT and 4KT even among older patients. Although a highly selected cohort, our results support improving access to 3KT and 4KT.  相似文献   

7.
The required intensity of monitoring for antibody‐mediated rejection (AMR) after of ABO‐incompatible (ABOi) kidney transplantation is not clearly formulized. We retrospectively evaluated a single‐center cohort of 115 ABO‐incompatible (ABOi) kidney transplant recipients, of which 32% were also HLA incompatible (ABOi/HLAi) with their donors. We used an adjusted negative binomial model to evaluate risk factors for late AMR. Using this model, we risk‐stratified patients into high‐ and low‐risk groups for the development of late AMR; 26% of patients had at least one AMR episode; 49% of AMR episodes occurred within 30‐days after transplant and were considered early AMR. Patients with an early AMR episode had a 5.5‐fold greater incidence of developing late AMR [IRR = 5.5, (95% CI: 1.5–19.3), P = 0.01]. ABOi/HLAi recipients trended toward increased late AMR risk [IRR = 1.9, (95% CI: 0.5–6.6), P = 0.3]. High‐risk recipients (those with an early AMR or those who were ABOi/HLAi) had a sixfold increased incidence of late AMR [IRR = 6.3, (95% CI: 1.6–24.6), P = 0.008] versus low‐risk recipients. The overall incidence of late AMR was 20.8% vs. 1.5% in low‐risk recipients. Changes in anti‐A/B titer did not correlate with late AMR (IRR = 0.9 per log titer increase, P = 0.7). This risk‐stratification scheme uses information available within 30 days of ABOi transplantation to determine risk for late AMR and can help direct longitudinal follow‐up for individual patients.  相似文献   

8.
Although interest in the role of donor-specific antibodies (DSAs) in kidney transplant rejection, graft survival, and histopathological outcomes is increasing, their impact on steroid avoidance or minimization in renal transplant populations is poorly understood. Primary outcomes of graft survival, rejection, and histopathological findings were assessed in 188 patients who received transplants between 2012 and 2015 at the Scripps Center for Organ Transplantation, which follows a steroid avoidance protocol. Analyses were performed using data from the United Network for Organ Sharing. Cohorts included kidney transplant recipients with de novo DSAs (dnDSAs; n = 27), preformed DSAs (pfDSAs; n = 15), and no DSAs (nDSAs; n = 146). Median time to dnDSA development (classes I and II) was shorter (102 days) than in previous studies. Rejection of any type was associated with DSAs to class I HLA (P < .05) and class II HLA (P < .01) but not with graft loss. Although mean fluorescence intensity (MFI) independently showed no association with rejection, an MFI >5000 showed a trend toward more antibody-mediated rejection (P < .06), though graft loss was not independently associated. Banff chronic allograft nephropathy scores and a modified chronic injury score were increased in the dnDSA cohort at 6 months, but not at 2 years (P < .001 and P < .08, respectively). Our data suggest that dnDSAs and pfDSAs impact short-term rejection rates but do not negatively impact graft survival or histopathological outcomes at 2 years. Periodic protocol post-transplant DSA monitoring may preemptively identify patients who develop dnDSAs who are at a higher risk for rejection.  相似文献   

9.
《Transplantation proceedings》2022,54(9):2457-2461
BackgroundBK polyomavirus infection (BKVi) is an important cause of kidney transplant (KT) loss, but there is scarce evidence on the impact of BK plasma viral load on graft function and long-term KT survival.MethodsA retrospective cohort study including all KT recipients with BKVi (BK viremia identified in ≥3 consecutive samples by polymerase chain reaction) in our center from January 2010 to December 2020 was performed. A case-control study (1:2) was performed. We grouped the cases according to their highest peak viral load: low-level viremia (<10,000 copies/mL) and high-level viremia (≥10,000 copies/mL). To identify risk factors for BKVi, a logistic regression analysis was achieved, and a multivariable Cox regression was used to describe risk factors for graft loss.ResultsA total of 849 KTs were performed, and 67 presented BKVi (low-level viremia, n = 35 and high-level viremia, n = 26). In logistic regression analysis male sex (odds ratio [OR], 4.226; 95% CI, 1.660-10.758, P = .002), age (OR, 1.047; 95% CI, 1.008-1.088; P = .018), and retransplant (OR, 4.162; 95% CI, 1.018-17.015; P = .047) were predictors of BKVi. Acute rejection was more frequent in the BKVi group (18% vs 4.9%, P = .004), and graft survival was lower in patients with BKVi and high-level viremia (P = .027). In Cox regression analysis, BKVi (hazard ratio, 3.657; 95% CI, 1.146-11.670; P = .029) and specific BKV (BK polyomavirus) high-level viremia (hazard ratio, 1.988; 95% CI, 1.012-3.907; P = .046) were predictors of shorter graft survival.ConclusionsBKV high-level viremia was associated with BKV nephropathy and poorer graft survival. Additionally, acute rejection is more frequent after BKVi. It is necessary to develop strategies safe and effective for these patients.  相似文献   

10.
《Transplantation proceedings》2021,53(6):1865-1871
BackgroundRenal allograft survival is negatively affected by the development of de novo posttransplant donor-specific antibodies (dnDSA). We sought to determine whether treatment with intravenous immunoglobulin (IVIG) could remove or reduce the intensity of dnDSA.MethodsA single-center study of 12 recipients with dnDSA and stable function who received IVIG 1 g/kg monthly for 6 months were compared with a contemporaneous cohort of 24 recipients with dnDSA who did not receive IVIG.ResultsThe median time to first dnDSA was 6 months (interquartile range [IQR], 1-12), and follow-up was 83 months (IQR, 58-94) posttransplant. Resolution of dnDSA occurred in 27% of IVIG vs 46% of control recipients (P = .48). Fifty-eight percent of recipients in both cohorts demonstrated a reduction in the intensity of the dominant DSA at last follow-up (P =1.0). A reduction in the number of dnDSAs occurred in 58% vs 62% of the IVIG and control cohorts, respectively (P = .81). Post-dnDSA, acute rejection occurred in 8% of the IVIG vs 42% in the control group (P = .06). Forty-two percent of IVIG-treated vs 49% of control recipients had a deterioration in function from first dnDSA until most recent follow-up (P = .81). Actuarial graft survivals were equivalent between groups.ConclusionsIVIG treatment of dnDSA in recipients with stable graft function had no impact on DSA clearance or MFI reduction, but this outcome may also be owing to sample size. Larger studies or alternate dosing regimens may be required to determine if there is any role for the use of IVIG as a treatment for dnDSA.  相似文献   

11.
Spinal surgery has long been considered to have an elevated risk of perioperative blood loss with significant associated blood transfusion requirements. However, a great variability exists in the blood loss and transfusion requirements of differing patients and differing procedures in the area of spinal surgery. We performed a retrospective study of all patients undergoing spinal surgery who required a transfusion ≥1 U of red blood cells (RBC) at the National Spinal Injuries Unit (NSIU) at the Mater Misericordiae University Hospital over a 10-year period. The purpose of this study was to identify risk factors associated with significant perioperative transfusion allowing the early recognition of patients at greatest risk, and to improve existing transfusion practices allowing safer, more appropriate blood product allocation. 1,596 surgical procedures were performed at the NSIU over a 10-year period. 25.9% (414/1,596) of these cases required a blood transfusion (n = 414). Surgical groups with a significant risk of requiring a transfusion >2 U RBC included deformity surgery (RR = 3.351, 95% CI 1.123–10.006, p = 0.03), tumor surgery (RR = 3.298, 95% CI 1.078–10.089, p = 0.036), and trauma surgery (RR = 2.444, 95% CI 1.183–5.050, p = 0.036). Multivariable logistic regression analysis identified multilevel surgery (>3 levels) as a significant risk of requiring a transfusion >2 U RBC (RR = 4.682, 95% CI 2.654–8.261, p < 0.0001). Several risk factors in the spinal surgery patient were identified as corresponding to significant transfusion requirements. A greater awareness of the risk factors associated with transfusion is required in order to optimize patient management.  相似文献   

12.
《Transplantation proceedings》2022,54(7):1786-1794
BackgroundThe aim of this study was to evaluate the effect of a recipient's obesity on posttransplant complications and patient and graft survival.MethodsA single-institution, retrospective study was performed on obese renal transplant recipients (BMI ≥ 30 kg/m2, n = 102) from January 2010 to December 2018, matched with non-obese recipients (BMI < 30 kg/m2, n = 204). For comparison, for every obese patient we selected 2 nonobese patients with a similar age, sex, and period of transplantation. The comparative analysis included patient and graft survival as primary outcomes and graft function and postoperative complications as a secondary outcome.ResultsRecipient demographics were comparable in both groups except for diabetic nephropathy in obese patients (P = .0006). Obesity was strongly related to a poorer patient survival (risk ratio [RR] = 2.83 confidence interval [CI] 95% 1.14-7.04; P = .020) but there was no observed difference in graft survival (P = .6). While early graft function was inferior in the obese population (RR = 2.41; CI 95% 1.53-3.79; P = .00016), during late follow-up, no statistically significant differences were observed between both groups (P = .36). Obese recipients had a significantly higher risk of delayed graft function (RR = 1.93; CI 95% (1.19-3.1), P = .0077), heart infarction (RR = 7; CI 95% 1.68-29.26; P = .0042), wound infections (RR = 8; CI 95% 1.96-32.87; P = .0015), diabetes aggravation (RR = 3.13; CI 95% 1.29-7.6; P = .011), and surgical revision for eventration (RR = 8; CI 95% 1.22-52.82; P = .026) when compared with nonobese recipients.ConclusionsDespite the inferior early kidney graft function in obese recipients, there was no difference observed at the long-term follow-up. However, recipient obesity demonstrated a negative effect on patient survival and postoperative complications.  相似文献   

13.
To investigate risk factors for invasive aspergillosis (IA) after kidney transplantation (KT), we conducted a systematic search in PubMed and EMBASE to identify studies published until June 2020. We included case-control or cohort design studies comprising KT recipients with a diagnosis of IA, defined according to the European Organization for Research and Treatment of Cancer/Invasive Fungal Infections Cooperative Group criteria, and assessed risk factors for the development of IA. Random-effect models meta-analysis served to pool data. We identified eleven case-control studies (319 IA cases and 835 controls). There was an increased risk of IA among recipients with underlying chronic lung diseases (odds ratio [OR] = 7.26; 95% confidence interval [CI] = 1.05-50.06) and among those with diabetic nephropathy (OR = 1.65; 95% CI = 1.10-2.48). Requiring posttransplant hemodialysis (OR = 3.69; 95% CI = 2.13-6.37) or surgical reintervention (OR = 6.28; 95% CI = 1.67-23.66) were also associated with an increased risk. Moreover, a positive link was identified between IA and posttransplant bacterial infection (OR = 7.51; 95% CI = 4.37-12.91), respiratory tract viral infection (OR = 7.75; 95% CI = 1.60-37.57), cytomegalovirus infection or disease (OR = 2.67; 95% CI = 1.12-6.32), and acute graft rejection (OR = 3.01; 95% CI = 1.78-5.09). In contrast, receiving a kidney from a living donor was associated with a reduced risk (OR = 0.65; 95% CI = 0.46-0.93). KT recipients that accumulate several of these conditions should be closely monitored and a low threshold of suspicion for IA should be maintained. Future studies should explore the benefit of mold-active prophylaxis to this subgroup of KT recipients at highest risk.  相似文献   

14.

Objective

Many studies have compared the safety and efficacy of the calcineurin inhibitor (CNI) avoidance or CNI withdrawal regimens with typical CNI regimens, but the results remain controversial. The aim of this systematic review and meta-analysis is to make a profound review and an objective appraisal of the safety and efficacy of the CNI avoidance and CNI withdrawal protocols.

Methods

We searched PUBMED, EMBASE, and the reference lists of retrieved studies to identify randomized controlled trials (RCTs) that referred to CNI-free regimens, CNI avoidance, or CNI withdrawal for kidney transplantation. Eight publications involving 27 different RCTs and a total of 3953 patients were used in the analysis.

Results

Use of mammalian target of rapamycin inhibitors, namely sirolimus (SRL), in combination with mycophenolate, conserve graft function at 1 year (glomerular filtration rage [GFR]: mean difference MD 6.21, 95% CI 0.02–12.41, P = .05; serum creatinine: MD −0.11, 95% CI −0.19 to −0.03, P = .01, respectively) and 2 years post-transplant (GFR: MD 13.96, 95% CI 7.32–20.60, P < .0001). Similarly, early withdrawal (≤6 months) of CNIs protect graft function at 1 year after transplant (GFR: MD 7.03, 95% CI 4.84–9.23, P < .00001, serum creatinine: MD −0.21, 95% CI −0.22 to −0.19, P < .00001, respectively). CNI avoidance and withdrawal strategies are associated with higher incidence of acute rejection at 1 year post-transplant (odds ratio OR 1.74, 95% CI 1.08–2.81, P = .02; OR 1.78, 95% CI 1.35–2.34, P < .0001, respectively). At 2 years after transplant, there was no significant difference (OR 0.92, 95% CI 0.33–2.51, P = .86; OR 2.42, 95% CI 1.01–5.82, P = .05, respectively). Meanwhile, neither adverse events nor patient/graft survival differed significantly between the CNI-free and CNI protocols at 1 and 2 years. Referring to long-term results in the published RCTs, use of CNI-free and CNI withdrawal regimens achieve better renal function vs CNI regimens, with no significant difference in patient and graft survival, acute rejection, and most reported adverse events.

Conclusions

In conclusion, this systematic review and meta-analysis suggests that renal recipients with early withdrawal of CNI drugs or avoiding CNI with SRL perform better to conserve graft function at 1 and 2 years post-transplant. Though the use of CNI regimens performs no better in 2-year acute rejection vs the contrast group, they greatly decrease the incidence of acute rejection at the first year after transplantation. CNI avoidance and withdrawal regimens improve the long-term renal function and perform similarly in the acute rejection, patient and graft survival, and adverse events. Due to the limited amounts of long-term studies, more high-quality RCTs are needed.  相似文献   

15.
The PIRCHE (Predicted Indirectly ReCognizable HLA Epitopes) score is an HLA epitope matching algorithm. PIRCHE algorithm estimates the level of presence of T-cell epitopes in mismatched HLA. The PIRCHE-II numbers associate with de novo donor-specific antibody (dnDSA) formation following liver transplantation and kidney allograft survival following renal transplantation. The aim of our study was to assess the PIRCHE-II score in calcineurin inhibitor (CNI)-free maintenance immunosuppression recipients.This was a retrospective study of forty-one liver transplant recipients on CNI-free immunosuppression and with available liver allograft biopsies. Donors and recipients were HLA typed. The HLA-derived mismatched peptide epitopes that could be presented by the recipient's HLA-DRB1 molecules were calculated using PIRCHE-II algorithm. The associations between PIRCHE-II scores and graft immune-mediated events were assessed using receiver operating characteristics curves and subsequent univariate and multivariate analyses.CNI-free patients with cellular rejection, humoral rejection, or severe portal inflammation had higher mean PIRCHE-II scores compared to patients with normal liver allografts. PIRCHE-II score and donor age were independent risk factors for liver graft survival in CNI-free patients (HR: 8.0, 95% CI: 1.3–49, p = .02; and HR: 0.88, 95% CI: 0.00–0.96, p = .007, respectively).PIRCHE-II scores could be predictive of liver allograft survival in CNI-free patients following liver transplantation. Larger studies are needed to confirm these results.  相似文献   

16.
Intruduction and aim: Angiotensin II type 1 receptor antibodies (AT1R-Ab) are associated with graft rejection and poor graft outcomes in kidney transplantation (KT). We aimed to assess the frequency of preformed AT1R-Ab and their impact on graft function and survival at 1 year after KT in a low immunological risk cohort. Methods: We performed a prospective, observational cohort study in 67 adult KT recipients, transplanted between 2018 and 2019. A cut-off value >10 U/ml was used for AT1R-Ab detection. Results: The frequency of preformed AT1R-Ab was 10.4% and the median value of their level was 8.4 U/ml (IQR: 6.8–10.4). Donor-specific anti-human leukocyte antigen antibodies (HLA-DSA) were absent, no case of biopsy-proven rejection was reported and the incidence of graft failure was 7.5%. Estimated glomerular filtration rate (eGFR) was significantly reduced in the AT1R-Ab group [35 (29.8–55.2) vs 56.1 (41.3–66.5) ml/min, p = 0.02] at 1 year after KT. After multivariate linear regression analysis, preformed AT1R-Ab were found as an independent determinant of eGFR at 1 year after KT (β: -15.395; 95% CI: −30.49 - -0.30; p = 0.04). By Cox multivariate regression analysis, preformed AT1R-Ab were not associated with graft failure (HR: 1.36; 95% CI:0.10–14.09; p = 0.80). Conclusion: Preformed AT1R-Ab are an independent determinant of graft function but do not impact graft survival at 12 months after transplantation in a prospective low immunological risk cohort of KT recipients.  相似文献   

17.
Tuberculosis (TB) mortality is high among kidney transplant (KT) recipients. Although local epidemiology is an important factor, diagnostic/therapeutic challenges and immunosuppressive therapy (ISS) may influence outcomes. We analyzed the cumulative incidence (CumI) of TB in KT recipients receiving a variety of ISS with long‐term follow‐up. Our retrospective single‐center cohort study included all KT procedures performed between January 1, 1998, and August 31, 2014, with follow‐up until August 31, 2014. Induction therapy was based on perceived immunological risk; maintenance ISS included prednisone and calcineurin inhibitor (CNI) plus azathioprine (AZA), and mycophenolic acid (MPA) or mechanistic target of rapamycin inhibitor (mTORi). Thirty‐four patients received belatacept/MPA. KT was performed on 11 453 patients and followed for 1989 (IQR 932 to 3632) days. Among these, 152 patients were diagnosed with TB (CumI 1.32%). Median time from KT to TB was 18.8 (IQR 7.2 to 60) months, with 59% of patients diagnosed after the first year. Unadjusted analysis revealed an increasing confidence interval (CI) of TB (0.94% CNI/AZA vs 1.6% CNI/MPA [HR = 1.62, 95% CI = 1.13 to 2.34, P = .009] vs 2.85% CNI/mTORi [HR = 2.45, 95% CI = 1.49 to 4.32, P < .001] vs 14.7% belatacept/MPA [HR = 13.14, 95% CI = 5.27 to 32.79, P < .001]). Thirty‐seven (24%) patients died, and 39 (25.6%) patients experienced graft loss. Cytomegalovirus infection (P = .02) and definitive ISS discontinuation (P < .001) were associated with death. Rejection (P = .018) and ISS discontinuation (P = .005) occurred with graft loss. TB occurred at any time after KT and was influenced by ISS.  相似文献   

18.
Donor-specific antibodies (DSA) are integral to the development of antibody-mediated rejection (AMR). Chronic AMR is associated with high mortality and an increased risk for cardiac allograft vasculopathy (CAV). Anti-donor HLA antibodies are present in 3–11% of patients at the time of heart transplantation (HTx), with de novo DSA (predominantly anti-HLA class II) developing post-transplant in 10–30% of patients. DSA are associated with lower graft and patient survival after HTx, with one study suggesting a three-fold increase in mortality in patients who develop de novo DSA (dnDSA). DSA against anti-HLA class II, notably DQ, are at particularly high risk for graft loss. Although detection of DSA is not a criterion for pathologic diagnosis of AMR, circulating DSA are found in almost all cases of AMR. MFI thresholds of ~5000 for DSA against class I antibodies, 2000 against class II antibodies, or an overall cut-off of 5?6000 for any DSA, have been suggested as being predictive for AMR. There is no firm consensus on pre-transplant strategies to treat HLA antibodies, or for the elimination of antibodies after diagnosis of AMR. Minimizing the risk of dnDSA is rational but data on risk factors in HTx are limited. The effect of different immunosuppressive regimens is largely unexplored in HTx, but studies in kidney transplantation emphasize the importance of adherence and maintaining adequate immunosuppression. One study has suggested a reduced risk for dnDSA with rabbit antithymocyte globulin induction. Management of DSA pre- and post-HTx varies but typically most centers rely on a plasmapheresis or immunoadsorption, with or without rituximab and/or intravenous immunoglobulin. Based on the literature and a multi-center survey, an algorithm for a suggested surveillance and therapeutic strategy is provided.  相似文献   

19.
《Injury》2022,53(2):294-300
AimThe objective of this study was to assess the efficacy and safety of intravenous TXA administration in elderly patients undergoing hip fracture surgery focusing on the effect of various dosages.MethodsA systematic search of PubMed, Embase and Cochrane Library was conducted until February 2021. Our primary outcome was peri?operative total blood loss, while secondary outcomes included transfusion rate, mean count of transfused RBC units and thromboembolic events’ incidence. A subgroup analysis was performed with respect to TXA dosage.ResultsOut of 146 records identified, 10 randomized controlled studies met the selection criteria. Data synthesis revealed that TXA resulted in a significant reduction in total blood loss by 229.45 ml in favor of TXA; 95% CI: [189.5, 269.4] and transfusion rate by 40%, RR = 0.60; 95% CI: [0.47, 0.78]. No increase in thromboembolic events rate was observed (RR = 1.08, 95% CI: [0.68, 1.69]) Furthermore, sub-analysis with respect to TXA dosage showed no significant difference in total blood loss reduction between “single” and “multiple doses” studies (223 vs 233.5 ml, p = 0.85.), while a trend for lower complications rate was observed in patients receiving a single dose of ≤ 15 mg/kg.ConclusionsThis meta-analysis provides strong evidence that TXA is a safe and effective agent to reduce perioperative blood loss in hip fracture surgery. When compared with higher dosages, a single dose of 15 mg/kg is associated with a non-significant reduction in adverse events, while achieving comparable outcomes.  相似文献   

20.
《Transplant immunology》2014,30(1-4):22-27
Historic red blood cell transfusion (RBCT) may induce anti-HLA antibody which, if donor specific (DSA), is associated with increased antibody-mediated rejection (AMR). Whether post-operative RBCT influences this risk is unknown.We examined the RBCT history in 258 renal transplant recipients stratified according to prevalent recipient HLA antibody (DSA, Non-DSA or No Antibody).AMR occurred more frequently in patients who received RBCT both pre and post transplant compared with all other groups (Pre + Post-RBCT 21%, Pre-RBCT 4%, Post-RBCT 6%, No-RBCT 6%, HR 4.1 p = 0.004). In the 63 patients who received Pre + Post-RBCT, 65% (13/20) with DSA developed AMR compared with 0/6 in the Non-DSA group and 2/37 (5%) in the No-Antibody group (HR 13.9 p < 0.001). In patients who received No-RBCT, Pre-RBCT or Post-RBCT there was no difference in AMR between patients with DSA, Non-DSA or No-Antibody. Graft loss was independently associated with Pre + Post-RBCT (HR 6.5, p = 0.001) AMR (HR 23.9 p < 0.001) and Non-AMR (6.0 p = 0.003) after adjusting for DSA and delayed graft function.Re-exposure to RBCT at the time of transplant is associated with increased AMR only in patients with preformed DSA, suggesting that RBCT provides additional allostimulation. Patients receiving Pre + Post-RBCT also had an increased risk of graft loss independently of AMR or DSA. Both pre and post procedural RBCT in renal transplantation is associated with modification of immunological risk and warrants additional study.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号