首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Kidney transplant (KT) outcomes for HIV-infected (HIV+) persons are excellent, yet acute rejection (AR) is common and optimal immunosuppressive regimens remain unclear. Early steroid withdrawal (ESW) is associated with AR in other populations, but its utilization and impact are unknown in HIV+ KT. Using SRTR, we identified 1225 HIV+ KT recipients between January 1, 2000, and December 31, 2017, without AR, graft failure, or mortality during KT admission, and compared those with ESW with those with steroid continuation (SC). We quantified associations between ESW and AR using multivariable logistic regression and interval-censored survival analysis, as well as with graft failure and mortality using Cox regression, adjusting for donor, recipient, and immunologic factors. ESW utilization was 20.4%, with more zero HLA mismatch (8% vs 4%), living donors (26% vs 20%), and lymphodepleting induction (64% vs 46%) compared to the SC group. ESW utilization varied widely across 129 centers, with less use at high- versus moderate-volume centers (6% vs 21%, P < .001). AR was more common with ESW by 1 year (18.4% vs 12.3%; aOR: 1.081.612.41, P = .04) and over the study period (aHR: 1.021.391.90, P = .03), without difference in death-censored graft failure (aHR 0.600.911.36, P = .33) or mortality (aHR: 0.751.151.77, P = .45). To reduce AR after HIV+ KT, tailoring of ESW utilization is reasonable.  相似文献   

2.
Outcomes of locally rejected kidneys transplanted at other centers (import KTX) are unknown. SRTR data from 2000 to 2009 of deceased‐donor KTXs excluding 0‐mismatch, paybacks, and other mandatory shares were compared by location of KTX at local (n = 48 165), regional (n = 4428) or national (n = 4104) centers using multivariable regression models. Compared to nonmandatory share local transplants, import KTX were associated with significantly higher overall risks of patient death (regional aHR 1.15, p < 0.01; national aHR 1.14, p < 0.01), and graft failure (regional aHR 1.17, p < 0.01; national aHR1.21, p < 0.01). In paired analysis, the risk of delayed graft function (DGF) for import KTX was higher compared to locally transplanted mates (regional aOR 1.53, p < 0.01, national aOR 2.14, p < 0.01); however, despite longer ischemia times, overall graft survival was similar. Mean cold ischemia times (CIT) pre‐ and post‐DonorNet® were similar for local and regional transplants, but significantly higher for national transplants (28.9 ± 9.9 vs. 29.9 ± 9.7 h, respectively, p = 0.01). Import KTX is associated with increased risks of graft failure, patient death and DGF. In the era of DonorNet® cold ischemia times of kidneys imported to regional centers are not improved compared to pre‐DonorNet®; and, those of national centers are significantly prolonged.  相似文献   

3.
Kidney paired donation (KPD) is an important tool to facilitate living donor kidney transplantation (LDKT). Concerns remain over prolonged cold ischemia times (CIT) associated with shipping kidneys long distances through KPD. We examined the association between CIT and delayed graft function (DGF), allograft survival, and patient survival for 1267 shipped and 205 nonshipped/internal KPD LDKTs facilitated by the National Kidney Registry in the United States from 2008 to 2015, compared to 4800 unrelated, nonshipped, non‐KPD LDKTs. Shipped KPD recipients had a median CIT of 9.3 hours (range = 0.25‐23.9 hours), compared to 1.0 hour for internal KPD transplants and 0.93 hours for non‐KPD LDKTs. Each hour of CIT was associated with a 5% increased odds of DGF (adjusted odds ratio: 1.05, 95% confidence interval [CI], 1.02‐1.09, P < .01). However, there was not a significant association between CIT and all‐cause graft failure (adjusted hazard ratio [aHR]: 1.01, 95% CI: 0.98‐1.04, P = .4), death‐censored graft failure ( [aHR]: 1.02, 95% CI, 0.98‐1.06, P = .4), or mortality (aHR 1.00, 95% CI, 0.96‐1.04, P > .9). This study of KPD‐facilitated LDKTs found no evidence that long CIT is a concern for reduced graft or patient survival. Studies with longer follow‐up are needed to refine our understanding of the safety of shipping donor kidneys through KPD.  相似文献   

4.
Several clinical and experimental models have underlined the role of the CXCR3‐binding chemokines in the immune‐mediated kidney diseases. This study aimed to investigate the predictive value of measuring pretransplant CXCL9 levels for acute rejection (AR) onset and kidney transplantation outcome. Pretransplantation serum levels of CXCL9 were tested retrospectively in 252 kidney graft recipients, whose stratification in two groups according to CXCL9 levels (<272.1 pg/ml vs. >272.1 pg/ml) showed highly significant differences in 5‐year survival rates (97.7% vs. 73.3%; P < 0.001). Multivariate analysis demonstrated that among the analysed variables, CXCL9 [relative risk (RR) 11.708] and AR (RR 3.604) had the highest predictive power of graft loss. Accordingly, patients with AR (254.4 ± 22.1; P < 0.05) and, even more, those with anti‐thymoglobulin (ATG)‐treated AR also showed increased pretransplant serum CXCL9 levels (319.3 ± 28.1, P < 0.001). Moreover, CXCL9 expression and distribution were investigated in tissue specimens obtained from 10 patients affected by AR, and wide CXCL9 expression was detected not only in infiltrating inflammatory cells but also in vascular and tubular structures. Measurement of pretransplant serum CXCL9 levels might represent the tracking of a clinically useful parameter to identify subjects at high risk of AR and graft failure. These findings might be used for the individualization of immunosuppressive therapies.  相似文献   

5.
Direct‐acting antiviral medications (DAAs) have revolutionized care for hepatitis C positive (HCV+) liver (LT) and kidney (KT) transplant recipients. Scientific Registry of Transplant Recipients registry data were integrated with national pharmaceutical claims (2007‐2016) to identify HCV treatments before January 2014 (pre‐DAA) and after (post‐DAA), stratified by donor (D) and recipient (R) serostatus and payer. Pre‐DAA, 18% of HCV+ LT recipients were treated within 3 years and without differences by donor serostatus or payer. Post‐DAA, only 6% of D‐/R+ recipients, 19.8% of D+/R+ recipients with public insurance, and 11.3% with private insurance were treated within 3 years (P < .0001). LT recipients treated for HCV pre‐DAA experienced higher rates of graft loss (adjusted hazard ratio [aHR] 1.341.852.10, P < .0001) and death (aHR 1.471.681.91, P < .0001). Post‐DAA, HCV treatment was not associated with death (aHR 0.340.671.32, P = .25) or graft failure (aHR 0.320.641.26, P = .20) in D+R+ LT recipients. Treatment increased in D+R+ KT recipients (5.5% pre‐DAA vs 12.9% post‐DAA), but did not differ by payer status. DAAs reduced the risk of death after D+/R+ KT by 57% (0.190.430.95, P = .04) and graft loss by 46% (0.270.541.07, P = .08). HCV treatment with DAAs appears to improve HCV+ LT and KT outcomes; however, access to these medications appears limited in both LT and KT recipients.  相似文献   

6.
A recent study reported that kidney transplant recipients of offspring living donors had higher graft loss and mortality. This seemed counterintuitive, given the excellent HLA matching and younger age of offspring donors; we were concerned about residual confounding and other study design issues. We used Scientific Registry of Transplant Recipients data 2001‐2016 to evaluate death‐censored graft failure (DCGF) and mortality for recipients of offspring versus nonoffspring living donor kidneys, using Cox regression models with interaction terms. Recipients of offspring kidneys had lower DCGF than recipients of nonoffspring kidneys (15‐year cumulative incidence 21.2% vs 26.1%, P < .001). This association remained after adjustment for recipient and transplant factors (adjusted hazard ratio [aHR] = 0.730.770.82, P < .001), and was attenuated among African American donors (aHR 0.770.850.95; interaction: P = .01) and female recipients (aHR 0.770.840.91, P < .001). Although offspring kidney recipients had higher mortality (15‐year mortality 56.4% vs 37.2%, P < .001), this largely disappeared with adjustment for recipient age alone (aHR = 1.021.061.10, P = .002) and was nonsignificant after further adjustment for other recipient characteristics (aHR = 0.930.971.01, P = .1). Kidneys from offspring donors provided lower graft failure and comparable mortality. An otherwise eligible donor should not be dismissed because they are the offspring of the recipient, and we encourage continued individualized counseling for potential donors.  相似文献   

7.
Delayed graft function (DGF) has a negative impact on graft survival in donation after brain death (DBD) but not for donation after cardiac death (DCD) kidneys. However, older donor age is associated with graft loss in DCD transplants. We sought to examine the interaction between donor age and DGF in DBD kidneys. This is a single‐center, retrospective review of 657 consecutive DBD recipients transplanted between 1990 and 2005. We stratified the cohort by decades of donor age and studied the association between DGF and graft failure using Cox models. The risk of graft loss associated with DGF was not significantly increased for donor age below 60 years (adjusted hazard ratio [aHR] 1.12, 1.51, and 0.90, respectively, for age <40, 41–50 and 51–60 years) but significantly increased after 60 years (aHR 2.67; P = 0.019). Analysis of death‐censored graft failure yielded similar results for donor age below 60 years and showed a substantially increased risk with donors above 60 years (aHR 6.98, = 0.002). This analysis reveals an unexpectedly high impact of older donor age on the association between DGF and renal transplant outcomes. Further research is needed to determine the best use of kidneys from donors above 60 years old, where DGF is expected.  相似文献   

8.
Donor‐specific antibodies (DSA) increase the risk of allograft rejection and graft failure. They may be present before transplant or develop de novo after transplantation. Here, we studied the evolution of preformed DSA and their impact on graft outcome in kidney transplant recipients. Using the Luminex Single Antigen assay, we analyzed the sera on the day of transplantation of 239 patients who received a kidney transplant. Thirty‐seven patients (15.5%) had pre‐existing DSA detected the day of transplantation. After 5 years, the pre‐existing DSA disappeared in 22 patients whereas they persisted in 12. Variables associated with DSA persistence were age <50 years (P = 0.009), a history of previous transplantation (P = 0.039), the presence of class II DSA (P = 0.009), an MFI of preformed DSA >3500 (P < 0.001), and the presence of two or more DSA (P < 0.001). DSA persistence was associated with a higher risk of graft loss and antibody‐mediated rejection. Previously undetected preformed DSA are deleterious to graft survival only when they persist after transplantation.  相似文献   

9.
Donor‐specific alloantibodies (DSA) have been associated with rejection and shorter graft survival after orthotopic liver transplantation (OLT). We examined the role of DSA in nonanastomotic biliary strictures (NAS) after OLT. Patients receiving first OLT who developed NAS (n = 68) and a control group without NAS (n = 83), with pre‐OLT and 12 months post‐OLT serum samples, were included. DSA were specified using the Luminex single antigen test. Risk factors for NAS and graft survival were analyzed. The presence of preformed DSA was not significantly different between patients with NAS and controls (P = .89). After 12 months, 26.5% of NAS patients and 16.9% of controls had generated de novo DSA (P = .15). Neither de novo class I DSA nor de novo class II DSA were associated with NAS. De novo DSA generally developed after the diagnosis of NAS. Time‐dependent regression analysis identified both NAS (aHR 8.05, CI 3.28 – 19.77, P < .01) and de novo class II DSA (aHR 2.84, CI 1.38 – 5.82, P < .01) as independent risk factors for graft loss. Preformed or de novo DSA were not associated with the development of NAS. However, NAS as well as de novo class II DSA were independent risk factors for graft loss after OLT.  相似文献   

10.
Study Type – Prognosis (inception cohort) Level of Evidence 2a What's known on the subject? and What does the study add? There is little data on the utility of digital rectal examination (DRE) as a diagnostic tool in the era of prostate‐specific antigen (PSA) testing. Using a population‐based database, we found that detection of prostate cancer while still localized among men with high‐grade PSA‐occult disease may result in survival benefit.

OBJECTIVE

  • ? To determine whether detection of high‐grade prostate cancer while still clinically localised on digital rectal examination (DRE) can improve survival in men with a normal prostate‐specific antigen (PSA) level.

PATIENTS AND METHODS

  • ? From the Surveillance, Epidemiology and End Results database, 166 104 men with prostate cancer diagnosed between 2004 and 2007 were identified.
  • ? Logistic regression was used to identify factors associated with the occurrence of palpable, PSA‐occult (PSA level of <2.5 ng/mL), Gleason score 8–10 prostate cancer.
  • ? Fine and Gray's and Cox multivariable regressions were used to analyse whether demographic, treatment, and clinicopathological factors were associated with the risk of prostate cancer‐specific mortality (PCSM) and all‐cause mortality (ACM), respectively.

RESULTS

  • ? Both increasing age (adjusted odds ratio [aOR] 1.02, 95% confidence interval (CI) 1.01–1.03; P < 0.001) and White race (aOR 1.26, 95% CI 1.03–1.54; P= 0.027) were associated with palpable, Gleason 8–10 prostate cancer. Of 166 104 men, 685 (0.4%) had this subset of prostate cancer.
  • ? Significant factors associated with risk of PCSM included PSA level (adjusted hazard ratio [aHR] 0.71, 95% CI 0.51–0.99; P= 0.04), higher Gleason score (aHR 2.20, 95% CI 1.25–3.87; P= 0.006), and T3–T4 vs T2 disease (aHR 3.11, 95% CI 1.79–5.41; P < 0.001).
  • ? Significant factors associated with risk of ACM included age (aHR 1.03, 95% CI 1.01–1.06; P= 0.006), higher Gleason score (aHR 2.05, 95% CI 1.36–3.09; P < 0.001), and T3–T4 vs T2 disease (aHR 2.11, 95% CI 1.38–3.25, P < 0.001)

CONCLUSIONS

  • ? Clinically localised disease on DRE among men with PSA‐occult high‐grade prostate cancer was associated with improved PCSM and ACM, suggesting that DRE in this cohort (older age and White race) may have the potential to improve survival.
  相似文献   

11.

Background

This study aimed to investigate the impact of non‐anatomical liver resection (NAR) versus anatomical resection (AR) in patients with colorectal liver metastasis (CRLM), with regard to perioperative and long‐term outcomes.

Methods

Analysis of prospectively collected data for patients with CRLM who underwent either AR or NAR between January 1993 and August 2011 was performed. The impact of AR and NAR on morbidity, mortality, margin positivity, redo liver resections, overall survival (OS) and disease free survival (DFS) was analysed.

Results

A total of 1574 resections for CRLM were performed. A total of 249 were redo resections and 334 patients underwent combined AR and NAR, hence, 583 were excluded. In total, 582 AR and 409 NAR were performed. The median age was 66 years (range 23.8–91.8). Median follow up was 32.2 months (interquartile range 17.5–56.9). The need for postoperative transfusion (11.6% versus 2.2%, P = <0.0001), overall complications (25% versus 10.7%, P < 0.0001) and 90‐day mortality (4.9% versus 1.2%, P < 0.0001) was higher in the AR group. R0 and R1 resection rates (AR 26.2% NAR 25%, P = 0.69) and number of patients with intrahepatic recurrence was similar between the two groups (AR 17.5% NAR 22%, P = 0.08). However, the need for redo liver surgery was higher in NAR group 15.4% versus 8.7% (P < 0.001). The OS (NAR 34.1 months versus AR 31.4 months, P = 0.002) and DFS were longer in the NAR group (NAR 18.8 months versus AR 16.9 months, P = 0.031).

Conclusions

A parenchymal preserving surgery (NAR) is associated with lower complication rates and better OS and DFS when compared with AR without compromising margin status. However, NAR increases the need for repeat liver resections.  相似文献   

12.
Transplant eligibility for tobacco and/or marijuana using candidates varies among transplant centers. This study compared the impact of marijuana use and tobacco use on kidney transplant recipient outcomes. Kidney transplant recipients at a single center from 2001 to 2015 were reviewed for outcomes of all‐cause graft loss, infection, biopsy‐proven acute rejection, and estimated glomerular filtration rate between four groups: marijuana‐only users, marijuana and tobacco users, tobacco‐only users, and nonusers. The cohort (N = 919) included 48 (5.2%) marijuana users, 45 (4.8%) marijuana and tobacco users, 136 (14.7%) tobacco users, and 75% nonusers. Smoking status was not significantly associated with acute rejection, estimated glomerular filtration rate or pneumonia within one‐year post‐transplant in an adjusted model. Compared to nonuse, marijuana and tobacco use and tobacco‐only use was significantly associated with increased risk of graft loss (aHR 1.68, P = .034 and 1.52, P = .006, respectively). Patients with isolated marijuana use had similar overall graft survival compared to nonusers (aHR 1.00, P = .994). Marijuana use should not be an absolute contraindication to kidney transplant.  相似文献   

13.
Increased risk donors (IRDs) may inadvertently transmit blood‐borne viruses to organ recipients through transplant. Rates of IRD kidney transplants in children and the associated outcomes are unknown. We used the Scientific Registry of Transplant Recipients to identify pediatric deceased donor kidney transplants that were performed in the United States between January 1, 2005 and December 31, 2015. We used the Cox regression analysis to compare patient and graft survival between IRD and non‐IRD recipients, and a sequential Cox approach to evaluate survival benefit after IRD transplants compared with remaining on the waitlist and never accepting an IRD kidney. We studied 328 recipients with and 4850 without IRD transplants. The annual IRD transplant rates ranged from 3.4% to 13.2%. IRDs were more likely to be male (= .04), black (P < .001), and die from head trauma (P = .006). IRD recipients had higher mean cPRA (0.085 vs 0.065, P = .02). After multivariate adjustment, patient survival after IRD transplants was significantly higher compared with remaining on the waitlist (adjusted hazard ratio [aHR]: 0.48, 95% CI: 0.26‐0.88, P = .018); however, patient (aHR: 0.93, 95% CI: 0.54‐1.59, P = .79) and graft survival (aHR: 0.89, 95% CI: 0.70‐1.13, P = .32) were similar between IRD and non‐IRD recipients. We recommend that IRDs be considered for transplant in children.  相似文献   

14.
Delayed graft function (DGF) following deceased donor kidney transplantation is associated with inferior outcomes. Delayed graft function following living‐donor kidney transplantation is less common, but its impact on graft survival unknown. We therefore sought to determine risk factors for DGF following living‐donor kidney transplantation and DGF's effect on living‐donor kidney graft survival. We analyzed living‐donor kidney transplants performed between 2000 and 2014 in the UNOS dataset. A total of 64 024 living‐donor kidney transplant recipients were identified, 3.6% developed DGF. Cold ischemic time, human leukocyte antigen mismatch, donor age, panel reactive antibody, recipient diabetes, donor and recipient body mass index, recipient race and gender, right nephrectomy, open nephrectomy, dialysis status, ABO incompatibility, and previous transplants were independent predictors of DGF in living‐donor kidney transplants. Five‐year graft survival among living‐donor kidney transplant recipients with DGF was significantly lower compared with graft survival in those without DGF (65% and 85%, respectively, P < 0.001). DGF more than doubled the risk of subsequent graft failure (hazard ratio = 2.3, 95% confidence interval: 2.1–2.6; P < 0.001). DGF after living‐donor kidney transplantation is associated with inferior allograft outcomes. Minimizing modifiable risk factors may improve outcomes in living‐donor kidney transplantation.  相似文献   

15.
Beyond the first posttransplant year, 3% of kidney transplants fail annually. In a prospective, multicenter cohort study, we tested the relative impact of early versus late events on risk of long‐term death‐censored graft failure (DCGF). In grafts surviving at least 90 days, early events (acute rejection [AR] and delayed graft function [DGF] before day 90) were recorded; serum creatinine (Cr) at day 90 was defined as baseline. Thereafter, a 25% rise in serum Cr or new‐onset proteinuria triggered graft biopsy (index biopsy, IBx), allowing comparison of risk of DCGF associated with early events (AR, DGF, baseline serum Cr >2.0 mg/dL) to that associated with later events (IBx). Among 3678 patients followed for 4.7 ± 1.9 years, 753 (20%) had IBx at a median of 15.3 months posttransplant. Early AR (HR = 1.77, P < .001) and elevated Cr at Day 90 (HR = 2.56, P < .0001) were associated with increased risk of DCGF; however, later‐onset dysfunction requiring IBx had far greater impact (HR = 13.8, P < .0001). At 90 days, neither clinical characteristics nor early events distinguished those who subsequently did or did not undergo IBx or suffer DCGF. To improve long‐term kidney allograft survival, management paradigms should promote prompt diagnosis and treatment of both early and later events.  相似文献   

16.
Cirrhosis is a significant marker of adverse postoperative outcome. A large national database was analyzed for abdominal wall hernia repair outcomes in cirrhotic vs. non-cirrhotic patients. Data from cirrhotics and non-cirrhotics undergoing inpatient repair of abdominal wall hernias (excluding inguinal) from 1999 to 2004 were obtained from the University HealthSystem Consortium (UHC) database. Differences (P<0.05) were determined using standard statistical methods. Inpatient hernia repair was performed in 30,836 non-cirrhotic (41.5% male) and 1,197 cirrhotic patients (62.7% male; P<0.0001). Cirrhotics had a higher age distribution (P<0.0001), no race differences (P=0.64), underwent ICU admission more commonly (15.9% vs. 6%; P<0.0001), had a longer LOS (5.4 vs. 3.7 days), and higher morbidity (16.5% vs. 13.8%; P=0.008), and mortality (2.5% vs. 0.2%; P<0.0001) compared to non-cirrhotics. Several comorbidities had a higher associated mortality in cirrhosis: functional impairment, congestive heart failure, renal failure, nutritional deficiencies, and peripheral vascular disease. The complications with the highest associated mortality in cirrhotics were aspiration pneumonia, pulmonary compromise, myocardial infarction, pneumonia, and metabolic derangements. Cirrhotics underwent emergent surgery more commonly than non-cirrhotics (58.9% vs. 29.5%; P<0.0001), with longer LOS regardless of elective or emergent surgery. Although elective surgical morbidity in cirrhotics was no different from non-cirrhotics (15.6% vs. 13.5%; P=0.18), emergent surgery morbidity was (17.3% vs. 14.5%; P=0.04). While differences in elective surgical mortality in cirrhotics approached significance (0.6% vs. 0.1%; P=0.06), mortality was 7-fold higher in emergencies (3.8% vs. 0.5%; P<0.0001). Patients with cirrhosis carry a significant risk of adverse outcome after abdominal wall hernia repair compared to non-cirrhotics, particularly with emergent surgery. It may, however, be safer than previously thought. Ideally, patients with cirrhosis should undergo elective hernia repair after medical optimization.  相似文献   

17.
Infections continue to be a major cause of post‐transplant morbidity and mortality, requiring increased health services utilization. Estimates on the magnitude of this impact are relatively unknown. Using national administrative databases, we compared mortality, acute care health services utilization, and costs in solid organ transplant (SOT) recipients to nontransplant patients using a retrospective cohort of hospitalizations in Canada (excluding Manitoba/Quebec) between April‐2009 and March‐2014, with a diagnosis of pneumonia, urinary tract infection (UTI), or sepsis. Costs were analyzed using multivariable linear regression. We examined 816 324 admissions in total: 408 352 pneumonia; 328 066 UTI's; and 128 275 sepsis. Unadjusted mean costs were greater in SOT compared to non‐SOT patients with pneumonia [(C$14 923 ± C$29 147) vs. (C$11 274 ± C$18 284)] and sepsis [(C$23 434 ± C$39 685) vs. (C$20 849 ± C$36 257)]. Mortality (7.6% vs. 12.5%; P < 0.001), long‐term care transfer (5.3% vs. 16.5%; P < 0.001), and mean length of stay (11.0 ± 17.7 days vs. 13.1 ± 24.9 days; P < 0.001) were lower in SOT. More SOT patients could be discharged home (63.2% vs. 44.3%; P < 0.001), but required more specialized care (23.5% vs. 16.1%; P < 0.001). Adjusting for age and comorbidities, hospitalization costs for SOT patients were 10% (95% CI: 8–12%) lower compared to non‐SOT patients. Increased absolute hospitalization costs for these infections are tempered by lower adjusted costs and favorable clinical outcomes.  相似文献   

18.
Current research is focusing on identifying bioclinical parameters for risk stratification of renal allograft loss, largely due to antibody‐mediated rejection (AMR). We retrospectively investigated graft outcome predictors in 24 unsensitized pediatric kidney recipients developing HLA de novo donor‐specific antibodies (dnDSAs), and treated for late AMR with plasmapheresis + low‐dose IVIG + Rituximab or high‐dose IVIG + Rituximab. Renal function and DSA properties were assessed before and longitudinally post treatment. The estimated GFR (eGFR) decline after treatment was dependent on a negative % eGFR variation in the year preceding treatment (P = 0.021) but not on eGFR at treatment (P = 0.74). At a median follow‐up of 36 months from AMR diagnosis, 10 patients lost their graft. Altered eGFR (P < 0.001) and presence of C3d‐binding DSAs (P = 0.005) at treatment, and failure to remove DSAs (P = 0.01) were negatively associated with graft survival in the univariable analysis. Given the relevance of DSA removal for therapeutic success, we analyzed antibody properties dictating resistance to anti‐humoral treatment. In the multivariable analysis, C3d‐binding ability (P < 0.05), but not C1q‐binding, and high mean fluorescence intensity (P < 0.05) were independent factors characterizing DSAs scarcely susceptible to removal. The poor prognosis of late AMR is related to deterioration of graft function prior to treatment and failure to remove C3d binding and/or high‐MFI DSAs.  相似文献   

19.
Alemtuzumab (AZ) induction in hepatitis C‐seropositive (HCV+) kidney transplant (KTX) recipients may negatively affect patient survival; however, available information is scant. Using US registry data from 2003 to 2010 of adult HCV+ deceased‐donor KTXs (n = 4910), we examined outcomes by induction agent – AZ (n = 294), other T cell‐depleting agents, (n = 2033; T cell), IL‐2 receptor blockade (n = 1135; IL‐2RAb), and no induction (n = 1448). On multivariate analysis, induction therapy was associated with significantly better overall patient survival with AZ [adjusted hazards ratio (aHR) 0.64, 95% confidence interval (CI) 0.45, 0.92], T cell (aHR 0.52, 95% CI 0.41, 0.65) or IL‐2RAb (aHR 0.67, 95% CI 0.53, 0.87), compared to no induction. A significant protective effect was also seen with AZ (aHR 0.63, 95% CI 0.40, 0.99), T cell (aHR 0.62, 95% CI 0.49, 0.78), and IL2R‐Ab (aHR 0.62, 95% CI 0.47, 0.82) in terms of death‐censored graft survival relative to no induction. There were 88 HIV+/HCV+ coinfected recipients. Compared to noninduction, any induction (i.e. three induction groups combined) was associated with similar overall patient survival (P = 0.2255) on univariate analysis. Induction therapy with AZ, other T cell‐depleting agents, or IL‐2RAb in HCV+ KTX is associated with better patient and death‐censored graft survival compared to noninduction. In HCV/HIV coinfected patients, induction is not contraindicated.  相似文献   

20.
Understanding the economic implications of induction and maintenance immunosuppression (ISx) is important in developing personalized kidney transplant (KTx) care. Using data from a novel integrated data set including financial records from the University Health System Consortium, Medicare, and pharmacy claims (2007-2014), we estimated the differences in the impact of induction and maintenance ISx regimens on transplant hospitalization costs and Medicare payments from KTx to 3 years. Use of thymoglobulin (TMG) significantly increased transplant hospitalization costs ($12 006; P = .02), compared with alemtuzumab and basiliximab. TMG resulted in lower Medicare payments in posttransplant years 1 (−$2058; P = .05) and 2 (−$1784; P = .048). Patients on steroid-sparing ISx incurred relatively lower total Medicare spending (−$10 880; P = .01) compared with patients on triple therapy (tacrolimus, antimetabolite, and steroids). MPA/AZA-sparing, mammalian target of rapamycin inhibitors-based, and cyclosporine-based maintenance ISx regimens were associated with significantly higher payments. Alternative ISx regimens were associated with different KTx hospitalization costs and longer-term payments. Future studies of clinical efficacy should also consider cost impacts to define the economic effectiveness of alternative ISx regimens.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号