首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Excellent outcomes have been demonstrated in primary human immunodeficiency virus (HIV)–positive (HIV+) kidney transplant recipients, but a subset will lose their graft and seek retransplantation (re‐KT). To date, no study has examined outcomes among HIV+ re‐KT recipients. We studied risk for death and graft loss among 4149 (22 HIV+ vs. 4127 HIV‐negative [HIV?]) adult re‐KT recipients reported to the Scientific Registry of Transplant Recipients (SRTR) (2004–2013). Compared to HIV? re‐KT recipients, HIV+ re‐KT recipients were more commonly African American (63.6% vs. 26.7%, p < 0.001), infected with hepatitis C (31.8% vs. 5.0%, p < 0.001) and had longer median time on dialysis (4.8 years vs. 2.1 years, p = 0.02). There were no significant differences in length of time between the primary and re‐KT events by HIV status (1.5 years vs. 1.4 years, p = 0.52). HIV+ re‐KT recipients experienced a 3.11‐fold increased risk of death (adjusted hazard ratio [aHR]: 3.11, 95% confidence interval [CI]: 1.82–5.34, p < 0.001) and a 1.96‐fold increased risk of graft loss (aHR: 1.96, 95% CI: 1.14–3.36, p = 0.01) compared to HIV? re‐KT recipients. Re‐KT among HIV+ recipients was associated with increased risk for mortality and graft loss. Future research is needed to determine if a survival benefit is achieved with re‐KT in this vulnerable population.  相似文献   

2.
For some patient subgroups, human immunodeficiency virus (HIV) infection has been associated with worse outcomes after kidney transplantation (KT); potentially modifiable factors may be responsible. The study goal was to identify factors that predict a higher risk of graft loss among HIV‐positive KT recipients compared with a similar transplant among HIV‐negative recipients. In this study, 82 762 deceased donor KT recipients (HIV positive: 526; HIV negative: 82 236) reported to the Scientific Registry of Transplant Recipients (SRTR) (2001–2013) were studied by interaction term analysis. Compared to HIV‐negative recipients, the hepatitis C virus (HCV) amplified risk 2.72‐fold among HIV‐positive KT recipients (adjusted hazard ratio [aHR]: 2.72, 95% confidence interval [CI]: 1.75–4.22, p < 0.001). Forty‐three percent of the excess risk was attributable to the interaction between HIV and HCV (attributable proportion of risk due to the interaction [AP]: 0.43, 95% CI: 0.23–0.63, p = 0.02). Among HIV‐positive recipients with more than three HLA mismatches (MMs), risk was amplified 1.80‐fold compared to HIV‐negative (aHR: 1.80, 95% CI: 1.31–2.47, p < 0.001); 42% of the excess risk was attributable to the interaction between HIV and more than three HLA MMs (AP: 0.42, 95% CI: 0.24–0.60, p = 0.01). High‐HIV‐risk (HIV‐positive/HCV‐positive HLAwith more than three MMs) recipients had a 3.86‐fold increased risk compared to low‐HIV‐risk (HIV‐positive/HCV‐negative HLA with three or fewer MMs)) recipients (aHR: 3.86, 95% CI: 2.37–6.30, p < 0.001). Avoidance of more than three HLA MMs in HIV‐positive KT recipients, particularly among coinfected patients, may mitigate the increased risk of graft loss associated with HIV infection.  相似文献   

3.
Excellent outcomes among HIV+ kidney transplant (KT) recipients have been reported by the NIH consortium, but it is unclear if experience with HIV+ KT is required to achieve these outcomes. We studied associations between experience measures and outcomes in 499 HIV+ recipients (SRTR data 2004–2011). Experience measures examined included: (1) center‐level participation in the NIH consortium; (2) KT experiential learning curve; and (3) transplant era (2004–2007 vs. 2008–2011). There was no difference in outcomes among centers early in their experience (first 5 HIV+ KT) compared to centers having performed > 6 HIV+ KT (GS adjusted hazard ratio [aHR]: 1.05, 95% CI: 0.68–1.61, p = 0.82; PS aHR: 0.93; 95% CI: 0.56–1.53, p = 0.76), and participation in the NIH‐study was not associated with any better outcomes (GS aHR: 1.08, 95% CI: 0.71–1.65, p = 0.71; PS aHR: 1.13; 95% CI: 0.68–1.89, p = 0.63). Transplant era was strongly associated with outcomes; HIV+ KTs performed in 2008–2011 had 38% lower risk of graft loss (aHR: 0.62; 95% CI: 0.42–0.92, p = 0.02) and 41% lower risk of death (aHR: 0.59; 95% CI: 0.39–0.90, p = 0.01) than that in 2004–2007. Outcomes after HIV+ KT have improved over time, but center‐level experience or consortium participation is not necessary to achieve excellent outcomes, supporting continued expansion of HIV+ KT in the US.  相似文献   

4.
Optimization of maintenance immunosuppression (mIS) regimens in the transplant recipient requires a balance between sufficient potency to prevent rejection and avoidance of excessive immunosuppression to prevent toxicities and complications. The optimal regimen after simultaneous liver-kidney (SLK) transplantation remains unclear, but small single-center reports have shown success with steroid-sparing regimens. We studied 4184 adult SLK recipients using the Scientific Registry of Transplant Recipients, from March 1, 2002, to February 28, 2017, on tacrolimus-based regimens at 1 year post-transplant. We determined the association between mIS regimen and mortality and graft failure using Cox proportional hazard models. The use of steroid-sparing regimens increased post-transplant, from 16.1% at discharge to 88.0% at 5 years. Using multi-level logistic regression modeling, we found center-level variation to be the major contributor to choice of mIS regimen (ICC 44.5%; 95% CI: 36.2%-53.0%). In multivariate analysis, use of a steroid-sparing regimen at 1 year was associated with a 21% decreased risk of mortality compared to steroid-containing regimens (aHR 0.79, P = .01) and 20% decreased risk of liver graft failure (aHR 0.80, P = .01), without differences in kidney graft loss risk (aHR 0.92, P = .6). Among SLK recipients, the use of a steroid-sparing regimen appears to be safe and effective without adverse effects on patient or graft survival.  相似文献   

5.
Direct‐acting antiviral medications (DAAs) have revolutionized care for hepatitis C positive (HCV+) liver (LT) and kidney (KT) transplant recipients. Scientific Registry of Transplant Recipients registry data were integrated with national pharmaceutical claims (2007‐2016) to identify HCV treatments before January 2014 (pre‐DAA) and after (post‐DAA), stratified by donor (D) and recipient (R) serostatus and payer. Pre‐DAA, 18% of HCV+ LT recipients were treated within 3 years and without differences by donor serostatus or payer. Post‐DAA, only 6% of D‐/R+ recipients, 19.8% of D+/R+ recipients with public insurance, and 11.3% with private insurance were treated within 3 years (P < .0001). LT recipients treated for HCV pre‐DAA experienced higher rates of graft loss (adjusted hazard ratio [aHR] 1.341.852.10, P < .0001) and death (aHR 1.471.681.91, P < .0001). Post‐DAA, HCV treatment was not associated with death (aHR 0.340.671.32, P = .25) or graft failure (aHR 0.320.641.26, P = .20) in D+R+ LT recipients. Treatment increased in D+R+ KT recipients (5.5% pre‐DAA vs 12.9% post‐DAA), but did not differ by payer status. DAAs reduced the risk of death after D+/R+ KT by 57% (0.190.430.95, P = .04) and graft loss by 46% (0.270.541.07, P = .08). HCV treatment with DAAs appears to improve HCV+ LT and KT outcomes; however, access to these medications appears limited in both LT and KT recipients.  相似文献   

6.
Increased risk donors (IRDs) may inadvertently transmit blood‐borne viruses to organ recipients through transplant. Rates of IRD kidney transplants in children and the associated outcomes are unknown. We used the Scientific Registry of Transplant Recipients to identify pediatric deceased donor kidney transplants that were performed in the United States between January 1, 2005 and December 31, 2015. We used the Cox regression analysis to compare patient and graft survival between IRD and non‐IRD recipients, and a sequential Cox approach to evaluate survival benefit after IRD transplants compared with remaining on the waitlist and never accepting an IRD kidney. We studied 328 recipients with and 4850 without IRD transplants. The annual IRD transplant rates ranged from 3.4% to 13.2%. IRDs were more likely to be male (= .04), black (P < .001), and die from head trauma (P = .006). IRD recipients had higher mean cPRA (0.085 vs 0.065, P = .02). After multivariate adjustment, patient survival after IRD transplants was significantly higher compared with remaining on the waitlist (adjusted hazard ratio [aHR]: 0.48, 95% CI: 0.26‐0.88, P = .018); however, patient (aHR: 0.93, 95% CI: 0.54‐1.59, P = .79) and graft survival (aHR: 0.89, 95% CI: 0.70‐1.13, P = .32) were similar between IRD and non‐IRD recipients. We recommend that IRDs be considered for transplant in children.  相似文献   

7.
Outcomes of old‐donor simultaneous pancreas–kidney transplantation (SPKT) have not been thoroughly studied. Scientific Registry of Transplant Recipients data reported for SPKT candidates receiving dialysis wait‐listed between 1993 and 2008 (n = 7937) were analyzed for outcomes among those who remained listed (n = 3301) and of SPKT recipients (n = 4636) using multivariable time‐dependent regression models. Recipients were stratified by donor/recipient age (cutoff 40 years) into: young‐to‐young (n = 2099), young‐to‐old (n = 1873), old‐to‐young (n = 293), and old‐to‐old (n = 371). The overall mortality was 12%, 14%, 20%, and 24%, respectively, for those transplanted, and 50% for those remaining on the waiting list. On multivariable analysis, old‐donor SPKT was associated with significantly higher overall risks of patient death, death‐censored pancreas, and kidney graft failure in both young (73%, 53%, and 63% increased risk, respectively) and old (91%, 124%, and 85% increased risk, respectively) recipients. The adjusted relative mortality risk was similar for recipients of old‐donor SPKT compared with wait‐listed patients including those who subsequently received young‐donor transplants (aHR 0.95; 95% CI 0.78, 1.12) except for candidates in OPOs with waiting times ≥604 days (aHR 0.65, 95% CI 0.45–0.94). Old‐donor SPKT results in significantly worse graft survival and patient mortality without any waiting‐time benefit as compared to young‐donor SPKT, except for candidates with expected long waiting times.  相似文献   

8.
Prediction models for post‐kidney transplantation mortality have had limited success (C‐statistics ≤0.70). Adding objective measures of potentially modifiable factors may improve prediction and, consequently, kidney transplant (KT) survival through intervention. The Short Physical Performance Battery (SPPB) is an easily administered objective test of lower extremity function consisting of three parts (balance, walking speed, chair stands), each with scores of 0–4, for a composite score of 0–12, with higher scores indicating better function. SPPB performance and frailty (Fried frailty phenotype) were assessed at admission for KT in a prospective cohort of 719 KT recipients at Johns Hopkins Hospital (8/2009 to 6/2016) and University of Michigan (2/2013 to 12/2016). The independent associations between SPPB impairment (SPPB composite score ≤10) and composite score with post‐KT mortality were tested using adjusted competing risks models treating graft failure as a competing risk. The 5‐year posttransplantation mortality for impaired recipients was 20.6% compared to 4.5% for unimpaired recipients (p < 0.001). Impaired recipients had a 2.30‐fold (adjusted hazard ratio [aHR] 2.30, 95% confidence interval [CI] 1.12–4.74, p = 0.02) increased risk of postkidney transplantation mortality compared to unimpaired recipients. Each one‐point decrease in SPPB score was independently associated with a 1.19‐fold (95% CI 1.09–1.30, p < 0.001) higher risk of post‐KT mortality. SPPB‐derived lower extremity function is a potentially highly useful and modifiable objective measure for pre‐KT risk prediction.  相似文献   

9.
Introduction : The number of HIV‐infected children and adolescents requiring second‐line antiretroviral treatment (ART) is increasing in low‐ and middle‐income countries (LMIC). However, the effectiveness of paediatric second‐line ART and potential risk factors for virologic failure are poorly characterized. We performed an aggregate analysis of second‐line ART outcomes for children and assessed the need for paediatric third‐line ART. Methods : We performed a multicentre analysis by systematically reviewing the literature to identify cohorts of children and adolescents receiving second‐line ART in LMIC, contacting the corresponding study groups and including patient‐level data on virologic and clinical outcomes. Kaplan–Meier survival estimates and Cox proportional hazard models were used to describe cumulative rates and predictors of virologic failure. Virologic failure was defined as two consecutive viral load measurements >1000 copies/ml after at least six months of second‐line treatment. Results : We included 12 cohorts representing 928 children on second‐line protease inhibitor (PI)‐based ART in 14 countries in Asia and sub‐Saharan Africa. After 24 months, 16.4% (95% confidence interval (CI): 13.9–19.4) of children experienced virologic failure. Adolescents (10–18 years) had failure rates of 14.5 (95% CI 11.9–17.6) per 100 person‐years compared to 4.5 (95% CI 3.4–5.8) for younger children (3–9 years). Risk factors for virologic failure were adolescence (adjusted hazard ratio [aHR] 3.93, p < 0.001) and short duration of first‐line ART before treatment switch (aHR 0.64 and 0.53, p = 0.008, for 24–48 months and >48 months, respectively, compared to <24 months). Conclusions : In LMIC, paediatric PI‐based second‐line ART was associated with relatively low virologic failure rates. However, adolescents showed exceptionally poor virologic outcomes in LMIC, and optimizing their HIV care requires urgent attention. In addition, 16% of children and adolescents failed PI‐based treatment and will require integrase inhibitors to construct salvage regimens. These drugs are currently not available in LMIC.  相似文献   

10.
Although neutropenia is a common complication after lung transplant, its relationship with recipient outcomes remains understudied. We evaluated a retrospective cohort of 228 adult lung transplant recipients between 2008 and 2013 to assess the association of neutropenia and granulocyte colony‐stimulating factor (GCSF) treatment with outcomes. Neutropenia was categorized as mild (absolute neutrophil count 1000‐1499), moderate (500‐999), or severe (<500) and as a time‐varying continuous variable. Associations with survival, acute rejection, and chronic lung allograft dysfunction (CLAD) were assessed with the use of Cox proportional hazards regression. GCSF therapy impact on survival, CLAD, and acute rejection development was analyzed by propensity score matching. Of 228 patients, 101 (42.1%) developed neutropenia. Recipients with severe neutropenia had higher mortality rates than those of recipients with no (adjusted hazard ratio [aHR] 2.97, 95% confidence interval [CI] 1.05‐8.41, P = .040), mild (aHR 14.508, 95% CI 1.58‐13.34, P = .018), or moderate (aHR 3.27, 95% CI 0.89‐12.01, P = .074) neutropenia. Surprisingly, GCSF treatment was associated with a higher risk for CLAD in mildly neutropenic patients (aHR 3.49, 95% CI 0.93‐13.04, P = .063), although it did decrease death risk in severely neutropenic patients (aHR 0.24, 95% CI 0.07‐0.88, P = .031). Taken together, our data point to an important relationship between neutropenia severity and GCSF treatment in lung transplant outcomes.  相似文献   

11.
Frailty, a measure of physiologic reserve, is associated with poor outcomes and mortality among kidney transplant (KT) candidates and recipients. There are no national estimates of frailty in this population, which may help patient counseling and resource allocation at transplant centers. We studied 4616 KT candidates and 1763 recipients in our multicenter prospective cohort of frailty from 2008‐2018 with Fried frailty measurements. Using Scientific Registry of Transplant Recipients (SRTR) data (KT candidates = 560 143 and recipients = 243 508), we projected the national prevalence of frailty (for KT candidates and recipients separately) using standardization through inverse probability weighting, accounting for candidate/recipient, donor, and transplant factors. In our multicenter cohort, 13.3% of KT candidates were frail at evaluation; 8.2% of LDKT recipients and 17.8% of DDKT recipients were frail at transplantation. Projected nationally, our modeling strategy estimated 91 738 KT candidates or 16.4% (95% confidence interval [CI] 14.4%‐18.4%) of all KT candidates during the study period were frail, and that 34 822 KT recipients or 14.3% (95% CI 12.3%‐16.3%) of all KT recipients were frail (LDKT = 8.2%; DDKT = 17.8%). Given the estimated national prevalence of frailty, transplant programs should consider assessing the condition during KT evaluation to improve patient counseling and resource allocation along with identification of recipients at risk for poor outcomes.  相似文献   

12.
Primary central nervous system lymphoma (PCNSL) risk is greatly increased in immunosuppressed human immunodeficiency virus–infected people. Using data from the US transplant registry linked with 17 cancer registries (1987‐2014), we studied PCNSL and systemic non‐Hodgkin lymphoma (NHL) in 288 029 solid organ transplant recipients. Transplant recipients had elevated incidence for PCNSL compared with the general population (standardized incidence ratio = 65.1; N = 168), and this elevation was stronger than for systemic NHL (standardized incidence ratio=11.5; N = 2043). Compared to kidney recipients, PCNSL incidence was lower in liver recipients (adjusted incidence rate ratio [aIRR] = 0.52), similar in heart and/or lung recipients, and higher in other/multiple organ recipients (aIRR = 2.45). PCNSL incidence was higher in Asians/Pacific Islanders than non‐Hispanic whites (aIRR = 2.09); after induction immunosuppression with alemtuzumab (aIRR = 3.12), monoclonal antibodies (aIRR = 1.83), or polyclonal antibodies (aIRR = 2.03); in recipients who were Epstein‐Barr virus–seronegative at the time of transplant and at risk of primary infection (aIRR = 1.95); and within the first 1.5 years after transplant. Compared to other recipients, those with PCNSL had increased risk of death (adjusted hazard ratio [aHR] = 11.79) or graft failure/retransplantation (aHR = 3.24). Recipients with PCNSL also had higher mortality than those with systemic NHL (aHR = 1.48). In conclusion, PCNSL risk is highly elevated among transplant recipients, and it carries a poor prognosis.  相似文献   

13.
Alemtuzumab (AZ) induction in hepatitis C‐seropositive (HCV+) kidney transplant (KTX) recipients may negatively affect patient survival; however, available information is scant. Using US registry data from 2003 to 2010 of adult HCV+ deceased‐donor KTXs (n = 4910), we examined outcomes by induction agent – AZ (n = 294), other T cell‐depleting agents, (n = 2033; T cell), IL‐2 receptor blockade (n = 1135; IL‐2RAb), and no induction (n = 1448). On multivariate analysis, induction therapy was associated with significantly better overall patient survival with AZ [adjusted hazards ratio (aHR) 0.64, 95% confidence interval (CI) 0.45, 0.92], T cell (aHR 0.52, 95% CI 0.41, 0.65) or IL‐2RAb (aHR 0.67, 95% CI 0.53, 0.87), compared to no induction. A significant protective effect was also seen with AZ (aHR 0.63, 95% CI 0.40, 0.99), T cell (aHR 0.62, 95% CI 0.49, 0.78), and IL2R‐Ab (aHR 0.62, 95% CI 0.47, 0.82) in terms of death‐censored graft survival relative to no induction. There were 88 HIV+/HCV+ coinfected recipients. Compared to noninduction, any induction (i.e. three induction groups combined) was associated with similar overall patient survival (P = 0.2255) on univariate analysis. Induction therapy with AZ, other T cell‐depleting agents, or IL‐2RAb in HCV+ KTX is associated with better patient and death‐censored graft survival compared to noninduction. In HCV/HIV coinfected patients, induction is not contraindicated.  相似文献   

14.
A recent study reported that kidney transplant recipients of offspring living donors had higher graft loss and mortality. This seemed counterintuitive, given the excellent HLA matching and younger age of offspring donors; we were concerned about residual confounding and other study design issues. We used Scientific Registry of Transplant Recipients data 2001‐2016 to evaluate death‐censored graft failure (DCGF) and mortality for recipients of offspring versus nonoffspring living donor kidneys, using Cox regression models with interaction terms. Recipients of offspring kidneys had lower DCGF than recipients of nonoffspring kidneys (15‐year cumulative incidence 21.2% vs 26.1%, P < .001). This association remained after adjustment for recipient and transplant factors (adjusted hazard ratio [aHR] = 0.730.770.82, P < .001), and was attenuated among African American donors (aHR 0.770.850.95; interaction: P = .01) and female recipients (aHR 0.770.840.91, P < .001). Although offspring kidney recipients had higher mortality (15‐year mortality 56.4% vs 37.2%, P < .001), this largely disappeared with adjustment for recipient age alone (aHR = 1.021.061.10, P = .002) and was nonsignificant after further adjustment for other recipient characteristics (aHR = 0.930.971.01, P = .1). Kidneys from offspring donors provided lower graft failure and comparable mortality. An otherwise eligible donor should not be dismissed because they are the offspring of the recipient, and we encourage continued individualized counseling for potential donors.  相似文献   

15.
Although recipient body mass index (BMI) and age are known risk factors for mortality after heart transplantation, how they interact to influence survival is unknown. Our study utilized the UNOS registry from 1997 to 2012 to define the interaction between BMI and age and its impact on survival after heart transplantation. Recipients were stratified by BMI: underweight (<18.5), normal weight (18.5–24.99), overweight (25–29.99), and either moderate (30–34.99), severe (35–39.99), or very severe (≥40) obesity. Recipients were secondarily stratified based on age: 18–40 (younger recipients), 40–65 (reference group), and ≥65 (advanced age recipients). Among younger recipients, being underweight was associated with improved adjusted survival (HR 0.902; p = 0.010) while higher mortality was seen in younger overweight recipients (HR 1.260; p = 0.005). However, no differences in adjusted survival were appreciated in underweight and overweight advanced age recipients. Obesity (BMI ≥ 30) was associated with increased adjusted mortality in normal age recipients (HR 1.152; p = 0.021) and even more so with young (HR 1.576; p < 0.001) and advanced age recipients (HR 1.292; p = 0.001). These results demonstrate that BMI and age interact to impact survival as age modifies BMI–mortality curves, particularly with younger and advanced age recipients.  相似文献   

16.
Steatotic donor livers (SDLs) (macrosteatosis ≥30%) represent a possible donor pool expansion, but are frequently discarded due to a historical association with mortality and graft loss. However, changes in recipient/donor demographics, allocation policy, and clinical protocols might have altered utilization and outcomes of SDLs. We used Scientific Registry of Transplant Recipients data from 2005 to 2017 and adjusted multilevel regression to quantify temporal trends in discard rates (logistic) and posttransplant outcomes (Cox) of SDLs, accounting for Organ Procurement Organization–level variation. Of 4346 recovered SDLs, 58.0% were discarded in 2005, versus only 43.1% in 2017 (P < .001). SDLs were always substantially more likely discarded versus non‐SDLs, although this difference decreased over time (adjusted odds ratio in 2005‐2007:13.1515.2817.74; 2008‐2011:11.7713.4115.29; 2012‐2014:9.8711.3713.10; 2015‐2017:7.798.8910.15, P < .001 for all). Conversely, posttransplant outcomes of recipients of SDLs improved over time: recipients of SDLs from 2012 to 2017 had 46% lower risk of mortality (adjusted hazard ratio [aHR]: 0.430.540.68, P < .001) and 47% lower risk of graft loss (aHR: 0.420.530.67, P < .001) compared to 2005 to 2011. In fact, in 2012 to 2017, recipients of SDLs had equivalent mortality (aHR: 0.901.041.21, P = .6) and graft loss (aHR: 0.901.041.20, P = .6) to recipients of non‐SDLs. Increasing utilization of SDLs might be a reasonable strategy to expand the donor pool.  相似文献   

17.
Kidney paired donation (KPD) is an important tool to facilitate living donor kidney transplantation (LDKT). Concerns remain over prolonged cold ischemia times (CIT) associated with shipping kidneys long distances through KPD. We examined the association between CIT and delayed graft function (DGF), allograft survival, and patient survival for 1267 shipped and 205 nonshipped/internal KPD LDKTs facilitated by the National Kidney Registry in the United States from 2008 to 2015, compared to 4800 unrelated, nonshipped, non‐KPD LDKTs. Shipped KPD recipients had a median CIT of 9.3 hours (range = 0.25‐23.9 hours), compared to 1.0 hour for internal KPD transplants and 0.93 hours for non‐KPD LDKTs. Each hour of CIT was associated with a 5% increased odds of DGF (adjusted odds ratio: 1.05, 95% confidence interval [CI], 1.02‐1.09, P < .01). However, there was not a significant association between CIT and all‐cause graft failure (adjusted hazard ratio [aHR]: 1.01, 95% CI: 0.98‐1.04, P = .4), death‐censored graft failure ( [aHR]: 1.02, 95% CI, 0.98‐1.06, P = .4), or mortality (aHR 1.00, 95% CI, 0.96‐1.04, P > .9). This study of KPD‐facilitated LDKTs found no evidence that long CIT is a concern for reduced graft or patient survival. Studies with longer follow‐up are needed to refine our understanding of the safety of shipping donor kidneys through KPD.  相似文献   

18.
Transplant candidates who accept a kidney labeled increased risk for disease transmission (IRD) accept a low risk of window period infection, yet those who decline must wait for another offer that might harbor other risks or never even come. To characterize survival benefit of accepting IRD kidneys, we used 2010‐2014 Scientific Registry of Transplant Recipients data to identify 104 998 adult transplant candidates who were offered IRD kidneys that were eventually accepted by someone; the median (interquartile range) Kidney Donor Profile Index (KDPI) of these kidneys was 30 (16‐49). We followed patients from the offer decision until death or end‐of‐study. After 5 years, only 31.0% of candidates who declined IRDs later received non‐IRD deceased donor kidney transplants; the median KDPI of these non‐IRD kidneys was 52, compared to 21 of the IRDs they had declined. After a brief risk period in the first 30 days following IRD acceptance (adjusted hazard ratio [aHR] accept vs decline: 1.222.063.49, P = .008) (absolute mortality 0.8% vs. 0.4%), those who accepted IRDs were at 33% lower risk of death 1‐6 months postdecision (aHR 0.500.670.90, P = .006), and at 48% lower risk of death beyond 6 months postdecision (aHR 0.460.520.58, P < .001). Accepting an IRD kidney was associated with substantial long‐term survival benefit; providers should consider this benefit when counseling patients on IRD offer acceptance.  相似文献   

19.
Transplant eligibility for tobacco and/or marijuana using candidates varies among transplant centers. This study compared the impact of marijuana use and tobacco use on kidney transplant recipient outcomes. Kidney transplant recipients at a single center from 2001 to 2015 were reviewed for outcomes of all‐cause graft loss, infection, biopsy‐proven acute rejection, and estimated glomerular filtration rate between four groups: marijuana‐only users, marijuana and tobacco users, tobacco‐only users, and nonusers. The cohort (N = 919) included 48 (5.2%) marijuana users, 45 (4.8%) marijuana and tobacco users, 136 (14.7%) tobacco users, and 75% nonusers. Smoking status was not significantly associated with acute rejection, estimated glomerular filtration rate or pneumonia within one‐year post‐transplant in an adjusted model. Compared to nonuse, marijuana and tobacco use and tobacco‐only use was significantly associated with increased risk of graft loss (aHR 1.68, P = .034 and 1.52, P = .006, respectively). Patients with isolated marijuana use had similar overall graft survival compared to nonusers (aHR 1.00, P = .994). Marijuana use should not be an absolute contraindication to kidney transplant.  相似文献   

20.
The replication kinetics of nonpathogenic anelloviruses belonging to the Alphatorquevirus genus (such as torque teno virus) might reflect the overall state of posttransplant immunosuppression. We analyzed 221 kidney transplant (KT) recipients in whom plasma alphatorquevirus DNA load was quantified by real‐time polymerase chain reaction at baseline and regularly through the first 12 posttransplant months. Study outcomes included posttransplant infection and a composite of opportunistic infection and/or de novo malignancy (immunosuppression‐related adverse event [iRAE]). Alphatorquevirus DNA loads at month 1 were higher among patients who subsequently developed posttransplant infection (P  = .023) or iRAE (P  = .009). Likewise, those with iRAE beyond months 3 and 6 also exhibited higher peak viral loads over the preceding periods. Areas under the curve for log10 alphatorquevirus DNAemia estimated by months 1 or 6 were significantly higher in patients experiencing study outcomes. Alphatorquevirus DNA loads above 3.15 and 4.56 log10 copies/mL at month 1 predicted the occurrence of posttransplant infection (adjusted hazard ratio [aHR]: 2.88; 95% confidence interval [CI]: 1.13‐7.36; P  = .027) and iRAE (aHR: 5.17; 95% CI: 2.01‐13.33; P  = .001). In conclusion, posttransplant monitoring of plasma alphatorquevirus DNA kinetics may be useful to identify KT recipients at increased risk of immunosuppression‐related complications.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号