首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
To implement split liver transplantation (SLT) a mandatory‐split policy has been adopted in Italy since August 2015: donors aged 18‐50 years at standard risk are offered for SLT, resulting in a left‐lateral segment (LLS) graft for children and an extended‐right graft (ERG) for adults. We aim to analyze the impact of the new mandatory‐split policy on liver transplantation (LT)‐waiting list and SLT outcomes, compared to old allocation policy. Between August 2015 and December 2016 out of 413 potentially “splittable” donors, 252 (61%) were proposed for SLT, of whom 53 (21%) donors were accepted for SLT whereas 101 (40.1%) were excluded because of donor characteristics and 98 (38.9%) for absence of suitable pediatric recipients. The SLT rate augmented from 6% to 8.4%. Children undergoing SLT increased from 49.3% to 65.8% (P = .009) and the pediatric LT‐waiting list time dropped (229 [10‐2121] vs 80 [12‐2503] days [P = .045]). The pediatric (4.5% vs 2.5% [P = .398]) and adult (9.7% to 5.2% [P < .001]) LT‐waiting list mortality reduced; SLT outcomes remained stable. Retransplantation (HR = 2.641, P = .035) and recipient weight >20 kg (HR = 5.113, P = .048) in LLS, and ischemic time >8 hours (HR = 2.475, P = .048) in ERG were identified as predictors of graft failure. A national mandatory‐split policy maximizes the SLT donor resources, whose selection criteria can be safely expanded, providing favorable impact on the pediatric LT‐waiting list and priority for adult sick LT candidates.  相似文献   

2.
The United States opioid use epidemic over the past decade has coincided with an increase in hepatitis C virus  (HCV) positive donors. Using propensity score matching, and the Organ Procurement Transplant Network data files from January 2015 to June 2019, we analyzed the short‐term outcomes of adult deceased donor kidney transplants of HCV uninfected recipients with two distinct groups of HCV positive donors (HCV seropositive, nonviremic n = 352 and viremic n = 196) compared to those performed using HCV uninfected donors (n = 36 934). Compared to the reference group, the transplants performed using HCV seropositive, nonviremic and viremic donors experienced a lower proportion of delayed graft function (35.2 vs 18.9%; P < .001 [HCV seropositive, nonviremic donors] and 36.2 vs 16.8% ;  P < .001[HCV viremic donors]). The recipients of HCV viremic donors had better allograft function at 6 months posttransplant (eGFR [54.1 vs 68.3 mL/min/1.73 m2; P = .004]. Furthermore, there was no statistical difference in the overall graft failure risk at 12 months posttransplant by propensity score matched multivariable Cox proportional analysis (HR =  0.60, 95% CI  0.23 to  1.29 [HCV seropositive, nonviremic donors] and HR =  0.85, 95% CI 0.25 to  2.96 [HCV viremic donors]). Further studies are required to determine the long‐term outcomes of these transplants and address unanswered questions regarding the use of HCV viremic donors.  相似文献   

3.
Organ donor contraindications are frequently reassessed for impact on recipient outcomes in attempt to meet demand for transplantation. This study retrospectively analyzed the United Network for Organ Sharing (UNOS) registry for adult heart transplants from 1987 to September 2016 to characterize the impact of donor malignancy history in heart transplantation. Kaplan-Meier estimates illustrated 10-year survival. Propensity score matching was utilized for 1:1 matching of donors with and without history of malignancy, and Cox proportional hazards and logistic regressions were used to analyze the matched population. Of 38 781 heart transplants, 622 (1.6%) had a donor history of malignancy. Cox regressions demonstrated that donor malignancy predicted increased 10-year mortality (HR = 1.16 [1.01-1.33]), but this difference did not persist when conditioned upon 1 year post-transplant survival (log-rank = 0.643). Cox regressions of the propensity score-matched population (455 pairs) found no association between donor malignancy and 10-year mortality (HR = 1.02 [0.84-1.24]). Older age and higher rates of hypertension were observed in donors with a history of malignancy whose recipients died within the first year post-transplant. Therefore, increased recipient mortality is likely due to donor characteristics beyond malignancy, creating the potential for expanded donor selection.  相似文献   

4.
A recent study reported that kidney transplant recipients of offspring living donors had higher graft loss and mortality. This seemed counterintuitive, given the excellent HLA matching and younger age of offspring donors; we were concerned about residual confounding and other study design issues. We used Scientific Registry of Transplant Recipients data 2001‐2016 to evaluate death‐censored graft failure (DCGF) and mortality for recipients of offspring versus nonoffspring living donor kidneys, using Cox regression models with interaction terms. Recipients of offspring kidneys had lower DCGF than recipients of nonoffspring kidneys (15‐year cumulative incidence 21.2% vs 26.1%, P < .001). This association remained after adjustment for recipient and transplant factors (adjusted hazard ratio [aHR] = 0.730.770.82, P < .001), and was attenuated among African American donors (aHR 0.770.850.95; interaction: P = .01) and female recipients (aHR 0.770.840.91, P < .001). Although offspring kidney recipients had higher mortality (15‐year mortality 56.4% vs 37.2%, P < .001), this largely disappeared with adjustment for recipient age alone (aHR = 1.021.061.10, P = .002) and was nonsignificant after further adjustment for other recipient characteristics (aHR = 0.930.971.01, P = .1). Kidneys from offspring donors provided lower graft failure and comparable mortality. An otherwise eligible donor should not be dismissed because they are the offspring of the recipient, and we encourage continued individualized counseling for potential donors.  相似文献   

5.
Among factors determining long-term kidney allograft outcome, pretransplant renal replacement therapy (RRT) is the most easily modifiable. Previous studies analysing RRT modality impact on patient and graft survival are conflicting. Studies on allograft function are scarce, lack sufficient size and follow-up. We retrospectively studied patient and allograft survival together with allograft function and its decline in 2277 allograft recipients during 2000–2014. Pretransplant RRT modality ≥60 days as grouped into “no RRT” (n = 136), “haemodialysis (HD)” (n = 1847), “peritoneal dialysis (PD)” (n = 159), and “HD + PD” (n = 135) was evaluated. Kaplan–Meier analysis demonstrated superior 5-/10-/15-year patient (93.0/81.8/73.1% vs. 86.2/71.6/49.8%), death-censored graft (90.8/85.4/71.5% vs. 84.4/75.2/63.2%), and 1-year rejection-free graft survival (73.8% vs. 63.8%) in PD versus HD patients. Adjusted Cox regression revealed 34.5% [1.5–56.5%] lower hazards of death, whereas death-censored graft loss was similar [HR = 0.707 (0.469–1.064)], and rejection was less frequent [HR = 0.700 (0.508–0.965)]. Allografts showed higher 1-/3-/5-year estimated glomerular filtration rate (eGFR) in “PD” versus “HD” groups. Living donation benefit for allograft function was most pronounced in groups “no RRT” and “PD”. Functional allograft decline (eGFR slope) was lowest for “PD”. Allograft recipients on pretransplant PD versus HD demonstrated superior all-cause patient and rejection-free graft survival along with better allograft function (eGFR).  相似文献   

6.
Long‐term survival in orthotopic liver transplant (OLT) recipients remains impaired because of many contributing factors, including a low pretransplant muscle mass (or sarcopenia). However, influence of posttransplant muscle mass on survival is currently unknown. We hypothesized that posttransplant urinary creatinine excretion rate (CER), an established noninvasive marker of total body muscle mass, is associated with long‐term survival after OLT. In a single‐center cohort study of 382 adult OLT recipients, mean ± standard deviation CER at 1 year posttransplantation was 13.3 ± 3.7 mmol/24 h in men and 9.4 ± 2.6 mmol/24 h in women. During median follow‐up for 9.8 y (interquartile range 6.4‐15.0 y), 104 (27.2%) OLT recipients died and 44 (11.5%) developed graft failure. In Cox regression analyses, as continuous variable, low CER was associated with increased risk for mortality (HR = 0.43, 95% CI: 0.26‐0.71, = .001) and graft failure (HR = 0.42, 95% CI: 0.20‐0.90, = .03), independent of age, sex, and body surface area. Similarly, OLT recipients in the lowest tertile had an increased risk for mortality (HR = 2.69; 95% CI: 1.47‐4.91, = .001) and graft failure (HR = 2.77, 95% CI: 1.04‐7.39, = .04), compared to OLT recipients in the highest tertile. We conclude that 1 year posttransplant low total body muscle mass is associated with long‐term risk of mortality and graft failure in OLT recipients.  相似文献   

7.
Nondirected kidney donors can initiate living donor chains that end to patients on the waitlist. We compared 749 National Kidney Registry (NKR) waitlist chain end transplants to other transplants from the NKR and the Scientific Registry of Transplant Recipients between February 2008 and September 2020. Compared to other NKR recipients, chain end recipients were more often older (53 vs. 52 years), black (32% vs. 15%), publicly insured (71% vs. 46%), and spent longer on dialysis (3.0 vs. 1.0 years). Similar differences were noted between chain end recipients and non-NKR living donor recipients. Black patients received chain end kidneys at a rate approaching that of deceased donor kidneys (32% vs. 34%). Chain end donors were older (52 vs. 44 years) with slightly lower glomerular filtration rates (93 vs. 98 ml/min/1.73 m2) than other NKR donors. Chain end recipients had elevated risk of graft failure and mortality compared to control living donor recipients (both p < .01) but lower graft failure (p = .03) and mortality (p < .001) compared to deceased donor recipients. Sharing nondirected donors among a multicenter network may improve the diversity of waitlist patients who benefit from living donation.  相似文献   

8.
Immunosuppression and comorbidities might place solid organ transplant (SOT) recipients at higher risk from COVID-19, as suggested by recent case series. We compared 45 SOT vs. 2427 non-SOT patients who were admitted with COVID-19 to our health-care system (March 1, 2020 - August 21, 2020), evaluating hospital length-of-stay and inpatient mortality using competing-risks regression. We compared trajectories of WHO COVID-19 severity scale using mixed-effects ordinal logistic regression, adjusting for severity score at admission. SOT and non-SOT patients had comparable age, sex, and race, but SOT recipients were more likely to have diabetes (60% vs. 34%, p < .001), hypertension (69% vs. 44%, p = .001), HIV (7% vs. 1.4%, p = .024), and peripheral vascular disorders (19% vs. 8%, p = .018). There were no statistically significant differences between SOT and non-SOT in maximum illness severity score (p = .13), length-of-stay (sHR: 0.91.11.4, p = .5), or mortality (sHR: 0.10.41.6, p = .19), although the severity score on admission was slightly lower for SOT (median [IQR] 3 [3, 4]) than for non-SOT (median [IQR] 4 [3–4]) (p = .042) Despite a higher risk profile, SOT recipients had a faster decline in disease severity over time (OR = 0.760.810.86, p < .001) compared with non-SOT patients. These findings have implications for transplant decision-making during the COVID-19 pandemic, and insights about the impact of SARS-CoV-2 on immunosuppressed patients.

  相似文献   

9.
BackgroundThere is little literature concerning clinical outcomes following revision joint arthroplasty in solid organ transplant recipients. The aims of this study are to (1) analyze postoperative outcomes and mortality following revision hip and knee arthroplasty in renal transplant recipients (RTRs) compared to non-RTRs and (2) characterize common indications and types of revision procedures among RTRs.MethodsA retrospective Medicare database review identified 1020 RTRs who underwent revision joint arthroplasty (359 revision total knee arthroplasty [TKA] and 661 revision total hip arthroplasty [THA]) from 2005 to 2014. RTRs were compared to their respective matched control groups of nontransplant revision arthroplasty patients for hospital length of stay, readmission, major medical complications, infections, septicemia, and mortality following revision.ResultsRenal transplantation was significantly associated with increased length of stay (6.12 ± 7.86 vs 4.33 ± 4.29, P < .001), septicemia (odds ratio [OR], 2.52; 95% confidence interval [CI], 1.83-3.46; P < .001), and 1-year mortality (OR, 2.71; 95% CI, 1.51-4.53; P < .001) following revision TKA. Among revision THA patients, RTR status was associated with increased hospital readmission (OR, 1.23; 95% CI, 1.03-1.47; P = .023), septicemia (OR, 1.82; 95% CI, 1.41-2.34; P < .001), and 1-year mortality (OR, 2.65; 95% CI, 1.88-3.66; P < .001). The most frequent primary diagnoses associated with revision TKA and THA among RTRs were mechanical complications of prosthetic implant.ConclusionPrior renal transplantation among revision joint arthroplasty patients is associated with increased morbidity and mortality when compared to nontransplant recipients.  相似文献   

10.
Allografts from living kidney donors with hypertension may carry subclinical kidney disease from the donor to the recipient and, thus, lead to adverse recipient outcomes. We examined eGFR trajectories and all-cause allograft failure in recipients from donors with versus without hypertension, using mixed-linear and Cox regression models stratified by donor age. We studied a US cohort from 1/1/2005 to 6/30/2017; 49 990 recipients of allografts from younger (<50 years old) donors including 597 with donor hypertension and 21 130 recipients of allografts from older (≥50 years old) donors including 1441 with donor hypertension. Donor hypertension was defined as documented predonation use of antihypertensive therapy. Among recipients from younger donors with versus without hypertension, the annual eGFR decline was −1.03 versus −0.53 ml/min/m2 (P = 0.002); 13-year allograft survival was 49.7% vs. 59.0% (adjusted allograft failure hazard ratio [aHR] 1.23; 95% CI 1.05–1.43; P = 0.009). Among recipients from older donors with versus without hypertension, the annual eGFR decline was −0.67 versus −0.66 ml/min/m2 (P = 0.9); 13-year allograft survival was 48.6% versus 52.6% (aHR 1.05; 95% CI 0.94–1.17; P = 0.4). In secondary analyses, our inferences remained similar for risk of death-censored allograft failure and mortality. Hypertension in younger, but not older, living kidney donors is associated with worse recipient outcomes.  相似文献   

11.
The average age of renal transplant recipients in the United States has increased over the past decade. The implications, however, have not been fully investigated. We explored predictors of success and demographic variables related to outcomes in elderly live donor transplantation. Retrospective analysis was performed using the UNOS database between 2001 and 2016. Donor characteristics and the graft failure rate of recipients above and below 70 years of age were compared across four eras: 2001-2004, 2005-2008, 2009-2012, and 2013-2016. There was a steady increase in average donor age from the first era to the fourth era (40-44) which was more evident among the septuagenarian patients (43-50) (P < .001). The 2-year graft survival rate improved from 92% in the first era to 96% in the fourth era (P < .001), and this was also more prominent in the >70 population (87%-93%) (P < .001). The >70 recipients were more likely to be non-Hispanic white (80.1% vs 65.1%, P < .001) and male (70.1% vs 61.0% P < .001), respectively. The donors were more likely to be non-Hispanic white and female in the >70 population. Live donation in the elderly is justified based on graft survival and patient survival. However, racial and gender differences exist in septuagenarian recipients and their donors.  相似文献   

12.
Hearts from older donors are increasingly utilized for transplantation due to unmet demand. Conflicting evidence exists regarding the prognosis of recipients of advanced age donor hearts, especially in young recipients. A retrospective analysis was performed on 11 433 patients aged 18 to 45 who received a cardiac transplant from 2000 to 2017. Overall, 10 279 patients received hearts from donors less than 45 and 1145 from donors greater than 45. Recipients of older donors were older (37 vs. 34 years, P < .01) and had higher rates of inotropic dependence (48% vs. 42%, P < .01). However, groups were similar in terms of comorbidities and dependence on mechanical circulatory support. Median survival for recipients of older donors was reduced by 2.6 years (12.6 vs. 15.2, P < .01). Multivariable analysis demonstrated donor age greater than 45 to be a predictor of mortality (HR 1.18 [1.05‐1.33], P = .01). However, when restricting the analysis to patients who received a donor with a negative preprocurement angiogram, donor age only had a borderline association with mortality (HR 1.20 [0.98‐1.46], P = .06). Older donor hearts in young recipients are associated with decreased long‐term survival, however this risk is reduced in donors without atherosclerosis. The long‐term hazard of this practice should be carefully weighed against the risk of waitlist mortality.  相似文献   

13.
Performing third or fourth kidney transplantation (3KT and 4KT) in older patients is rare due to surgical and immunologic challenges. We aimed to analyze and compare the outcomes of younger (18–64 years) and older (≥65 years) recipients of 3KT and 4KT. Between 1990 and 2016, we identified 5816 recipients of 3KTs (153 were older) and 886 recipients of 4KTs (18 were older). The incidences of delayed graft function (24.3% vs. 24.8%, = .89), primary non-function (3.2% vs. 1.3%, p = .21), 1-year acute rejection (18.6% vs. 14.8%, p = .24), and 5-year death censored graft failure (DCGF) (24.8% vs. 17.9%, p = .06) were not different between younger and older recipients of 3KT. However, 5-year mortality was higher in older recipients (14.0% vs. 33.8%, p < .001) which remained significant after adjustment (aHR = 3.21, 95% CI: 2.59–3.99). Similar patterns were noted in the 4KT cohort. When compared with waitlisted patients, 3KT and 4KT are associated with a lower risk of mortality; aHR = 0.37, 95% CI: 0.33–0.41 and aHR = 0.31, 95% CI: 0.24–0.41, respectively. This survival benefit did not differ by recipient age (younger vs. older, p for interaction = 3KT: .49 and 4KT: .58). In the largest cohort described to date, we report that there is a survival benefit of 3KT and 4KT even among older patients. Although a highly selected cohort, our results support improving access to 3KT and 4KT.  相似文献   

14.
Although the use of induction therapy has reduced the risk of acute rejection after heart transplantation, its use may be associated with other adverse outcomes. We aimed to examine the effect of no induction (NoInd), induction with basiliximab (BAS), or induction with antithymocyte globulin (ATG) on outcome after heart transplantation. We analyzed data from the International Society for Heart and Lung Transplantation (ISHLT) registry for adult heart transplants performed between 2000 and 2013. The primary outcome was cumulative all-cause mortality, and the secondary outcome was cause-specific death. We identified 27 369 transplants whose recipients received NoInd (n = 15 688), ATG (n = 6830), or BAS (n = 4851). Over a median follow-up of 1497 days, overall 30-day mortality was 5% and 1-year mortality was 11%. Survival after transplant was similar in patients treated with NoInd compared with ATG. The survival was improved using NoInd compared with BAS (log-rank P = .040), adjustment HR = 1.11 (95% CI, 1.04-1.19). Compared to NoInd, BAS was associated with higher risk of graft failure-related deaths, HR = 1.27 (95% CI, 1.02-1.58), and ATG was associated with higher risk of malignancy-related deaths, HR = 1.18 (95% CI, 1.01-1.39). Survival of patients who received NoInd was similar to ATG and better compared with BAS. Further, the use of ATG may be associated with increased malignancy-related mortality, compared with NoInd.  相似文献   

15.
Recent studies have found that metastatic castrated-resistant prostate cancer (mCRPC) with positive androgen receptor splice variant 7 (AR-V7) may have poor prognosis during endocrine or chemotherapy treatment, but the specific mechanism was still unclear. We had finished literature search in March 2019 from PubMed, Web of Science database, and Embase. The final results were presented in this research. The pooled results showed that AR-V7 status predicted pooled PSA-PFS (HR = 4.31, 95% CI: 2.57–7.24, p < .001), rPFS (HR = 2.39, 95% CI: 1.28–4.48, p = .006) and OS (HR = 4.27, 95% CI: 3.22–5.66, p < .001) in mCRPC patients after endocrine or chemotherapy treatment. Subgroup analysis of different treatments revealed that mCRPC patients treated with chemotherapy had significant association between positive AR-V7 and OS (HR = 2.82, 95% CI: 1.72–4.62, p < .001), and also during endocrine therapy (HR = 4.78, 95% CI: 3.33–6.86, p < .001). Our study demonstrated that AR-V7-positive mCRPC patients may have worse prognosis. AR-V7 may be an independent prognostic factor for endocrine therapy or chemotherapy in patients with mCRPC.  相似文献   

16.
Islet transplantation has become a well‐established therapy for select patients with type 1 diabetes. Viability and engraftment can be compromised by the generation of oxidative stress encountered during isolation and culture. We evaluated whether the administration of BMX‐001 (MnTnBuOE‐2‐PyP5+ [Mn(III) meso‐tetrakis‐(N‐ b ‐butoxyethylpyridinium‐2‐yl)porphyrin]) and its earlier derivative, BMX‐010 (MnTE‐2‐PyP [Mn(III) meso‐tetrakis‐(N‐methylpyridinium‐2‐yl)porphyrin]) could improve islet function and engraftment outcomes. Long‐term culture of human islets with BMX‐001, but not BMX‐010, exhibited preserved in vitro viability. Murine islets isolated and cultured for 24 hours with 34 μmol/L BMX‐001 exhibited improved insulin secretion (n = 3 isolations, P < .05) in response to glucose relative to control islets. In addition, 34 μmol/L BMX‐001–supplemented murine islets exhibited significantly reduced apoptosis as indicated by terminal deoxynucleotidyl transferase dUTP nick end labeling, compared with nontreated control islets (P < .05). Murine syngeneic islets transplanted under the kidney capsule at a marginal dose of 150 islets revealed 58% of 34 μmol/L BMX‐001–treated islet recipients became euglycemic (n = 11 of 19) compared with 19% of nontreated control islet recipients (n = 3 of 19, P < .05). Of murine recipients receiving a marginal dose of human islets cultured with 34 μmol/L BMX‐001, 92% (n = 12 of 13) achieved euglycemia compared with 57% of control recipients (n = 8 of 14, P = .11). These results demonstrate that the administration of BMX‐001 enhances in vitro viability and augments murine marginal islet mass engraftment.  相似文献   

17.
The optimal timing of chemotherapy relative to resection of synchronous colorectal liver metastases (SCRLM) is not known. The objective of this retrospective multi-institutional study was to assess the influence of chemotherapy administered before and after hepatic resection on long-term outcomes among patients with initially resectable SCRLM treated from 1995 to 2005. Clinicopathologic data, treatments, and long-term outcomes from patients with initially resectable SCRLM who underwent partial hepatectomy at three hepatobiliary centers were reviewed. Four hundred ninety-nine consecutive patients underwent resection; 297 (59.5%) and 264 (52.9%) were treated with chemotherapy before and after resection. Chemotherapy strategies included pre-hepatectomy alone (n = 148, 24.7%), post-hepatectomy alone (n = 115, 23.0%), perioperative (n = 149, 29.0%), and no chemotherapy (n = 87, 17.4%). Male gender (p = 0.0029, HR = 1.41 [1.12–1.77]), node-positive primary tumor (p = 0.0046, HR = 1.40 [1.11–1.77]), four or more SCRLM (p = 0.0005, HR = 1.65 [1.24–2.18]), and post-hepatectomy chemotherapy treatment for 6 months or longer (p = 0.039, HR = 0.75 [0.57–0.99]) were associated with recurrence-free survival after discovery of SCRLM. Carcinoembryonic antigen >200 ng/ml (p = 0.0003, HR = 2.33 [1.48–3.69]), extrahepatic metastatic disease (p = 0.0025, HR = 2.34 [1.35–4.05]), four or more SCRLM (p = 0.033, HR = 1.43 [1.03–2.00]), and post-hepatectomy chemotherapy treatment for 2 months or longer (p < 0.0001, HR = 0.59 [0.45–0.76]) were associated with overall survival. Pre-hepatectomy chemotherapy was not associated with recurrence-free or overall survival. Patients treated with perioperative chemotherapy had similar outcomes as patients treated with post-hepatectomy chemotherapy only. We conclude that chemotherapy administered after but not before resection of SCRLM was associated with improved recurrence-free and overall survival. However, prospective randomized trials are needed to determine the optimal timing of chemotherapy.  相似文献   

18.
Desensitization has enabled incompatible living donor kidney transplantation (ILDKT) across HLA/ABO barriers, but added immunomodulation might put patients at increased risk of infections. We studied 475 recipients from our center from 2010 to 2015, categorized by desensitization intensity: none/compatible (n = 260), low (0-4 plasmaphereses, n = 47), moderate (5-9, n = 74), and high (≥10, n = 94). The 1-year cumulative incidence of infection was 50.1%, 49.8%, 66.0%, and 73.5% for recipients who received none, low, moderate, and high-intensity desensitization (P < .001). The most common infections were UTI (33.5% of ILDKT vs. 21.5% compatible), opportunistic (21.9% vs. 10.8%), and bloodstream (19.1% vs. 5.4%) (P < .001). In weighted models, a trend toward increased risk was seen in low (wIRR = 0.771.402.56,P = .3) and moderately (wIRR = 0.881.352.06,P = .2) desensitized recipients, with a statistically significant 2.22-fold (wIRR = 1.332.223.72,P = .002) increased risk in highly desensitized recipients. Recipients with ≥4 infections were at higher risk of prolonged hospitalization (wIRR = 2.623.574.88, P < .001) and death-censored graft loss (wHR = 1.154.0113.95,P = .03). Post–KT infections are more common in desensitized ILDKT recipients. A subset of highly desensitized patients is at ultra-high risk for infections. Strategies should be designed to protect patients from the morbidity of recurrent infections, and to extend the survival benefit of ILDKT across the spectrum of recipients.  相似文献   

19.
IntroductionDonor nephrectomy (DN) is a procedure performed to provide recipients with a kidney to treat end-stage renal disease. The following analysis evaluated depression diagnosis in DN patients compared to controls.MethodsDN patients and matched controls were identified between 2000 and 2009 from the Statewide Planning and Research Cooperative System database. Cohorts were tracked for depression incidence. Multivariable logistic regression was used to determine independent predictors of a postoperative depression diagnosis.ResultsThe total study cohort included 2108 DN cases and 2108 controls. In both donors and controls, the baseline rate of depression was 0.95% (n = 20). The 5-year incidence of depression diagnosis after exposure increased in both cohorts (donors: 2.5%, n = 53; controls: 7.2%, n = 152; P < .001). The 5-year relative risk for developing depression was 2.65 (CI 1.59-4.42, P = .0002) in donors and 7.60 (CI 4.79-12.07, P < .001) in controls. On multivariable regression, being a donor was associated with reduced risk of developing postoperative depression (OR = 0.322, CI 0.233-0.445, P < .001), and the greatest risk factor for postoperative depression was a prior depressive diagnosis (OR = 7.811, CI 3.814-15.997, P < .001).ConclusionOur analysis shows that the strongest risk factor for depression was a prior diagnosis of depression. However, willingness to undergo donor nephrectomy is associated with less subsequent depression than the control population, suggesting that kidney donors may be a more resilient cohort.  相似文献   

20.
The results of studies on the association between sex mismatch and survival after heart transplantation are conflicting. Data from the Spanish Heart Transplantation Registry. From 4625 recipients, 3707 (80%) were men. The donor was female in 943 male recipients (25%) and male in 481 female recipients (52%). Recipients of male hearts had a higher body mass index (25.9 ± 4.1 vs. 24.3 ± 3.7; P < 0.01), and male donors were younger than female donors (33.4 ± 12.7 vs. 38.2 ± 12.3; P < 0.01). No further relevant differences related to donor sex were detected. In the univariate analysis, mismatch was associated with mortality in men (hazard ratio [HR], 1.18; 95% confidence interval [CI], 1.06–1.32; P = 0.003) but not in women (HR, 0.91; 95% CI 0.74–1.12; P = 0.4). A significant interaction was detected between sex mismatch and recipient gender (P = 0.02). In the multivariate analysis, sex mismatch was associated with long‐term mortality (HR, 1.14; 95% CI 1.01–1.29; P = 0.04), and there was a tendency toward significance for the interaction between sex mismatch and recipient gender (P = 0.08). In male recipients, mismatch increased mortality mainly during the first month and in patients with pulmonary gradient >13 mmHg. Sex mismatch seems to be associated with mortality after heart transplantation in men but not in women.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号