首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 781 毫秒
1.
The epidemiology of infection after liver transplantation for hilar cholangiocarcinoma has not been systematically investigated. In this study of 124 patients, 255 infections occurred in 105 patients during the median follow‐up of 4.2 years. The median time to first infection was 15.1 weeks (IQR 1.6‐62.6). The most common sites were the abdomen, bloodstream, and musculoskeletal system. Risk factors for any post‐transplant infection were pre‐transplant VRE colonization (Hazard Ratio [HR] 1.9, P=.002), living donor transplantation (HR 6.6, P<.001), longer cold ischemia time (HR 1.05 per 10 minutes, P<.001), donor CMV seropositivity (HR 2.2, P<.001), hepatic artery thrombosis (HR 2.6, P=.005), biliary stricture (HR 3.8, P=.002), intra‐abdominal fluid collection (HR 4.2, P<.001), and re‐operations within 1 month after transplantation (HR 1.7, P=.020). Abdominal infections were independently associated with hemodialysis requirement within 1 month after transplantation (HR 5.6, P=.006), hepatic artery thrombosis (HR 3.3, P=.007), biliary stricture (HR 5.2, P<.001), and abdominal fluid collection (HR 3.7, P=.0002). Bloodstream infections were independently associated with allograft ischemia (HR 17.8, P<.001), biliary stricture (HR 6.5, P=.005), and recipient VRE colonization (HR 4, P<.001). Abdominal infections (HR 2.3, P=.02) and Clostridium difficile infections (HR 4.6, P=.01) were independently associated with increased mortality.  相似文献   

2.
We modified the previously described D‐MELD score in deceased donor liver transplant, to (D+10)MELD to account for living donors being about 10 years younger than deceased donors, and tested it on living donor liver transplantation (LDLT) recipients. Five hundred consecutive LDLT, between July 2010 and December 2012, were retrospectively analyzed to see the effect of (D+10)MELD on patient and graft survival. Donor age alone did not influence survival. Recipients were divided into six classes based on the (D+10)MELD score: Class 1 (0‐399), Class 2 (400‐799), Class 3 (800‐1199), Class 4 (1200‐1599), Class 5 (1600‐1999), and Class 6 (>2000). The 1 year patient survival (97.1, 88.8, 87.6, 76.9, and 75% across Class 1‐5, P=.03) and graft survival (97.1, 87.9, 82.3, 76.9, and 75%; P=.04) was significantly different among the classes. The study population was divided into two groups at (D+10)MELD cut off at 860. Group 1 had a significantly better 1 year patient (90.4% vs 83.4%; P=.02) and graft survival (88.6% vs 80.2%; P=.01). While donor age alone does not predict recipient outcome, (D+10)MELD score is a strong predictor of recipient and graft survival, and may help in better recipient/donor selection and matching in LDLT.  相似文献   

3.
The average age of renal transplant recipients in the United States has increased over the past decade. The implications, however, have not been fully investigated. We explored predictors of success and demographic variables related to outcomes in elderly live donor transplantation. Retrospective analysis was performed using the UNOS database between 2001 and 2016. Donor characteristics and the graft failure rate of recipients above and below 70 years of age were compared across four eras: 2001-2004, 2005-2008, 2009-2012, and 2013-2016. There was a steady increase in average donor age from the first era to the fourth era (40-44) which was more evident among the septuagenarian patients (43-50) (P < .001). The 2-year graft survival rate improved from 92% in the first era to 96% in the fourth era (P < .001), and this was also more prominent in the >70 population (87%-93%) (P < .001). The >70 recipients were more likely to be non-Hispanic white (80.1% vs 65.1%, P < .001) and male (70.1% vs 61.0% P < .001), respectively. The donors were more likely to be non-Hispanic white and female in the >70 population. Live donation in the elderly is justified based on graft survival and patient survival. However, racial and gender differences exist in septuagenarian recipients and their donors.  相似文献   

4.
This clinical study evaluates end‐ischemic hypothermic machine perfusion (eHMP) in expanded criteria donors (ECD) kidneys. eHMP was initiated upon arrival of the kidney in our center and continued until transplantation. Between 11/2011 and 8/2014 eHMP was performed in 66 ECD kidneys for 369 (98‐912) minutes after 863 (364‐1567) minutes of cold storage (CS). In 49 of 66 cases, the contralateral kidney from the same donor was preserved by static CS only and accepted by another Eurotransplant (ET) center. Five (10.2%) of these kidneys were ultimately judged as “not transplantable” by the accepting center and discarded. After exclusion of early unrelated graft losses, 43 kidney pairs from the same donor were eligible for direct comparison of eHMP vs CS only: primary non‐function and delayed graft function (DGF) were 0% vs 9.3% (P=.04) and 11.6% vs 20.9% (P=.24). There was no statistically significant difference in 1‐year graft survival (eHMP vs CS only: 97.7% vs 88.4%, P=.089). In a multivariate analysis, eHMP was an independent factor for prevention of DGF (OR: 0.28, P=.041). Development of DGF was the strongest risk factor for 1‐year graft failure (Renal resistance: 38.2, P<.001). In summary, eHMP is a promising reconditioning technique to improve the quality and acceptance rate of suboptimal grafts.  相似文献   

5.
Observation that 1,25‐Dihydroxyvitamin‐D3 has an immunomodulatory effect on innate and adaptive immunity raises the possible effect on clinical graft outcome. Aim of this study was to evaluate the correlation of biopsy‐proven acute rejection, CMV infection, BKV infection, with 1,25‐Dihydroxyvitamin‐D3 deficiency and the benefit of calcitriol supplementation before and during the transplantation. Risk factors and kidney graft function were also evaluated. All RTRs received induction therapy with basiliximab, cyclosporine, mycophenolic acid, and steroids. During the first year, the incidence of BPAR (4% vs 11%, P=.04), CMV infection (3% vs 9%, P=.04), and BKV infection (6% vs 19%, P=.04) was significantly lower in users compared to controls. By multivariate Cox regression analysis, 1,25‐Dihydroxyvitamin‐D3 deficiency and no calcitriol exposure were independent risk factors for BPAR (HR=4.30, P<.005 and HR=3.25, P<.05), for CMV infection (HR=2.33, P<.05 and HR=2.31, P=.001), and for BKV infection (HR=2.41, P<.05 and HR=2.45, P=.001). After one year, users had a better renal function: eGFR was 62.5±6.7 mL/min vs 51.4±7.6 mL/min (P<.05). Only one user developed polyomavirus‐associated nephropathy vs 15 controls. Two users lost their graft vs 11 controls. 1,25(OH)2‐D3 deficiency circulating levels increased the risk of BPAR, CMV infection, BKV infection after kidney transplantation. Administration of calcitriol is a way to obtain adequate 1,25(OH)2‐D3 circulating levels.  相似文献   

6.
The Model for End‐Stage Liver Disease (MELD) score predicts higher transplant healthcare utilization and costs; however, the independent contribution of functional status towards costs is understudied. The study objective was to evaluate the association between functional status, as measured by Karnofsky Performance Status (KPS), and liver transplant (LT) costs in the first posttransplant year. In a cohort of 598 LT recipients from July 1, 2009 to November 30, 2014, multivariable models assessed associations between KPS and outcomes. LT recipients needing full assistance (KPS 10%‐40%) vs being independent (KPS 80%‐100%) were more likely to be discharged to a rehabilitation facility after LT (22% vs 3%) and be rehospitalized within the first posttransplant year (78% vs 57%), all P < .001. In adjusted generalized linear models, in addition to MELD (P < .001), factors independently associated with higher 1‐year post‐LT transplant costs were older age, poor functional status (KPS 10%‐40%), living donor LT, pre‐LT hemodialysis, and the donor risk index (all P < .001). One‐year survival for patients in the top cost decile was 83% vs 93% for the rest of the cohort (log rank P < .001). Functional status is an important determinant of posttransplant resource utilization; therefore, standardized measurements of functional status should be considered to optimize candidate selection and outcomes.  相似文献   

7.
ObjectiveDespite growing evidence of comparable outcomes in recipients of donation after circulatory death and donation after brain death donor lungs, donation after circulatory death allografts continue to be underused nationally. We examined predictors of nonuse.MethodsAll donors who donated at least 1 organ for transplantation between 2005 and 2019 were identified in the United Network for Organ Sharing registry and stratified by donation type. The primary outcome of interest was use of pulmonary allografts. Organ disposition and refusal reasons were evaluated. Multivariable regression modeling was used to assess the relationship between donor factors and use.ResultsA total of 15,458 donation after circulatory death donors met inclusion criteria. Of 30,916 lungs, 3.7% (1158) were used for transplantation and 72.8% were discarded primarily due to poor organ function. Consent was not requested in 8.4% of donation after circulatory death offers with donation after circulatory death being the leading reason (73.4%). Nonuse was associated with smoking history (P < .001), clinical infection with a blood source (12% vs 7.4%, P = .001), and lower PaO2/FiO2 ratio (median 230 vs 423, P < .001). In multivariable regression, those with PaO2/FiO2 ratio less than 250 were least likely to be transplanted (adjusted odds ratio, 0.03; P < .001), followed by cigarette use (0.28, P < .001), and donor age >50 (0.75, P = .031). Recent transplant era was associated with significantly increased use (adjusted odds ratio, 2.28; P < .001).ConclusionsNontransplantation of donation after circulatory death lungs was associated with potentially modifiable predonation factors, including organ procurement organizations' consenting behavior, and donor factors, including hypoxemia. Interventions to increase consent and standardize donation after circulatory death donor management, including selective use of ex vivo lung perfusion in the setting of hypoxemia, may increase use and the donor pool.  相似文献   

8.
Livers from older donors (OLDs; age ≥70) are risky and often declined; however, it is likely that some candidates will benefit from OLDs versus waiting for younger ones. To characterize the survival benefit of accepting OLD grafts, we used 2009‐2017 SRTR data to identify 24 431 adult liver transplant (LT) candidates who were offered OLD grafts eventually accepted by someone. Outcomes from the time‐of‐offer were compared between candidates who accepted an OLD graft and matched controls within MELD ± 2 who declined the same offer. Candidates who accepted OLD grafts (n = 1311) were older (60.5 vs. 57.8 years, P < .001), had a higher median MELD score (25 vs. 22, P < .001), and were less likely to have hepatitis C cirrhosis (14.9% vs. 31.2%, P < .001). Five‐year cumulative mortality among those who accepted versus declined the same OLD offer was 23.4% versus 41.2% (P < .001). Candidates who accepted OLDs experienced an almost twofold reduction in mortality (aHR:0.450.520.59, P < .001) compared to those who declined the same offer, especially among the highest MELD (35‐40) candidates (aHR:0.100.240.55, P = .001). Accepting an OLD offer provided substantial long‐term survival benefit compared to waiting for a better organ offer, notably among candidates with MELD 35‐40. Providers should consider these benefits as they evaluate OLD graft offers.  相似文献   

9.
The Value of Pulmonary Function Testing Prior to Bariatric Surgery   总被引:1,自引:1,他引:0  
Hamoui N  Anthone G  Crookes PF 《Obesity surgery》2006,16(12):1570-1573
Background: Pulmonary function tests (PFTs) are often abnormal in the morbidly obese and improve after bariatric surgery. Our aim was to determine the utility of obtaining preoperative PFTs in assessing postoperative risk. Methods: 146 consecutive patients undergoing open bariatric surgery were analyzed. Patients were divided into those who had postoperative complications (Group A, n=27) and those who did not (Group B, n=119). PFTs and BMI were compared between Groups A and B. PFT parameters are reported as the median percentage of age-matched controls. Results: Patients in Group A compared to Group B were heavier (BMI 58 vs 51 kg/m2, P=.001) and older (46 vs 40 years, P=.02) than those in group B. They had reduced forced vital capacity (80% vs 97%, P<.001) and forced expiratory volume in 1 second (84% vs 99%, P=.002). They also had reduced vital capacity (VC, 85% vs 102%, P<.001) and total lung capacity (89% vs 99%, P=.01). They had decreased maximal voluntary ventilation (93% vs 106%, P=.003). They had lower arterial pO2 (76 mmHg vs 85 mmHg, P=.001) and higher arterial-alveolar gradient (23 vs 17, P=.007). The strongest predictors of postoperative complications on multivariate analysis were reduced VC (RR 2.29 for each 10% decrease in VC, P=.0007) and age (RR 6.4 for age >40 years, P=.01). Conclusions: PFTs help to predict complications after bariatric surgery. The greatest reduction in VC may occur in patients with central obesity, reflecting increased intrabdominal pressure and diminished chest wall compliance.  相似文献   

10.
ABO blood group antigen incompatibility (ABO mismatch) is not an obstacle to allogeneic stem cell transplantation (allo‐SCT). However, the impact on clinical outcome after allo‐SCT remains controversial. We analyzed 512 patients after allogeneic peripheral blood SCT (allo‐PBSCT) for an association of ABO mismatch with transfusion requirements, myeloid and platelet engraftment, the incidence of GvHD, relapse, transplant‐related mortality (TRM), and overall survival (OS). A total of 260 patients underwent ABO‐mismatched transplantation and the control group consisted of 252 patients with ABO‐matched allo‐PBSCT. We found a significant association between major‐0 ABO mismatch (group 0 recipient/group A, B, or AB donor) and increased red blood cell (RBC) and platelet transfusion requirements (both P<.001) as well as delayed platelet engraftment (P<.001). Minor‐A (group A recipient/group 0 donor) and minor‐AB (group AB recipient/group 0, A, or B donor) ABO mismatch was significantly associated with an increased TRM after allo‐PBSCT (P=.001 and P=.02). In multivariate analysis performed using Cox regression, minor ABO mismatch appeared as independent risk factor for TRM after allo‐PBSCT. No association was found for ABO mismatch with the incidence of GvHD, relapse, and OS. Our results suggest that ABO blood group mismatch has a significant impact on the outcome and that minor‐A and minor‐AB ABO mismatch represents a risk factor for increased TRM after allo‐PBSCT.  相似文献   

11.
Invasive micropapillary carcinoma (IMPC) of the breast is a highly aggressive and a rare subtype of breast cancer. In this study, we aimed to investigate differences between pure and mixed IMPCs of the breast in terms of clinicopathologic features, and also to analyze the significance of expressions of ARID1A and bcl‐2 regarding prognosis. Sixty‐nine of IMPCs consisting of 21 pure and 48 mixed type diagnosed at Pathology Department of Istanbul Medical Faculty between 2000 and 2011, who had complete follow‐up data, were collected to analyze ARID1A and bcl‐2 expressions immunohistochemically with prognosis. The median follow‐up period was 94 months. No significant difference was found between pure and mixed type IMPC, as well as in luminal subgroups in terms of prognostic and clinicopatologic features. ARID1A and human epidermal growth factor receptor‐2 (Her‐2) status were found to be independent prognostic factors of both overall survival (OS) (HR=6.1, 95% CI 1.4‐26.6, P=.02; HR=15.9, 95% CI 3.5‐71.5, P<.0001, respectively) and disease free survival (DFS) (HR=4, 95% CI 1.1‐14.9, P=.04; HR=7.2, 95% CI 2‐25.4, P=.002, respectively) in multivariate analysis using Cox regression. The loss of ARID1A expression was significantly related with 10 year‐OS (P=.001) and 10 year‐DFS (P=.05). Statistically significant effect of ARID1A expression was also stated on DFS and OS in Luminal B group (P=.05 and P=.001 respectively). Pure and mixed type IMPCs are similar in terms of clinicopathologic and prognostic features. The loss of ARID1A expression and Her‐2 positivity have significant adverse effect clinical outcomes of IMPC patients.  相似文献   

12.
Inflammation in fibrosis areas (i‐IF/TA) of kidney allografts is associated with allograft loss; however, its diagnostic significance remains to be determined. We investigated the clinicohistologic phenotype and determinants of i‐IF/TA in a prospective cohort of 1539 kidney recipients undergoing evaluation of i‐IF/TA and tubulitis in atrophic tubules (t‐IF/TA) on protocol allograft biopsies performed at 1 year posttransplantation. We considered donor, recipient, and transplant characteristics, immunosuppression, and histological diagnoses in 2260 indication biopsies performed within the first year posttransplantation. Nine hundred forty‐six (61.5%) patients presented interstitial fibrosis/tubular atrophy (IF/TA Banff grade > 0) at 1 year posttransplant, among whom 394 (41.6%) showed i‐IF/TA. i‐IF/TA correlated with concurrent t‐IF/TA (P < .001), interstitial inflammation (P < .001), tubulitis (P < .001), total inflammation (P < .001), peritubular capillaritis (P < .001), interstitial fibrosis (P < .001), and tubular atrophy (P = .02). The independent determinants of i‐IF/TA were previous T cell–mediated rejection (TCMR) (P < .001), BK virus nephropathy (P = .007), steroid therapy (P = .039), calcineurin inhibitor therapy (P = .011), inosine‐5′‐monophosphate dehydrogenase inhibitor therapy (P = .011), HLA‐B mismatches (P = .012), and HLA‐DR mismatches (P = .044). TCMR patients with i‐IF/TA on posttreatment biopsy (N = 83/136, 61.0%) exhibited accelerated progression of IF/TA over time (P = .01) and decreased 8‐year allograft survival (70.8% vs 83.5%, P = .038) compared to those without posttreatment i‐IF/TA. Our results support that i‐IF/TA may represent a manifestation of chronic active TCMR.  相似文献   

13.
Preoperative risk assessment of potential kidney transplant recipients often fails to adequately balance risk related to underlying comorbidities with the beneficial impact of kidney transplantation. We sought to develop a simple scoring system based on factors known at the time of patient assessment for placement on the waitlist to predict likelihood of severe adverse events 1 year post‐transplant. The tool includes four components: age, cardiopulmonary factors, functional status, and metabolic factors. Pre‐transplant factors strongly associated with severe adverse events include diabetic (OR: 3.76, P<.001), coronary artery disease (OR: 3.45, P<.001), history of CABG/PCI (OR 3.1, P=.001), and peripheral vascular disease (OR 2.74, P=.008).The score was evaluated by calculation of concordance index. The C statistic of 0.74 for the risk stratification group was considered good discrimination in the validation cohort (N=127) compared to the development cohort (N=368). The pre‐transplant risk group was highly predictive of severe adverse events (OR 2.36, P<.001). Patients stratified into the above average‐risk group were four times more likely to experience severe adverse events compared to average‐risk patients, while patients in the high‐risk group were nearly 11 times more likely to experience severe adverse events. The pre‐transplant risk stratification tool is a simple scoring scheme using easily obtained preoperative characteristics that can meaningfully stratify patients in terms of post‐transplant risk and may ultimately guide patient selection and inform the counseling of potential kidney transplant recipients.  相似文献   

14.
Complement component 3 (C3) presents both slow (C3S) and fast (C3F) variants, which can be locally produced and activated by immune system cells. We studied C3 recipient variants in 483 liver transplant patients by RT‐PCR‐HRM to determine their effect on graft outcome during the first year post‐transplantation. Allograft survival was significantly decreased in C3FF recipients (C3SS 95% vs C3FS 91% vs C3FF 83%; P=.01) or C3F allele carriers (C3F absence 95% vs C3F presence 90%, P=.02). C3FF genotype or presence of C3F allele independently increased risk for allograft loss (OR: 2.38, P=.005 and OR: 2.66, P=.02, respectively). C3FF genotype was more frequent among patients whose first infection was of viral etiology (C3SS 13% vs C3FS 18% vs C3FF 32%; P=.04) and independently increased risk for post‐transplant viral infections (OR: 3.60, P=.008). On the other hand, C3FF and C3F protected from rejection events (OR: 0.54, P=.03 and OR: 0.63, P=.047, respectively). Differences were not observed in hepatitis C virus recurrence or patient survival. In conclusion, we show that, independently from C3 variants produced by donor liver, C3F variant from recipient diminishes allograft survival, increases susceptibility to viral infections, and protects from rejection after transplantation. C3 genotyping of liver recipients may be useful to stratify risk.  相似文献   

15.
A recent study reported that kidney transplant recipients of offspring living donors had higher graft loss and mortality. This seemed counterintuitive, given the excellent HLA matching and younger age of offspring donors; we were concerned about residual confounding and other study design issues. We used Scientific Registry of Transplant Recipients data 2001‐2016 to evaluate death‐censored graft failure (DCGF) and mortality for recipients of offspring versus nonoffspring living donor kidneys, using Cox regression models with interaction terms. Recipients of offspring kidneys had lower DCGF than recipients of nonoffspring kidneys (15‐year cumulative incidence 21.2% vs 26.1%, P < .001). This association remained after adjustment for recipient and transplant factors (adjusted hazard ratio [aHR] = 0.730.770.82, P < .001), and was attenuated among African American donors (aHR 0.770.850.95; interaction: P = .01) and female recipients (aHR 0.770.840.91, P < .001). Although offspring kidney recipients had higher mortality (15‐year mortality 56.4% vs 37.2%, P < .001), this largely disappeared with adjustment for recipient age alone (aHR = 1.021.061.10, P = .002) and was nonsignificant after further adjustment for other recipient characteristics (aHR = 0.930.971.01, P = .1). Kidneys from offspring donors provided lower graft failure and comparable mortality. An otherwise eligible donor should not be dismissed because they are the offspring of the recipient, and we encourage continued individualized counseling for potential donors.  相似文献   

16.
ObjectiveThe study objective was to determine whether donor substance abuse (opioid overdose death, opioid use, cigarette or marijuana smoking) impacts lung acceptance and recipient outcomes.MethodsDonor offers to a single center from 2013 to 2019 were reviewed to determine if lung acceptance rates and recipient outcomes were affected by donor substance abuse.ResultsThere were 3515 donor offers over the study period. A total of 154 offers (4.4%) were opioid use and 117 (3.3%) were opioid overdose deaths. A total of 1744 donors (65.0%) smoked cigarettes and 69 donors (2.6%) smoked marijuana. Of smokers, 601 (35.0%) had less than 20 pack-year history and 1117 (65.0%) had more than 20 pack-year history. Substance abuse donors were younger (51.5 vs 55.2 P < .001), more often male (65.6 vs 54.8%, P < .001), more often White (86.2 vs 68.7%, P < .001), and had hepatitis C (8.3 vs 0.8%, P < .001). Donor acceptance was significantly associated with brain dead donors (odds ratio, 1.56, P < .001), donor smoking history (odds ratio, 0.56, P < .001), hepatitis C (odds ratio, 0.35, P < .001), younger age (odds ratio, 0.98, P < .001), male gender (odds ratio, 0.74, P = .004), and any substance abuse history (odds ratio, 0.50, P < .001), but not opioid use, opioid overdose death, or marijuana use. Recipient survival was equivalent when using lungs from donors who had opioid overdose death, who smoked marijuana, or who smoked cigarettes for less than 20 patient-years or more than 20 patient-years, and significantly longer in recipients of opioid use lungs. There was no significant difference in time to chronic lung allograft dysfunction for recipients who received lungs from opioid overdose death or with a history of opioid use, marijuana smoking, or cigarette smoking.ConclusionsDonor acceptance was impacted by cigarette smoking but not opioid use, opioid overdose death, or marijuana use. Graft outcomes and recipient survival were similar for recipients of lungs from donors who abused substances.  相似文献   

17.
In the United States, kidney donation from international (noncitizen/nonresident) living kidney donors (LKDs) is permitted; however, given the heterogeneity of healthcare systems, concerns remain regarding the international LKD practice and recipient outcomes. We studied a US cohort of 102 315 LKD transplants from 2000‐2016, including 2088 international LKDs, as reported to the Organ Procurement and Transplantation Network. International LKDs were more tightly clustered among a small number of centers than domestic LKDs (Gini coefficient 0.76 vs 0.58, P < .001). Compared with domestic LKDs, international LKDs were more often young, male, Hispanic or Asian, and biologically related to their recipient (P < .001). Policy‐compliant donor follow‐up was substantially lower for international LKDs at 6, 12, and 24 months postnephrectomy (2015 cohort: 45%, 33%, 36% vs 76%, 71%, 70% for domestic LKDs, P < .001). Among international LKDs, Hispanic (aOR = 0.230.360.56, P < .001) and biologically related (aOR = 0.390.590.89, P < .01) donors were more compliant in donor follow‐up than white and unrelated donors. Recipients of international living donor kidney transplant (LDKT) had similar graft failure (aHR = 0.780.891.02, P = .1) but lower mortality (aHR = 0.530.620.72, P < .001) compared with the recipients of domestic LDKT after adjusting for recipient, transplant, and donor factors. International LKDs may provide an alternative opportunity for living donation. However, efforts to improve international LKD follow‐up and engagement are warranted.  相似文献   

18.
Hearts from older donors are increasingly utilized for transplantation due to unmet demand. Conflicting evidence exists regarding the prognosis of recipients of advanced age donor hearts, especially in young recipients. A retrospective analysis was performed on 11 433 patients aged 18 to 45 who received a cardiac transplant from 2000 to 2017. Overall, 10 279 patients received hearts from donors less than 45 and 1145 from donors greater than 45. Recipients of older donors were older (37 vs. 34 years, P < .01) and had higher rates of inotropic dependence (48% vs. 42%, P < .01). However, groups were similar in terms of comorbidities and dependence on mechanical circulatory support. Median survival for recipients of older donors was reduced by 2.6 years (12.6 vs. 15.2, P < .01). Multivariable analysis demonstrated donor age greater than 45 to be a predictor of mortality (HR 1.18 [1.05‐1.33], P = .01). However, when restricting the analysis to patients who received a donor with a negative preprocurement angiogram, donor age only had a borderline association with mortality (HR 1.20 [0.98‐1.46], P = .06). Older donor hearts in young recipients are associated with decreased long‐term survival, however this risk is reduced in donors without atherosclerosis. The long‐term hazard of this practice should be carefully weighed against the risk of waitlist mortality.  相似文献   

19.
After bilateral lung and heart–lung transplantation in adults with pulmonary hypertension, hemodynamic and oxygenation deficiencies are life‐threatening complications that are increasingly managed with extracorporeal life support (ECLS). The primary aim of this retrospective study was to assess 30‐day and 1‐year survival rates in patients managed with vs without post‐operative venoarterial ECLS in 2008–2013. The secondary endpoints were the occurrence rates of nosocomial infection, bleeding, and acute renal failure. Of the 93 patients with pulmonary hypertension who received heart‐lung (n=29) or bilateral lung (n=64) transplants, 28 (30%) required ECLS a median of 0 [0–6] hours after surgery completion and for a median of 3.0 [2.0–8.5] days. Compared to ECLS patients, controls had higher survival at 30 days (95.0% vs 78.5%; P=.02) and 1 year (83% vs 64%; P=.005), fewer nosocomial infections (48% vs 79%; P=.0006), and fewer bleeding events (17% vs 43%; P=.008). The need for renal replacement therapy was not different between groups (11% vs 17%; P=.54). Venoarterial ECLS is effective in treating pulmonary graft dysfunction with hemodynamic failure after heart‐lung or bilateral lung. However, ECLS use was associated with higher rates of infection and bleeding.  相似文献   

20.
Racial disparities in living donor kidney transplantation (LDKT) persist but the most effective target to eliminate these disparities remains unknown. One potential target could be delays during completion of the live donor evaluation process. We studied racial differences in progression through the evaluation process for 247 African American (AA) and 664 non‐AA living donor candidates at our center between January 2011 and March 2015. AA candidates were more likely to be obese (38% vs 22%: P < .001), biologically related (66% vs 44%: P < .001), and live ≤50 miles from the center (64% vs 37%: P < .001) than non‐AAs. Even after adjusting for these differences, AAs were less likely to progress from referral to donation (aHR for AA vs non‐AA: 0.260.47 0.83; P = .01). We then assessed racial differences in completion of each step of the evaluation process and found disparities in progression from medical screening to in‐person evaluation (aHR: 0.410.620.94; P = .02) and from clearance to donation (aHR: 0.28 0.510.91; P = .02), compared with from referral to medical screening (aHR: 0.781.021.33; P = .95) and from in‐person evaluation to clearance (aHR: 0.59 0.931.44; P = .54). Delays may be a manifestation of the transplant candidate's social network, thus, targeted efforts to optimize networks for identification of donor candidates may help address LDKT disparities.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号