首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Usage of “large‐for‐size” left lateral segment (LLS) liver grafts in children with high graft to recipient weight ratio (GRWR) is controversial due to concerns about increased recipient complications. During the study period, 77 pediatric living donor liver transplantations (LDLTs) with LLS grafts were performed. We compared recipients with GRWR ≥2.5% (GR‐High = 50) vs GRWR <2.5% (GR‐Low = 27). Median age was higher in the GR‐Low group (40 vs 8 months, P> .0001). Graft (GR‐High: 98%, 98%, 98% vs GR‐Low: 96%, 93%, 93%) and patient (GR‐High: 98%, 98%, 98% vs GR‐Low: 100%, 96%, 96%) survival at 1, 3, and 5 years was similar between groups (P = NS). Overall complications were also similar (34% vs 30%; P = .8). Hepatic artery and portal vein thrombosis following transplantation was not different (P = NS). Delayed abdominal fascia closure was more common in GR‐High patients (17 vs 1; P = .002). Subgroup analysis comparing recipients with GRWR ≥4% (GR‐XL = 20) to GRWR <2.5% (GRWR‐Low = 27) revealed that delayed abdominal fascia closure was more common in the GR‐XL group, but postoperative complications and graft and patient survival were similar. We conclude that pediatric LDLT with large‐for‐size LLS grafts is associated with excellent clinical outcomes. There is an increased need for delayed abdominal closure with no compromise of long‐term outcomes. The use of high GRWR expands the donor pool and improves timely access to the benefits of transplantation without extra risks.  相似文献   

2.
To assess whether biopsy‐guided selection of kidneys from very old brain‐dead donors enables more successful transplantations, the authors of this multicenter, observational study compared graft survival between 37 recipients of 1 or 2 histologically evaluated kidneys from donors older than 80 years and 198 reference‐recipients of non–histologically evaluated single grafts from donors aged 60 years and younger (transplantation period: 2006‐2013 at 3 Italian centers). During a median (interquartile range) of 25 (13‐42) months, 2 recipients (5.4%) and 10 reference‐recipients (5.1%) required dialysis (crude and donor age‐ and sex‐adjusted hazard ratio [95% confidence interval] 1.55 [0.34‐7.12], P = .576 and 1.41 [0.10‐19.54], = .798, respectively). Shared frailty analyses confirmed similar outcomes in a 1:2 propensity score study comparing recipients with 74 reference‐recipients matched by center, year, donor, and recipient sex and age. Serum creatinine was similar across groups during 84‐month follow‐up. Recipients had remarkably shorter waiting times than did reference‐recipients and matched reference‐recipients (7.5 [4.0‐19.5] vs 36 [19‐56] and 40 [24‐56] months, respectively, P < .0001 for both comparisons). Mean (± SD) kidney donor risk index was 2.57 ± 0.32 in recipients vs 1.09 ± 0.24 and 1.14 ± 0.24 in reference‐recipients and matched reference‐recipients (P < .0001 for both comparisons). Adverse events were similar across groups. Biopsy‐guided allocation of kidneys from octogenarian donors permits further expansion of the donor organ pool and faster access to a kidney transplant, without increasing the risk of premature graft failure.  相似文献   

3.
The impact of postreperfusion syndrome (PRS) during liver transplantation (LT) using donor livers with significant macrosteatosis is largely unknown. Clinical outcomes of all patients undergoing LT with donor livers with moderate macrosteatosis (30%‐60%) (N = 96) between 2000 and 2017 were compared to propensity score matched cohorts of patients undergoing LT with donor livers with mild macrosteatosis (10%‐29%) (N = 96) and no steatosis (N = 96). Cardiac arrest at the time of reperfusion was seen in eight (8.3%) of the patients in the moderate macrosteatosis group compared to one (1.0%) of the patients in the mild macrosteatosis group (P = .02) and zero (0%) of the patients in the no steatosis group (P = .004). Patients in the moderate macrosteatosis group had a higher rate of PRS (37.5% vs 18.8%; P = .004), early allograft dysfunction (EAD) (76.4% vs 25.8%; P < .001), renal dysfunction requiring continuous renal replacement therapy following transplant (18.8% vs 8.3%; P = .03) and return to the OR within 30 days (24.0% vs 7.3%; P = .002), than the no steatosis group. Both long‐term patient (P = .30 and P = .08) and graft survival (P = .15 and P = .12) were not statistically when comparing the moderate macrosteatosis group to the mild macrosteatosis and no steatosis groups. Recipients of LT using livers with moderate macrosteatosis are at a significant increased risk of PRS. If patients are able to overcome the initial increased perioperative risk of using these donor livers, long‐term graft survival does not appear to be different than matched recipients receiving grafts with no steatosis.  相似文献   

4.
Thirty percent of kidney transplant recipients are readmitted in the first month posttransplantation. Those with donor‐specific antibody requiring desensitization and incompatible live donor kidney transplantation (ILDKT) constitute a unique subpopulation that might be at higher readmission risk. Drawing on a 22‐center cohort, 379 ILDKTs with Medicare primary insurance were matched to compatible transplant‐matched controls and to waitlist‐only matched controls on panel reactive antibody, age, blood group, renal replacement time, prior kidney transplantation, race, gender, diabetes, and transplant date/waitlisting date. Readmission risk was determined using multilevel, mixed‐effects Poisson regression. In the first month, ILDKTs had a 1.28‐fold higher readmission risk than compatible controls (95% confidence interval [CI] 1.13‐1.46; P < .001). Risk peaked at 6‐12 months (relative risk [RR] 1.67, 95% CI 1.49‐1.87; P < .001), attenuating by 24‐36 months (RR 1.24, 95% CI 1.10‐1.40; P < .001). ILDKTs had a 5.86‐fold higher readmission risk (95% CI 4.96‐6.92; P < .001) in the first month compared to waitlist‐only controls. At 12‐24 (RR 0.85, 95% CI 0.77‐0.95; P = .002) and 24‐36 months (RR 0.74, 95% CI 0.66‐0.84; P < .001), ILDKTs had a lower risk than waitlist‐only controls. These findings of ILDKTs having a higher readmission risk than compatible controls, but a lower readmission risk after the first year than waitlist‐only controls should be considered in regulatory/payment schemas and planning clinical care.  相似文献   

5.
Organ shortage continues to challenge the field of transplantation. One potential group of donors are those who have been transplant recipients themselves, or Organ Donation After Transplant (ODAT) donors. We conducted a retrospective cohort study to describe ODAT donors and to compare outcomes of ODAT grafts versus conventional grafts. From October 1, 1987 to June 30, 2015, 517 former recipients successfully donated 803 organs for transplant. Former kidney recipients generally survived a median of approximately 4 years before becoming an ODAT donor whereas liver, lung, and heart recipients generally survived less than a month prior to donation. In the period June 1, 2005 to December 31, 2014, liver grafts from ODAT donors had a significantly higher risk of graft failure compared to non‐ODAT liver transplants (P = .008). Kidney grafts donated by ODAT donors whose initial transplant occurred >1 year prior were associated with significantly increased graft failure (P = .012). Despite increased risk of graft failure amongst certain ODAT grafts, 5‐year survival was still high. ODAT donors should be considered another form of expanded criteria donor under these circumstances.  相似文献   

6.
Livers from older donors (OLDs; age ≥70) are risky and often declined; however, it is likely that some candidates will benefit from OLDs versus waiting for younger ones. To characterize the survival benefit of accepting OLD grafts, we used 2009‐2017 SRTR data to identify 24 431 adult liver transplant (LT) candidates who were offered OLD grafts eventually accepted by someone. Outcomes from the time‐of‐offer were compared between candidates who accepted an OLD graft and matched controls within MELD ± 2 who declined the same offer. Candidates who accepted OLD grafts (n = 1311) were older (60.5 vs. 57.8 years, P < .001), had a higher median MELD score (25 vs. 22, P < .001), and were less likely to have hepatitis C cirrhosis (14.9% vs. 31.2%, P < .001). Five‐year cumulative mortality among those who accepted versus declined the same OLD offer was 23.4% versus 41.2% (P < .001). Candidates who accepted OLDs experienced an almost twofold reduction in mortality (aHR:0.450.520.59, P < .001) compared to those who declined the same offer, especially among the highest MELD (35‐40) candidates (aHR:0.100.240.55, P = .001). Accepting an OLD offer provided substantial long‐term survival benefit compared to waiting for a better organ offer, notably among candidates with MELD 35‐40. Providers should consider these benefits as they evaluate OLD graft offers.  相似文献   

7.
Normothermic ex vivo kidney perfusion (NEVKP) represents a novel approach for graft preservation and functional improvement in kidney transplantation. We investigated whether NEVKP also allows graft quality assessment before transplantation. Kidneys from 30‐kg pigs were recovered in a model of heart‐beating donation (group A) after 30 minutes (group B) or 60 minutes (group C) (n = 5/group) of warm ischemia. After 8 hours of NEVKP, contralateral kidneys were resected, grafts were autotransplanted, and the pigs were followed for 3 days. After transplantation, renal function measured based on peak serum creatinine differed significantly among groups (P < .05). Throughout NEVKP, intrarenal resistance was lowest in group A and highest in group C (P < .05). intrarenal resistance at the initiation of NEVKP correlated with postoperative renal function (P < .001 at NEVKP hour 1). Markers of acid‐base homeostasis (pH, HCO3, base excess) differed among groups (P < .05) and correlated with posttransplantation renal function (P < .001 for pH at NEVKP hour 1). Similarly, lactate and aspartate aminotransferase were lowest in noninjured grafts versus donation after circulatory death kidneys (P < .05) and correlated with posttransplantation kidney function (P < .001 for lactate at NEVKP hour 1). In conclusion, assessment of perfusion characteristics and clinically available perfusate biomarkers during NEVKP allows the prediction of posttransplantation graft function. Thus, NEVKP might allow decision‐making regarding whether grafts are suitable for transplantation.  相似文献   

8.
Delayed graft function (DGF) is very high in our center (70%‐80%), and we usually receive a kidney for transplant after more than 22 hours of static cold ischemia time (CIT). Also, there is an inadequate care of the donors, contributing to a high rate of DGF. We decided to test whether machine perfusion (MP) after a CIT improved the outcome of our transplant patients. We analyzed the incidence of DGF, its duration, and the length of hospital stay (LOS) in patients who received a kidney preserved with MP after a CIT (hybrid perfusion—HP). We included 54 deceased donors kidneys preserved with HP transplanted from Feb/13 to Jul/14, and compared them to 101 kidney transplants preserved by static cold storage (CS) from Nov/08 to May/12. The median pumping time was 11 hours. DGF incidence was 61.1% vs 79.2% (P = .02), median DGF duration was 5 vs 11 days (< .001), and median LOS was 13 vs 18 days (< .011), for the HP compared to CS group. The observed reduction of DGF with machine perfusion did not occur in donors over 50 years old. In the multivariate analysis, risk factors for DGF, adjusted for CIT, were donor age (OR, 1.04; P = .005) and the absence of use of MP (OR, 1.54; P = .051). In conclusion, the use of HP contributed to faster recovery of renal function and to a shorter length of hospital stay.  相似文献   

9.
Steatotic donor livers (SDLs) (macrosteatosis ≥30%) represent a possible donor pool expansion, but are frequently discarded due to a historical association with mortality and graft loss. However, changes in recipient/donor demographics, allocation policy, and clinical protocols might have altered utilization and outcomes of SDLs. We used Scientific Registry of Transplant Recipients data from 2005 to 2017 and adjusted multilevel regression to quantify temporal trends in discard rates (logistic) and posttransplant outcomes (Cox) of SDLs, accounting for Organ Procurement Organization–level variation. Of 4346 recovered SDLs, 58.0% were discarded in 2005, versus only 43.1% in 2017 (P < .001). SDLs were always substantially more likely discarded versus non‐SDLs, although this difference decreased over time (adjusted odds ratio in 2005‐2007:13.1515.2817.74; 2008‐2011:11.7713.4115.29; 2012‐2014:9.8711.3713.10; 2015‐2017:7.798.8910.15, P < .001 for all). Conversely, posttransplant outcomes of recipients of SDLs improved over time: recipients of SDLs from 2012 to 2017 had 46% lower risk of mortality (adjusted hazard ratio [aHR]: 0.430.540.68, P < .001) and 47% lower risk of graft loss (aHR: 0.420.530.67, P < .001) compared to 2005 to 2011. In fact, in 2012 to 2017, recipients of SDLs had equivalent mortality (aHR: 0.901.041.21, P = .6) and graft loss (aHR: 0.901.041.20, P = .6) to recipients of non‐SDLs. Increasing utilization of SDLs might be a reasonable strategy to expand the donor pool.  相似文献   

10.
We aimed to determine the long‐term outcomes of eculizumab‐treated, positive crossmatch (+XM) kidney transplant recipients compared with +XM and age‐matched negative crossmatch (?XM) controls. We performed an observational retrospective study and examined allograft survival, histologic findings, long‐term B‐cell flow cytometric XM (BFXM), and allograft‐loss–associated factors. The mean (SD) posttransplant follow‐up was 6.3 (2.5) years in the eculizumab group; 7.6 (3.5), +XM control group; 7.9 (2.5), ?XM control group. The overall and death‐censored allograft survival rates were similar in +XM groups (= .73, = .48) but reduced compared with ?XM control patients (< .001, < .001). In the eculizumab‐treated group, 57.9% (11/19) of the allografts had chronic antibody‐mediated rejection, but death‐censored allograft survival was 76.6%, 5 years; 75.4%, 7 years. Baseline IgG3 positivity and BFXM ≥300 were associated with allograft loss. C1q positivity was also associated with allograft loss but did not reach statistical significance. Donor‐specific antibodies appeared to decrease in eculizumab‐treated patients. After excluding patients with posttransplant plasmapheresis, 42.3% (9/21) had negative BFXMs; 31.8% (7/22), completely negative single‐antigen beads 1 year posttransplant. Eculizumab‐treated +XM patients had reduced allograft survival compared with ?XM controls but similar survival to +XM controls. BFXM and complement‐activating donor‐specific antibodies (by IgG3 and C1q testing) may be used for risk stratification in +XM transplantation.  相似文献   

11.
The aim of this study was to produce a prognostic model to help predict posttransplant survival in patients transplanted with grade‐3 acute‐on‐chronic liver failure (ACLF‐3). Patients with ACLF‐3 who underwent liver transplantation (LT) between 2007 and 2017 in 5 transplant centers were included (n = 152). Predictors of 1‐year mortality were retrospectively screened and tested on a single center training cohort and subsequently tested on an independent multicenter cohort composed of the 4 other centers. Four independent pretransplant risk factors were associated with 1‐year mortality after transplantation in the training cohort: age ≥53 years (P = .044), pre‐LT arterial lactate level ≥4 mml/L (P = .013), mechanical ventilation with PaO2/FiO2 ≤ 200 mm Hg (P = .026), and pre‐LT leukocyte count ≤10 G/L (P = .004). A simplified version of the model was derived by assigning 1 point to each risk factor: the transplantation for Aclf‐3 model (TAM) score. A cut‐off at 2 points distinguished a high‐risk group (score >2) from a low‐risk group (score ≤2) with 1‐year survival of 8.3% vs 83.9% respectively (P < .001). This model was subsequently validated in the independent multicenter cohort. The TAM score can help stratify posttransplant survival and identify an optimal transplantation window for patients with ACLF‐3.  相似文献   

12.

Background

We analyze our outcomes utilizing imported allografts as a strategy to shorten wait list time for pancreas transplantation.

Methods

This is an observational retrospective cohort of 26 recipients who received either a locally procured (n = 16) or an imported pancreas graft (n = 10) at our center between January 2014 and May 2017. Wait list times of this cohort were compared to UNOS Region 9 (New York State and Western Vermont). Hospital financial data were also reviewed to analyze the cost‐effectiveness of this strategy.

Results

Imported pancreas grafts had significantly increased cold ischemia times (CIT) and peak lipase (PL) levels compared to locally procured grafts (CIT 827 vs 497 minutes; P = .001, PL 563 vs 157 u/L; P = .023, respectively). There were no differences in graft or patient survival. The median wait time was significantly lower for simultaneous kidney‐pancreas transplants at our center (518 days, n = 21) compared to Region 9 (1001 days, n = 65) P = .038. Despite financial concerns, the cost of transport for imported grafts was offset by lower standard acquisition costs.

Conclusions

Imported pancreas grafts may be a cost‐effective strategy to increase organ utilization and shorten wait times in regions with longer waiting times.  相似文献   

13.
Hearts from older donors are increasingly utilized for transplantation due to unmet demand. Conflicting evidence exists regarding the prognosis of recipients of advanced age donor hearts, especially in young recipients. A retrospective analysis was performed on 11 433 patients aged 18 to 45 who received a cardiac transplant from 2000 to 2017. Overall, 10 279 patients received hearts from donors less than 45 and 1145 from donors greater than 45. Recipients of older donors were older (37 vs. 34 years, P < .01) and had higher rates of inotropic dependence (48% vs. 42%, P < .01). However, groups were similar in terms of comorbidities and dependence on mechanical circulatory support. Median survival for recipients of older donors was reduced by 2.6 years (12.6 vs. 15.2, P < .01). Multivariable analysis demonstrated donor age greater than 45 to be a predictor of mortality (HR 1.18 [1.05‐1.33], P = .01). However, when restricting the analysis to patients who received a donor with a negative preprocurement angiogram, donor age only had a borderline association with mortality (HR 1.20 [0.98‐1.46], P = .06). Older donor hearts in young recipients are associated with decreased long‐term survival, however this risk is reduced in donors without atherosclerosis. The long‐term hazard of this practice should be carefully weighed against the risk of waitlist mortality.  相似文献   

14.
US Pediatric Heart Allocation Policy was recently revised, deprioritizing candidates with cardiomyopathy while maintaining status 1A eligibility for congenital heart disease (CHD) candidates on “high‐dose” inotropes. We compared waitlist characteristics and mortality around this change. Status 1A listings decreased (70% to 56%, P < .001) and CHD representation increased among status 1A listings (48% vs 64%, P < .001). Waitlist mortality overall (subdistribution hazard ratio [SHR] 0.96, P = .63) and among status 1A candidates (SHR 1.16, P = .14) were unchanged. CHD waitlist mortality trended better (SHR 0.82, P = .06) but was unchanged for CHD candidates listed status 1A (SHR 0.92, P = .47). Status 1A listing exceptions increased 2‐ to 3‐fold among hypertrophic and restrictive cardiomyopathy candidates and 13.5‐fold among dilated cardiomyopathy (DCM) candidates. Hypertrophic (SHR 6.25, P = .004) and restrictive (SHR 3.87, P = .03) cardiomyopathy candidates without status 1A exception had increased waitlist mortality, but those with DCM did not (SHR 1.26, P = .32). Ventricular assist device (VAD) use increased only among DCM candidates ≥1 years old (26% vs 38%, P < .001). Current allocation policy has increased CHD status 1A representation but has not improved their waitlist mortality. Excessive DCM status 1A listing exceptions and continued status 1A prioritization of children on stable VADs potentially diminish the intended benefits of policy revision.  相似文献   

15.
We tested the efficacy of religiously tailored and ethically balanced education upon living kidney organ donation intent among Muslim Americans. Pre-post changes in participant stage of change, preparedness, and likelihood judged efficacy. Among 137 participants, mean stage of change toward donation appeared to improve (0.59; SD ± 1.07, P < .0001), as did the group's preparedness to make a donation decision (0.55; SD ± 0.86, P < .0001), and likelihood to donate a kidney (0.39; SD ± 0.85, P < .0001). Mean change in likelihood to encourage a loved one, a co-worker, or a mosque community member with ESRD to seek a living donor also increased (0.22; SD ± 0.84, P = .0035, 0.23; SD ± 0.82, = .0021, 0.33; SD ± 0.79, P < .0001 respectively). Multivariate ordered logistic regression models revealed that gains in biomedical knowledge regarding organ donation increased odds for positive change in preparedness (OR = 1.20; 95% CI 1.01-1.41, P = .03), while increasing age associated with lower odds of positive change in stage of change (OR = 0.98, 95% CI 0.96-0.998, = .03), and prior registration as an organ donor lowered odds for an increase in likelihood to donate a kidney (OR = 0.22; 95% CI 0.08-0.60, = .003). Our intervention appears to enhance living kidney donation-related intent among Muslim Americans [Clinicaltrials.gov number: NCT04443114].  相似文献   

16.
Inflammation in fibrosis areas (i‐IF/TA) of kidney allografts is associated with allograft loss; however, its diagnostic significance remains to be determined. We investigated the clinicohistologic phenotype and determinants of i‐IF/TA in a prospective cohort of 1539 kidney recipients undergoing evaluation of i‐IF/TA and tubulitis in atrophic tubules (t‐IF/TA) on protocol allograft biopsies performed at 1 year posttransplantation. We considered donor, recipient, and transplant characteristics, immunosuppression, and histological diagnoses in 2260 indication biopsies performed within the first year posttransplantation. Nine hundred forty‐six (61.5%) patients presented interstitial fibrosis/tubular atrophy (IF/TA Banff grade > 0) at 1 year posttransplant, among whom 394 (41.6%) showed i‐IF/TA. i‐IF/TA correlated with concurrent t‐IF/TA (P < .001), interstitial inflammation (P < .001), tubulitis (P < .001), total inflammation (P < .001), peritubular capillaritis (P < .001), interstitial fibrosis (P < .001), and tubular atrophy (P = .02). The independent determinants of i‐IF/TA were previous T cell–mediated rejection (TCMR) (P < .001), BK virus nephropathy (P = .007), steroid therapy (P = .039), calcineurin inhibitor therapy (P = .011), inosine‐5′‐monophosphate dehydrogenase inhibitor therapy (P = .011), HLA‐B mismatches (P = .012), and HLA‐DR mismatches (P = .044). TCMR patients with i‐IF/TA on posttreatment biopsy (N = 83/136, 61.0%) exhibited accelerated progression of IF/TA over time (P = .01) and decreased 8‐year allograft survival (70.8% vs 83.5%, P = .038) compared to those without posttreatment i‐IF/TA. Our results support that i‐IF/TA may represent a manifestation of chronic active TCMR.  相似文献   

17.
To implement split liver transplantation (SLT) a mandatory‐split policy has been adopted in Italy since August 2015: donors aged 18‐50 years at standard risk are offered for SLT, resulting in a left‐lateral segment (LLS) graft for children and an extended‐right graft (ERG) for adults. We aim to analyze the impact of the new mandatory‐split policy on liver transplantation (LT)‐waiting list and SLT outcomes, compared to old allocation policy. Between August 2015 and December 2016 out of 413 potentially “splittable” donors, 252 (61%) were proposed for SLT, of whom 53 (21%) donors were accepted for SLT whereas 101 (40.1%) were excluded because of donor characteristics and 98 (38.9%) for absence of suitable pediatric recipients. The SLT rate augmented from 6% to 8.4%. Children undergoing SLT increased from 49.3% to 65.8% (P = .009) and the pediatric LT‐waiting list time dropped (229 [10‐2121] vs 80 [12‐2503] days [P = .045]). The pediatric (4.5% vs 2.5% [P = .398]) and adult (9.7% to 5.2% [P < .001]) LT‐waiting list mortality reduced; SLT outcomes remained stable. Retransplantation (HR = 2.641, P = .035) and recipient weight >20 kg (HR = 5.113, P = .048) in LLS, and ischemic time >8 hours (HR = 2.475, P = .048) in ERG were identified as predictors of graft failure. A national mandatory‐split policy maximizes the SLT donor resources, whose selection criteria can be safely expanded, providing favorable impact on the pediatric LT‐waiting list and priority for adult sick LT candidates.  相似文献   

18.
Intra‐patient variability (IPV) of tacrolimus trough level has been associated with poor outcome after kidney transplantation. These findings were derived from single‐center analyses and restricted mainly to measurements early after transplantation. We analyzed in a multicenter effort whether high IPV of tacrolimus levels at posttransplant years 1, 2, and 3 was associated with impaired clinical outcome. More than 6600 patients who received a deceased donor kidney transplant during 2000‐2014 and had a functioning graft for >3 years were studied. Graft survival was significantly impaired with increasing IPV (< 0.001). As compared to patients with a low IPV of <30%, the risk of graft loss during years 4‐6 increased 32% in patients with an IPV of 30% to 44% and 66% in patients with an IPV of ≥45% (= 0.002 and < 0.001). About one‐third of patients showed an IPV of ≥30% with substantially impaired outcome. Even in patients with good outcome during the first 3 posttransplant years, a high IPV was associated with inferior graft survival. Our data indicate that a fluctuating tacrolimus trough level at years 1, 2, and 3 posttransplant is a major problem in kidney transplantation.  相似文献   

19.
In the United States, kidney donation from international (noncitizen/nonresident) living kidney donors (LKDs) is permitted; however, given the heterogeneity of healthcare systems, concerns remain regarding the international LKD practice and recipient outcomes. We studied a US cohort of 102 315 LKD transplants from 2000‐2016, including 2088 international LKDs, as reported to the Organ Procurement and Transplantation Network. International LKDs were more tightly clustered among a small number of centers than domestic LKDs (Gini coefficient 0.76 vs 0.58, P < .001). Compared with domestic LKDs, international LKDs were more often young, male, Hispanic or Asian, and biologically related to their recipient (P < .001). Policy‐compliant donor follow‐up was substantially lower for international LKDs at 6, 12, and 24 months postnephrectomy (2015 cohort: 45%, 33%, 36% vs 76%, 71%, 70% for domestic LKDs, P < .001). Among international LKDs, Hispanic (aOR = 0.230.360.56, P < .001) and biologically related (aOR = 0.390.590.89, P < .01) donors were more compliant in donor follow‐up than white and unrelated donors. Recipients of international living donor kidney transplant (LDKT) had similar graft failure (aHR = 0.780.891.02, P = .1) but lower mortality (aHR = 0.530.620.72, P < .001) compared with the recipients of domestic LDKT after adjusting for recipient, transplant, and donor factors. International LKDs may provide an alternative opportunity for living donation. However, efforts to improve international LKD follow‐up and engagement are warranted.  相似文献   

20.
The impact of a new national kidney allocation system (KAS) on access to the national deceased‐donor waiting list (waitlisting) and racial/ethnic disparities in waitlisting among US end‐stage renal disease (ESRD) patients is unknown. We examined waitlisting pre‐ and post‐KAS among incident (N = 1 253 100) and prevalent (N = 1 556 954) ESRD patients from the United States Renal Data System database (2005‐2015) using multivariable time‐dependent Cox and interrupted time‐series models. The adjusted waitlisting rate among incident patients was 9% lower post‐KAS (hazard ratio [HR]: 0.91; 95% confidence interval [CI], 0.90‐0.93), although preemptive waitlisting increased from 30.2% to 35.1% (P < .0001). The waitlisting decrease is largely due to a decline in inactively waitlisted patients. Pre‐KAS, blacks had a 19% lower waitlisting rate vs whites (HR: 0.81; 95% CI, 0.80‐0.82); following KAS, disparity declined to 12% (HR: 0.88; 95% CI, 0.85‐0.90). In adjusted time‐series analyses of prevalent patients, waitlisting rates declined by 3.45/10 000 per month post‐KAS (< .001), resulting in ≈146 fewer waitlisting events/month. Shorter dialysis vintage was associated with greater decreases in waitlisting post‐KAS (P < .001). Racial disparity reduction was due in part to a steeper decline in inactive waitlisting among minorities and a greater proportion of actively waitlisted minority patients. Waitlisting and racial disparity in waitlisting declined post‐KAS; however, disparity remains.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号