首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 171 毫秒
1.
Despite continuous total quality management (TQM) and traditional weekly morbidity and mortality (M&M) conferences, our liver transplant program survival rates became lower than expected according to national benchmarking standards. Data from the U.S. Scientific Registry of Transplant Recipients (SRTR), the organization contracted by the federal government to manage statistics reported by all U.S. transplant centers, showed that during 7/1/98 TO 6/30/00, our 1-year graft (76.86%, P = 0.230) and patient (80.61%, P = 0.016) survival was lower than expected compared with national expected rates (graft 81.89% and patient 88.30%). In response, our program added root cause analysis (RCA) to our quality improvement process. RCA is a method of identifying causal factors that underlie variation in performance. Using RCA, two of our liver transplant surgeons performed a systematic review of all liver transplant patient deaths in our center from 1/95 TO 8/00 (86 of 372 patients transplanted). All phases of the transplant process, including recipient and donor selection, transplant procedure, and follow-up care (including psychosocial issues) were reviewed to determine specific events from each phase that led to an adverse outcome. For the 86 deaths, 162 root causes were identified. The apportionment was as follows: recipient and donor selection, 58%; transplant procedure, 24%; follow-up care, 18%. The top four root causes were obesity, surgical anatomy issues, pulmonary events, and cardiac events, all relating to patient selection. Multiple root causes in the cases reviewed led to futile liver transplants. Early in 2001, our program conducted in-services and instituted protocol changes according to RCA findings. In April 2004, SRTR data revealed that for patients transplanted between 1/01/01 and 6/30/03, our 1-year liver graft survival of 90.73% is now significantly higher (P = 0.018) compared to the national expected rate of 84.48%. Our 1-year patient survival rate of 92.66% is higher than the expected rate of 89.29%, although not significantly (P = 0.285). In conclusion, periodic RCA of adverse events should be added to the TQM efforts and M&M conferences of programs encompassing multiple medical services.  相似文献   

2.
Every 6 months, the Scientific Registry of Transplant Recipients (SRTR) publishes evaluations of every solid organ transplant program in the United States, including evaluations of 1‐year patient and graft survival. The Centers for Medicare & Medicaid Services (CMS) and the Organ Procurement and Transplantation Network (OPTN) Membership and Professional Standards Committee (MPSC) use SRTR's 1‐year evaluations for regulatory review of transplant programs. Concern has been growing that the regulatory scrutiny of transplant programs with lower‐than‐expected outcomes is harmful, causing programs to undertake fewer high‐risk transplants and leading to unnecessary organ discards. As a result, CMS raised its threshold for a “Condition‐Level Deficiency” designation of observed relative to expected 1‐year graft or patient survival from 1.50 to 1.85. Exceeding this threshold in the current SRTR outcomes report and in one of the four previous reports leads to scrutiny that may result in loss of Medicare funding. For its part, OPTN is reviewing a proposal from the MPSC to also change its performance criteria thresholds for program review, to review programs with “substantive clinical differences.” We review the details and implications of these changes in transplant program oversight.  相似文献   

3.
BACKGROUND: Sirolimus (SRL) may increase the incidence of or prolong delayed graft function (DGF) after cadaveric renal transplantation. This study compares transplant outcomes of SRL-based induction immunosuppression (IS) with other calcineurin-inhibitor (CNI) sparing regimens in the DGF setting. METHODS: Adult cadaveric renal-transplant recipients who received transplants between January 1, 1997 and June 30, 2001 and experienced DGF (n=132) were divided into three groups by induction IS: A, depleting antibody (n=41); B, SRL (n=49); and C, neither (n=42). All recipients also received steroids and mycophenolate mofetil with delayed initiation of CNIs when good renal function returned. Patient survival, graft survival, and time to rejection within 1 year of transplantation were assessed by Kaplan-Meier analysis. One-year graft function was compared using Kruskal-Wallis and Fisher's exact tests. RESULTS: The SRL group had longer DGF duration (P=0.01). The three groups had comparable patient (P=0.27) and graft survival (P=0.69), but the depleting antibody group experienced less rejection (P=0.004). There were no clinically significant differences in 1-year graft function. CONCLUSIONS: In our analysis of a large and modern cohort of adult cadaveric transplant recipients with DGF, induction immunosuppression with a depleting antibody preparation reduced rejection, whereas SRL prolonged DGF duration. All three CNI-sparing induction IS regimens resulted in comparable patient survival, graft survival, and graft function.  相似文献   

4.
BACKGROUND: Hepatitis C virus (HCV) recurrence in HCV+ liver transplant recipients is almost inevitable and may be promoted by immunosuppression. We compared the amount of liver damage with regard to usage of steroids and basiliximab. METHODS: A total of 140 HCV+ adult liver transplant recipients were randomly allocated to basiliximab + steroids or basiliximab + placebo (plus cyclosporine and azathioprine). Primary endpoint: hepatitis C histological recurrence (liver damage as for Ishak grading score >or=8 by biopsy at 12 months); secondary endpoints: treatment failure (death, graft loss, patient withdrawal), biopsy proven acute rejection (BPAR), treated acute rejection (tAR), allograft and patient survival rates at 12 months. RESULTS: Any significant difference has been observed in the 12-month hepatitis C histological recurrence rate (41.2% basiliximab + steroids, 37.5% basiliximab + placebo, P = 0.354). The treatment failure rate was significantly higher in basiliximab + steroids (28.8%) than in basiliximab + placebo (15.6%), P = 0.03; the combination test for the evaluation of the joint hypothesis resulted in a borderline nonsignificant overall result (P = 0.059). BPAR rate was significantly lower in the group treated with steroids (24.3% basiliximab + steroids, 39.4% basiliximab + placebo, P = 0.04), while the tAR rate was similar (29.7% basiliximab + steroids and 37.9% basiliximab + placebo). Any significant differences in 1-year graft and patient survival rates have been observed (72.9% and 84.8% basiliximab+steroids; 81.5% and 89.0% basiliximab + placebo). CONCLUSIONS: Results suggest that steroid-free therapy is associated with a significantly lower treatment failure rate, although histological recurrence rate of hepatitis C is similar in the two groups. This benefit is not offset by an evident increase in graft rejection rate requiring treatment.  相似文献   

5.
Biliary atresia is the most common indication for orthotopic liver transplantation (OLT) in the pediatric population. The outcomes of liver transplantation for biliary atresia, however, have not been formally examined on a national scale. The objective of this study was to identify pretransplant variables that predict patient survival after primary liver transplantation for biliary atresia. A cohort of 1,976 pediatric patients undergoing primary liver transplantation for biliary atresia between 1/1988 to 12/2003 was enrolled from the United Network for Organ Sharing database after excluding patients with a history of multiorgan transplant or previous liver transplant. Follow-up data up to 16 years post-OLT was available. The 5- and 10-year actuarial survival rates of patients that underwent liver transplantation for biliary atresia in the United States are 87.2% and 85.8%, respectively, and the 5- and 10-year graft actuarial survival rates are 76.2% and 72.7%, respectively. Early deaths (< or =90 days post-OLT) were more often caused by graft failure (P = 0.01), whereas late deaths (>90 days post-OLT) were more often due to malignancy (P < 0.01). An analysis of outcomes over time demonstrated a decrease in post-OLT survival and an increase in the number of OLTs done for biliary atresia at an increasing number of centers. A multivariate analysis revealed that cadaveric partial/reduced liver grafts, a history of life support at the time of OLT, and decreased age were independent predictors of increased post-OLT mortality. In conclusion, OLT is an effective treatment for biliary atresia. Certain pretransplant variables may help predict patient survival following liver transplantation for biliary atresia.  相似文献   

6.
BACKGROUND: Biliary reconstruction represents one of the most challenging parts of right lobe (RL) living donor liver transplantations (LDLTs). Different causes, surgical techniques, and treatments have been suggested but are incompletely defined. METHODS: Between June 1999 and January 2002, 96 RL LDLTs were performed in our center. We reviewed the incidence of biliary complications in all the recipients. RESULTS: Roux-en-Y reconstruction was performed in 53 cases (55.2%) and duct-to-duct was performed in 39 cases (40.6%). Both procedures were performed in 4 cases (4.2%). Multiple ducts (> or =2) were found in 58 grafts (60.4%). Thirty-nine recipients (40.6%) had 43 biliary complications: 21 had bile leaks, 22 had biliary strictures, and 4 had both complications. Patients with multiple ducts had a higher incidence of bile leaks than those patients with a single duct (P=0.049). No significant differences in complications were found between Roux-en-Y or duct-to-duct reconstructions. Freedom from biliary complications was 59% at 1 year and 55% at 2 years. The overall 1-year and 2-year survival rates for patients were 86% and 81%, respectively. The overall 1-year and 2-year survival rates for grafts were 80% and 77%, respectively. Occurrence of bile leaks affected patient and graft survival (76% and 65% 2-year patient and graft survival, respectively, vs. 89% and 85% for those without biliary leaks, P=0.07). CONCLUSIONS: Despite technical modifications and application of various surgical techniques, biliary complications remain frequent after RL LDLT. Patients with multiple biliary reconstructions had a higher incidence of bile leaks. Patients who developed leaks had lower patient and graft survival rates.  相似文献   

7.
Abstract. A multivariate analysis of prognostic factors for graft failure was performed on patients in the International Pancreas Transplant Registry. The analysis was restricted to the period January 1978 to June 1987 and included 764 patients. All patients had at least 1 year of follow-up. The following variables were studied: transplant year, continent (N. America, Europe, others), type of donor (cadaver, living related mismatched, living related HLA-identical), donor mismatch at the HLA A, B loci, donor mismatch at the DR loci, preservation time, kidney association (pancreas transplant alone, simultaneous pancreas and kidney transplant, pancreas after kidney transplant), whole versus segmental pancreatic transplant, graft duct management technique (polymer injection, enteric drainage, stomach drainage, bladder drainage), and immunosuppression. By stepwise, logistic regression analysis, we found that the following factors were predictive for 1-year graft function: donor mismatch at the DR loci ( P = 0. 0003), kidney association ( P < 0. 0001), type of donor ( P = 0. 04), and immunosuppression ( P = 0. 0002). For donor mismatch at the DR loci, we found an odds ratio for success of 2. 2 for 0 versus 2 mismatches. The odds for success were 2. 9 for simultaneous pancreas and kidney transplant versus pancreas transplant alone. The best results-79% 1-year graft survival-were obtained for the combination of 0 mismatches at the DR loci, pancreas after kidney transplant, living related HLA-identical donor, and the immunosuppressive regimen consisting of cyclosporin, azathioprine, and prednisone. Patients receiving a pancreas transplant alone with 0 mismatches at the DR loci, living related HLA-identical donor, and triple immunosuppressive regimen had a predicted 1-year graft survival of 71%.  相似文献   

8.
In this retrospective study of hepatitis C virus (HCV)-infected transplant recipients in the 9-center Adult to Adult Living Donor Liver Transplantation Cohort Study, graft and patient survival and the development of advanced fibrosis were compared among 181 living donor liver transplant (LDLT) recipients and 94 deceased donor liver transplant (DDLT) recipients. Overall 3-year graft and patient survival were 68% and 74% in LDLT, and 80% and 82% in DDLT, respectively. Graft survival, but not patient survival, was significantly lower for LDLT compared to DDLT (P = 0.04 and P = 0.20, respectively). Further analyses demonstrated lower graft and patient survival among the first 20 LDLT cases at each center (LDLT 20; P = 0.002 and P = 0.002, respectively) and DDLT recipients (P < 0.001 and P = 0.008, respectively). Graft and patient survival in LDLT >20 and DDLT were not significantly different (P = 0.66 and P = 0.74, respectively). Overall, 3-year graft survival for DDLT, LDLT >20, and LDLT 20 were not significantly different. Important predictors of graft loss in HCV-infected patients were limited LDLT experience, pretransplant HCC, and higher MELD at transplantation.  相似文献   

9.
BACKGROUND/AIMS: The aim of this retrospective study is to analyze the prognostic impact of Model for End-Stage Liver Disease (MELD) score in patients undergoing liver transplantation (OLT) with suboptimal livers. METHODS: Between January 2002 and January 2006, 160 adult patients with liver cirrhosis received a whole liver for primary OLT at our institution including 81 with a suboptimal liver (SOL group) versus 79 with an optimal liver (group OL). The definition of suboptimal liver was: one major criterion (age >60 years, steatosis >20%) or at least two minor criteria: sodium >155 mEq/L, Intensive Care Unit stay >7 days, dopamine >10 microg/kg/min, abnormal liver tests, and relevant hemodynamic instability. RESULTS: Baseline recipients characteristics were comparable in the two study groups. The SOL group had a significantly greater number of early graft deaths (<30 days) than the OL group, while the 3-year Kaplan-Meier patient survivals were similar. Using logistic regression, MELD score was significantly related to patient death only in the SOL group (P = .01), and the receiver operator characteristics curve method identified 17 as the best MELD cutoff with the 3-year survival of 93% versus 85% for MELD < or =7 versus >17, respectively (P > 05). In comparison, it was 94% and 72% in the SOL group (P < .05). Similarly, MELD >17 was significantly associated with early graft death rates only in the SOL group. CONCLUSION: This study advised surgeons to not use suboptimal livers for patients with advanced MELD scores, thus supporting a donor-recipient matching policy.  相似文献   

10.
Pediatric renal transplants--results with sequential immunosuppression.   总被引:2,自引:0,他引:2  
Cyclosporine has improved the results of renal transplantation. In 1984, we began using it as part of a sequential immunosuppression protocol (MALG, AZA, P, and delayed administration of CsA) in our pediatric renal transplant recipients. We studied the outcome of the 131 pediatric renal transplants (less than or equal to 18 years of age at transplant) performed at our institution between June 1984 and March 1991. We compared these results with the 144 similar transplants performed since January 1980 that did not involve CsA immunosuppression. In the sequential immunosuppression group, there were 97 primary (74%) (26 [27%] cadaver, 71 [73%] living donor [LD]) and 34 (26%) retransplant (23 [68%] CAD, 11 [32%]) recipients. Age at transplant (mean +/- SD) was 7.4 +/- 5.5. Overall, 1-year actuarial graft survival was 93%; 1-year patient survival was 100%. The mean number of hospital readmissions was 3.0 +/- 3.5; 26 (20%) were readmission-free. The mean number of rejection episodes was .87 +/- 1.3 per patient; 73 (56%) were rejection-free. Importantly, LD (vs. CAD) recipients had fewer rejection episodes (P = 0.06). In the first post-transplant year, the serum creatinine level was significantly lower in primary (vs. retransplant) recipients and in LD (vs. CAD) recipients (P less than 0.05). In the 144 patients not receiving CsA, there were 129 (90%) primary (27 CAD, 102 LD) and 15 (10%) retransplant (7 CAD, 8 LD) recipients. Age at transplant was 6.9 +/- 5.3 years. The 1-year actuarial graft survival rate was 82%; the 1-year patient survival rate was 94%. The mean number of hospital readmissions was 3.3 +/- 2.3; 5 (8%) were readmission-free. The mean number of rejection episodes was 1.2 +/- 1.5; 27 (45%) were rejection-free. There was no difference in the serum creatinine level based on donor source or transplant number. Sequential immunosuppression has significantly improved patient (P = 0.003) and graft survival (P = 0.004) rates. Comparing sequential vs. non-CsA immunosuppression, there was no difference in the number of readmissions (P = 0.47), number of rejection episodes (P = 0.17), or serum creatinine level. The number of rejection-free patients was significantly lower in LD (vs. CAD) recipients (P less than 0.05). There was no evidence of progressive deterioration in renal function in the sequential (vs. non-CsA) recipients.  相似文献   

11.
《Transplantation proceedings》2022,54(10):2621-2626
BackgroundThe role of advanced practice providers (APPs) in an academic transplant surgical acute care setting remains to be defined. We sought to evaluate the impact of a transplant surgeon–APP (TSAPP) practice model on patient access and outcomes in the care of critically ill patients with end-stage liver disease (ESLD) in an academic transplant center.MethodsA retrospective analysis evaluated the effect of practice model evolution over an 11-year period on hospital access of patients with ESLD to an academic liver transplantation center and survival outcomes. We compared 3 practice models: era 1 (transplant surgeon–general surgery resident; January 2009 to Sept 2012): vs era 2 ( transition transplant surgeon–general surgery resident to TSAPP; October 2012 to December 2016): vs era 3 (TSAPP; January 2017 to December 2020).ResultsPatient access to hospitalization and inpatient service census increased significantly over time with TSAPP model (P < .01). At the time of liver transplant, the median Model for End-Stage Liver Disease scores for era 1 (25), era 2 (33), and era 3 (34), P < .01, and patient requirement for intensive care unit for era 1 (7.1%), era 2 (44.8%), and era 3 (56.4%), P < .01, have increased. The overall 1-year patient survival rates remained comparable across all eras: era 1 (93.88%), era 2 (93.11%), and era 3 (94.06%), P = .77ConclusionsThe APPs play an integral role in clinical transplantation practice. The integration of APPs into the transplant surgical workforce increased access of high-acuity patients with ESLD to the transplantation center. In addition, it provided excellent patient and graft survival outcomes after liver transplant.  相似文献   

12.
BACKGROUND: Highly sensitised renal transplant candidates (HSP) have a reduced chance of receiving a transplant. In Eurotransplant (ET), two special allocation programs have been made available for such patients: the Highly Immunised Tray (HIT) program and the Acceptable Mismatch program (AM), albeit with different inclusion and exclusion criteria (HIT, current PRA% >or=85%; AM, current and/or historical PRA% >or=85%). When a suitable kidney is available for a patient, included in these special programs, the kidney is mandatory offered. In contrast, in the point score system of the standard ET kidney allocation procedure (ETKAS), HSP (PRA >or=85%) only get a marginal bonus according to their current sensitisation. It was tested whether the allocation priority of the two special allocation programs is justified from the perspective of transplant outcome. METHODS: The post- transplant outcomes of recent consecutive cohorts of AM, HIT and HSP-ETKAS transplants were compared. The end points were initial graft function, rejection episodes during the first three months post-transplant, and 1-year kidney graft outcome. RESULTS: Between January 1, 1997 and June 30, 1998, 101 HSP received a kidney-only transplant: 29 via AM, 39 via HIT and 33 via ETKAS. HLA-A,B,DR matching was more favourable in the AM and HIT allocation groups and their waiting times till transplantation were much shorter than those of the HSP-ETKAS allocation group. The incidence of initial graft non-function was similar among the three HSP allocation groups, averaging 50%. Recovery of the initial non-function was more likely for AM and HIT transplants. No difference was present with regard to the percentage of patients who experienced at least one rejection episode during the first three months post-transplant, averaging 43%. However, the AM group had less severe and/or less recurrent rejection episodes. The 1-year kidney graft survival, censored for death with functional graft, was 96% for AM, 82% for HIT and 75% for HSP-ETKAS transplants (p = 0.04). CONCLUSIONS: The two special allocation programs for HSP do yield adequate results and offer a shorter waiting time, compared to the standard kidney allocation procedure. The AM approach might be preferred because of the smoother post-transplant management and the better graft survival, keeping the HIT approach as a back up. Since the allocation priority is justified in view of efficiency, the renal transplant community should support the incorporation of a special allocation program for HSP in their respective organ exchange program.  相似文献   

13.
BACKGROUND: Calcineurin inhibitor (CNI) toxicity is a common cause of chronic allograft nephropathy. Although de novo sirolimus (SRL) with CNI minimization may provide better graft function, studies in Asian recipients are lacking. AIM: We sought to determine the 1-year outcomes of renal transplant patients who received a de novo SRL-based regimen with CNI minimization. PATIENTS AND METHODS: A single-center, prospective study of de novo SRL-based, reduced-dose cyclosporine regimen was performed from 2004 to 2007. The control group was a historical cohort of a cyclosporine-based regimen (cyclosporine, prednisolone, and mycophenolate mofetil). The 1-year outcome parameters included renal function, rate of acute rejection, biopsy-proven CNI toxicity, graft and patient survivals. RESULTS: The SRL-based regimen achieved 100% 1-year graft and patient survivals. The renal function was comparable between the SRL-based and CNI-based regimens (serum creatinine 1.32 +/- 0.45 and 1.45 +/- 0.43 mg/dL; P = .27). The rate of biopsy-proven acute rejection was comparable (9.5% and 13%; P = .68). The SRL-based regimen had a higher rate of biopsy-proven CNI toxicity (28.5% and 9.7%; P = .03). CONCLUSIONS: De novo SRL-based regimen with CNI minimization provides excellent transplant outcomes. The strategy to minimize or withdraw CNIs may achieve excellent graft function. A prospective study targeting lower CNI trough levels in Asian transplant recipients is required.  相似文献   

14.
BACKGROUND/PURPOSE: Liver transplantation is standard therapy for children with a variety of liver diseases. The current shortage of organ donors has led to aggressive use of reduced or split grafts and living-related donors to provide timely liver transplants to these children. The purpose of this study is to examine the impact of these techniques on graft survival in children currently treated with liver transplantation. METHODS: Data were obtained on all patients less than 21 years of age treated with isolated liver transplants performed after January 1, 1996 in an integrated statewide pediatric liver transplant program, which encompasses 2 high-volume centers. Nonparametric tests of association and life table analysis were used to analyze these data (SAS v 6.12). RESULTS: One hundred twenty-three children received 147 grafts (62 at the University of Florida, 85 at the University of Miami). Fifty-two (36%) children were less than 1 year of age at time of transplant, and 80 (55%) were less than 2 years of age. Patient survival rate was identical in the 2 centers (1-year actuarial survival rate, 88.4% and 87.1%). Twenty-five (17%) grafts were reduced, 28 (19%) were split, 6 were from living donors (4%), and 88 (60%) were whole organs. One-year graft survival rate was 80% for whole grafts, 71.6% for reduced grafts, and 64.3% for split grafts (P =.06). Children who received whole organs (mean age, 6.1 years) were older than those who received segmental grafts (mean age, 2.5 years; P <.01). Multifactorial analysis suggested that patient age, gender, and use of the graft for retransplant did not influence graft survival, nor did the type of graft used influence patient survival. CONCLUSIONS: The survival rate of children after liver transplantation is excellent independent of graft type. Use of current techniques to split grafts between 2 recipients is associated with an increased graft loss and need for retransplantation. Improvement in graft survival of these organs could reduce the morbidity and cost of liver transplantation significantly in children.  相似文献   

15.
BACKGROUND: We studied patient and graft survival rates in adult liver transplant recipients, analyzing outcomes based on donor source (deceased donor [DD] vs. living donor [LD]) and graft type (whole liver vs. partial liver). METHODS: A retrospective database analysis of all adult liver transpants performed at our center over a 7-year period of time. RESULTS: Between 1999 and 2005, 384 liver transplants were performed in adult recipients, either as a whole liver from a deceased donor (DD-WL, n=284), split liver from a DD (DD-SL, n=31), or a partial transplant from a living donor (LD, n=69). DD-SL transplants were performed with a full right or left lobe graft, while LD transplants used the right lobe. Demographic differences in the three groups were most noticeable for lower model for end-stage liver disease scores in LD recipients (P<0.001) and younger donor age in DD-SL recipients (P<0.001). Superior graft survival results were seen in LD recipients versus either DD-WL recipients or DD-SL recipients (P=0.02 and P=0.05, respectively). Multivariate analysis showed hepatitis C (HR=1.53, P=0.05) and hepatocellular carcinoma (HR=1.74, P=0.03) to be significant risk factors for patient survival. Hepatitis C (HR=1.61, P=0.03) and donor age more than 50 (HR=1.64, P=0.04) were significant risk factors for graft survival. However, neither graft type nor donor source were significant independent risk factors for patient or graft survival. CONCLUSIONS: Our data suggests that the status of the recipient is probably a more important determinant of outcome than graft type or donor source.  相似文献   

16.
BACKGROUND: Few data are available about the long-term outcome of renal transplantation in patients with systemic lupus erythematosus (SLE). METHODS: We retrospectively studied all lupus nephritis patients who received kidney allografts in our center between June 1989 and 2006. Patient and allograft outcomes were compared with those of 60 controls. RESULTS: Mean follow-up after renal transplantation was 87 +/- 39 months for patients with lupus and 88 +/- 54 months for controls. Actuarial 10-year patient (83% vs 85%; P = .62) and death-censored graft survival rates (73% vs 69%; P = .36) were not significantly different between lupus patients and controls. Intravascular thrombotic events occurred in 4 patients with SLE (17.4%) and 3 controls (5%; P < .05). Recurrence of lupus nephritis was documented in 1 renal allograft (4.3%). CONCLUSION: Long-term patient and graft survivals were similar in SLE and non-SLE renal transplant recipients. The risk for thrombotic complications was greater among SLE patients.  相似文献   

17.
BACKGROUND: Long-term follow-up of heart, liver, and lung transplantation has led to an increased recognition of secondary end-stage renal failure (ESRF) in transplant recipients. This study examines our center's experience with renal transplantation following previous solid organ transplantation. METHODS: From January 1, 1992, to September 30, 1999, our center performed 18 renal transplants in previous solid organ recipients. During the same period, 815 total renal transplants were performed. One- and 3-year graft and patient survival, recipient demographics, donor type, and reason for transplantation were compared between these groups. RESULTS: Of the 18 recipients, 7 had prior heart transplants, 4 had prior liver transplants, and 7 had prior lung transplants. Cyclosporine toxicity contributed to renal failure in 17 (94.4%) of the patients-either as a sole factor (11 patients) or in combination with hypertension, renal artery stenosis, or tacrolimus toxicity (6 patients). Kaplan-Meier 1- and 3-year patient survival was 82.9% and 73.7%, compared with 95.5% and 90.7% in all renal transplant recipients. No surviving patient has suffered renal allograft loss. Mean current creatinine level is 1.4 mg/dL. CONCLUSIONS: Renal transplantation is an excellent therapy for ESRF following prior solid organ transplantation. One and 3-year patient and graft survival demonstrate the utility of renal transplantation in this patient population.  相似文献   

18.
The MELD/PELD (M/P) system for liver allocation was implemented on February 27, 2002, in the United States. Since then sufficient time has elapsed to allow for assessment of posttransplant survival rates under this system. We analyzed 4163 deceased donor liver transplants performed between February 27, 2002, and December 31, 2003, for whom follow-up reporting was 95% and 67% complete at 6 and 12 months, respectively. Kaplan-Meier survival analysis revealed 1-year patient and graft survival rates for status 1 of 76.9% and 70.4%, respectively, and 87.3% and 82.9% for patients prioritized by M/P (P < .0001 for status 1 vs M/P). When adult candidates were stratified by MELD score quartile at transplant, 1-year survival rates were 89.5%, 88.3%, 86.6%, and 78.1% for lowest to highest quartile (P = .0002) and graft survival rates were similarly distributed (85.0%, 84.5%, 82.7%, 73.0%, P < .0001). Candidates with hepatocellular cancer (89.6%) and other MELD score exceptions (88.8%) had slightly higher 1-year survival rates compared with standard MELD recipients (86.0%), which did not reach statistical significance (P = .089). Pediatric recipients had slightly better patient (88.7%) and graft (86.5%) survival rates at 1 year than adults but there were no significant differences among the PELD strata due to small numbers of patients in each PELD quartile. We conclude that patient and graft survival have remained excellent since implementation of the MELD/PELD system. Although recipients with MELD scores in the highest quartile have reduced survival compared with other quartiles, their 1-year survival rate is acceptable when their extreme risk of dying without a transplant is taken into consideration.  相似文献   

19.
INTRODUCTION: The use of extended criteria donors (ECDs) could minimize shortage of suitable donor livers for transplantation. In 3 years, the aggressive use of ECD livers has reduced the wait list at our center from 257 to 30 patients with a median wait time of 18 days without using living donors. This study compares the graft/patient survival from standard (SD) and ECD for our transplant population between 2001 and 2005. METHODS: Records of all adult liver transplant recipients over 4 years were reviewed (n = 571). ECD criteria included: age >59 years, BMI >34.9, maximum AST/ALT >500, maximum bilirubin >2.0, peak serum sodium >170, HBV/HCV/HTLV reactive, donation after cardiac death, cold ischemia time >12 hours, ICU stay >5 days, 3 or more pressors simultaneously, extensive alcohol abuse, cancer history (nonskin), active meningitis/bacteremia, or significant donor liver trauma. Outcomes included graft and patient survival at 90 days, 1 year, and 2 years. RESULTS: Sixty-eight percent of recipients (n = 388) received ECD livers. Primary factors accounting for ECD-liver status included: elevated liver function tests (20%), hypernatremia (12.6%), and extensive alcohol abuse (11.4%). Graft survival was (SD, ECD): 90-day 91%, 88%; 1-year 84%, 80%; 2-year 78%, 77%; patient survival was: 90-day 93%, 90%; 1-year 87%, 82%; 2-year 83%, 79%. Kaplan-Meier survival analysis failed to demonstrate an overall difference in graft or patient survival at any time point. Only donor age >60 years was associated with decreased graft and patient survival. CONCLUSIONS: Liver grafts from ECD can be used to dramatically reduce wait list time with outcomes comparable to those for SD without resorting to living donor liver transplantation.  相似文献   

20.
In the United States, the Scientific Registry of Transplant Recipients (SRTR) provides publicly available quality report cards. These reports have historically rated transplant programs using a 3‐tier system. In 2016, the SRTR temporarily transitioned to a 5‐tier system, which classified more programs as under‐performing. As part of a larger survey about transplant quality metrics, we surveyed members of the American Society of Transplant Surgeons and American Society of Transplantation (N = 280 respondents) on transplant center experiences with patient and payer responses to the 5‐tier SRTR ratings. Over half of respondents (n = 137, 52.1%) reported ≥1 negative effect of the new 5‐tier ranking system, including losing patients, losing insurers, increased concern among patients, and increased concern among referring providers. Few respondents (n = 35, 13.7%) reported any positive effects of the 5‐tier ranking system. Lower SRTR‐reported scores on the 5‐tier scale were associated with increased risk of reporting at least one negative effect in a logistic model (P < 0.01). The change to a more granular rating system provoked an immediate response in the transplant community that may have long‐term implications for transplant hospital finances and patient options for transplantation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号