首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Purpose

Donor age is a well-known factor influencing graft function after deceased donor liver transplantation (DDLT). However, the effect of donors older than recipients on graft outcomes remains unclear. This study investigated the relationship between the donor–recipient age gradient (DRAG) and posttransplant outcomes after DDLT.

Methods

We included 164 adult recipients who underwent DDLT between May 1996 and April 2011. Patients were divided into 2 groups according to the value of DRAG: Negative (DRAG −20 to −1; n = 99) versus positive (DRAG 0–20; n = 65). Medical records were reviewed and laboratory data were retrospectively collected.

Results

The median age of donors and recipients was 43 (range, 10–80) and 46 (range, 19–67) years, respectively. The mean follow-up time was 57.4 months. A positive DRAG had a negative effect on levels of alkaline phosphatase until 2 weeks after transplantation. However, the positive group showed a lower incidence of hepatitis B viral disease recurrence. The 1-, 3-, and 5-year graft survival rates were 80.4%, 76.8%, and 71.4% in the negative group, and 65.8%, 58.4%, and 56.3% in the positive group, respectively. The positive DRAG group showed significantly inferior graft survival compared with the negative DRAG group (P = .036).

Conclusion

This study demonstrated that donors older than recipients had a deleterious effect on graft outcomes. DRAG could be a meaningful determinant of graft survival among DDLT recipients.  相似文献   

2.

Objective

Our aim was to evaluate whether the reduction in spleen volume at 6 months after living donor liver transplantation (LDLT) was affected by the size of the right lobe liver graft.

Patients and Methods

We analyzed 87 adult recipients of right lobe liver grafts who displayed preoperative splenomegaly: spleen volume >500 cm3 by computed tomographic (CT) volumetry. The recipients were grouped according to the graft weight-to-recipient weight ratio: GRWR >1 versus GRWR <1. The 2 groups were compared at 6 months after LDLT for mean postoperative spleen volume (SV) and mean SV change ratio 5, which was defined as [(SVpreop − SV6m)/SVpreop] × 100%, where SVpreop and SV6m represent SV calculated based on CT examinations preoperatively and at 6 month follow-up after LDLT, respectively.

Results

The GRWR ranged from 0.77 to 1.66. There were 53 patients with GRWR >1 and 34 with GRWR <1. Our analysis showed significant hepatic graft volume regeneration and SV reduction at 6 months after LDLT. The SV change ratio weakly but significantly correlated with the transplanted liver graft weight (Pearson correlation coefficient, r = 0.274; P < .009). In the group GRWR >1, the mean postoperative SV and the mean SV change ratio were 632 ± 220 cm3 and decreased by 32 ± 11%, respectively. The mean postoperative SV and the mean SV change ratio in group GRWR <1 were 598 ± 188 cm3 and decreased by 34 ± 13%, respectively. There were no differences in mean postoperative SV and mean SV change ratios between the 2 groups.

Conclusion

LDLT using a right lobe graft resulted in a significant reduction of SV at 6 months after surgery, but there were no significant differences between recipients who received different sized right lobe liver grafts.  相似文献   

3.
4.
《Transplantation proceedings》2019,51(7):2473-2477
PurposeThe variation of multiple bile ducts in a living donor graft is not infrequent; however, the literature on the impact of the number of bile ducts on postoperative biliary complications is scarce. We investigated whether the number of biliary duct anastomoses affects the rate of postoperative biliary complications in patients undergoing living donor liver transplantation (LDLT).Materials and MethodsBetween January 2016 and January 2018, all patients who underwent LDLT were reviewed. The patients were divided into 2 groups according to the number bile duct anastomoses (single duct [group A, n = 78] or multiple ducts [group B, n = 94]). Data collection included demographic features, Child Pugh Score (CPS), graft-recipient weight ratio (GRWR), surgical data including technique of biliary anastomosis (duct-to-duct, duct-to-sheath, double duct-to-duct, and hepaticojejunostomy), and postoperative morbidity and mortality.ResultsThe duct-to-duct anastomosis was the mostly commonly performed technique in group A, whereas double duct-to-duct and duct-to-sheath were significantly higher in group B. Operating time was quite high in group B compared to group A (438 ± 72 minutes vs 420 ± 61 minutes, respectively; P = .05). Regarding biliary complications (n = 40, 23.2%), the rates of biliary leakage (n = 17, 9.9%) and strictures (n = 25, 14.5%) were similar in both groups (P = .164 and .773, respectively). CPS was positively correlated (for Child B and C, odds ratio [OR]: 10.669 and 17.866, respectively), whereas GRWR was negatively correlated (OR: 9.530) with biliary stricture. Increased risk for bile leakage was observed in younger donors (OR: .929). Although overall mortality rate was 9.8% (n = 17), only 5 of the patients (29%) died of biliary complications.ConclusionThe number of biliary ducts and anastomoses did not affect the rate of complications. However, CPS, GRWR, and young donor age were found to be predisposing factors for postoperative biliary complications. Mortality was mostly based on the causes other than biliary complications.  相似文献   

5.
《Liver transplantation》2000,6(5):582-587
Alagille syndrome (AGS) is frequently associated with growth failure, which has been attributed to concurrent congenital anomalies, cholestasis, and malabsorption and/or malnutrition. However, the underlying cause of the growth failure is not well understood. Our objective is to analyze the growth pattern in 26 patients with AGS and the possible effect that orthotopic liver transplantation (OLT) may have on this pattern. The standardized height, weight, and growth velocity of 26 pair-matched patients with AGS were compared. Thirteen patients underwent OLT. Repeated-measure ANOVA methods were used for the statistical analysis. The overall mean standardized height (z score) was –2.92 in the OLT group versus –1.88 in the non-OLT group (P = .03). The overall mean standardized weight was –1.21 in the non-OLT group and –1.67 in the OLT group (P = .23). In 15 patients, birth weight was 2.82 ± 0.4 kg, for a mean standardized weight of –0.95, and weight at diagnosis was 4.53 ± 2.12 kg, for a mean standardized weight of –1.56. Bone age was delayed in the 9 patients who underwent bone-age analysis. Growth hormone therapy administered to 2 patients did not improve growth. Patients with AGS had growth failure secondary to other factors in addition to liver disease. Growth failure beginning in the prenatal period supports a genetic basis for this feature. Growth improvement up to normal levels should not be expected as a benefit of OLT in these patients. Growth failure as a primary indication for OLT should be cautiously examined in patients with AGS.(Liver Transpl 2000;6:582-587.)  相似文献   

6.
Since the upper age for organ donors has been raised, a higher incidence of preexistent organ damage and functional impairment is to be expected. Coronary artery sclerosis increases with age. It can only be diagnosed with certainty by coronary angiography. Since contrast medium administration may cause renal damage when risk factors are present, this study sought to establish whether angiography negatively influenced the early postoperative function of kidney grafts. We compared the clinical courses of 36 recipients of kidneys from donors in whom coronary angiography or levography had been performed with 36 recipients of kidneys from donors who had not been subjected to contrast medium. The results showed that the administration of contrast medium had no influence on renal function at 3 or 6 months after transplantation. In conclusion, fears that donor kidneys might be harmed by contrast medium appeared to therefore be unfounded.  相似文献   

7.
《Transplantation proceedings》2019,51(9):3116-3119
BackgroundLarge-for-size (LFS) graft should be avoided when performing an adult deceased donor liver transplantation (DDLT) as it is associated with abdominal compartment syndrome, severe graft injury, and primary graft nonfunction. When inadvertently facing with LFS graft intraoperatively, the most commonly reported approach has been a surgical reduction of the right lobe despite its technical difficulty in addition to ongoing coagulopathy after graft reperfusion. We report a case where we performed a left lateral sectionectomy instead of a right lobe modification.Case ReportA 44-year-old 58.4 kg female patient was admitted with drug-induced acute hepatic failure and underwent an emergency DDLT. The donor was a 51-year-old 60.0 kg man. At the time of procurement, the liver was noted to be hypertrophic. The estimated graft/recipient weight ratio was 3.49%. After completing the vascular and bile duct anastomosis, the abdomen could not be closed due to its large graft size. Because of the hypertrophic left lateral lobe and ongoing coagulopathy, we decided to perform an in situ left lateral sectionectomy rather than right posterior sectionectomy or right hemihepatectomy. The next day, the liver function failed to improve, and the patient’s blood pressure began to decline gradually. Computed tomography showed severe inferior vena cava (IVC) compression by the graft, and the patient underwent transjugular IVC stent placement. Soon after, the patient’s blood pressure improved and liver function gradually normalized. The patient was discharged uneventfully on postoperative day 45.ConclusionUnder specific conditions, in situ left lateral sectionectomy is a solution for unexpected LFS graft during DDLT.  相似文献   

8.

Introduction

Autoimmune hepatitis and cholestatic liver diseases have more favorable outcomes after liver transplantation as compared to viral hepatitis and alcoholic liver diseases. However, there are only few reports comparing outcomes of both living donor liver transplants (LDLT) and deceased donor liver transplants (DDLT) for these conditions.

Aim

We aim to study the survival outcomes of patients undergoing LT for autoimmune and cholestatic diseases and to identify possible risk factors influencing survival. Survival outcomes for LDLT vs. DDLT are also to be compared for these diseases.

Patients and Methods

A retrospective analysis of the UNOS database for patients transplanted between February 2002 until October 2006 for AIH, PSC, and PBC was performed. Survival outcomes for LDLT and DDLT patients were analyzed and factors influencing survival were identified.

Results

Among all recipients the estimated patient survival at 1, 3, and 5 years for LDLT was 95.5%, 93.6%,and 92.5% and for DDLT was 90.9%, 86.5%, and 84.9%, respectively (p?=?0.002). The estimated graft survival at 1, 3, and 5 years for LDLT was 87.9%, 85.4%, and 84.3% and for DDLT 85.9%, 80.3%, and 78.6%, respectively (p?=?0.123). On multivariate proportional hazard regression analysis after adjusting for age and MELD score, the effect of donor type was not found to be significant.

Conclusion

The overall survival outcomes of LDLT were similar to DDLT in our patients with autoimmune and cholestatic liver diseases. It appears from our study that after adjusting for age and MELD score donor type does not significantly affect the outcome.  相似文献   

9.
Hepatorenal syndrome (HRS) is a reversible, functional renal failure that occurs in patients with advanced hepatic failure. However, the reported rates of complete recovery of renal function and patient survivals after orthotopic liver transplantation (OLT) are variable. The aim of this study was to compare the outcomes after OLT between patients with HRS and those without HRS (no-HRS). We established exclusion criteria to select study patients who underwent OLT in a single center between January 2005 and October 2008. The exclusion criteria included the following: (1) malignancy, (2) <18 years of age, (3) other than primary OLT, (4) ABO mismatch or hemophilia, (5) no liver cirrhosis, and (6) survival >1 month after OLT. We selected 71 subjects, including 8 HRS and 63 no-HRS patients. No significant differences were observed in the estimated glomerular filtration rate (eGFR) between the 2 groups except for a lower eGFR on the day of and 1 month after OLT in the HRS group: 108.3 ± 40.5 versus 31.4 ± 14.1 mL/min and 85.4 ± 15.0 versus 57.3 ± 12.1 mL/min (P = .000 and P = .014, respectively). The renal function of 6/7 HRS patients who survived >1 year improved. The 1-year patient survival rate after OLT in HRS patients was similar to that without HRS: 95% versus 86% (P = .37). We concluded that HRS had minimal effects on patient survival and return of acceptable renal function.  相似文献   

10.
As inflammation began to be recognized as a major contributor to the pathogenesis of atherosclerosis, we evaluated the patients that developed mediastinitis, a long-standing inflammatory process, after coronary artery bypass grafting. There are many studies that have focused on the graft patency. But, till now, no study has been done to detect the effects of mediastinitis to graft patency. So, we aimed to detect the effect of mediastinitis on the graft patency in patients who have undergone coronary artery bypass surgery. Sixteen of 45 patients who have been operated upon for coronary artery bypass surgery and developed mediastinitis, which was treated with open drainage and mediastinal irrigation with late wound closure, were included in the study. The mean age of the patients was 55 +/- 11 (range 35-69) and nine of the patients were male. The graft patency was evaluated with control coronary angiographies after a mean period of 30.42 +/- 43.17 months (range 1-132). The left internal thoracic artery was patent in all patients (100%). Right internal thoracic artery patency rate was 50% (1/2). One individual bypassed radial artery was patent, whereas the sequential bypassed graft was occluded. The patency ratio of radial artery anastomosis was 33% (1/3). Twelve of the 17 saphenous vein grafts were patent (70.58%). The total number of patent distal anastomosis was 30/38 (78.94%). When compared with the graft patency of patients without infection, it was found that mediastinitis does not affect the graft patency rates adversely.  相似文献   

11.

Introduction

There is a paucity of data on long-term outcomes of older kidney recipients. Our aim was to compare the early and long-term outcomes of deceased donor kidney transplantation in patients aged ≥60 years with outcomes in younger recipients.

Materials and Methods

From 1998 to 2005, we performed 271 deceased donor kidney transplants. There were 76 recepients (28.1%) >60 years old. Older candidates were carefully selected based on their physiologic, cardiac, and performance status. Demographic data, including clinical characteristics, early complications, mortality, and patient and graft survival rates, were collected and analyzed.

Results

Older patients had comparable perioperative mortality and morbidity, incidence of delayed graft function (DGF), length of stay, and readmissions compared with younger patients. The rates of acute rejection and major infections were also comparable between the 2 study groups. Among older recipients, 25/76 (32.1%) patients received extended criteria donor kidneys compared with only 35/195 (17.9%) of younger patients (P < .001). Nevertheless, equivalent 1-, 3-, and 5-year allograft survival rates were observed in elderly and young patients; 91.5% versus, 92.5%, 78.5% versus 81.9%, and 75.6% versus 78.5%, respectively. Overall patient survival was also comparable in both groups.

Conclusion

Kidney transplantation in appropriately selected elderly recipients provides equivalent outcomes compared with those observed in younger patients. These observations support the notion that older recipients should not lose access to deceased donor kidney transplantation in the effort to achieve a perceived gain in social utility.  相似文献   

12.

Background

We investigated whether the age of donor kidneys influences the incidence of nocturnal polyuria in patients with successful renal transplantation (RTX).

Methods

Eighty-five patients (45 men and 40 women) undergoing RTX (median age, 47 years) were included in this study. Twenty-four-hour bladder diaries were kept for 3 days, and nocturnal polyuria was defined as a nocturnal polyuria index (nocturnal urine volume/24-hour urine volume) of >0.33. Risk factors for nocturnal polyuria were analyzed in patients with RTX by means of the Mann-Whitney U test, χ2 test, and a logistic regression analysis.

Results

End-stage renal disease (ESRD) developed from diabetes mellitus in 16 patients (19%). Sixty-five patients (76%) received pre-transplant dialysis, with a median duration of 5 years. The median serum creatinine level and body mass index at the most recent visit were 1.2 mg/dL and 21.2 kg/m2, respectively. On the basis of the 24-hour bladder diaries, nocturnal polyuria was identified in 48 patients (56%). A logistic regression analysis revealed that diabetes mellitus as the original disease for ESRD was the only risk factor for nocturnal polyuria (odds ratio, 8.95; 95% confidence interval, 2.01–65.3; P = .0028). The age of donor kidneys at examination did not affect the incidence of nocturnal polyuria (P = .9402).

Conclusions

Nocturnal polyuria was not uncommon in patients with successful RTX. Diabetes mellitus as the original disease for ESRD was the only risk factor for nocturnal polyuria, whereas the age of donor kidneys at examination did not affect the incidence of nocturnal polyuria. Thus, nocturnal polyuria is caused by recipient factors but not donor factors.  相似文献   

13.
14.

Purpose

Advanced donor age is a well-known risk factor for poor graft function after living donor liver transplantation (LDLT). In addition, advanced recipient age has a significant impact because of the high prevalence of comorbidities. We investigated the relationship between donor–recipient age gradient (DRAG) and the posttransplant outcomes in LDLT.

Methods

We included 821 consecutive adult recipients who underwent LDLT from June 1997 to May 2011. According to the value of DRAG, they were divided into 2 groups: Negative years (the donor was younger than the recipient) and positive years (the donor was older than the recipient). These groups were further divided into subgroups (≤−21, −20 to −1, 0 to 20, and ≥21 years). We collected retrospectively patient characteristics, laboratory results, medical and surgical complications, and graft loss.

Results

The positive DRAG group had higher level of posttransplant alkaline phosphatase, but a lower incidence of biliary complications. The negative DRAG group, particularly DRAG ≤ −21 years was associated with the superior 1-, 3-, 5-, and 10-year graft survivals. Recipients with DRAG ≥ 21 showed persistently inferior graft survival during the observation period. In cases of young donors, transplants utilizing lower DRAG seen between young donors and older recipients showed more favorable graft survival than that of young-to-young transplants.

Conclusion

This study demonstrated that DRAG and a fixed donor age limit could be significant factors to predict graft survival after LDLT. Patients should carefully consider the worse graft survival if the donor is older than the recipient by ≥20.  相似文献   

15.
16.
Lung transplantation is a limited by donor pool shortage. Despite the efforts to extend the graft acceptability with recurrent donor criteria reformulations, previous cardiothoracic surgery is still considered a contraindication. A donor who underwent cardiac surgery could potentially provide an ideal lung but high intraoperative risks and intrinsic technical challenges are expected during the graft harvesting. The purpose of this study is to present our dedicated protocol and four clinical cases of successful lung procurements from donors who had a previous major cardiac surgery. One donor had ascending aortic root (AAR) substitution, another had mitral valve substitution, and two had coronary artery bypass surgery. The others' eligibility criteria for organ allocation, such as ABO compatibility, PaO2/FiO2 ratio, absence of aspiration, or sepsis were respected. In one of the cases with previous coronary bypass grafting, the donor had a veno-arterial extracorporeal membrane oxygenation support. Consequently, the grafts required an ex vivo lung perfusion evaluation. We report the technical details of procurement and postoperative courses of recipients. All procurements were uneventful, without lung damage or waste of abdominal organs related to catastrophic intraoperative events. All recipients had a successful clinical outcome. We believe that successful transplantation is achievable even in a complicated setting, such as cases involving donors with previous cardiac surgery frequently are. Facing lung donor shortage, we strongly support any effort to avoid the loss of possible acceptable lungs. In particular, previous major cardiac surgery does not strictly imply a poor quality of lungs as well as unsustainable graft procurement.  相似文献   

17.
18.
The clinical benefit of machine perfusion (MP) was recently assessed in a 1-year Brazilian multicenter prospective randomized trial, that showed that the use of MP was associated with a reduced incidence of delayed graft function (DGF) compared to static cold storage (SCS) in kidney transplant recipients (45% vs 61%). The objective of the present analysis is to consider the cost-effectiveness of MP relative to SCS based on clinical data from this Brazilian cohort. A decision tree model was constructed to simulate a population of 1000 kidney transplant recipients based on data derived from this Brazilian multicenter clinical trial. The model accounts for different health state utilities to estimate the cost-effectiveness of deceased donor kidney transplantation in Brazil comparing 2 kidney preservation methods: MP and SCS. The model accounts for 3 possible graft outcomes at 1 year post-transplantation: success (an immediate functioning kidney), failure (primary nonfunction requiring a return to dialysis), or DGF 1 year post-transplant. MP provided 612 total quality-adjusted life years (QALYs) (0.61 QALYs per patient) as compared to SCS (553 total QALYs, 0.55 QALYs per patient). MP was cost effective relative to SCS (US$22,117/QALY, R$70,606/QALY). The use of MP also resulted in more functioning grafts than SCS (821 vs 787), leading to a cost per functioning graft of US$38,033 (R$121,417). In conclusion, this analysis indicates that, despite the initial added cost associated with MP, the use of MP results in more functioning grafts (821 vs 787) and higher patient quality of life relative to SCS in Brazil.  相似文献   

19.

Background  

The purpose of this original article was to evaluate the impact of the Pringle maneuver on the survival of patients with colorectal liver metastases.  相似文献   

20.

Background

Multiple renal artery kidneys still represent a special challenge for surgeons, during both nephrectomy for organ donation and transplantation. Recognition of anatomical conditions with advanced imaging methods is one of the most important elements of the preoperative evaluation process.

Aim

The purpose of the current study was to assess if anatomical abnormalities affect the outcomes of living kidney donor transplantation procedures.

Patients and Methods

A retrospective analysis of 60 living kidney donors and their recipients was performed. Patients were assigned to two groups: pairs with a single allograft vessels (group I) and pairs with any anatomical abnormalities of the transplanted organ (group II). The impact of anatomical abnormalities on initial and long-term outcomes of the transplantation were analyzed.

Results

The analyzed study group consisted of 60 pairs (35 included in group I and 25 in group II). Immediate graft function was observed in 65.7% vs 64% individuals, recpectively (n.s.). Mean serum creatinine concentration was 1.6, 1.46, and 1.44 mg/mL (group I) vs 1.78, 1.78, and 1.65 mg/mL (group II) at 1, 6, and 12 months posttransplant, respectively (n.s.). Glomerular filtration rate (using the Chronic Kindey Disease Epidemiology Collaboration equation) was estimated at 54.3, 59.9, and 61.0 mL/min/1.73 m2 (group I) vs 59.8, 57.6, and 59.8 mL/min/1.73 m2 (group II) at the same time points, respectively (n.s.).

Conclusions

Presence of single renal vessels was not a predictor of immediate graft function in living-donor kidney transplantation. Transplantation outcomes for kidneys with anatomical anomalies did not differ when compared to organs with typical anatomy. Multiple renal arteries did not impact initial graft function if precise surgical technique and proper preoperative diagnostics were provided.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号