首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Objectives

In adults undergoing living donor liver transplantation (LDLT), the transplanted livers are partial grafts, and the portal venous pressure is higher than that observed with whole liver grafts. In patients undergoing LDLT concomitant with splenomegaly, portal venous flow is often diverted to collateral vessels, leading to a high risk of portal vein thrombosis. In such cases, occlusion of the collateral veins is important; however, complete occlusion of all collaterals without blocking the blood flow through the splenic artery causes portal hypertension and liver failure. We aimed to examine the effect of performing a splenectomy concomitant with LDLT to reduce portal vein complications.

Methods

Between 1991 and 2017, we performed 170 LDLT operations, including 83 in adults. For this cohort study, adult cases were divided into 2 groups. Group I was those who underwent LDLT without splenectomy (n = 60); Group II was those who underwent LDLT with splenectomy for the reduction of portal hypertension (n = 23). We investigated the incident rates of complications, including blood loss, lethal portal vein thrombosis (intrahepatic thrombosis), acute rejection, and so on. We also investigated the survival rates in both groups.

Results

The incident rate of lethal portal vein thrombosis in Group II was significantly lower than that observed in Group I (4.4% vs 21.7%, respectively, P = .0363). There were no statistically significant differences observed between the groups with respect to blood loss, survival rates, and other such parameters.

Conclusion

LDLT concomitant with splenectomy might effectively reduce the occurrence of portal vein complications in adults.  相似文献   

2.

Background

Biliary complication is one of the major donor complications during and after hepatectomy in living donor liver transplantation (LDLT). We evaluated risk factors for donor biliary complication in adult-to-adult LDLT.

Patients and Methods

From March 2002 to November 2016, 126 consecutive patients who underwent donor hepatectomy in adult-to-adult LDLT were divided into 2 groups according to biliary compilations: nonbiliary complication (non-BC) group (n = 114) and biliary complication (BC) group (n = 12).

Results

Among 126 donor hepatectomies, 35 patients (28%) experienced perioperative complications, including 10 (7.9%) with Clavien-Dindo classification grade III. Biliary complications occurred in 12 patients (9.5%): bile leakage in 10 and intraoperative bile duct injury in 2. Additional computed tomography- and/or ultrasound-guided drainage or exchange of original drain was required in 7 patients. In comparison between BC and non-BC groups, future remnant liver volume was significantly higher in the BC group than in the non-BC group (63% vs 40%; P?=?.02). In multivariate analysis, larger future remnant liver volume (P?=?.005) and shorter operating time (P?=?.02) were identified as independent risk factors for biliary complications. We had 2 patients with intraoperative bile duct injury: both were successfully treated by duct-to-duct biliary anastomosis with insertion of biliary stent or T-tube.

Conclusion

Large remnant liver volume was a significant risk factor for biliary complications, especially biliary leakage, after donor hepatectomy. For intraoperative bile duct injury, duct-to-duct anastomosis with biliary stent is a feasible method to recover.  相似文献   

3.

Objective

In patients with living donor liver transplantation (LDLT), late-onset complications sometimes develop because of long-term use of immunosuppressive drugs. One of the immunosuppressive drug-related complications is de novo malignancies resulting in reduced survival.

Patients and Methods

Among 153 patients undergoing LDLT, we retrospectively reviewed the medical records of 97 adult recipients (February 2002 to May 2017), who had been followed-up at our hospital for more than one year after LDLT. The median age was 52 years old (20–70) and the median observational period was 6.9 years (2.4–15.3).

Results

De novo malignancy after adult LDLT developed in 11.3% (11/97) of patients, including posttransplantation lymphoproliferative disorder (PTLD) (n = 4) (2 in the brain and 2 in abdominal lymph nodes), lung cancer (n = 1), pancreatic cancer (n = 1), gastric cancer (n = 1), laryngeal cancer (n = 1), lower gingival cancer (n = 1), bladder cancer (n = 1), and melanoma (n = 1). Age at cancer diagnosis ranged from 36 to 70 years old with an average age of 61 years. The interval from LDLT to cancer diagnosis was 8.3 years (3.9–12.2). Four patients (36.6%) including PTLD (n = 2), lung cancer (n = 1), and pancreatic cancer (n = 1) died of cancer and all of them were diagnosed with cancer within 10 years after LDLT. Six patients were diagnosed with cancer more than 10 years after LDLT and all of them survived after treatment of cancer.

Conclusion

De novo malignancy was found in 11.3% of LDLT patients, and more than half of this population subset developed tumors 10 years after LDLT. Long-term close follow-up should be performed by taking any kinds of de novo malignancy into consideration.  相似文献   

4.

Background

The relationship between smoking cessation and weight gain is well recognized. Examining the link between smoking cessation and weight gain in donor candidates for living donor liver transplantation (LDLT) is an important topic because of the influence of weight gain on the liver. This study assessed body weight (BW) changes after smoking cessation in donor candidates for LDLT.

Methods

The 27 donor candidates were retrospectively analyzed. The smoking status was determined based on questionnaires administered at the initial presentation, and the candidates were divided into 2 groups: recent quitters and nonsmokers. The changes in BW were compared between the groups.

Results

The recent quitters group included 10 (37.0%) candidates, and the nonsmokers group included 17 (63.0%). In the nonsmokers group, 1 candidate had gained weight since the initial presentation. In contrast, in the recent quitters group, 70.0% of candidates had gained weight since the initial presentation (P < .01). The change in BW from the initial presentation was greater in recent quitters than in nonsmokers (+1.6 kg [+2.4%] vs ?0.5 kg [?0.9%]; P < .01). Two candidates in the recent quitters group gained ≥?5 kg [8%] of weight. One of these 2 candidates was judged to be in a donor-inadequate status because of the appearance of fatty liver.

Conclusions

Weight gain due to smoking cessation was observed in donor candidates for LDLT. The amount of weight gain after smoking cessation is highly individualized, so everyone concerned with LDLT must be alert to its potential development.  相似文献   

5.

Background

Liver transplantation from donors after cardiac death (DCD) might increase the pool of available organs. Recently, some investigators reported the potential use of mesenchymal stem cells (MSCs) to improve the outcome of liver transplantation from DCD. The aim of this study was to evaluate the cytoprotective effects and safety of MSC transplantation on liver grafts from DCD.

Methods

Rats were divided into 4 groups (n = 5) as follows: 1. the heart-beating group, in which liver grafts were retrieved from heart-beating donors; 2. the DCD group, in which liver grafts were retrieved from DCD that had experienced apnea-induced agonal conditions; 3. the MSC-1 group, and 4. the MSC-2 group, in which liver grafts were retrieved as with the DCD group, but were infused MSCs (2.0 × 105 or 1.0 × 106, respectively). The retrieved livers were perfused with oxygenated Krebs-Henseleit bicarbonate buffer (37°C) through the portal vein for 2 hours after 6 hours of cold preservation. Perfusate, bile, and liver tissues were then investigated.

Results

Bile production in the MSC-2 group was significantly improved compared with that in the DCD group. Based on histologic findings, narrowing of the sinusoidal space in the both MSC groups was improved compared with that in the DCD group.

Conclusions

MSCs could protect the function of liver grafts from warm ischemia-reperfusion injury and improve the viability of DCD liver grafts. In addition, we found that the infusion of 1.0 × 106 MSCs does not obstruct the hepatic sinusoids of grafts from DCD.  相似文献   

6.

Objectives

This study aims to investigate postdonation outcomes of adult living donor liver transplantation donors and remnant liver regeneration in different graft types.

Methods

A total of 236 adult living donor liver transplantation donors were classified into different groups: donors with <35% remnant liver volume (group A; n = 56) and donors with remnant liver volume ≥35% (group B, n = 180); left lobe grafts (LLG group; n = 98) including middle hepatic vein (MHV) and right lobe grafts (RLG group; n = 138) without MHV. The 98 LLG group donors were further classified into 2 subgroups based on hepatic venous drainage patterns: MHV-dominant (n = 20) and non-MHV-dominant (n = 78). The demographic data, postoperative laboratory data, complications, graft weight, remnant liver volume, remnant liver growth rate, and remnant liver regeneration rate (RLRR) after partial liver donation were analyzed.

Results

The postoperative aspartate aminotransferase, alanine aminotransferase, total bilirubin, intensive care unit stays, and hospitalization stays were higher in A and RLG group donors. All the donor complications in our series were minor complications. The postoperative complication rate was higher in the A and RLG group, but failed to reach statistical significance. There was no significant difference in RLRR between the RLG/LLG and A/B groups. However, the MHV-dominant group had significantly lower RLRR than the non-MHV-dominant group (P < .05).

Conclusions

Small remnant liver volume donors (<35% remnant liver) have higher risks of developing postdonation minor complications. Left lobe liver donation in MHV-dominant donor candidates are a major concern.  相似文献   

7.

Background

Rotator cuff tears are believed to coexist with cervical spine lesions. In cases of preexisting neuropathy, such as cervical spine lesions, fatty degeneration has likely already occurred due to the neuropathy. In these cases, rotator cuff tear is thought to occur easily because of preexisting extensive fatty degeneration and degeneration of the tendons due to neuropathy. This study aimed to evaluate the effects of paralysis due to neuropathy proximal to the suprascapular nerve on the supraspinatus and infraspinatus tendons using a rat model of brachial plexus paralysis.

Methods

This study included fifteen, 8-week-old Sprague–Dawley rats. The left shoulder was included in the paralysis group and the contralateral shoulder constituted the sham group. Biomechanical testing (evaluated maximum tear force, maximum displacement and Young's modulus) (n = 10) and histological analyses (n = 5) (using the Bonar scale) were performed at 12 weeks postoperatively to confirm the degeneration of the tendon.

Results

The mean maximum tear force was significantly lower in the paralysis group than in the sham group (P = 0.008), indicating that rotator cuff tears occurred with a lower force in the paralysis group. Additionally, the average Young's modulus was significantly greater in the paralysis group than in the sham group (P = 0.003), indicating that the rotator cuff muscle became hard and inflexible in the paralysis group. The Bonar scales of the histological analyses were significantly higher in the paralysis group (total score = 7.04 ± 0.61) than the sham group (total score = 0) (p < 0.0001).

Conclusions

If neuropathy of proximal to the suprascapular nerve, such as cervical spine or brachial plexus lesion, exists, weakness and degeneration of the rotator cuff tendon and stiffness of the rotator cuff muscle develop. Neuropathy is likely a cause of rotator cuff tears.  相似文献   

8.

Background

Living donor liver transplantation (LDLT) is a definitive procedure for splenomegaly caused by liver cirrhosis and portal hypertension, but splenomegaly persists in some patients. The aim of this study was to clarify the long-term changes in the spleen volume after LDLT.

Methods

The 13 pediatric patients who survived for >8 years after LDLT were retrospectively analyzed. We calculated the spleen volume/standard spleen volume (SV/SSV) ratio by automated computed tomography (CT) volumetry. We assessed the spleen volumes before LDLT, at roughly postoperative week (POW) 4, at postoperative year (POY) 1, at POY 5, and at POY 10.

Results

With regard to SV as evaluated by CT volumetry, there were no consistent trends, with median values as follows: before LDLT, 282.5 (71–641) cm3; POW 4, 252 (109–798) cm3; POY 1, 222.5 (97–948) cm3; POY 5, 263.5 (123–564) cm3; and POY 10, 377 (201–1080) cm3. In contrast, the SV/SSV ratio decreased chronologically as follows: before LDLT, 5.0 (0.7–6.0); POW 4, 3.7 (2.3–4.3); POY 1, 2.2 (1.7–6.3); POY 5, 1.7 (1.1–5.4); and POY 10, 1.4 (1.1–6.9). In the remote phase after LDLT, many cases showed a trend toward an improved SV/SSV ratio, but splenomegaly was prolonged without improvement in 3 cases (23.1%) with portal vein complications and advanced fibrosis. Furthermore, all 3 cases showed a decreased platelet count due to hypersplenism.

Conclusion

Splenomegaly requires a long time to demonstrate an improvement. In cases without an improvement of splenomegaly, we should suspect abnormalities in the graft liver and portal hemodynamics.  相似文献   

9.

Background

Lumbar decompression surgery is often used to treat neurological symptoms of the lower extremity as a result of lumbar disease. However, this method also leads to the improvement of the accompanying low back pain (LBP). We studied the extent of LBP improvement after lumbar decompression surgery without fusion and the associated preoperative factors.

Methods

Patients (n = 140) with lumbar spinal stenosis (n = 90) or lumbar disc herniation (n = 50) were included. To evaluate the change in LBP, VAS scores and the Oswestry disability index scores were measured before surgery and 2 weeks, 3 months, and 6 months after surgery. The predictors of residual LBP were investigated using logistic regression analyses.

Results

In total, 140 patients were examined. The VAS scores for LBP before surgery and 2 weeks, 3 months, and 6 months after surgery were 4.4 ± 3.0 (mean ± standard deviation), 1.1 ± 1.5, 1.3 ± 1.8, and 1.9 ± 2.2, respectively. LBP significantly improved 2 weeks after surgery (P < 0.001), stabilized between 2 weeks and 3 months after surgery, but was significantly aggravated 3–6 months after surgery (P < 0.001). At 6 months after surgery, 67 (47.9%) patients had a VAS score of >1. The predictors of residual LBP included severe preoperative LBP, degenerative scoliosis and the size of the Cobb angle. The independent predictors, determined by multivariate analysis were degenerative scoliosis and the size of the Cobb angle.

Conclusions

LBP was alleviated at 2 weeks after lumbar decompression surgery for lumbar disc herniation and lumbar spinal stenosis. The predictors of residual LBP after decompression included more severe LBP at baseline, degenerative scoliosis and the size of Cobb angle.

Level of evidence

Level 3.  相似文献   

10.

Background

The effectiveness of everolimus (EVR) for ABO-incompatible (ABOi) kidney transplantation is unknown. We evaluated outcomes of conversion from steroid to EVR in ABOi kidney transplant recipients.

Methods

We performed a retrospective observational cohort study of 33 de novo consecutive adult ABOi living donor kidney transplant recipients. Desensitization was performed using 0 to 4 sessions of plasmapheresis and 1 to 2 doses of 100 mg rituximab according to the anti-A/B antibody titer. ABOi recipients were administered a combination of tacrolimus, mycophenolate mofetil, and methylprednisolone. Diabetic patients were converted from methylprednisolone to EVR at 1 to 15 months post-transplantation to prevent diabetes progression. Graft outcomes, hemoglobin A1c (HbA1c) levels, and cytomegalovirus infection rates were compared between the EVR (n = 11) and steroid (n = 22) groups.

Results

Mean postoperative duration was 814 and 727 days in the EVR and steroid groups, respectively (P = .65). Between the 2 groups, graft survival rate (100% vs 95.5%, P > .99), acute rejection rate (9.1% vs 18.2%, P = .64), and serum creatinine levels (1.46 mg/dL vs 1.68 mg/dL, P = .66) were comparable. Although HbA1c levels were elevated in the steroid group (5.47%, 5.87%; P = .003), no significant deterioration was observed in the EVR group without additional insulin administration (6.10%, 6.47%; P = .21). Cytomegalovirus infection rate was significantly lower in the EVR group than in the steroid group (18.2% vs 63.6%, P = .026).

Conclusion

Conversion from steroid to EVR in ABOi kidney transplant recipients maintained excellent graft outcomes and avoided diabetes progression and cytomegalovirus infection.  相似文献   

11.

Objective

The goal of this study was to evaluate whether pretransplant serum hyaluronic acid (HA) levels can predict outcomes after adult-to-adult living donor liver transplantation (LDLT).

Methods

In study I, 21 patients who underwent LDLT (March 2002-February 2004) were divided into 2 groups: the H-I group (HA ≥500 ng/mL; n = 12) and the L-I group (HA <500 ng/mL; n = 9). The influence of pretransplantation HA levels on short-term surgical outcome was investigated. In study II, 77 LDLT patients (May 2004-December 2014) were also divided into 2 groups: the H-II group (HA ≥500 ng/mL; n = 40) and the L-II group (HA <500 ng/mL; n = 37). We compared long-term survival and investigated prognostic factors.

Results

In study I, HA levels significantly decreased after LDLT, and those in the H-I group were significantly higher compared with the L-I group at 1, 3, 5, and 7 days after LDLT. There were significant differences in postoperative peak total bilirubin levels (H-I vs L-I, 17.2 vs 6.2 mg/dL; P = .013), peak ascitic fluid volume (1327 vs 697 mL/d; P = .005), and the hepatocyte growth factor levels at 3 days after LDLT (1879 vs 1092 pg/mL; P = .03). In study II, the 1- and 5-year survival rates were significantly lower in the H-II group than in the L-II group (H-II vs L-II, 65.0% and 48.5% vs 86.5% and 80.8%; P = .004). In multivariate analysis, significant prognostic factors were preoperative HA ≥500 ng/mL (P = .004) and graft to recipient body weight ratio <0.8 (P = .042).

Conclusions

Preoperative HA level can be a prognostic risk factor. Patients with high HA levels are vulnerable and should be carefully managed after LDLT.  相似文献   

12.

Background

The purposes of this study were to quantitatively analyze osteophyte formation of the distal radius following scaphoid nonunion and to investigate how fracture locations relate to osteophyte formation patterns.

Methods

Three-dimensional surface models of the scaphoid and distal radius were constructed from computed tomographic images of both the wrists of 17 patients' with scaphoid nonunion. The scaphoid nonunions were classified into 3 types according to the location of the fracture line: distal extra-articular (n = 6); distal intra-articular (n = 5); and proximal (n = 6). The osteophyte models of the radius were created by subtracting the mirror image of the contralateral radius model from the affected radius model using a Boolean operation. The osteophyte locations on the radius were divided into 5 areas: styloid process, dorsal scaphoid fossa, volar scaphoid fossa, dorsal lunate fossa, and volar lunate fossa. Osteophyte volumes were compared among the areas and types of nonunion. The presence or absence of dorsal intercalated segment instability (DISI) deformity was also determined.

Results

The distal intra-articular type exhibited significantly larger osteophytes in the styloid process than the distal extra-articular type. Furthermore, the proximal type exhibited significantly larger osteophytes in the dorsal scaphoid fossa than the distal extra-articular type. Finally, the distal intra- and extra-articular types were more associated with DISI deformity and tended to have larger osteophytes in the lunate fossa than the proximal type.

Conclusion

The pattern of osteophyte formation in the distal radius determined using three-dimensional computed tomography imaging varied among the different types of scaphoid nonunion (distal extra-articular, distal intra-articular, and proximal). The results of this study are clinically useful in determining whether additional resection of osteophytes or radial styloid is necessary or not during the treatment of the scaphoid nonunion.  相似文献   

13.

Background

The main challenge with cytomegalovirus (CMV) prophylaxis in IgG donor-positive/recipient-negative (D+/R–) kidney transplant recipients is late-onset CMV disease. We evaluated a novel protocol for the prevention of late-onset CMV infection and disease in D+/R? organ recipients.

Methods

Our prospective, observational, cohort study included 100 adult kidney transplant recipients. Prophylaxis with low-dose valganciclovir (450 mg/d, 3 times a week for 6 months) was administered to D+/R? recipients. Risk factors for CMV infection and disease were identified. Renal function and the outcomes of CMV infection and disease were compared between D+/R? (n = 15) and recipient-positive (R+; n = 81) organ recipients.

Results

D+/R? recipients showed significant independent risk factors with high hazard ratios for CMV infection (2.04) and disease (10.3). The proportion of CMV infection in D+/R? and R+ recipients was 80% and 46% (P = .023), and that of CMV disease was 33% and 6.2% (P = .008), repectively. D+/R? recipients developed CMV infection and disease within 6 months after transplantation. However, both CMV infection- and disease-free survival rates beyond 1 year post-transplantation defined as late-onset were stable in D+/R? recipients. Moreover, serum creatinine levels at 1 year post-transplantation were comparable between D+/R? and R+ recipients (1.45 ± 0.71 vs 1.16 ± 0.35 mg/dL, P = .26).

Conclusion

Our novel protocol prevented late-onset CMV infection and disease beyond 1 year post-transplantation in D+/R? recipients.  相似文献   

14.

Background

This study aimed to determine the appropriate administration duration of edoxaban 15 mg (a factor Xa inhibitor) for the prevention of deep vein thrombosis (DVT) after total knee arthroplasty (TKA).

Methods

Our study comprised 202 patients who underwent TKA (excluding bilateral TKA) at our institution between 2014 and 2015. The subjects received edoxaban 15 mg daily for 1 (n = 93) or 2 (n = 109) weeks; group assignment was random. B-mode ultrasonography was performed 7 and 14 days post-TKA for the detection of DVT. We compared the incidence of DVT between the groups and examined for side effects.

Results

The demographic data of the patients in the 1- and 2-week administration groups were similar at baseline. DVT incidence did not differ significantly between the groups at 1 week post-TKA. However, it was significantly lower in the 2-week administration group (n = 0) than in the 1-week administration group (n = 7; p = 0.004) at 2 weeks post-DVT. Neither group exhibited symptomatic DVT. A total of six patients withdrew during the study period because of hepatic dysfunction.

Conclusions

Our results show that the administration of edoxaban 15 mg is more effective in preventing DVT after TKA when administered for 2 weeks than for 1 week.  相似文献   

15.

Background

It is unclear whether simultaneous surgery for posterior ankle impingement syndrome (PAIS) and concomitant ankle disorders, such as anterior ankle impingement syndrome (AAIS), lateral ankle instability (LAI), and osteochondral lesion of the talus (OLT), allows for early return to athletic activity.

Methods

Ninety-seven patients who engaged in athletic activity (mean age 27 [range 18–43] years) and were treated by a hindfoot endoscopic approach for PAIS alone or simultaneously for PAIS and concomitant ankle disorders were included in this study. The patients were divided into four groups: PAIS alone (group A, n = 61), PAIS with AAIS (group B, n = 8), PAIS with LAI with or without AAIS (group C, n = 20), and PAIS with OLT with or without AAIS/LAI (group D, n = 8). In all patients, the concomitant ankle disorder was treated simultaneously by arthroscopic debridement for AAIS, bone marrow stimulation or autologous cancellous bone transplantation for OLT, and anterior talofibular ligament repair or reconstruction for LAI. American Orthopedic Foot and Ankle Society (AOFAS) Ankle–Hindfoot Scale scores before and 2 years after surgery and times from surgery to resuming training and athletic activity were compared between the groups.

Results

Mean AOFAS score improved significantly after surgery in all groups (groups A and C, P < .0001; groups B and D: P < .05). The time taken to return to training was significantly longer in group D than in groups A, B, and C (all P < .01) as was the time taken to return to athletic activity in groups C and D when compared with group A (P < .01); however, there were no significant differences in this regard between groups B and C.

Conclusion

Concomitant surgery for AAIS and LAI with PAIS did not delay the postoperative start of training, however, concomitant surgery for LAI and OLT delayed the return to athletic activity when compared with PAIS surgery alone.

Study design

Clinical Retrospective Comparative Study.  相似文献   

16.

Background

There have been no prospective studies comparing anterior surgery and posterior method in terms of long-term outcomes. The purposes of this study is to clarify whether there is any difference in long-term clinical and radiologic outcomes of anterior decompression with fusion (ADF) and laminoplasty (LAMP) for the treatment of cervical spondylotic myelopathy (CSM).

Methods

Ninety-five patients were prospectively treated with ADF or LAMP for CSM in our hospital from 1996 through 2003. On alternate years, patients were enrolled to receive ADF (1997, 1999, 2001, and 2003: ADF group, n = 45) or LAMP (1996, 1998, 2000, and 2002: LAMP group, n = 50). We excluded 19 patients who died during follow-up, and 25 who were lost to follow-up. Clinical outcomes were evaluated by the recovery rate of the Japanese Orthopaedic Association (JOA) score between the two groups. Sagittal alignment of the C2–7 lordotic angle and range of motion (ROM) in flexion and extension on plain X-ray were measured.

Results

Mean age at the time of surgery was 58.3 years in the ADF group and 57.9 years in the LAMP group. Mean preoperative JOA score was 10.0 and 10.5, respectively. Mean recovery rate of the JOA score at 3–5 years postoperatively was significantly higher in the ADF group (p < 0.05). Reoperation was required in 1 patient for pseudarthrosis and in 1 patient for recurrence of myelopathy in the ADF group; no patient in the LAMP group underwent a second surgery. There was a significant difference in maintenance of the lordotic angle in the ADF group compared with the LAMP group (p < 0.05), but not in ROM.

Conclusions

Both ADF and LAMP provided similar good outcomes at 10-year time-point whereas ADF could achieve more satisfactory outcomes and better sagittal alignment at the middle-term. However, the incidence of reoperation and complication in the ADF group were higher than those in the LAMP group.

Study design

A prospective comparative study (not randomized).  相似文献   

17.

Background

In patients eligible for organ transplantation, the Kidney Disease Improving Global Outcomes (KDIGO) guidelines specifically recommend avoiding red blood cell transfusions (RBCT) when possible to minimize the risk of allosensitization.

Objective

To assess the effect of perioperative RBCT on outcomes in living-related kidney transplantation (LRKT) recipients.

Methods

We retrospectively assessed 97 patients who underwent LRKT and whose data were evaluable at our institution between March 2009 and May 2016. We measured serum creatinine levels and calculated the estimated glomerular filtration rate (eGFR) at 3 months, 6 months, and 1 year after kidney transplantation (KTx). We evaluated the rejection rate within a year after KTx. We compared the renal function and rejection rate between those who received blood transfusions (n = 21) and those who did not (n = 76) during the perioperative period.

Results

Among patient characteristics, the rate of ABO-incompatible KTx and the mean hemoglobin levels before KTx differed significantly between the groups. The serum creatinine levels and eGFR within 1 year after KTx did not differ significantly between the two groups. The rejection rate in those who received blood transfusions and those who did not was 28.6% (6/21 patients) and 25.0% (19/76 patients) (P = .741), respectively.

Conclusions

We found that the rejection rate was slightly higher in patients who received perioperative RBCT than in those who did not, but the difference was not significant within a year after KTx. Perioperative RBCT may not affect renal function within a year after KTx.  相似文献   

18.

Objective

The aim of this study is to determine whether post-transarterial chemoembolization imaging (computed tomography or magnetic resonance imaging) could accurately predict the tumors' necrosis on pathologic specimens.

Background

Transarterial chemoembolization with drug-eluting beads has been proven to be an effective way to bridge patients with hepatocellular carcinomas to liver transplantation.

Materials and methods

From September 2012 to June 2017, 59 patients with a total of 78 hepatocellular carcinomas, who received transarterial chemoembolization with drug-eluting beads before liver transplantation in Kaohsiung Chang Gung Memorial Hospital, were included in the study. All patients and hepatocellular carcinomas have pre-transarterial chemoembolization and post-transarterial chemoembolization images (computed tomography or magnetic resonance imaging) and pathological findings for correlation. Tumor response was evaluated according to modified Response Evaluation Criteria in Solid Tumors. The ranges of necrotic percentage are 100%, 91-99%, 51-90%, and <50%.

Results

The accuracy rate between the imaging and pathology correlation was 40% for computed tomography and 42% for magnetic resonance imaging. The recurrent rate of the complete respond group is 11.5%, the partial respond group is 16.0%, and the stationary group is 28.6%.

Conclusion

Computed tomography and magnetic resonance imaging sensitivity is not satisfactory for microscopic evaluation of residual tumors after transarterial chemoembolization with drug-eluting beads. However, survival is good after liver transplantation no matter what the microscopic findings were.  相似文献   

19.

Background

The frequency of renal transplants from elderly living donors has increased because of a shortage of donors. However, the results of renal transplantation using aged kidney grafts have yet to be determined conclusively.

Methods

We evaluated 45 patients who underwent living donor kidney transplantation at our institution. The patients were categorized according to donor age at the time of the transplant: ≥?60 years (elderly donor group, n = 21) and <60 years (young donor group, n = 24). We reviewed the renal function of the recipients and pathologic findings of the graft including interstitial fibrosis score, tubular atrophy score, tubular atrophy and interstitial fibrosis grades, and arteriosclerosis up to 2 years posttransplantation.

Results

Significant differences were observed in the preoperative creatinine clearance of the donor, prevalence of hypertension in the donor, and age of the recipient. Serum creatinine levels in the elderly donor group were significantly higher from 2 months to 1 year posttransplantation, and the estimated glomerular filtration rate was significantly lower from 7 days to 1 year posttransplantation. However, the decline in estimated glomerular filtration rate from 14 days to up to 2 years posttransplantation was similar in the 2 groups. There was no significant difference in the renal biopsy findings between the 2 groups except for arteriosclerosis 1 year posttransplantation.

Conclusion

Kidney grafts from elderly living donors were not associated with a deterioration in renal function, and their pathologic findings were comparable with those of young donors for up to 2 years posttransplantation.  相似文献   

20.
Preceding solo kidney transplantation for type 1 diabetes with end-stage renal failure is controversial because of less pancreatic graft survival in pancreas transplantation after kidney transplantation (PAK) than in simultaneous pancreas and kidney transplantation (SPK).

Methods

To study the effectiveness of preceding solo kidney transplantation for type 1 diabetes with end-stage renal failure, comparative retrospective analysis was performed between SPK (n = 232) and PAK (n = 39) that were performed until December 2016.

Results

At 1, 3, and 5 years, pancreatic graft survival in SPK was 87.5%, 86.4%, and 82.8%, respectively, and 87.1%, 65.0%, and 49.1%, respectively, in PAK, which showed lesser long-term graft survival than SPK. Because 10 cases out of 16 (62.5%) failed into pancreatic graft loss with rejection in PAK, which was about 3 times more than in SPK, control of rejection is very important; rejection episodes were decreased by rabbit antithymocyte globulin induction resulting in improved graft survival. Five-year patient survival was 88.0% in SPK and 96.6% in PAK.

Conclusion

Considering patient survival, preceding solo kidney transplantation for type 1 diabetes with end-stage renal failure should be performed if a donor is available.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号