首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
《Transplantation proceedings》2019,51(5):1555-1558
ObjectivesTo compare mini-incision donor nephrectomy (MDN) with laparoscopic donor nephrectomy (LDN) performed by the same surgical team, regarding short- and long-term outcomes.MethodsThree hundred and five patients, who underwent donor nephrectomy in our institution, through an MDN (n = 141) between January 1998-November 2011 and LDN (n = 164) since June 2010-December 2017, were compared.ResultsThe mean operative time for MDN (120 ± 29 minutes) was not significantly different when compared to LDN (113 ± 34 minutes), but when comparing the first 50 LDN and the 50 most recent, we found a reduction in the duration of the procedure. Laparoscopic donors had a shorter warm ischemia time (229 seconds vs 310 seconds, P = .01), particularly the 50 most recent, hospital stay (4.3 days vs 5.9 days, P < .001), and postoperative complications (P = .03). The incidence of graft acute tubular necrosis (ATN) was superior in the MDN (89% vs 25%, P < .001), although there was no significant difference regarding first-year serum creatinine (SCr) and glomerular filtration rate (GFR) (SCr 1.38 mg/dL vs SCr 1.33 mg/dL and GFR 63.7 mL/min vs 63.1 mL/min) comparing the 2 groups. Long-term graft survival did not significantly differ between groups. There was also no relationship between postoperative ATN events and long-term graft function.ConclusionsWith the growing experience of the high-volume centers and with specialized teams, LDN could be considered the most suitable technique for living donor nephrectomy with better results in short-term results (warm ischemia time, hospital stay, and postoperative complications), without difference in long-term outcomes.  相似文献   

2.
ObjectiveWe report our experience with laparoscopic donor nephrectomy (LDN) compared with open donor nephrectomy (ODN). Prognostic factors associated with adverse outcomes in LDN were identified.MethodsFrom January 2000 to December 2009, 243 consecutive live-donor nephrectomies were performed, including 129 LDNs and 114 ODNs. We compared patient demographics, perioperative outcomes, and recipient graft function in each group. Prognostic factors for adverse outcomes in LDN were investigated using uni- and multivariate analyses.ResultsPatient demographics, except mean donor age (P = .032), were similar between groups. Mean operative time (219 vs 163 minutes; P < .001) and warm ischemia time (WIT; 3.1 vs 1.7 minutes; P < .001) were significantly longer in LDN. Conversely, mean analgesic requirement (9.2 vs 14.7 mg morphine; P < .001) and postoperative hospital stay (6.5 vs 7.1 days; P = .003) were significantly lower with LDN. Mean estimated blood loss (EBL) was slightly lower in LDN (P = .15). There were 7 conversions from LDN to ODN. Complication rates were similar between the groups (P = .38). Delayed graft function (10.9% vs 1.7%; P = .016) and mean serum creatinine level at 1 month (1.47 vs 1.3 mg/dL; P = .04) were higher for LDN. However, 5-year allograft survival was not inferior among LDN (90% vs 85%; P = .42). Mean operative time (268 to 175 minutes; P < .001), EBL (316 to 66 mL; P < .001), and complication incidence (8 to 0 cases; P < .002) gradually decreased from the initial 43 cases to the last 43 cases of LDNs. Among surgeons who had performed-30 LDNs, the mean operative time and WIT were 197 mL and 2.8 minutes, respectively.ConclusionsBased on our evidence, LDN was a feasible and safe surgical option for live-donor nephrectomy, even in a small volume center. Better results can be achieved after a learning curve of experience for both the surgeon and the institution.  相似文献   

3.
BackgroundInduction therapy improves graft outcomes in kidney transplant recipients (KTRs). We aimed to compare the incidences of antibody-mediated rejection (AMR) and acute cellular rejection (ACR) as well as graft and patient outcomes in KTRs who underwent induction with alemtuzumab versus rabbit-antithymocyte globulin (r-ATG).MethodsThis was a single-center retrospective study involving patients who underwent kidney transplantation between January 2009 and December 2011 after receiving induction therapy with either alemtuzumab or r-ATG. Maintenance immunosuppression included tacrolimus and mycophenolate mofetil with early steroid withdrawal. Acute rejection was diagnosed using allograft biopsy.ResultsAmong the 108 study patients, 68 received alemtuzumab and 40 got r-ATG. There was a significantly higher incidence of AMR (15% vs 2.5%; P = .008) and similar incidence of ACR (4.4% vs 10%; P = .69) for alemtuzumab versus r-ATG groups. One-year serum creatinine levels (l.68 ± 0.8 mg/dL vs 1.79 ± 1.8 mg/dL; P = .66) as well as graft (91.1 ± 3.5% vs 94.5 ± 3.8%; P = .48) and patient (93.8 ± 3.0% vs 96.4 ± 3.5%; P = .92) survivals were similar for the alemtuzumab versus the r-ATG groups.ConclusionOur study showed a higher incidence of AMR and similar incidence of ACR in KTRs who underwent induction with alemtuzumab compared with those who received r-ATG and were maintained on tacrolimus and MMF. This was despite a lower HLA mismatch in the alemtuzumab group. One-year graft survival, patient survival, and allograft function were similar. Inadequate B-cell suppression by alemtuzumab as well as altered phenotypic and functional properties of repopulating B cells could be contributing to heightened risk of AMR in these patients.  相似文献   

4.
BackgroundThe diagnostic threshold for mild autonomous cortisol secretion using low dose, overnight, dexamethasone suppression testing is recognized widely as a serum cortisol ≥1.8 mcg/dL. The degree to which these patients require postoperative glucocorticoid replacement is unknown.MethodsWe reviewed adult patients with corticotropin (ACTH)-independent hypercortisolism who underwent unilateral laparoscopic adrenalectomy for benign disease with a dexamethasone suppression testing ≥1.8 mcg/dL at our institution from 1996 to 2018. Patients with a dexamethasone suppression testing of 1.8 to 5 mcg/dL were compared with those with a dexamethasone suppression testing >5 mcg/dL.ResultsWe compared 68 patients with a preoperative dexamethasone suppression testing of 1.8 to 5 mcg/dL to 53 patients with a preoperative dexamethasone suppression testing >5 mcg/dL. Preoperative serum ACTH (mean 10.0 vs 9.2 pg/mL), adenoma size (mean 3.4 vs 3.5 cm), and side of adrenalectomy (37 and 47% right) were similar between groups (P > .05 each). Patients with a dexamethasone suppression testing 1.8 to 5 mcg/dL were older (mean values 58 ± 11 vs 52 ± 16 years ; P = .01), less likely to be female (63 vs 81%; P = .03), had greater body mass indexes (33.1 ± 8.4 vs 29.1 ± 5.6; P = .01), and had lesser 24 hour preoperative urine cortisol excretions (32.6 ± 26.7 vs 76.1 ± 129.4 mcg; P = .03). Postoperative serum cortisol levels were compared in 22 patients with a dexamethasone suppression testing of 1.8 to 5 mcg/dL to 14 patients with a dexamethasone suppression testing >5 mcg/dL. Those with dexamethasone suppression testing 1.8 to 5 mcg/dL had greater postoperative serum cortisol levels (8.0 ± 5.7 vs 5.0 ± 2.6 mcg/dL; P = .03), were less likely to be discharged on glucocorticoid replacement (59% vs 89%; P = .003), and had a decreased duration of treatment (4.4 ± 3.8 vs 10.7 ± 18.0 months; P = .04).ConclusionAssessment of early postoperative adrenal function with mild autonomous cortisol secretion is necessary to minimize unnecessary glucocorticoid replacement.  相似文献   

5.
《Transplantation proceedings》2019,51(9):2868-2872
BackgroundThe gap between organ availability and patients on the waiting list for deceased donor kidney transplants has resulted in the wide use of extended criteria donors (ECDs).We aimed to compare the surgical outcomes of single kidney transplantation (KT) performed at our institute with standard criteria donor (SCD) or ECD grafts, according to the Organ Procurement and Transplantation Network definition. Patients and methods. Our retrospective analysis studied 115 adult recipients of KT from January 2016 to July 2018, with kidney grafts procured from adult donors after brain or circulatory death, performed at our institute. Among the 2 recipients’ groups, we compared the incidence of early graft loss, delayed graft function, hospitalization, and surgical complications. We compared the evaluation of time to early graft loss with Kaplan-Meier estimators and curves; the hypothesis of no difference in time to graft loss between the 2 groups was tested using the log-rank statistics.ResultsOf the 103 deceased donor kidney transplants during the study period, 129 grafts were used after the regional network sharing allocation. More frequently, ECDs had a greater body mass index than SCDs (25.2 ± 3.9 vs 27.7 ± 5.0, P = .005) and type II diabetes mellitus (0% vs 18%, P = .002). KT recipients who received an ECD graft (73, 63.5%) were older (59.8 ± 9.8 vs 45.2 ± 15.4, P < .001) and presented a higher rate of delayed graft function (56% vs 24%, P = .001). Post-transplant graft loss did not differ among the 2 groups.ConclusionBased on clinical experience in a single transplant center, ECD use for KTs is crucial in facing the organ shortage, without impairing post-deceased donor kidney transplant outcomes.  相似文献   

6.
IntroductionCardiovascular disease (CVD) is the leading cause of mortality in chronic kidney disease (CKD) patients. Fibroblast growth factor–23 (FGF-23) is associated with atherosclerosis and cardiovascular mortality in CKD patients and healthy subjects. However, data in renal transplant recipients (RTR) are scarce. We aimed to determine factors associated with FGF-23 and to explore its relationship to atherosclerosis.MethodsForty-six patients and 44 controls were included. FGF-23 was measured from plasma. Carotid intima media thickness (CIMT) was evaluated ultrasonographically.ResultsPatients had higher waist circumference (WC; 92.2 ± 14.9 vs 85.3 ± 11.0 cm; P < .05), glucose (99.8 ± 17.2 vs 90.3 ± 6.5 mg/dL; P < .01), creatinine (1.43 ± 0.6 vs 0.86 ± 0.1 mg/dL; P < .01), triglyceride (160.4 ± 58.9 vs 135.6 ± 59.8 mg/dL; P < .05), white blood cells (WBC; 7938.6 ± 2105.2 vs 6715.7 ± 1807.5 WBC/mm3; P < .01), ferritin (217.0 ± 255.8 vs 108.3 ± 142.4 ng/mL; P < .05), uric acid (6.5 ± 1.6 vs 4.7 ± 1.3 mg/dL; P < .01), C-reactive protein (CRP; 8.2 ± 18.2 vs 5.3 ± 7.9 mg/L; P < .01), parathyroid hormone (PTH; 89.7 ± 59.2 vs 44.1 ± 16.7 pg/mL; P < .01), and alkaline phosphatase (ALP; 162.5 ± 86.6 vs 74.2 ± 21.9 U/L; P < .01). FGF-23 was higher in patients (11.7 ± 7.2 vs 9.6 ± 6.8 pg/mL; P < .05). CIMT was similar (0.58 ± 0.09 vs 0.57 ± 0.1 mm; P > .05). WC, creatinine, and uric acid were positively correlated with FGF-23, whereas albumin showed negative correlation. On multivariate analysis only creatinine and uric acid were determinants of FGF-23.ConclusionFGF-23 levels are associated with uric acid in RTR. Larger studies are needed to confirm this finding.  相似文献   

7.
BackgroundKidney transplantation is the treatment of choice for patients with end-stage renal disease. In recent years donor criteria have changed to increase the percentage of expanded-criteria donors (ECDs). The aim of this study was to analyze transplants from ECDs obtained at our institution from. 2010 to 2012. We studied the comorbidity of ECD, preimplantation histologic study, renal function, and survival of transplanted grafts.Patients and MethodsEighty ECDs (160 kidneys) were analyzed. Forty-nine grafts were not implanted owing to macroscopic lesions (37 kidneys) or histologic findings on preimplantation biopsy (12 kidneys). Finally, 60 grafts from ECDs were implanted in our center. We analyzed the characteristics of the grafts (kidney function, creatinine clearance) and compared the data with a control group of allografts from standard-criteria donors (n = 14).ResultsThe median age of the ECD group was 72 years (range 65–77). No differences were found in certain characteristics between the ECDs whose kidneys were or were not implanted (hypertension, diabetes, creatinine at the time of the donation or proteinuria). However, there were differences in donor age (75 vs 67; P = .043), increased preimplantation biopsy score (6.8 ± 1.3 vs 4.8 ± 1.1; P = .041), and a higher percentage of cardiovascular disease (62.5% vs 43%; P = .038). Comparison of ECD and non-ECD grafts showed a lower creatinine clearance at 1 year (50 ± 05 mL/min vs 69 ± 96 mL/min, respectively; P < .001) and 2 years (50 ± 07 mL/min vs 67 ± 74 mL/min; P < .001) after transplantation. There were no differences in delayed graft function or graft survival between the 2 groups at 2 years after transplantation (95% vs 100%; P = .38).ConclusionsWe found no differences in graft survival from ECD compared with the control group of standard-criteria donors. The evaluation of grafts from ECD may be a strategy to increase the number of kidney transplants.  相似文献   

8.
With over 80,000 patients in the United States awaiting kidney transplantation, renal transplant surgery continues to evolve with attractive surgical options for living donation, which include laparoscopic donor nephrectomy (LDN) and robotic-assisted laparoscopic donor nephrectomy (RALDN). LDN is currently accepted as the gold standard procedure for living donor nephrectomy; RALDN is an evolving technique and may emerge as a preferred procedure over time. We present our initial experience with RALDN from December 2007 to August 2008. Thirty-five patients who underwent RALND were retrospectively analyzed and compared with 35 age- and time (year)-matched patients who underwent LDN. The parameters analyzed were length of hospital stay (3.2 ± 0.9 days, P < 0.59), estimated blood loss (146 ± 363 ml, P < 0.36), operating time (149 ± 44 min, P < 0.23), cold ischemic time (135 ± 202 min, P < 0.19), preoperative creatinine (0.82 ± 0.26 mg/dl, P < 0.46) and postoperative creatinine (1.44 ± 1.03 mg/dl, P < 0.20). There was no statistical difference between RALDN patients with single renal artery (n = 27) and those with more than one renal artery (n = 8) kidneys. There was one serious complication requiring conversion to open laparotomy to control a bleeding renal artery stump following extraction of the kidney. One-year graft survival for the 35 recipients of RALDN was 97.1%. RALDN is feasible and compares favorably to the standard LDN procedure with good graft survival. Robotic-assisted transplant surgery is an emerging technique with potential benefits to both surgeon and patient.  相似文献   

9.
Background and aimsHyperglycemia, a major side effect of patients receiving total parenteral nutrition (PN), is associated with higher mortality in critically ill patients. The aim of this study was to determine whether elevated blood glucose levels would be associated with worse outcomes in patients receiving PN.MethodsThis retrospective study included postoperative patients admitted to our surgical intensive care unit (SICU) from July 2008 to June 2009. Data collected included blood glucose levels, length of stay, and outcome measures. Correlations among daily average, maximum, and minimum blood glucose levels and outcome measures were calculated.ResultsSixty-nine patients were enrolled and divided into PN (n = 40) and non-PN (n = 29) groups. The initial mean blood glucose levels were 138.4 ± 63.1 mg/dL and 123.2 ± 41.8 mg/dL for the PN and non-PN groups, respectively. The mean blood glucose concentration was significantly increased (ΔBS = 44.8 ± 57.3 mg/dL; p < 0.001) in the PN group compared with the non-PN group (ΔBS = 39.4 ± 67.0 mg/dL; p = 0.004). The blood glucose concentration was significantly increased and consequently, consumption of insulin was increased on the 2nd day of ICU admission. The risk of mortality increased by a factor of 1.3 (OR = 1.30, 95% CI = 1.07–1.59, p = 0.010) for each 10 mg/dL increase in blood glucose level, when the daily maximum blood glucose level was >250 mg/dL. There were no cases of mortality in the current study when the blood glucose levels were controlled below 180 mg/dL. The mean blood glucose level in patients receiving PN was higher in those with diabetes than in those without diabetes (215.5 ± 42.8 vs. 165.8 ± 42.0 mg/dL, respectively, p = 0.001).ConclusionThe blood glucose level was associated with patient outcome and should be intensively monitored in critically ill surgical patients. We suggest that blood glucose levels should be controlled below 180 mg/dL in postoperative critically ill patients.  相似文献   

10.
BackgroundAlternate methods for characterizing oral glucose tolerance tests (OGTT) have emerged as superior to the 2-hour glucose in identifying individuals at risk for type 2 diabetes. The significance of these methods in cystic fibrosis (CF) is unclear. We compared 3 OGTT classifications in youth with CF: 1. curve shape (biphasic vs. monophasic), 2. time to glucose peak (≤30minutes vs. >30minutes), 3. 1-hour glucose (1hG) <155 mg/dL vs. ≥155 mg/dL to traditional OGTT criteria to determine which best identifies lower oral disposition index (oDI), pulmonary function, and body mass index (BMI).MethodsYouth 10–18 years with CF, not on insulin, underwent 2-hour OGTT. Glucoses were classified by traditional criteria and 3 alternate methods as normal (biphasic curve, glucose peak ≤30minutes, and/or 1hG <155 mg/dL) or abnormal (monophasic curve, glucose peak >30minutes, and/or 1hG ≥155 mg/dL). oDI was calculated [1/fasting insulin*(ΔInsulin0–30 min/ΔGlucose0–30 min)]. Mean oDI, BMI, forced expiratory volume in 1 second (FEV1), and forced vital capacity (FVC) were compared by OGTT classification.ResultsFifty-two youth with CF participated (mean±SD age 13±4years; 37% male; BMI z-score 0.0±0.8; FEV1 88±16.3%; FVC 97±14.8%). Late time to peak glucose and 1hG ≥155 mg/dL identified individuals with lower oDI (p=0.01); traditional OGTT criteria for prediabetes did not. No OGTT classification identified individuals with worse BMI nor pulmonary function. oDI was not associated with BMI, FEV1, or FVC.ConclusionsAlternate OGTT measures including time to peak glucose and 1hG better identify oDI abnormalities than traditional criteria. Further studies are required to determine whether these alternate methods identify individuals with CF at risk for future clinical decline.  相似文献   

11.
The aim of this study was to provide a systematic review and meta‐analysis of reports comparing laparoendoscopic single‐site (LESS) living‐donor nephrectomy (LDN) vs standard laparoscopic LDN (LLDN). A systematic review of the literature was performed in September 2013 using PubMed, Scopus, Ovid and The Cochrane library databases. Article selection proceeded according to the search strategy based on Preferred Reporting Items for Systematic Reviews and Meta‐analyses criteria. Weighted mean differences (WMDs) were used to measure continuous variables and odds ratios (ORs) to measure categorical ones. Nine publications meeting eligibility criteria were identified, including 461 LESS LDN and 1006 LLDN cases. There were more left‐side cases in the LESS LDN group (96.5% vs 88.6%, P < 0.001). Meta‐analysis of extractable data showed that LLDN had a shorter operative time (WMD 15.06 min, 95% confidence interval [CI] 4.9–25.1; P = 0.003), without a significant difference in warm ischaemia time (WMD 0.41 min, 95% CI –0.02 to 0.84; P = 0.06). Estimated blood loss was lower for LESS LDN (WMD ?22.09 mL, 95% CI –29.5 to –14.6; P < 0.001); however, this difference was not clinically significant. There was a greater likelihood of conversion for LESS LDN (OR 13.21, 95% CI 4.65–37.53; P < 0.001). Hospital stay was similar (WMD –0.11 days, 95% CI –0.33 to 0.12; P = 0.35), as well as the visual analogue pain score at discharge (WMD –0.31, 95% CI –0.96 to 0.35; P = 0.36), but the analgesic requirement was lower for LESS LDN (WMD –2.58 mg, 95% CI –5.01 to –0.15; P = 0.04). Moreover, there was no difference in the postoperative complication rate (OR 1.00, 95% CI 0.65–1.54; P = 0.99). Renal function of the recipient, as based on creatinine levels at 1 month, showed similar outcomes between groups (WMD 0.10 mg/dL, –0.09 to 0.29; P = 0.29). In conclusion, LESS LDN represents an emerging option for living kidney donation. This procedure offers comparable surgical and early functional outcomes to the conventional LLDN, with a lower analgesic requirement. However, it is more technically challenging than LLDN, as shown by a greater likelihood of conversion. The role of LESS LDN remains to be defined.  相似文献   

12.
PurposeTo retrospectively review the ability of direct bilirubin serum level to predict mortality and complications in patients undergoing transarterial chemoembolization (TACE) for hepatocellular carcinoma (HCC) and compare it to the predictive value of the currently utilized total bilirubin serum level.Materials and methodsA total of 219 patients who underwent TACE for 353 hepatocelluar carcinomas (HCC) at a single institution were included. There were 165 men and 54 women, with a mean age of 61.4 ± 7.6 (SD) [range: 27–86 years]. The patients’ electronic medical records were evaluated and they were divided into cohorts based on total bilirubin (< 2, 2–3, and > 3 mg/dL) as well as direct bilirubin (< 1 and 1–2 mg/dL).ResultsDirect bilirubin serum level was significantly greater in the cohort of patients who did not survive as compared to those who survived 6 months ([0.58 ± 0.46 (SD) mg/dL; range: < 0.1–1.8 mg/dL] vs. [0.40 ± 0.31 (SD) mg/dL; range: < 0.1–1.6 mg/dL], respectively) (P = 0.04) and 12 months ([0.49 ± 0.38 (SD) mg/dL; range: < 0.1–1.8 mg/dL] vs. [0.38 ± 0.32 (SD) mg/dL; range: < 0.1–1.6 mg/dL], respectively) (P = 0.03). While total bilirubin serum level was not significantly different in those who did not and did survive 6 months ([1.54 ± 0.99 (SD) mg/dL; range: 0.3–3.9 mg/dL] vs. [1.27 ± 0.70 (SD) mg/dL; range: 0.3–3.75 mg/dL], respectively) (P = 0.16), it was significantly different when evaluating 12 months survival ([1.46 ± 0.87 (SD) mg/dL; range: 0.3–3.9 mg/dL] vs. [1.22 ± 0.65 (SD) mg/dL; range: 0.3–3.9 mg/dL]) (P = 0.03). Akaike information criterion (AIC) analysis revealed that direct bilirubin level more accurately predicted overall survival (AIC = 941.19 vs. 1000.51) and complications (AIC = 352.22 vs. 357.42) than total bilirubin serum levels.ConclusionDirect bilirubin serum level appears to outperform total bilirubin concentration for predicting complications and overall survival in patients undergoing TACE. Patients with relatively maintained direct bilirubin levels should be considered for TACE, particularly in the setting of bridging to transplant.  相似文献   

13.
ObjectiveIt has been suggested that more bypass outflow targets for bypass grafts improve patency and outcomes. Our objective was to examine this in a multicenter contemporary series of axillary to femoral artery grafts.MethodsThe Vascular Quality Initiative database was queried for all axillary-unifemoral (AxUF) and axillary-bifemoral (AxBF) bypass grafts performed between 2010 and 2017 for claudication, rest pain, and tissue loss. Patients with acute limb ischemia were excluded. Patients' demographics and comorbidities as well as operative details and outcomes were recorded. Univariable, multivariable, and Kaplan-Meier analyses were used to assess long-term outcomes.ResultsThere were 412 (32.9%) AxUF grafts and 839 (67.1%) AxBF grafts identified. Overall, the mean age of the patients was 68.3 years, 51.1% were male, and 84.7% were white. Compared with AxBF grafts, AxUF grafts were more often performed for urgent cases; in patients who were younger, male, nonambulatory, and diabetic; and in those with preoperative anticoagulation, critical limb ischemia, prior bypass, aneurysm repair, peripheral vascular intervention, and major amputation (P < .05 for all). There were no significant differences between AxUF and AxBF grafts in perioperative wound complications (4.2% vs 2.9%; P = .23), cardiac complications (7.3% vs 10.4%; P = .08), pulmonary complications (4.1% vs 6%, P = .18), early stenosis/occlusion (0.2% vs 0.8%; P = .22), perioperative mortality (2.9% vs 3.2%; P = .77), and length of stay (6.4 ± 5.6 days vs 6.7 ± 8 days; P = .29). The mean estimated blood loss (268.1 mL vs 348.6 mL; P < .001) and mean operative time (201 minutes vs 224.1 minutes; P < .001) were significantly lower for AxUF grafts. Kaplan-Meier analysis showed that AxUF and AxBF grafts had similar freedom from graft occlusion (62.6% vs 71.8%; P = .074), major adverse limb event-free survival (57.1% vs 66.6%; P = .052), and survival (86% vs 86%; P = .897) at 1 year. Major amputation-free survival was lower for AxUF grafts (63.7% vs 73%; P = .028). Multivariable analysis also showed that the type of graft configuration did not independently predict occlusion/death (hazard ratio [HR], 1.06; 95% confidence interval [CI], 0.77-1.46; P = .72), amputation/death (HR, 1.12; 95% CI, 0.83-1.51; P = .45), major adverse limb event/death (HR, 0.97; 95% CI, 0.73-1.3; P = .85), or mortality (HR, 0.91; 95% CI, 0.65-1.26; P = .55). Three-year survival after placement of AxUF and AxBF grafts was similar (75.1% vs 78.2%; P = .414).ConclusionsAxUF and AxBF grafts have similar perioperative and 1-year outcomes. Graft patency was not significantly different between an AxBF graft and an AxUF graft at 1 year. Overall, patients treated with these reconstructions have many comorbidities and low long-term survival.  相似文献   

14.
《Transplantation proceedings》2021,53(7):2180-2187
BackgroundPostmortal organ donor rates remain low in Germany, whereas donor age has been increasing considerably in the last decades. As a consequence of low donation rates older and more marginal donor kidneys are accepted for transplantation. However, procured kidneys from very old a/o marginal donors may be considered as not suitable for transplantation as a single organ and subsequently be discarded. However, dual transplantation of both kidneys from such donors may provide an opportunity to nevertheless use these organs for renal transplantation, thereby providing the twofold nephron mass as a single kidney transplantation.MethodsWe compared in this retrospective analysis the outcome of 10 recipients of a dual kidney transplantation (DKT) with 40 matched recipients of a single kidney transplantation (SKT). Recipients were matched for donor and recipient age (ie, a maximum age difference of ±10 years in a ratio of 1:4 for DKT vs SKT recipients). In addition, a second SKT control group of 10 SKT recipients being transplanted immediately before each DKT recipient with a kidney from a donor aged ≥65 years was used for comparison. All renal transplant recipients were observed for up to 3 years or until July 31, 2020.ResultsMean donor and recipient age was 77.2 ± 4.6/75.1 ± 6.6/82.1 ± 7.9 and 66.4 ± 5.8/66.1 ± 6.0/64.8 ± 8.4 for SKT group 1/SKT group 2/DKT, respectively. Procurement serum creatinine concentrations were significantly higher in the DKT group in comparison to the SKT control group 1 (P = .019) as was the rate of transplant artery atherosclerosis (P = .021). Furthermore, Kidney Donor Profile Index, and Kidney Donor Risk Index were significantly higher (P = .0138/P = .064, and P < .001/P = .038) in the DKT group than in SKT group 1 and 2. Rates of acute rejection and delayed graft function were not significantly different between groups, though biopsy-proven acute rejection was numerically higher in the SKT groups. Patient survival and overall and death-censored graft survival rates were also not significantly different between groups, although they tended to be higher after DKT.ConclusionsDKT provides an opportunity to successfully use postmortal kidneys even from donors aged >80 years and a Kidney Donor Profile Index ≥95% for renal transplantation. DKT may thereby increase the available pool of donors to better serve patients with end-stage renal disease on the waiting list.  相似文献   

15.
ObjectivesTo assess the effect of DuraGraft (Somahlution Inc, Jupiter, Fla), an intraoperative graft treatment, on saphenous vein grafts in patients undergoing isolated coronary artery bypass grafting.MethodsWithin patients, 2 saphenous vein grafts were randomized to DuraGraft or heparinized saline. Multidetector computed tomography angiography at 1, 3, and 12 months assessed change in wall thickness (primary end point at 3 months), lumen diameter, and maximum narrowing for the whole graft and the proximal 5-cm segment. Safety end points included graft occlusion, death, myocardial infarction, and repeat revascularization.ResultsAt 3 months, no significant changes were observed between DuraGraft- and saline-treated grafts (125 each) for wall thickness, lumen diameter, and maximum narrowing. At 12 months, DuraGraft-treated grafts demonstrated smaller mean wall thickness, overall (0.12 ± 0.06 vs 0.20 ± 0.31 mm; P = .02) and in the proximal segment (0.11 ± 0.03 vs 0.21 ± 0.33 mm; P = .01). Changes in wall thickness were greater in the proximal segment of saline-treated grafts (0.09 ± 0.29 vs 0.00 ± 0.03 mm; P = .04). Increase in maximum graft narrowing was larger in the proximal segment in the saline-treated grafts (4.7% ± 12.7% vs 0.2% ± 3.8%; P = .01). Nine DuraGraft and 11 saline grafts had occluded or thrombosed. One myocardial infarction was associated with a saline graft occlusion. No deaths or revascularizations were observed.ConclusionsDuraGraft demonstrated a favorable effect on wall thickness at 12 months, particularly in the proximal segment. Longer-term follow-up in larger studies is needed to evaluate the effect on clinical outcomes.  相似文献   

16.
《Journal of vascular surgery》2020,71(6):2056-2064
ObjectiveLimited data exist comparing atherectomy (At) with balloon angioplasty for infrapopliteal peripheral arterial disease. The objective of this study was to compare the outcomes of infrapopliteal At with angioplasty vs angioplasty alone in patients with critical limb ischemia.MethodsThis is a retrospective, single-center, longitudinal study comparing patients undergoing either infrapopliteal At with angioplasty or angioplasty alone for critical limb ischemia, between January 2014 and October 2017. The primary outcome was primary patency rates. Secondary outcomes were reintervention rates, assisted primary patency, secondary patency, major adverse cardiac events, major adverse limb events, amputation-free survival, overall survival, and wound healing rates. Data were analyzed in multivariate generalized linear models with log rank tests to determine survival in Kaplan-Meier curves.ResultsThere were 342 infrapopliteal interventions, 183 percutaneous balloon angioplasty (PTA; 54%), and 159 atherectomies (At) with PTA (46%) performed on 290 patients, with a mean age of 67 ± 12 years; 61% of the patients were male. The PTA and At/PTA groups had similar demographics, tissue loss (79% vs 84%; P = .26), ischemic rest pain (21% vs 16%; P = .51), mean follow-up (19 ± 9 vs 20 ± 9 months; P = .32), mean number of vessels treated (1.7 ± 0.8 vs 1.9 ± 0.8; P = .08) and the mean lesion length treated (6.55 ± 5.00 cm vs 6.02 ± 4.00 cm; P = .08), respectively. Similar 3-month (96 ± 1% vs 94 ± 1%), 6-month (85 ± 2% vs 86 ± 3%), 12-month (68 ± 3% vs 69 ± 4%), and 18-month (57 ± 4% vs 62 ± 4%) primary patency rates were seen in the two groups (P = .87). At/PTA patients had significantly higher reintervention rates as compared with the PTA patients (28% vs 16%; P = .02). Similar assisted primary patency rates (67 ± 4% vs 69 ± 4%; P = .78) and secondary patency rates (61 ± 4% vs 66 ± 4%; P = .98) were seen in the PTA and At/PTA groups at 18 months. The 30-days major adverse cardiac event rates (3% vs 2%; P = .13) and 30-day major adverse limb event rates (5% vs 4%; P = .2) were similar in both groups. Wound healing rates (72 ± 3% vs 75 ± 2%; P = .12), 1-year amputation-free survival (68 ± 4.1% vs 70 ± 2%; P = .5), and 1-year overall survival (76 ± 4% vs 78 ± 4%; P = .39) rates did not differ in the PTA and At/PTA groups. THE At/PTA group had higher local complication rates (7 [4%] vs 1 [0.5%]; P = .03)ConclusionsAt with angioplasty provides similar patency rates compared with angioplasty alone for infrapopliteal peripheral arterial disease, but associated with higher reintervention and local complication rates. Further appropriately designed studies are required to determine the exact role of At in this subset of patients.  相似文献   

17.
《Transplantation proceedings》2021,53(7):2238-2241
BackgroundThe purpose of this study was to identify factors influencing changes in the body mass index (BMI) of kidney transplant (KT) patients and provide data for the management of the BMI of patients who have undergone KT.MethodThe participants were 106 patients who underwent KT at a single center from August 2014 to June 2017. BMIs were compared and analyzed for 6 months and 24 months after KT, and the survey details were collected through medical records. Analysis was performed between 2 groups, one with increased BMI and the other without. Multivariate logistic regression analysis was performed to identify the factors related to an increase in BMI.ResultsBMI increased from 22.60 ± 2.72 kg/m2 at 6 months to 23.18 ± 3.06 kg/m2 2 years after KT. The group with increased BMI (n = 39) had more patients with higher low-density cholesterol levels at the time of KT (low-density cholesterol ≥100 mg/dL; 34 [54.0%] vs 10 [26.3]; P = .008) and without statin drug use than the other group (n = 67) (statin drug use, 48 [70.6%] vs 34 [87.2%], P = .044). Multiple logistic regression analysis showed that age >50 years (odds ratio [OR] = 2.942; 95% confidence interval [CI], 1.075-8.055; P = .036), low-density lipoprotein >100 mg/dL at KT (OR = 6.618; 95% CI, 2.225-19.682; P = 0.001), and no statin drugs (OR = 5.094; 95% CI, 1.449-17.911, P = .011) were the risk factors for an increased BMI after KT.ConclusionsAfter KT, to prevent an increase in the BMI, clinicians should strongly recommend the use of drugs to treat hyperlipidemia, especially in elderly patients with high low-density lipoprotein levels before KT.  相似文献   

18.
BackgroundAdherent perinephric fat (APF) is a known risk factor of surgical difficulty during laparoscopic donor nephrectomy (LDN). The Mayo Adhesive Probability (MAP) score predicts APF accurately. The aim of this study is to identify the association between MAP score and operative time in LDN.MethodsWe retrospectively evaluated 154 kidney donors who underwent surgery from December 2017 to December 2019 at ?stanbul Ayd?n University Hospital and ?stinye University Hospital. All of the operations were done by 3 senior surgeons by a fully laparoscopic method. The MAP score was derived from computed tomography scans by 1 blinded reader. Demographic data, body mass index (BMI), MAP score, side selection, estimated glomerular filtration rate (eGFR), number of arteries and veins, operative time, hospital stay, and complications are recorded. Single and multiple variable analyses were used to evaluate the correlation between operative time and MAP score, BMI, side selection, and number of vascular structures.ResultsA total of 154 patients (79 men, 75 women) with a mean age of 44.4 ± 12.72 were included in this study. None of the cases were converted to open nephrectomy. There were no major complications. Mean BMI was 27.59 ± 4.32 kg/m2, mean MAP score was 0.69 ± 1.15, and mean operative time was 40.25 ± 9.81 minutes. Although mean BMI was higher in women (28.19 ± 4.52 vs 27.03 ± 4.07; P < .05), mean MAP score was lower than in men (0.35 ± 0.86 vs 1.03 ± 1.29; P < .001). Older age, higher BMI, higher MAP score, and presence of multiple renal arteries were associated with longer operative time of LDN. The MAP score was associated with older age, male sex and higher BMI.ConclusionsThis study showed that different risk factors can affect operative time in LDN. The MAP score was significantly associated with longer operative time, especially in men, so it can be useful for predicting surgical difficulty in kidney donors.  相似文献   

19.
BackgroundBorderline changes suspicious for acute T-cell–mediated rejection (BC) are frequently seen on biopsy specimens, but their clinical significance and clinical management are still controversial. Our goal was to compare clinical outcomes of kidney transplant recipients with biopsy-proven BC vs acute T-cell–mediated rejection (aTCMR) and the influence of treating BC on graft outcomes.MethodsA retrospective cohort study was performed in all kidney transplant recipients with biopsy-proven BC and aTCMR between January 2012 and December 2018, according to Banff 2017 criteria; patients with concomitant antibody-mediated rejection were excluded.ResultsWe included 85 patients, 30 with BC (35.3%) and 55 with aTCMR (64.7%). There was no difference between groups regarding demographics, HLA matching and sensitization, immunosuppression, or time of transplant. Treatment with steroids was started in 15 patients with BC (50%) and in all patients with aTCMR, with 4 of the latter additionally receiving thymoglobulin (7.2%). At 1 year post biopsy, overall graft survival was 71%, and despite presenting better estimated glomerular filtration rate (eGFR) at biopsy (33.3 ± 23.4 vs 19.9 ± 13.2 mL/min/1.73 m2, P = .008), patients in the BC group presented the same graft survival as the aTCMR group according to Kaplan-Meyer survival curves. When analyzing the BC group (n = 30) and comparing the patients who were treated (n = 15) vs a conservative approach (n = 15), graft survival at 1 year was 87% for treated patients and 73% for nontreated patients (P = .651), with no difference in eGFR for patients with functioning graft. However, at longer follow-up, survival curves showed a trend for better graft survival in treated patients (70.2 ± 9.2 vs 38.4 ± 8.4 months, P = .087).ConclusionOur study showed that patients with BC did not present better graft survival or graft function at 1 year after biopsy or at follow-up compared with the aTCMR group, despite better eGFR at diagnosis. We found a trend for better graft survival in patients with BC treated with steroids compared with a conservative approach. These results reinforce the importance of borderline changes in graft outcomes and that the decision to treat can influence long-term outcomes.  相似文献   

20.
BackgroundThis study sought to determine the total amount of time committed to planned and unplanned episodes of care related to primary, unilateral total joint arthroplasty (TJA), relative to a growth in outpatient TJA.MethodsAll primary, unilateral TJA procedures performed over a 7-year period by a single surgeon at a single institution were retrospectively reviewed. Time dedicated to planned work was calculated over each episode of care, from surgery scheduling to 90 days postoperatively. All telephone inquiries and readmissions involving the surgeon’s direct input, over the episode of care, constituted time dedicated to unplanned work.ResultsBetween 2012 and 2018, as the proportion of outpatient TJAs increased, the average planned episode-of-care time per patient decreased from 412 minutes to 361 minutes. Despite a 108% increase in the total number of outpatient TJAs between 2017 and 2018 (51/432 (11.8%) to 106/555 (19.1%); P = .002), neither the average number of unplanned telephone inquiries (4.6 ± 3.8 vs 4.2 ± 3.7; P = .124), nor the mean time per patient required to respond to calls (23.1 ± 19.4 vs 21.2 ± 18 minutes, P = .135) differed. Between 2017 and 2018, the average total episode-of-care time per patient decreased from 403 minutes (376 planned + 27 unplanned) to 387 minutes (361 planned + 26 unplanned).ConclusionDespite an increase in outpatient TJA, the average time required for planned and unplanned patient care remained relatively constant. The growth of outpatient TJA nationally should not trigger a change in Centers for Medicare and Medicaid Services benchmarks.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号