首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background

Surface electromyography is a noninvasive technique for detecting the activity of skeletal muscles and especially the muscles for respiratory compliance; namely, the diaphragm and rectus abdominis. This study compares these muscles in healthy individuals, liver disease patients, and after abdominal surgery.

Objective

To study muscle activity by surface electromyography of the right diaphragm muscles and right rectus abdominis (root means square, RMS), and the manovacuometry muscle strength (maximal inspiratory pressure, MIP; and maximal expiratory pressure, MEP).

Results

We evaluated 246 subjects who were divided into 3 groups: healthy (65), liver disease (171), and post-surgery (10). In liver disease group the BMI was higher significantly for ascites (P = .001), and was increase in RMS rectum (P = .0001), RMS diaphragm (P = .030), and a decreased inspiratory and expiratory indices (P = .0001) pressure in the post-surgery group. A multivariate analysis showed tendency to an increased BMI in liver disease and in the post-surgery groups correlated with an increased RMS rectum and the lower MIP/MEP (P = .11). The receiver operating characteristic curve showed that RMS rectus was capable of discriminating liver disease and post-surgery patients from healthy subjects (area = 0.63; 95% CI 0.549–0.725).

Conclusion

The muscle activity of normal individuals is lower than in subjects with deficit muscles because less effort is necessary to overcome the same resistance, observed by surface electromyography and muscle strength.  相似文献   

2.
IntroductionMuscle dysfunction is one of the most extensively studied manifestations of COPD. Metabolic changes in muscle are difficult to study in vivo, due to the lack of non-invasive techniques. Our aim was to evaluate metabolic activity simultaneously in various muscle groups in COPD patients.MethodsThirty-nine COPD patients and 21 controls with normal lung function, due to undergo computed axial and positron emission tomography for staging of localized lung lesions were included. After administration of 18-fluordeoxyglucose, images of 2 respiratory muscles (costal and crural diaphragm, and rectus abdominus) and 2 peripheral muscles (brachial biceps and quadriceps) were obtained, using the standard uptake value as the glucose metabolism index.ResultsStandard uptake value was higher in both portions of the diaphragm than in the other muscles of all subjects. Moreover, the crural diaphragm and rectus abdominus showed greater activity in COPD patients than in the controls (1.8±0.7 vs 1.4±0.8; and 0.78±0.2 vs 0.58±0.1; respectively, P<.05). A similar trend was observed with the quadriceps. In COPD patients, uptake in the two respiratory muscles and the quadriceps correlated directly with air trapping (r=0.388, 0.427 and 0.361, respectively, P<.05).ConclusionsThere is greater glucose uptake and metabolism in the human diaphragm compared to other muscles when the subject is at rest. Increased glucose metabolism in the respiratory muscles (with a similar trend in their quadriceps) of COPD patients is confirmed quantitatively, and is directly related to the mechanical loads confronted.  相似文献   

3.
BackgroundBioelectrical impedance analysis is a simple, noninvasive method of assessing body composition. Dialysis modality and selection of buffer type may have an impact on body composition. The aim of our study was to compare body compositions of patients from the waiting list for cadaveric renal transplantation according to the dialysis modality.MethodsWe examined a total of 152 (110 hemodialysis [HD] and 42 continuous ambulatory peritoneal dialysis [CAPD]) patients. Demographic data were collected from patient charts. The last 6 months routine laboratory evaluations including hemoglobin, serum creatinine, intact parathyroid hormone, albumin, C reactive protein, calcium, phosphorus were collected. Body compositions were measured using the Tanita BC-420MA Body Composition Analyzer (Tanita, Tokyo, Japan). We made a subanalysis of the CAPD group according to buffer choices as follows: lactate-buffered (n = 16) and bicarbonate/lactate–buffered (n = 26) solution users.ResultsThe body weight (P = .022), body mass index (BMI; 25.8 ± 4.7 vs 23.4 ± 4.9 kg/m2, P = .009), muscle mass (P = .01), fat-free mass (P = .013), and visceral fat ratio (9.5 ± 5.4 vs 7.3 ± 4.1 %, P = .022) were significantly higher in the CAPD group. Total body water of CAPD patients were also higher (P = .003), but total body water ratios of HD and CAPD groups were similar. Fat and fat-free mass ratios of patient groups were also similar. Comparing CAPD subgroups we observed that patients using bicarbonate/lactate–buffered solutions had higher body weights (P = .038), BMI (27.1 ± 5 vs 23.7 ± 3.5 kg/m2, P = .018) values, and visceral fat ratios (8.0 ± 5.2 vs 4.6 ± 2.5 %, P = .023). These patients also tend to have higher fat mass without statistical significance (P = .074). Fat, muscle, and fat-free mass total body water ratios of peritoneal dialysis subgroups were similar.ConclusionWe believe that body composition analysis should be used as a complementary method for assessing nutritional status of PD and CAPD patients as body weight or BMI measurements do not reflect fat, muscle masses, and visceral fat ratios in these patients. Stable, well nourished CAPD patients should be closely observed and be encouraged to increase daily exercise and/or decrease calorie intake from other sources to decrease risks associated with abdominal obesity.  相似文献   

4.
BackgroundThis study compared (1) perioperative outcomes, (2) postoperative complications, and (3) reoperation rates after primary total hip arthroplasty (THA) between short stature patients and matched control patients.MethodsA review of primary THA patients from 2012 to 2017 using an institutional database was conducted. This yielded 12,850 patients of which 108 were shorter than 148 cm. These patients were matched 1:1 by age (P = .527), gender (P = .664), and body mass index (P = .240) to controls. The final study population with minimum 1-year follow-up that was included for analysis comprised 47 patients in the short stature cohort and 57 patients in the control cohort. The following outcomes/complications were compared: operative times, lengths of stay (LOSs), intraoperative fractures, minor complications, 90-day readmissions, and revisions.ResultsOperative times were significantly longer in the short stature cohort than in the matched control cohort (133 ± 65 minutes vs 104 ± 30 minutes, P = .005). In addition, hospital LOS was slightly longer in the short stature group than in the matched control groups (3.2 ± 1.5 days vs 2.6 ± 1.0, P = .017). Rates of intraoperative fractures (P = 1.000), minor complications P = .406), 90-day readmissions (P = .5000), and revision (P = .202) were similar between the short stature and control cohorts.ConclusionPatients with disproportionately short stature had longer operative times and slight longer LOS. However, complication and readmission rates were similar. Future studies with larger sample sizes are warranted to confirm these findings and further evaluate implant survivorship in this unique THA patient population.  相似文献   

5.
《Journal of vascular surgery》2020,71(6):2056-2064
ObjectiveLimited data exist comparing atherectomy (At) with balloon angioplasty for infrapopliteal peripheral arterial disease. The objective of this study was to compare the outcomes of infrapopliteal At with angioplasty vs angioplasty alone in patients with critical limb ischemia.MethodsThis is a retrospective, single-center, longitudinal study comparing patients undergoing either infrapopliteal At with angioplasty or angioplasty alone for critical limb ischemia, between January 2014 and October 2017. The primary outcome was primary patency rates. Secondary outcomes were reintervention rates, assisted primary patency, secondary patency, major adverse cardiac events, major adverse limb events, amputation-free survival, overall survival, and wound healing rates. Data were analyzed in multivariate generalized linear models with log rank tests to determine survival in Kaplan-Meier curves.ResultsThere were 342 infrapopliteal interventions, 183 percutaneous balloon angioplasty (PTA; 54%), and 159 atherectomies (At) with PTA (46%) performed on 290 patients, with a mean age of 67 ± 12 years; 61% of the patients were male. The PTA and At/PTA groups had similar demographics, tissue loss (79% vs 84%; P = .26), ischemic rest pain (21% vs 16%; P = .51), mean follow-up (19 ± 9 vs 20 ± 9 months; P = .32), mean number of vessels treated (1.7 ± 0.8 vs 1.9 ± 0.8; P = .08) and the mean lesion length treated (6.55 ± 5.00 cm vs 6.02 ± 4.00 cm; P = .08), respectively. Similar 3-month (96 ± 1% vs 94 ± 1%), 6-month (85 ± 2% vs 86 ± 3%), 12-month (68 ± 3% vs 69 ± 4%), and 18-month (57 ± 4% vs 62 ± 4%) primary patency rates were seen in the two groups (P = .87). At/PTA patients had significantly higher reintervention rates as compared with the PTA patients (28% vs 16%; P = .02). Similar assisted primary patency rates (67 ± 4% vs 69 ± 4%; P = .78) and secondary patency rates (61 ± 4% vs 66 ± 4%; P = .98) were seen in the PTA and At/PTA groups at 18 months. The 30-days major adverse cardiac event rates (3% vs 2%; P = .13) and 30-day major adverse limb event rates (5% vs 4%; P = .2) were similar in both groups. Wound healing rates (72 ± 3% vs 75 ± 2%; P = .12), 1-year amputation-free survival (68 ± 4.1% vs 70 ± 2%; P = .5), and 1-year overall survival (76 ± 4% vs 78 ± 4%; P = .39) rates did not differ in the PTA and At/PTA groups. THE At/PTA group had higher local complication rates (7 [4%] vs 1 [0.5%]; P = .03)ConclusionsAt with angioplasty provides similar patency rates compared with angioplasty alone for infrapopliteal peripheral arterial disease, but associated with higher reintervention and local complication rates. Further appropriately designed studies are required to determine the exact role of At in this subset of patients.  相似文献   

6.
BackgroundPatients with end-stage renal disease (ESRD) experience erectile dysfunction (ED). Although it is a benign disorder, ED is related to physical and psychosocial health, and it has a significant impact on the quality of life (QOL). The objective of the present study was to investigate the effects of different renal replacement therapies on ED.MethodsA total of 100 ESRD patients and 50 healthy men were recruited to the present cross-sectional study. The study was consisted of 53 renal transplantation (RT; group I; mean age, 39.01 ± 7.68 years; mean duration of follow-up, 97.72 ± 10.35 months) and 47 hemodialysis (HD) patients (group II; mean age, 38.72 ± 9.12 years; mean duration of follow-up, 89.13 ± 8.65 months). The control group consisted of 50 healthy men (group III; mean age 39.77 ± 8.51 years). Demographic data and laboratory values were obtained. All groups were evaluated with the following scales: International Index of Erectile Function (IIEF)-5 and Short Form (SF)-36 questionnaires, and Beck Depression Inventory (BDI). The patients whose IIEF score were ≤21 were accepted as having ED.ResultsThe mean age of these groups were similar (P > .05). Total IIEF-5 scores of men in groups I, II, and III were 19.5 ± 4.5, 16.4 ± 5.9, and 22.5 ± 3.4, respectively. The mean total IIEF-5 score of control group was higher than those of groups I and II (P < .001). Posttransplant group mean total IIEF-5 score was also higher than the HD group (P < .05). Groups I and II significantly differed from control group in terms of presence of ED (IIEF score ≤21: Group I, n = 28 [52.8%]; group II, n = 29 [61.7%]; and group III, n = 12 [%24], respectively [P < .001]), whereas there was no difference between groups I and II. In the logistic regression analysis (variables included age, BDI, and renal replacement therapy [HD and transplantation]), ED was independently associated with age (odds ratio [OR], 1.1; 95% confidence interval [CI], 1.05–1.2), BDI (OR, 1.1; 95% CI, 1.01–1.13). Additionally, ED was not associated with renal replacement therapy (OR, 1.46; 95% CI, 0.60–3.57). Physiologic health domain of SF-36 was significantly better in healthy controls (P < .001). Patient groups were similar in terms of BDI score (P > .05). ED score was negatively correlated with BDI (r = ?0.368; P < .001), and positively correlated with SF-36 (r = 0.495; P < .001) in all patient groups.ConclusionPatients with ESRD had significantly lower sexual function and lower QOL scores than the healthy control men. Notably, the mode of renal replacement therapy had no impact on male sexual function.  相似文献   

7.
BackgroundOur study determined long-term (up to 27 years) results of fixed-bearing vs mobile-bearing total knee arthroplasties (TKAs) in patients <60 years with osteoarthritis.MethodsThis study included 291 patients (582 knees; mean age 58 ± 5 years), who received a mobile-bearing TKA in one knee and a fixed-bearing TKA in the other. The mean duration of follow-up was 26.3 y (range 24-27).ResultsAt the latest follow-up, the mean Knee Society knee scores (91 ± 9 vs 89 ± 11 points, P = .383), Western Ontario and McMaster Universities Osteoarthritis Index (35 ± 7 vs 37 ± 6 points, P = .165), range of knee motion (128° ± 13° vs 125° ± 15°, P = .898), and University of California, Los Angeles activity score (6 ± 4 vs 6 ± 4 points, P = 1.000) were below the level of clinical significance between the 2 groups. Revision of mobile-bearing and fixed-bearing TKA occurred in 16 (5.5%) and 20 knees (6.9%), respectively. The rate of survival at 27 years for mobile-bearing and fixed-bearing TKA was 94.5% (95% confidence interval 89-100) and 93.1% (95% confidence interval 88-98), respectively, and no significant differences were observed between the groups. Osteolysis was identified in 4 knees (1.4%) in each group.ConclusionThere were no significant differences in functional outcomes, rate of loosening, osteolysis, or survivorship between the 2 groups.  相似文献   

8.
BackgroundAcute kidney injury (AKI) is commonly associated with HIV infection.ObjectivesTo describe the profile of AKI in HIV infected versus non-infected persons.Patients and methodsThis is a prospective study that was carried out during the study period from January 2010 to December 2015 in the department of nephrology-internal medicine D of Treichville University Hospital (Côte d’Ivoire).ResultsThe prevalence of HIV infection was 35.2% in the population of AKI. The average age of patients was 42 ± 18 years in the HIV positive group against 51 ± 18 years in the HIV negative group (P = 0.0001). Etiologies were infections in 65.1% in the HIV positive group against 38.8% in the HIV negative group (P = 0.0001) and water loss in 24.7% in the HIV positive group against 7.8% in the HIV negative group (P = 0.0001). Factors such as the AIDS stage (P = 0.002), severe sepsis (P = 0.002) and acute pyelonephritis (P = 0.001) were associated with mortality in HIV positive patients against severe anemia (P = 0.0001) and severe sepsis (P = 0.0001) in the HIV-negative group.ConclusionHIV positive patients are younger with a female predominance. The mortality rate is identical in both groups.  相似文献   

9.
BackgroundChronic kidney disease (CKD) is common in patients with chronic liver disease (CLD) as is acute kidney injury (AKI). The differentiation between CKD vs AKI is often difficult and sometimes the both may coexist. A combined kidney–liver transplant (CKLT) may result in a kidney transplant in patients whose renal function is likely to recover or at least who have stable renal function post-transplant. We retrospectively enrolled 2742 patients who underwent living donor liver transplant at our center from 2007 to 2019.MethodsThis audit was carried out in liver transplant recipients with CKD 3 to 5 who underwent either liver transplant alone (LTA) or CKLT to look at outcomes and long-term evolution of renal function. Forty-seven patients met the medical eligibility criteria for CKLT. Of the 47 patients, 25 underwent LTA and the rest 22 underwent CKLT. The diagnosis of CKD was made according to the Kidney Disease: Improving Global Outcomes classification.ResultsPreoperative renal function parameters were comparable between the 2 groups. However, CKLT patients had significantly lower glomerular filtration rates (P = .007) and higher proteinuria (P = .01). Postoperatively, renal function, and comorbidities were comparable between the 2 groups. Survival was similar at 1, 3, and 12 months, respectively (log-rank; P = .84, = .81, and = .96, respectively). At the end of the study period, 57% of patients who survived in LTA groups had stabilized renal function (Creatinine = 1.8 ± 0.6 mg/dL).ConclusionsLiver transplant alone is not inferior to CKLT in living donor situations. Renal dysfunction is stabilized in the long term whereas long-term dialysis may be carried out in others. Living donor liver transplantation alone is not inferior to CKLT for cirrhotic patients with CKD.  相似文献   

10.
《Transplantation proceedings》2023,55(5):1193-1198
BackgroundPatients with liver graft failures have an extremely low chance of finding a cadaveric graft in countries with a scarcity of deceased donors. We compared the outcomes of liver re-transplantation with living-donor liver grafts (re-LDLT) and deceased-donor liver grafts (re-DDLT) in adult patients (>18 years).MethodsThe medical records of 1513 (1417 [93.6%] LDLT and 96 [6.3%] DDLT) patients who underwent liver transplantation at Memorial Hospital between January 2011 and October 2022 were reviewed. Forty patients (24 adults and 16 pediatric) were re-transplanted (2.84%); 24 adult patients (2.72%: 25 re-LDLT, 1 patient with second re-LDLT) were divided into 2 groups: re-DDLT (n = 6) and re-LDLT (n = 18). The groups were compared in demographics, pre-, peri-, postoperative characteristics, and outcomes.ResultsThe overall survival rates were 91.7%, 79.2%, 75.0%, and 75% for <30 days, 31 to 90 days, 1, and 3 years, respectively. The LDLT group was significantly younger (P = .022), had smaller graft weight (P = .03), shorter mechanical ventilation (P = .036) but longer operation time (P = .019), and hospitalization period (P = .003). The groups were otherwise comparable. There was no statistically significant difference in survival rates between the groups (P = .058), although the re-LDLT group had an evidently higher survival rate (88.9% and 83.3 % vs 50.0%).ConclusionRe-LDLT has shown comparable outcomes to re-DDLT, if not better (even not far from significance P = .058). These results may encourage performing re-LDLTs in patients with indications for re-LT without worrying about low chances of survival, especially in countries with limited sources of deceased donors.  相似文献   

11.
ObjectivesMesenteric ischaemia/reperfusion (IR) may lead to liver mitochondrial dysfunction and multiple organ failure. We determined whether gut IR induces early impairment of liver mitochondrial oxidative activity and whether methylene blue (MB) might afford protection.DesignControlled animal study.Materials and methodsRats were randomised into three groups: controls (n = 18), gut IR group (mesenteric ischaemia (60 min)/reperfusion (60 min)) (n = 18) and gut IR + MB group (15 mg kg?1 MB intra-peritoneally) (n = 16). Study parameters were: serum liver function markers, blood lactate, standard histology and DNA fragmentation (apoptosis) on intestinal and liver tissue, maximal oxidative capacity of liver mitochondria (state 3) and activity of complexes II, III and IV of the respiratory chain measured using a Clark oxygen electrode.ResultsGut IR increased lactate deshydrogenase (+982%), aspartate and alanine aminotransferases (+43% and +74%, respectively) and lactate levels (+271%). It induced segmental loss of intestinal villi and cryptic apoptosis. It reduced liver state 3 respiration by 30% from 50.1 ± 3 to 35.2 ± 3.5 μM O2 min?1 g?1 (P < 0.01) and the activity of complexes II, III and IV of the mitochondrial respiratory chain. Early impairment of liver mitochondrial respiration was related to blood lactate levels (r2 = 0.45). MB restored liver mitochondrial function.ConclusionsMB protected against gut IR-induced liver mitochondria dysfunction.  相似文献   

12.
PurposeIn recent years, the increasing number of obese individuals in Japan has made transplant teams sometimes forced to select candidates with a high body mass index (BMI) as marginal donors in living donor liver transplantation. However, data are lacking regarding the impact of a high BMI on the outcome for liver donors, particularly over the long term. Here, we aimed to clarify the impact of a high BMI on postoperative short- and long-term outcomes in liver donors.MethodsWe selected 80 cases that had complete 5-year data available from hepatectomies performed in 2005 to 2015 in our institute. We divided donors into overweight (BMI ≥ 25 kg/m2, n = 16) and normal-weight (BMI < 25, n = 64) groups.ResultsPreoperatively, the overweight group had significantly higher preoperative levels of serum alanine aminotransferase and γ-glutamyl transpeptidase and a larger liver volume than the normal-weight group. Although the overweight group had significantly greater intraoperative blood loss (660 ± 455 vs 312 ± 268 mL, P = .0018) and longer operation times (463 ± 88 vs 386 ± 79 min, P = .0013), the groups showed similar frequencies of postoperative complications. At 1 year post hepatectomy, liver regeneration and spleen enlargement ratios did not significantly differ between the 2 groups. Remarkably, the overweight group showed significantly higher serum γ-glutamyl transpeptidase levels over the long term.ConclusionsOverweight status alone was not a risk factor for either short- or long-term postoperative outcomes after a donor hepatectomy. However, donors with elevated γ-glutamyl transpeptidase levels, which was frequent among overweight donors, may require special attention.  相似文献   

13.
BackgroundIn 1992, a landmark study demonstrated clinical deterioration in respiratory function and nutritional status prior to the onset of cystic fibrosis-related diabetes (CFRD). We re-evaluated this outcome.MethodsThe Montreal Cystic Fibrosis Cohort is a prospective CFRD screening study. We performed a 6-year retrospective analysis of nutritional parameters and FEV1 (%) in subjects who developed incident CFRD and in controls who maintained normoglycemia (NG). In the former group, data was collected over 6 years prior to diabetes onset.ResultsSubjects (n = 86) had a mean age of 31.7 ± 8.1 years, BMI of 23.0 ± 4.0 kg/m2, and FEV1% of 70.1 ± 24.2%. Eighty-one percent had pancreatic insufficiency (PI). Patients were grouped as follows: NG+PS (pancreatic sufficient) (n = 16), NG+PI (pancreatic insufficient) (n = 21), CFRD+PS (n = 3) and CFRD+PI (n = 46).At their most recent screen NG+PS subjects had significantly greater BMI, as compared to NG+PI and CFRD+PI groups (26.2 ± 3.6 kg/m2 vs 22.6 ± 4.2 kg/m2 vs 22.1 ± 3.5 kg/m2, p = 0.0016). FEV1 was significantly greater in the NG+PS group (91.5 ± 16.8% vs 67.8 ± 25.3% vs 63.5 ± 22.2%, p = 0.0002). The rates of change in weight, BMI, fat mass (%), and FEV1 prior to the most recent visit (NG+PS, NG+PI groups) or to the diagnosis of de novo CFRD were similar between groups.ConclusionIn a contemporary context, CFRD onset is not preceded by deterioration in BMI, fat mass, or pulmonary function. Low BMI and FEV1 are more closely associated with PI than a pre-diabetic state.  相似文献   

14.
Background/objective: The aim of the present study was to compare the operative and early postoperative results of the use of del Nido Cardioplegia solution (dNCS) with traditional blood cardioplegia (BC) in adult aortic surgery.MethodsA retrospective single-center study was performed on 118 patients who underwent aortic surgery with cardiopulmonary bypass (CPB) between January 2016 and June 2020. Patients were divided in to two groups according to the type of cardioplegia solution used. Cardiac arrest was achieved in Group 1 (n = 65) with traditional BC and in Group 2 (n = 53) with dNCS. Operative and postoperative outcomes of the patients were compared between the two groups.ResultsPatient demographic characteristics were similar between the two groups. dNCS group showed significantly lower aortic cross-clamp (ACC) time (73.3 vs. 87.5 min, P = 0.001), cardioplegia volume (1323.9 ± 368.5 vs. 2773.8 ± 453.8 ml, P< 0.001), defibrillation rate (44.4%vs. 69.2%, P = 0.006), drainage amount (412 ± 73.2 vs. 446.9 ± 95.1 ml, P = 0.026) and inotropic support need (37% vs. 55.3%, P = 0.046). Also dNCS group had significantly lower high sensitive troponin I (hsTnI) levels at 6th (203.5 ± 68.6 vs. 275.7 ± 76.2 ng/L, P< 0.001) and 24th (253.1 ± 101 vs. 293.4 ± 80.1 ng/L, P = 0.017) postoperative hours. And dNCS group showed significantly higher hematocrit levels at 6th (25.1 ± 3.2 vs. 22.5 ± 2.5%, P< 0.001) and 24th (25.8 ± 2.7 vs. 24.6 ± 2.8%, P = 0.024) postoperative hours. Times of intensive care unit stay, durations of intuabation and hospital stay times were similar in both groups. There was no significant difference in terms of postoperative ejection fraction values (P = 0.714).ConclusionCompared with conventional BC, dNCS provided significantly shorter ACC times, reduced the need for intraoperative defibrillation, lowered postoperative hsTnI levels with comparable early clinical outcomes for adult patients undergoing aortic surgery. dNCS is a safe and efficient alternative to the traditional BC solution in adult aortic cardiac surgery.  相似文献   

15.
PurposeThe purpose of this study was to determine the local progression rate and identify factors that may predict local progression, in patients who achieve a complete response (CR) radiologically after undergoing transarterial chemoembolization (TACE) for hepatocellular carcinoma (HCC).Materials and methodsOne-hundred-forty-seven patients, who achieved CR of 224 HCCs after TACE, were retrospectively reviewed. There were 109 men and 38 women with a mean age of 61.6 ± 6.8 (SD) years (range: 45.4–86.9 years). Logistic mixed-effects and Cox regression models were used to evaluate associations between clinical factors and local progression.ResultsA total of 75 patients (75/147; 51%) and 99 (99/224,44.2%) lesions showed local progression at a median of 289.5 days (Q1: 125, Q3: 452; range: 51–2245 days). Pre-treatment, international normalization ratio (INR) (1.17 ± 0.15 [SD] vs. 1.25 ± 0.16 [SD]; P <0.001), model for end-stage liver disease (9.4 ± 2.6 [SD] vs. 10.6 ± 3.2 [SD]; P = 0.010) and Child-Pugh score (6 ± 1 [SD] vs. 6.4 ± 1.3 [SD]; P = 0.012) were significantly lower while albumin serum level (3.4 ± 0.62 [SD] vs. 3.22 ± 0.52 [SD]; P = 0.033) was significantly greater in those who showed local progression as compared to those who did not. In terms of local-recurrence free survival, the number of TACE treatments (hazard ratio [HR]: 2.05 [95% CI: 1.57–2.67]; P<0.001), INR (HR: 0.13 [95% CI: 0.03–0.61]; P = 0.010) and type of TACE (P = 0.003) were significant. Patients with local progression on any tumor did not differ from those who did in terms of overall survival (P = 0.072), however, were less likely to be transplanted (20/75, 26.7%) than those who did not (33/72; 36.1%) (P = 0.016).ConclusionA significant number of patients who achieve CR of HCC after TACE have local progression. This emphasizes the importance of long-term follow up.  相似文献   

16.
BackgroundLiver transplantation (LT) has the limitation of graft shortage. Therefore, to increase the donor pool, even marginal grafts are being transplanted depending on the recipient's condition. This study was conducted to analyze the post-LT prognosis using discarded liver grafts.Methods and MaterialsFrom January 2010 to September 2020, deceased-donor LT was performed in 160 patients in our center. Among them, 121 patients (allocated group) were preferentially allocated to our center, and the remaining 39 patients (24.4%, discarded group) received liver grafts that were discarded by prioritized centers.ResultsThe preoperative model for end-stage liver disease score were 27.0 ± 10.41 and 27.0 ±11.79 for each group (P = .99). There were no differences between the 2 groups in operation time (P = .06) and intraoperative packed red cell transfusion (P = .90). There were no differences between the 2 groups in early allograft dysfunction (P = .48) and hospital stay (P = .26) after deceased-donor LT. In-hospital mortality occurred in 10 patients (8.3%) in the allocated group and 4 patients (10.3%) in the discarded group. Only the length of intensive care unit stay was significantly longer in the discarded group (P = 0.04). The 5-year survival rate was 73.8% in the allocated group and 72.2% in the discarded group.ConclusionsThe outcome of the discarded group is never worse than that of the allocated group. deceased-donor LT from the discarded graft can be acceptable. As a result, the number of discarded grafts can be reduced.  相似文献   

17.
《Journal of vascular surgery》2020,71(6):2089-2097
ObjectivePlateletcrit (PCT) reflects the total platelet mass in blood and can be calculated from a complete blood count. We examined the effect of PCT on outcomes of endovascular and open interventions for chronic limb ischemia.MethodsPatients who underwent revascularization for chronic limb ischemia (Rutherford categories 3-6) between June 2001 and December 2014 were retrospectively identified. PCT on admission was recorded. Patients and limbs were divided into tertiles of low (0.046-0.211), medium (0.212-0.271), and high (0.272-0.842) PCT. Patency, limb salvage, major adverse limb events, major adverse cardiac events, and survival rates were calculated using Kaplan-Meier analysis and compared with log-rank test. Cox regression analysis was used for multivariate analysis.ResultsA total of 1431 limbs (1210 patients) were identified and divided into low PCT (477 limbs in 407 patients), medium PCT (477 limbs in 407 patients), and high PCT (477 limbs in 396 patients) groups. The patients in the high tertile were 2 years older that the patients in the other two tertiles (P = .009). Five-year primary patency was 65% ± 3% in the low-PCT group compared with 55% ± 3% and 51% ± 3% in the medium and high PCT groups, respectively (P = .004). Five-year secondary patency was 81% ± 2% in the low PCT group compared with 82% ± 2% and 72% ± 3% in the medium and high PCT groups, respectively (P = .02). Five-year limb salvage rate was 86% ± 2% in the low PCT group compared with 79% ± 3% and 74% ± 3% in the medium PCT and high PCT groups, respectively (P = .004). Multivariate regression analysis showed that low PCT was independently associated with primary patency after endovascular interventions (hazard ratio, 0.67 [0.47-0.95]; P = .02) but not after open interventions (hazard ratio, 0.72 [0.43-1.21]; P = .21).ConclusionsHigh PCT is associated with poor patency and limb salvage rates after interventions for lower extremity chronic limb ischemia. Multivariate regression analysis confirmed association of low PCT with improved primary patency after endovascular interventions but not after open interventions. High PCT may be a marker of increased platelet reactivity and could be used to identify patients at high risk for early thrombosis and failure after interventions.  相似文献   

18.
19.
《Surgery》2023,173(2):350-356
BackgroundThe significant decrease in elective surgery during the COVID-19 pandemic prompted fears that there would be an increase in emergency or urgent operations for certain disease states. The impact of COVID-19 on ventral hernia repair is unknown. This study aimed to compare volumes of elective and nonelective ventral hernia repairs performed pre–COVID-19 with those performed during the COVID-19 pandemic.MethodsAn analysis of a prospective database from 8 hospitals capturing patient admissions with the International Classification of Diseases, Tenth Revision Procedure Coding System for ventral hernia repair from January 2017 through June 2021 were included. During, COVID-19 was defined as on or after March 2020.ResultsComparing 3,558 ventral hernia repairs pre–COVID-19 with 1,228 during COVID-19, there was a significant decrease in the mean number of elective ventral hernia repairs per month during COVID-19 (pre–COVID-19: 61 ± 5 vs during COVID-19 19: 39 ± 11; P < .001), and this persisted after excluding the initial 3-month COVID-19 surge (61 ± 5 vs 42 ± 9; P < .001). There were fewer nonelective cases during the initial 3-month COVID-19 surge (32 ± 9 vs 24 ± 4; P = .031), but, excluding the initial surge, there was no difference in nonelective volume (32 ± 9 vs 33 ± 8; P = .560).During COVID-19, patients had lower rates of congestive heart failure (elective: 9.0% vs 6.6%; P = .0047; nonelective: 17.7% vs 11.6%; P < .001) and chronic obstructive pulmonary disease (elective: 13.7% vs 10.2%; P = .017; nonelective: 17.9% vs 12.0%; P < .001) and underwent fewer component separations (10.2% vs 6.4%; P ≤ .001). Intensive care unit admissions decreased for elective ventral hernia repairs (7.7% vs 5.0%; P = .016). Length of stay, cost, and readmission were similar between groups.ConclusionElective ventral hernia repair volume decreased during COVID-19 whereas nonelective ventral hernia repairs transiently decreased before returning to baseline. During COVID-19, patients appeared to be lower risk and less complex. The possible impact of the more complex patients delaying surgery is yet to be seen.  相似文献   

20.
《Transplantation proceedings》2022,54(8):2236-2242
BackgroundTo establish a new and accurate model for standard liver volume (SLV) estimation and graft size prediction in liver transplantation for Chinese adults.MethodsIn this study, the data of morphologic indices and liver volume (LV) were retrospectively obtained on 507 cadaveric liver transplantation donors between June 2017 and September 2020 in Shulan (Hangzhou) Hospital. Linear regression analysis was performed to evaluate the impact of each parameter and develop a new SLV formula. The new formula was then validated prospectively on 97 donors between October 2020 and June 2021, and the prediction accuracy was compared with previous formulas.ResultsThe average LV in all subjects was 1445.68 ± 309.94 mL. Body weight (BW) showing the strongest correlation (r = 0.453, P < .001). By stepwise multiple linear regression analysis, BW and age were the only 2 independent correlation factors for LV. Shulan estimation model derived: SLV (mL) = 13.266 × BW (kg) – 4.693 × age + 797.16 (R2 = 0.236, P < .001). In the validation cohort, our new model achieved no significant differences between the estimated SLV and the actual LV (P > .05), and showed the lowest mean percentage error of 0.33%. The proportions of estimated SLV within the actual LV ± 20%, ± 15%, and ± 10% percentage errors were 69.1%, 55.7%, and 40.2%, respectively.DiscussionThe Shulan SLV estimation model predicted LV more accurately than previous formulas on Chinese adults, which could serve as a simple screening tool during the initial assessment of graft volume for potential donors.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号