首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 52 毫秒
1.

Background

Cancer stem cells may be associated with tumor progression and prognosis for colon cancer. We hypothesized that expression of Aldehyde dehydrogenase 1 (ALDH1) would increase with tumor progression and be associated with survival.

Methods

Tissue was obtained from resection specimens for isolation of cancer stem cells. In addition, paraffin blocks from resected colon cancers with normal colon, primary tumor, and lymph node and liver metastasis from 2000 to 2010 were identified and stained with ALDH1.

Results

In in vitro models (adherent and tumor spheres) ALHD1+ cells grew more efficiently than ALDH1− cells. ALDH1 expression was highest in peritumoral crypt cells (0.137 μm2, 95% confidence interval [CI] 0.125–0.356) and normal crypts (median 0.091 μm2, 95% CI 0.064–0.299) followed by lymph node metastasis (median 0.025 μm2, 95% CI 0–0.131) and the primary cancers (median 0.014 μm2, 95% CI 0.0123–0.154). Samples were divided into high and low ALDH1 expression. Survival was associated with expression in the primary tumor (9 versus 23 mo, P = 0.0016) expression but not peritumoral tissue (21 versus 20.5 mo, P = 0.32), normal colon (19 versus 27 mo, P = 0.289), or lymph node metastasis (23 versus 21 mo, P = 0.69). On univariate analysis, ALDH1 expression and grade were associated with survival but ages, number of lymph node metastasis, race, or grade were not associated. On multivariate analysis, only ALDH1 status continued to be associated with survival, odds ratio 4.4, and P = 0.011.

Conclusions

ALDH1 is indicative of stemness and is a biomarker marker in colon cancer. Expression did not increase with progression from normal colon to primary tumors and metastasis.  相似文献   

2.

Introduction

The incidence of recurrent primary hyperparathyroidism (PHPT) had been reported to be between 1% and 10%. The purpose of this study was to examine if patients with multigland disease have a different recurrence rate.

Methodology

A retrospective analysis of a prospectively collected database was performed on patients with PHPT who underwent parathyroidectomy at one institution between 2001 and 2013. Patients who underwent initial parathyroidectomy with at least 6 mo of follow-up were included and were divided into three groups according to operative notes: single adenoma (SA), double adenoma (DA), and hyperplasia (HP). An elevated postoperative serum calcium level within 6 mo of surgery was defined as a persistent disease, whereas an elevated calcium after 6 mo was defined as a recurrence.

Results

In total, 1402 patients met inclusion criteria, and the success rate of parathyroidectomy was 98.4%. The mean age was 60 ± 14 y and 78.5% were female. Among them, 1097 patients (78%) had SA, 124 patients (9%) had DA, and 181 patients had HP (13%). The rate of persistent PHPT was higher among patients with DA (4%) versus SA (1.3%) and HP (2.2%) (P = 0.0049). Moreover, the recurrence rate was higher among patients with DA (7.3%) versus SA (1.7%) and HP (4.4%) (P = 0.0005) with identical median follow-up time. The median of the follow-up was 11 mo for patients with SA, 12.5 for patients with DA, and 12 for patients with HP (P = 0.1603).

Conclusions

Recurrent and persistent PHPT occur more frequently in patients with DA. These data suggest that DA in some cases could represent asymmetric or asynchronous hyperplasia. Therefore, patients with DA may warrant more rigorous intraoperative scrutiny and more vigilant monitoring after parathyroidectomy.  相似文献   

3.

Objective

Due to the shortage of cadaver liver grafts in Asia, more than 90% of biliary atresia (BA) patients require living donor liver transplantation (LDLT), but the factors that influence liver graft regeneration in pediatric patients are still unclear. The aim of this study was to evaluate the potential predisposing factors that encourage liver graft regeneration in pediatric liver transplantation (LT).

Methods

Case notes and Doppler ultrasound and computed tomography studies performed before and 6 months after transplantation of 103 BA patients who underwent LDLT were reviewed. The predisposing factors that triggered liver regeneration were compiled from statistical analyses and included the following: age, gender, body weight and height, spleen size, graft weight–to–recipient weight ratio (GRWR), post-transplantation total portal flow, and vascular complications.

Results

Seventy-two pediatric recipients were enrolled in this study. The liver graft regeneration rate was 29.633 ± 36.61% (range, −29.53–126.27%). The size of the spleen (P = .001), post-transplantation portal flow (P = .004), and age (P = .04) were correlated lineally with the regeneration rate. The GRWR was negatively correlated with the regeneration rate (P = .001) and was the only independent factor that affected the regeneration rate. When the GRWR was >3.4, patients tended to have poor and negative graft regeneration (P = .01).

Conclusion

Large-for-size grafts have negative effect on regeneration rates because liver grafts that are too large can compromise total portal flow and increase vascular complications, especially when the GRWR is >3.4. Thus, optimal graft size is more essential than other factors in a pediatric LDLT patient.  相似文献   

4.

Background

Before bariatric surgery, some patients with type 2 diabetes mellitus (T2DM) experience improvement in blood glucose control and reduced insulin requirements while on a preoperative low-calorie diet (LCD). We hypothesized that patients who exhibit a significant glycemic response to this diet are more likely to experience remission of their diabetes in the postoperative period.

Materials and methods

Insulin-dependent T2DM patients undergoing bariatric surgery between August 2006 and February 2011 were eligible for inclusion. Insulin requirements at day 0 and 10 of the LCD were compared. Patients with a ≥50% reduction in total insulin dosage to maintain appropriate blood glucose control were considered rapid responders to the preoperative LCD. All others were non–rapid responders. We analyzed T2DM remission rates up to 1 y postoperatively.

Results

A total of 51 patients met inclusion criteria and 29 were categorized as rapid responders (57%). The remaining 22 were considered non–rapid responders (43%). The two groups did not differ demographically. Rapid responders had greater T2DM remission rates at 6 (44% versus 13.6%; P = 0.02) and 12 mo (72.7% versus 5.9%; P < 0.01). In patients undergoing laparoscopic gastric bypass, rapid responders showed greater excess weight loss at 3 mo (40.1% versus 28.2%; P < 0.01), 6 mo (55.2% versus 40.2%; P < 0.01), and 12 mo (67.7% versus 47.3%; P < 0.01).

Conclusions

Insulin-dependent T2DM bariatric surgery patients who display a rapid glycemic response to the preoperative LCD are more likely to experience early remission of T2DM postoperatively and greater weight loss.  相似文献   

5.

Background

Surface electromyography is a noninvasive technique for detecting the activity of skeletal muscles and especially the muscles for respiratory compliance; namely, the diaphragm and rectus abdominis. This study compares these muscles in healthy individuals, liver disease patients, and after abdominal surgery.

Objective

To study muscle activity by surface electromyography of the right diaphragm muscles and right rectus abdominis (root means square, RMS), and the manovacuometry muscle strength (maximal inspiratory pressure, MIP; and maximal expiratory pressure, MEP).

Results

We evaluated 246 subjects who were divided into 3 groups: healthy (65), liver disease (171), and post-surgery (10). In liver disease group the BMI was higher significantly for ascites (P = .001), and was increase in RMS rectum (P = .0001), RMS diaphragm (P = .030), and a decreased inspiratory and expiratory indices (P = .0001) pressure in the post-surgery group. A multivariate analysis showed tendency to an increased BMI in liver disease and in the post-surgery groups correlated with an increased RMS rectum and the lower MIP/MEP (P = .11). The receiver operating characteristic curve showed that RMS rectus was capable of discriminating liver disease and post-surgery patients from healthy subjects (area = 0.63; 95% CI 0.549–0.725).

Conclusion

The muscle activity of normal individuals is lower than in subjects with deficit muscles because less effort is necessary to overcome the same resistance, observed by surface electromyography and muscle strength.  相似文献   

6.

Background

Shockwave was shown to enhance the healing of anterior cruciate ligament (ACL) reconstruction in rabbits. This study evaluated the effect of extracorporeal shockwave therapy (ESWT) on ACL reconstruction in human subjects. We hypothesized that ESWT may improve human ACL reconstruction.

Methods

Fifty-three patients were randomized into two groups with 26 patients in ESWT group and 27 patients in control group. The ESWT group underwent single-bundle hamstring autograft ACL reconstruction and received ESWT immediately after surgery. The control group underwent ACL surgery without ESWT. Both groups received the same rehabilitation postoperatively. The evaluations included Lysholm score, IKDC score and KT-1000, radiograph, bone mineral density, and magnetic resonance imaging.

Results

ESWT group showed significantly better Lysholm score than control group at 1 and 2 y postoperatively (P < 0.001 and 0.001, respectively). No significant difference was noted in IKDC score between the two groups (P = 0.080 and 0.076, respectively). The KT-1000 values were significantly better in ESWT group than control group at 2 y postoperatively (P = 0.027). The tibia tunnel on X-ray was significantly smaller in ESWT group compared with control group at 2 y (P = 0.018). The bone mineral density values showed no discernable difference between the two groups at 6 mo and 2 y (P = 0.522 and 0.984, respectively). On magnetic resonance imaging, ESWT group showed significant decrease in tibia tunnel enlargement at 6 mo and 2 y compared with the control group (P = 0.024 and <0.001, respectively).

Conclusions

ESWT significantly improves the subjective Lysholm score and decreases the middle 1/3 tibia tunnel enlargement after single hamstring autograft ACL reconstruction.  相似文献   

7.

Introduction

Patients with history of prior sternotomy may have poorer outcomes after heart transplantation. Quantitation of risk from prior sternotomy has not been well established. The United Network for Organ Sharing (UNOS) database was analyzed to assess early and late survival and predictors of outcome in adult heart transplant recipients with and without prior sternotomy.

Methods

Of 11,266 adults with first heart–only transplantation from 1997 to 2011, recipients were divided into 2 groups: those without prior sternotomy (first sternotomy group; n = 6006 or 53.3%) and those with at least 1 prior sternotomy (redo sternotomy group; n = 5260 or 46.7%). A multivariable Cox model was used to identify predictors of mortality.

Results

Survival was lower in the redo group at 60 days (92.6% vs 95.9%; hazard ratio [HR] 1.83, 95% confidence interval [CI]: 1.56–2.15; P < .001). Conditional 5-year survival in 60-day survivors was similar in the 2 groups (HR = 1.01, 95% CI 0.90–1.12, P = .90). During the first 60 days post-transplant, the redo group had more cardiac reoperations (12.3% vs 8.8%, P = .0008), a higher frequency of dialysis (8.9% vs 5.2%, P < .0001), a greater percentage of drug-treated infections (23.2% vs 19%, P = .003), and a higher percentage of strokes (2.5% vs 1.4%, P = .0001). A multivariable Cox proportional hazards model identified prior sternotomy as a significant independent predictor of mortality, in addition to age, female gender, congenital cardiomyopathy, need for ventilation, mechanical circulatory support, dialysis prior to transplant, pretransplant serum bilirubin (≥3 mg/dL), and preoperative serum creatinine (≥2 mg/dL).

Conclusions

Prior sternotomy is associated with an excess 3.3% mortality and higher morbidity within the first 60 days after heart transplantation, as measured by frequency of dialysis, drug-treated infections, and strokes. Conditional 5-year survival after 60 days is unaffected by prior sternotomy. These findings should be taken into account for risk assessment of patients undergoing heart transplantation.  相似文献   

8.

Background

Infectious complications are major factors for morbidity and mortality in liver transplant recipients. To establish a proper strategy to reduce infectious complications, we analyzed epidemiologic and risk factors for post-transplant infections.

Methods

We analyzed the medical records of 231 consecutive liver transplant recipients from December 2007 to November 2011, including at least 1-year follow up, for comparison with those from 1996 to 2005.

Results

Among 231 patients, 126 (54.5%) experienced 244 infectious episodes, a rate of 1.05 per patient. Among overall mortality of 9.9% (23/231), infections were more prevalent (P = .04). Predominant infections were postoperative intra-abdominal problems (36.1%), peritonitis (15.2%), pneumonia (13.5%), bacteremia (4.1%), wound complications (1.6%), viral etiologies (18.0%), and other causes (11.5%). Causative organisms were bacterial (68.9%), viral (14.7%), fungal (7.0%), and unproven ones (9.4%). Multivariate analysis of risks for infection showed significant impacts of Model for End-stage Liver Disease score [P = .027; odds ratio (OR), 1.04], post-transplant biliary complications (P < .001; OR, 3.50), and rejection episodes (P = .023; OR, 3.39). Mortality was related to retransplantation (P = .003), post-transplant dialysis (P = .006), and infection (P = .056) upon univariate analysis, none of which were significant in multivariate analysis. Compared with data from the previous period, overall and infection-related mortality decreased from 24.5% to 9.9% and 52.9% to 26.1%, respectively. There were no significant changes in the types of infection or rate of drug-resistant bacteria, but candidal infections and cytomegalovirus reactivations were more prevalent.

Conclusion

Our data showed current perioperative antimicrobial regimens need not be changed: however, new strategies are needed to reduce infectious complications after liver transplantation, to reduce biliary complications and to properly manage rejection episodes.  相似文献   

9.

Purpose

Donor age is a well-known factor influencing graft function after deceased donor liver transplantation (DDLT). However, the effect of donors older than recipients on graft outcomes remains unclear. This study investigated the relationship between the donor–recipient age gradient (DRAG) and posttransplant outcomes after DDLT.

Methods

We included 164 adult recipients who underwent DDLT between May 1996 and April 2011. Patients were divided into 2 groups according to the value of DRAG: Negative (DRAG −20 to −1; n = 99) versus positive (DRAG 0–20; n = 65). Medical records were reviewed and laboratory data were retrospectively collected.

Results

The median age of donors and recipients was 43 (range, 10–80) and 46 (range, 19–67) years, respectively. The mean follow-up time was 57.4 months. A positive DRAG had a negative effect on levels of alkaline phosphatase until 2 weeks after transplantation. However, the positive group showed a lower incidence of hepatitis B viral disease recurrence. The 1-, 3-, and 5-year graft survival rates were 80.4%, 76.8%, and 71.4% in the negative group, and 65.8%, 58.4%, and 56.3% in the positive group, respectively. The positive DRAG group showed significantly inferior graft survival compared with the negative DRAG group (P = .036).

Conclusion

This study demonstrated that donors older than recipients had a deleterious effect on graft outcomes. DRAG could be a meaningful determinant of graft survival among DDLT recipients.  相似文献   

10.

Objective

Many studies have compared the safety and efficacy of the calcineurin inhibitor (CNI) avoidance or CNI withdrawal regimens with typical CNI regimens, but the results remain controversial. The aim of this systematic review and meta-analysis is to make a profound review and an objective appraisal of the safety and efficacy of the CNI avoidance and CNI withdrawal protocols.

Methods

We searched PUBMED, EMBASE, and the reference lists of retrieved studies to identify randomized controlled trials (RCTs) that referred to CNI-free regimens, CNI avoidance, or CNI withdrawal for kidney transplantation. Eight publications involving 27 different RCTs and a total of 3953 patients were used in the analysis.

Results

Use of mammalian target of rapamycin inhibitors, namely sirolimus (SRL), in combination with mycophenolate, conserve graft function at 1 year (glomerular filtration rage [GFR]: mean difference MD 6.21, 95% CI 0.02–12.41, P = .05; serum creatinine: MD −0.11, 95% CI −0.19 to −0.03, P = .01, respectively) and 2 years post-transplant (GFR: MD 13.96, 95% CI 7.32–20.60, P < .0001). Similarly, early withdrawal (≤6 months) of CNIs protect graft function at 1 year after transplant (GFR: MD 7.03, 95% CI 4.84–9.23, P < .00001, serum creatinine: MD −0.21, 95% CI −0.22 to −0.19, P < .00001, respectively). CNI avoidance and withdrawal strategies are associated with higher incidence of acute rejection at 1 year post-transplant (odds ratio OR 1.74, 95% CI 1.08–2.81, P = .02; OR 1.78, 95% CI 1.35–2.34, P < .0001, respectively). At 2 years after transplant, there was no significant difference (OR 0.92, 95% CI 0.33–2.51, P = .86; OR 2.42, 95% CI 1.01–5.82, P = .05, respectively). Meanwhile, neither adverse events nor patient/graft survival differed significantly between the CNI-free and CNI protocols at 1 and 2 years. Referring to long-term results in the published RCTs, use of CNI-free and CNI withdrawal regimens achieve better renal function vs CNI regimens, with no significant difference in patient and graft survival, acute rejection, and most reported adverse events.

Conclusions

In conclusion, this systematic review and meta-analysis suggests that renal recipients with early withdrawal of CNI drugs or avoiding CNI with SRL perform better to conserve graft function at 1 and 2 years post-transplant. Though the use of CNI regimens performs no better in 2-year acute rejection vs the contrast group, they greatly decrease the incidence of acute rejection at the first year after transplantation. CNI avoidance and withdrawal regimens improve the long-term renal function and perform similarly in the acute rejection, patient and graft survival, and adverse events. Due to the limited amounts of long-term studies, more high-quality RCTs are needed.  相似文献   

11.

Background

Efforts to improve long-term patient and allograft survival have included use of induction therapies as well as steroid and/or calcineurin inhibitor (CNI) avoidance/minimization.

Methods

This is a retrospective review of kidney transplant recipients between September 2004 and July 2009. Immune minimization (group 1; n = 182) received alemtuzumab induction, low-dose CNI, and mycophenolic acid (MPA). Conventional immunosuppression (group 2; n = 232) received rabbit anti-thymocyte globulin, standard-dose CNI, MPA, and prednisone.

Results

Both groups were followed up for same length of time (49.4 ± 21.7 months; P = .12). Patient survival was also similar (90% vs 94%; P = .14). Death-censored graft survival was inferior in group 1 compared with group 2 (86% vs 96%, respectively; P = .003). On multivariate analysis, group 1 was an independent risk factor for graft loss (aHR = 2.63; 95% confidence interval [CI], 1.32–5.26; P = .006). Biopsy-proven acute rejection occurred more in group 1, due to late rejections compared with group 2 (7% vs 2%; P < .01 respectively). Graft function was lower in group 1 compared with group 2 at 3 months (49.5 mL/mt vs 70.7 mL/mt, respectively; P < .001) to 48 months (48.6 mL/mt vs 69.4 mL/mt, respectively; P = .04).

Conclusion

Minimization of maintenance immunosuppression after alemtuzumab correlated with higher acute rejection and inferior graft survival compared with thymoglobulin and conventional triple immunotherapy.  相似文献   

12.

Background

Studies have proposed a neuroprotective role for alcohol (ETOH) in traumatic brain injury (TBI). We hypothesized that ETOH intoxication is associated with mortality in patients with severe TBI.

Methods

Version 7.2 of the National Trauma Data Bank (2007–2010) was queried for all patients with isolated blunt severe TBI (Head Abbreviated Injury Score ≥4) and blood ETOH levels recorded on admission. Primary outcome measure was mortality. Multivariate logistic regression analysis was performed to assess factors predicting mortality and in-hospital complications.

Results

A total of 23,983 patients with severe TBI were evaluated of which 22.8% (n = 5461) patients tested positive for ETOH intoxication. ETOH-positive patients were more likely to have in-hospital complications (P = 0.001) and have a higher mortality rate (P = 0.01). ETOH intoxication was an independent predictor for mortality (odds ratio: 1.2, 95% confidence interval: 1.1–2.1, P = 0.01) and development of in-hospital complications (odds ratio: 1.3, 95% confidence interval: 1.1–2.8, P = 0.009) in patients with isolated severe TBI.

Conclusions

ETOH intoxication is an independent predictor for mortality in patients with severe TBI patients and is associated with higher complication rates. Our results from the National Trauma Data Standards differ from those previously reported. The proposed neuroprotective role of ETOH needs further clarification.  相似文献   

13.

Background

The aim of this study was to evaluate the safety and efficacy of thyroidectomy using the Harmonic ACE scalpel (HS) or the LigaSure Precise (LS) instrument in conventional thyroidectomy.

Materials and methods

A prospective, randomized controlled trial was performed. Between August 2011 and June 2012, 832 patients who required thyroidectomy for papillary thyroid cancer were randomized into groups treated with either the HS or the LS instrument. Operative time and surgical morbidities were analyzed.

Results

A total of 320 patients (HS group, N = 164; LS instrument group, N = 156) were randomized for analysis according to the intention-to-treat principle. There were no statistically significant differences in the operative times (HS group versus LS instrument group: 71.93 ± 18.26 versus 75.15 ± 20.13; P = 0.423), postoperative transient hypoparathyroidism (13.4% versus 14.1%; P = 0.858), and permanent recurrent laryngeal nerve injuries between the two groups.

Conclusions

In this study, both hemostatic devices were safe and effective in terms of postoperative results and complications without any differences.  相似文献   

14.

Introduction

Sedation and pain management for mechanically ventilated critically ill surgical patients pose many challenges for the intensivist. Even though daily interruption of sedatives and opioids is appropriate in medical intensive care unit (ICU) patients, it may not be feasible in the surgical patients with pain from surgical incision or trauma. Therefore we developed an analgesia/sedation based protocol for the surgical ICU population.

Methods

We performed a two-phase prospective observational control study. We evaluated a prescriber driven analgesia/sedation protocol (ASP) in a 12-bed surgical ICU. The pre-ASP group was sedated as usual (n = 100) and the post-ASP group was managed with the new ASP (n = 100). Each phase of the study lasted for 5 mo. Comparisons between the two groups were performed by χ2 or Fisher’s exact test for categorical variables and the Mann-Whitney test for nonparametric variables. A P value <0.05 was statistically significant.

Results

We found a significant reduction in the use of fentanyl (P < 0.001) and midazolam (P = 0.001). We achieved sedation goals of 86.8% in the post-ASP group compared to 74.4% in the pre-ASP (P < 0.001). Mean mechanical ventilations days in pre- and post-ASP group were 5.9 versus 3.8 (P = 0.033).

Conclusion

In our cohort of critically ill surgery patients implementation of an ASP resulted in reduced use of continuously infused benzodiazepines and opioids, a decline in cumulative benzodiazepine and analgesic dosages, and a greater percentage of Richmond Agitation Sedation Scale scores at goal. We also showed reduced mechanical ventilation days.  相似文献   

15.

Background

Previous studies among cancer patients have demonstrated that religious patients receive more aggressive end-of-life (EOL) care. We sought to examine the effect of religious affiliation on EOL care in the intensive care unit (ICU) setting.

Materials and methods

We conducted a retrospective review of all patients admitted to any adult ICU at a tertiary academic center in 2010 requiring at least 2 d of mechanical ventilation. EOL patients were those who died within 30 d of admission. Hospital charges, ventilator days, hospital days, and days until death were used as proxies for intensity of care among the EOL patients. Multivariate analysis using multiple linear regression, zero-truncated negative binomial regression, and Cox proportional hazard model were used.

Results

A total of 2013 patients met inclusion criteria; of which, 1355 (67%) affirmed a religious affiliation. The EOL group had 334 patients, with 235 (70%) affirming a religious affiliation. The affiliated and nonaffiliated patients had similar levels of acuity. Controlling for demographic and medical confounders, religiously affiliated patients in the EOL group incurred 23% (P = 0.030) more hospital charges, 25% (P = 0.035) more ventilator days, 23% (P = 0.045) more hospital days, and 30% (P = 0.036) longer time until death than their nonaffiliated counterparts. Among all included patients, survival did not differ significantly among affiliated and nonaffiliated patients (log-rank test P = 0.317), neither was religious affiliation associated with a difference in survival on multivariate analysis (hazard ratio of death for religious versus nonreligious patients 0.95, P = 0.542).

Conclusions

Compared with nonaffiliated patients, religiously affiliated patients receive more aggressive EOL care in the ICU. However, this high-intensity care does not translate into any significant difference in survival.  相似文献   

16.

Introduction

Metastatic disease is generally considered as an absolute contraindication for liver transplantation. However, due to relatively low aggressiveness and slow progression rates, liver metastases from neuroendocrine tumors (NETs) form an exception to this rule. Given the scarcity of available data, the purpose of this study was to evaluate long-term outcomes following liver transplantation for NET metastases.

Material and Methods

There were 12 primary liver transplantations in patients with NET metastases out of 1334 liver transplantations performed in the Department of General, Transplant and Liver Surgery (Medical University of Warsaw) in the period between December 1989 and October 2013. Overall survival (OS) and disease-free survival (DFS) were set as primary and secondary outcome measures, respectively.

Results

Median follow-up was 7.9 years. For all patients, OS rate was 78.6% at 10 years and DFS rate was 15.5% at 9 years. Intraoperative transfusions of packed red blood cells (P = .021), Ki-67 proliferative index more than 2% (P = .048), and grade 2 tumors (P = .037) were identified as factors significantly associated with worse DFS. Notably, loss of E-cadherin expression (P = .444), mitotic rate (P = .771), extent of liver involvement (P = .548), primary tumor site (P = .983), and recipient age (P = .425) were not significantly associated with DFS.

Conclusions

Excellent long-term OS rates support liver transplantation for unresectable NET metastases despite almost universal post-transplantation tumor recurrence. Selection of patients with G1 tumors with Ki-67 index not exceeding 2% and reducing the requirement for intraoperative blood transfusions might improve DFS rates.  相似文献   

17.

Background

The receptor for advanced glycation end products (RAGE) is recognized to be responsible for cancer progression in several human cancers. In this study, we investigated the clinical impact of RAGE expression in patients with hepatocellular carcinoma (HCC) after hepatectomy.

Materials and methods

Sixty-five consecutive patients who underwent initial hepatectomy for HCC were investigated. The relationships between immunohistochemical expression of RAGE and clinicopathologic features, clinical outcome (overall survival [OS], and disease-free survival [DFS]) were evaluated.

Results

The cytoplasmic expression of RAGE in HCC cells was observed in 46 patients (70.8%) and correlated with histologic grade (poorly differentiated versus moderately differentiated HCC, P = 0.021). Five-year OS in RAGE-positive and RAGE-negative groups were 72% and 94%, respectively, whereas 5-y DFS were 29% and 55%, respectively. There were significant differences between OS and DFS (P = 0.018 and 0.031, respectively). Multivariate analysis indicated that RAGE was an independent predictor for both OS and DFS (P = 0.048 and 0.032, respectively).

Conclusions

Our data suggest for the first time a positive correlation between RAGE expression and poor therapeutic outcome. Furthermore, RAGE downregulation may provide a novel therapeutic target for HCC.  相似文献   

18.

Background

Discrepancies in socioeconomic factors have been associated with higher rates of perforated appendicitis. As an equal-access health care system theoretically removes these barriers, we aimed to determine if remaining differences in demographics, education, and pay result in disparate outcomes in the rate of perforated appendicitis.

Materials and methods

All patients undergoing appendectomy for acute appendicitis (November 2004–October 2009) at a tertiary care equal access institution were categorized by demographics and perioperative data. Rank of the sponsor was used as a surrogate for economic status. A multivariate logistic regression model was performed to determine patient and clinical characteristics associated with perforated appendicitis.

Results

A total of 680 patients (mean age 30 ± 16 y; 37% female) were included. The majority were Caucasian (56.4% [n = 384]; African Americans 5.6% [n = 38]; Asians 1.9% [n = 13]; and other 48.9% [n = 245]) and enlisted (87.2%). Overall, 6.4% presented with perforation, with rates of 6.6%, 5.8%, and 6.7% (P = 0.96) for officers, enlisted soldiers, and contractors, respectively. There was no difference in perforation when stratified by junior or senior status for either officers or enlisted (9.3% junior versus 4.40% senior officers, P = 0.273; 6.60% junior versus 5.50% senior enlisted, P = 0.369). On multivariate analysis, parameters such as leukocytosis and temperature, as well as race and rank were not associated with perforation (P = 0.7). Only age had a correlation, with individuals aged 66–75 y having higher perforation rates (odds ratio, 1.04; 95% confidence interval, 1.02–1.05; P < 0.001).

Conclusions

In an equal-access health care system, older age, not socioeconomic factors, correlated with increased appendiceal perforation rates.  相似文献   

19.
20.

Background

Previous studies have indicated that clinical pathways may shorten hospital length of stay (HLOS) among patients undergoing distal pancreatectomy (DP). Here, we evaluate an institutional standardized care pathway (SCP) for patients undergoing DP.

Materials and methods

A retrospective review of patients undergoing DP from November 2006 to November 2012 was completed. Patients treated before and after implementation of the SCP were compared. Multivariable linear regression was then performed to identify independent predictors of HLOS.

Results

There were no differences in patient characteristics between SCP (n = 50) and pre-SCP patients (n = 100). Laparoscopic technique (62% versus 13%, P < 0.001), splenectomy (52% versus 38%, P = 0.117), and concomitant major organ resection (24% versus 13%, P = 0.106) were more common among SCP patients. Overall, important complication rates were similar (24% versus 26%, P = 0.842). SCP patients resumed a normal diet earlier (4 versus 5 d, P = 0.025) and had shorter HLOS (6 versus 7 d, P = 0.026). There was no increase in 30-d resurgery or readmission. In univariate comparison, SCP, cancer diagnoses, intraductal papillary mucinous neoplasm diagnoses, neoadjuvant therapy, operative technique, major organ resection, and feeding tube placement were associated with HLOS; however, after multivariable adjustment, only laparoscopic technique (−33%, P = 0.001), concomitant major organ resection (+38%, P < 0.001), and feeding tube placement (+68%, P < 0.001) were independent predictors of HLOS.

Conclusions

Implementation of a clinical pathway did not improve HLOS at our institution. The increasing use of laparoscopy likely accounts for shorter HLOS in the SCP cohort. In the future, it will be important to identify clinical scenarios most likely to benefit from implementation of a clinical pathway.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号