首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

BACKGROUND

In modern clinical research, the accepted minimum follow-up for patient-reported outcome measures (PROMs) after lumbar spine surgery is 24 months, particularly after fusion. Recently, this minimum requirement has been called into question.

PURPOSE

We aim to quantify the concordance of 1- and 2-year PROMs to evaluate the importance of long-term follow-up after elective lumbar spine surgery.

STUDY DESIGN

Retrospective analysis of data from a prospective registry.

PATIENT SAMPLE

We identified all patients in our prospective institutional registry who underwent degenerative lumbar spine surgery with complete baseline, 12-month, and 24-month follow-up for ODI and numeric rating scales for back and leg pain (NRS-BP and NRS-LP).

OUTCOME MEASURES

Oswestry Disability Index (ODI) and NRS-BP and NRS-LP at 1 year and at 2 years.

METHODS

We evaluated concordance of 1- and 2-year change scores by means of Pearson's product-moment correlation and performed logistic regression to assess if achieving the minimum clinically important difference (MCID) at 12 months predicted 24-month MCID. Odds ratios (OR) and their 95% confidence intervals (CI), as well as model areas-under-the-curve were obtained.

RESULTS

A total of 210 patients were included. We observed excellent correlation among 12- and 24-month ODI (r?=?0.88), NRS-LP (r?=?0.76) and NRS-BP (r?=?0.72, all p <.001). Equal results were obtained when stratifying for discectomy, decompression, or fusion. Patients achieving 12-month MCID were likely to achieve 24-month MCID for ODI (OR: 3.3, 95% CI: 2.4–4.1), NRS-LP (OR: 2.99, 95% CI: 2.2–4.2) and NRS-BP (OR: 3.4, 95% CI: 2.7–4.2, all p <.001) with excellent areas-under-the-curve values of 0.81, 0.77, and 0.84, respectively. Concordance rates between MCID at both follow-ups were 87.2%, 83.8%, and 84.2%. A post-hoc power analysis demonstrated sufficient statistical power.

CONCLUSIONS

Irrespective of the surgical procedure, 12-month PROMs for functional disability and pain severity accurately reflect those at 24 months. In support of previous literature, our results suggest that 12 months of follow-up may be sufficient for evaluating spinal patient care in clinical practice as well as in research.  相似文献   

2.

BACKGROUND CONTEXT

Patients with pyogenic vertebral osteomyelitis (PVO) are expected to have an increased risk of bone loss. Therefore, early bisphosphonate therapy would be clinically effective for PVO patients with osteoporosis.

PURPOSE

This study aimed to investigate the effect of bisphosphonate on clinical outcomes of PVO patients with osteoporosis.

STUDY DESIGN/SETTING

A retrospective comparative study.

PATIENT SAMPLE

PVO patients with osteoporosis.

OUTCOME MEASURES

Four events of interest for Cox proportional hazard model included surgical treatment, recurrence of infection, subsequent fracture of adjacent vertebral bodies, and death.

METHODS

PVO patients were divided into three groups: group A (initiation of bisphosphonate within 6 weeks after PVO diagnosis), group B (initiation of bisphosphonate between 6 weeks and 3 months after PVO diagnosis), and group C (no treatment for osteoporosis). Cox proportional hazard model was used for the four events of interest.

RESULTS

A total of 360 PVO patients with osteoporosis were investigated for the four events of interest. Group A had significantly lower hazard ratios for undergoing later (>6 weeks after diagnosis) surgery than group C (p?=?.014) despite similar occurrences of overall surgery. A significant difference was also observed in the occurrence of subsequent fractures at adjacent vertebral bodies (p?=?.001 for model 1 and p?=?.002 for model 2). Groups A and B had significantly lower hazard ratios for subsequent fracture than group C. No significant differences were observed in the hazard ratios of recurrence and death among the three groups.

CONCLUSIONS

Early bisphosphonate treatment in PVO patients with osteoporosis was associated with a significantly lower occurrence of subsequent vertebral fracture at adjacent vertebral bodies and lower occurrence of subsequent surgery.  相似文献   

3.

BACKGROUND CONTEXT

Data on the long-term outcome after fusion for isthmic spondylolisthesis are scarce.

PURPOSE

To study patient-reported outcomes and adjacent segment degeneration (ASD) after fusion for isthmic spondylolisthesis and to compare patient-reported outcomes with a control group.

STUDY DESIGN/SETTING

A prospective study including a cross-sectional control group.

PATIENT SAMPLE

Patients with isthmic spondylolisthesis underwent posterior lumbar interbody fusion (PLIF) (n=86) or posterolateral fusion (PLF) (n=77). Patient-reported outcome data were available for 73 patients in the PLIF group and 71 in the PLF group at a mean of 11 (range 5–16) years after baseline. Seventy-seven patients in the PLIF group and 54 in the PLF group had radiographs at a mean of 14 (range 9–19) years after baseline. One hundred thirty-six randomly selected persons from the population served as controls for the patient-reported outcomes.

OUTCOME MEASURES

Patient-reported outcomes include the following: global outcome, Oswestry Disability Index, Disability Rating Index, and Short Form 36. The ASD was determined from radiographs using the University of California Los Angeles (UCLA) grading scale.

METHODS

: The chi-square test or analysis of covariance (ANCOVA) was used for group comparisons. The ANCOVA was adjusted for follow-up time, smoking, Meyerding slippage grade, teetotaler (yes/no) and, if available, the baseline level of the dependent variable.

RESULTS

There were no significant patient-reported outcome differences between the PLIF group and the PLF group. The prevalence of ASD was 42% (32/77) in the PLIF group and 26% (14/54) in the PLF group (p=.98). The patient-reported outcome data indicated lower physical function and more pain in individuals with surgically treated isthmic spondylolisthesis compared to the controls.

CONCLUSIONS

PLIF and PLF groups had similar long-term patient-reported and radiological outcomes. Individuals with isthmic spondylolisthesis have lower physical function and more pain several years after surgery when compared to the general population.  相似文献   

4.

BACKGROUND CONTEXT

Pedicle screw loosening is common after spinal fusion and can be associated with pseudoarthrosis and pain. With suspicion of screw loosening on standard radiographs, CT is currently considered the advanced imaging modality of choice. MRI with new metal artifact reduction techniques holds potential to be sensitive in detection of screw loosening. The sensitivity and specificity of either of the imaging modalities are yet clear.

PURPOSE

To evaluate the sensitivity and specificity of three different image modalities (standard radiographs, CT, and MRI) for detection of pedicle screw loosening.

STUDY DESIGN/SETTING

Cross-sectional diagnostic study.

PATIENT SAMPLE

Forty-one patients (159 pedicle screws) undergoing revision surgeries after lumbar spinal fusion between August 2014 and April 2017 with preoperative radiographs, CT, and MRI with spinal metal artifact reduction (STIR WARP and TSE high bandwidth sequences).

OUTCOME MEASURES

Sensitivity and specificity in detection of screw loosening for each imaging modality.

METHODS

Screw torque force was measured intraoperatively and compared with preoperative screw loosening signs such as peri-screw edema in MRI and peri-screw osteolysis in CT and radiographs. A torque force of less than 60 Ncm was used to define a screw as loosened.

RESULTS

Sensitivity and specificity in detection of screw loosening was 43.9% and 92.1% for MRI, 64.8% and 96.7% for CT, and 54.2% and 83.5% for standard radiographs, respectively.

CONCLUSIONS

Despite improvement of MRI with metal artifact reduction MRI technique, CT remains the modality of choice. Even so, CT fails to detect all loosened pedicle screws.  相似文献   

5.

BACKGROUND CONTEXT

Transforaminal lumbar interbody fusion (TLIF) is a widely accepted surgical procedure, but cage migration (CM) and cage retropulsion (CR) are associated with poor outcomes.

PURPOSE

This study seeks to identify risk factors associated with these serious events.

STUDY DESIGN

A prospective observational longitudinal study.

PATIENT SAMPLE

Over a 5-year period, 881 lumbar levels in 784 patients were treated using TLIF at three spinal surgery centers.

OUTCOME MEASURES

We evaluated the odds ratio of the risk factors for CM with and without subsidence and CR in multivariate analysis.

METHODS

Our study classified CM into two subgroups: CM without subsidence and CM with subsidence. Cases of spinal canal and/or foramen intrusion of the cage was defined separately as CR. Patient records, operative notes, and radiographs were analyzed for factors potentially related to CM with subsidence, CM without subsidence, and CR.

RESULTS

Of 881 lumbar levels treated with TLIFs, CM without subsidence was observed in 20 (2.3%) and CM with subsidence was observed in 36 (4.1%) patients. Among the CM cases, CR was observed in 17 (17/56, 30.4%). The risk factors of CM without subsidence were osteoporosis (OR 8.73, p < .001) and use of a unilateral single cage (OR 3.57, p < .001). Osteoporosis (OR 5.77, p < .001) and endplate injury (OR 26.87, p < .001) were found to be significant risk factors for CM with subsidence. Risk factors of CR were osteoporosis (OR 7.86, p < .001), pear-shaped disc (OR 8.28, p = .001), endplate injury (OR 18.70, p < .001), unilateral single cage use (OR 4.40, p = .03), and posterior cage position (OR 6.45, p = .04). A difference in overall fusion rates was identified, with a rate of 97.1% (801 of 825) for no CM, 55.0% (11 of 20) for CM without subsidence, 41.7% (15 of 36) for CM with subsidence, and 17.6% (3 of 17) for CR at 1.5 years postoperatively.

CONCLUSIONS

Our results suggest that osteoporosis is a significant risk factor for both CM and CR. In addition, a pear-shaped disc, posterior positioning of the cage, the presence of endplate injury and the use of a single cage were correlated with the CM with and without subsidence and CR.  相似文献   

6.

Background Context

Posterior lumbar fusion (PLF) is a commonly performed procedure. The evolution of bundled payment plans is beginning to require physicians to more closely consider patient outcomes up to 90 days after an operation. Current quality metrics and other databases often consider only 30 postoperative days. The relatively new Healthcare Cost and Utilization Project Nationwide Readmissions Database (HCUP-NRD) tracks patient-linked hospital admissions data for up to one calendar year.

Purpose

To identify readmission rates within 90 days of discharge following PLF and to put this in context of 30 day readmission and baseline readmission rates.

Study Design

Retrospective study of patients in the HCUP-NRD.

Patient Sample

Any patient undergoing PLF performed in the first 9 months of 2013 were identified in the HCUP-NRD.

Outcome Measures

Readmission patterns up to a full calendar year after discharge.

Methods

PLFs performed in the first 9 months of 2013 were identified in the HCUP-NRD. Patient demographics and readmissions were tracked for 90 days after discharge. To estimate the average admission rate in an untreated population, the average daily admission rate in the last quarter of the year was calculated for a subset of PLF patients who had their operation in the first quarter of the year. This study was deemed exempt by the institution's Human Investigation Committee.

Results

Of 26,727 PLFs, 1,580 patients (5.91%) were readmitted within 30 days of discharge and 2,603 patients (9.74%) were readmitted within 90 days of discharge. Of all readmissions within 90 days, 54.56% occurred in the first 30 days. However, if only counting readmissions above the baseline admission rate of a matched population from the 4th quarter of the year (0.08% of population/day), 89.78% of 90 day readmissions occurred within the first 30 days.

Conclusions

The current study delineates readmission rates after PLF and puts this in the context of 30-day readmission rates and baseline readmission rates for those undergoing PLF. These results are important for patient counseling, planning, and preparing for potential bundled payments in spine surgery.  相似文献   

7.

BACKGROUND CONTEXT

Spinal epidural lipomatosis (SEL) is a condition in which excess lumbar epidural fat (EF) deposition often leads to compression of the cauda equina or nerve root. Although SEL is often observed in obese adults, no systematic research investigating the potential association between SEL and metabolic syndrome has been conducted.

PURPOSE

To elucidate potential association between SEL and metabolic syndrome.

STUDY DESIGN

An observational study used data of a medical checkup.

PATIENT SAMPLE

We retrospectively reviewed data from consecutive subjects undergoing medical checkups. A total of 324 subjects (174 men and 150 women) were enrolled in this study.

OUTCOME MEASURES

The correlation of EF accumulation with demographic data and metabolic-related factors was evaluated.

METHODS

The degree of EF accumulation was evaluated based on the axial views of lumbar magnetic resonance imaging. Visceral and subcutaneous fat areas were measured at the navel level using abdominal computed tomography. Metabolic syndrome was diagnosed according to the criteria of the Japanese Society of Internal Medicine. The correlation of SEL with metabolic syndrome and metabolic-related conditions was statistically evaluated.

RESULTS

The degree of EF accumulation demonstrated a significant correlation to body mass index, abdominal circumference, and visceral fat area. However, age, body fat percentage, and subcutaneous fat area showed no correlation with the degree of EF accumulation. Logistic regression analysis revealed that metabolic syndrome (odds ratio [OR]=3.8, 95% confidence interval [CI]=1.5–9.6) was significantly associated with SEL. Among the diagnostic criteria for metabolic syndrome, visceral fat area ≥100 cm2 (OR=4.8, 95% CI=1.5–15.3) and hypertension (OR=3.5, 95% CI=1.1–11.8) were observed to be independently associated with SEL.

CONCLUSION

This is the first study to demonstrate that metabolic syndrome is associated with SEL in a relatively large, unbiased population. Our data suggest that metabolic-related conditions are potentially related to EF deposition and that SEL could be a previously unrecognized manifestation of metabolic syndrome.  相似文献   

8.

BACKGROUND CONTEXT

Surgical treatment of cervical ossification of the posterior longitudinal ligament (OPLL) has a high risk of various complications. Anterior decompression with fusion (ADF) and laminoplasty (LAMP) are the most representative surgical procedures. However, few studies have compared the two procedures in terms of perioperative surgical complications.

PURPOSE

To compare the perioperative complications post-ADF and LAMP for cervical OPLL using a large national inpatient database.

STUDY DESIGN

A retrospective cohort study with propensity score matching analysis.

PATIENT SAMPLE

Overall, 8,718 (ADF/LAMP:1,333/7,485) patients who underwent surgery for cervical OPLL from April 1, 2010 to March 31, 2016 in hospitals using the diagnosis procedure combination were analyzed.

OUTCOME MEASURES

The occurrence of postoperative complications during hospitalization.

METHODS

We compared the perioperative systemic and local complications, reoperation rates, and costs between ADF and LAMP using propensity score matching analysis.

RESULTS

One-to-one matching resulted in 1,192 pairs of patients who underwent ADF and LAMP. The postoperative cardiovascular event rate was significantly higher (ADF/LAMP=1.9/0.8%, p=.013) in the ADF group. The incidence rates of dysphagia (similarly, 2.4/0.2%, p<.001), pneumonia (1.0/0.3%, p=.045), and spinal fluid leakage (2.4/0.4%, p<.001) were also higher in the ADF group, even after matching. The costs were also higher in the ADF group. However, surgical site infection (2.0/3.4%, p=.033) was significantly lower in the ADF group. No significant difference in the reoperation rates was found between the groups.

CONCLUSION

The present study, using a large nationwide database, demonstrated that perioperative complications were more common in the ADF group, but that surgical site infection (SSI) was more frequently observed in the LAMP group.  相似文献   

9.

BACKGROUND CONTEXT

Postdischarge care is a significant source of cost variability after posterior lumbar fusion surgery. However, there remains limited evidence associating postdischarge inpatient services and improved postoperative outcomes, despite the high cost of these services.

PURPOSE

To determine the association between posthospital discharge to inpatient care facilities and postoperative complications.

STUDY DESIGN

A retrospective review of all 1- to 3-level primary posterior lumbar fusion cases in the 2010-2014 National Surgical Quality Improvement Program registry was conducted. Propensity scores for discharge destination were determined based on observable baseline patient characteristics. Multivariable propensity-adjusted logistic regressions were performed to determine associations between discharge destination and postdischarge complications, with adjusted odds ratios (OR) and 95% confidence intervals (CI).

RESULTS

A total of 18,652 posterior lumbar fusion cases were identified, 15,234 (82%) were discharged home, and 3,418 (18%) were discharged to continued inpatient care. Multivariable propensity-adjusted analysis demonstrated that being discharged to inpatient facilities was independently associated with higher risk of thromboembolic complications (OR [95% CI]: 1.79 [1.13–2.85]), urinary complications, (1.79 [1.27–2.51]), and unplanned readmissions (1.43 [1.22–1.68]).

CONCLUSIONS

Discharge to continued inpatient care versus home after primary posterior lumbar fusion is independently associated with higher odds of certain major complications. To optimize clinical outcomes as well as cost savings in an era of value-based reimbursements, clinicians and hospitals should consider further investigation into carefully investigating which patients might be better served by home discharge after surgery.  相似文献   

10.

BACKGROUND CONTEXT

Health literacy, defined as “the degree to which individuals have the capacity to obtain, process, and understand basic health information and services needed to make appropriate health decisions,” has been demonstrated to affect access to care and appropriate healthcare utilization.

PURPOSE

To determine the impact of health literacy in the evaluation and management of patients with chronic low back pain.

STUDY DESIGN

Cross sectional.

PATIENT SAMPLE

Patients seen at a multisurgeon spine specialty clinic.

OUTCOME MEASURES

Oswestry Disability Index, EQ-5D, and Numeric Rating Scales (0–10) for back and leg pain.

METHODS

The Newest Vital Sign (NVS) and Health Literacy Survey, Oswestry Disability Index, EQ-5D and pain scales were administered to patients undergoing evaluation and treatment for lumbar degenerative disease in the outpatient setting. Patients were surveyed regarding their use of medication, therapy, and pain management modalities.

RESULTS

Of 201 patients approached for participation, 186 completed the health literacy surveys. Thirty (17%) were assessed as having limited literacy, 52 (28%) as possibly having limited literacy and 104 (56%) having adequate literacy based on their NVS scores. The cohort with low NVS scores also had low Health Literacy Survey Scores. Patients with limited literacy had worse back and leg pain scores compared with patients with possibly limited literacy and adequate literacy. Patients with adequate health literacy were more likely to use medications (80% vs. 53%, p?=?.017) and were more likely to see a specialist (34% vs. 17%) compared with those with limited literacy. Patients with limited health literacy were not more likely to see a chiropractor (7% vs. 7%), but reported more visits (19 vs. 8).

CONCLUSIONS

Patients with lower health literacy reported worse back and leg pain scores, indicating either more severe disease or a fundamental difference in their responses to standard health-related quality of life measures. This study also suggests that patients with limited health literacy may underutilize some resources and overutilize other resources. Further study is needed to clarify these patterns, and to examine their impact on health status and clinical outcomes.  相似文献   

11.

BACKGROUND CONTEXT

Degenerative lumbar scoliosis (DLS) is an increasingly common spinal disorder of which current management is characterized by a substantial variety in treatment advice. To improve evidence-based clinical decision-making and increase uniformity and transparency of care, the Scoliosis Research Society established appropriateness criteria for surgery for DLS. In these criteria, however, the patient perspective was not formally incorporated. Since patient perspective is an increasingly important consideration in informed decision-making, embedding patient-reported outcome measures (PROMs) in the appropriateness criteria would allow for an objective and transparent patient-centered approach.

PURPOSE

To evaluate the extent that patient perspective is integrated into the appropriateness criteria of surgery for DLS.

STUDY DESIGN

Single center, retrospective, cohort study.

PATIENT SAMPLE

150 patients with symptomatic degenerative lumbar scoliosis.

Outcome Measures

The association between appropriateness for surgery and various PROMs [Visual Analogue Scale for pain, Short Form 36 (SF-36), Pain Catastrophizing Scale (PCS), Hospital Anxiety Depression Scale (HADS), and Oswestry Disability Index (ODI)].

METHODS

Medical records of all patients with symptomatic DLS were reviewed and scored according to the appropriateness criteria. To assess the association between the appropriateness criteria and the validated PROMs, analysis of variance was used to test for differences in PROMS for each of the three categories resulting from the appropriateness criteria. To assess how well PROMs can discriminate between appropriate and inappropriate, we used a logistic regression analysis. Discriminative ability was subsequently determined by computing the area under the curve (AUC), resulting from the logistic regression analysis. Spearman rank analysis was used to establish a correlation pattern between the PROMs used and the appropriateness criteria.

RESULTS

There was a significant association between the appropriateness of surgery and the PROMs. The discriminative ability for appropriateness of surgery for PROMs as a group was strong (AUC of 0.83). However, when considered in isolation, the predictive power of any individual PROMs was poor. The different categories of the appropriateness criteria significantly coincided with the PROMs used.

CONCLUSION

There is a statistically significant association between the appropriateness criteria of surgery for DLS and PROMs. Implementation of PROMs into the appropriateness criteria may lead to more transparent, quantifiable and uniform clinical decision making for DLS.  相似文献   

12.

Background

In bariatric surgery, preoperative very low-calorie diets (VLCD) may better meet the technical demands of surgery by shrinking the liver. However, diets may affect tissue healing and influence bowel anastomosis in an as-yet-undefined manner.

Objective

This randomized controlled trial aimed to examine the effect on collagen deposition in wounds in patients on a 4-week VLCD before laparoscopic gastric bypass.

Setting

University hospital.

Methods

The trial was undertaken in patients undergoing laparoscopic Roux-en-Y gastric bypass, with a control group (n?=?10) on normal diet and an intervention group (n?=?10) on VLCD (800 kcal) for 4 weeks. The primary outcome measured was expression of collagen I and III in skin wounds, with biopsies taken before and after the diet and 7 days postoperatively as a surrogate of anastomotic healing. Secondary outcome measures included liver volume and fibrosis score, body composition, operating time, blood loss, hospital stay, and complications.

Results

Patients in both groups were similar in age, sex, body mass index (53.4 versus 52.8 kg/m2), co-morbidities, liver volume, and body composition. Expression of mature collagen type I was significantly decreased in diet patients compared with controls after 4 weeks of diet and 7 days after surgery. This was significant decrease in liver volume (23% versus 2%, P?=?.03) but no difference in operating times (129 versus 139 min, P?=?.16), blood loss, length of stay, or incidence of complications.

Conclusions

Preoperative diets shrink liver volume and decrease expression of mature collagen in wounds after surgery. Whether the latter has a detrimental effect on clinical outcomes requires further evaluation.  相似文献   

13.

Background

For a number of years the laparoscopic adjustable gastric band has been one of the leading bariatric procedures with good short-term outcomes. However, inadequate weight loss, weight regain, and other band-related complications in the long term led to an increase in revisional Roux-en-Y gastric bypass (RYGB) procedures. Lengthening the biliopancreatic limb, a relatively simple and safe adjustment of the standard technique, could improve the results of the revisional procedure.

Objectives

The aim of this randomized controlled trial was to evaluate the effect of a long biliopancreatic limb RYGB (LBP-GB) and standard RYGB (S-GB) as revisional procedure after laparoscopic adjustable gastric band.

Setting

General hospital specialized in bariatric surgery

Methods

One hundred forty-six patients were randomized in 2 groups; 73 patients underwent an S-GB (alimentary/biliopancreatic limb 150/75 cm), and 73 patients underwent LBP-GB (alimentary/biliopancreatic limb 75/150). Weight loss, remission of co-morbidities, quality of life, and complications were assessed during a period of 4 years.

Results

Baseline characteristics between the groups were comparable. At 48 months the follow-up rate was 95%. Mean total weight loss after 24 months was 27% for LBP-GB versus 22% S-GB (P?=?.015); mean total weight loss after 48 months was 23% and 18%, respectively (P?=?.036). No significant differences in other parameters were found between the groups.

Conclusions

A LBP-GB as revisional procedure after a failing laparoscopic adjustable gastric band improves short- and long-term total weight loss compared with an S-GB. Together with future modifications this technically simple adjustment of the RYGB could significantly improve disappointing results after revisional surgery.  相似文献   

14.

BACKGROUND CONTEXT

Fungal spinal epidural abscess (FSEA) is a rare entity with high morbidity and mortality. Reports describing the clinical features, diagnosis, treatment, and outcomes of FSEA are scarce in the literature.

PURPOSE

This study aimed to describe the clinical features, diagnosis, treatment, and outcomes of FSEA.

STUDY DESIGN

This study is designed as a retrospective clinical case series.

PATIENT SAMPLE

A continuous series of patients with the diagnosis of FSEA who presented at our institution from 1993 to 2016.

METHODS

We reviewed the electronic medical records of patients with SEA who were treated within our hospital system from 1993 to 2016. We only included SEA cases that were due to fungi. We also reviewed FSEA cases in the English language literature from 1952 to 2017 to analyze the features of FSEA.

RESULTS

From a database of 1,053 SEA patients, we identified 9 patients with FSEA. Aspergillus fumigatus was isolated from 2 (22%) patients, and Candida species were isolated from 7 (78%). Focal spine pain, neurologic deficit, and fever were demonstrated in 89%, 50%, and 44% of FSEA cases, respectively. Five of nine cases involved the thoracic spine, and eight were located anterior to the thecal sac. Three cases had fungemia, six had long symptom duration (>2 weeks) prior to presentation, seven had concurrent immunosuppression, and eight had vertebral osteomyelitis. Additionally, one case had residual motor deficit at last follow-up, one had S1 sensory radicular symptoms, two suffered recurrent FSEA, two died within hospitalization, and two died within 90 days after discharge.

CONCLUSIONS

In summary, the classic diagnostic triad (focal spine pain, neurologic deficit, and fever) is not of great clinical utility for FSEA. Biopsy, intraoperative tissue culture, and blood culture can be used to diagnose FSEA. The most common pathogens of FSEA are Aspergillus and Candida species. Therefore, empiric treatment for FSEA should cover these species while definitive identification is pending. FSEA is found in patients with poor baseline health status, which is the essential reason for its high mortality.  相似文献   

15.

Background

The essence of enhanced recovery after surgery (ERAS) program is the multimodal approach, and many authors have demonstrated safety and feasibility in fast-track bariatric surgery.

Objectives

The aim of this study was to evaluate the postoperative pain after the implementation of an ERAS protocol in Roux-en-Y gastric bypass and to compare it with the application of a standard care protocol.

Setting

University Hospital Rey Juan Carlos, Madrid, Spain.

Methods

A prospective randomized clinical trial of all the patients undergoing Roux-en-Y gastric bypass was performed. Patients were randomized into the following 2 groups: those patients after an ERAS program and those patients after a standard care protocol. Postoperative pain, nausea or vomiting, morbidity, mortality, hospital stay, and analytic acute phase reactants 24 hours after surgery were evaluated.

Results

One hundred eighty patients were included in the study, 90 in each group. Postoperative pain (16 versus 37 mm; P < .001), nausea or vomiting (8.9% versus 2.2%; P?=?.0498), and hospital stay (1.7 versus 2.8 d; P < .001) were significantly lower in the ERAS group. There were no significant differences in complications, mortality, and readmission rates. White blood cell count, serum fibrinogen, and C reactive protein levels were significantly lower in the ERAS group 24 hours after surgery.

Conclusion

The implementation of an ERAS protocol was associated with lower postoperative pain, reduced incidence of postoperative nausea or vomiting, lower levels of acute phase reactants, and earlier hospital discharge. Complications, reinterventions, mortality, and readmission rates were similar to that obtained after a standard care protocol.  相似文献   

16.

Background

Obesity is associated with an increased risk of atrial fibrillation (AF). Bariatric surgery results insubstantial long-term weight loss and the amelioration of several chronic comorbidities. We hypothesized that weightreduction with bariatric surgery would reduce the long-term incidence of AF.

Objectives

To assess the association between bariatric surgery and AF prevention.

Setting

University Hospital, United States.

Methods

All patients who underwent bariatric surgery at a single institution from 1985–2015 (n?=?3,572) were propensity score matched 1:1 to a control population of obese patients with outpatient appointments (n?=?45,750) in our clinical data repository. Patients with a prior diagnosis of AF were excluded. Demographics, relevant comorbidities, and insurance status were collected and a chart review was performed for all patients with AF. Paired univariate analyses were used to compare the two groups.

Results

After propensity score matching, 5,044 total patients were included (2,522 surgical, 2,522 non-surgical). There were no differences in preoperative body mass index (BMI) (47.1 vs 47.7 kg/m2, P?=?0.76) or medical comorbidities between groups. The incidence of AF was lower among surgical patients (0.8% vs 2.9%, P?=?0.0001). In patients ultimately diagnosed with AF, time from enrollment to development of AF did not differ between groups; however, surgical patients with AF experienced a significantly higher reduction in excess BMI compared to non-surgical patients with AF (57.9% vs ?3.8%, P<0.001).

Conclusion

The incidence of AF was lower among patients who underwent bariatric surgery compared to their medically managed counterparts. Weight reduction with bariatric surgery may reduce the long-term incidence of AF.  相似文献   

17.

BACKGROUND

The pattern of linear graph schematized by visual analogue scale (VAS) score displaying pain worsening between 2 days and 2 weeks after selective nerve root block (SNRB) is called rebound pain.

PURPOSE

The purpose of this study was to determine if sodium hyaluronate and carboxymethyl cellulose solution (HA-CMC sol) injection could reduce the occurrence of rebound pain at 3 days to 2 weeks after SNRB in patients with radiculopathy compared with injection with corticosteroids and local anesthetics alone.

STUDY DESIGN/SETTING

Double blinded randomized controlled clinical trial.

PATIENT SAMPLE

A total of 44 patients (23 of 24 patients in the Guardix group and 21 of 24 patients in the control group) who finished the follow-up session were subjects of this study.

OUTCOME MEASUREMENT

Patients were asked to write down their average VAS pain scores daily for 12 weeks. Functional outcomes were assessed by Oswestry Disability Index, Roland Morris Disability Questionnaire , and Short Form-36.

METHOD

A cocktail of corticosteroids, 1% lidocaine, 0.5% Bupivacaine, and 1 mL of normal saline was used for the control group whereas a cocktail of corticosteroids, 1% lidocaine, 0.5% Bupivacaine, and 1 mL of HA-CMC solution was used for the G group. Study participants were randomized into one of two treatment regimens. They were followed up for 3 months.

RESULTS

VAS score at 2 weeks after the procedure was 4.19±1.32 in the control group, which was significantly (p<.05) higher than that (2.43±1.24) in the G group. VAS score at 6 weeks after the procedure was 4.00±1.23 in the control group and 3.22±1.45 in the G group, showing no significant (p=.077) difference between the two groups. There were no significant differences in functional outcomes at 6 or 12 weeks after the procedure.

CONCLUSIONS

Compared with conventional cocktail used for SNRB, addition of HA-CMC sol showed effective control of rebound pain at 3 days to 2 weeks after the procedure.  相似文献   

18.

PURPOSE

To characterize the gross, histologic, and systemic changes caused by implantation of metal fragments commonly used in commercial bullets into the intervertebral disc.

BACKGROUND CONTEXT

Long-term complications of retained bullet fragments in the spine have been documented in the literature; however, the impact of different metal projectiles on the intervertebral disc has not been described. This study was performed to assess the local effects of the metallic bullet fragments on the intervertebral disc and their systemic effects regarding metal ion concentrations in serum and solid organs.

STUDY DESIGN

Animal Model Study.

METHODS

Funding for this project was provided by the Cervical Spine Research Society in the amount of $10,000. Copper, lead, and aluminum alloys from commercially available bullets were surgically implanted into sequential intervertebral discs in the lumbar spine of six canines. Kirschner wire implantation and a sham operation were performed as controls. Radiographs were performed to confirm the location of the bullets. Animals were sacrificed at 4, 6, and 9 months postimplantation. Whole blood, plasma, cerebrospinal fluid, kidney tissue, and liver tissue samples were analyzed for copper and lead concentrations. Histologic and gross samples were examined at the time of sacrifice.

RESULTS

Significant tissue reactions were noted in the discs exposed to copper and lead. Copper resulted in significantly more severe disc degeneration than either the lead or aluminum alloy. In the short interval follow-up of this study, no statistically significant trend was observed in whole blood, plasma, cerebrospinal fluid, and tissue levels.

CONCLUSION

This study demonstrates that the canine intervertebral disc is differentially susceptible to metallic fragments depending on the composition. Trends were noted for increasing levels of lead and copper in liver tissue samples although statistical significance could not be reached due to short time interval and small sample size. The metallic composition of retained fragments can be a determining factor in deciding on surgical intervention.  相似文献   

19.

Background

Bariatric surgery is an effective and durable treatment for obesity. However, the number of patients that progress to bariatric surgery after initial evaluation remains low.

Objectives

The purpose of this study was to identify factors influencing a qualified patient's successful progression to surgery in a U.S. metropolitan area.

Setting

Academic, university hospital.

Methods

A single-institution retrospective chart review was performed from 2003 to 2016. Patient demographics and follow-up data were compared between those who did and did not progress to surgery. A follow-up telephone survey was performed for patients who failed to progress. Univariate analyses were performed and statistically significant variables of interest were analyzed using a multivariable logistic regression model.

Results

A total of 1102 patients were identified as eligible bariatric surgery candidates. Four hundred ninety-eight (45%) patients progressed to surgery and 604 (55%) did not. Multivariable analysis showed that patients who did not progress were more likely male (odds ratio [OR] 2.2 confidence interval [CI]: 1.2–4.2, P < .05), smokers (OR 2.4 CI: 1.1–5.4, P < .05), attended more nutrition appointments (OR 2.1 CI: 1.5–2.8, P < .0001), attended less total preoperative appointments (OR .41 CI: .31–.55, P < .0001), and resided in-state compared with out of state (OR .39 CI: .22–.68, P < .05). The top 3 patient self-reported factors influencing nonprogression were fear of complication, financial hardship, and insurance coverage.

Conclusions

Multiple patient factors and the self-reported factors of fear of complication and financial hardship influenced progression to bariatric surgery in a U.S. metropolitan population. Bariatric surgeons and centers should consider and address these factors when assessing patients.  相似文献   

20.

Background

Numerous studies have shown that Roux-en-Y gastric bypass (RYGB) and sleeve gastrectomy (SG) differently affect metabolic disorders associated with obesity. While bariatric surgery has been shown to improve nonalcoholic fatty liver disease, very few studies have compared liver parameters after both procedures.

Objectives

To compare the evolution of liver parameters after SG and RYGB and their relationships with improvement of metabolic disorders.

Methods

Metabolic parameters and abdominal ultrasonography were recorded before and 1 year after bariatric surgery in all patients who underwent SG or RYGB between 2004 and 2016 in our institution.

Setting

University hospital, Colombes, France.

Results

Five hundred thirty-three patients (15% men, age 43 ± 11 yr) were analyzed, including 326 who underwent RYGB and 207 who underwent SG. Before surgery, body mass index (44.7 ± 5.7 versus 44.4 ± 7.4 kg/m²) and metabolic parameters were not significantly different. One year after surgery, RYGB induced greater weight loss (31.9 ± 7.7 versus 28.6 ± 8.3 %, P < .001). Metabolic parameters improved in both groups, but fasting insulin, low-density lipoprotein cholesterol, C-reactive protein, and ferritin were lower after RYGB (P < .001). In contrast, transaminases were higher after RYGB compared with SG (alanine aminotransferase: 31.6 ± 18.7 versus 22.6 ± 7.7 IU/L; P < .001). The persistence of alanine aminotransferase >34 IU/L (27% versus 7% of patients, P < .001) was independent of the persistence of steatosis on ultrasonography (39% versus 37% of patients) 1 year after RYGB and SG, respectively.

Conclusion

Despite a greater improvement of metabolic disorders, RYGB has less beneficial effect on liver parameters compared with SG. Further studies are required to define the mechanisms explaining these differences between both procedures.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号