首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 593 毫秒
1.

Objective

To measure the effect of nurse practitioner and pharmacist consultations on the appropriate use of medications by patients.

Design

We studied patients in the intervention arm of a randomized controlled trial. The main trial intervention was provision of multidisciplinary team care and the main outcome was quality and processes of care for chronic disease management.

Setting

Patients were recruited from a single publicly funded family health network practice of 8 family physicians and associated staff serving 10 000 patients in a rural area near Ottawa, Ont.

Participants

A total of 120 patients 50 years of age or older who were on the practice roster and who were considered by their family physicians to be at risk of experiencing adverse health outcomes.

Intervention

A pharmacist and 1 of 3 nurse practitioners visited each patient at his or her home, conducted a comprehensive medication review, and developed a tailored plan to optimize medication use. The plan was developed in consultation with the patient and the patient’s doctor. We assessed medication appropriateness at the study baseline and again 12 to 18 months later.

Main outcome measures

We used the medication appropriateness index to assess medication use. We examined associations between personal characteristics and inappropriate use at baseline and with improvements in medication use at the follow-up assessment. We recorded all drug problems encountered during the trial.

Results

At baseline, 27.2% of medications were inappropriate in some way and 77.7% of patients were receiving at least 1 medication that was inappropriate in some way. At the follow-up assessments these percentages had dropped to 8.9% and 38.6%, respectively (P < .001). Patient characteristics that were associated with receiving inappropriate medication at baseline were being older than 80 years of age (odds ratio [OR] = 5.00, 95% CI 1.19 to 20.50), receiving more than 4 medications (OR = 6.64, 95% CI 2.54 to 17.4), and not having a university-level education (OR = 4.55, 95% CI 1.69 to 12.50).

Conclusion

We observed large improvements in the appropriate use of medications during this trial. This might provide a mechanism to explain some of the reductions in mortality and morbidity observed in other trials of counseling and advice provided by pharmacists and nurses.

Trial registration number

NCT00238836 (ClinicalTrials.gov).  相似文献   

2.
3.

Introduction

Critically ill patients with respiratory failure undergoing bronchoscopy have an increased risk of hypoxaemia-related complications. Previous studies have shown that in awake, hypoxaemic patients non-invasive ventilation (NIV) is helpful in preventing gas exchange deterioration during bronchoscopy. An alternative and increasingly used means of oxygen delivery is its application via high-flow nasal cannula (HFNC). This study was conducted to compare HFNC with NIV in patients with acute hypoxaemic respiratory failure undergoing flexible bronchoscopy.

Methods

Prospective randomised trial randomising 40 critically ill patients with hypoxaemic respiratory failure to receive either NIV or HFNC during bronchoscopy in the intensive care unit.

Results

After the initiation of NIV and HFNC, oxygen levels were significantly higher in the NIV group compared to the HFNC group. Two patients were unable to proceed to bronchoscopy after the institution of HFNC due to progressive hypoxaemia. During bronchoscopy, one patient on HFNC deteriorated due to intravenous sedation requiring non-invasive ventilatory support. Bronchoscopy was well tolerated in all other patients. There were no significant differences between the two groups regarding heart rate, mean arterial pressure and respiratory rate. Three patients in the NIV group and one patient in the HFNC group were intubated within 24 hours after the end of bronchoscopy (P = 0.29).

Conclusions

The application of NIV was superior to HFNC with regard to oxygenation before, during and after bronchoscopy in patients with moderate to severe hypoxaemia. In patients with stable oxygenation under HFNC, subsequent bronchoscopy was well tolerated.

Trial registration

ClinicalTrials.gov NCT01870765. Registered 30 May 2013.  相似文献   

4.

Background

Limited evidence suggests that dietary interventions may offer a promising approach for migraine. The purpose of this study was to determine the effects of a low-fat plant-based diet intervention on migraine severity and frequency.

Methods

Forty-two adult migraine sufferers were recruited from the general community in Washington, DC, and divided randomly into two groups. This 36-week crossover study included two treatments: dietary instruction and placebo supplement. Each treatment period was 16 weeks, with a 4-week washout between. During the diet period, a low-fat vegan diet was prescribed for 4 weeks, after which an elimination diet was used. Participants were assessed at the beginning, midpoint, and end of each period. Significance was determined using student’s t-tests.

Results

Worst headache pain in last 2 weeks, as measured by visual analog scale, was initially 6.4/10 cm (SD 2.1 cm), and declined 2.1 cm during the diet period and 0.7 cm during the supplement period (p=0.03). Average headache intensity (0–10 scale) was initially 4.2 (SD 1.4) per week, and this declined by 1.0 during the diet period and by 0.5 during the supplement period (p=0.20). Average headache frequency was initially 2.3 (SD 1.8) per week, and this declined by 0.3 during the diet period and by 0.4 during the supplement period (p=0.61). The Patient’s Global Impression of Change showed greater improvement in pain during the diet period (p<0.001).

Conclusions

These results suggest that a nutritional approach may be a useful part of migraine treatment, but that methodologic issues necessitate further research.

Trial registration

Clinicaltrials.gov, NCT01699009 and NCT01547494.  相似文献   

5.

Introduction

Several single-center studies and meta-analyses have shown that perioperative goal-directed therapy may significantly improve outcomes in general surgical patients. We hypothesized that using a treatment algorithm based on pulse pressure variation, cardiac index trending by radial artery pulse contour analysis, and mean arterial pressure in a study group (SG), would result in reduced complications, reduced length of hospital stay and quicker return of bowel movement postoperatively in abdominal surgical patients, when compared to a control group (CG).

Methods

160 patients undergoing elective major abdominal surgery were randomized to the SG (79 patients) or to the CG (81 patients). In the SG hemodynamic therapy was guided by pulse pressure variation, cardiac index trending and mean arterial pressure. In the CG hemodynamic therapy was performed at the discretion of the treating anesthesiologist. Outcome data were recorded up to 28 days postoperatively.

Results

The total number of complications was significantly lower in the SG (72 vs. 52 complications, p = 0.038). In particular, infection complications were significantly reduced (SG: 13 vs. CG: 26 complications, p = 0.023). There were no significant differences between the two groups for return of bowel movement (SG: 3 vs. CG: 2 days postoperatively, p = 0.316), duration of post anesthesia care unit stay (SG: 180 vs. CG: 180 minutes, p = 0.516) or length of hospital stay (SG: 11 vs. CG: 10 days, p = 0.929).

Conclusions

This multi-center study demonstrates that hemodynamic goal-directed therapy using pulse pressure variation, cardiac index trending and mean arterial pressure as the key parameters leads to a decrease in postoperative complications in patients undergoing major abdominal surgery.

Trial registration

ClinicalTrial.gov, NCT01401283.  相似文献   

6.

Introduction

Patients with distributive shock who require high dose vasopressors have a high mortality. Angiotensin II (ATII) may prove useful in patients who remain hypotensive despite catecholamine and vasopressin therapy. The appropriate dose of parenteral angiotensin II for shock is unknown.

Methods

In total, 20 patients with distributive shock and a cardiovascular Sequential Organ Failure Assessment score of 4 were randomized to either ATII infusion (N =10) or placebo (N =10) plus standard of care. ATII was started at a dose of 20 ng/kg/min, and titrated for a goal of maintaining a mean arterial pressure (MAP) of 65 mmHg. The infusion (either ATII or placebo) was continued for 6 hours then titrated off. The primary endpoint was the effect of ATII on the standing dose of norepinephrine required to maintain a MAP of 65 mmHg.

Results

ATII resulted in marked reduction in norepinephrine dosing in all patients. The mean hour 1 norepinephrine dose for the placebo cohort was 27.6 ± 29.3 mcg/min versus 7.4 ± 12.4 mcg/min for the ATII cohort (P =0.06). The most common adverse event attributable to ATII was hypertension, which occurred in 20% of patients receiving ATII. 30-day mortality for the ATII cohort and the placebo cohort was similar (50% versus 60%, P =1.00).

Conclusion

Angiotensin II is an effective rescue vasopressor agent in patients with distributive shock requiring multiple vasopressors. The initial dose range of ATII that appears to be appropriate for patients with distributive shock is 2 to 10 ng/kg/min.

Trial registration

Clinicaltrials.gov NCT01393782. Registered 12 July 2011.  相似文献   

7.

Background

Nuclear magnetic resonance (NMR) imaging and spectroscopy have been applied to assess skeletal muscle oxidative metabolism. Therefore, in-vivo NMR may enable the characterization of ischemia-reperfusion injury. The goal of this study was to evaluate whether NMR could detect the effects of ischemic preconditioning (IPC) in healthy subjects.

Methods

Twenty-three participants were included in two randomized crossover protocols in which the effects of IPC were measured by NMR and muscle force assessments. Leg ischemia was administered for 20 minutes with or without a subsequent impaired reperfusion for 5 minutes (stenosis model). IPC was administered 4 or 48 hours prior to ischemia. Changes in 31phosphate NMR spectroscopy and blood oxygen level-dependent (BOLD) signals were recorded. 3-Tesla NMR data were compared to those obtained for isometric muscular strength.

Results

The phosphocreatine (PCr) signal decreased robustly during ischemia and recovered rapidly during reperfusion. In contrast to PCr, the recovery of muscular strength was slow. During post-ischemic stenosis, PCr increased only slightly. The BOLD signal intensity decreased during ischemia, ischemic exercise and post-ischemic stenosis but increased during hyperemic reperfusion. IPC 4 hours prior to ischemia significantly increased the maximal PCr reperfusion signal and mitigated the peak BOLD signal during reperfusion.

Conclusions

Ischemic preconditioning positively influenced muscle metabolism during reperfusion; this resulted in an increase in PCr production and higher oxygen consumption, thereby mitigating the peak BOLD signal. In addition, an impairment of energy replenishment during the low-flow reperfusion was detected in this model. Thus, functional NMR is capable of characterizing changes in reperfusion and in therapeutic interventions in vivo.

Trial Registration

ClinicalTrials.gov: NCT00883467  相似文献   

8.

Introduction

Tailoring interventions to address identified barriers to change may be an effective strategy to implement guidelines and improve practice. However, there is inadequate data to inform the optimal method or level of tailoring. Consequently, we conducted the PERFormance Enhancement of the Canadian nutrition guidelines by a Tailored Implementation Strategy (PERFECTIS) study to determine the feasibility of a multifaceted, interdisciplinary, tailored intervention aimed at improving adherence to critical care nutrition guidelines for the provision of enteral nutrition.

Methods

A before-after study was conducted in seven ICUs from five hospitals in North America. During a 3-month pre-implementation phase, each ICU completed a nutrition practice audit to identify guideline-practice gaps and a barriers assessment to identify obstacles to practice change. During a one day meeting, the results of the audit and barriers assessment were reviewed and used to develop a site-specific tailored action plan. The tailored action plan was then implemented over a 12-month period that included bi-monthly progress meetings. Compliance with the tailored action plan was determined by the proportion of items in the action plan that was completely implemented. We examined acceptability of the intervention through staff responses to an evaluation questionnaire. In addition, the nutrition practice audit and barriers survey were repeated at the end of the implementation phase to determine changes in barriers and nutrition practices.

Results

All five sites successfully completed all aspects of the study. However, their ability to fully implement all of their developed action plans varied from 14% to 75% compliance. Nurses, on average, rated the study-related activities and resources as ‘somewhat useful’ and a third of respondents ‘agreed’ or ‘strongly agreed’ that their nutrition practice had changed as a result of the intervention. We observed a statistically significant 10% (Site range -4.3% to -26.0%) decrease in overall barriers score, and a non-significant 6% (Site range -1.5% to 17.9%) and 4% (-8.3% to 18.2%) increase in the adequacy of total nutrition from calories and protein, respectively.

Conclusions

The multifaceted tailored intervention appears to be feasible but further refinement is warranted prior to testing the effectiveness of the approach on a larger scale.

Trial registration

ClinicalTrials.gov NCT01168128. Registered 21 July 2010.  相似文献   

9.

Introduction

Glucose measurement in intensive care medicine is performed intermittently with the risk of undetected hypoglycemia. The workload for the ICU nursing staff is substantial. Subcutaneous continuous glucose monitoring (CGM) systems are available and may be able to solve some of these issues in critically ill patients.

Methods

In a randomized controlled design in a mixed ICU in a teaching hospital we compared the use of subcutaneous CGM with frequent point of care (POC) to guide insulin treatment. Adult critically ill patients with an expected stay of more than 24 hours and in need of insulin therapy were included. All patients received subcutaneous CGM. CGM data were blinded in the control group, whereas in the intervention group these data were used to feed a computerized glucose regulation algorithm. The same algorithm was used in the control group fed by intermittent POC glucose measurements. Safety was assessed with the incidence of severe hypoglycemia (<2.2 mmol/L), efficacy with the percentage time in target range (5.0 to 9.0 mmol/L). In addition, we assessed nursing workload and costs.

Results

In this study, 87 patients were randomized to the intervention and 90 to the control group. CGM device failure resulted in 78 and 78 patients for analysis. The incidence of severe glycemia and percentage of time within target range was similar in both groups. A significant reduction in daily nursing workload for glucose control was found in the intervention group (17 versus 36 minutes; P <0.001). Mean daily costs per patient were significantly reduced with EUR 12 (95% CI −32 to −18, P = 0.02) in the intervention group.

Conclusions

Subcutaneous CGM to guide insulin treatment in critically ill patients is as safe and effective as intermittent point-of-care measurements and reduces nursing workload and daily costs. A new algorithm designed for frequent measurements may lead to improved performance and should precede clinical implementation.

Trial registration

Clinicaltrials.gov, NCT01526044. Registered 1 February 2012.

Electronic supplementary material

The online version of this article (doi:10.1186/s13054-014-0453-9) contains supplementary material, which is available to authorized users.  相似文献   

10.

Introduction

Critical illness polyneuropathy and/or myopathy (CIPNM) is a severe complication of critical illness. Retrospective data suggest that early application of IgM-enriched intravenous immunoglobulin (IVIG) may prevent or mitigate CIPNM. Therefore, the primary objective was to assess the effect of early IgM-enriched IVIG versus placebo to mitigate CIPNM in a prospective setting.

Methods

In this prospective, randomized, double-blinded and placebo-controlled trial, 38 critically ill patients with multiple organ failure (MOF), systemic inflammatory response syndrome (SIRS)/sepsis, and early clinical signs of CIPNM were included. Patients were randomly assigned to be treated either with IgM-enriched IVIG or placebo over a period of three days. CIPNM was measured by the CIPNM severity sum score based on electrophysiological stimulation of the median, ulnar, and tibial nerves on days 0, 4, 7, 14 and on the histological evaluation of muscle biopsies on days 0 and 14 and ranged from 0 (no CIPNM) to 8 (very severe CIPNM).

Results

A total of 38 critically ill patients were included and randomized to receive either IgM-enriched IVIG (n = 19) or placebo (n = 19). Baseline characteristics were similar between the two groups. CIPNM could not be improved by IVIG treatment, represented by similar CIPNM severity sum scores on day 14 (IVIG vs. placebo: 4.8 ± 2.0 vs. 4.5 ± 1.8; P = 0.70). CIPNM severity sum score significantly increased from baseline to day 14 (3.5 ± 1.6 vs. 4.6 ± 1.9; P = 0.002). After an interim analysis the study was terminated early due to futility in reaching the primary endpoint.

Conclusions

Early treatment with IVIG did not mitigate CIPNM in critically ill patients with MOF and SIRS/sepsis.

Trial registration

Clinicaltrials.gov: NCT01867645  相似文献   

11.

Introduction

Among critically ill patients with acute kidney injury (AKI) needing continuous renal replacement therapy (CRRT), the effect of convective (via continuous venovenous hemofiltration [CVVH]) versus diffusive (via continuous venovenous hemodialysis [CVVHD]) solute clearance on clinical outcomes is unclear. Our objective was to evaluate the feasibility of comparing these two modes in a randomized trial.

Methods

This was a multicenter open-label parallel-group pilot randomized trial of CVVH versus CVVHD. Using concealed allocation, we randomized critically ill adults with AKI and hemodynamic instability to CVVH or CVVHD, with a prescribed small solute clearance of 35 mL/kg/hour in both arms. The primary outcome was trial feasibility, defined by randomization of >25% of eligible patients, delivery of >75% of the prescribed CRRT dose, and follow-up of >95% of patients to 60 days. A secondary analysis using a mixed-effects model examined the impact of therapy on illness severity, defined by sequential organ failure assessment (SOFA) score, over the first week.

Results

We randomized 78 patients (mean age 61.5 years; 39% women; 23% with chronic kidney disease; 82% with sepsis). Baseline SOFA scores (mean 15.9, SD 3.2) were similar between groups. We recruited 55% of eligible patients, delivered >80% of the prescribed dose in each arm, and achieved 100% follow-up. SOFA tended to decline more over the first week in CVVH recipients (-0.8, 95% CI -2.1, +0.5) driven by a reduction in vasopressor requirements. Mortality (54% CVVH; 55% CVVHD) and dialysis dependence in survivors (24% CVVH; 19% CVVHD) at 60 days were similar.

Conclusions

Our results suggest that a large trial comparing CVVH to CVVHD would be feasible. There is a trend toward improved vasopressor requirements among CVVH-treated patients over the first week of treatment.

Trial Registration

ClinicalTrials.gov: NCT00675818  相似文献   

12.

Introduction

The aim of the study was to assess whether adults admitted to hospitals with both Intensive Care Units (ICU) and Intermediate Care Units (IMCU) have lower in-hospital mortality than those admitted to ICUs without an IMCU.

Methods

An observational multinational cohort study performed on patients admitted to participating ICUs during a four-week period. IMCU was defined as any physically and administratively independent unit open 24 hours a day, seven days a week providing a level of care lower than an ICU but higher than a ward. Characteristics of hospitals, ICUs and patients admitted to study ICUs were recorded. The main outcome was all-cause in-hospital mortality until hospital discharge (censored at 90 days).

Results

One hundred and sixty-seven ICUs from 17 European countries enrolled 5,834 patients. Overall, 1,113 (19.1%) patients died in the ICU and 1,397 died in hospital, with a total of 1,397 (23.9%) deaths. The illness severity was higher for patients in ICUs with an IMCU (median Simplified Acute Physiology Score (SAPS) II: 37) than for patients in ICUs without an IMCU (median SAPS II: 29, P <0.001). After adjustment for patient characteristics at admission such as illness severity, and ICU and hospital characteristics, the odds ratio of mortality was 0.63 (95% CI 0.45 to 0.88, P = 0.007) in favour of the presence of IMCU. The protective effect of the IMCU was absent in patients who were admitted for basic observation, for example, after surgery (odds ratio 1.15, 95% CI 0.65 to 2.03, P = 0.630) but was strong in patients admitted to an ICU for other reasons (odds ratio 0.54, 95% CI 0.37 to 0.80, P = 0.002).

Conclusions

The presence of an IMCU in the hospital is associated with significantly reduced adjusted hospital mortality for adults admitted to the ICU. This effect is relevant for the patients requiring full intensive treatment.

Trial registration

Clinicaltrials.gov NCT01422070. Registered 19 August 2011.

Electronic supplementary material

The online version of this article (doi:10.1186/s13054-014-0551-8) contains supplementary material, which is available to authorized users.  相似文献   

13.

Introduction

Recombinant human erythropoietin (EPO) is known to provide organ protection against ischemia-reperfusion injury through its pleiotropic properties. The aim of this single-site, randomized, case-controlled, and double-blind study was to investigate the effect of pre-emptive EPO administration on the incidence of postoperative acute kidney injury (AKI) in patients with risk factors for AKI undergoing complex valvular heart surgery.

Methods

We studied ninety-eight patients with preoperative risk factors for AKI. The patients were randomly allocated to either the EPO group (n = 49) or the control group (n = 49). The EPO group received 300 IU/kg of EPO intravenously after anesthetic induction. The control group received an equivalent volume of normal saline. AKI was defined as an increase in serum creatinine >0.3 mg/dl or >50% from baseline. Biomarkers of renal injury were serially measured until five days postoperatively.

Results

Patient characteristics and operative data, including the duration of cardiopulmonary bypass, were similar between the two groups. Incidence of postoperative AKI (32.7% versus 34.7%, P = 0.831) and biomarkers of renal injury including cystatin C and neutrophil gelatinase-associated lipocalin showed no significant differences between the groups. The postoperative increase in interleukin-6 and myeloperoxidase was similar between the groups. None of the patients developed adverse complications related to EPO administration, including thromboembolic events, throughout the study period.

Conclusions

Intravenous administration of 300 IU/kg of EPO did not provide renal protection in patients who are at increased risk of developing AKI after undergoing complex valvular heart surgery.

Trial registration

Clinical Trial.gov, NCT01758861  相似文献   

14.

Introduction

There are little data about patients with cardiogenic shock (CS) who survive the early phase of acute myocardial infarction (AMI). The aim of this study was to assess long-term (5-year) mortality among early survivors of AMI, according to the presence of CS at the acute stage.

Methods

We analyzed 5-year follow-up data from the French registry of Acute ST-elevation and non-ST-elevation Myocardial Infarction (FAST-MI) 2005 registry, a nationwide French survey including consecutive patients admitted for ST or non-ST-elevation AMI at the end of 2005 in 223 institutions.

Results

Of 3670 patients enrolled, shock occurred in 224 (6.1%), and 3411 survived beyond 30 days or hospital discharge, including 99 (2.9%) with shock. Early survivors with CS had a more severe clinical profile, more frequent concomitant in-hospital complications, and were less often managed invasively than those without CS.Five-year survival was 59% in patients with, versus 76% in those without shock (adjusted hazard ratio (HR) = 1.72 [1.24-2.38], P = 0.001). The excess of death associated with CS, however, was observed only during the first year (one-year survival: 77% vs 93%, adjusted HR: 2.87 [1.85 to 4.46] P <0.001), while survival from one to 5 years was similar (76% vs 82%, adjusted HR: 1.06 [0.64 to 1.74]). Propensity score-matched analyses yielded similar results.

Conclusions

In patients surviving the early phase of AMI, CS at the initial stage carries an increased risk of death up to one year after the acute event. Beyond one year, however, mortality is similar to that of patients without shock.

Trial registration

ClinicalTrials.gov number, NCT00673036, Registered May 5, 2008.

Electronic supplementary material

The online version of this article (doi:10.1186/s13054-014-0516-y) contains supplementary material, which is available to authorized users.  相似文献   

15.

Background

Risk scores for cardiovascular disease (CVD) are in common use to integrate multiple cardiovascular risk factors in order to identify individuals at greatest risk for disease. The purpose of this study was to determine if individuals at greater cardiovascular risk have T1 mapping indices by cardiovascular magnetic resonance (CMR) indicative of greater myocardial fibrosis.

Methods

CVD risk scores for 1208 subjects (men, 50.8%) ages 55–94 years old were evaluated in the Multiethnic Study of Atherosclerosis (MESA) at six centers. T1 times were determined at 1.5Tesla before and after gadolinium administration (0.15 mmol/kg) using a modified Look-Locker pulse sequence. The relationship between CMR measures (native T1, 12 and 25 minute post-gadolinium T1, partition coefficient and extracellular volume fraction) and 14 established different cardiovascular risk scores were determined using regression analysis. Bootstrapping analysis with analysis of variance was used to compare different CMR measures. CVD risk scores were significantly different for men and women (p < 0.001).

Results

25 minute post gadolinium T1 time showed more statistically significant associations with risk scores (10/14 scores, 71%) compared to other CMR indices (e.g. native T1 (7/14 scores, 50%) and partition coefficient (7/14, 50%) in men. Risk scores, particularly the new 2013 AHA/ASCVD risk score, did not correlate with any CMR fibrosis index.

Conclusions

Men with greater CVD risk had greater CMR indices of myocardial fibrosis. T1 times at greater delay time (25 minutes) showed better agreement with commonly used risk score indices compared to ECV and native T1 time.

Clinical trial registration

http://www.mesa-nhlbi.org/, NCT00005487.  相似文献   

16.

Introduction

We sought to investigate whether the use of balanced solutions reduces the incidence of hyperchloraemic acidosis without increasing the risk for intracranial hypertension in patients with severe brain injury.

Methods

We conducted a single-centre, two-arm, randomised, double-blind, pilot controlled trial in Nantes, France. Patients with severe traumatic brain injury (Glasgow Coma Scale score ≤8) or subarachnoid haemorrhage (World Federation of Neurosurgical Society grade III or higher) who were mechanically ventilated were randomised within the first 12 hours after brain injury to receive either isotonic balanced solutions (crystalloid and hydroxyethyl starch; balanced group) or isotonic sodium chloride solutions (crystalloid and hydroxyethyl starch; saline group) for 48 hours. The primary endpoint was the occurrence of hyperchloraemic metabolic acidosis within 48 hours.

Results

Forty-two patients were included, of whom one patient in each group was excluded (one consent withdrawn and one use of forbidden therapy). Nineteen patients (95%) in the saline group and thirteen (65%) in the balanced group presented with hyperchloraemic acidosis within the first 48 hours (hazard ratio = 0.28, 95% confidence interval [CI] = 0.11 to 0.70; P = 0.006). In the saline group, pH (P = .004) and strong ion deficit (P = 0.047) were lower and chloraemia was higher (P = 0.002) than in the balanced group. Intracranial pressure was not different between the study groups (mean difference 4 mmHg [-1;8]; P = 0.088). Seven patients (35%) in the saline group and eight (40%) in the balanced group developed intracranial hypertension (P = 0.744). Three patients (14%) in the saline group and five (25%) in the balanced group died (P = 0.387).

Conclusions

This study provides evidence that balanced solutions reduce the incidence of hyperchloraemic acidosis in brain-injured patients compared to saline solutions. Even if the study was not powered sufficiently for this endpoint, intracranial pressure did not appear different between groups.

Trial registration

EudraCT 2008-004153-15 and NCT00847977The work in this trial was performed at Nantes University Hospital in Nantes, France.  相似文献   

17.

Citation

Casaer MP, Mesotten D, Hermans G et al. Early versus late parenteral nutrition in critically ill adults. N Engl J Med. 2011;365: 506-517.

Background

Controversy exists about the timing of the initiation of parenteral nutrition (PN) in critically ill adults in whom caloric targets cannot be met by enteral nutrition (EN) alone.

Methods

Objective

To compare early-initiation of PN (European guidelines) with late-initiation (American and Canadian guidelines) in adults who are receiving insufficient enteral nutrition in the intensive care unit (ICU).

Design

Prospective, randomized, controlled, parallel-group, multicenter clinical trial.

Setting

Seven multidisciplinary ICUs in Belgium.

Subjects

All adults admitted to participating ICUs with a nutritional risk score of 3 or more who did not meet any exclusion criteria.

Intervention

After enrollment, 2312 patients were randomized to receive PN 48 hours after ICU admission (early-initiation) and 2328 patients were randomized to receive PN on day 8 (late-initiation group). Both groups received early EN using a standardized protocol. PN was continued until EN met 80% of calorific goals, or when oral nutrition was resumed. It was restarted if enteral or oral feeding fell below 50% of calculated calorific needs.

Outcomes

Primary end point was the duration of dependency on intensive care, defined as the number of intensive care days and time to discharge from the ICU.

Results

The median stay in the ICU was one day shorter for the late-initiation group (3 v. 4; p = 0.02). The late-initiation group had a relative increase, of 6.3%, in the likelihood of being discharged earlier, and alive, from the ICU (hazard ratio 1.06; 95% confidence interval [CI] 1.00-1,13; p = 0.04). Rates of death in the ICU and survival at 90 days were similar between the two groups. The late-initiation group, as compared to the early-initiation group, had fewer ICU infections (22.8% v. 26.2%; p = 0.008), less days of renal replacement therapy (7 days (interquartile range [IQR] 3-16) v. 10 days (IQR 5-23); p = 0.008) and fewer patients requiring more than 2 days of mechanical ventilation (36.3% v. 40.2%; p = 0.006).

Conclusions

Late-initiation of PN was associated with faster recovery and fewer complications, when compared with early-initiation.

Trial Registration

NCT00512122  相似文献   

18.

OBJECTIVE

To evaluate the cost-effectiveness of Anticipatory and Preventive Team Care (APTCare).

DESIGN

Analysis of data drawn from a randomized controlled trial.

SETTING

A family health network in a rural area near Ottawa, Ont.

PARTICIPANTS

Patients 50 years of age or older at risk of experiencing adverse health outcomes. Analysis of cost-effectiveness was performed for a subsample of participants with at least 1 of the chronic diseases used in the quality of care (QOC) measure (74 intervention and 78 control patients).

INTERVENTIONS

At-risk patients were randomly assigned to receive usual care from their family physicians or APTCare from a collaborative team.

MAIN OUTCOME MEASURES

Cost-effectiveness and the net benefit to society of the APTCare intervention.

RESULTS

Costs not directly associated with delivery of the intervention were similar in the 2 arms: $9121 and $9222 for the APTCare and control arms, respectively. Costs directly associated with the program were $3802 per patient for a total cost per patient of $12 923 and $9222, respectively (P = .033). A 1% improvement in QOC was estimated to cost $407 per patient. Analysis of the net benefit to society in absolute dollars found a breakeven threshold of $750 when statistical significance was required. This implies that society must place a value of at least $750 on a 1% improvement in QOC in order for the intervention to be socially worthwhile. By any of the metrics used, the APTCare intervention was not cost-effective, at least not in a population for which baseline QOC was high.

CONCLUSION

Although our calculations suggest that the APTCare intervention was not cost-effective, our results need the following caveats. The costs of such a newly introduced intervention are bound to be higher than those for an established, up-and-running program. Furthermore, it is possible that some benefits of the secondary preventive measures were not captured in this limited 12- to 18-month study or were simply not measured.

TRIAL REGISTRATION NUMBER

NCT00238836 (CONSORT).  相似文献   

19.

Objective

To investigate the effectiveness of patient self-management (PSM) of anticoagulation using warfarin in a typical primary care site in Canada and to determine the feasibility of conducting a future large-scale trial in this setting.

Design

An 8-month pragmatic open-label randomized crossover trial.

Setting

A typical Canadian primary care practice in British Columbia.

Intervention

Patients were randomized to PSM or physician management for 4 months, after which allocation was reversed. The PSM group members were instructed to monitor their serum international normalized ratio (INR) at community laboratories and to adjust their warfarin doses independently using provided nomograms. Education on warfarin dose adjustment was limited to a single 15-minute office visit.

Main outcome measures

The primary outcome was the proportion of INR values in the therapeutic range among participants. Feasibility outcomes included proportion of eligible patients consenting, patients’ preference of management strategy, patients’ satisfaction, and visits or phone communication with physicians regarding dose adjustment. Safety outcomes included bleeding or thromboembolic events.

Results

Eleven patients completed the trial, contributing 99 patient-months of monitoring and providing 122 INR measures. The mean proportion of INR values in therapeutic range among subjects in the PSM and physician-management groups was 82% and 80%, respectively (P = .82). The improvement in patient satisfaction with PSM was not significant. Ten of the 11 patients preferred PSM to physician management and elected to continue with this strategy after study completion (P = .001). No calls or visits were made to the physician regarding dose adjustment during the PSM period. There were no episodes of major bleeding or thromboembolic events.

Conclusion

Patient self-management was not demonstrated to be superior to standard care, but was easily implemented and was the method preferred by patients. Our feasibility outcomes justify a larger trial and suggest that subject recruitment and protocol adherence would not pose barriers for such a study.

Trial registration number

NCT00925028 (ClinicalTrials.gov).  相似文献   

20.

Background

The insertion of urinary catheters during urinary surgical interventions may lead to catheter-related bladder discomfort (CRBD) in the postoperative period.

Objective

We aimed to evaluate the effect of single-dose intravenous paracetamol on CRBD.

Methods

In this randomized, controlled, double-blind study, 64 patients (age >18 years, American Society of Anesthesiologists Physical Status I–II) requiring urinary bladder catheterization for percutaneous nephrolithotomy were assigned to groups that received either intravenous paracetamol (15 mg/kg) (group P) or NaCl 0.9% solution (control group [group C]) 30 minutes before the end of surgery. Patients received patient-controlled analgesia (10-mg bolus of meperidine, without infusion, 20-minute lock out) postoperatively. CRBD and pain status were assessed at 30 minutes and 1, 2, 4, 6, and 12 hours postoperatively. Postoperative meperidine requirement and patient and surgeon satisfaction were assessed.

Results

Group P had significantly lower CRBD scores at all time points except at 12 hours postoperatively compared with group C (P < 0.05). Total meperidine consumption was significantly higher in group C (P < 0.05). Patient and surgeon satisfaction scores were significantly higher in group P (P < 0.05).

Conclusions

Intraoperative single-dose paracetamol was found to be effective in reducing the severity of CRBD and pain in urologic surgery. We suggest that it may be an efficient, reliable, easy-to-apply drug for CRBD. ClinicalTrials.gov identifier: NCT01652183.Key words: catheter-related bladder discomfort, intravenous paracetamol, urologic surgery  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号