首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.

Introduction

Acute heart failure (AHF) is characterized by inadequate cardiac output (CO), congestive symptoms, poor peripheral perfusion and end-organ dysfunction. Treatment often includes a combination of diuretics, oxygen, positive pressure ventilation, inotropes and vasodilators or vasopressors. Lactate is a marker of illness severity but is also an important metabolic substrate for the myocardium at rest and during stress. We tested the effects of half-molar sodium lactate infusion on cardiac performance in AHF.

Methods

We conducted a prospective, randomised, controlled, open-label, pilot clinical trial in 40 patients fulfilling two of the following three criteria for AHF: (1) left ventricular ejection fraction <40%, (2) acute pulmonary oedema or respiratory failure of predominantly cardiac origin requiring mechanical ventilation and (3) currently receiving vasopressor and/or inotropic support. Patients in the intervention group received a 3 ml/kg bolus of half-molar sodium lactate over the course of 15 minutes followed by 1 ml/kg/h continuous infusion for 24 hours. The control group received only a 3 ml/kg bolus of Hartmann’s solution without continuous infusion. The primary outcome was CO assessed by transthoracic echocardiography 24 hours after randomisation. Secondary outcomes included a measure of right ventricular systolic function (tricuspid annular plane systolic excursion (TAPSE)), acid-base balance, electrolyte and organ function parameters, along with length of stay and mortality.

Results

The infusion of half-molar sodium lactate increased (mean ± SD) CO from 4.05 ± 1.37 L/min to 5.49 ± 1.9 L/min (P < 0.01) and TAPSE from 14.7 ± 5.5 mm to 18.3 ± 7 mm (P = 0.02). Plasma sodium and pH increased (136 ± 4 to 146 ± 6 and 7.40 ± 0.06 to 7.53 ± 0.03, respectively; both P < 0.01), but potassium, chloride and phosphate levels decreased. There were no significant differences in the need for vasoactive therapy, respiratory support, renal or liver function tests, duration of ICU and hospital stay or 28- and 90-day mortality.

Conclusions

Infusion of half-molar sodium lactate improved cardiac performance and led to metabolic alkalosis in AHF patients without any detrimental effects on organ function.

Trial registration

Clinicaltrials.gov NCT01981655. Registered 13 August 2013.  相似文献   

2.

Introduction

Severe acute pancreatitis is still a potentially life threatening disease with high mortality. The aim of this study was to evaluate the therapeutic effect of thoracic epidural anaesthesia (TEA) on survival, microcirculation, tissue oxygenation and histopathologic damage in an experimental animal model of severe acute pancreatitis in a prospective animal study.

Methods

In this study, 34 pigs were randomly assigned into 2 treatment groups. After severe acute pancreatitis was induced by intraductal injection of glycodesoxycholic acid in Group 1 (n = 17) bupivacaine (0.5%; bolus injection 2 ml, continuous infusion 4 ml/h) was applied via TEA. In Group 2 (n = 17) no TEA was applied. During a period of 6 hours after induction, tissue oxygen tension (tpO2) in the pancreas and pancreatic microcirculation was assessed. Thereafter animals were observed for 7 days followed by sacrification and histopathologic examination.

Results

Survival rate after 7 days was 82% in Group 1 (TEA) versus 29% in Group 2: (Control) (P <0.05). Group 1 (TEA) also showed a significantly superior microcirculation (1,608 ± 374 AU versus 1,121 ± 510 AU; P <0.05) and tissue oxygenation (215 ± 64 mmHg versus 138 ± 90 mmHG; P <0.05) as compared to Group 2 (Control). Consecutively, tissue damage in Group 1 was reduced in the histopathologic scoring (5.5 (3 to 8) versus 8 (5.5 to 10); P <0.05).

Conclusions

TEA led to improved survival, enhanced microcirculatory perfusion and tissue oxygenation and resulted in less histopathologic tissue-damage in an experimental animal model of severe acute pancreatitis.  相似文献   

3.

Introduction

Patients with distributive shock who require high dose vasopressors have a high mortality. Angiotensin II (ATII) may prove useful in patients who remain hypotensive despite catecholamine and vasopressin therapy. The appropriate dose of parenteral angiotensin II for shock is unknown.

Methods

In total, 20 patients with distributive shock and a cardiovascular Sequential Organ Failure Assessment score of 4 were randomized to either ATII infusion (N =10) or placebo (N =10) plus standard of care. ATII was started at a dose of 20 ng/kg/min, and titrated for a goal of maintaining a mean arterial pressure (MAP) of 65 mmHg. The infusion (either ATII or placebo) was continued for 6 hours then titrated off. The primary endpoint was the effect of ATII on the standing dose of norepinephrine required to maintain a MAP of 65 mmHg.

Results

ATII resulted in marked reduction in norepinephrine dosing in all patients. The mean hour 1 norepinephrine dose for the placebo cohort was 27.6 ± 29.3 mcg/min versus 7.4 ± 12.4 mcg/min for the ATII cohort (P =0.06). The most common adverse event attributable to ATII was hypertension, which occurred in 20% of patients receiving ATII. 30-day mortality for the ATII cohort and the placebo cohort was similar (50% versus 60%, P =1.00).

Conclusion

Angiotensin II is an effective rescue vasopressor agent in patients with distributive shock requiring multiple vasopressors. The initial dose range of ATII that appears to be appropriate for patients with distributive shock is 2 to 10 ng/kg/min.

Trial registration

Clinicaltrials.gov NCT01393782. Registered 12 July 2011.  相似文献   

4.

Introduction

Low plasma glutamine levels are associated with worse clinical outcome. Intravenous glutamine infusion dose- dependently increases plasma glutamine levels, thereby correcting hypoglutaminemia. Glutamine may be transformed to glutamate which might limit its application at a higher dose in patients with severe traumatic brain injury (TBI). To date, the optimal glutamine dose required to normalize plasma glutamine levels without increasing plasma and cerebral glutamate has not yet been defined.

Methods

Changes in plasma and cerebral glutamine, alanine, and glutamate as well as indirect signs of metabolic impairment reflected by increased intracranial pressure (ICP), lactate, lactate-to-pyruvate ratio, electroencephalogram (EEG) activity were determined before, during, and after continuous intravenous infusion of 0.75 g L-alanine-L-glutamine which was given either for 24 hours (group 1, n = 6) or 5 days (group 2, n = 6) in addition to regular enteral nutrition. Lab values including nitrogen balance, urea and ammonia were determined daily.

Results

Continuous L-alanine-L-glutamine infusion significantly increased plasma and cerebral glutamine as well as alanine levels, being mostly sustained during the 5 day infusion phase (plasma glutamine: from 295 ± 62 to 500 ± 145 μmol/ l; brain glutamine: from 183 ± 188 to 549 ± 120 μmol/ l; plasma alanine: from 327 ± 91 to 622 ± 182 μmol/ l; brain alanine: from 48 ± 55 to 89 ± 129 μmol/ l; p < 0.05, ANOVA, post hoc Dunn’s test).Plasma glutamate remained unchanged and cerebral glutamate was decreased without any signs of cerebral impairment. Urea and ammonia were significantly increased within normal limits without signs of organ dysfunction (urea: from 2.7 ± 1.6 to 5.5 ± 1.5 mmol/ l; ammonia: from 12 ± 6.3 to 26 ± 8.3 μmol/ l; p < 0.05, ANOVA, post hoc Dunn’s test).

Conclusions

High dose L-alanine-L-glutamine infusion (0.75 g/ kg/ d up to 5 days) increased plasma and brain glutamine and alanine levels. This was not associated with elevated glutamate or signs of potential glutamate-mediated cerebral injury. The increased nitrogen load should be considered in patients with renal and hepatic dysfunction.

Trial registration

Clinicaltrials.gov NCT02130674. Registered 5 April 2014  相似文献   

5.

Introduction

Acute renal failure (ARF) requiring renal replacement therapy (RRT) occurs frequently in ICU patients and significantly affects mortality rates. Previously, few large clinical trials investigated the impact of RRT modalities on patient outcomes. Here we investigated the effect of two major RRT strategies (intermittent hemodialysis (IHD) and continuous veno-venous hemofiltration (CVVH)) on mortality and renal-related outcome measures.

Methods

This single-center prospective randomized controlled trial (“CONVINT”) included 252 critically ill patients (159 male; mean age, 61.5 ± 13.9 years; Acute Physiology and Chronic Health Evaluation (APACHE) II score, 28.6 ± 8.8) with dialysis-dependent ARF treated in the ICUs of a tertiary care academic center. Patients were randomized to receive either daily IHD or CVVH. The primary outcome measure was survival at 14 days after the end of RRT. Secondary outcome measures included 30-day-, intensive care unit-, and intrahospital mortality, as well as course of disease severity/biomarkers and need for organ-support therapy.

Results

At baseline, no differences in disease severity, distributions of age and gender, or suspected reasons for acute renal failure were observed. Survival rates at 14 days after RRT were 39.5% (IHD) versus 43.9% (CVVH) (odds ratio (OR), 0.84; 95% confidence interval (CI), 0.49 to 1.41; P = 0.50). 14-day-, 30-day, and all-cause intrahospital mortality rates were not different between the two groups (all P > 0.5). No differences were observed in days on RRT, vasopressor days, days on ventilator, or ICU-/intrahospital length of stay.

Conclusions

In a monocentric RCT, we observed no statistically significant differences between the investigated treatment modalities regarding mortality, renal-related outcome measures, or survival at 14 days after RRT. Our findings add to mounting data demonstrating that intermittent and continuous RRTs may be considered equivalent approaches for critically ill patients with dialysis-dependent acute renal failure.

Trial registration

NCT01228123, clinicaltrials.gov  相似文献   

6.

Introduction

Long-term ventilated intensive care patients frequently require tracheostomy. Although overall risks are low, serious immediate and late complications still arise. Real-time ultrasound guidance has been proposed to decrease complications and improve the accuracy of the tracheal puncture. We aimed to compare the procedural safety and efficacy of real-time ultrasound guidance with the traditional landmark approach during percutaneous dilatational tracheostomy (PDT).

Methods

A total of 50 patients undergoing PDT for clinical indications were randomly assigned, after obtaining informed consent, to have the tracheal puncture procedure carried out using either traditional anatomical landmarks or real-time ultrasound guidance. Puncture position was recorded via bronchoscopy. Blinded assessors determined in a standardised fashion the deviation of the puncture off midline and whether appropriate longitudinal position between the first and fourth tracheal rings was achieved. Procedural safety and efficacy data, including complications and number of puncture attempts required, were collected.

Results

In total, 47 data sets were evaluable. Real-time ultrasound guidance resulted in significantly more accurate tracheal puncture. Mean deviation from midline was 15 ± 3° versus 35 ± 5° (P = 0.001). The proportion of appropriate punctures, defined a priori as 0 ± 30° from midline, was significantly higher: 20 (87%) of 23 versus 12 (50%) of 24 (RR = 1.74; 95% CI = 1.13 to 2.67; P = 0.006). First-pass success rate was 20 (87%) of 23 in the ultrasound group and 14 (58%) of 24 in the landmark group (RR = 1.49; 95% CI = 1.03 to 2.17; P = 0.028). The observed decrease in procedural complications was not statistically significant: 5 (22%) of 23 in the ultrasound group versus 9 (37%) of 24 in the landmark group (RR = 0.58; 95% CI = 0.23 to 1.47; P = 0.24).

Conclusions

Ultrasound guidance significantly improved the rate of first-pass puncture and puncture accuracy. Fewer procedural complications were observed; however, this did not reach statistical significance. These results support wider general use of real-time ultrasound guidance as an additional tool to improve PDT.

Trial registration

Australian New Zealand Clinical Trials Registry ID: ACTRN12611000237987 (registered 4 March 2011)

Electronic supplementary material

The online version of this article (doi:10.1186/s13054-014-0514-0) contains supplementary material, which is available to authorized users.  相似文献   

7.

Introduction

The objective of this study was to describe the pharmacokinetics of vancomycin in ICU patients and to examine whether contemporary antibiotic dosing results in concentrations that have been associated with favourable response.

Methods

The Defining Antibiotic Levels in Intensive Care (DALI) study was a prospective, multicentre pharmacokinetic point-prevalence study. Antibiotic dosing was as per the treating clinician either by intermittent bolus or continuous infusion. Target trough concentration was defined as ≥15 mg/L and target pharmacodynamic index was defined as an area under the concentration-time curve over a 24-hour period divided by the minimum inhibitory concentration of the suspected bacteria (AUC0–24/MIC ratio) >400 (assuming MIC ≤1 mg/L).

Results

Data of 42 patients from 26 ICUs were eligible for analysis. A total of 24 patients received vancomycin by continuous infusion (57%). Daily dosage of vancomycin was 27 mg/kg (interquartile range (IQR) 18 to 32), and not different between patients receiving intermittent or continuous infusion. Trough concentrations were highly variable (median 27, IQR 8 to 23 mg/L). Target trough concentrations were achieved in 57% of patients, but more frequently in patients receiving continuous infusion (71% versus 39%; P = 0.038). Also the target AUC0–24/MIC ratio was reached more frequently in patients receiving continuous infusion (88% versus 50%; P = 0.008). Multivariable logistic regression analysis with adjustment by the propensity score could not confirm continuous infusion as an independent predictor of an AUC0–24/MIC >400 (odds ratio (OR) 1.65, 95% confidence interval (CI) 0.2 to 12.0) or a Cmin ≥15 mg/L (OR 1.8, 95% CI 0.4 to 8.5).

Conclusions

This study demonstrated large interindividual variability in vancomycin pharmacokinetic and pharmacodynamic target attainment in ICU patients. These data suggests that a re-evaluation of current vancomycin dosing recommendations in critically ill patients is needed to more rapidly and consistently achieve sufficient vancomycin exposure.  相似文献   

8.

Background

Adenosine cardiovascular magnetic resonance (CMR) can accurately quantify myocardial perfusion reserve. While regadenoson is increasingly employed due to ease of use, imaging protocols have not been standardized. We sought to determine the optimal regadenoson CMR protocol for quantifying myocardial perfusion reserve index (MPRi) – more specifically, whether regadenoson stress imaging should be performed before or after rest imaging.

Methods

Twenty healthy subjects underwent CMR perfusion imaging during resting conditions, during regadenoson-induced hyperemia (0.4 mg), and after 15 min of recovery. In 10/20 subjects, recovery was facilitated with aminophylline (125 mg). Myocardial time-intensity curves were used to obtain left ventricular cavity-normalized myocardial up-slopes. MPRi was calculated in two different ways: as the up-slope ratio of stress to rest (MPRi-rest), and the up-slope ratio of stress to recovery (MPRi-recov).

Results

In all 20 subjects, MPRi-rest was 1.78 ± 0.60. Recovery up-slope did not return to resting levels, regardless of aminophylline use. Among patients not receiving aminophylline, MPRi-recov was 36 ± 16% lower than MPRi-rest (1.13 ± 0.38 vs. 1.82 ± 0.73, P = 0.001). In the 10 patients whose recovery was facilitated with aminophylline, MPRi-recov was 20 ± 24% lower than MPRi-rest (1.40 ± 0.35 vs. 1.73 ± 0.43, P = 0.04), indicating incomplete reversal. In 3 subjects not receiving aminophylline and 4 subjects receiving aminophylline, up-slope at recovery was greater than at stress, suggesting delayed maximal hyperemia.

Conclusions

MPRi measurements from regadenoson CMR are underestimated if recovery perfusion is used as a substitute for resting perfusion, even when recovery is facilitated with aminophylline. True resting images should be used to allow accurate MPRi quantification. The delayed maximal hyperemia observed in some subjects deserves further study.

Trial registration

ClinicalTrials.gov NCT00871260  相似文献   

9.

Introduction

Optimal feeding of critically ill patients in the ICU is controversial. Existing guidelines rest on rather weak evidence. Whole body protein kinetics may be an attractive technique for assessing optimal protein intake. In this study, critically ill patients were investigated during hypocaloric and normocaloric IV nutrition.

Methods

Neurosurgical patients on mechanical ventilation (n = 16) were studied during a 48-hour period. In random order 50% and 100% of measured energy expenditure was given as IV nutrition during 24 hours, corresponding to hypocaloric and normocaloric nutrition, respectively. At the end of each period, whole body protein turnover was measured using d5-phenylalanine and 13C-leucine tracers.

Results

The phenylalanine tracer indicated that whole-body protein synthesis was lower during hypocaloric feeding, while whole-body protein degradation and amino acid oxidation were unaltered, which resulted in a more negative protein balance, namely −1.9 ± 2.1 versus −0.7 ± 1.3 mg phenylalanine/kg/h (P = 0.014). The leucine tracer indicated that whole body protein synthesis and degradation and amino acid oxidation were unaltered, but the protein balance was negative during hypocaloric feeding, namely −0.3 ± 0.5 versus 0.6 ± 0.5 mg leucine/kg/h (P < 0.001).

Conclusion

In the patient group studied, hypocaloric feeding was associated with a more negative protein balance, but the amino acid oxidation was not different. The protein kinetics measurements and the study’s investigational protocol were useful for assessing the efficacy of nutrition support on protein metabolism in critically ill patients.  相似文献   

10.

Introduction

The combination of Adenosine (A), lidocaine (L) and Mg2+ (M) (ALM) has demonstrated cardioprotective and resuscitative properties in models of cardiac arrest and hemorrhagic shock. This study evaluates whether ALM also demonstrates organ protective properties in an endotoxemic porcine model.

Methods

Pigs (37 to 42 kg) were randomized into: 1) Control (n = 8) or 2) ALM (n = 8) followed by lipopolysaccharide infusion (1 μg∙kg-1∙h-1) for five hours. ALM treatment consisted of 1) a high dose bolus (A (0.82 mg/kg), L (1.76 mg/kg), M (0.92 mg/kg)), 2) one hour continuous infusion (A (300 μg∙kg-1 ∙min-1), L (600 μg∙kg-1 ∙min-1), M (336 μg∙kg-1 ∙min-1)) and three hours at a lower dose (A (240∙kg-1∙min-1), L (480 μg∙kg-1∙min-1), M (268 μg∙kg-1 ∙min-1)); controls received normal saline. Hemodynamic, cardiac, pulmonary, metabolic and renal functions were evaluated.

Results

ALM lowered mean arterial pressure (Mean value during infusion period: ALM: 47 (95% confidence interval (CI): 44 to 50) mmHg versus control: 79 (95% CI: 75 to 85) mmHg, P <0.0001). After cessation of ALM, mean arterial pressure immediately increased (end of study: ALM: 88 (95% CI: 81 to 96) mmHg versus control: 86 (95% CI: 79 to 94) mmHg, P = 0.72). Whole body oxygen consumption was significantly reduced during ALM infusion (ALM: 205 (95% CI: 192 to 217) ml oxygen/min versus control: 231 (95% CI: 219 to 243) ml oxygen/min, P = 0.016). ALM treatment reduced pulmonary injury evaluated by PaO2/FiO2 ratio (ALM: 388 (95% CI: 349 to 427) versus control: 260 (95% CI: 221 to 299), P = 0.0005). ALM infusion led to an increase in heart rate while preserving preload recruitable stroke work. Creatinine clearance was significantly lower during ALM infusion but reversed after cessation of infusion. ALM reduced tumor necrosis factor-α peak levels (ALM 7121 (95% CI: 5069 to 10004) pg/ml versus control 11596 (95% CI: 9083 to 14805) pg/ml, P = 0.02).

Conclusion

ALM infusion induces a reversible hypotensive and hypometabolic state, attenuates tumor necrosis factor-α levels and improves cardiac and pulmonary function, and led to a transient drop in renal function that was reversed after the treatment was stopped.  相似文献   

11.

Introduction

Rewarming from deep hypothermic circulatory arrest (DHCA) produces calcium desensitization by troponin I (cTnI) phosphorylation which results in myocardial dysfunction. This study investigated the acute overall hemodynamic and metabolic effects of epinephrine and levosimendan, a calcium sensitizer, on myocardial function after rewarming from DHCA.

Methods

Forty male Wistar rats (400 to 500 g) underwent cardiopulmonary bypass (CPB) through central cannulation and were cooled to a core temperature of 13°C to 15°C within 30 minutes. After DHCA (20 minutes) and CPB-assisted rewarming (60 minutes) rats were randomly assigned to 60 minute intravenous infusion with levosimendan (0.2 μg/kg/min; n = 15), epinephrine (0.1 μg/kg/min; n = 15) or saline (control; n = 10). Systolic and diastolic functions were evaluated at different preloads with a conductance catheter.

Results

The slope of left ventricular end-systolic pressure volume relationship (Ees) and preload recruitable stroke work (PRSW) recovered significantly better with levosimendan compared to epinephrine (Ees: 85 ± 9% vs 51 ± 11%, P<0.003 and PRSW: 78 ± 5% vs 48 ± 8%, P<0.005; baseline: 100%). Levosimendan but not epinephrine reduced left ventricular stiffness shown by the end-diastolic pressure-volume relationship and improved ventricular relaxation (Tau). Levosimendan preserved ATP myocardial content as well as energy charge and reduced plasma lactate concentrations. In normothermia experiments epinephrine in contrast to Levosimendan increased cTnI phosphorylation 3.5-fold. After rewarming from DHCA, cTnI phosphorylation increased 4.5-fold in the saline and epinephrine group compared to normothermia but remained unchanged with levosimendan.

Conclusions

Levosimendan due to prevention of calcium desensitization by cTnI phosphorylation is more effective than epinephrine for treatment of myocardial dysfunction after rewarming from DHCA.  相似文献   

12.

Introduction

Liberal and overaggressive use of vasopressors during the initial period of shock resuscitation may compromise organ perfusion and worsen outcome. When transiently applying the concept of permissive hypotension, it would be helpful to know at which arterial blood pressure terminal cardiovascular collapse occurs.

Methods

In this retrospective cohort study, we aimed to identify the arterial blood pressure associated with terminal cardiovascular collapse in 140 patients who died in the intensive care unit while being invasively monitored. Demographic data, co-morbid conditions and clinical data at admission and during the 24 hours before and at the time of terminal cardiovascular collapse were collected. The systolic, mean and diastolic arterial blood pressures immediately before terminal cardiovascular collapse were documented. Terminal cardiovascular collapse was defined as an abrupt (<5 minutes) and exponential decrease in heart rate (>50% compared to preceding values) followed by cardiac arrest.

Results

The mean ± standard deviation (SD) values of the systolic, mean and diastolic arterial blood pressures associated with terminal cardiovascular collapse were 47 ± 12 mmHg, 35 ± 11 mmHg and 29 ± 9 mmHg, respectively. Patients with congestive heart failure (39 ± 13 mmHg versus 34 ± 10 mmHg; P = 0.04), left main stem stenosis (39 ± 11 mmHg versus 34 ± 11 mmHg; P = 0.03) or acute right heart failure (39 ± 13 mmHg versus 34 ± 10 mmHg; P = 0.03) had higher arterial blood pressures than patients without these risk factors. Patients with severe valvular aortic stenosis had the highest arterial blood pressures associated with terminal cardiovascular collapse (systolic, 60 ± 20 mmHg; mean, 46 ± 12 mmHg; diastolic, 36 ± 10 mmHg), but this difference was not significant. Patients with sepsis and patients exposed to sedatives or opioids during the terminal phase exhibited lower arterial blood pressures than patients without sepsis or administration of such drugs.

Conclusions

The arterial blood pressure associated with terminal cardiovascular collapse in critically ill patients was very low and varied with individual co-morbid conditions (for example, congestive heart failure, left main stem stenosis, severe valvular aortic stenosis, acute right heart failure), drug exposure (for example, sedatives or opioids) and the type of acute illness (for example, sepsis).

Electronic supplementary material

The online version of this article (doi:10.1186/s13054-014-0719-2) contains supplementary material, which is available to authorized users.  相似文献   

13.

Background

Diffuse myocardial fibrosis (DMF) is important in cardiovascular disease, however until recently could only be assessed by invasive biopsy. We hypothesised that DMF measured by T1 mapping is elevated in isolated systemic hypertension.

Methods

In a study of well-controlled hypertensive patients from a specialist tertiary centre, 46 hypertensive patients (median age 56, range 21 to 78, 52 % male) and 50 healthy volunteers (median age 45, range 28 to 69, 52 % male) underwent clinical CMR at 1.5 T with T1 mapping (ShMOLLI) using the equilibrium contrast technique for extracellular volume (ECV) quantification. Patients underwent 24-hours Automated Blood Pressure Monitoring (ABPM), echocardiographic assessment of diastolic function, aortic stiffness assessment and measurement of NT-pro-BNP and collagen biomarkers.

Results

Late gadolinium enhancement (LGE) revealed significant unexpected underlying pathology in 6 out of 46 patients (13 %; myocardial infarction n = 3; hypertrophic cardiomyopathy (HCM) n = 3); these were subsequently excluded. Limited, non-ischaemic LGE patterns were seen in 11 out of the remaining 40 (28 %) patients. Hypertensives on therapy (mean 2.2 agents) had a mean ABPM of 152/88 mmHg, but only 35 % (14/40) had left ventricular hypertrophy (LVH; LV mass male > 90 g/m2; female > 78 g/m2). Native myocardial T1 was similar in hypertensives and controls (955 ± 30 ms versus 965 ± 38 ms, p = 0.16). The difference in ECV did not reach significance (0.26 ± 0.02 versus 0.27 ± 0.03, p = 0.06). In the subset with LVH, the ECV was significantly higher (0.28 ± 0.03 versus 0.26 ± 0.02, p < 0.001).

Conclusion

In well-controlled hypertensive patients, conventional CMR discovered significant underlying diseases (chronic infarction, HCM) not detected by echocardiography previously or even during this study. T1 mapping revealed increased diffuse myocardial fibrosis, but the increases were small and only occurred with LVH.  相似文献   

14.

Introduction

Assist in unison to the patient’s inspiratory neural effort and feedback-controlled limitation of lung distension with neurally adjusted ventilatory assist (NAVA) may reduce the negative effects of mechanical ventilation on right ventricular function.

Methods

Heart–lung interaction was evaluated in 10 intubated patients with impaired cardiac function using esophageal balloons, pulmonary artery catheters and echocardiography. Adequate NAVA level identified by a titration procedure to breathing pattern (NAVAal), 50% NAVAal, and 200% NAVAal and adequate pressure support (PSVal, defined clinically), 50% PSVal, and 150% PSVal were implemented at constant positive end-expiratory pressure for 20 minutes each.

Results

NAVAal was 3.1 ± 1.1cmH2O/μV and PSVal was 17 ± 2 cmH20. For all NAVA levels negative esophageal pressure deflections were observed during inspiration whereas this pattern was reversed during PSVal and PSVhigh. As compared to expiration, inspiratory right ventricular outflow tract velocity time integral (surrogating stroke volume) was 103 ± 4%, 109 ± 5%, and 100 ± 4% for NAVAlow, NAVAal, and NAVAhigh and 101 ± 3%, 89 ± 6%, and 83 ± 9% for PSVlow, PSVal, and PSVhigh, respectively (p < 0.001 level-mode interaction, ANOVA). Right ventricular systolic isovolumetric pressure increased from 11.0 ± 4.6 mmHg at PSVlow to 14.0 ± 4.6 mmHg at PSVhigh but remained unchanged (11.5 ± 4.7 mmHg (NAVAlow) and 10.8 ± 4.2 mmHg (NAVAhigh), level-mode interaction p = 0.005). Both indicate progressive right ventricular outflow impedance with increasing pressure support ventilation (PSV), but no change with increasing NAVA level.

Conclusions

Right ventricular performance is less impaired during NAVA compared to PSV as used in this study. Proposed mechanisms are preservation of cyclic intrathoracic pressure changes characteristic of spontaneous breathing and limitation of right-ventricular outflow impedance during inspiration, regardless of the NAVA level.

Trial registration

Clinicaltrials.gov Identifier: NCT00647361, registered 19 March 2008

Electronic supplementary material

The online version of this article (doi:10.1186/s13054-014-0499-8) contains supplementary material, which is available to authorized users.  相似文献   

15.

Introduction

Delirium is a common occurrence in critically ill patients and is associated with an increase in morbidity and mortality. Septic patients with delirium may differ from a general critically ill population. The aim of this investigation was to study the relationship between systemic inflammation and the development of delirium in septic and non-septic critically ill patients.

Methods

We performed a prospective cohort study in a 20-bed mixed intensive care unit (ICU) including 78 (delirium = 31; non-delirium = 47) consecutive patients admitted for more than 24 hours. At enrollment, patients were allocated to septic or non-septic groups according to internationally agreed criteria. Delirium was diagnosed using the Confusion Assessment Method for the Intensive Care Unit (CAM-ICU) during the first 72 hours of ICU admission. Blood samples were collected within 12 hours of enrollment for determination of tumor necrosis factor (TNF)-α, soluble TNF Receptor (STNFR)-1 and -2, interleukin (IL)-1β, IL-6, IL-10 and adiponectin.

Results

Out of all analyzed biomarkers, only STNFR1 (P = 0.003), STNFR2 (P = 0.005), adiponectin (P = 0.005) and IL-1β (P < 0.001) levels were higher in delirium patients. Adjusting for sepsis and sedation, these biomarkers were also independently associated with delirium occurrence. However, none of them were significant influenced by sepsis.

Conclusions

STNFR1, STNFR2, adiponectin and IL-1β were associated with delirium. Sepsis did not modify the relationship between the biomarkers and delirium occurrence.  相似文献   

16.

Introduction

Current practice in the delivery of caloric intake (DCI) in patients with severe acute kidney injury (AKI) receiving renal replacement therapy (RRT) is unknown. We aimed to describe calorie administration in patients enrolled in the Randomized Evaluation of Normal vs. Augmented Level of Replacement Therapy (RENAL) study and to assess the association between DCI and clinical outcomes.

Methods

We performed a secondary analysis in 1456 patients from the RENAL trial. We measured the dose and evolution of DCI during treatment and analyzed its association with major clinical outcomes using multivariable logistic regression, Cox proportional hazards models, and time adjusted models.

Results

Overall, mean DCI during treatment in ICU was low at only 10.9 ± 9 Kcal/kg/day for non-survivors and 11 ± 9 Kcal/kg/day for survivors. Among patients with a lower DCI (below the median) 334 of 729 (45.8%) had died at 90-days after randomization compared with 316 of 727 (43.3%) patients with a higher DCI (above the median) (P = 0.34). On multivariable logistic regression analysis, mean DCI carried an odds ratio of 0.95 (95% confidence interval (CI): 0.91-1.00; P = 0.06) per 100 Kcal increase for 90-day mortality. DCI was not associated with significant differences in renal replacement (RRT) free days, mechanical ventilation free days, ICU free days and hospital free days. These findings remained essentially unaltered after time adjusted analysis and Cox proportional hazards modeling.

Conclusions

In the RENAL study, mean DCI was low. Within the limits of such low caloric intake, greater DCI was not associated with improved clinical outcomes.

Trial registration

ClinicalTrials.gov number, NCT00221013  相似文献   

17.

Introduction

The aim of this study was to determine if there are differences between patients with pre-existing left ventricular dysfunction and those with normal antecedent left ventricular function during a sepsis episode in terms of in-hospital mortality and mortality risk factors when treated in accordance with a sepsis treatment algorithm.

Methods

We performed a retrospective case-control analysis of patients selected from a quality improvement database of 1,717 patients hospitalized with sepsis between 1 January 2005 and 30 June 2010. In this study, 197 patients with pre-existing left ventricular systolic dysfunction and sepsis were compared to 197 case-matched patients with normal prior cardiac function and sepsis.

Results

In-hospital mortality rates (P = 0.117) and intubation rates at 24 hours (P = 0.687) were not significantly different between cases and controls. There was no correlation between the amount of intravenous fluid administered over the first 24 hours and the PaO2/FiO2 ratio at 24 hours in either cases or controls (r2 = 0.019 and r2 = 0.001, respectively). Mortality risk factors for cases included intubation status (P = 0.016, OR = 0.356 for no intubation), compliance with a sepsis bundle (P = 0.008, OR = 3.516 for failed compliance), a source of infection other than the lung (P = 0.019, OR = 2.782), and the initial mixed venous oxygen saturation (P = 0.004, OR = 0.997). Risk factors for controls were the initial platelet count (P = 0.028, OR = 0.997) and the serum lactate level (P = 0.048, OR = 1.104). Patients with pre-existing left ventricular dysfunction who died had a lower initial mean mixed venous oxygen saturation than those who survived (61 ± 18% versus 70 ± 16%, P = 0.002).

Conclusions

Clinical outcomes were not different between septic patients with pre-existing left ventricular dysfunction and those with no cardiac disease. There was no correlation between fluid administration and oxygenation at 24 hours in either cohort. The mortality risk factor profile of patients with pre-existing left ventricular dysfunction was different when compared with control patients, and may be related to oxygen delivery determinants.  相似文献   

18.

Background

Whether T1-mapping cardiovascular magnetic resonance (CMR) can accurately quantify the area-at-risk (AAR) as delineated by T2 mapping and assess myocardial salvage at 3T in reperfused ST-segment elevation myocardial infarction (STEMI) patients is not known and was investigated in this study.

Methods

18 STEMI patients underwent CMR at 3T (Siemens Bio-graph mMR) at a median of 5 (4–6) days post primary percutaneous coronary intervention using native T1 (MOLLI) and T2 mapping (WIP #699; Siemens Healthcare, UK). Matching short-axis T1 and T2 maps covering the entire left ventricle (LV) were assessed by two independent observers using manual, Otsu and 2 standard deviation thresholds. Inter- and intra-observer variability, correlation and agreement between the T1 and T2 mapping techniques on a per-slice and per patient basis were assessed.

Results

A total of 125 matching T1 and T2 mapping short-axis slices were available for analysis from 18 patients. The acquisition times were identical for the T1 maps and T2 maps. 18 slices were excluded due to suboptimal image quality. Both mapping sequences were equally prone to susceptibility artifacts in the lateral wall and were equally likely to be affected by microvascular obstruction requiring manual correction. The Otsu thresholding technique performed best in terms of inter- and intra-observer variability for both T1 and T2 mapping CMR. The mean myocardial infarct size was 18.8 ± 9.4 % of the LV. There was no difference in either the mean AAR (32.3 ± 11.5 % of the LV versus 31.6 ± 11.2 % of the LV, P = 0.25) or myocardial salvage index (0.40 ± 0.26 versus 0.39 ± 0.27, P = 0.20) between the T1 and T2 mapping techniques. On a per-slice analysis, there was an excellent correlation between T1 mapping and T2 mapping in the quantification of the AAR with an R2 of 0.95 (P < 0.001), with no bias (mean ± 2SD: bias 0.0 ± 9.6 %). On a per-patient analysis, the correlation and agreement remained excellent with no bias (R2 0.95, P < 0.0001, bias 0.7 ± 5.1 %).

Conclusions

T1 mapping CMR at 3T performed as well as T2 mapping in quantifying the AAR and assessing myocardial salvage in reperfused STEMI patients, thereby providing an alternative CMR measure of the the AAR.  相似文献   

19.

Introduction

Plasma selenium (Se) concentrations are reduced in critically ill surgical patients, and lower plasma Se concentrations are associated with worse outcomes. We investigated whether adjuvant Se supplementation in the form of sodium selenite could improve outcomes in surgical patients with sepsis.

Methods

In this retrospective study, all adult patients admitted to a 50-bed surgical ICU with severe sepsis between January 2004 and April 2010 were included and analysed according to whether they had received adjuvant Se supplementation, which was given at the discretion of the attending physician. When prescribed, Se was administered in the form of sodium selenite pentahydrate (Na2SeO3∙5H2O), in which 100 μg of Se corresponds to 333 μg of sodium selenite. A bolus of sodium selenite corresponding to 1,000 μg of Se was injected intravenously through a central venous line for 30 minutes, followed by infusion of 1,000 μg/day for 24 hours for 14 days until ICU discharge or death. We performed logistic regression analysis to investigate the impact of adjuvant Se supplementation on hospital mortality.

Results

Adjuvant Se was administered to 413 (39.7%) of the 1,047 patients admitted with severe sepsis. Age and sex were similar between patients who received adjuvant Se and those who did not. Compared with patients who did not receive adjuvant Se supplementation, patients who did had higher scores on the Simplified Acute Physiology Score II, a greater prevalence of cancer upon admission to the ICU and were more commonly admitted after abdominal surgery. Compared with patients who did not receive adjuvant Se, patients who did had higher hospital mortality rates (46% versus 39.1%; P = 0.027), and longer median (interquartile range (IQR)) ICU stays (15 days (6 to 24) versus 11 days (4 to 24); P = 0.01) and hospital lengths of stay (33 days (21 to 52) versus 28 days (17 to 46); P = 0.001). In multivariable analysis, adjuvant Se supplementation was not independently associated with favourable outcome (odds ratio = 1.19, 95% confidence interval = 0.86 to 1.65; P = 0.288).

Conclusions

In this retrospective analysis of a large cohort of surgical ICU patients with severe sepsis, adjuvant Se supplementation in the form of sodium selenite had no impact on in-hospital death rates after adjustment for confounders.  相似文献   

20.

Introduction

Several methods have been proposed to evaluate neurological outcome in out-of-hospital cardiac arrest (OHCA) patients. Blood lactate has been recognized as a reliable prognostic marker for trauma, sepsis, or cardiac arrest. The objective of this study was to examine the association between initial lactate level or lactate clearance and neurologic outcome in OHCA survivors who were treated with therapeutic hypothermia.

Methods

This retrospective cohort study included patients who underwent protocol-based 24-hour therapeutic hypothermia after OHCA between January 2010 and March 2012. Serum lactate levels were measured at the start of therapy (0 hours), and after 6 hours, 12 hours, 24 hours, 48 hours and 72 hours. The 6 hour and 12 hour lactate clearance were calculated afterwards. Patients’ neurologic outcome was assessed at one month after cardiac arrest; good neurological outcome was defined as Cerebral Performance Category one or two. The primary outcome was an association between initial lactate level and good neurologic outcome. The secondary outcome was an association between lactate clearance and good neurologic outcome in patients with initial lactate level >2.5 mmol/l.

Results

Out of the 76 patients enrolled, 34 (44.7%) had a good neurologic outcome. The initial lactate level showed no significant difference between good and poor neurologic outcome groups (6.07 ±4 .09 mmol/L vs 7.13 ± 3.99 mmol/L, P = 0.42), However, lactate levels at 6 hours, 12 hours, 24 hours, and 48 hours in the good neurologic outcome group were lower than in the poor neurologic outcome group (3.81 ± 2.81 vs 6.00 ± 3.22 P <0.01, 2.95 ± 2.07 vs 5.00 ± 3.49 P <0.01, 2.17 ± 1.24 vs 3.86 ± 3.92 P <0.01, 1.57 ± 1.02 vs 2.21 ± 1.35 P = 0.03, respectively). The secondary analysis showed that the 6-hour and 12-hour lactate clearance was higher for good neurologic outcome patients (35.3 ± 34.6% vs 6.89 ± 47.4% P = 0.01, 54.5 ± 23.7% vs 25.6 ± 43.7% P <0.01, respectively). After adjusting for potential confounding variables, the 12-hour lactate clearance still showed a statistically significant difference (P = 0.02).

Conclusion

The lactate clearance rate, and not the initial lactate level, was associated with neurological outcome in OHCA patients after therapeutic hypothermia.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号