首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 599 毫秒
1.

Introduction

The aim of this study was to describe the population pharmacokinetics of vancomycin in critically ill patients treated with and without extracorporeal membrane oxygenation (ECMO).

Methods

We retrospectively reviewed data from critically ill patients treated with ECMO and matched controls who received a continuous infusion of vancomycin (35 mg/kg loading dose over 4 hours followed by a daily infusion adapted to creatinine clearance, CrCl)). The pharmacokinetics of vancomycin were described using non-linear mixed effects modeling.

Results

We compared 11 patients treated with ECMO with 11 well-matched controls. Drug dosing was similar between groups. The median interquartile range (IQR) vancomycin concentrations in ECMO and non-ECMO patients were 51 (28 to 71) versus 45 (37 to 71) mg/L at 4 hours; 23 (16 to 38) versus 29 (21 to 35) mg/L at 12 hours; 20 (12 to 36) versus 23 (17–28) mg/L at 24 hours (ANOVA, P =0.53). Median (ranges) volume of distribution (Vd) was 99.3 (49.1 to 212.3) and 92.3 (22.4 to 149.4) L in ECMO and non-ECMO patients, respectively, and clearance 2.4 (1.7 to 4.9) versus 2.3 (1.8 to 3.6) L/h (not significant). Insufficient drug concentrations (that is drug levels <20 mg/dL) were more common in the ECMO group. The pharmacokinetic model (non-linear mixed effects modeling) was prospectively validated in five additional ECMO-treated patients over a 6-month period. Linear regression analysis comparing the observed concentrations and those predicted using the model showed good correlation (r2 of 0.67; P <0.001).

Conclusions

Vancomycin concentrations were similar between ECMO and non-ECMO patients in the early phase of therapy. ECMO treatment was not associated with significant changes in Vd and drug clearance compared with the control patients.  相似文献   

2.

Introduction

Acute respiratory failure (ARF) is the main reason for intensive care unit (ICU) admissions in patients with hematologic malignancies (HMs). We report the first series of adult patients with ARF and HMs treated with extracorporeal membrane oxygenation (ECMO).

Methods

This is a retrospective cohort study of 14 patients with HMs (aggressive non-Hodgkin lymphoma (NHL) n = 5; highly aggressive NHL, that is acute lymphoblastic leukemia or Burkitt lymphoma, n = 5; Hodgkin lymphoma, n = 2; acute myeloid leukemia, n = 1; multiple myeloma, n = 1) receiving ECMO support because of ARF (all data as medians and interquartile ranges; age, 32 years (22 to 51 years); simplified acute physiology score II (SAPS II): 51 (42 to 65)). Etiology of ARF was pneumonia (n = 10), thoracic manifestation of NHL (n = 2), sepsis of nonpulmonary origin (n = 1), and transfusion-related acute lung injury (n = 1). Diagnosis of HM was established during ECMO in four patients, and five first received (immuno-) chemotherapy on ECMO.

Results

Before ECMO, the PaO2/FiO2 ratio was 60 (53 to 65), (3.3 to 3.7). Three patients received venoarterial ECMO because of acute circulatory failure in addition to ARF; all other patients received venovenous ECMO. All patients needed vasopressors, and five needed hemofiltration. Thrombocytopenia occurred in all patients (lowest platelet count was 20 (11 to 21) G/L). Five major bleeding events were noted. ECMO duration was 8.5 (4 to 16) days. ICU and hospital survival was 50%. All survivors were alive at follow-up (36 (10 to 58) months); five patients were in complete remission, one in partial remission, and one had relapsed.

Conclusions

ECMO therapy is feasible in selected patients with HMs and ARF and can be associated with long-term disease-free survival.  相似文献   

3.

Introduction

The use of standard doses of β-lactam antibiotics during continuous renal replacement therapy (CRRT) may result in inadequate serum concentrations. The aim of this study was to evaluate the adequacy of unadjusted drug regimens (i.e., similar to those used in patients with normal renal function) in patients treated with CRRT and the influence of CRRT intensity on drug clearance.

Methods

We reviewed data from 50 consecutive adult patients admitted to our Department of Intensive Care in whom routine therapeutic drug monitoring (TDM) of broad-spectrum β-lactam antibiotics (ceftazidime or cefepime, CEF; piperacillin/tazobactam; TZP; meropenem, MEM) was performed using unadjusted β-lactam antibiotics regimens (CEF = 2 g q8h; TZP = 4 g q6h; MEM = 1 g q8h). Serum drug concentrations were measured twice during the elimination phase by high-performance liquid chromatography (HPLC-UV). We considered therapy was adequate when serum drug concentrations were between 4 and 8 times the minimal inhibitory concentration (MIC) of Pseudomonas aeruginosa during optimal periods of time for each drug (≥70% for CEF; ≥ 50% for TZP; ≥ 40% for MEM). Therapy was considered as early (ET) or late (LT) phase if TDM was performed within 48 hours of antibiotic initiation or later on, respectively.

Results

We collected 73 serum samples from 50 patients (age 58 ± 13 years; Acute Physiology and Chronic Health Evaluation II (APACHE II) score on admission 21 (17–25)), 35 during ET and 38 during LT. Drug concentrations were above 4 times the MIC in 63 (90%), but above 8 times the MIC in 39 (53%) samples. The proportions of patients with adequate drug concentrations during ET and LT were quite similar. We found a weak but significant correlation between β-lactam antibiotics clearance and CRRT intensity.

Conclusions

In septic patients undergoing CRRT, doses of β-lactam antibiotics similar to those given to patients with normal renal function achieved drug levels above the target threshold in 90% of samples. Nevertheless, 53% of samples were associated with very high drug levels and daily drug regimens may need to be adapted accordingly.  相似文献   

4.

Background

Microvascular obstruction (MVO) describes suboptimal tissue perfusion despite restoration of infarct-related artery flow. There are scarce data on Infarct Size (IS) and MVO in relation to the mode and timing of reperfusion. We sought to characterise the prevalence and extent of microvascular injury and IS using Cardiovascular magnetic resonance (CMR), in relation to the mode of reperfusion following acute ST-Elevation Myocardial Infarction (STEMI).

Methods

CMR infarct characteristics were measured in 94 STEMI patients (age 61.0 ± 13.1 years) at 1.5 T. Seventy-three received reperfusion therapy: primary percutaneous coronary-intervention (PPCI, n = 47); thrombolysis (n = 12); rescue PCI (R-PCI, n = 8), late PCI (n = 6). Twenty-one patients presented late (>12 hours) and did not receive reperfusion therapy.

Results

IS was smaller in PPCI (19.8 ± 13.2% of LV mass) and thrombolysis (15.2 ± 10.1%) groups compared to patients in the late PCI (40.0 ± 15.6%) and R-PCI (34.2 ± 18.9%) groups, p <0.001. The prevalence of MVO was similar across all groups and was seen at least as frequently in the non-reperfused group (15/21, [76%] v 33/59, [56%], p = 0.21) and to a similar magnitude (1.3 (0.0-2.8) v 0.4 [0.0-2.9]% LV mass, p = 0.36) compared to patients receiving early reperfusion therapy. In the 73 reperfused patients, time to reperfusion, ischaemia area at risk and TIMI grade post-PCI were the strongest independent predictors of IS and MVO.

Conclusions

In patients with acute STEMI, CMR-measured MVO is not exclusive to reperfusion therapy and is primarily related to ischaemic time. This finding has important implications for clinical trials that use CMR to assess the efficacy of therapies to reduce reperfusion injury in STEMI.  相似文献   

5.

Background

Established heart failure in thalassaemia major has a poor prognosis and optimal management remains unclear.

Methods

A 1 year prospective study comparing deferoxamine (DFO) monotherapy or when combined with deferiprone (DFP) for patients with left ventricular ejection fraction (LVEF) <56% was conducted by the Thalassemia Clinical Research Network (TCRN). All patients received DFO at 50–60 mg/kg 12–24 hr/day sc or iv 7 times weekly, combined with either DFP 75 at mg/kg/day (combination arm) or placebo (DFO monotherapy arm). The primary endpoint was the change in LVEF by CMR.

Results

Improvement in LVEF was significant in both study arms at 6 and 12 months (p = 0.04), normalizing ventricular function in 9/16 evaluable patients. With combination therapy, the LVEF increased from 49.9% to 55.2% (+5.3% p = 0.04; n = 10) at 6 months and to 58.3% at 12 months (+8.4% p = 0.04; n = 7). With DFO monotherapy, the LVEF increased from 52.8% to 55.7% (+2.9% p = 0.04; n = 6) at 6 months and to 56.9% at 12 months (+4.1% p = 0.04; n = 4). The LVEF trend did not reach statistical difference between study arms (p = 0.89). In 2 patients on DFO monotherapy during the study and in 1 patient on combined therapy during follow up, heart failure deteriorated fatally. The study was originally powered for 86 participants to determine a 5% difference in LVEF improvement between treatments. The study was prematurely terminated due to slow recruitment and with the achieved sample size of 20 patients there was 80% power to detect an 8.6% difference in EF, which was not demonstrated. Myocardial T2* improved in both arms (combination +1.9 ± 1.6 ms p = 0.04; and DFO monotherapy +1.9 ± 1.4 ms p = 0.04), but with no significant difference between treatments (p = 0.65). Liver iron (p = 0.03) and ferritin (p < 0.001) both decreased significantly in only the combination group.

Conclusions

Both treatments significantly improved LVEF and myocardial T2*. Although this is the largest and only randomized study in patients with LV decompensation, further prospective evaluation is needed to identify optimal chelation management in these high-risk patients.  相似文献   

6.

Introduction

Risk stratification in patients undergoing extracorporeal membrane oxygenation (ECMO) support after cardiovascular surgery remains challenging, because data on specific outcome predictors are limited. Serum butyrylcholinesterase demonstrated a strong inverse association with all-cause and cardiovascular mortality in non-critically ill patients. We therefore evaluated the predictive value of preoperative serum butyrylcholinesterase levels in patients undergoing venoarterial ECMO support after cardiovascular surgery.

Methods

We prospectively included 191 patients undergoing venoarterial ECMO therapy after cardiovascular surgery at a university-affiliated tertiary care center in our registry.

Results

All-cause and cardiovascular mortality were defined as primary study end points. During a median follow-up time of 51 months (IQR, 34 to 71) corresponding to 4,197 overall months of follow-up, 65% of patients died. Cox proportional hazard regression analysis revealed a significant and independent inverse association between higher butyrylcholinesterase levels and all-cause mortality with an adjusted hazard ratio (HR) of 0.44 (95% CI, 0.25 to 0.78; P = 0.005), as well as cardiovascular mortality, with an adjusted HR of 0.38 (95% CI, 0.21 to 0.70; P = 0.002), comparing the third with the first tertile. Survival rates were higher in patients within the third tertile of butyrylcholinesterase compared with patients within the first tertile at 30 days (68% versus 44%) as well as at 6 years (47% versus 21%).

Conclusions

The current study revealed serum butyrylcholinesterase as a strong and independent inverse predictor of all-cause and cardiovascular mortality in patients undergoing venoarterial ECMO therapy after cardiovascular surgery. These findings advance the limited knowledge on risk stratification in patients undergoing ECMO support and represent a valuable addition for a comprehensive decision making before ECMO implantation.  相似文献   

7.

Introduction

Rewarming from deep hypothermic circulatory arrest (DHCA) produces calcium desensitization by troponin I (cTnI) phosphorylation which results in myocardial dysfunction. This study investigated the acute overall hemodynamic and metabolic effects of epinephrine and levosimendan, a calcium sensitizer, on myocardial function after rewarming from DHCA.

Methods

Forty male Wistar rats (400 to 500 g) underwent cardiopulmonary bypass (CPB) through central cannulation and were cooled to a core temperature of 13°C to 15°C within 30 minutes. After DHCA (20 minutes) and CPB-assisted rewarming (60 minutes) rats were randomly assigned to 60 minute intravenous infusion with levosimendan (0.2 μg/kg/min; n = 15), epinephrine (0.1 μg/kg/min; n = 15) or saline (control; n = 10). Systolic and diastolic functions were evaluated at different preloads with a conductance catheter.

Results

The slope of left ventricular end-systolic pressure volume relationship (Ees) and preload recruitable stroke work (PRSW) recovered significantly better with levosimendan compared to epinephrine (Ees: 85 ± 9% vs 51 ± 11%, P<0.003 and PRSW: 78 ± 5% vs 48 ± 8%, P<0.005; baseline: 100%). Levosimendan but not epinephrine reduced left ventricular stiffness shown by the end-diastolic pressure-volume relationship and improved ventricular relaxation (Tau). Levosimendan preserved ATP myocardial content as well as energy charge and reduced plasma lactate concentrations. In normothermia experiments epinephrine in contrast to Levosimendan increased cTnI phosphorylation 3.5-fold. After rewarming from DHCA, cTnI phosphorylation increased 4.5-fold in the saline and epinephrine group compared to normothermia but remained unchanged with levosimendan.

Conclusions

Levosimendan due to prevention of calcium desensitization by cTnI phosphorylation is more effective than epinephrine for treatment of myocardial dysfunction after rewarming from DHCA.  相似文献   

8.

Introduction

Low plasma selenium concentrations are frequent in critically ill patients. However, whether this is due to systemic inflammation, a deficient nutritional state or both is still not clear. We aimed to determine the factors associated with low plasma selenium in critically ill children while considering the inflammatory response and nutritional status.

Method

A prospective study was conducted in 173 children (median age 34 months) with systemic inflammatory response who had plasma selenium concentrations assessed 48 hours after admission and on the 5th day of ICU stay. The normal reference range was 0.58 μmol/L to 1.6 μmol/L. The outcome variable was ‘low plasma selenium’, which was defined as plasma selenium values below the distribution median during this period. The main explanatory variables were age, malnutrition, sepsis, C-reactive protein (CRP), and clinical severity scores. The data were analyzed using a Binomial Generalized Estimating Equations model, which includes the correlation between admission and 5th day responses.

Results

Malnutrition and CRP were associated with low plasma selenium. The interaction effect between these two variables was significant. When CRP values were less than or equal to 40 mg/L, malnutrition was associated with low plasma selenium levels (odds ratio (OR) = 3.25, 95% confidence interval (CI) 1.39 to 7.63, P = 0.007; OR = 2.98, 95% CI 1.26 to 7.06, P = 0.013; OR = 2.49, 95% CI 1.01 to 6.17, P = 0.049, for CRP = 10, 20 and 40 mg/L, respectively). This effect decreased as CRP concentrations increased and there was loose significance when CRP values were >40 mg/L. Similarly, the effect of CRP on low plasma selenium was significant for well-nourished patients (OR = 1.13; 95% CI 1.06 to 1.22, P <0.001) but not for the malnourished (OR = 1.03; 95% CI 0.99 to 1.08, P = 0.16).

Conclusions

There is a significant interaction between the magnitude of the inflammatory response and malnutrition on low plasma selenium. This interaction should be considered when interpreting plasma concentrations as an index of selenium status in patients with systemic inflammation as well as in the decision on selenium supplementation.  相似文献   

9.

Introduction

Indications for renal replacement therapy (RRT) have not been generally standardized and vary among intensive care units (ICUs). We aimed to assess the proportion, indications, and modality of RRT, as well as the association between the proportion of RRT use and 90-day mortality in patients with septic shock in Finnish adult ICUs.

Methods

We identified patients with septic shock from the prospective observational multicenter FINNAKI study conducted between 1 September 2011 and 1 February 2012. We divided the ICUs into high-RRT and low-RRT ICUs according to the median of the proportion of RRT-treated patients with septic shock. Differences in indications, and modality of RRT between ICU groups were assessed. Finally, we performed an adjusted logistic regression analysis to evaluate the possible association of the ICU group (high vs. low-RRT) with 90-day mortality.

Results

Of the 726 patients with septic shock, 131 (18.0%, 95% CI 15.2 to 20.9%) were treated with RRT. The proportion of RRT-treated patients varied from 3% up to 36% (median 19%) among ICUs. High-RRT ICUs included nine ICUs (354 patients) and low-RRT ICUs eight ICUs (372 patients). In the high-RRT ICUs patients with septic shock were older (P = 0.04), had more cardiovascular (P <0.001) and renal failures (P = 0.003) on the first day in the ICU, were more often mechanically ventilated, and received higher maximum doses of norepinephrine (0.25 μg/kg/min vs. 0.18 μg/kg/min, P <0.001) than in the low-RRT ICUs. No significant differences in indications for or modality of RRT existed between the ICU groups. The crude 90-day mortality rate for patients with septic shock was 36.2% (95% CI 31.1 to 41.3%) in the high-RRT ICUs compared to 33.9% (95% CI 29.0 to 38.8%) in the low-RRT ICUs, P = 0.5. In an adjusted logistic regression analysis the ICU group (high-RRT or low-RRT ICUs) was not associated with 90-day mortality.

Conclusions

Patients with septic shock in ICUs with a high proportion of RRT had more severe organ dysfunctions and received more organ-supportive treatments. Importantly, the ICU group (high-RRT or low-RRT group) was not associated with 90-day mortality.  相似文献   

10.

Introduction

Several methods have been proposed to evaluate neurological outcome in out-of-hospital cardiac arrest (OHCA) patients. Blood lactate has been recognized as a reliable prognostic marker for trauma, sepsis, or cardiac arrest. The objective of this study was to examine the association between initial lactate level or lactate clearance and neurologic outcome in OHCA survivors who were treated with therapeutic hypothermia.

Methods

This retrospective cohort study included patients who underwent protocol-based 24-hour therapeutic hypothermia after OHCA between January 2010 and March 2012. Serum lactate levels were measured at the start of therapy (0 hours), and after 6 hours, 12 hours, 24 hours, 48 hours and 72 hours. The 6 hour and 12 hour lactate clearance were calculated afterwards. Patients’ neurologic outcome was assessed at one month after cardiac arrest; good neurological outcome was defined as Cerebral Performance Category one or two. The primary outcome was an association between initial lactate level and good neurologic outcome. The secondary outcome was an association between lactate clearance and good neurologic outcome in patients with initial lactate level >2.5 mmol/l.

Results

Out of the 76 patients enrolled, 34 (44.7%) had a good neurologic outcome. The initial lactate level showed no significant difference between good and poor neurologic outcome groups (6.07 ±4 .09 mmol/L vs 7.13 ± 3.99 mmol/L, P = 0.42), However, lactate levels at 6 hours, 12 hours, 24 hours, and 48 hours in the good neurologic outcome group were lower than in the poor neurologic outcome group (3.81 ± 2.81 vs 6.00 ± 3.22 P <0.01, 2.95 ± 2.07 vs 5.00 ± 3.49 P <0.01, 2.17 ± 1.24 vs 3.86 ± 3.92 P <0.01, 1.57 ± 1.02 vs 2.21 ± 1.35 P = 0.03, respectively). The secondary analysis showed that the 6-hour and 12-hour lactate clearance was higher for good neurologic outcome patients (35.3 ± 34.6% vs 6.89 ± 47.4% P = 0.01, 54.5 ± 23.7% vs 25.6 ± 43.7% P <0.01, respectively). After adjusting for potential confounding variables, the 12-hour lactate clearance still showed a statistically significant difference (P = 0.02).

Conclusion

The lactate clearance rate, and not the initial lactate level, was associated with neurological outcome in OHCA patients after therapeutic hypothermia.  相似文献   

11.

Background

Diffuse myocardial fibrosis (DMF) is important in cardiovascular disease, however until recently could only be assessed by invasive biopsy. We hypothesised that DMF measured by T1 mapping is elevated in isolated systemic hypertension.

Methods

In a study of well-controlled hypertensive patients from a specialist tertiary centre, 46 hypertensive patients (median age 56, range 21 to 78, 52 % male) and 50 healthy volunteers (median age 45, range 28 to 69, 52 % male) underwent clinical CMR at 1.5 T with T1 mapping (ShMOLLI) using the equilibrium contrast technique for extracellular volume (ECV) quantification. Patients underwent 24-hours Automated Blood Pressure Monitoring (ABPM), echocardiographic assessment of diastolic function, aortic stiffness assessment and measurement of NT-pro-BNP and collagen biomarkers.

Results

Late gadolinium enhancement (LGE) revealed significant unexpected underlying pathology in 6 out of 46 patients (13 %; myocardial infarction n = 3; hypertrophic cardiomyopathy (HCM) n = 3); these were subsequently excluded. Limited, non-ischaemic LGE patterns were seen in 11 out of the remaining 40 (28 %) patients. Hypertensives on therapy (mean 2.2 agents) had a mean ABPM of 152/88 mmHg, but only 35 % (14/40) had left ventricular hypertrophy (LVH; LV mass male > 90 g/m2; female > 78 g/m2). Native myocardial T1 was similar in hypertensives and controls (955 ± 30 ms versus 965 ± 38 ms, p = 0.16). The difference in ECV did not reach significance (0.26 ± 0.02 versus 0.27 ± 0.03, p = 0.06). In the subset with LVH, the ECV was significantly higher (0.28 ± 0.03 versus 0.26 ± 0.02, p < 0.001).

Conclusion

In well-controlled hypertensive patients, conventional CMR discovered significant underlying diseases (chronic infarction, HCM) not detected by echocardiography previously or even during this study. T1 mapping revealed increased diffuse myocardial fibrosis, but the increases were small and only occurred with LVH.  相似文献   

12.

Introduction

Acute renal failure (ARF) requiring renal replacement therapy (RRT) occurs frequently in ICU patients and significantly affects mortality rates. Previously, few large clinical trials investigated the impact of RRT modalities on patient outcomes. Here we investigated the effect of two major RRT strategies (intermittent hemodialysis (IHD) and continuous veno-venous hemofiltration (CVVH)) on mortality and renal-related outcome measures.

Methods

This single-center prospective randomized controlled trial (“CONVINT”) included 252 critically ill patients (159 male; mean age, 61.5 ± 13.9 years; Acute Physiology and Chronic Health Evaluation (APACHE) II score, 28.6 ± 8.8) with dialysis-dependent ARF treated in the ICUs of a tertiary care academic center. Patients were randomized to receive either daily IHD or CVVH. The primary outcome measure was survival at 14 days after the end of RRT. Secondary outcome measures included 30-day-, intensive care unit-, and intrahospital mortality, as well as course of disease severity/biomarkers and need for organ-support therapy.

Results

At baseline, no differences in disease severity, distributions of age and gender, or suspected reasons for acute renal failure were observed. Survival rates at 14 days after RRT were 39.5% (IHD) versus 43.9% (CVVH) (odds ratio (OR), 0.84; 95% confidence interval (CI), 0.49 to 1.41; P = 0.50). 14-day-, 30-day, and all-cause intrahospital mortality rates were not different between the two groups (all P > 0.5). No differences were observed in days on RRT, vasopressor days, days on ventilator, or ICU-/intrahospital length of stay.

Conclusions

In a monocentric RCT, we observed no statistically significant differences between the investigated treatment modalities regarding mortality, renal-related outcome measures, or survival at 14 days after RRT. Our findings add to mounting data demonstrating that intermittent and continuous RRTs may be considered equivalent approaches for critically ill patients with dialysis-dependent acute renal failure.

Trial registration

NCT01228123, clinicaltrials.gov  相似文献   

13.

Background

In hypertrophic cardiomyopathy (HCM), autopsy studies revealed both increased focal and diffuse deposition of collagen fibers. Late gadolinium enhancement imaging (LGE) detects focal fibrosis, but is unable to depict interstitial fibrosis. We hypothesized that with T1 mapping, which is employed to determine the myocardial extracellular volume fraction (ECV), can detect diffuse interstitial fibrosis in HCM patients.

Methods

T1 mapping with a modified Look-Locker Inversion Recovery (MOLLI) pulse sequence was used to calculate ECV in manifest HCM (n = 16) patients and in healthy controls (n = 14). ECV was determined in areas where focal fibrosis was excluded with LGE.

Results

The total group of HCM patients showed no significant changes in mean ECV values with respect to controls (0.26 ± 0.03 vs 0.26 ± 0.02, p = 0.83). Besides, ECV in LGE positive HCM patients was comparable with LGE negative HCM patients (0.27 ± 0.03 vs 0.25 ± 0.03, p = 0.12).

Conclusions

This study showed that HCM patients have a similar ECV (e.g. interstitial fibrosis) in myocardium without LGE as healthy controls. Therefore, the additional clinical value of T1 mapping in HCM seems limited, but future larger studies are needed to establish the clinical and prognostic potential of this new technique within HCM.  相似文献   

14.

Background

Late gadolinium enhancement (LGE) is identified frequently in LVNC. However, the features of this findings are limited. The purpose of the present study was to describe the frequency and distribution of LGE in patients meeting criteria for left ventricular non-compaction (LVNC), as assessed by cardiovascular magnetic resonance (CMR).

Methods

Forty-seven patients (37 males and 10 females; mean age, 39 ± 18 years) considered to meet standard CMR criteria for LVNC were studied. The LGE images were obtained 15 ± 5 min after the injection of 0.2 mmol/kg of gadolinium-DTPA using an inversion-recovery sequence, and analyzed using a 17-segment model.

Results

Mean number of non-compacted segments per patient was 7.4 ± 2.5 and the NC:C was 3.2 ± 0.7. Non-compaction was most commonly noted in the apical segments in all patients. LGE was present in 19 of the 47 patients (40%), and most often located in the ventricular septum. The distribution of LGE was subendocardial (n = 5; 6%), mid-myocardial (n = 61; 68%), subepicardial (n = 10; 11%), and transmural (n = 14; 15%) in total of 90 LGE (+) segments.

Conclusions

In patients considered to meet criteria for LVNC, LGE distributions visible were strikingly heterogeneous with appearances potentially attributable to three or more distinct cardiomyopathic processes. This may be in keeping with previous suggestions that the criteria may be of low specificity. Further work is needed to determine whether conditions such as dilated cardiomyopathy, previous myocardidtis or ischaemic heart disease increase the apparent depth of non-compact relative to compact myocardium.  相似文献   

15.

Background

Cardiovascular magnetic resonance (CMR) provides non-invasive and more accurate assessment of right ventricular (RV) function in comparison to echocardiography. Recent study demonstrated that assessment of RV function by echocardiography was an independent predictor for mortality in patients with interstitial lung disease (ILD). The purpose of this study was to determine the prognostic significance of CMR derived RV ejection fraction (RVEF) in ILD patients.

Methods

We enrolled 76 patients with ILD and 24 controls in the current study. By using 1.5 T CMR scanner equipped with 32 channel cardiac coils, we performed steady-state free precession cine CMR to assess the RVEF. RV systolic dysfunction (RVSD) was defined as RVEF ≤45.0% calculated by long axis slices. Pulmonary hypertension (PH) was defined as mean pulmonary artery pressure (mPAP) of more than 25 mmHg at rest in the setting of pulmonary capillary wedge pressure ≤15 mmHg.

Results

The median RVEF was 59.2% in controls (n = 24), 53.8% in ILD patients without PH (n = 42) and 43.1% in ILD patients with PH (n = 13) (p < 0.001 by one-way ANOVA). During a mean follow-up of 386 days, 18 patients with RVSD had 11 severe events (3 deaths, 3 right heart failure, 3 exacerbation of dyspnea requiring oxygen, 2 pneumonia requiring hospitalization). In contrast, only 2 exacerbation of dyspnea requiring oxygen were observed in 58 patients without RVSD. Multivariate Cox regression analysis showed that RVEF independently predicted future events, after adjusting for age, sex and RVFAC by echocardiography (hazard ratio: 0.889, 95% confidence interval: 0.809 – 0.976, p = 0.014).

Conclusions

The current study demonstrated that RVSD in ILD patients can be clearly detected by cine CMR. Importantly, low prevalence of PH (17%) indicated that population included many mild ILD patients. CMR derived RVEF might be useful for the risk stratification and clinical management of ILD patients.  相似文献   

16.

Background

We sought to identify cardiovascular magnetic resonance (CMR) parameters associated with successful univentricular to biventricular conversion in patients with small left hearts.

Methods

Patients with small left heart structures and a univentricular circulation who underwent CMR prior to biventricular conversion were retrospectively identified and divided into 2 anatomic groups: 1) borderline hypoplastic left heart structures (BHLHS), and 2) right-dominant atrioventricular canal (RDAVC). The primary outcome variable was transplant-free survival with a biventricular circulation.

Results

In the BHLHS group (n = 22), 16 patients (73%) survived with a biventricular circulation over a median follow-up of 40 months (4–84). Survival was associated with a larger CMR left ventricular (LV) end-diastolic volume (EDV) (p = 0.001), higher LV-to-right ventricle (RV) stroke volume ratio (p < 0.001), and higher mitral-to-tricuspid inflow ratio (p = 0.04). For predicting biventricular survival, the addition of CMR threshold values to echocardiographic LV EDV improved sensitivity from 75% to 93% while maintaining specificity at 100%. In the RDAVC group (n = 10), 9 patients (90%) survived with a biventricular circulation over a median follow-up of 29 months (3–51). The minimum CMR values were a LV EDV of 22 ml/m2 and a LV-to-RV stroke volume ratio of 0.19.

Conclusions

In BHLHS patients, a larger LV EDV, LV-to-RV stroke volume ratio, and mitral-to-tricuspid inflow ratio were associated with successful biventricular conversion. The addition of CMR parameters to echocardiographic measurements improved the sensitivity for predicting successful conversion. In RDAVC patients, the high success rate precluded discriminant analysis, but a range of CMR parameters permitting biventricular conversion were identified.  相似文献   

17.

Introduction

Cefepime, a broad spectrum antibiotic, is commonly prescribed in intensive care units (ICU) and may be an overlooked cause of neurologic symptoms such as encephalopathy, myoclonus, seizures, and coma. We aimed to characterize cefepime neurotoxicity in the ICU.

Methods

We performed a retrospective study of adult ICU patients treated with intravenous cefepime for at least 3 days between January 1, 2009 and December 31, 2011. The primary outcome was the development of cefepime neurotoxicity, with the likelihood of causality ascribed via a modified Delphi method.

Results

This study included 100 patients. The mean age was 65.8 years (± 12.7 years). The median daily average dose of cefepime was 2.5 (IQR 2.0 to 3.5) grams. The median treatment duration was 6 (IQR 4 to 10) days. Renal failure in any form was present in 84 patients. Chronic kidney disease affected 40 patients, and 77 had acute kidney injury. Cefepime neurotoxicity occurred in 15 patients. Of these, seven were considered definite cases, three probable, and five possible. Neurotoxic symptoms included impaired consciousness (n = 13), myoclonus (n = 11), disorientation (n = 6), and nonconvulsive status epilepticus (n = 1). The dose of cefepime was appropriately adjusted for renal clearance in 64 patients (75.3%) without cefepime neurotoxicity and four patients (28.6%) with neurotoxicity (P = 0.001). Chronic kidney disease was present in 30 patients (35.3%) without neurotoxicity and in 10 (66.7%) of those with neurotoxicity (P = 0.04).

Conclusions

Critically ill patients with chronic kidney disease are particularly susceptible to cefepime neurotoxicity. Myoclonus and impaired consciousness are the predominant clinical manifestations. Neurotoxic symptoms occur more often when the cefepime dose is not adjusted for renal function, but can still occur despite those modifications.  相似文献   

18.

Background

Identification of the subset females with Turner syndrome who face especially high risk of aortic dissection is difficult, and more optimal risk assessment is pivotal in order to improve outcomes. This study aimed to provide comprehensive, dynamic mathematical models of aortic disease in Turner syndrome by use of cardiovascular magnetic resonance (CMR).

Methods

A prospective framework of long-term aortic follow-up was used, which comprised diameters of the thoracic aorta prospectively assessed at nine positions by CMR at the three points in time (baseline [n = 102, age 38 ± 11 years], follow-up [after 2.4 ± 0.4 years, n = 80] and end-of-study [after 4.8 ± 0.5 years, n = 78]). Mathematical models were created that cohesively integrated all measurements at all positions, from all visits and for all participants, and using these models cohesive risk factor analyses were conducted based on which predictive modeling was performed on which predictive modelling was performed.

Results

The cohesive models showed that the variables with effect on aortic diameter were aortic coarctation (P < 0.0001), bicuspid aortic valves (P < 0.0001), age (P < 0.0001), diastolic blood pressure (P = 0.0008), body surface area (P = 0.015) and antihypertensive treatment (P = 0.005). Oestrogen replacement therapy had an effect of borderline significance (P = 0.08). From these data, mathematical models were created that enabled preemption of aortic dilation from CMR derived aortic diameters in scenarios both with and without known risk factors. The fit of the models to the actual data was good.

Conclusion

The presented cohesive model for prediction of aortic diameter in Turner syndrome could help identifying females with rapid growth of aortic diameter, and may enhance clinical decision-making based on serial CMR.  相似文献   

19.

Background

Quantitative Cardiovascular Magnetic Resonance (CMR) techniques have gained high interest in CMR research. Myocardial T2 mapping is thought to be helpful in diagnosis of acute myocardial conditions associated with myocardial edema. In this study we aimed to establish a technique for myocardial T2 mapping based on gradient-spin-echo (GraSE) imaging.

Methods

The local ethics committee approved this prospective study. Written informed consent was obtained from all subjects prior to CMR. A modified GraSE sequence allowing for myocardial T2 mapping in a single breath-hold per slice using ECG-triggered acquisition of a black blood multi-echo series was developed at 1.5 Tesla. Myocardial T2 relaxation time (T2-RT) was determined by maximum likelihood estimation from magnitude phased-array multi-echo data. Four GraSE sequence variants with varying number of acquired echoes and resolution were evaluated in-vitro and in 20 healthy volunteers. Inter-study reproducibility was assessed in a subset of five volunteers. The sequence with the best overall performance was further evaluated by assessment of intra- and inter-observer agreement in all volunteers, and then implemented into the clinical CMR protocol of five patients with acute myocardial injury (myocarditis, takotsubo cardiomyopathy and myocardial infarction).

Results

In-vitro studies revealed the need for well defined sequence settings to obtain accurate T2-RT measurements with GraSE. An optimized 6-echo GraSE sequence yielded an excellent agreement with the gold standard Carr-Purcell-Meiboom-Gill sequence. Global myocardial T2 relaxation times in healthy volunteers was 52.2 ± 2.0 ms (mean ± standard deviation). Mean difference between repeated examinations (n = 5) was −0.02 ms with 95% limits of agreement (LoA) of [−4.7; 4.7] ms. Intra-reader and inter-reader agreement was excellent with mean differences of −0.1 ms, 95% LoA = [−1.3; 1.2] ms and 0.1 ms, 95% LoA = [−1.5; 1.6] ms, respectively (n = 20). In patients with acute myocardial injury global myocardial T2-RTs were prolonged (mean: 61.3 ± 6.7 ms).

Conclusion

Using an optimized GraSE sequence CMR allows for robust, reliable, fast myocardial T2 mapping and quantitative tissue characterization. Clinically, the GraSE-based T2-mapping has the potential to complement qualitative CMR in patients with acute myocardial injuries.

Electronic supplementary material

The online version of this article (doi:10.1186/s12968-015-0127-z) contains supplementary material, which is available to authorized users.  相似文献   

20.

Introduction

Septic shock is a major cause of morbidity and mortality throughout the world. Unfortunately, the optimal fluid management of septic shock is unknown and currently is empirical.

Methods

A retrospective analysis was performed at Barnes-Jewish Hospital (St. Louis, Missouri). Consecutive patients (n = 325) hospitalized with septic shock who had echocardiographic examinations performed within 24 hours of shock onset were enrolled.

Results

A total of 163 (50.2%) patients with septic shock died during hospitalization. Non-survivors had a significantly larger positive net fluid balance within the 24 hour window of septic shock onset (median (IQR): 4,374 ml (1,637 ml, 7,260 ml) vs. 2,959 ml (1,639.5 ml, 4,769.5 ml), P = 0.004). The greatest quartile of positive net fluid balance at 24 hours and eight days post-shock onset respectively were found to predict hospital mortality, and the greatest quartile of positive net fluid balance at eight days post-shock onset was an independent predictor of hospital mortality (adjusted odds ratio (AOR), 1.66; 95% CI, 1.39 to 1.98; P = 0.004). Survivors were significantly more likely to have mild left ventricular dysfunction as evaluated by bedside echocardiography and non-survivors had slightly elevated left ventricular ejection fraction, which was also found to be an independent predictor of outcome.

Conclusions

Our data confirms the importance of fluid balance and cardiac function as outcome predictors in patients with septic shock. A clinical trial to determine the optimal administration of intravenous fluids to patients with septic shock is needed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号