首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Purpose

To evaluate the utility of contrast-enhanced sonography in the study of pediatric liver transplant recipients and its potential impact in reducing the need for invasive diagnostic procedures.

Materials and methods

From October 2002 to December 2003 we performed routine color Doppler ultrasound and contrast-enhanced ultrasound studies on 30 pediatric patients who had undergone liver transplantation. Findings indicative of complications were confirmed with invasive studies (angiography, computed tomography, and PTC).

Results

Contrast-enhanced sonography correctly identified four of the five cases of hepatic artery thrombosis and all those involving the portal (n = 6) and hepatic vein (n = 3) thrombosis. It failed to identify one case of hepatic artery thrombosis characterized by collateral circulation arising from the phrenic artery and the single case of hepatic artery stenosis. The latter was more evident on color Doppler, which revealed a typical tardus parvus waveform. The use of contrast offered no significant advantages in the study of biliary complications although it did provide better visualization of bile leaks.

Conclusions

Contrast-enhanced sonography improves diagnostic confidence and reduces the need for more invasive imaging studies in the postoperative follow-up of pediatric liver transplant recipients.  相似文献   

2.

Introduction

Liver dysfunction can derive from severe sepsis and might be associated with poor prognosis. However, diagnosis of septic liver dysfunction is challenging due to a lack of appropriate tests. Measurement of maximal liver function capacity (LiMAx test) has been successfully evaluated as a new diagnostic test in liver resection and transplantation. The aim of this study was to evaluate the LiMAx test during sepsis in comparison to biochemical tests and the indocyanin green test (ICG-PDR).

Methods

We prospectively investigated 28 patients (8 female and 20 male, age range 35 to 80 years) suffering from sepsis on a surgical ICU. All patients received routine resuscitation from septic shock (surgery, fluids, catecholamines, antibiotic drugs). The first LiMAx test and ICG-PDR were carried out within the first 24 hours after onset of septic symptoms, followed by day 2, 5 and 10. Other biochemical parameters and scores determining the severity of illness were measured daily. Clinical outcome parameters were examined after 90 days or at the end of treatment. The population was divided into 2 groups (group A: non-survivors or ICU length of stay (ICU-LOS) >30 days versus group B: survivors and ICU-LOS <30 days) for analysis.

Results

Epidemiological baseline characteristics of both groups were similar. Group A patients had significant lower LiMAx and ICG-PDR values than patients in group B. Determination of ICG-PDR by finger probe failed in 14.3% of tests due to insufficient peripheral pulses. Respiratory, renal and hepatic dysfunction (LiMAx and ICG-PDR) were associated with prolonged ICU-LOS. Only LiMAx <100 μg/kg/h and respiratory dysfunction were associated with increased mortality. For LiMAx <100 μg/kg/h receiver operating characteristic-analysis revealed a 100% sensitivity and 77% specificity for death.

Conclusions

Sepsis-related hepatic dysfunction can be diagnosed early and effectively by the LiMAx test. The extent of LiMAx impairment is predictive for patient morbidity and mortality. The sensitivity and specificity of the LiMAx test was superior to that of ICG-PDR regarding the prediction of mortality.  相似文献   

3.

Background and purpose

Sonoelastography (SE) is a new technique that can assess differences in tissue stiffness. This study investigated the performance of SE for the differentiation of supraspinatus (SSP) tendon alterations of tendinopathy compared to magnetic resonance imaging (MRI) and conventional ultrasonography (US).

Methods

One hundred and eighteen consecutively registered patients with symptoms and MRI findings of SSP tendinopathy were assessed with US and SE. Coronal images of the SSP tendon were obtained using US and SE. Increased signal intensity on T2-weighted images in the coronal planes were graded according to the extent of the signal changes from ventral to dorsal. SE images were evaluated by reviewers using an experimentally proven color grading system.

Results

Using SE, 7.6 % of the SSP tendons were categorized as grade 0, 30.5 % as grade 1, 19.5 % as grade 2, and 42.4 % as grade 3. Evaluation of the interobserver reliability of the SE findings showed “almost perfect agreement”, with a weighted kappa coefficient of 0.83. By comparing the MRI findings with the SE findings, grades of MRI and SE had a positive correlation (r = 0.829, p = <0.001). Furthermore, grades of US and SE also had a positive correlation (r = 0.723, p = <0.001).

Conclusions

SE is valuable in the detection of the intratendinous and peritendinous alterations of the SSP tendon and has excellent interobserver reliability and excellent correlation with MRI findings and conventional ultrasonography findings.  相似文献   

4.

Abstract

Cardiorenal syndrome type 1 (CRS-1) is the acute kidney disfunction caused by an acute worsening of cardiac function. CRS-1 is the consequence of renal vasoconstriction secondary to renin–angiotensin system (RAS) activation. No animal models of CRS-1 are described in literature.

Purpose

To characterize a murine model of CRS-1 by using a high-resolution ultrasound echo-color Doppler system (VEVO2100).

Materials

Post-ischemic heart failure was induced by coronary artery ligation (LAD) in seven CD1 mice. Fifteen and thirty days after surgery, mice underwent cardiac and renal echo-color Doppler. Serum creatinine and plasma renin activity were measured after killing. Animals were compared to seven CD1 control mice.

Results

Heart failure with left ventricle dilatation (end diastolic area, p < 0.05 vs. controls) and significantly reduced ejection fraction (EF; p < 0.01 vs. controls) was evident 15 days after LAD. We measured a significant renal vasoconstriction in infarcted mice characterized by increased renal pulsatility index (PI; p < 0.05 vs. controls) associated to increased creatinine and renin levels (p < 0.05 vs. controls)

Conclusions

The mice model of LAD is a good model of CRS-1 evaluable by Doppler sonography and characterized by renal vasoconstriction due to the activation of the renin–angiotensin system secondary to heart failure.  相似文献   

5.

Objective

The purpose of this study was to evaluate sonoelastography (SE) in the assessment of the long head of biceps tendon (LHBT) in patients with symptoms of biceps tendinitis or tendinosis and in patients without biceps lesion. The findings were compared with those obtained at clinical examination, using ultrasonography (US).

Materials and methods

36 shoulders of 34 consecutively registered patients with clinical symptoms and US findings of biceps tendinitis or tendinosis, and 114 shoulders of 98 patients without biceps lesions were assessed with SE. Transverse and longitudinal images of LHBT were obtained using SE. SE images were evaluated by reviewers using an experimentally proven color grading system.

Results

The transverse images of SE showed a mean sensitivity of 69.4 %, a mean specificity of 95.6 % and a mean accuracy of 89.3 %. Good correlation of conventional ultrasound findings was found (p < 0.001, r = 0.763). The longitudinal images of SE showed a mean sensitivity of 94.4 %, a mean specificity of 92.1 % and a mean accuracy of 92.7 %. Good correlation of conventional ultrasound findings was found (p < 0.001, r = 0.585). Inter-observer reliability of SE was in “almost perfect agreement” with a weighted kappa coefficient of 0.84.

Conclusions

SE has potential to be clinically useful in the detection of the intratendinous and peritendinous alterations of LHBT and has excellent accuracy and excellent correlation with conventional ultrasound findings.  相似文献   

6.

Background

Liver cirrhosis has been shown to affect cardiac performance. However cardiac dysfunction may only be revealed under stress conditions. The value of non-invasive stress tests in diagnosing cirrhotic cardiomyopathy is unclear. We sought to investigate the response to pharmacological stimulation with dobutamine in patients with cirrhosis using cardiovascular magnetic resonance.

Methods

Thirty-six patients and eight controls were scanned using a 1.5 T scanner (Siemens Symphony TIM; Siemens, Erlangen, Germany). Conventional volumetric and feature tracking analysis using dedicated software (CMR42; Circle Cardiovascular Imaging Inc, Calgary, Canada and Diogenes MRI; Tomtec; Germany, respectively) were performed at rest and during low to intermediate dose dobutamine stress.

Results

Whilst volumetry based parameters were similar between patients and controls at rest, patients had a smaller increase in cardiac output during stress (p = 0.015). Ejection fraction increase was impaired in patients during 10 μg/kg/min dobutamine as compared to controls (6.9 % vs. 16.5 %, p = 0.007), but not with 20 μg/kg/min (12.1 % vs. 17.6 %, p = 0.12). This was paralleled by an impaired improvement in circumferential strain with low dose (median increase of 14.4 % vs. 30.9 %, p = 0.03), but not with intermediate dose dobutamine (median increase of 29.4 % vs. 33.9 %, p = 0.54). There was an impaired longitudinal strain increase in patients as compared to controls during low (median increase of 6.6 % vs 28.6 %, p < 0.001) and intermediate dose dobutamine (median increase of 2.6%vs, 12.6 % p = 0.016). Radial strain response to dobutamine was similar in patients and controls (p > 0.05).

Conclusion

Cirrhotic cardiomyopathy is characterized by an impaired cardiac pharmacological response that can be detected with magnetic resonance myocardial stress testing. Deformation analysis parameters may be more sensitive in identifying abnormalities in inotropic response to stress than conventional methods.  相似文献   

7.

Introduction

Current international sepsis guidelines recommend low-dose enteral nutrition (EN) for the first week. This contradicts other nutrition guidelines for heterogenous groups of ICU patients. Data on the optimal dose of EN in septic patients are lacking. Our aim was to evaluate the effect of energy and protein amount given by EN on clinical outcomes in a large cohort of critically ill septic patients.

Methods

We conducted a secondary analysis of pooled data collected prospectively from international nutrition studies. Eligible patients had a diagnosis of sepsis and/or pneumonia and were admitted to the ICU for ≥3 days, mechanically ventilated within 48 hours of ICU admission and only receiving EN. Patients receiving parenteral nutrition were excluded. Data were collected from ICU admission up to a maximum of 12 days. Regression models were used to examine the impact of calorie and protein intake on 60-day mortality and ventilator-free days.

Results

Of the 13,630 patients included in the dataset, 2,270 met the study inclusion criteria. Patients received a mean amount of 1,057 kcal/d (14.5 kcal/kg/day) and 49 g protein/day (0.7 g/kg/d) by EN alone. Patients were mechanically ventilated for a median of 8.4 days and 60-day mortality was 30.5%. An increase of 1,000 kcal was associated with reduced 60-day mortality (odds ratio (OR) 0.61; 95% confidence interval (CI) 0.48 to 0.77, P <0.001) and more ventilator-free days (2.81 days, 95% CI 0.53 to 5.08, P = 0.02) as was an increase of 30 g protein per day (OR 0.76; 95% CI 0.65 to 0.87, P <0.001 and 1.92 days, 95% CI 0.58 to 3.27, P = 0.005, respectively).

Conclusions

In critically ill septic patients, a calorie and protein delivery closer to recommended amounts by EN in the early phase of ICU stay was associated with a more favorable outcome.  相似文献   

8.

OBJECTIVE

Nocturnal hypoglycemia can cause seizures and is a major impediment to tight glycemic control, especially in young children with type 1 diabetes. We conducted an in-home randomized trial to assess the efficacy and safety of a continuous glucose monitor–based overnight predictive low-glucose suspend (PLGS) system.

RESEARCH DESIGN AND METHODS

In two age-groups of children with type 1 diabetes (11–14 and 4–10 years of age), a 42-night trial for each child was conducted wherein each night was assigned randomly to either having the PLGS system active (intervention night) or inactive (control night). The primary outcome was percent time <70 mg/dL overnight.

RESULTS

Median time at <70 mg/dL was reduced by 54% from 10.1% on control nights to 4.6% on intervention nights (P < 0.001) in 11–14-year-olds (n = 45) and by 50% from 6.2% to 3.1% (P < 0.001) in 4–10-year-olds (n = 36). Mean overnight glucose was lower on control versus intervention nights in both age-groups (144 ± 18 vs. 152 ± 19 mg/dL [P < 0.001] and 153 ± 14 vs. 160 ± 16 mg/dL [P = 0.004], respectively). Mean morning blood glucose was 159 ± 29 vs. 176 ± 28 mg/dL (P < 0.001) in the 11–14-year-olds and 154 ± 25 vs. 158 ± 22 mg/dL (P = 0.11) in the 4–10-year-olds, respectively. No differences were found between intervention and control in either age-group in morning blood ketosis.

CONCLUSIONS

In 4–14-year-olds, use of a nocturnal PLGS system can substantially reduce overnight hypoglycemia without an increase in morning ketosis, although overnight mean glucose is slightly higher.  相似文献   

9.

Introduction

Current sepsis guidelines recommend antimicrobial treatment (AT) within one hour after onset of sepsis-related organ dysfunction (OD) and surgical source control within 12 hours. The objective of this study was to explore the association between initial infection management according to sepsis treatment recommendations and patient outcome.

Methods

In a prospective observational multi-center cohort study in 44 German ICUs, we studied 1,011 patients with severe sepsis or septic shock regarding times to AT, source control, and adequacy of AT. Primary outcome was 28-day mortality.

Results

Median time to AT was 2.1 (IQR 0.8 – 6.0) hours and 3 hours (-0.1 – 13.7) to surgical source control. Only 370 (36.6%) patients received AT within one hour after OD in compliance with recommendation. Among 422 patients receiving surgical or interventional source control, those who received source control later than 6 hours after onset of OD had a significantly higher 28-day mortality than patients with earlier source control (42.9% versus 26.7%, P <0.001). Time to AT was significantly longer in ICU and hospital non-survivors; no linear relationship was found between time to AT and 28-day mortality. Regardless of timing, 28-day mortality rate was lower in patients with adequate than non-adequate AT (30.3% versus 40.9%, P < 0.001).

Conclusions

A delay in source control beyond 6 hours may have a major impact on patient mortality. Adequate AT is associated with improved patient outcome but compliance with guideline recommendation requires improvement. There was only indirect evidence about the impact of timing of AT on sepsis mortality.  相似文献   

10.

Introduction

Acute kidney injury (AKI) following acute myocardial infarction (AMI) is associated with unfavorable prognosis. Endothelial activation and injury were found to play a critical role in the development of both AKI and AMI. This pilot study aimed to determine whether the plasma markers of endothelial injury and activation could serve as independent predictors for AKI in patients with AMI.

Methods

This prospective study was conducted from March 2010 to July 2012 and enrolled consecutive 132 patients with AMI receiving percutaneous coronary intervention (PCI). Plasma levels of thrombomodulin (TM), von Willebrand factor (vWF), angiopoietin (Ang)-1, Ang-2, Tie-2, and vascular endothelial growth factor (VEGF) were measured on day 1 of AMI. AKI was defined as elevation of serum creatinine of more than 0.3 mg/dL within 48 hours.

Results

In total, 13 out of 132 (9.8%) patients with AMI developed AKI within 48 hours. Compared with patients without AKI, patients with AKI had increased plasma levels of Ang-2 (6338.28 ± 5862.77 versus 2412.03 ± 1256.58 pg/mL, P = 0.033) and sTM (7.6 ± 2.26 versus 5.34 ± 2.0 ng/mL, P < 0.001), and lower estimated glomerular filtration rate (eGFR) (46.5 ± 20.2 versus 92.5 ± 25.5 mL/min/1.73 m2, P < 0.001). Furthermore, the areas under the receiver operating curves demonstrated that plasma thrombomodulin (TM) and Ang-2 levels on day 1 of AMI had modest discriminative powers for predicting AKI development following AMI (0.796, P <0.001; 0.833, P <0.001; respectively).

Conclusions

Endothelial activation, quantified by plasma levels of TM and Ang-2 may play an important role in development of AKI in patients with AMI.  相似文献   

11.

Background

Ectopic accumulation of fat accompanies visceral obesity with detrimental effects. Lipid oversupply to cardiomyocytes leads to cardiac steatosis, and in animal studies lipotoxicity has been associated with impaired left ventricular (LV) function. In humans, studies have yielded inconclusive results. The aim of the study was to evaluate the role of epicardial, pericardial and myocardial fat depots on LV structure and function in male subjects with metabolic syndrome (MetS).

Methods

A study population of 37 men with MetS and 38 men without MetS underwent cardiovascular magnetic resonance and proton magnetic spectroscopy at 1.5 T to assess LV function, epicardial and pericardial fat area and myocardial triglyceride (TG) content.

Results

All three fat deposits were greater in the MetS than in the control group (p <0.001). LV diastolic dysfunction was associated with MetS as measured by absolute (471 mL/s vs. 667 mL/s, p = 0.002) and normalized (3.37 s-1 vs. 3.75 s-1, p = 0.02) LV early diastolic peak filling rate and the ratio of early diastole (68% vs. 78%, p = 0.001). The amount of epicardial and pericardial fat correlated inversely with LV diastolic function. However, myocardial TG content was not independently associated with LV diastolic dysfunction.

Conclusions

In MetS, accumulation of epicardial and pericardial fat is linked to the severity of structural and functional alterations of the heart. The role of increased intramyocardial TG in MetS is more complex and merits further study.  相似文献   

12.

Objective

Sonolastography (SE) technique is one of the new functional ultrasonic imaging techniques, which was developed in the past few years and can obtain the distributions of elasticity in tissues. Using magnetic resonance imaging (MRI) as the standard of reference, the purpose of this study was to evaluate the ability of SE to assess the fatty degeneration of suprasupinatus (SSP) and to compare it to the MRI and the conventional ultrasonography (US) findings.

Materials and methods

The institutional review board approved the study, and a retrospective analysis between January 2013 and September 2013 was performed on 101 shoulders of 98 consecutive patients using MRI, US, and SE for the evaluation of shoulder lesion. Oblique sagittal images of SSP were obtained using SE. The SE images were evaluated by reviewers using an experimentally proven color grading system.

Results

When comparing SE to standard MRI findings, the mean sensitivity of SE was 95.6 %, the specificity 87.5 %, and the accuracy 91.1 %. The interobserver reliability of the SE findings was “almost perfect agreement” with a weighted kappa coefficient of 0.81. On comparing MRI with the SE findings, the grades of MRI and SE have a positive correlation (r = 0.855, P = <0.001). Furthermore, the grades of US and SE also have a positive correlation (r = 0.793, P = <0.001).

Conclusion

SE is valuable in the quantitative assessment of the severity of the fatty atrophy of the supraspinatus and has an excellent accuracy, an excellent correlation with MRI and the conventional US, and an excellent interobserver reliability.  相似文献   

13.

OBJECTIVE

We used fast-gradient magnetic resonance imaging (MRI) to determine the longitudinal associations between the hepatic fat content (HFF), glucose homeostasis, and a biomarker of hepatocellular apoptosis in obese youth.

RESEARCH DESIGN AND METHODS

Baseline and longitudinal liver and abdominal MRI were performed with an oral glucose tolerance test in 76 obese youth followed for an average of 1.9 years. Cytokeratin-18 (CK-18) was measured at baseline and follow-up as a biomarker of hepatic apoptosis. The relationship between baseline HFF and metabolic parameters and circulating levels of CK-18 at follow-up were assessed using a bivariate correlation.

RESULTS

At baseline, 38% had hepatic steatosis based on %HFF ≥5.5% with alterations in indices of insulin sensitivity and secretion. At follow-up, BMI increased in both groups and baseline %HFF correlated strongly with the follow-up %HFF (r = 0.81, P < 0.001). Over time, markers of insulin sensitivity and 2-h glucose improved significantly in the group without fatty liver, in contrast with the persistence of the insulin resistance and associated correlates in the fatty liver group. Baseline HFF correlated with 2-h glucose (r = 0.38, P = 0.001), whole-body insulin sensitivity (r = −0.405, P = 0.001), adiponectin (r = −0.44, P < 0.001), CK-18 levels, (r = 0.63, P < 0.001), and disposition index (r = −0.272, P = 0.021) at follow-up. In a multivariate analysis, we showed that baseline HFF is an independent predictor of 2-h glucose and whole-body insulin sensitivity.

CONCLUSIONS

In obese youth, the phenotype of MRI-measured hepatic steatosis is persistent. Baseline HFF strongly modulates longitudinally 2-h blood glucose, biomarkers of insulin resistance, and hepatocellular apoptosis.Concurrent with the soaring rates of childhood obesity, nonalcoholic fatty liver disease (NAFLD) has emerged as the most common liver disease in children in the U.S. (1). NAFLD includes a wide spectrum of pathologies, ranging from simple steatosis (also called NAFL) to steatohepatitis (NASH) to fibrosis/cirrhosis (2,3). NAFLD is associated with hyperlipidemia, insulin resistance, and type 2 diabetes. Thirty to forty percent of obese youth have NAFLD, and ∼10% of them develop NASH, characterized by inflammation and hepatocyte ballooning on a background of hepatic steatosis (1,46). Although simple hepatic steatosis usually has a “benign course,” NASH, on the other hand, may progress to end-stage liver disease. Progression to more deleterious stages occurs more rapidly in children than in adults, as described by Feldstein et al. (7).Accurate diagnosis and staging of NAFL/NASH requires liver biopsy. Due to its associated risks, high cost, and poor acceptability in pediatrics, liver biopsy is a roadblock limiting advancement in the pathogenesis and natural history of the disease in children. However, two imaging techniques (1H-nuclear magnetic resonance [1H-NMR] and fast magnetic resonance imaging [fast MRI]) have been proven to accurately quantify fatty liver content (NAFLD) in both adults and children and thus are increasingly being used in clinical research (810). A hepatic fat content (HFF) ≥5.5% is consistent with the diagnosis of hepatic steatosis (810). In our group, the fast MRI was strongly correlated with 1H-NMR measures of steatosis and with macrovesicular steatosis/NASH seen on liver biopsy in obese children (11). However, information on advanced stages, such as inflammation and fibrosis, cannot be obtained with these imaging techniques.There is a high prevalence of hepatic steatosis in obese children and adolescents. The potential for progression to deleterious stages of liver disease and its association with type 2 diabetes creates a need to accurately identify those children and to track the putative metabolic changes associated with hepatic steatosis. Although there is vast literature on the effect of fatty liver on metabolic deterioration in adults, little is known about this process in the pediatrics. Therefore, the aim of this study was to follow a multiethnic group of obese children and adolescents with and without liver steatosis and track changes in metabolic parameters in relation to baseline HFF measured by fast MRI. We hypothesize that baseline HFF would strongly modulate the changes in glucose metabolism and insulin resistance over time in obese youth. Furthermore, circulating levels of cytokeratin-18 (CK-18), a biomarker of hepatocellullar apoptosis, known to be linked to steatohepatitis (12), was measured to follow the putative association between steatosis and hepatocellular damage longitudinally.  相似文献   

14.

Purpose

Contrast-enhanced ultrasound (CEUS) is the application of ultrasound contrast agents (UCAs) to traditional medical sonography. The development of UCAs allowed to overcome some of the limitations of conventional B-mode and Doppler ultrasound techniques and enabled the display of the parenchymal microvasculature. Purpose of this paper is to delineate the elements of a solid and science-based technique in the execution of urinary bladder CEUS.

Methods

We describe the technical execution of urinary bladder CEUS and the use of perfusion softwares to perform contrast enhancement quantitative analysis with generation of time–intensity curves from regions of interest.

Results

During CEUS, normal bladder wall shows a wash-in time of 13 s, a time to peak (TTP) >40 s, a signal intensity (SI) <45 % and a wash-out time >80 s; Low-grade urothelial cell carcinoma (UCC) shows a wash-in time of 13 s, a time to peak TTP >28 s, a SI <45 % and a wash-out time of 40 s; High-grade UCC shows a wash-in time of 13 s, a TTP >28 s, a SI >50 % and a wash-out time of 58 s.

Conclusions

CEUS is a useful tool for an accurate characterization of bladder UCC although it has some drawbacks. To avoid misunderstandings, a widely accepted classification and a standardized terminology about the most significant parameters of this application should be adopted in the immediate future.  相似文献   

15.

Introduction

Several methods have been proposed to evaluate neurological outcome in out-of-hospital cardiac arrest (OHCA) patients. Blood lactate has been recognized as a reliable prognostic marker for trauma, sepsis, or cardiac arrest. The objective of this study was to examine the association between initial lactate level or lactate clearance and neurologic outcome in OHCA survivors who were treated with therapeutic hypothermia.

Methods

This retrospective cohort study included patients who underwent protocol-based 24-hour therapeutic hypothermia after OHCA between January 2010 and March 2012. Serum lactate levels were measured at the start of therapy (0 hours), and after 6 hours, 12 hours, 24 hours, 48 hours and 72 hours. The 6 hour and 12 hour lactate clearance were calculated afterwards. Patients’ neurologic outcome was assessed at one month after cardiac arrest; good neurological outcome was defined as Cerebral Performance Category one or two. The primary outcome was an association between initial lactate level and good neurologic outcome. The secondary outcome was an association between lactate clearance and good neurologic outcome in patients with initial lactate level >2.5 mmol/l.

Results

Out of the 76 patients enrolled, 34 (44.7%) had a good neurologic outcome. The initial lactate level showed no significant difference between good and poor neurologic outcome groups (6.07 ±4 .09 mmol/L vs 7.13 ± 3.99 mmol/L, P = 0.42), However, lactate levels at 6 hours, 12 hours, 24 hours, and 48 hours in the good neurologic outcome group were lower than in the poor neurologic outcome group (3.81 ± 2.81 vs 6.00 ± 3.22 P <0.01, 2.95 ± 2.07 vs 5.00 ± 3.49 P <0.01, 2.17 ± 1.24 vs 3.86 ± 3.92 P <0.01, 1.57 ± 1.02 vs 2.21 ± 1.35 P = 0.03, respectively). The secondary analysis showed that the 6-hour and 12-hour lactate clearance was higher for good neurologic outcome patients (35.3 ± 34.6% vs 6.89 ± 47.4% P = 0.01, 54.5 ± 23.7% vs 25.6 ± 43.7% P <0.01, respectively). After adjusting for potential confounding variables, the 12-hour lactate clearance still showed a statistically significant difference (P = 0.02).

Conclusion

The lactate clearance rate, and not the initial lactate level, was associated with neurological outcome in OHCA patients after therapeutic hypothermia.  相似文献   

16.

Purpose

Point-of-care ultrasound evaluates inferior vena cava (IVC) and internal jugular vein (IJV) measurements to estimate intravascular volume status. The reliability of the IVC and IJV collapsibility index during increased thoracic or intra-abdominal pressure remains unclear.

Methods

Three phases of sonographic scanning were performed: spontaneous breathing phase, increased thoracic pressure phase via positive pressure ventilation (PPV) phase, and increased intra-abdominal pressure (IAP) phase via laparoscopic insufflation to 15 mmHg. IVC measurements were done at 1–2 cm below the diaphragm and IJV measurements were done at the level of the cricoid cartilage during a complete respiratory cycle. Collapsibility index was calculated by (max diameter − min diameter)/max diameter × 100 %. Chi square, t test, correlation procedure (CORR) and Fisher’s exact analyses were completed.

Results

A total of 144 scans of the IVC and IJV were completed in 16 patients who underwent laparoscopic surgery. Mean age was 46 ± 15 years, with 75 % female and 69 % African-American. IVC and IJV collapsibility correlated in the setting of spontaneous breathing (r2 = 0.86, p < 0.01). IVC collapsibility had no correlation with the IJV in the setting of PPV (r2 = 0.21, p = 0.52) or IAP (r2 = 0.26, p = 0.42). Maximal IVC diameter was significantly smaller during increased IAP (16.5 mm ± 4.9) compared to spontaneous breathing (20.6 mm ± 4.8, p = 0.04) and PPV (21.8 mm ± 5.6, p = 0.01).

Conclusion

IJV and IVC collapsibility correlated during spontaneous breathing but there was no statistically significant correlation during increased thoracic or intra-abdominal pressure. Increased intra-abdominal pressure was associated with a significant smaller maximal IVC diameter and cautions the reliability of IVC diameter in clinical settings that are associated with intra-abdominal hypertension or abdominal compartment syndrome.  相似文献   

17.

Introduction

Inflammation and coagulation are closely linked, and both can be triggered by endotoxin. Thrombelastometry and impedance aggregometry are of diagnostic and predictive value in critically ill patients. In this observational study we investigated the correlation of endotoxin activity with thrombelasometric and aggregometric variables in patients with systemic inflammation.

Methods

Based on a daily screening on a tertiary academic surgical ICU, patients, as soon as they fulfilled two or more criteria for systemic inflammatory response syndrome (SIRS), were included. In whole blood we performed endotoxin activity (EA) assay, thrombelastometry (ROTEM®) and impendance aggregometry (Multiplate®).

Results

In total, 49 patients were included with a broad spread of EA levels of (median (minimum to maximum)) 0.27 (0.01 to 0.72), allowing expedient correlative analysis. Clot formation time (CFT) (263 s (60 to 1,438 s)) and clotting time (CT) (1,008 s (53 to 1,481 s)) showed a significant negative correlation with EA level (r = -0.38 (P < 0.005) and r = -0.29 (P < 0.05)). Positive correlations were found for alpha-angle (50° (17 to 78°), r = 0.40 (P < 0.005)) and maximum clot firmness (MCF) (55 mm (5/76), r = 0.27 (P < 0.05)). No significant correlations were found between Lysis Index at 60 minutes (LI60) and EA levels. There was no correlation between EA level and aggregometric values, or classical coagulation parameters.

Conclusions

In patients with systemic inflammation, increasing endotoxin concentrations correlate with increased clot formation.  相似文献   

18.

Background

Myocardial fibrosis imaging using late gadolinium enhancement (LGE) cardiac magnetic resonance (CMR) has been validated as a quantitative predictive marker for response to medical, surgical, and device therapy. To date, all such studies have examined conventional, non-phase corrected magnitude images.  However, contemporary practice has rapdily adopted phase-corrected image reconstruction. We sought to investigate the existence of any systematic bias between threshold-based scar quantification performed on conventional magnitude inversion recovery (MIR) and matched phase sensitive inversion recovery (PSIR) images.

Methods

In 80 patients with confirmed ischemic (N = 40), or non-ischemic (n = 40) myocardial fibrosis, and also in a healthy control cohort (N = 40) without fibrosis, myocardial late enhancement was quantified using a Signal Threshold Versus Reference Myocardium technique (STRM) at ≥2, ≥3, and ≥5 SD threshold, and also using the Full Width at Half Maximal (FWHM) technique. This was performed on both MIR and PSIR images and values compared using linear regression and Bland-Altman analyses.

Results

Linear regression analysis demonstrated excellent correlation for scar volumes between MIR and PSIR images at all three STRM signal thresholds for the ischemic (N = 40, r = 0.96, 0.95, 0.88 at 2, 3, and 5 SD, p < 0.0001 for all regressions), and non ischemic (N = 40, r = 0.86, 0.89, 0.90 at 2, 3, and 5 SD, p < 0.0001 for all regressions) cohorts. FWHM analysis demonstrated good correlation in the ischemic population (N = 40, r = 0.83, p < 0.0001). Bland-Altman analysis demonstrated a systematic bias with MIR images showing higher values than PSIR for ischemic (3.3 %, 3.9 % and 4.9 % at 2, 3, and 5 SD, respectively), and non-ischemic (9.7 %, 7.4 % and 4.1 % at ≥2, ≥3, and ≥5 SD thresholds, respectively) cohorts. Background myocardial signal measured in the control population demonstrated a similar bias of 4.4 %, 2.6 % and 0.7 % of the LV volume at 2, 3 and 5 SD thresholds, respectively. The bias observed using FWHM analysis was −6.9 %.

Conclusions

Scar quantification using phase corrected (PSIR) images achieves values highly correlated to those obtained on non-corrected (MIR) images. However, a systematic bias exists that appears exaggerated in non-ischemic cohorts. Such bias should be considered when comparing or translating knowledge between MIR- and PSIR-based imaging.  相似文献   

19.

Introduction

Circulatory failure during brain death organ donor resuscitation is a problem that compromises recovery of organs. Combined administration of steroid, thyroxine and vasopressin has been proposed to optimize the management of brain deceased donors before recovery of organs. However the single administration of hydrocortisone has not been rigorously evaluated in any trial.

Methods

In this prospective multicenter cluster study, 259 subjects were included. Administration of low-dose steroids composed the steroid group (n = 102).

Results

Although there were more patients in the steroid group who received norepinephrine before brain death (80% vs. 66%: P = 0.03), mean dose of vasopressor administered after brain death was significantly lower than in the control group (1.18 ± 0.92 mg/H vs. 1.49 ± 1.29 mg/H: P = 0.03), duration of vasopressor support use was shorter (874 min vs. 1160 min: P < 0.0001) and norepinephrine weaning before aortic clamping was more frequent (33.8% vs. 9.5%: P < 0.0001). Using a survival approach, probability of norepinephrine weaning was significantly different between the two groups (P < 0.0001) with a probability of weaning 4.67 times higher in the steroid group than in the control group (95% CI: 2.30 – 9.49).

Conclusions

Despite no observed benefits of the steroid administration on primary function recovery of transplanted grafts, administration of glucocorticoids should be a part of the resuscitation management of deceased donors with hemodynamic instability.  相似文献   

20.

Introduction

We developed a protocol to initiate surgical source control immediately after admission (early source control) and perform initial resuscitation using early goal-directed therapy (EGDT) for gastrointestinal (GI) perforation with associated septic shock. This study evaluated the relationship between the time from admission to initiation of surgery and the outcome of the protocol.

Methods

This examination is a prospective observational study and involved 154 patients of GI perforation with associated septic shock. We statistically analyzed the relationship between time to initiation of surgery and 60-day outcome, examined the change in 60-day outcome associated with each 2 hour delay in surgery initiation and determined a target time for 60-day survival.

Results

Logistic regression analysis demonstrated that time to initiation of surgery (hours) was significantly associated with 60-day outcome (Odds ratio (OR), 0.31; 95% Confidence intervals (CI)), 0.19-0.45; P <0.0001). Time to initiation of surgery (hours) was selected as an independent factor for 60-day outcome in multiple logistic regression analysis (OR), 0.29; 95% CI, 0.16-0.47; P <0.0001). The survival rate fell as surgery initiation was delayed and was 0% for times greater than 6 hours.

Conclusions

For patients of GI perforation with associated septic shock, time from admission to initiation of surgery for source control is a critical determinant, under the condition of being supported by hemodynamic stabilization. The target time for a favorable outcome may be within 6 hours from admission. We should not delay in initiating EGDT-assisted surgery if patients are complicated with septic shock.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号