首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Introduction

Indications for renal replacement therapy (RRT) have not been generally standardized and vary among intensive care units (ICUs). We aimed to assess the proportion, indications, and modality of RRT, as well as the association between the proportion of RRT use and 90-day mortality in patients with septic shock in Finnish adult ICUs.

Methods

We identified patients with septic shock from the prospective observational multicenter FINNAKI study conducted between 1 September 2011 and 1 February 2012. We divided the ICUs into high-RRT and low-RRT ICUs according to the median of the proportion of RRT-treated patients with septic shock. Differences in indications, and modality of RRT between ICU groups were assessed. Finally, we performed an adjusted logistic regression analysis to evaluate the possible association of the ICU group (high vs. low-RRT) with 90-day mortality.

Results

Of the 726 patients with septic shock, 131 (18.0%, 95% CI 15.2 to 20.9%) were treated with RRT. The proportion of RRT-treated patients varied from 3% up to 36% (median 19%) among ICUs. High-RRT ICUs included nine ICUs (354 patients) and low-RRT ICUs eight ICUs (372 patients). In the high-RRT ICUs patients with septic shock were older (P = 0.04), had more cardiovascular (P <0.001) and renal failures (P = 0.003) on the first day in the ICU, were more often mechanically ventilated, and received higher maximum doses of norepinephrine (0.25 μg/kg/min vs. 0.18 μg/kg/min, P <0.001) than in the low-RRT ICUs. No significant differences in indications for or modality of RRT existed between the ICU groups. The crude 90-day mortality rate for patients with septic shock was 36.2% (95% CI 31.1 to 41.3%) in the high-RRT ICUs compared to 33.9% (95% CI 29.0 to 38.8%) in the low-RRT ICUs, P = 0.5. In an adjusted logistic regression analysis the ICU group (high-RRT or low-RRT ICUs) was not associated with 90-day mortality.

Conclusions

Patients with septic shock in ICUs with a high proportion of RRT had more severe organ dysfunctions and received more organ-supportive treatments. Importantly, the ICU group (high-RRT or low-RRT group) was not associated with 90-day mortality.  相似文献   

2.

Introduction

The role of systemic hemodynamics in the pathogenesis of septic acute kidney injury (AKI) has received little attention. The purpose of this study was to investigate the association between systemic hemodynamics and new or persistent of AKI in severe sepsis.

Methods

A retrospective study between 2006 and 2010 was performed in a surgical ICU in a teaching hospital. AKI was defined as development (new AKI) or persistent AKI during the five days following admission based on the Acute Kidney Injury Network (AKIN) criteria. We studied the association between the following hemodynamic targets within 24 hours of admission and AKI: central venous pressure (CVP), cardiac output (CO), mean arterial pressure (MAP), diastolic arterial pressure (DAP), central venous oxygen saturation (ScvO2) or mixed venous oxygen saturation (SvO2).

Results

This study included 137 ICU septic patients. Of these, 69 had new or persistent AKI. AKI patients had a higher Simplified Acute Physiology Score (SAPS II) (57 (46 to 67) vs. 45 (33 to 52), P < 0.001) and higher mortality (38% vs. 15%, P = 0.003) than those with no AKI or improving AKI. MAP, ScvO2 and CO were not significantly different between groups. Patients with AKI had lower DAP and higher CVP (P = 0.0003). The CVP value was associated with the risk of developing new or persistent AKI even after adjustment for fluid balance and positive end-expiratory pressure (PEEP) level (OR = 1.22 (1.08 to 1.39), P = 0.002). A linear relationship between CVP and the risk of new or persistent AKI was observed.

Conclusions

We observed no association between most systemic hemodynamic parameters and AKI in septic patients. Association between elevated CVP and AKI suggests a role of venous congestion in the development of AKI. The paradigm that targeting high CVP may reduce occurrence of AKI should probably be revised. Furthermore, DAP should be considered as a potential important hemodynamic target for the kidney.  相似文献   

3.

Introduction

The development of acute kidney injury (AKI) is associated with poor outcome. The modified RIFLE (risk, injury, failure, loss of kidney function, and end-stage renal failure) classification for AKI, which classifies patients with renal replacement therapy needs according to RIFLE failure class, improves the predictive value of AKI in patients undergoing cardiac surgery. Our aim was to assess risk factors for post-operative AKI and the impact of renal function on short- and long-term survival among all AKI subgroups using the modified RIFLE classification.

Methods

We prospectively studied 2,940 consecutive cardiosurgical patients between January 2004 and July 2009. AKI was defined according to the modified RIFLE system. Pre-operative, operative and post-operative variables usually measured on and during admission, which included main outcomes, were recorded together with cardiac surgery scores and ICU scores. These data were evaluated for association with AKI and staging in the different RIFLE groups by means of multivariable analyses. Survival was analyzed via Kaplan-Meier and a risk-adjusted Cox proportional hazards regression model. A complete follow-up (mean 6.9 ± 4.3 years) was performed in 2,840 patients up to April 2013.

Results

Of those patients studied, 14% (n = 409) were diagnosed with AKI. We identified one intra-operative (higher cardiopulmonary bypass time) and two post-operative (a longer need for vasoactive drugs and higher arterial lactate 24 hours after admission) predictors of AKI. The worst outcomes, including in-hospital mortality, were associated with the worst RIFLE class. Kaplan-Meier analysis showed survival of 74.9% in the RIFLE risk group, 42.9% in the RIFLE injury group and 22.3% in the RIFLE failure group (P <0.001). Classification at RIFLE injury (Hazard ratio (HR) = 2.347, 95% confidence interval (CI) 1.122 to 4.907, P = 0.023) and RIFLE failure (HR = 3.093, 95% CI 1.460 to 6.550, P = 0.003) were independent predictors for long-term patient mortality.

Conclusions

AKI development after cardiac surgery is associated mainly with post-operative variables, which ultimately could lead to a worst RIFLE class. Staging at the RIFLE injury and RIFLE failure class is associated with higher short- and long-term mortality in our population.  相似文献   

4.

Introduction

Recently, the Kidney Disease: Improving Global Outcomes (KDIGO) proposed a new definition and classification of acute kidney injury (AKI) on the basis of the RIFLE (Risk, Injury, Failure, Loss of kidney function, and End-stage renal failure) and AKIN (Acute Kidney Injury Network) criteria, but comparisons of the three criteria in critically ill patients are rare.

Methods

We prospectively analyzed a clinical database of 3,107 adult patients who were consecutively admitted to one of 30 intensive care units of 28 tertiary hospitals in Beijing from 1 March to 31 August 2012. AKI was defined by the RIFLE, AKIN, and KDIGO criteria. Receiver operating curves were used to compare the predictive ability for mortality, and logistic regression analysis was used for the calculation of odds ratios and 95% confidence intervals.

Results

The rates of incidence of AKI using the RIFLE, AKIN, and KDIGO criteria were 46.9%, 38.4%, and 51%, respectively. KDIGO identified more patients than did RIFLE (51% versus 46.9%, P = 0.001) and AKIN (51% versus 38.4%, P <0.001). Compared with patients without AKI, in-hospital mortality was significantly higher for those diagnosed as AKI by using the RIFLE (27.8% versus 7%, P <0.001), AKIN (32.2% versus 7.1%, P <0.001), and KDIGO (27.4% versus 5.6%, P <0.001) criteria, respectively. There was no difference in AKI-related mortality between RIFLE and KDIGO (27.8% versus 27.4%, P = 0.815), but there was significant difference between AKIN and KDIGO (32.2% versus 27.4%, P = 0.006). The areas under the receiver operator characteristic curve for in-hospital mortality were 0.738 (P <0.001) for RIFLE, 0.746 (P <0.001) for AKIN, and 0.757 (P <0.001) for KDIGO. KDIGO was more predictive than RIFLE for in-hospital mortality (P <0.001), but there was no difference between KDIGO and AKIN (P = 0.12).

Conclusions

A higher incidence of AKI was diagnosed according to KDIGO criteria. Patients diagnosed as AKI had a significantly higher in-hospital mortality than non-AKI patients, no matter which criteria were used. Compared with the RIFLE criteria, KDIGO was more predictive for in-hospital mortality, but there was no significant difference between AKIN and KDIGO.  相似文献   

5.

Introduction

Acute kidney injury (AKI) occurs frequently after liver transplantation and is associated with significant morbidity and mortality. Recent evidence has linked the predominant usage of ‘chloride-liberal’ intravenous fluids, such as 0.9% saline to the development of renal dysfunction in general critically ill patients. We compared the effects of perioperative fluid types on AKI in liver transplant recipients.

Methods

An observational analysis of liver transplant recipients over a 33-month period, between January 2010 and September 2013, was performed. Intensive care unit database and patient records were analyzed for determinants of early postoperative AKI. Univariate and multivariate regression analysis was carried out using a two-tailed P value less than 0.05 to establish significance. The institutional Research Ethics Committee approved the study methodology (RAC no. 2131 073).

Results

One hundred and fifty-eight liver transplants were performed, AKI developed in 57 (36.1%) patients: 39 (68.4%) fully recovered, 13 (22.8%) developed chronic renal failure and 10 (17.5%) required long-term hemodialysis. On univariate regression analysis, AKI was significantly associated with greater than 3,200 ml of chloride-liberal fluids infused within the first postoperative day (HR 5.9, 95% CI 2.64, 13.2, P <0.001), greater than 1,500 ml colloids received in the operating room (hazard ratio (HR) 1.97, 95% CI 1.01, 3.8, P = 0.046), vasopressor requirement for 48 hours posttransplant (HR 3.34, 95% CI 1.55, 7.21, P = 0.002), hyperchloremia at day 2 (HR 1.09, 95% CI 1.01, 1.18, P = 0.015) and preoperative model for end-stage liver disease (MELD) score (HR 1.08, 95% CI 1.03, 1.13, P <0.001).After stepwise multivariate regression, infusion of greater than 3,200 ml of chloride-liberal fluids (HR 6.25, 95% CI 2.69, 14.5, P <0.000) and preoperative MELD score (HR 1.08, 95% CI 1.02, 1.15, P = 0.004) remained significant predictors for AKI.

Conclusions

In a sample of liver transplant recipients, infusion of higher volumes of chloride-liberal fluids and preoperative status was associated with an increased risk for postoperative AKI.  相似文献   

6.

Introduction

Acute renal failure (ARF) requiring renal replacement therapy (RRT) occurs frequently in ICU patients and significantly affects mortality rates. Previously, few large clinical trials investigated the impact of RRT modalities on patient outcomes. Here we investigated the effect of two major RRT strategies (intermittent hemodialysis (IHD) and continuous veno-venous hemofiltration (CVVH)) on mortality and renal-related outcome measures.

Methods

This single-center prospective randomized controlled trial (“CONVINT”) included 252 critically ill patients (159 male; mean age, 61.5 ± 13.9 years; Acute Physiology and Chronic Health Evaluation (APACHE) II score, 28.6 ± 8.8) with dialysis-dependent ARF treated in the ICUs of a tertiary care academic center. Patients were randomized to receive either daily IHD or CVVH. The primary outcome measure was survival at 14 days after the end of RRT. Secondary outcome measures included 30-day-, intensive care unit-, and intrahospital mortality, as well as course of disease severity/biomarkers and need for organ-support therapy.

Results

At baseline, no differences in disease severity, distributions of age and gender, or suspected reasons for acute renal failure were observed. Survival rates at 14 days after RRT were 39.5% (IHD) versus 43.9% (CVVH) (odds ratio (OR), 0.84; 95% confidence interval (CI), 0.49 to 1.41; P = 0.50). 14-day-, 30-day, and all-cause intrahospital mortality rates were not different between the two groups (all P > 0.5). No differences were observed in days on RRT, vasopressor days, days on ventilator, or ICU-/intrahospital length of stay.

Conclusions

In a monocentric RCT, we observed no statistically significant differences between the investigated treatment modalities regarding mortality, renal-related outcome measures, or survival at 14 days after RRT. Our findings add to mounting data demonstrating that intermittent and continuous RRTs may be considered equivalent approaches for critically ill patients with dialysis-dependent acute renal failure.

Trial registration

NCT01228123, clinicaltrials.gov  相似文献   

7.

Introduction

The aim of this study was to examine whether albumin reduced mortality when employed for the resuscitation of adult patients with severe sepsis and septic shock compared with crystalloid by meta-analysis.

Methods

We searched for and gathered data from MEDLINE, Elsevier, Cochrane Central Register of Controlled Trials and Web of Science databases. Studies were eligible if they compared the effects of albumin versus crystalloid therapy on mortality in adult patients with severe sepsis and septic shock. Two reviewers extracted data independently. Disagreements were resolved by discussion with other two reviewers until a consensus was achieved. Data including mortality, sample size of the patients with severe sepsis, sample size of the patients with septic shock and resuscitation endpoints were extracted. Data were analyzed by the methods recommended by the Cochrane Collaboration Review Manager 4.2 software.

Results

A total of 5,534 records were identified through the initial search. Five studies compared albumin with crystalloid. In total, 3,658 severe sepsis and 2,180 septic shock patients were included in the meta-analysis. The heterogeneity was determined to be non-significant (P = 0.86, I2 = 0%). Compared with crystalloid, a trend toward reduced 90-day mortality was observed in severe sepsis patients resuscitated with albumin (odds ratio (OR) 0.88; 95% CI, 0.76 to 1.01; P = 0.08). However, the use of albumin for resuscitation significantly decreased 90-day mortality in septic shock patients (OR 0.81; 95% CI, 0.67 to 0.97; P = 0.03). Compared with saline, the use of albumin for resuscitation slightly improved outcome in severe sepsis patients (OR 0.81; 95% CI, 0.64 to 1.08; P = 0.09).

Conclusions

In this meta-analysis, a trend toward reduced 90-day mortality was observed in severe sepsis patients resuscitated with albumin compared with crystalloid and saline. Moreover, the 90-day mortality of patients with septic shock decreased significantly.

Electronic supplementary material

The online version of this article (doi:10.1186/s13054-014-0702-y) contains supplementary material, which is available to authorized users.  相似文献   

8.

Introduction

Septic syndromes remain the leading cause of mortality in intensive care units (ICU). Septic patients rapidly develop immune dysfunctions, the intensity and duration of which have been linked with deleterious outcomes. Decreased mRNA expressions of major histocompatibility complex (MHC) class II-related genes have been reported after sepsis. We investigated whether their mRNA levels in whole blood could predict mortality in septic shock patients.

Methods

A total of 93 septic shock patients were included. On the third day after shock, the mRNA expressions of five MHC class II-related genes (CD74, HLA-DRA, HLA-DMB, HLA-DMA, CIITA) were measured by qRT-PCR and monocyte human leukocyte antigen-DR (mHLA-DR) by flow cytometry.

Results

A significant correlation was found among MHC class II related gene expressions. Among mRNA markers, the best prognostic value was obtained for CD74 (HLA-DR antigen-associated invariant chain). For this parameter, the area under the receiver operating characteristic curve (AUC) was calculated (AUC = 0.67, 95% confidence interval (CI) = 0.55 to 0.79; P = 0.01) as well as the optimal cut-off value. After stratification based on this threshold, survival curves showed that a decreased CD74 mRNA level was associated with increased mortality after septic shock (Log rank test, P = 0.0043, Hazard Ratio = 3.0, 95% CI: 1.4 to 6.5). Importantly, this association remained significant after multivariate logistic regression analysis including usual clinical confounders (that is, severity scores, P = 0.026, Odds Ratio = 3.4, 95% CI: 1.2 to 9.8).

Conclusion

Decreased CD74 mRNA expression significantly predicts 28-day mortality after septic shock. After validation in a larger multicentric study, this biomarker could become a robust predictor of death in septic patients.  相似文献   

9.

Introduction

Patients with severe acute kidney injury (AKI) who are hospitalized at centers that do not provide renal replacement therapy (RRT) are frequently subjected to inter-hospital transfer for the provision of RRT. It is unclear whether such transfers are associated with worse patient outcomes as compared with the receipt of initial care in a center that provides RRT. This study examined the relationship between inter-hospital transfer and 30-day mortality among critically ill patients with AKI who received RRT.

Methods

We conducted a retrospective cohort study of all critically ill patients who commenced RRT for AKI at two academic hospitals in Toronto, Canada. The exposure of interest was inter-hospital transfer for the administration of RRT. We evaluated the relationship between transfer status and 30-day mortality (primary outcome) and RRT dependence at 30 days following RRT initiation (secondary outcome), by using multivariate logistic regression with adjustment for patient demographics, clinical factors, biochemical indices, and severity of illness.

Results

Of 370 patients who underwent RRT for AKI, 82 (22.2%) were transferred for this purpose from another hospital. Compared with non-transferred patients who started RRT, transferred patients were younger (61 ± 15 versus 65 ± 15 years, P = 0.03) and had a higher serum creatinine concentration at RRT initiation (474 ± 295 versus 365 ± 169 μmol/L, P = 0.002). Inter-hospital transfer was not associated with mortality (adjusted odds ratio 0.61, 95% confidence interval 0.33 to 1.12) or RRT-dependence (adjusted odds ratio 1.64, 95% confidence interval 0.70 to 3.81) at 30 days.

Conclusions

Within the limitations of this observational study and the potential for residual confounding, inter-hospital transfer of critically ill patients with AKI was not associated with a higher risk of death or dialysis dependence 30 days after the initiation of acute RRT.

Electronic supplementary material

The online version of this article (doi:10.1186/s13054-014-0513-1) contains supplementary material, which is available to authorized users.  相似文献   

10.

Introduction

Current practice in the delivery of caloric intake (DCI) in patients with severe acute kidney injury (AKI) receiving renal replacement therapy (RRT) is unknown. We aimed to describe calorie administration in patients enrolled in the Randomized Evaluation of Normal vs. Augmented Level of Replacement Therapy (RENAL) study and to assess the association between DCI and clinical outcomes.

Methods

We performed a secondary analysis in 1456 patients from the RENAL trial. We measured the dose and evolution of DCI during treatment and analyzed its association with major clinical outcomes using multivariable logistic regression, Cox proportional hazards models, and time adjusted models.

Results

Overall, mean DCI during treatment in ICU was low at only 10.9 ± 9 Kcal/kg/day for non-survivors and 11 ± 9 Kcal/kg/day for survivors. Among patients with a lower DCI (below the median) 334 of 729 (45.8%) had died at 90-days after randomization compared with 316 of 727 (43.3%) patients with a higher DCI (above the median) (P = 0.34). On multivariable logistic regression analysis, mean DCI carried an odds ratio of 0.95 (95% confidence interval (CI): 0.91-1.00; P = 0.06) per 100 Kcal increase for 90-day mortality. DCI was not associated with significant differences in renal replacement (RRT) free days, mechanical ventilation free days, ICU free days and hospital free days. These findings remained essentially unaltered after time adjusted analysis and Cox proportional hazards modeling.

Conclusions

In the RENAL study, mean DCI was low. Within the limits of such low caloric intake, greater DCI was not associated with improved clinical outcomes.

Trial registration

ClinicalTrials.gov number, NCT00221013  相似文献   

11.

Introduction

Neutrophil gelatinase-associated lipocalin (NGAL) is a biomarker of acute kidney injury (AKI), and levels reflect severity of disease in critically ill patients. However, continuous venovenous hemofiltration (CVVH) may affect plasma levels by clearance or release of NGAL by activated neutrophils in the filter, dependent on the anticoagulation regimen applied. We therefore studied handling of NGAL by CVVH in patients with AKI.

Methods

Immediately before initiation of CVVH, prefilter blood was drawn. After 10, 60, 180, and 720 minutes of CVVH, samples were collected from pre- and postfilter (in- and outlet) blood and ultrafiltrate. CVVH with the following anticoagulation regimens was studied: no anticoagulation in case of a high bleeding tendency (n = 13), unfractionated heparin (n = 8), or trisodium citrate (n = 21). NGAL levels were determined with enzyme-linked immunosorbent assay (ELISA).

Results

Concentrations of NGAL at inlet and outlet were similar, and concentrations did not change over time in any of the anticoagulation groups; thus no net removal or production of NGAL occurred. Concentrations of NGAL at inlet correlated with disease severity at initiation of CVVH and at the end of a CVVH run. Concentrations of NGAL in the ultrafiltrate were lower with citrate-based CVVH (P = 0.03) and decreased over time, irrespective of anticoagulation administered (P < 0.001). The sieving coefficient and clearance of NGAL were low and decreased over time (P < 0.001).

Conclusions

The plasma level and biomarker value of NGAL in critically ill patients with AKI are not affected by CVVH, because clearance by the filter was low. Furthermore, no evidence exists for intrafilter release of NGAL by neutrophils, irrespective of the anticoagulation method applied.  相似文献   

12.
IntroductionUse of hydroxyethyl starch (HES) in septic patients is reported to increase the mortality and incidence of renal replacement therapy (RRT). However, whether or not use of HES would induce the same result in non-septic patients in the intensive care unit (ICU) remains unclear. The objective of this meta-analysis was to evaluate 6% HES versus other fluids for non-septic ICU patients.MethodsRandomized controlled trials (RCTs) were searched from Pubmed, OvidSP, Embase database and Cochrane Library, published before November, 2013. A meta-analysis was made on the effect of 6% HES versus other fluids for non-septic ICU patients, including mortality, RRT incidence, bleeding volume, red blood cell (RBC) transfusion and fluid application for non-septic patients in ICU.ResultsTwenty-two RCTs were included, involving 6,064 non-septic ICU patients. Compared with the other fluids, 6% HES was not associated with decreased overall mortality (RR = 1.03, 95%CI: 0.09 to 1.17; P = 0.67; I2 = 0). There was no significant difference in RRT incidence, bleeding volume and red blood cell transfusion between 6% HES group and the other fluid groups. However, patients in HES group received less total intravenous fluids than those receiving crystalloids during the first day in ICU (SMD = −0.84; 95%CI: −1.39 to −0.30; P = 0.003, I2 = 74%).ConclusionsThis meta-analysis found no increased mortality, RRT incidence, bleeding volumes or RBC transfusion in non-septic ICU patients, but the sample sizes were small and the studies generally were of poor quality.

Electronic supplementary material

The online version of this article (doi:10.1186/s13054-015-0833-9) contains supplementary material, which is available to authorized users.  相似文献   

13.

Introduction

Sepsis is the leading cause of acute kidney injury (AKI) in critical patients. The optimal timing of initiating renal replacement therapy (RRT) in septic AKI patients remains controversial. The objective of this study is to determine the impact of early or late initiation of RRT, as defined using the simplified RIFLE (risk, injury, failure, loss of kidney function, and end-stage renal failure) classification (sRIFLE), on hospital mortality among septic AKI patients.

Methods

Patient with sepsis and AKI requiring RRT in surgical intensive care units were enrolled between January 2002 and October 2009. The patients were divided into early (sRIFLE-0 or -Risk) or late (sRIFLE-Injury or -Failure) initiation of RRT by sRIFLE criteria. Cox proportional hazard ratios for in hospital mortality were determined to assess the impact of timing of RRT.

Results

Among the 370 patients, 192 (51.9%) underwent early RRT and 259 (70.0%) died during hospitalization. The mortality rate in early and late RRT groups were 70.8% and 69.7% respectively (P > 0.05). Early dialysis did not relate to hospital mortality by Cox proportional hazard model (P > 0.05). Patients with heart failure, male gender, higher admission creatinine, and operation were more likely to be in the late RRT group. Cox proportional hazard model, after adjustment with propensity score including all patients based on the probability of late RRT, showed early dialysis was not related to hospital mortality. Further model matched patients by 1:1 fashion according to each patient's propensity to late RRT showed no differences in hospital mortality according to head-to-head comparison of demographic data (P > 0.05).

Conclusions

Use of sRIFLE classification as a marker poorly predicted the benefits of early or late RRT in the context of septic AKI. In the future, more physiologically meaningful markers with which to determine the optimal timing of RRT initiation should be identified.  相似文献   

14.

Purpose

Determine whether there are unique patterns to the urine biochemistry profile in septic compared with non-septic acute kidney injury (AKI) and whether urinary biochemistry predicts worsening AKI, need for renal replacement therapy and mortality.

Materials and Methods

Prospective cohort study of critically ill patients with septic and non-septic AKI, defined by the RIFLE (Risk, Injury, Failure, Loss, End-Stage) criteria. Urine biochemistry parameters were compared between septic and non-septic AKI and were correlated with neutrophil gelatinase-associated lipocalin (NGAL), worsening AKI, renal replacement therapy (RRT), and mortality.

Results

Eighty-three patients were enrolled, 43 (51.8%) with sepsis. RIFLE class was not different between groups (P = .43). Urine sodium (UNa) < 20 mmol/L, fractional excretion of sodium (FeNa) < 1%, and fractional excretion of urea (FeU) < 35% were observed in 25.3%, 57.8%, and 33.7%, respectively. Septic AKI had lower UNa compared with non-septic AKI (P = .04). There were no differences in FeNa or FeU between groups. Urine NGAL was higher for FeNa≥1% compared to FeNa<1% (177.4 ng/mL [31.9-956.5] vs 48.0 ng/mL [21.1-232.4], P = .04). FeNa showed low correlation with urine NGAL (P = .05) and plasma NGAL (P = .14). There was poor correlation between FeU and urine NGAL (P = .70) or plasma NGAL (P = .41). UNa, FeNa, and FeU showed poor discrimination for worsening AKI, RRT and mortality.

Conclusion

Urine biochemical profiles do not discriminate septic and non-septic AKI. UNa, FeNa, and FeU do not reliably predict biomarker release, worsening AKI, RRT or mortality. These data imply limited utility for these measures in clinical practice in critically ill patients with AKI.  相似文献   

15.

Introduction

Cystatin C (Cysc) could be affected by thyroid function both in vivo and in vitro and thereby may have limited ability to reflect renal function. We aimed to assess the association between Cysc and thyroid hormones as well as the effect of thyroid function on the diagnostic accuracy of Cysc to detect acute kidney injury (AKI).

Methods

A total of 446 consecutive intensive care unit (ICU) patients were screened for eligibility in this prospective AKI observational study. Serum Cysc, thyroid hormones and serum creatinine (Scr) were measured upon entry to the ICU. We also collected each patient''s baseline characteristics including the Acute Physiology and Chronic Health Evaluation II (APACHE-II) score. The diagnostic performance of Cysc was assessed from the area under the receiver operator characteristic curve (AUC) in each quartile of thyroid hormone(s).

Results

A total of 114 (25.6%) patients had a clinical diagnosis of AKI upon entry to the ICU. The range of free thyroxine (FT4) value was 4.77 to 39.57 pmol/L. Multivariate linear regression showed that age (standardized beta = 0.128, P < 0.0001), baseline Scr level (standardized beta = 0.290, P < 0.0001), current Scr (standardized beta = 0.453, P < 0.0001), albumin (standardized beta = -0.086, P = 0.006), and FT4 (standardized beta = 0.062, P = 0.039) were related with Cysc. Patients were divided into four quartiles based on FT4 levels. The AUC for Cysc in detecting AKI in each quartile were as follows: 0.712 in quartile I, 0.754 in quartile II, 0.829 in quartile III and 0.797 in quartile IV. There was no significant difference in the AUC between any two groups (all P > 0.05). The optimal cut-off value of Cysc for diagnosing AKI increased across FT4 quartiles (1.15 mg/L in quartile I, 1.15 mg/L in quartile II, 1.35 mg/L in quartile III and 1.45 mg/L in quartile IV).

Conclusions

There was no significant impact of thyroid function on the diagnostic accuracy of Cysc to detect AKI in ICU patients. However, the optimal cut-off value of Cysc to detect AKI could be affected by thyroid function.  相似文献   

16.

Introduction

Elevated plasma B-type natriuretic peptide (BNP) levels in patients with critical sepsis (severe sepsis and septic shock) may indicate septic cardiomyopathy. However, multiple heterogeneous conditions may also be involved in increased BNP level. In addition, the prognostic value of BNP in sepsis remains debatable. In this study, we sought to discover potential independent determinants of BNP elevation in critical sepsis. The prognostic value of BNP was also evaluated.

Methods

In this observational study, we enrolled mechanically ventilated, critically septic patients requiring hemodynamic monitoring through a pulmonary artery catheter. All clinical, laboratory and survival data were prospectively collected. Plasma BNP concentrations were measured daily for five consecutive days. Septic cardiomyopathy was assessed on day 1 on the basis of left and right ventricular ejection fractions (EF) derived from echocardiography and thermodilution, respectively. Mortality was recorded at day 28.

Results

A total of 42 patients with severe sepsis (N = 12) and septic shock (N = 30) were ultimately enrolled. Daily BNP levels were significantly elevated in septic shock patients compared with those with severe sepsis (P ≤0.002). Critical illness severity (assessed by Acute Physiology and Chronic Health Evaluation II and maximum Sequential Organ Failure Assessment scores), and peak noradrenaline dose on day 1 were independent determinants of BNP elevation (P <0.05). Biventricular EFs were inversely correlated with longitudinal BNP measurements (P <0.05), but not independently. Pulmonary capillary wedge pressures (PCWP) and volume expansion showed no correlation with BNP. In septic shock, increased central venous pressure (CVP) and CVP/PCWP ratio were independently associated with early BNP values (P <0.05).Twenty-eight-day mortality was 47.6% (20 of 42 patients). Daily BNP values poorly predicted outcome; BNP on day 1 > 800 pg/ml (the best cutoff point) fairly predicted mortality, with a sensitivity%, specificity% and area under the curve values of 65, 64 and 0.70, respectively (95% confidence interval = 0.54 to 0.86; P = 0.03). Plasma BNP levels declined faster in survivors than in nonsurvivors in both critical sepsis and septic shock (P ≤0.002). In septic shock, a BNP/CVP ratio >126 pg/mmHg/ml on day 2 and inability to reduce BNP <500 pg/ml implied increased mortality (P ≤0.036).

Conclusions

The severity of critical illness, rather than septic cardiomyopathy, is probably the major determinant of BNP elevation in patients with critical sepsis. Daily BNP values are of limited prognostic value in predicting 28-day mortality; however, fast BNP decline over time and a decrease in BNP <500 pg/ml may imply a favorable outcome.  相似文献   

17.
Renal replacement therapy can be applied either in an intermittent fashion or in a continuous fashion in severe acute kidney injury. To date, no modality has been shown to consistently improve patient survival. In the study recently reported by Sun and colleagues, continuous application of renal replacement therapy was associated with improved renal recovery, defined by lower risk of long-term need for chronic dialysis therapy. This association between nonrecovery and intermittent renal replacement therapy may be explained by a higher rate of hypotensive episodes and the lower capacity for fluid removal during the first 72 hours of therapy. Altogether, this study adds to the growing body of evidence to suggest improved likelihood of recovery of kidney function in critically ill survivors of AKI with continuous modalities for renal replacement therapy.In recent years, there has been increased interest in the long-term outcomes for patients who survive an episode of critical illness. For those survivors who experienced severe acute kidney injury (AKI) during the course of their critical illness, renal recovery is of upmost importance. Indeed, nonrecovery or incomplete recovery of renal function can translate into a need for long-term dialysis – a treatment associated with low quality of life and representing a major burden for healthcare systems.In the previous issue of Critical Care, Sun and colleagues have compared the outcomes of 145 patients who required renal replacement therapy (RRT) for sepsis-related AKI [1]. Their findings suggest that recovery of kidney function to dialysis independence at 60 days was strongly associated with the initial RRT modality applied. Indeed, application of RRT in a continuous fashion (continuous venovenous hemodiafiltration (CVVHDF)) was associated with a higher rate of renal recovery than its application in a prolonged intermittent fashion (extended daily hemofiltration (EDHF)). After accounting for relevant confounding variables in multivariable analysis, initial treatment with CVVHDF was associated with significant 3.8-fold higher odds of recovery of kidney function when compared with initial therapy with EDHF. This difference was evident despite the fact that patients receiving CVVHDF had significantly lower initial mean arterial pressures, more oliguria and lower serum pH at the time of RRT initiation. There was no difference in adjusted mortality rates between the two modalities.The findings of Sun and colleagues are consistent with those obtained in several large cohort studies [2-4] in which higher rates of recovery to dialysis independence were found in survivors of critical illness complicated by AKI initially treated with continuous renal replacement therapy (CRRT) compared with those treated with intermittent renal replacement therapy (IRRT). In a systematic review including 50 studies reporting dialysis dependence in AKI survivors [5], IRRT as an initial modality was associated with a 1.7 times greater risk for dialysis dependence when compared with CRRT (odds ratio, 1.73; 95% confidence interval, 1.35 to 1.68). However, these results are susceptible to treatment allocation bias, as the effect was largely driven by observational studies and was nonsignificant when the analysis was restricted to randomized controlled trials (odds ratio, 1.15; 95% confidence interval, 0.73 to 1.68; n = 7). Recently, a large, population-based Canadian study (not included in the meta-analysis) similarly compared renal recovery among survivors of severe AKI according to the initial RRT modality and included a propensity matched analysis to adjust for treatment allocation [6]. In this study, initial treatment with IRRT, when compared with CRRT, was also associated with a significantly higher likelihood of dialysis dependence at 90 days (21% vs 16%) and during long-term follow-up (27% vs 21%).One shortcoming of these studies was the fact that data were sourced from administrative databases or population registries and that these could not provide patient-level data. The study from Sun and colleagues therefore provides further insights and granularity on patient characteristics at the time of RRT initiation and details on treatment provision [1]. Their data provide plausible explanations for the higher rate of renal recovery associated with initial treatment with CRRT.First, use of CRRT was associated with a trend for fewer episodes of hypotension (15% vs 26%, P = 0.112) and higher average mean arterial pressure during the first 72 hours of therapy (89.7 mmHg vs 83.8 mmHg, P = 0.137) when compared with IRRT. Although these differences failed to reach statistical significance, they were clinically important. This nonsignificance is presumably due to the higher delivered hourly ultrafiltration rate necessary with IRRT to achieve fluid homeostasis targets compared with CRRT (241 ml/hour in the EDHF group vs 149 ml/hour in the CVVHF group, P <0.001). Indeed, Conger and colleagues have long established the loss of autoregulation of renal blood flow in AKI, and the impact of hypotension contributes to further histological damage [7,8]. The increased occurrence of iatrogenic hypotension induced by IRRT to achieve fluid removal targets can logically be expected to contribute to delayed or reduced likelihood of renal recovery.Second, despite a higher hourly ultrafiltration rate, the use of EDHF was associated with a lower ability to remove fluids in the first 72 hours. This is demonstrated by the net negative fluid balance obtained in the CVVHDF group but not in the EDHF group (–0.46 l vs +0.15 l, P = 0.019). The capacity to safely remove fluid in critically ill oligoanuric patients is limited in IRRT compared with CRRT. Fluid accumulation and overload are now recognized as important complications of critical illness and are associated with adverse outcomes [9,10] and reduced renal recovery [10,11]. A direct causal relationship between positive fluid balance and worse renal outcome has yet to be determined; however, these factors may be related to increased pressure exerted by extravascular fluid within an encapsulated organ [12].Finally, the continuous application of RRT mathematically translated into almost double the delivered RRT dose during the first 72 hours (replacement flow: CVVHDF group 4.64 l/day vs EDHF group 2.65 l/day, P <0.001). Whether this higher and more consistently delivered dose early in a patient’s course of critically illness translates into better metabolic homeostasis remains speculative and should be explored in randomized trials, despite prior trials showing no significant impact on survival or recovery by delivered dose [13-15]. The role of enhanced clearance of inflammatory molecules on renal recovery remains to be evaluated.Overall, the study by Sun and colleagues further contributes to the growing body of evidence to suggest that initial treatment with CRRT compared with IRRT in critically ill patients with AKI may confer superiority for increased likelihood of renal recovery. The physiologic reasoning for this conclusion is biologically plausible: CRRT is associated with better hemodynamic stability through reduced episodes of hypotension and improved fluid homeostasis. In the absence of a suitable powered randomized trial with renal recovery as a primary endpoint, the evidence supporting the superiority of initial treatment with CRRT with renal recovery in mind will be derived from observational data such as these. However, given the current burden of evidence suggesting better recovery with CRRT as the initial therapy, physicians should seriously consider moving away from potentially deleterious therapies before proof of their non-inferiority is established. While lower healthcare costs associated with intermittent hemodialysis are often advocated for their primary use in critically ill patients with AKI, formal economic analyses taking long-term dialysis costs into account are necessary to establish the real cost of IRRT in the ICU.  相似文献   

18.

Introduction

In a previous report, we demonstrated a favorable trend for supplementation with antithrombin (AT) concentrate at a dosage of 3,000 IU/day over 1,500 IU/day for the treatment of sepsis-associated disseminated intravascular coagulation (DIC) in patients with an AT activity of 70% or less. Since the survival difference did not reach statistical significance, we planned to examine the effects in a larger number of cases with severer disease.

Methods

We performed a non-randomized multi-institutional survey. In total, 307 septic DIC patients who had AT activity less than 40% and who had undergone AT substitution at a dose of either 1,500 IU/day or 3,000 IU/day for three consecutive days were analyzed. Of these, 259 patients received 1,500 IU/day (AT1500 group) and 48 patients received 3,000 IU/day (AT3000 group). The primary efficacy endpoints were recovery from DIC by day 7 and an all-cause mortality on day 28. Adverse bleeding events were also examined. A logistic regression analysis was conducted by using age, sex, body weight, initial AT activity, DIC score, platelet count, coadministration of heparin, recombinant thrombomodulin, suspected source of infection, surgery, and supplemented AT dose.

Results

Supplementation significantly decreased the DIC score in the AT3000 group, leading to the superior resolution of DIC, compared with the results in the AT1500 group (66.7% versus 45.2%, P = 0.007). In addition, the AT3000 group exhibited a better survival than the AT1500 group (77.1% versus 56.4%, P = 0.010). Bleeding events were observed in 6.96% (severe bleeding: 3.04%) in the AT1500 group and 6.52% (severe bleeding, 4.35%) in the AT3000 group (P = 1.000; severe bleeding, P = 0.648). A logistic regression analysis revealed that the use of AT3000 (odds ratio (OR), 2.419; P = 0.025), a higher initial platelet count (OR, 1.054; P = 0.027), and patient age (OR, 0.977; P = 0.045) were significantly correlated with an improved survival.

Conclusions

The AT3000 group exhibited significantly improved rates of survival and recovery from DIC without an increased risk of bleeding, compared with the AT1500 group, among the patients with sepsis-associated DIC and an AT activity of less than 40%.  相似文献   

19.

Introduction

We sought to investigate whether treatment of subnormal (<70%) central venous oxygen saturation (ScvO2) with inotropes or red blood cell (RBC) transfusion during early goal-directed therapy (EGDT) for septic shock is independently associated with in-hospital mortality.

Methods

Retrospective analysis of a prospective EGDT patient database drawn from 21 emergency departments with a single standardized EGDT protocol. Patients were included if, during EGDT, they concomitantly achieved a central venous pressure (CVP) of ≥8 mm Hg and a mean arterial pressure (MAP) of ≥65 mm Hg while registering a ScvO2 < 70%. Treatment propensity scores for either RBC transfusion or inotrope administration were separately determined from independent patient sub-cohorts. Propensity-adjusted logistic regression analyses were conducted to test for associations between treatments and in-hospital mortality.

Results

Of 2,595 EGDT patients, 572 (22.0%) met study inclusion criteria. The overall in-hospital mortality rate was 20.5%. Inotropes or RBC transfusions were administered for an ScvO2 < 70% to 51.9% of patients. Patients were not statistically more likely to achieve an ScvO2 of ≥70% if they were treated with RBC transfusion alone (29/59, 49.2%, P = 0.19), inotropic therapy alone (104/226, 46.0%, P = 0.15) or both RBC and inotropic therapy (7/12, 58.3%, P = 0.23) as compared to no therapy (108/275, 39.3%). Following adjustment for treatment propensity score, RBC transfusion was associated with a decreased adjusted odds ratio (aOR) of in-hospital mortality among patients with hemoglobin values less than 10 g/dL (aOR 0.42, 95% CI 0.18 to 0.97, P = 0.04) while inotropic therapy was not associated with in-hospital mortality among patients with hemoglobin values of 10 g/dL or greater (aOR 1.16, 95% CI 0.69 to 1.96, P = 0.57).

Conclusions

Among patients with septic shock treated with EGDT in the setting of subnormal ScvO2 values despite meeting CVP and MAP target goals, treatment with RBC transfusion may be independently associated with decreased in-hospital mortality.  相似文献   

20.

Introduction

Early protein and energy feeding in critically ill patients is heavily debated and early protein feeding hardly studied.

Methods

A prospective database with mixed medical-surgical critically ill patients with prolonged mechanical ventilation (>72 hours) and measured energy expenditure was used in this study. Logistic regression analysis was used to analyse the relation between admission day-4 protein intake group (with cutoffs 0.8, 1.0, and 1.2 g/kg), energy overfeeding (ratio energy intake/measured energy expenditure > 1.1), and admission diagnosis of sepsis with hospital mortality after adjustment for APACHE II (Acute Physiology and Chronic Health Evaluation II) score.

Results

A total of 843 patients were included. Of these, 117 had sepsis. Of the 736 non-septic patients 307 were overfed. Mean day-4 protein intake was 1.0 g/kg pre-admission weight per day and hospital mortality was 36%. In the total cohort, day-4 protein intake group (odds ratio (OR) 0.85; 95% confidence interval (CI) 0.73 to 0.99; P = 0.047), energy overfeeding (OR 1.62; 95%CI 1.07 to 2.44; P = 0.022), and sepsis (OR 1.77; 95%CI 1.18 to 2.65; P = 0.005) were independent risk factors for mortality besides APACHE II score. In patients with sepsis or energy overfeeding, day-4 protein intake was not associated with mortality. For non-septic, non-overfed patients (n = 419), mortality decreased with higher protein intake group: 37% for <0.8 g/kg, 35% for 0.8 to 1.0 g/kg, 27% for 1.0 to 1.2 g/kg, and 19% for ≥1.2 g/kg (P = 0.033). For these, a protein intake level of ≥1.2 g/kg was significantly associated with lower mortality (OR 0.42, 95%CI 0.21 to 0.83, P = 0.013).

Conclusions

In non-septic critically ill patients, early high protein intake was associated with lower mortality and early energy overfeeding with higher mortality. In septic patients early high protein intake had no beneficial effect on mortality.

Electronic supplementary material

The online version of this article (doi:10.1186/s13054-014-0701-z) contains supplementary material, which is available to authorized users.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号