首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.

Introduction

The use of standard doses of β-lactam antibiotics during continuous renal replacement therapy (CRRT) may result in inadequate serum concentrations. The aim of this study was to evaluate the adequacy of unadjusted drug regimens (i.e., similar to those used in patients with normal renal function) in patients treated with CRRT and the influence of CRRT intensity on drug clearance.

Methods

We reviewed data from 50 consecutive adult patients admitted to our Department of Intensive Care in whom routine therapeutic drug monitoring (TDM) of broad-spectrum β-lactam antibiotics (ceftazidime or cefepime, CEF; piperacillin/tazobactam; TZP; meropenem, MEM) was performed using unadjusted β-lactam antibiotics regimens (CEF = 2 g q8h; TZP = 4 g q6h; MEM = 1 g q8h). Serum drug concentrations were measured twice during the elimination phase by high-performance liquid chromatography (HPLC-UV). We considered therapy was adequate when serum drug concentrations were between 4 and 8 times the minimal inhibitory concentration (MIC) of Pseudomonas aeruginosa during optimal periods of time for each drug (≥70% for CEF; ≥ 50% for TZP; ≥ 40% for MEM). Therapy was considered as early (ET) or late (LT) phase if TDM was performed within 48 hours of antibiotic initiation or later on, respectively.

Results

We collected 73 serum samples from 50 patients (age 58 ± 13 years; Acute Physiology and Chronic Health Evaluation II (APACHE II) score on admission 21 (17–25)), 35 during ET and 38 during LT. Drug concentrations were above 4 times the MIC in 63 (90%), but above 8 times the MIC in 39 (53%) samples. The proportions of patients with adequate drug concentrations during ET and LT were quite similar. We found a weak but significant correlation between β-lactam antibiotics clearance and CRRT intensity.

Conclusions

In septic patients undergoing CRRT, doses of β-lactam antibiotics similar to those given to patients with normal renal function achieved drug levels above the target threshold in 90% of samples. Nevertheless, 53% of samples were associated with very high drug levels and daily drug regimens may need to be adapted accordingly.  相似文献   

3.
Continuous renal replacement therapy (CRRT) in pediatric acute kidney dysfunction has evolved in recent decades; however, little objective data exist for complications associated with CRRT. Santiago and colleagues are among the first to document four complications of acute kidney dysfunction in critically ill children: catheterization-related insertion complications, hypotension, hemorrhage, and electrolyte disturbances. They reported that hypotension at connection (41.3%) and electrolyte disturbance (50.6%) were the leading complications. Although this study is limited by small sample size and the outcome variables measured, it is an important first step in assessing outcomes of CRRT in children. A prospective multicenter randomized trial will be needed to fully delineate the complications and define the risk/benefit ratio of CRRT in children.  相似文献   

4.
PurposeThe purpose of our study was to investigate the timing of continuous renal replacement therapy (CRRT) application, based on the interval between the start of early goal-directed therapy (EGDT) and CRRT initiation, to ascertain whether the timing was an independent predictor of mortality in patients with septic acute kidney injury (AKI).Materials and methodsAn observational retrospective cohort study was conducted of 60 patients (> 18 years old) who had been admitted to the emergency department and received resuscitation according to the standard EGDT algorithm for severe sepsis and septic shock, and who were treated with CRRT due to septic AKI, between June 2008 and February 2013 at a tertiary hospital in Seoul, Korea. The patients were divided into 2 groups based on the median interval between the start of EGDT and the commencement of CRRT. The main outcome was 28-day all-cause mortality, and a multivariate Cox analysis for mortality was used to evaluate the independent impact of the early CRRT treatment.ResultsThe mean patient age was 66.3 years, and 52 (86.7%) were male. The most common comorbid disease was diabetes mellitus (35.0%) followed by malignancy (26.7%). The median interval between the start of EGDT and commencement of CRRT was 26.4 hours. During the study period, 28-day mortality was 43.3% (26 of 60 patients). The 28-day all-cause mortality rate was significantly higher in the late CRRT group than in the early CRRT group (56.7 vs 30.0%, P= .037). Furthermore, the higher mortality risk in the late group remained significant even after adjusting for diabetes mellitus, liver failure, and Acute Physiology and Chronic Health Evaluation II scores (hazard ratio, 2.461; 95% confidence interval, 1.044-5.800; P= .026).ConclusionEarly initiation of CRRT may be of benefit. Given the complex nature of this intervention and the ongoing controversy regarding early vs late initiation of therapy in acute and chronic situations, it is vital to develop accurate clinical trials to find definitive answers.  相似文献   

5.

Purpose

In acute kidney injury patients, metabolic acidosis is common. Its severity, duration, and associated changes in mean arterial pressure (MAP) and vasopressor therapy may be affected by the intensity of continuous renal replacement therapy (CRRT). We aimed to compare key aspects of acidosis and MAP and vasopressor therapy in patients treated with two different CRRT intensities.

Methods

We studied a nested cohort of 115 patients from two tertiary intensive care units (ICUs) within a large multicenter randomized controlled trial treated with lower intensity (LI) or higher intensity (HI) CRRT.

Results

Levels of metabolic acidosis at randomization were similar [base excess (BE) of ?8 ± 8 vs. ?8 ± 7 mEq/l; p = 0.76]. Speed of BE correction did not differ between the two groups. However, the HI group had a greater increase in MAP from baseline to 24 h (7 ± 3 vs. 0 ± 3 mmHg; p < 0.01) and a greater decrease in norepinephrine dose (from 12.5 to 3.5 vs. 5 to 2.5 μg/min; p < 0.05). The correlation (r) coefficients between absolute change in MAP and norepinephrine (NE) dose versus change in BE were 0.05 and ?0.37, respectively.

Conclusions

Overall, LI and HI CRRT have similar acid–base effects in patients with acidosis. However, HI was associated with greater improvements in MAP and vasopressor requirements (clinical trial no. NCT00221013).  相似文献   

6.
7.
8.
Objectives Critical illness increases the tendency to both coagulation and bleeding, complicating anticoagulation for continuous renal replacement therapy (CRRT). We analyzed strategies for anticoagulation in CRRT concerning implementation, efficacy and safety to provide evidence-based recommendations for clinical practice. Methods We carried out a systematic review of the literature published before June 2005. Studies were rated at five levels to create recommendation grades from A to E, A being the highest. Grades are labeled with minus if the study design was limited by size or comparability of groups. Data extracted were those on implementation, efficacy (circuit survival), safety (bleeding) and monitoring of anticoagulation. Results Due to the quality of the studies recommendation grades are low. If bleeding risk is not increased, unfractionated heparin (activated partial thromboplastin time, APTT, 1–1.4 times normal) or low molecular weight heparin (anti-Xa 0.25–0.35 IU/l) are recommended (grade E). If facilities are adequate, regional anticoagulation with citrate may be preferred (grade C). If bleeding risk is increased, anticoagulation with citrate is recommended (grade D). CRRT without anticoagulation can be considered when coagulopathy is present (grade D). If clotting tendency is increased predilution or the addition of prostaglandins to heparin may be helpful (grade C). Conclusion Anticoagulation for CRRT must be tailored to patient characteristics and local facilities. The implementation of regional anticoagulation with citrate is worthwhile to reduce bleeding risk. Future trials should be randomized and should have sufficient power and well defined endpoints to compensate for the complexity of critical illness-related pro- and anticoagulant forces. An international consensus to define clinical endpoints is advocated. Electronic supplementary material Electronic supplementary material to this contribution can be obtained by using the Springer Link server located at and is accessible for authorized users.  相似文献   

9.
Correct antibiotic treatment is of utmost importance to treat infections in critically ill patients, not only in terms of spectrum and timing but also in terms of dosing. However, this is a real challenge for the clinician because the pathophysiology (such as shock, augmented renal clearance, and multiple organ dysfunction) has a major impact on the pharmacokinetics of hydrophilic antibiotics. The presence of extra-corporal circuits, such as continuous renal replacement therapy, may further complicate this difficult exercise. Standard dosing may result in inadequate concentrations, but unadjusted dosing regimens may lead to toxicity. Recent studies confirm the variability in concentrations, and the wide variation in dialysis techniques used certainly contributes to these findings. Well-designed clinical studies are needed to provide the data from which robust dosing guidance can be developed. In the meantime, non-adjusted dosing in the first 1 to 2 days of antibiotic therapy during continuous renal replacement therapy followed by dose reduction later on seems to be a prudent approach.This commentary discusses the findings of Beumier and colleagues [1] about beta-lactam concentrations during continuous renal replacement therapy (CRRT). In recent years, pharmacokinetics (PK) of antibiotics in critically ill patients has been studied intensely. Standard antibiotic dosing apparently results in highly variable antibiotic concentrations in the blood and tissues. A recent multicenter point prevalence study found that 20% of the patients are underdosed (when aiming for the most conservative target) and could link antibiotic underexposure to worse clinical outcome [2]. The causes for the variability in effective antibiotic concentration are multiple, but increased volume of distribution and enhanced elimination from the circulation, mostly via the kidneys in patients suffering from augmented renal clearance, are the most important contributors.But also when the kidneys fail, antibiotic dosing can be challenging. When kidney function deteriorates, standard dosing may lead to accumulation of a drug and therefore antibiotic doses are often adapted to the kidney function. Also, when renal replacement therapy (RRT) is indicated, antibiotic doses are often decreased.RRT does, however, affect antibiotic concentrations as antibiotics are also eliminated from the circulation. The rate at which this occurs is affected by the physicochemical characteristics of the drug and the type and intensity of RRT used, among other factors. Antibiotics with a small volume of distribution and low protein binding will be affected more. It can be expected that CRRT techniques are more efficient in removing antibiotics from the blood.Several studies have addressed this issue in recent years. Seyler and colleagues [3] found that patients treated with CRRT had overall low target attainment in the early phase of therapy when recommended doses are used and when four times the minimum inhibitory concentration (MIC) of the least susceptible microorganism is aimed for [3]. Later during therapy, concentrations were higher for meropenem, piperacillin, and ceftazidime - in all patients, these drugs were administered as intermittent infusion. Bauer and colleagues [4] found that there was an important interindividual variability in piperacillin concentrations and that CRRT dose was the only factor associated with piperacillin elimination. This variability was confirmed by Roberts and colleagues [5] in a similar patient cohort, with both under- and overdosing occurring in about 10% of the patients, depending on the PK/pharmacodynamics (PD) target considered. In all of these studies, the variability of antibiotic doses administered is striking; this, on the one hand, reflects the uncertainty of the clinician on how RRT may affect antibiotic concentrations but, on the other hand, adds a level of complexity when it comes to analyzing target attainment in different studies. Furthermore, the intervals between start of antibiotic therapy and sampling as well as RRT initiation and sampling are additional confounders when target attainment is analyzed in these studies. In a systematic review, Vaara and colleagues [6] found the overall quality of studies investigating the effect of RRT moderate, as the many important RRT parameters were often lacking.In this issue of Critical Care, Beumier and colleagues [1] report that standard, unadjusted dosing results in very high concentrations in about half of the patients. Although the study includes a large number of patients and reflects clinical practice in their intensive care unit, there may have been a selection bias with only patients at risk selected for therapeutic drug monitoring (TDM). Also, the timing was variable, and it is not clear whether patients that were treated with antibiotics long before RRT was initiated had an increased risk of overdosing as it was defined in this study.With these new data, drawing firm conclusions regarding antibiotic dosing during RRT remains difficult. However, it can be concluded that higher-than-standard dosing in this patient population is probably unwarranted. From previous studies, it seems that underdosing is more common during the first days of treatment and that overdosing may occur later (48 hours or later) when doses are not adjusted. This is probably due to the fact that factors other than RRT determine the concentrations in the first 24 to 48 hours of therapy, such as the enhanced volume of distribution, and elimination determines the subsequent concentrations. Furthermore, it should be kept in mind that the susceptibility of the microorganism is equally important; target attainment is in danger only when borderline susceptible microorganisms with higher-than-average MIC are involved.Clearly, more research using well-designed studies with prospective data collection is needed. Moreover, the impact of other RRT strategies such as intermittent hemodialysis and sustained low-efficiency daily dialysis needs our urgent attention. In the meantime, for patients undergoing CRRT, optimizing antibiotic therapy is as essential as in other patients. Though rarely suggested in CRRT, prolonged and extended infusion may be sufficient to overcome underdosing in many. In difficult cases, TDM - when available - could be a solution, although many questions surrounding the application of TDM remain unanswered. Most importantly, it is not clear what PK/PD target should be aimed for to treat infections during critical illness, but TDM could at least detect the extremes of concentrations and allow dose adaptation in case of severe under- and overdosing.  相似文献   

10.
IntroductionIntravenous (IV) voriconazole is not recommended in patients with creatinine clearance <50 ml/min to avoid potentially toxic accumulation of sulfobutylether-β-cyclodextrin (SBECD). The purpose of this study was to evaluate the pharmacokinetics of SBECD, voriconazole, and voriconazole N-oxide in critically ill patients undergoing continuous renal replacement therapy (CRRT) and to determine if CRRT removes SBECD sufficiently to allow for the use of IV voriconazole without significant risk of SBECD accumulation.MethodsThis prospective, open-label pharmacokinetic study enrolled patients >18 years old receiving IV voriconazole for a known or suspected invasive fungal infection while undergoing CRRT. Serial blood and effluent samples were collected on days 1, 3, 5, 7, and every 3 to 5 days thereafter. SBECD, voriconazole, and voriconazole N-oxide plasma and effluent concentrations were measured by liquid chromatography-tandem mass spectrometry. Pharmacokinetic, pharmacodynamic, and pharmacogenetic analyses were conducted.ResultsTen patients (mean ± standard deviation (SD)) 53 ± 11 years old, 50% male, 81 ± 14 kg, with Acute Physiologic and Chronic Health Evaluation II (APACHE II) scores of 31.5 ± 3.8 were evaluated. All patients underwent continuous venovenous hemofiltration (CVVH) with a median predilution replacement fluid rate of 36 (interquartile range (IQR) 32 to 37) ml/kg/hr and total ultrafiltration rate of 38 (IQR 34 to 39) ml/kg/hr. Mean ± SD voriconazole and SBECD dosages administered were 8.1 ± 2.1 mg/kg/day and 129 ± 33 mg/kg/day, respectively. Voriconazole plasma trough concentrations were >1 mg/L in all patients with CVVH accounting for only 15% of the total body clearance. CVVH accounted for 86% of the total body clearance of SBECD with the majority of the dose being recovered in the effluent. Minimal increases in dose normalized SBECD area under the concentration-time curve from 0 to 12 hours (AUC0-12) (4,484 ± 4,368 to 4,553 ± 2,880 mg*hr/L; P = 0.97) were observed after study day 1.ConclusionsCVVH effectively removed SBECD at a rate similar to the ultrafiltration rate. Voriconazole clearance by CVVH was not clinically significant. Standard dosages of IV voriconazole can be utilized in patients undergoing CVVH without significant risk of SBECD accumulation.

Trial registration

ClinicalTrials.gov NCT01101386. Registered 6 April 2010.

Electronic supplementary material

The online version of this article (doi:10.1186/s13054-015-0753-8) contains supplementary material, which is available to authorized users.  相似文献   

11.
12.
Heparin is the most commonly prescribed anticoagulant for continuous renal replacement therapy. There is, however, increasing evidence questioning its safety, particularly in the critically ill. Heparin mainly confers its anticoagulant effect by binding to antithrombin. Heparin binds to numerous other proteins and cells as well, however, compromising its efficacy and safety. Owing to antithrombin consumption and degradation, and to the binding of heparin to acute phase proteins, and to apoptotic and necrotic cells, critical illness confers heparin resistance. The nonspecific binding of heparin further leads to an unpredictable interference with inflammation pathways, microcirculation and phagocytotic clearance of dead cells, with possible deleterious consequences for patients with sepsis and systemic inflammation. Regional anticoagulation with citrate does not increase the patient's risk of bleeding. The benefits of citrate further include a longer or similar circuit life, and possibly better patient and kidney survival. This needs to be confirmed in larger randomized controlled multicenter trials. The use of citrate might be associated with less inflammation and has useful bio-energetic implications. Citrate can, however, with inadequate use cause metabolic derangements. Full advantages of citrate can only be realized if its risks are well controlled. These observations suggest a greater role for citrate.  相似文献   

13.
14.
15.
ObjectivesThe study aimed to characterize the pharmacokinetics (PK) of four β-lactams (piperacillin, ceftazidime, cefepime, and meropenem) in patients comedicated with amikacin (AMK), and to confirm the predictive performance of AMK data, obtained from therapeutic drug monitoring (TDM), on these PK, using a population modeling approach.Design and methodsSerum samples were collected in 88 critically ill septic patients. For each β-lactam, the covariate model was optimized using renal function. Furthermore, predictive performance of AMK concentrations and PK parameters was assessed on β-lactam PK.ResultsA two-compartment model with first-order elimination best fitted the β-lactam data. Results supported the superiority of AMK concentrations, over renal function and AMK PK parameters, to assess the β-lactam PK.ConclusionThe study confirmed the significant link between the exposure to AMK and to β-lactams, and presented population models able to guide β-lactam dosage adjustments using renal biomarkers or TDM-related aminoglycoside data.  相似文献   

16.
17.
18.

Introduction

Altered pharmacokinetics (PK) in critically ill patients can result in insufficient serum β-lactam concentrations when standard dosages are administered. Previous studies on β-lactam PK have generally excluded the most severely ill patients, or were conducted during the steady-state period of treatment. The aim of our study was to determine whether the first dose of piperacillin-tazobactam, ceftazidime, cefepime, and meropenem would result in adequate serum drug concentrations in patients with severe sepsis and septic shock.

Methods

Open, prospective, multicenter study in four Belgian intensive care units. All consecutive patients with a diagnosis of severe sepsis or septic shock, in whom treatment with the study drugs was indicated, were included. Serum concentrations of the antibiotics were determined by high-pressure liquid chromatography (HPLC) before and 1, 1.5, 4.5 and 6 or 8 hours after administration.

Results

80 patients were treated with piperacillin-tazobactam (n = 27), ceftazidime (n = 18), cefepime (n = 19) or meropenem (n = 16). Serum concentrations remained above 4 times the minimal inhibitory concentration (T > 4 × MIC), corresponding to the clinical breakpoint for Pseudomonas aeruginosa defined by the European Committee on Antimicrobial Susceptibility Testing (EUCAST), for 57% of the dosage interval for meropenem (target MIC = 8 μg/mL), 45% for ceftazidime (MIC = 32 μg/mL), 34% for cefepime (MIC = 32 μg/mL), and 33% for piperacillin-tazobactam (MIC = 64 μg/mL). The number of patients who attained the target PK profile was 12/16 for meropenem (75%), 5/18 for ceftazidime (28%), 3/19 (16%) for cefepime, and 12/27 (44%) for piperacillin-tazobactam.

Conclusions

Serum concentrations of the antibiotic after the first dose were acceptable only for meropenem. Standard dosage regimens for piperacillin-tazobactam, ceftazidime and cefepime may, therefore, be insufficient to empirically cover less susceptible pathogens in the early phase of severe sepsis and septic shock.  相似文献   

19.
BackgroundThe best fluid replacement strategy and the role of albumin in immunocompromised patients with sepsis is unclear.MethodsWe performed a secondary analysis of immunocompromised patients enrolled in the ALBIOS trial which randomized patients with severe sepsis or septic shock to receive either 20% albumin (target 30 g per liter or more) and crystalloid or crystalloid alone during ICU stay.ResultsOf 1818 patients originally enrolled, 304 (16.4%) were immunocompromised. One-hundred-thirty-nine (45.7%) patients were randomized in the albumin while 165 (54.2%) in the crystalloid group. At 90 days, 69 (49.6%) in the albumin group and 89 (53.9%) in the crystalloids group died (hazard ratio - HR - 0.94; 95% CI 0.69–1.29). No differences were observed with regards to 28-day mortality, SOFA score (and sub-scores), length of stay in the ICU and in the hospital, proportion of patients who had developed acute kidney injury or received renal replacement therapy, duration of mechanical ventilation. Albumin was not independently associated with a higher or lower 90-day mortality (HR 0.979, 95% CI 0.709–1.352) as compared to crystalloid.ConclusionAlbumin replacement during the ICU stay, as compared with crystalloids alone, did not affect clinical outcomes in a cohort of immunocompromised patients with sepsis.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号