首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Objective To investigate the relationship between intra-abdominal hypertension (IAH) and acute renal failure (ARF) in critically ill patients. Design and setting Prospective, observational study in a general intensive care unit. Patients Patients consecutively admitted for > 24 h during a 6-month period. Interventions None. Measurements and results Intra-abdominal pressure (IAP) was measured through the urinary bladder pressure measurement method. The IAH was defined as a IAP ≥12 mmHg in at least two consecutive measurements performed at 24-h intervals. The ARF was defined as the failure class of the RIFLE classification. Of 123 patients, 37 (30.1%) developed IAH. Twenty-three patients developed ARF (with an overall incidence of 19%), 16 (43.2%) in IAH and 7 (8.1%) in non-IAH group (p < 0.05). Shock (p < 0.001), IAH (p = 0.002) and low abdominal perfusion pressure (APP; p = 0.046) resulted as the best predictive factors for ARF. The optimum cut-off point of IAP for ARF development was 12 mmHg, with a sensitivity of 91.3% and a specificity of 67%. The best cut-off values of APP and filtration gradient (FG) for ARF development were 52 and 38 mmHg, respectively. Age(p = 0.002), cumulative fluid balance (p = 0.002) and shock (p = 0.006) were independent predictive factors of IAH. Raw hospital mortality rate was significantly higher in patients with IAH; however, risk-adjusted and O/E ratio mortality rates were not different between groups. Conclusions In critically ill patients IAH is an independent predictive factor of ARF at IAP levels as low as 12 mmHg, although the contribution of impaired systemic haemodynamics should also be considered.  相似文献   

2.
Objective To determine the incidence of Clostridium difficile (CD) diarrhoea in feed-intolerant, critically ill patients who received erythromycin-based prokinetic therapy. Design and setting Prospective observational study in a mixed intensive care unit. Methods The development of diarrhoea (> 3 loose, liquid stool per day with an estimated total volume ≥ 250 ml/day) was assessed in 180 consecutive critically ill patients who received prokinetic therapy (erythromycin only, n = 53; metoclopramide, n = 37; combination erythromycin/metoclopramide, n = 90) for feed intolerance. Stool microscopy, culture and CD toxin assay were performed in all patients who developed diarrhoea during and after prokinetic therapy. Diarrhoea was deemed to be related to CD infection if CD toxin was detected. Results Demographics, antibiotic use and admission diagnosis were similar amongst the three patients groups. Diarrhoea developed in 72 (40%) patients, 9.9 ± 0.8 days after commencement of therapy, none of whom was positive for CD toxin or bacterial infection. Parasitic infections were found in four aboriginal men from an area endemic for these infections. Diarrhoea was most prevalent in patients who received combination therapy (49%) and was more common than in those who received erythromycin alone (30%) and metoclopramide alone (32%). Diarrhoea was short-lasting with a mean duration of 3.6 ± 1.2 days. Conclusions In critical illness, diarrhoea following the administration of erythromycin at prokinetic doses is not associated with CD but may be related to pro-motility effects of the agent. Prokinetic therapy should be stopped at the onset of diarrhoea and prophylactic use should be strictly avoided.  相似文献   

3.
Objective To assess the relationship between blood glucose concentrations (BSL) and intolerance to gastric feeding in critically ill patients. Design Prospective, case-controlled study. Patients and participants Two-hourly BSL and insulin requirements over the first 10 days after admission were assessed in 95 consecutive feed-intolerant (NG aspirate > 250 ml during feed) critically ill patients and 50 age-matched, feed-tolerant patients who received feeds for at least 3 days. Patients with diabetes mellitus were excluded. A standard insulin protocol was used to maintain BSL at 5.0–7.9 mmol. Measurements and results The peak BSLs were significantly higher before and during enteral feeding in feed-intolerant patients. The mean and trough BSLs were, however, similar between the two groups on admission, 24 h prior to feeding and for the first 4 days of feeding. The variations in BSLs over 24 h before and during enteral feeding were significantly greater in feed-intolerant patients. A BSL greater than 10 mmol/l was more prevalent in patients with feed intolerance during enteral feeding. The time taken to develop feed intolerance was inversely related to the admission BSL (r = –0.40). The amount of insulin administered before and during enteral feeding was similar between the two groups. Conclusions Feed intolerance in critically ill patients is associated with a greater degree of glycaemic variation, with a greater number of patients with transient hyperglycaemia. These data suggest more intensive insulin therapy may be required to minimize feed intolerance, an issue that warrants further study. Supported by the National Health and Medical Research Council of Australia.  相似文献   

4.
Objective Little information is available regarding current practice in continuous renal replacement therapy (CRRT) for the treatment of acute renal failure (ARF) and the possible clinical effect of practice variation. Design Prospective observational study. Setting A total of 54 intensive care units (ICUs) in 23 countries. Patients and participants A cohort of 1006 ICU patients treated with CRRT for ARF. Interventions Collection of demographic, clinical and outcome data. Measurements and results All patients except one were treated with venovenous circuits, most commonly as venovenous hemofiltration (52.8%). Approximately one-third received CRRT without anticoagulation (33.1%). Among patients who received anticoagulation, unfractionated heparin (UFH) was the most common choice (42.9%), followed by sodium citrate (9.9%), nafamostat mesilate (6.1%), and low-molecular-weight heparin (LMWH; 4.4%). Hypotension related to CRRT occurred in 19% of patients and arrhythmias in 4.3%. Bleeding complications occurred in 3.3% of patients. Treatment with LMWH was associated with a higher incidence of bleeding complications (11.4%) compared to UFH (2.3%, p = 0.0083) and citrate (2.0%, p = 0.029). The median dose of CRRT was 20.4 ml/kg/h. Only 11.7% of patients received a dose of > 35 ml/kg/h. Most (85.5%) survivors recovered to dialysis independence at hospital discharge. Hospital mortality was 63.8%. Multivariable analysis showed that no CRRT-related variables (mode, filter material, drug for anticoagulation, and prescribed dose) predicted hospital mortality. Conclusions This study supports the notion that, worldwide, CRRT practice is quite variable and not aligned with best evidence. Electronic supplementary material The online version of this article (doi:) contains supplementary material, which is available to authorized users. This article is discussed in the editorial available at: . This study was supported by the Austin Hospital Anaesthesia and Intensive Care Trust Fund.  相似文献   

5.
Objective Several studies have reported a close relationship between an increased dose of dialysis and survival in patients treated for acute renal failure. Unfortunately, the quantification of dialysis in critically ill patients based on the urea nitrogen formula Kt/V is not applicable. Ionic dialysance is a new parameter calculated in real time from the dialysate conductivity and correlated with the effective urea clearance in chronic hemodialysis patients. The aim of our study was to evaluate ionic dialysance in the quantification of dialysis in critically ill patients with acute renal failure. Design Prospective open-label study. Setting An 18-bed medical intensive care unit. Patients Thirty-one patients with multiple organ dysfunction syndrome and acute renal failure requiring intermittenthemodialysis were included. Measurements Using the first dialysis session of each patient, we compared the delivered dose of dialysis based on ionic dialysance measurement (KtID) with the well-accepted gold standard method based on fractional dialysate sampling (Ktdialysate). The data were analyzed using linear regression and Bland–Altman analysis. Results Thirty-one intermittent hemodialysis sessions were performed in 31 critically ill patients (mean age 58 ± 12 years, SAPS II score 56 ± 10). We found a close correlation between Ktdialysate and KtID (Ktdialysate = 36.3 ± 11.4 l; KtID = 38.4 ± 11.8; r = 0.96) with excellent limits of agreement (–2.2 l; 6.4 l). Conclusion The feasibility of dialysis quantification based on ionic dialysance in the critically ill patient is good. This method is a simple and accurate tool for the determination of dialysis dose in critically ill patients.  相似文献   

6.
Objective To examine the occurrence of feed intolerance in critically ill patients with previously diagnosed type II diabetes mellitus (DM) who received prolonged gastric feeding. Design and setting Retrospective study in a level 3 mixed ICU. Patients All mechanically ventilated, enterally fed patients (n = 649), with (n = 118) and without type II DM (n = 531) admitted between January 2003 and July 2005. Interventions Patients with at least 72 h of gastric feeding were identified by review of case notes and ICU charts. The proportion that developed feed intolerance was determined. All patient received insulin therapy. Results The proportion of patients requiring gastric feeding for at least 72 h was similar between patients with and without DM (42%, 50/118, vs. 42%, 222/531). Data from patients with DM were also compared with a group of 50 patients matched for age, sex and APACHE II score, selected from the total non-diabetic group. The occurrence of feed intolerance (DM 52% vs. matched non-DM 50% vs. unselected non-diabetic 58%) and the time taken to develop feed intolerance (DM 62.6 ± 43.8 h vs. matched non-DM 45.3 ± 54.6 vs. unselected non-diabetic 50.6 ± 59.5) were similar amongst the three groups. Feed intolerance was associated with a greater use of morphine/midazolam and vasopressor support, a lower feeding rate and a longer ICU length of stay. Conclusions In critically ill patients who require prolonged enteral nutrition, a prior history of DM type II does not appear to be a further risk factor for feed intolerance. This research was supported by the National Health and Medical Research Council of Australia An erratum to this article can be found at  相似文献   

7.
Objective To investigate the pattern of pituitary-adrenal responses to human corticotropin-releasing hormone (hCRH) in critically ill patients and to examine the relation between responses and clinical outcome. Design and setting Prospective study in consecutive critically ill patients in a general intensive care unit in a teaching hospital. Patients The study included 37 critically ill, mechanically ventilated patients with diverse underlying diagnoses (28 men, 9 women; median age 56 years). Interventions A morning blood sample was obtained to measure baseline cortisol, corticotropin (ACTH), and cytokines. Patients were then injected with 100 μg hCRH, and plasma cortisol and ACTH were measured over a period of 2 h. Measurements and results In the overall patient population baseline and peak cortisol concentrations following hCRH were 16 ± 5 and 21 ± 5 μg/dl, respectively, and median baseline and peak ACTH levels 23 and 65 pg/ml, respectively. Higher ACTH levels and longer release of cortisol were noted in nonsurvivors (n = 18) than in survivors (n = 19). Furthermore, nonsurvivors had higher concentrations of interleukin 8 (115 vs. 38 pg/ml) and interleukin 6 (200 vs. 128 pg/ml) than survivors. Conclusions Critically ill patients demonstrate altered pituitary-adrenal axis responses to hCRH. This is particularly evident in the sickest patients with the highest degree of inflammatory profile who ultimately do not survive.  相似文献   

8.
Objectives Cancer patients are at high risk for acute kidney injury (AKI), which is associated with high mortality when renal replacement therapy is required. Because physicians might be reluctant to offer dialysis to patients with malignancies, we sought to appraise outcomes in critically ill cancer patients (mainly with hematological malignancies) who received renal replacement therapy for AKI complicating cancer management. Design Cohort study including consecutive patients who received renal replacement therapy for AKI complicating cancer management, over a 42-month period. Their mortality was compared with that of non-cancer patients who received renal replacement therapy in the same center over the same study period (control group). Setting A 12-bed medical intensive care unit in a university hospital. Results 94 critically-ill cancer patients met the inclusion criteria. Median SAPS II was 53 (IQR 40–75) and median Logistic Organ Dysfunction score was 7 (IQR 5–10). The etiology of AKI was multiple in most patients (248 identified factors in 93 patients). Hospital mortality was 51.1%. Two variables were independently associated with hospital mortality: the severity of associated organ failures at ICU admission (OR, 1.33; 95% CI, 1.11–1.59; per point) and renal function deterioration after ICU admission (OR, 5.42; 95% CI, 1.62–18.11). Characteristics of the malignancy were not associated with hospital mortality. The presence of cancer had no detectable influence on hospital mortality after adjustment for gender, age, acute severity as assessed by the SAPS II score, and chronic health status [OR 1.2, 95% CI 0.63–2.27; p = 0.57]. Conclusion ICU admission should be considered in selected critically ill cancer patients with AKI requiring renal replacement therapy.  相似文献   

9.
Objective To evaluate the reliability of mini-bronchoalveolar lavage (mini-BAL) for the measurement of tobramycin concentrations in epithelial lining fluid (ELF) in comparison with conventional bronchoscopic bronchoalveolar lavage (BAL). Design Prospective, open-label study. Setting An intensive care unit and research ward in a university hospital. Patients Twelve critically ill adult patients with ventilator-associated pneumonia (VAP). Interventions All subjects received intravenous infusions of tobramycin 7–10 mg/kg once daily. After 2 days of therapy, the steady-state serum and ELF concentrations (obtained from BAL and mini-BAL) of tobramycin were determined by means of high-performance liquid chromatography. Measurements and results We observed poor penetration of tobramycin in ELF of ≈ 12% with ELF peak concentrations of ≈ 3 mg/l with both methods. Good agreement in Bland–Altman analysis (mean ± SD bias = 0.04 ± 0.38 mg/l) was observed between the two methods of sampling. Conclusion Our results suggest that tobramycin 7–10 mg/kg once daily in critically ill patients with VAP might provide insufficient lung concentrations in the case of difficult-to-treat pathogens. Besides, mini-BAL, which is simple, non-invasive and easily repeatable at the bedside, appears to be a reliable method for the measurement of antibiotic concentrations in ELF in comparison with bronchoscopic BAL in critically ill patients with VAP. This article is discussed in the editorial available at: . Support was provided only by institutional sources.  相似文献   

10.
《Australian critical care》2014,27(4):188-193
Enteral nutrition (EN) for the critically ill and mechanically ventilated patients can be administered either via the continuous or bolus methods. However, there is insufficient evidence supporting which of these methods may have a lower risk of aspiration and gastrointestinal (GI) complications. This study was conducted in order to identify the incidence of aspiration and GI complications using continuous enteral nutrition (CEN) and bolus enteral nutrition (BEN) in critically ill patients at the Rafik Hariri University Hospital (RHUH), Beirut, Lebanon.MethodsA pseudo-randomised controlled trial was conducted on 30 critically ill mechanically ventilated patients receiving EN for more than 72 h. Patients were randomly assigned into the following groups: an experimental group that received CEN and a control group that received BEN. Furthermore, patients’ health characteristics data as well as the incidence of aspiration and GI complications (high gastric residual volume “HGRV”, vomiting, diarrhoea, and constipation) were subsequently collected.ResultsThere were no statistically significant differences between the effects of CEN versus BEN groups on the occurrence of aspiration, HGRV, diarrhoea, or vomiting (P > 0.05). However, constipation was significantly greater in patients receiving CEN (10 patients (66.7%)) as compared with those receiving BEN (3 patients (20%)) (P = 0.025).ConclusionCEN versus BEN methods did not affect the incidence of aspiration, HGRV, vomiting or diarrhoea. However, the incidence of constipation was significantly greater in patients receiving CEN.  相似文献   

11.
Objective To establish the incidence of central venous catheter erosion in a patient cohort receiving total parenteral nutrition and to examine risk factors and complications of vascular erosion. Design and setting Review of prospectively collected intravenous nutrition service audit records in a tertiary university hospital. Results Records of 1,499 patients (2,992 catheters) were studied over the 14 year period 1991–2005. Fisher's exact test was used to determine statistical significance. Five erosions occurred, representing an incidence of 0.17% per catheter or 0.28 per 1,000 catheter days. One of the five patients died from ensuing complications. Mean time to onset of symptoms was 3.6 days following catheter insertion. Symptoms/signs included dyspnoea (n = 5), chest pain (n = 2) and pleural effusion (n = 5). Diagnosis was delayed by a mean of 1.6 days. Three erosions occurred in left subclavian catheters (n = 583); two in left internal jugular catheters (n = 453). None occurred in right-sided catheters (n = 1956). The relative risk of erosion occurring in left-sided catheters compared to right was 2.9 (95% CI 2.76–3.00; p = 0.009). There was no statistically significantly greater risk of vascular erosion in subclavian than internal jugular catheters (relative risk 0.9; p = 1.0). Older age was a statistically significant risk factor (p = 0.009); female sex was not (p = 0.18). Conclusion In patients receiving total parenteral nutrition via central venous catheters, erosion has an incidence per catheter of 0.17% and is more likely to occur in left-sided catheters and elderly patients.  相似文献   

12.
Objective To demonstrate the diagnostic yield, therapeutic role and safety of flexible bronchoscopy via an intensivist-led service in critically ill children.Design Retrospective chart review.Setting Regional paediatric intensive care unit.Measurements and results One hundred forty-eight flexible bronchoscopies were performed by two intensivists on 134 patients (median age 16.5 months) over a 2.5-year period. Eighty-eight percent of patients required mechanical ventilation, and 22% were receiving inotropes. Case mix included general (n = 77), cardiac surgery (n = 18), cardiology (n = 13), ear-nose-and-throat surgery (n = 17), oncology (n = 8) and renal (n = 1). The indication for bronchoscopy was defined a priori according to one of four categories: suspected upper airway disease (n = 32); lower airway disease (n = 70); investigation of pulmonary disease (n = 25); and extubation failure (n = 21). Bronchoscopy was generally performed soon after PICU admission, at a median time of 1.5 days for the former three categories, and 4 days for extubation failure group. A positive yield from bronchoscopy (diagnosis that explained the clinical condition or influenced patient management) was present in 113 of 148 (76%) procedures, varying within groups from 44% (pulmonary disease) to 90% (extubation failure).Ten percent of patients developed a fall in oxygen saturations > 20% during the procedure and 17% required a bolus of at least 10 ml/kg of 0.9% saline for hypotension.Conclusions Critically ill patients with respiratory problems may benefit from a PICU-led bronchoscopy service as the yield for positive bronchoscopic finding is high, particularly for upper airway problems or extubation failure.  相似文献   

13.
Objective To investigate the potential beneficial and adverse effects of early post-pyloric feeding compared with gastric feeding in critically ill adult patients with no evidence of impaired gastric emptying.Design Randomised controlled studies comparing gastric and post-pyloric feeding in critically ill adult patients from Cochrane Controlled Trial Register (2005 issue 3), EMBASE and MEDLINE databases (1966 to 1 October 2005) without any language restriction were included. Two reviewers reviewed the quality of the studies and performed data extraction independently.Measurements and results Eleven randomised controlled studies with a total of 637 critically ill adult patients were considered. The mortality (relative risk [RR] 1.01, 95% CI 0.76–1.36, p = 0.93; I 2 = 0%) and risk of aspiration or pneumonia (RR 1.28, 95% CI 0.91–1.80, p = 0.15; I 2 = 0%) were not significantly different between patients treated with gastric or post-pyloric feeding. The effect of post-pyloric feeding on the risk of pneumonia or aspiration was similar when studies were stratified intothose with and those without the use of concurrent gastric decompression (RR ratio 0.95, 95% CI 0.48–1.91, p = 0.89). The risk of diarrhoea and the length of intensive care unit stay (weighted mean difference in days –1.46, 95% CI –3.74 to 0.82,p = 0.21; I 2 = 24.6%) were not statistically different. The gastric feeding group had a much lower risk of experiencing feeding tube placement difficulties or blockage (0 vs 9.6%, RR 0.13, 95% CI 0.04–0.44, p = 0.001; I 2 = 0%).Conclusions Early use of post-pyloric feeding instead of gastric feeding in critically ill adult patients with no evidence of impaired gastric emptying was not associated with significant clinical benefits.This study was solely funded by the Department of Intensive Care, Royal Perth Hospital. No financial support was received for this study from pharmaceutical companies or other private companies in the form of grants and awards.  相似文献   

14.
Objective To determine the incidence and duration of adrenal inhibition induced by a single dose of etomidate in critically ill patients. Design Prospective, observational cohort study. Setting Three intensive care units in a university hospital. Patients Forty critically ill patients without sepsis who received a single dose of etomidate for facilitating endotracheal intubation. Measurements and main results Serial serum cortisol and 11β-deoxycortisol samples were taken at baseline and 60 min after corticotropin stimulation test (250 μg 1–24 ACTH) at 12, 24, 48, and 72 h after etomidate administration. Etomidate-related adrenal inhibition was defined by the combination of a rise in cortisol less than 250 nmol/l (9 μg/dl) after ACTH stimulation and an excessive accumulation of serum 11β-deoxycortisol concentrations at baseline. At 12 h after etomidate administration, 32/40 (80%) patients fulfilled the diagnosis criteria for etomidate-related adrenal insufficiency. This incidence was significantly lower at 48 h (9%) and 72 h (7%). The cortisol to 11β-deoxycortisol ratio (F/S ratio), reflecting the intensity of the 11β-hydroxylase enzyme blockade, improved significantly over time. Conclusions A single bolus infusion of etomidate resulted in wide adrenal inhibition in critically ill patients. However, this alteration was reversible by 48 h following the drug administration. The empirical use of steroid supplementation for 48 h following a single dose of etomidate in ICU patients without septic shock should thus be considered. Concomitant serum cortisol and 11β-deoxycortisol dosages are needed to provide evidence for adrenal insufficiency induced by etomidate in critically ill patients. Electronic supplementary material The online version of this article (doi:) contains supplementary material, which is available to authorized users. Financial support: All of the authors have disclosed that they have no financial relationship with or interest in any commercial companies.  相似文献   

15.
Background and aims Continuous veno–venous haemofiltration (CVVH) is an established treatment for acute renal failure (ARF). Recently, extended intermittent dialytic techniques have been proposed for the treatment of ARF. The aim of this study was to compare these two approaches. Setting Intensive care unit of tertiary hospital. Subjects Sixteen critically ill patients with ARF. Design Randomised controlled trial. Intervention We randomised sixteen patients to three consecutive days of treatment with either CVVH (8) or extended daily dialysis with filtration (EDDf) (8) and compared small-solute, electrolyte and acid–base control. Results There was no significant difference between the two therapies for urea or creatinine levels over 3 days. Of 80 electrolyte measurements taken before treatment, 19 were abnormal. All values were corrected as a result of treatment, except for one patient in the CVVH group who developed hypophosphataemia (0.54 mmol/l) at 72 h. After 3 days of treatment, there was a mild but persistent metabolic acidosis in the EDDf group compared to the CVVH group (median bicarbonate: 20 mmol/l vs. 29 mmol/l: p = 0.039; median base deficit: –4 mEq/l vs. –2.1 mEq/l, p = 0.033). Conclusions CVVH and EDDf as prescribed achieved similar control of urea, creatinine and electrolytes. Acidosis was better controlled with CVVH. Electronic supplementary material The online version of this article (doi:) contains supplementary material, which is available to authorized users.  相似文献   

16.
Objective To determine the steady-state trough serum and epithelial lining fluid (ELF) concentrations of teicoplanin 12 mg/kg per day in critically ill patients with ventilator associated pneumonia.Design and setting Prospective, pharmacokinetic study in the surgical intensive care unit in a university hospital.Patients Thirteen adult patients with nosocomial bacterial pneumonia on mechanical ventilation were enrolled.Interventions All subjects received a 30-min intravenous infusion of 12 mg/kg teicoplanin every 12 h for 2 consecutive days followed by 12 mg/kg once daily. Teicoplanin concentrations in serum and ELF were determined simultaneously 4–6 days after antibiotic administration started.Measurements and results The median total and free concentrations of teicoplanin in serum at trough were 15.9 μg/ml (range 8.8–29.9) and 3.7 (2.0–5.4), respectively. The concentration in ELF was 4.9 (2.0–11.8).Conclusions In critically ill patients with ventilator-associated pneumonia the administration of high teicoplanin doses is required to reach sufficient trough antibiotic concentrations in lung tissues at steady state. At that time trough-free concentrations of teicoplanin in serum and ELF are comparable.This work was financially supported by a grant from Laboratoires Aventis, Paris, France  相似文献   

17.
Objective To explore the effects of renal function estimated by measured creatinine clearance (ClCR) on trough serum concentration (Cmin) of piperacillin given to critically ill patients.Design Prospective observational study.Setting An intensive care unit and research ward in a university hospital.Patients Seventy critically ill patients, including 22 with severe trauma.Interventions All subjects received an intravenous infusion of piperacillin 4 g three times (n = 61) or four times (n = 9) per day. Piperacillin Cmin values were determined 24 h after treatment started and compared to the French breakpoint defining antibiotic susceptibility against Enterobacteriaceae (8 mg/l) or Pseudomonas sp. (16 mg/l).Results Median (range) piperacillin Cmin was 11.9 (< 1–156.3) mg/l, with a great variability among patients. Although the median value was close to the breakpoints, sub-therapeutic plasma levels were frequently observed. Piperacillin Cmin was lower than the breakpoint for Enterobacteriaceae in 37% of patients, and lower than the breakpoint for P. aeruginosa in 67% of them. A strong relationship was observed between piperacillin Cmin and ClCR: the higher the ClCR, the lower the piperacillin Cmin in serum. For patients with a ClCR < 50 ml/min, enough piperacillin Cmin was achieved in most patients with 12 g piperacillin per day. For patients with higher ClCR values, a piperacillin daily dose of 16 g or more may be warranted.Conclusions In critically ill patients, therapeutic monitoring must be part of the routine, and knowledge of ClCR value may be useful for the choice of adequate initial piperacillin dosing.No funding source  相似文献   

18.
Objective In critically ill patients, energy requirements are frequently calculated as a multiple of total body weight presuming a linear relationship between total body weight and resting energy expenditure (REE); however, it is doubtful if this estimation of energy needs should be applied to all patients, particularly to overweight patients, since adipose tissue has a low contribution to REE. This study was undertaken to test the hypothesis that REE adjusted for total body weight decreases with increasing body mass index in critically ill patients. Additionally, measured REE was compared with three predictive equations. Design and Setting Clinical study in a university hospital intensive care unit. Patients One hundred critically ill patients admitted to the intensive care unit. Measurements and results Patients were included into four groups according to their body mass index (normal weight, pre-obese, obese, and morbidly obese). Measured REE was assessed using indirect calorimetry. Energy needs were calculated using the basal metabolic rate, the Consensus Statement of the American College of Chest Physicians (REEacs), and 25 kcal/kg of ideal body weight (REEibw). Adjusted REE was 24.8 ± 5.5 kcal/kg in normal weight, 22.0 ± 3.7 kcal/kg in pre-obese, 20.4 ± 2.6 kcal/kg in obese, and 16.3 ± 2.3 kcal/kg in morbidly obese patients (p < 0.01). Basal metabolic rate underestimated measured REE in normal weight and pre-obese patients. REEacs and REEibw over- and underestimated measured REE in overweight patients, respectively. Conclusions Predictive equations were not able to estimate measured REE adequately in all the patients. Adjusted REE decreased with increasing body mass index; thus, a body mass index group-specific adaptation for the estimation of energy needs should be applied.  相似文献   

19.
Objective To evaluate a blind ‘active’ technique for the bedside placement of post-pyloric enteral feeding tubes in a critically ill population with proven gastric ileus. Design and setting An open study to evaluate the success rate and duration of the technique in cardiothoracic and general intensive care units of a tertiary referral hospital. Patients 20 consecutive, ventilated patients requiring enteral nutrition, where feeding had failed via the gastric route. Interventions Previously described insertion technique—the Corpak 10-10-10 protocol—for post-pyloric enteral feeding tube placement, modified after 20 min if placement had not been achieved, by insufflation of air into the stomach to promote pyloric opening. Measurements and results A standard protocol and a set method to identify final tube position were used in each case. In 90% (18/20) of cases tubes were placed on the first attempt, with an additional tube being successfully placed on the second attempt. The median time for tube placement was 18 min (range 3–55 min). In 20% (4/20) insufflation of air was required to aid trans-pyloric passage. Conclusions The previously described technique, modified by insufflation of air into the stomach in prolonged attempts to achieve trans-pyloric passage, proved to be an effective and cost efficient method to place post-pyloric enteral feeding tubes. This technique, even in the presence of gastric ileus, could be incorporated by all critical care facilities, without the need for any additional equipment or costs. This approach avoids the costs of additional equipment, time-delays and necessity to transfer the patient from the ICU for the more traditional techniques of endoscopy and radiographic screening.  相似文献   

20.
Objective The transpulmonary thermo-dye dilution technique enables assessment of cardiac index (CI) intrathoracic blood volume index (ITBVI) and extravascular lung water index (EVLWI). Since the extent of lung edema may influence the reliability of CI measurement by transpulmonary thermodilution due to loss of indicator, we analyzed the impact of EVLWI on transpulmonary thermodilution-derived CI in critically ill patients. Design Retrospective, clinical study. Setting Surgical intensive care unit in a university hospital Patients and methods With ethics approval we analyzed data from 57 patients (38 men, 19 women; age range 18–79 years) who, for clinical indication, underwent hemodynamic monitoring by transpulmonary thermo-dye dilution and pulmonary artery thermodilution (572 measurements). All patients were mechanically ventilated and had received a femoral artery thermo-fiberoptic and pulmonary artery catheter which were connected to a computer system (Cold-Z021, Pulsion Medical Systems, Munich, Germany). For each measurement, 15–17 ml indocyanine green(4–6 °C) was injected central venously. Injections were made manually and independently from the respiratory cycle. Linear regression was used for statistical analysis. Interventions and main results The difference between transpulmonary and pulmonary artery thermodilution CI was not correlated with EVLWIfor all measurements (n = 572, r = 0.01, p = 0.76) and when using only the first simultaneous measurement (n = 57, r = 0.08, p = 0.56). Furthermore, EVLWI was not correlated with transpulmonary thermodilution CI (n = 572, r = 0.07, p = 0.08). Coefficient of variation for transpulmonary thermodilution CI was 7.7 ± 4.3%. Conclusion Measurement of cardiac output by transpulmonary thermodilution is not influenced by EVLWI in critically ill patients and loss of indicator as the underlying reason is probably overestimated.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号