首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.

Background and objectives

Home dialysis is often recognized as a first-choice therapy for patients initiating dialysis. However, studies comparing clinical outcomes between peritoneal dialysis and home hemodialysis have been very limited.

Design, setting, participants, & measurements

This Australia and New Zealand Dialysis and Transplantation Registry study assessed all Australian and New Zealand adult patients receiving home dialysis on day 90 after initiation of RRT between 2000 and 2012. The primary outcome was overall survival. The secondary outcomes were on-treatment survival, patient and technique survival, and death-censored technique survival. All results were adjusted with three prespecified models: multivariable Cox proportional hazards model (main model), propensity score quintile–stratified model, and propensity score–matched model.

Results

The study included 10,710 patients on incident peritoneal dialysis and 706 patients on incident home hemodialysis. Treatment with home hemodialysis was associated with better patient survival than treatment with peritoneal dialysis (5-year survival: 85% versus 44%, respectively; log-rank P<0.001). Using multivariable Cox proportional hazards analysis, home hemodialysis was associated with superior patient survival (hazard ratio for overall death, 0.47; 95% confidence interval, 0.38 to 0.59) as well as better on-treatment survival (hazard ratio for on-treatment death, 0.34; 95% confidence interval, 0.26 to 0.45), composite patient and technique survival (hazard ratio for death or technique failure, 0.34; 95% confidence interval, 0.29 to 0.40), and death-censored technique survival (hazard ratio for technique failure, 0.34; 95% confidence interval, 0.28 to 0.41). Similar results were obtained with the propensity score models as well as sensitivity analyses using competing risks models and different definitions for technique failure and lag period after modality switch, during which events were attributed to the initial modality.

Conclusions

Home hemodialysis was associated with superior patient and technique survival compared with peritoneal dialysis.  相似文献   

2.

Summary

Background and objectives

Despite significant advances in the epidemiology of acute kidney injury (AKI), prognostication remains a major clinical challenge. Unfortunately, no reliable method to predict renal recovery exists. The discovery of biomarkers to aid in clinical risk prediction for recovery after AKI would represent a significant advance over current practice.

Design, setting, participants, & measurements

We conducted the Biological Markers of Recovery for the Kidney study as an ancillary to the Acute Renal Failure Trial Network study. Urine samples were collected on days 1, 7, and 14 from 76 patients who developed AKI and received renal replacement therapy (RRT) in the intensive care unit. We explored whether levels of urinary neutrophil gelatinase-associated lipocalin (uNGAL), urinary hepatocyte growth factor (uHGF), urinary cystatin C (uCystatin C), IL-18, neutrophil gelatinase-associated lipocalin/matrix metalloproteinase-9, and urine creatinine could predict subsequent renal recovery.

Results

We defined renal recovery as alive and free of dialysis at 60 days from the start of RRT. Patients who recovered had higher uCystatin C on day 1 (7.27 versus 6.60 ng/mg·creatinine) and lower uHGF on days 7 and 14 (2.97 versus 3.48 ng/mg·creatinine; 2.24 versus 3.40 ng/mg·creatinine). For predicting recovery, decreasing uNGAL and uHGF in the first 14 days was associated with greater odds of renal recovery. The most predictive model combined relative changes in biomarkers with clinical variables and resulted in an area under the receiver-operator characteristic curve of 0.94.

Conclusions

We showed that a panel of urine biomarkers can augment clinical risk prediction for recovery after AKI.  相似文献   

3.

Background and objective

ABO blood types are determined by antigen modifications on glycoproteins and glycolipids and associated with altered plasma levels of inflammatory and endothelial injury markers implicated in AKI pathogenesis. We sought to determine the association of ABO blood types with AKI risk in critically ill patients with trauma or sepsis.

Design, setting, participants, & measurements

We conducted two prospective cohort studies at an urban, academic, level I trauma center and tertiary referral center; 497 patients with trauma admitted to the surgical intensive care unit between 2005 and 2010 with an injury severity score >15 and 759 patients with severe sepsis admitted to the medical intensive care unit between 2008 and 2013 were followed for 6 days for the development of incident AKI. AKI was defined by Acute Kidney Injury Network creatinine and dialysis criteria.

Results

Of 497 patients with trauma, 134 developed AKI (27%). In multivariable analysis, blood type A was associated with higher AKI risk relative to type O among patients of European descent (n=229; adjusted risk, 0.28 versus 0.14; risk difference, 0.14; 95% confidence interval, 0.03 to 0.24; P=0.02). Of 759 patients with sepsis, AKI developed in 326 (43%). Blood type A again conferred higher AKI risk relative to type O among patients of European descent (n=437; adjusted risk, 0.53 versus 0.40; risk difference, 0.14; 95% confidence interval, 0.04 to 0.23; P=0.01). Findings were similar when analysis was restricted to those patients who did not develop acute respiratory distress syndrome or were not transfused. We did not detect a significant association between blood type and AKI risk among individuals of African descent in either cohort.

Conclusions

Blood type A is independently associated with AKI risk in critically ill patients with trauma or severe sepsis of European descent, suggesting a role for ABO glycans in AKI susceptibility.  相似文献   

4.

Background and objectives

Use of small changes in serum creatinine to diagnose AKI allows for earlier detection but may increase diagnostic false–positive rates because of inherent laboratory and biologic variabilities of creatinine.

Design, setting, participants, & measurements

We examined serum creatinine measurement characteristics in a prospective observational clinical reference cohort of 2267 adult patients with AKI by Kidney Disease Improving Global Outcomes creatinine criteria and used these data to create a simulation cohort to model AKI false–positive rates. We simulated up to seven successive blood draws on an equal population of hypothetical patients with unchanging true serum creatinine values. Error terms generated from laboratory and biologic variabilities were added to each simulated patient’s true serum creatinine value to obtain the simulated measured serum creatinine for each blood draw. We determined the proportion of patients who would be erroneously diagnosed with AKI by Kidney Disease Improving Global Outcomes creatinine criteria.

Results

Within the clinical cohort, 75.0% of patients received four serum creatinine draws within at least one 48-hour period during hospitalization. After four simulated creatinine measurements that accounted for laboratory variability calculated from assay characteristics and 4.4% of biologic variability determined from the clinical cohort and publicly available data, the overall false–positive rate for AKI diagnosis was 8.0% (interquartile range =7.9%–8.1%), whereas patients with true serum creatinine ≥1.5 mg/dl (representing 21% of the clinical cohort) had a false–positive AKI diagnosis rate of 30.5% (interquartile range =30.1%–30.9%) versus 2.0% (interquartile range =1.9%–2.1%) in patients with true serum creatinine values <1.5 mg/dl (P<0.001).

Conclusions

Use of small serum creatinine changes to diagnose AKI is limited by high false–positive rates caused by inherent variability of serum creatinine at higher baseline values, potentially misclassifying patients with CKD in AKI studies.  相似文献   

5.

Summary

Background and objectives

Prior studies have examined long-term outcomes of a single acute kidney injury (AKI) event in hospitalized patients. We examined the effects of AKI episodes during multiple hospitalizations on the risk of chronic kidney disease (CKD) in a cohort with diabetes mellitus (DM).

Design, setting, participants, & measurements

A total of 4082 diabetics were followed from January 1999 until December 2008. The primary outcome was reaching stage 4 CKD (GFR of <30 ml/min per 1.73 m2). AKI during hospitalization was defined as >0.3 mg/dl or a 1.5-fold increase in creatinine relative to admission. Cox survival models examined the effect of first AKI episode and up to three episodes as time-dependent covariates, on the risk of stage 4 CKD. Covariates included demographic variables, baseline creatinine, and diagnoses of comorbidities including proteinuria.

Results

Of the 3679 patients who met eligibility criteria (mean age = 61.7 years [SD, 11.2]; mean baseline creatinine = 1.10 mg/dl [SD, 0.3]), 1822 required at least one hospitalization during the time under observation (mean = 61.2 months [SD, 25]). Five hundred thirty of 1822 patients experienced one AKI episode; 157 of 530 experienced ≥2 AKI episodes. In multivariable Cox proportional hazards models, any AKI versus no AKI was a risk factor for stage 4 CKD (hazard ratio [HR], 3.56; 95% confidence interval [CI], 2.76, 4.61); each AKI episode doubled that risk (HR, 2.02; 95% CI, 1.78, 2.30).

Conclusions

AKI episodes are associated with a cumulative risk for developing advanced CKD in diabetes mellitus, independent of other major risk factors of progression.  相似文献   

6.

Background and objectives

AKI is a risk factor for development or worsening of CKD. However, diagnosis of renal dysfunction by serum creatinine could be confounded by loss of muscle mass and creatinine generation after critical illness.

Design, setting, participants, & measurements

A retrospective, single center analysis of serum in patients surviving to hospital discharge with an intensive care unit admission of 5 or more days between 2009 and 2011 was performed.

Results

In total, 700 cases were identified, with a 66% incidence of AKI. In 241 patients without AKI, creatinine was significantly lower (P<0.001) at hospital discharge than admission (median, 0.61 versus 0.88 mg/dl; median decrease, 33%). In 160 patients with known baseline, discharge creatinine was significantly lower than baseline in all patients except those patients with severe AKI (Kidney Disease Improving Global Outcomes category 3), who had no significant difference. In a multivariable regression model, median duration of hospitalization was associated with a predicted 30% decrease (95% confidence interval, 8% to 45%) in creatinine from baseline in the absence of AKI; after allowing for this effect, AKI was associated with a 29% (95% confidence interval, 10% to 51%) increase in predicted hospital discharge creatinine. Using a similar model to exclude the confounding effect of prolonged major illness on creatinine, 148 of 700 patients (95% confidence interval, 143 to 161) would have eGFR<60 ml/min per 1.73 m2 at hospital discharge compared with only 63 of 700 patients using eGFR based on unadjusted hospital creatinine (a 135% increase in potential CKD diagnoses; P<0.001).

Conclusion

Critical illness is associated with significant falls in serum creatinine that persist to hospital discharge, potentially causing inaccurate assessment of renal function at discharge, particularly in survivors of AKI. Prospective measurements of GFR and creatinine generation are required to confirm the significance of these findings.  相似文献   

7.

Background and objectives

AKI is frequent and is associated with poor outcomes. There is limited information on the epidemiology of AKI worldwide. This study compared patients with AKI in emerging and developed countries to determine the association of clinical factors and processes of care with outcomes.

Design, setting, participants, & measurements

This prospective observational study was conducted among intensive care unit patients from nine centers in developed countries and five centers in emerging countries. AKI was defined as an increase in creatinine of ≥0.3 mg/dl within 48 hours.

Results

Between 2008 and 2012, 6647 patients were screened, of whom 1275 (19.2%) developed AKI. A total of 745 (58% of those with AKI) agreed to participate and had complete data. Patients in developed countries had more sepsis (52.1% versus 38.0%) and higher Acute Physiology and Chronic Health Evaluation (APACHE) scores (mean±SD, 61.1±27.5 versus 51.1±25.2); those from emerging countries had more CKD (54.3% versus 38.3%), GN (6.3% versus 0.9%), and interstitial nephritis (7.0% versus 0.6%) (all P<0.05). Patients from developed countries were less often treated with dialysis (15.5% versus 30.2%; P<0.001) and started dialysis later after AKI diagnosis (2.0 [interquartile range, 0.75–5.0] days versus 0 [interquartile range, 0–5.0] days; P=0.02). Hospital mortality was 22.0%, and 13.3% of survivors were dialysis dependent at discharge. Independent risk factors associated with hospital mortality included older age, residence in an emerging country, use of vasopressors (emerging countries only), dialysis and mechanical ventilation, and higher APACHE score and cumulative fluid balance (developed countries only). A lower probability of renal recovery was associated with residence in an emerging country, higher APACHE score (emerging countries only) and dialysis, while mechanical ventilation was associated with renal recovery (developed countries only).

Conclusions

This study contrasts the clinical features and management of AKI and demonstrates worse outcomes in emerging than in developed countries. Differences in variations in care may explain these findings and should be considered in future trials.  相似文献   

8.

Background and objectives

Disease biomarkers require appropriate clinical context to be used effectively. Combining clinical risk factors, in addition to small changes in serum creatinine, has been proposed to improve the assessment of AKI. This notion was developed in order to identify the risk of AKI early in a patient''s clinical course. We set out to assess the performance of this combination approach.

Design, setting, participants, & measurements

A secondary analysis of data from a prospective multicenter intensive care unit cohort study (September 2009 to April 2010) was performed. Patients at high risk using this combination approach were defined as an early increase in serum creatinine of 0.1–0.4 mg/dl, depending on number of clinical factors predisposing to AKI. AKI was defined and staged using the Acute Kidney Injury Network criteria. The primary outcome was evolution to severe AKI (Acute Kidney Injury Network stages 2 and 3) within 7 days in the intensive care unit.

Results

Of 506 patients, 214 (42.2%) patients had early creatinine elevation and were deemed at high risk for AKI. This group was more likely to subsequently develop the primary endpoint (16.4% versus 1.0% [not at high risk], P<0.001). The sensitivity of this grouping for severe AKI was 92%, the specificity was 62%, the positive predictive value was 16%, and the negative predictive value was 99%. After adjustment for Sequential Organ Failure Assessment score, serum creatinine, and hazard tier for AKI, early creatinine elevation remained an independent predictor for severe AKI (adjusted relative risk, 12.86; 95% confidence interval, 3.52 to 46.97). Addition of early creatinine elevation to the best clinical model improved prediction of the primary outcome (area under the receiver operating characteristic curve increased from 0.75 to 0.83, P<0.001).

Conclusion

Critically ill patients at high AKI risk, based on the combination of clinical factors and early creatinine elevation, are significantly more likely to develop severe AKI. As initially hypothesized, the high-risk combination group methodology can be used to identify patients at low risk for severe AKI in whom AKI biomarker testing may be expected to have low yield. The high risk combination group methodology could potentially allow clinicians to optimize biomarker use.  相似文献   

9.

Summary

Background and objectives

To date there is no reliable marker for the differentiation of prerenal and intrinsic acute kidney injury (AKI). We investigated whether urinary calprotectin, a mediator protein of the innate immune system, may serve as a diagnostic marker in AKI.

Design, setting, participants, & measurements

This was a cross-sectional study with 101 subjects including 86 patients with AKI (34 prerenal, 52 intrinsic including 23 patients with urinary tract infection) and 15 healthy controls. Assessment of urinary calprotectin concentration was by ELISA and immunohistochemistry of kidney biopsy specimens using a calprotectin antibody. Inclusion criteria were: admission to hospital for AKI stage 1 to 3 (Acute Kidney Injury Network); exclusion criteria were: prior renal transplantation and obstructive uropathy.

Results

Median urinary calprotectin was 60.7 times higher in intrinsic AKI (1692 ng/ml) than in prerenal AKI (28 ng/ml, p <0.01). Urinary calprotectin in prerenal disease was not significantly different from healthy controls (45 ng/ml, p = 0.25). Receiver operating curve curve analysis revealed a high accuracy of calprotectin (area under the curve, 0.97) in predicting intrinsic AKI. A cutoff level of 300 ng/ml provided a sensitivity of 92.3% and a specificity of 97.1%. Calculating urinary calprotectin/creatinine ratios did not lead to a further increase of accuracy. Immunostainings of kidney biopsies were positive for calprotectin in intrinsic AKI and negative in prerenal AKI.

Conclusions

Accuracy of urinary calprotectin in the differential diagnosis of AKI is high. Whereas calprotectin levels in prerenal disease are comparable with healthy controls, intrinsic AKI leads to highly increased calprotectin concentrations.  相似文献   

10.
Objective Acute kidney injury (AKI) frequently occurs after catheter-based interventional procedures and increases mortality. However, the implications of AKI before thoracic endovascular aneurysm repair (TEVAR) of type B acute aortic dissection (AAD) remain unclear. This study evaluated the incidence, predictors, and in-hospital outcomes of AKI before TEVAR in patients with type B AAD. Methods Between 2009 and 2013, 76 patients were retrospectively evaluated who received TEVAR for type B AAD within 36 h from symptom onset. The patients were classified into no-AKI vs. AKI groups, and the severity of AKI was further staged according to kidney disease: improving global outcomes criteria before TEVAR. Results The incidence of preoperative AKI was 36.8%. In-hospital complications was significantly higher in patients with preoperative AKI compared with no-AKI (50.0% vs. 4.2%, respectively; P < 0.001), including acute renal failure (21.4% vs. 0, respectively; P < 0.001), and they increased with severity of AKI (P < 0.001). The maximum levels of body temperature and white blood cell count were significantly related to maximum serum creatinine level before TEVAR. Multivariate analysis showed that systolic blood pressure on admission (OR: 1.023; 95% CI: 1.003–1.044; P = 0.0238) and bilateral renal artery involvement (OR: 19.076; 95% CI: 1.914–190.164; P = 0.0120) were strong predictors of preoperative AKI. Conclusions Preoperative AKI frequently occurred in patients with type B AAD, and correlated with higher in-hospital complications and enhanced inflammatory reaction. Systolic blood pressure on admission and bilateral renal artery involvement were major risk factors for AKI before TEVAR.  相似文献   

11.

Summary

Background and objectives

Experimental acute kidney injury (AKI) activates the HMG–CoA reductase (HMGCR) gene, producing proximal tubule cholesterol loading. AKI also causes sloughing of proximal tubular cell debris into tubular lumina. This study tested whether these two processes culminate in increased urinary pellet cholesterol content, and whether the latter has potential AKI biomarker utility.

Design, setting, participants, & measurements

Urine samples were collected from 29 critically ill patients with (n = 14) or without (n= 15) AKI, 15 patients with chronic kidney disease, and 15 healthy volunteers. Centrifuged urinary pellets underwent lipid extraction, and the extracts were assayed for cholesterol content (factored by membrane phospholipid phosphate content). In vivo HMGCR activation was sought by measuring levels of RNA polymerase II (Pol II), and of a gene activating histone mark (H3K4m3) at exon 1 of the HMGCR gene (chromatin immunoprecipitation assay of urine chromatin samples).

Results

AKI+ patients had an approximate doubling of urinary pellet cholesterol content compared with control urine samples (versus normal; P < 0.001). The values significantly correlated (r, 0.5; P < 0.01) with serum, but not urine, creatinine concentrations. Conversely, neither critical illness without AKI nor chronic kidney disease raised pellet cholesterol levels. Increased HMGCR activity in the AKI+ patients was supported by three- to fourfold increased levels of Pol II, and of H3K4m3, at the HMGCR gene (versus controls or AKI− patients).

Conclusions

(1) Clinical AKI, like experimental AKI, induces HMGCR gene activation; (2) increased urinary pellet cholesterol levels result; and (3) urine pellet cholesterol levels may have potential AKI biomarker utility. The latter will require future testing in a large prospective trial.  相似文献   

12.

Summary

Background and objectives

Low bone mineral density and coronary artery calcification (CAC) are highly prevalent among chronic kidney disease (CKD) patients, and both conditions are strongly associated with higher mortality. The study presented here aimed to investigate whether reduced vertebral bone density (VBD) was associated with the presence of CAC in the earlier stages of CKD.

Design, setting, participants, & measurements

Seventy-two nondialyzed CKD patients (age 52 ± 11.7 years, 70% male, 42% diabetics, creatinine clearance 40.4 ± 18.2 ml/min per 1.73 m2) were studied. VBD and CAC were quantified by computed tomography.

Results

CAC > 10 Agatston units (AU) was observed in 50% of the patients (median 120 AU [interquartile range 32 to 584 AU]), and a calcification score ≥ 400 AU was found in 19% (736 [527 to 1012] AU). VBD (190 ± 52 Hounsfield units) correlated inversely with age (r = −0.41, P < 0.001) and calcium score (r = −0.31, P = 0.01), and no correlation was found with gender, creatinine clearance, proteinuria, lipid profile, mineral parameters, body mass index, and diabetes. Patients in the lowest tertile of VBD had expressively increased calcium score in comparison to the middle and highest tertile groups. In the multiple logistic regression analysis adjusting for confounding variables, low VBD was independently associated with the presence of CAC.

Conclusions

Low VBD was associated with CAC in nondialyzed CKD patients. The authors suggest that low VBD might constitute another nontraditional risk factor for cardiovascular disease in CKD.  相似文献   

13.

Background and objectives

Observational evidence has suggested that RRT modality may affect recovery after AKI. It is unclear whether initial choice of intermittent hemodialysis or continuous RRT affects renal recovery, survival, or development of ESRD in critically ill patients when modality choice is made primarily on hemodynamics.

Design, setting, participants, & measurements

We performed a retrospective cohort study examining adults (≥18 years old) admitted to intensive care units from 2000 to 2008 who received RRT for AKI and survived to hospital discharge or 90 days. We analyzed renal recovery (alive and not requiring RRT) and reasons for nonrecovery (death or ESRD) at 90 and 365 days. Conditional multivariable logistic regression was used to assess differences in renal recovery at 90 and 365 days between continuous RRT and intermittent hemodialysis. Models were stratified by propensity for continuous RRT and adjusted for age and reference creatinine.

Results

Of 4738 patients with Kidney Disease Improving Global Outcomes stage 3 AKI, 1338 (28.2%) received RRT, and 638 (47.7%) survived to hospital discharge (353 intermittent hemodialysis and 285 continuous RRT). Recovery from AKI was lower for intermittent hemodialysis versus continuous RRT at 90 days (66.6% intermittent hemodialysis versus 75.4% continuous RRT; P=0.02) but similar at 365 days (54.1% intermittent hemodialysis versus 59.6% continuous RRT; P=0.17). In multivariable analysis, there was no difference in odds of recovery at 90 or 365 days for patients initially treated with continuous RRT versus intermittent hemodialysis (90 days: odds ratio, 1.19; 95% confidence interval, 0.91 to 1.55; P=0.20; 365 days: odds ratio, 0.93; 95% confidence interval, 0.72 to 1.2; P=0.55).

Conclusions

We found no significant difference in hazards for nonrecovery or reasons for nonrecovery (mortality or ESRD) with intermittent hemodialysis versus continuous RRT. These results suggest that, when initial RRT modality is chosen primarily on hemodynamics, renal recovery and clinical outcomes in survivors are similar between intermittent hemodialysis and continuous RRT.  相似文献   

14.

Summary

Background and objectives

Management of volume status in patients with acute kidney injury (AKI) is complex, and the role of diuretics is controversial. The primary objective was to elucidate the association between fluid balance, diuretic use, and short-term mortality after AKI in critically ill patients.

Design, setting, participants, & measurements

Using data from the Fluid and Catheter Treatment Trial (FACTT), a multicenter, randomized controlled trial evaluating a conservative versus liberal fluid-management strategy in 1000 patients with acute lung injury (ALI), we evaluated the association of post-renal injury fluid balance and diuretic use with 60-day mortality in patients who developed AKI, as defined by the AKI Network criteria.

Results

306 patients developed AKI in the first 2 study days and were included in our analysis. There were 137 in the fluid-liberal arm and 169 in the fluid-conservative arm (P = 0.04). Baseline characteristics were similar between groups. Post-AKI fluid balance was significantly associated with mortality in both crude and adjusted analysis. Higher post-AKI furosemide doses had a protective effect on mortality but no significant effect after adjustment for post-AKI fluid balance. There was no threshold dose of furosemide above which mortality increased.

Conclusions

A positive fluid balance after AKI was strongly associated with mortality. Post-AKI diuretic therapy was associated with 60-day patient survival in FACTT patients with ALI; this effect may be mediated by fluid balance.  相似文献   

15.

Summary

Background and objectives

Dialysis patients are at high risk for low-trauma bone fracture. Bone density measurements using dual-energy x-ray absorptiometry (DXA) do not reliably differentiate between patients with and without fractures. The aim of this study was to identify differences in bone microarchitecture between patients with and without a history of fracture using high-resolution peripheral quantitative computed tomography (HR-pQCT).

Design, setting, participants, & measurements

Seventy-four prevalent hemodialysis patients were recruited for measurements of areal bone mineral density (aBMD) by DXA and bone microarchitecture by HR-pQCT. Patients with a history of trauma-related fracture were excluded. Forty healthy volunteers served as controls. Blood levels of parathyroid hormone, vitamin D, and markers of bone turnover were determined.

Results

Dialysis patients, particularly women, had markedly impaired bone microarchitecture. Patients with fractures had significantly reduced cortical and trabecular microarchitecture compared with patients without fractures. aBMD tended to be lower in patients with fractures, but differences were statistically not significant. The strongest determinant of fracture was the HR-pQCT-measured trabecular density of the tibia, which also had the highest discriminatory power to differentiate patients according to fracture status. Radial DXA had a lower discriminatory power than trabecular density.

Conclusions

Bone microarchitecture is severely impaired in dialysis patients and even more so in patients with a history of fracture. HR-pQCT can identify dialysis patients with a history of low-trauma fracture.  相似文献   

16.

Background

To identify risk factors for acute kidney injury (AKI) in overweight patients who underwent surgery for acute type A aortic dissection (TAAD).

Methods

A retrospective study including 108 consecutive overweight patients [body mass index (BMI) ≥24] between December 2009 and April 2013 in Beijing Anzhen Hospital has been performed. AKI was defined by Acute Kidney Injury Network (AKIN) criteria, which is based on serum creatinine (sCr) or urine output.

Results

The mean age of the patients was 43.69±9.66 years. Seventy-two patients (66.7%) developed AKI during the postoperative period. A logistic regression analysis was performed to identify two independent risk factors for AKI: elevated preoperative sCr level and 72-h drainage volume. Renal replacement therapy (RRT) was required in 15 patients (13.9%). The overall postoperative mortality rate was 7.4%, 8.3% in AKI group and 5.6% in non-AKI group. There is no statistically significant difference between the two groups (P=0.32).

Conclusions

A higher incidence of AKI (66.7%) in overweight patients with acute TAAD was confirmed. The logistic regression model identified elevated preoperative sCr level and 72-h drainage volume as independent risk factors for AKI in overweight patients. We should pay more attention to prevent AKI in overweight patients with TAAD.  相似文献   

17.

Background and objectives

The secular trend toward dialysis initiation at progressively higher levels of eGFR is not well understood. This study compared temporal trends in eGFR at dialysis initiation within versus outside the Department of Veterans Affairs (VA)—the largest non–fee-for-service health system in the United States.

Design, setting, participants, & measurements

The study used linked data from the US Renal Data System, VA, and Medicare to compare temporal trends in eGFR at dialysis initiation between 2000 and 2009 (n=971,543). Veterans who initiated dialysis within the VA were compared with three groups who initiated dialysis outside the VA: (1) veterans whose dialysis was paid for by the VA, (2) veterans whose dialysis was not paid for by the VA, and (3) nonveterans. Logistic regression was used to estimate average predicted probabilities of dialysis initiation at an eGFR≥10 ml/min per 1.73 m2.

Results

The adjusted probability of starting dialysis at an eGFR≥10 ml/min per 1.73 m2 increased over time for all groups but was lower for veterans who started dialysis within the VA (0.31; 95% confidence interval [95% CI], 0.30 to 0.32) than for those starting outside the VA, including veterans whose dialysis was (0.36; 95% CI, 0.35 to 0.38) and was not (0.40; 95% CI, 0.40 to 0.40) paid for by the VA and nonveterans (0.39; 95% CI, 0.39 to 0.39). Differences in eGFR at initiation within versus outside the VA were most pronounced among older patients (P for interaction <0.001) and those with a higher risk of 1-year mortality (P for interaction <0.001).

Conclusions

Temporal trends in eGFR at dialysis initiation within the VA mirrored those in the wider United States dialysis population, but eGFR at initiation was consistently lowest among those who initiated within the VA. Differences in eGFR at initiation within versus outside the VA were especially pronounced in older patients and those with higher 1-year mortality risk.  相似文献   

18.

Background and objectives

Dialysis patients have a high risk for inadequate nutrition. Their nutritional status is particularly susceptible to deterioration when faced with intercurrent events such as hospitalization. This study was conducted to improve the understanding of the temporal evolution of nutritional parameters as a foundation for rational and proactive nutritional intervention.

Design, setting, participants, & measurements

A retrospective cohort study was performed to investigate the temporal evolution of nutritional parameters (serum albumin, serum phosphate, serum creatinine, equilibrated normalized protein catabolic rate, and interdialytic weight gain) and a composite nutritional score derived from these parameters, in two populations: (1) incident hemodialysis (HD) patients who started HD between January 2006 and December 2011 and were followed for up to 54 months (median 16.3), and (2) prevalent patients with HD vintage ≥2.5 years who were hospitalized between January 2006 and December 2011 and followed from 6 months before to 6 months after hospitalization.

Results

In incident patients (n=126,964), each of the nutritional parameters improved after HD initiation, with a mean composite nutritional score at the 24th percentile at the start of HD and reaching a plateau at the 57th percentile toward the end of the second year on dialysis. Nutritional parameters increased more rapidly and reached higher values among patients who survived longer. In hospitalized patients (n=14,193), the nutritional parameters and the composite score began to decline 1–2 months before hospitalization, reached their lowest level in the month after hospitalization, and then partially recovered in the subsequent 5 months. The degree of recovery of the nutritional score was inversely related to the number of rehospitalizations.

Conclusions

This study increases the understanding of nutritional resilience and its determinants in HD patients. Application of the nutritional score, pending further validation, may facilitate targeted and timely interventions to avert the negative consequences of inadequate nutrition in chronic HD patients.  相似文献   

19.

Summary

Background and objectives

Acute kidney injury (AKI) complicating cardiopulmonary bypass (CPB) results in increased morbidity and mortality. Urinary hepcidin-25 has been shown to be elevated in patients who do not develop AKI after CPB using semiquantitative mass spectrometry (SELDI TOF-MS). The goals of this study were to quantitatively validate these findings with ELISA and evaluate the diagnostic performance of hepcidin-25 for AKI.

Design, setting, participants, & measurements

A nested, case-control analysis of urinary hepcidin-25 in AKI (n = 22) and non-AKI (n = 22) patients was conducted to validate the SELDI TOF-MS data at the following times: preoperatively; the start of CPB; 1 hour on CPB; on arrival to the intensive care unit; and postoperative days (POD) 1 and 3 to 5. The diagnostic performance of hepcidin-25 was then evaluated in the entire prospective observational cohort (n = 338) at POD 1. AKI was defined as Cr >50% from baseline, within 72 hours postoperatively.

Results

Urinary hepcidin-25/Cr ratio was significantly elevated in all patients at POD 1 compared with baseline (P < 0.0005) and was also significantly elevated in non-AKI versus AKI patients at POD 1 (P < 0.0005). Increased log10 hepcidin-25/Cr ratio was strongly associated with avoidance of AKI on univariate analysis. On multivariate analysis, the log10 hepcidin-25/Cr ratio (P < 0.0001) was associated with avoidance of AKI with an area under the curve of 0.80, sensitivity 0.68, specificity 0.68, and negative predictive value 0.96.

Conclusions

Elevated urinary hepcidin-25 on POD 1 is a strong predictor of avoidance of AKI beyond postoperative day 1.  相似文献   

20.

Background

There is considerable controversy regarding the diagnosis of Acute Kidney Injury (AKI), and there are over 30 different definitions.

Objective

To evaluate the incidence and risk factors for the development of AKI following cardiac surgery according to the RIFLE, AKIN and KDIGO criteria, and compare the prognostic power of these criteria.

Methods

Cross-sectional study that included 321 consecutive patients (median age 62 [53-71] years; 140 men) undergoing cardiac surgery between June 2011 and January 2012. The patients were followed for up to 30 days, for a composite outcome (mortality, need for dialysis and extended hospitalization).

Results

The incidence of AKI ranged from 15% - 51%, accordingly to the diagnostic criterion adopted. While age was associated with risk of AKI in the three criteria, there were variations in the remaining risk factors. During follow-up, 89 patients developed the outcome and all criteria were associated with increased risk in the univariate Cox analysis and after adjustment for age, gender, diabetes, and type of surgery. However, after further adjustment for extracorporeal circulation and the presence of low cardiac output, only AKI diagnosed by the KDIGO criterion maintained this significant association (HR= 1.89 [95% CI: 1.18 - 3.06]).

Conclusion

The incidence and risk factors for AKI after cardiac surgery vary significantly according to the diagnostic criteria used. In our analysis, the KDIGO criterion was superior to AKIN and RIFLE with regard its prognostic power.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号