Abstract: | IntroductionPatients with liver cirrhosis and septic shock have a significantly higher risk of mortality and morbidity compared with non-cirrhotic patients. The peripheral blood lymphocyte-to-monocyte ratio (LMR) can determine the prognosis of cirrhotic patients. Our study aimed to investigate the usefulness of LMR as a predictive marker of mortality risk in cirrhotic patients with septic shock.MethodsThis single-center, retrospective case-control study included adult patients who visited the emergency department between January 1, 2018 and June 30, 2020 and diagnosed with liver cirrhosis and septic shock. They were divided into survivor and non-survivor groups according to their survival status at the 60-day follow-up. We used a Cox proportional hazards regression model to identify independent factors associated with mortality risk and tested the mortality discriminative ability of those factors using the area under a receiver operating characteristic curve.ResultsA total of 93 patients were eligible for this study. Compared with the patients in the survivor group, those in the non-survivor group had significantly higher Child-Pugh (11 ± 2 vs. 9 ± 2, p < 0.001) and MELD scores (29 ± 6 vs. 22 ± 8, p < 0.001), higher serum international normalized ratio (1.7 vs.1.4, p = 0.03), bilirubin (6.0 vs. 3.3 mg/dL, p = 0.02), lactate (5.4 vs. 2.7 mmol/L, p < 0.01), creatinine (2.2 vs. 1.6 mg/dL, p = 0.04), higher neutrophil-to-lymphocyte ratio (13.0 vs. 10.3, p = 0.02), and lower LMR (1.1 vs. 2.3, p < 0.01). The LMR (adjusted hazard ratio [aHR] = 1.54, p = 0.01) and lactate (aHR = 1.03, p < 0.01) were identified as independent predictive factors for mortality in the multivariate regression model. Furthermore, LMR (area under curve [AUC]: 0.87) revealed a superior discrimination ability in mortality prediction compared with the Child-Pugh (AUC: 0.72) and MELD (AUC: 0.76) scores.ConclusionsThe LMR can be used to predict mortality risk in cirrhotic patients with septic shock. |