Anastomotic leakage (AL) represents a serious complication after abdominal surgery. Therefore, it is important to detect it early before it becomes clinically apparent. The predictive value of C-reactive protein (CRP) as a marker of infective postoperative complications, particularly in the form of anastomotic leakage, has been investigated by several authors with promising results. The aim of this study was to evaluate the diagnostic accuracy of C-reactive protein in predicting anastomotic leakage.
Methods
The serum CRP level, white blood cell (WBC) count, and body temperature (BT) of 156 patients who underwent elective abdominal surgery with primary anastomosis were monitored daily until postoperative day (POD) 7. We recorded all postoperative complications and analyzed the data. Diagnostic accuracy of CRP with regard to development of AL was assessed by receiver operating characteristic curve analysis.
Results
Fifteen patients (9.6 %) developed anastomotic leakage. CRP was significantly higher every day during the first 7 postoperative days in patients who developed AL compared with those patients who did not develop complications, whereas the WBC count and BT were not. A CRP cutoff value of 135 mg/l on POD 3 yielded a sensitivity of 73 %, a specificity of 73 %, and a negative predictive value of 95.4 % for the detection of AL.
Conclusions
According to our results, values of CRP less than 135 mg/l on POD 3 may contribute to a safe discharge from hospital. Patients with CRP values higher than 135 mg/l on POD 3 require prolonged hospitalization and an intensive search for infective complications, particularly AL. 相似文献
Acute liver failure (ALF) or fulminant hepatitis is a rare, yet severe outcome of infection with hepatitis B virus (HBV) that carries a high mortality rate. The occurrence of a life‐threatening condition upon infection with a prevalent virus in individuals without known risk factors is suggestive of pathogen‐specific immune dysregulation. In the absence of established differences in HBV virulence, we hypothesized that ALF upon primary infection with HBV could be due to rare deleterious variants in the human genome. To search for such variants, we performed exome sequencing in 21 previously healthy adults who required liver transplantation upon fulminant HBV infection and 172 controls that were positive for anti‐HBc and anti‐HBs but had no clinical history of jaundice or liver disease. After a series of hypothesis‐driven filtering steps, we searched for putatively pathogenic variants that were significantly associated with case‐control status. We did not find any causal variant or gene, a result that does not support the hypothesis of a shared monogenic basis for human susceptibility to HBV‐related ALF in adults. This study represents a first attempt at deciphering the human genetic contribution to the most severe clinical presentation of acute HBV infection in previously healthy individuals. 相似文献
BACKGROUND: Laparoscopic surgery is widely recognized as a well-tolerated and effective method for cholecystectomy. It is also considered cost saving because it has been associated with a decreased hospital length of stay. Variables that might lead to increased costs in laparoscopic surgery are the technique and drugs used in anesthesia. OBJECTIVE: The goal of this study was to compare the costs of 2 anesthetic techniques used in laparoscopic cholecystectomy (LC)--balanced versus IV anesthesia--from the standpoint of an outpatient surgical department, with a time horizon of 1 year. METHODS: Patients scheduled to undergo elective LC were enrolled in this prospective case study. Patients were randomly allocated to receive balanced anesthesia, administered as low fresh gas flow (LFGF) with inhalational sevoflurane and IV sufentanil in a target controlled infusion (LFGF SS group), or IV anesthesia, administered as IV propofol/sufentanil in a target controlled infusion (TCI group). We used a microcosting procedure to measure health care resource utilization in individual patients to detect treatment differences. The costs of medications used for the induction and maintenance of anesthesia during surgery were considered for LFGF SS and TCI. Other end points included duration of anesthesia; mean times to early emergence, tracheal extubation, orientation, and postanesthesia discharge (PAD); pain intensity before first analgesia; number of analgesics required in the first 24 hours after surgery; and prevalences of nausea, vomiting, and agitation. RESULTS: A total of 60 patients were included in this analysis (male/female ratios in the LFGF SS and TCI groups: 11/19 and 12/18, respectively; mean [SD] ages, 48 [7.9] and 47 [8.6] years; and mean [SD] body mass indexes, 26 [2.0] and 26 [3.0] kg/m2). The costs of anesthetics were significantly lower with LFGF SS compared with TCI (euro17.40 [euro2.66] vs euro22.01 [euro2.50] [2006 euros]). Times to early emergence and tracheal extubation were significantly shorter with LFGF SS than TCI (5.97 [1.16] vs 7.73 [1.48] minutes and 7.57 [1.07] vs 8.87 [1.45] minutes, respectively). There were no significant between-group differences in mean duration of anesthesia; times to orientation and PAD; pain intensity before first analgesia; number of analgesics required in the first 24 hours; or prevalences of nausea, vomiting, and agitation. Because no clinically significant differences in the anesthetic results were observed, a cost-minimization analysis was conducted and found that using LFGF SS, the outpatient surgical department could realize a budget savings of euro454 per 100 patients. For the nearly 1000 expected patients per year, the savings for the department was calculated as euro4540. CONCLUSION: The results from this cost analysis in these patients who underwent elective LC suggest that the use of sevoflurane through the LFGF technique would be cost saving in this outpatient surgical department. 相似文献
To assess the presence of subclinical left ventricular myocardial dysfunction in subjects with high-normal blood pressure (BP) and untreated arterial hypertension, using three-dimensional (3D) echocardiography strain analysis. This cross-sectional study included 49 subjects with optimal BP, 50 subjects with high-normal BP, and 50 newly diagnosed untreated hypertensive patients matched by gender and age. All the subjects underwent 24 h blood pressure monitoring and complete two-dimensional and 3D echocardiography examination. The enrolled subjects were grouped according to 24 h systolic BP values, dividing the subjects with optimal BP from those with high-normal BP and the hypertensive patients (cut-off values were 120 and 130 mmHg, respectively). 3D global longitudinal strain was significantly lower in the high-normal BP group and the hypertensive patients, in comparison with the optimal BP group (?20.5 ± 3.3 vs. ?18.7 ± 2.8 vs. ?17.6 ± 2.7 %, p < 0.001). Similar results were obtained for 3D global circumferential strain (?18.6 ± 3 vs. ?17.1 ± 2.9 vs. ?16 ± 2.5 %, p < 0.001), as well for 3D global radial strain (49.4 ± 9.5 vs. 44.7 ± 8.1 vs. 43.5 ± 7.8 %, p = 0.002), and global area strain (?31.2 ± 4.8 vs. ?28.7 ± 4.2 vs. ?27.1 ± 4.5 %, p < 0.001). LV twist was increased in the hypertensive patients in comparison with the high-normal and the optimal BP groups (10.1° ± 2.4° vs. 10.8° ± 2.6° vs. 13.8° ± 3.1°, p < 0.01), whereas untwisting rate significantly and gradually decreased from the optimal BP group, across the high-normal BP group, to the hypertensive patients (?135 ± 35 vs. ?118 ± 31 vs. ?102 ± 27°/s, p < 0.001). 3D echocardiography revealed that the subjects with high-normal BP suffered subclinical impairment of LV mechanics similar as the hypertensive patients. 相似文献
To assess whether implant macrodesign parameters interacting with implant time in function (Tf) could influence the peri-implantitis occurrence.
Materials and methods
One hundred and two patients (55.17?±?11.2 years old) with diagnosed early/moderate peri-implantitis around endosseous implants with implant-supported prosthetics reconstruction (n?=?139) were recruited. Implant macrodesign (implant shape, thread number, implant collar), clinical parameters (peri-implant probing depth (PPD), clinical attachment level (CAL), keratinised tissue width (KTW), plaque index, bleeding on probe), implant placement localisation and region, and Tf were assessed and compared.
Results
Peri-implantitis occurred approximately 6.1?±?3.38 years after implant loading. There was a significant positive correlation between the implant macrodesign and Tf. Peri-implantitis rates were statistically significantly higher in implants with a cylindric shape and triple-thread in the posterior part of the mandible (p?=?0.037 and 0.012, respectively). The thread number and implant shape interacting with Tf showed statistically significant influences on CAL and PPD increase (p?<?0.05). Results indicated a statistically positive interaction between Tf and KTW decrease around the implants with microthreaded collar (p?<?0.001).
Conclusion
Peri-implantitis might be presented as a time-dependent disease. Implant-based factors, such as Tf and implant macrodesign, could influence peri-implantitis occurrence, exacerbate clinical parameters, and promote progressive bone loss.
Clinical relevance
Peri-implantitis can be affected by implant macrodesign and Tf. The implant body shape, thread number, and design of the implant collar may be considered peri-implantitis-related risk indicators that should be taken into account in proper implant planning and therapy.