首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Neuroendocrine tumors (NET) are a rare and heterogeneous group of neoplasms with variable biological behavior. They frequently metastasize to the liver, requiring active, multimodality treatment. Surgical resection, possible in only a minority of cases, was until recently the only potentially curative option. For unresectable NET with liver metastases, liver transplantation (LT) emerged as a potential curative treatment due to relatively slow growth and indolent behavior of the metastases. In this case series with literature review, we retrospectively analyzed the characteristics of 12 highly selected patients with metastatic NET disease as an indication for LT treated in our center. We also summarized the proposed prognostic factors, and evaluated and compared the existing selection criteria. The main poor prognostic factors in our patients were high grade NET and primary tumor in the pancreas. Inconsistent liver transplantation outcome parameters make it difficult to standardize patient selection criteria. There is a need for further studies that would fully elucidate the curative potential of LT in patients diagnosed with NET.

Generally perceived as an indolent, non-aggressive disease, neuroendocrine tumors (NET) comprise a heterogeneous group with variable malignant potential (1-4). These tumors most often arise from the gastrointestinal (60.9%) and bronchopulmonary system (27.4%), with around 50% being functional and producing various hormone-mediated syndromes (1,5). At diagnosis, only 40% of them present as a localized disease. Untreated patients with metastases, mostly in the liver (40%-93%) and bone (12%-20%), experience 20%-40% five-year survival (2,6-9), necessitating a more active therapeutic approach (2,10,11). The pattern of hepatic involvement in most cases is not amenable to curative liver resection (approximately 80% of cases), thus leaving room for various ablative therapies: hepatic artery embolization procedures, peptide receptor radiotherapy, somatostatin analogues, and chemotherapy and molecular-targeted protocols (1,12-15). The relatively slow growing pattern of metastases and their long confinement to the liver make transplantation a reasonable long-term potentially curative option (3,12,14,16,17). Due to disease rarity, heterogeneous tumor parameters, and a lack of clear and prospectively validated patient selection criteria, the transplantation results are highly variable and insufficient to make definitive recommendations (9,17-20).In this case series with literature review, we retrospectively analyzed the characteristics of 12 highly selected patients with metastatic NET disease as indication for liver transplantation (LT) treated in our center. We critically reviewed pertinent prognostic factors and selection criteria for patients with NET undergoing LT, with an aim to further clarify the true benefit of this complex and potentially curative procedure.  相似文献   

2.
AimTo investigate the risk factors and the outcomes of extracorporeal membrane oxygenation (ECMO) in pediatric patients treated at the University Hospital Center Zagreb, the largest center in Croatia providing pediatric ECMO.MethodsThis retrospective study enrolled all the pediatric patients who required E-CPR from 2011 to 2019. Demographic data, cardiac anatomy, ECMO indications, ECMO complications, and neurodevelopmental status at hospital discharge were analyzed.ResultsIn the investigated period, E-CPR was used in 16 children, and the overall survival rate was 37.5%. Six patients were in the neonatal age group, 5 in the infant group, and 5 in the “older” group. There was no significant difference between the sexes. Four patients had an out-of-hospital arrest and 12 had an in-hospital arrest. Twelve out of 16 patients experienced renal failure and needed hemodialysis, with 4 out of 6 patients in the survivor group and 8 out of 10 in the non-survivor group. Survivors and non-survivors did not differ in E-CPR duration time, lactate levels before ECMO, time for lactate normalization, and pH levels before and after the start of ECMO.ConclusionThe similarity of our results to those obtained by other studies indicates that the ECMO program in our hospital should be maintained and improved.

The use of extracorporeal cardiopulmonary resuscitation (E-CPR) is increasing (1). E-CPR is defined as an initiation of extracorporeal membrane oxygenation (ECMO) during active chest compressions. Its main goal is to provide immediate cardiovascular support to patients who do not react to CPR (2) and to lead to survival and a better neurological outcome (3). After administering CPR for more than 30 minutes, survival with conventional CPR measures ranges between 0%-5% (4,5).The most recent systematic review by the International Liaison Committee on Resuscitation from 2015 recommended that E-CPR should be considered for children with underlying cardiac conditions who have an in-hospital cardiac arrest when appropriate protocols, expertise, and equipment are available (6). According to the Extracorporeal Life Support Organization (ELSO) registry from 2017 (7), more than 60 000 people received extracorporeal life support (ECLS), between 2009 and 2015, with an overall survival rate of 61% (7). Pediatric ECMO experience in Slovenia shows that ECMO programs may be incorporated in smaller hospitals in the region (8-10). The ELSO database includes data on all reported pediatric ECMO runs, including those conducted with E-CPR, and in patients with congenital heart surgery and neonates with diaphragmatic hernia or meconium aspiration syndrome, etc. During the 6-year period, 3005 E-CPR runs were reported, with an overall survival to hospital discharge of 43% (7). A survival rate of 31% was reported by Ergűn et al (11) and in E-CPR patients with severe burn injury (12). The longer the CPR duration time, the lower was the survival to discharge rate. Matos et al reported an E-CPR survival-to-discharge rate of 33% after >35 min of chest compressions (13). Other studies reported that the overall survival rate of pediatric E-CPR cases was growing, with better neurological outcomes than among the patients in the CPR group only (14). Pilar et al found that in 73 pediatric cardiac patients requiring cardiopulmonary resuscitation for >30 min (15), the survival to hospital discharge was 43.8%, with 3/4 of the patients having normal neurological function or mild neurological disability (15). Based on ELSO registry, approximately 10% of all ECMO patients meet brain death criteria (7). One of the biggest single-center studies, involving 184 pediatric ECPR patients (16), showed a successful ECMO weaning in 63% of the patients and the overall survival rate to hospital discharge of 43%. In the same study, the risk factors linked to increased mortality were presupport pH<7.1, mechanical complications, and neurological complications (16). The E-CPR use can involve many complications, not necessarily linked to factors preceding cardiac arrest, such as low cardiac output syndrome or irreversible respiratory failure (17). Furthermore, common complications of ECMO treatment are fluid overload and acute kidney injury (18). Many studies showed renal replacement therapy (RRT) to be negatively associated with survival (15,16,18,19).This study assessed the risk factors and the outcomes of ECMO in the largest Croatian center providing pediatric E-CPR experience over nine years and compared the survivor and the non-survivor group.  相似文献   

3.
AimTo evaluate liver stiffness (LS) by real-time two-dimensional shear wave elastography (RT 2D-SWE) and to assess its correlation with the mean arterial pressure (MAP) in patients on maintenance hemodialysis (MHD). The secondary aim was to identify biological and biochemical parameters associated with elevated LS.MethodsThis study enrolled patients treated with MHD in the Split University Hospital from December 2017 through February 2018. LS was measured after a HD session using RT 2D-SWE. Mean arterial pressure was measured before RT-2D-SWE was performed.ResultsThe study enrolled 47 patients with the mean ± standard deviation age of 68.48 ± 14.33 years. Arterial hypertension was diagnosed in 70.2% of patients. Liver stiffness >7 kPa, suggesting clinically relevant fibrosis, was found in 59.5% of patients. Arterial pressure was significantly correlated with LS (ρ = 0.38, P = 0.008). C-reactive protein (ρ = 0.548, P = 0.023), parathyroid hormone (ρ = 0.507, P = 0.038), and total bilirubin (ρ = 0.423, P = 0.020) were correlated with elevated LS.ConclusionMean arterial pressure is correlated with increased LS in patients on MHD. Our results emphasize the importance of proper regulation of arterial blood pressure and indicate that LS should always be interpreted in combination with laboratory parameters. Further prospective studies with larger series are needed.

Chronic kidney disease (CKD) is an important public health problem. The number of CKD patients in the world has increased from approximately 10 000 patients in 1973 to 703 243 in 2015 (1,2). All CKD stages and end stage renal disease (ESRD) are associated with high morbidity, increased health care utilization, and mortality (3). Patients on long-term maintenance hemodialysis (MHD) have an increased risk for hepatitis B and C viral infection and chronic liver inflammation. Acute or chronic liver inflammation is frequently followed by a development of liver fibrosis (LF) (4-6). The clinical presentation of LF differs from that of cirrhosis. Mild LF, as well as early stage liver cirrhosis, are asymptomatic in most patients, which points to the importance of an early diagnosis. Appropriate staging of LF is important for prognosis and for therapeutic decision-making.Percutaneous liver biopsy is still the gold standard for the evaluation and staging of LF (7). It is an invasive procedure with substantial limitations and very serious potential side effects, primarily, post procedural bleeding. The risk is even higher in dialysis patients who have a bleeding tendency due to platelet dysfunction caused by uremic state. Not less important, liver biopsy carries the risk of pneumothorax and hematothorax (8,9). Another limitation is the possibility of a significant sampling error, since the biopsy specimen represents only 1/50 000 the size of the liver parenchyma (10,11). Due to these limitations, many non-invasive tools for the assessment of LF have been recently investigated, showing promising results (11,12). The most frequently applied novel methods, using ultrasound waves for tissue elasticity measurement, are transient elastography and two-dimensional real-time shear wave elastography (RT-2D-SWE) (13,14). RT 2D-SWE is a fast, quantitative method for assessing LF by measuring liver stiffness (LS) in real time. Acoustic push pulses induce shear waves, and their speed is observed as color image in the region of interest (ROI). Velocity (m/s) is determined by measuring the waves passing through the examined tissue. These values are converted into tissue elasticity expressed in kilopascals (Young’s module) (15). 2D-SWE has demonstrated high resolution and excellent reproducibility for assessing LF (16-19).LS can be affected by different parameters. LS values are elevated in healthy people with BMI>30 kg/m2, and in patients with viral hepatitis, primary biliary cholangitis, alcoholic liver disease, and cholestatic liver diseases independent of the degree of fibrosis (20-23). LS correlates significantly with portal pressure and arterial pressure (24-27), and is affected by volume overload: either due to heart failure or liver failure with ascites (11,28). To avoid volume overload, LS measurement should be performed following a hemodialysis (HD) session (29-31). It remains unexplored to what extend blood pressure contributes to LS (25). Little is known about biological and biochemical factors associated with LS changes in ESRD patients on MHD. To the best of our knowledge, most such studies have been conducted on patients with underlying liver disease (32,33). Only a few have focused on factors that influence LS (measured by transient elastography) and fibrosis in HD population (28). In dialysis patients, active chronic inflammation possibly affects the liver and changes LS (34). Another pathophysiological mechanism could be chronic hepatic congestion with subsequent idiopathic noncirrhotic portal hypertension (35). This study aims to investigate the correlation between the mean arterial pressure (MAP) and LS in patients on MHD. The secondary aim was to identify biological and biochemical parameters correlated with elevated LS. Finally, we hypothesized that MAP influenced LS and that C-reactive protein (CRP), parathyroid hormone (PTH), and total bilirubin correlate with the LS value.  相似文献   

4.
5.
AimTo explore the effects of an anterior quadratus lumborum block (QLB) on opioid consumption, pain, nausea, and vomiting (PONV) after ambulatory laparoscopic cholecystectomy.MethodsThis randomized controlled study recruited 70 patients scheduled for ambulatory laparoscopic cholecystectomy from January 2018 to March 2019. The participants were randomly allocated to one of the following groups: 1) anterior QLB (n = 25) with preoperative ropivacaine 3.75 mg/mL, 20 mL bilaterally; 2) placebo QLB (n = 22) with preoperative isotonic saline, 20 mL bilaterally; and 3) controls (n = 23) given only standard intravenous and oral analgesia. The primary endpoint was opioid analgesic consumption. The secondary endpoints were pain (numeric rating scale 0-10) and PONV (scale 0-3, where 0 = no PONV and 3 = severe PONV). Assessments were made up to 48 hours postoperatively.ResultsThe groups did not significantly differ in opioids consumption and reported pain at 1, 2, 24, and 48 hours postoperatively. PONV in the QLB group was lower than in the placebo and control groups.ConclusionPreoperative anterior QLB for laparoscopic cholecystectomy did not affect postoperative opioid requirements and pain. However, anterior QLB may decrease PONV.Clinical Trial numberNCT03437187; January 22, 2018.

Inadequate post-operative pain control may lead to adverse outcomes, such as prolonged hospitalization, higher incidence of re-operation, re-admission, and higher treatment costs (1-3). After laparoscopic cholecystectomy, 17%-41% of patients have been shown to suffer insufficient pain relief (4). However, this may be counteracted by an effective postoperative analgesic treatment.The evolution of nerve stimulation techniques and ultrasound-guided regional anesthesia have greatly enhanced the success and quality of peripheral nerve blocks (5,6). In the area of thoraco-abdominal surgery, a variety of nerve blocks has been introduced. A detailed injection technique of the earliest variants of the quadratus lumborum block (QLB) was described in 2013 (7). All thoracolumbar blocks are injected close to the thoracolumbar fascia (TLF) (8,9), but the target area is different, as well as the spread of local anesthetics and clinical effects (10). The QLB is described in different anatomical locations – the lateral, posterior, and anterior location (11,12). In the lateral QLB, the anatomical target is any point lateral to the QL muscle, while in the posterior QLB, the target is a point between the muscle and the middle layer of the TLF. The anterior QLB anatomical target is between the QL and the psoas major muscle, in the anterior layer of the TLF (13), from where local anesthetics have been shown to spread into the thoracic paravertebral space (14).QLB provides early and rapid pain relief and allows early ambulation in certain patient populations. Multiple case studies also confirmed the QLB to be a rescue block after different surgical procedures (13,15,16). Complications associated with the performance of abdominal wall blocks are fortunately very rare (17). However, studies on the effect of anterior QLB on postoperative opioid consumption are scarce.The primary aim of this study was to evaluate the effect of using the anterior QLB in ambulatory, laparoscopic cholecystectomy, as measured by opioid consumption (primary outcome), experienced pain, and postoperative nausea and vomiting (PONV) (secondary outcomes). The anterior approach to the QLB was chosen since studies have shown that the spread of local anesthesia to the thoracic paravertebral space reduces visceral pain (13,18-22), and that the block has a longer duration than other QLB approaches (23).  相似文献   

6.
AimTo analyze the difference in the salivary cortisol response to psychosocial stress between the patients with the first episode of psychosis (FEP) and the control group.MethodsWe performed a cross-sectional analysis of the baseline measurements of a prospective cohort study conducted from 2015 to 2018 at two Croatian psychiatric hospitals. The study consecutively enrolled 53 patients diagnosed with FEP and 63 healthy controls. The primary outcome was the difference in the changes of salivary cortisol concentration during the stress test. The secondary outcome was the difference in the baseline levels of salivary cortisol between patients with FEP and controls. The tertiary outcome were the correlations of salivary cortisol levels with the results of the Positive and Negative Syndrome Scale for Schizophrenia, Rosenberg Self-Esteem Scale, and the International Personality Item Pool.ResultsPatients with FEP had significantly higher baseline salivary cortisol than controls, but their salivary cortisol increased significantly less during the stress test.ConclusionPatients with FEP respond differently to stressful stimuli than controls, as shown by the increased baseline salivary cortisol and blunted cortisol response, possibly indicating a greater vulnerability to psychosocial stress.

Schizophrenia is one of the most complex psychiatric disorders (1), and in most cases, a long-term condition characterized by alternating periods of acute psychosis and remission. The first episode of psychosis (FEP) is usually preceded by a prodromal phase with non-specific symptoms, followed by psychotic symptoms pertaining to five dimensions: positive, negative, affective, cognitive, and psychomotor. While the pathogenesis is largely unknown, different streams of research consistently show that schizophrenia results from a complex gene-environment interaction (2), in line with the stress-diathesis model of schizophrenia etiology (3). According to this model, genetic and environmental factors lead to vulnerability to schizophrenia (2), and schizophrenia occurs when the vulnerable person experiences “enough” stress (4). For the majority of people, significant distress is associated with major stressors, eg, death in the family, serious illness, or separation (5). However, in persons prone to psychosis significant distress is caused even by minor stressors, also known as daily hassles (6).Response to stress involves the activation of the hypothalamic-pituitary-adrenal axis (HPA): the hypothalamus secretes corticotropin-releasing factor (CRF), which stimulates the secretion of adrenocorticotropic hormone (ACTH) from the front pituitary gland. ACTH then stimulates the release of cortisol from the adrenal gland, causing the cascading effect of several bodily systems (immune, neuroendocrine, inflammatory response to the organism, etc) (7). In the central nervous system, increased cortisol levels can lead to the sensitization of dopaminergic response to stress, which can lead to an excessive dopamine release and psychotic symptoms (6). Indeed, various indicators of altered stress response have been confirmed in the population at risk for psychosis compared with healthy population (8,9).While it has been suggested that persons with schizophrenia show a blunted cortisol response following experimentally induced psychosocial stress (10,11), these findings should be further elucidated considering the heterogeneous research results. Some of these discrepancies may arise from the heterogeneity of the samples and different confounding factors but also from different stress response across illness types and stages (12).Generally, persons with FEP show higher basal cortisol values than the healthy population (8,13,14), a finding that possibly indicates their increased basic HPA hyperactivity, which contributes to their higher vulnerability to stressors. HPA axis hyperactivity indicated by elevated basal cortisol levels in individuals with psychosis is expected to also produce a higher acute elevation of cortisol concentrations in response to acute stressful situation (11). However, studies on FEP are not uniform in their findings (5-7). A small study with eleven medication-naive patients found an attenuated response of cortisol to psychosocial stress (15). Furthermore, Pruessner et al (16) reported significantly lower cortisol levels and systolic blood pressure during the psychosocial stress task in an ultra-high-risk group compared with controls; the lower cortisol levels were associated with higher self-rated stress in the past year. The authors suggested that in these patients attenuated stress response reflects the vulnerability to stressors. A recent study among patients with FEP obtained contrary results, ie, lower salivary cortisol levels at baseline but no difference in the cortisol response during psychosocial stress challenge test compared with healthy controls (17). Another study found no differences in baseline salivary cortisol levels between patients with FEP and controls, but a blunted cortisol response to stressors in FEP, both on and off medication, compared with controls (18). Finally, individuals with schizophrenia have shown lower cortisol levels both in anticipation and after exposure to social stress compared with controls, even though there were no differences in cortisol production rate (19).Of note, people with posttraumatic stress syndrome (20) and those with high anxiety traits (21) showed higher salivary baseline cortisol, but a blunted response to psychosocial stress compared with healthy controls (22), which suggests a deficient acute neuroendocrine stress response.Thus, the aim of our study was to assess the difference in the salivary cortisol response to the psychosocial stress between the patients with FEP and healthy controls. We enrolled only patients with FEP who were homogeneous in age, duration of illness, phase of illness, and with minimal exposure to medication to limit some of the major confounding factors from other studies. We hypothesized that patients with FEP showed higher baseline salivary cortisol levels, which indicates increased basic HPA hyperactivity, and a lower cortisol increase in response to psychosocial stressor compared with HC, as an indicator of an aberrant response to psychosocial stress. Our exploratory aim was to analyze the correlation of other clinical factors, including psychopathology, personality traits, and stressful life events with salivary cortisol levels.  相似文献   

7.
AimTo assess the use of personal protective equipment (PPE) and related knowledge and attitudes during the coronavirus disease 2019 epidemic in Croatia.MethodsThe online survey, conducted on social media in May 2020, yielded 1393 responses across the country (66% from the Adriatic area). The questionnaire consisted of socio-demographic questions and questions on the knowledge, attitudes, and behaviors related to PPE use. The χ2 test, t test, and multivariate logistic regression were used in data analysis.ResultsAs many as 84.0% of participants reported the compliance with social distancing measures, while 52.8% reported using PPE (mask and/or gloves) when shopping or visiting friends and family. Participants demonstrated good knowledge (mean of 10.4 [95% CI 10.3-10.4] correct answers out of 13 questions) and neutral to moderately positive attitude about PPE use (mean of 36.6 [36.1-37.1] out of 50 points). Participants with higher education, women, and health care workers had a greater probability for having a high knowledge score. Women, older individuals, public transport users, people with more positive PPE use attitude, and those who complied with social distancing had a higher probability of PPE use, while health care workers and highly educated participants had a reduced probability of PPE use in public.ConclusionsCroatians had good knowledge and neutral to moderately positive attitudes about PPE use. Nevertheless, health authorities need to promote positive attitudes about PPE use in order to retain trust and compliance with epidemiological measures.

The coronavirus disease 2019 pandemic is one of the major health and economic crises in modern history (1). The novel coronavirus (SARS-CoV-2), first recognized in the Chinese city of Wuhan in December 2019, has rapidly spread worldwide causing respiratory infections in humans (2). On January 30, 2020, the World Health Organization (WHO) declared an international public health emergency with an aim to establish international collaboration and control the outbreak (3). Among the main WHO recommendations for virus spread prevention were early detection and isolation of sick persons, contact tracing, physical distancing measures, and the use of personal protective equipment (PPE) (4).This pandemic has become the topic of intensive research, with an unprecedented number of articles published in a very short period (5). Many studies have investigated the compliance with epidemiological measures and protective equipment use. Studies conducted worldwide among general population and health care workers between March and July 2020 have shown high knowledge and positive attitudes on preventive practices but also variable actual practices (6).The first case of SARS-CoV-2 infection in Croatia was confirmed on February 25, 2020 (7). By the end of March, Croatia had introduced some of the strictest epidemiological measures in the world (8,9), such as banning public gatherings and sports events, closing of restaurants and non-essential shops, and border closing (lockdown period). Educational institutions switched to online teaching on March 16 (10). Epidemiological measures lasted 30 days, after which the Civil Protection Directorate of the Ministry of the Interior of Croatia (CPD) started restriction lifting in three phases (Supplementary Figure 1(supplementary figure 1)) (11,12).From February 25 to May 25, the CPD held a daily press conference, at which they issued epidemic-related information, such as the number of new COVID-19 cases, and announced new restrictions.Knowledge and attitudes are important determinants of behavior (13-15), which is of special importance in pandemic outbreaks. The aim of our study was to investigate the level of knowledge and attitudes about PPE use in Croatia during a spring period characterized by a declining epidemic curve. Additionally, we aimed to assess the characteristics associated with the adherence to epidemiological measures in the general population, and to compare knowledge, attitudes, and behaviors between health care and non-health care workers.  相似文献   

8.
AimTo compare the efficacy of different components of online and contact anatomy classes as perceived by medical students.MethodsAn anonymous course evaluation survey was conducted at the end of the academic year 2019/2020. The organization of classes due to the SARS-CoV-2 pandemic provided our students with a unique opportunity to compare online and contact classes. Students’ responses were analyzed according to the type of obtained data (ratio, ordinal, and categorical).ResultsThe response rate was 95.58%. Approximately 90% of students found anatomical dissection and practical work in general to be the most important aspect of teaching, which could not be replaced by online learning. During online classes, students missed the most the interaction with other students, followed by the interaction with student teaching assistants and teaching staff. Very few students found contact lectures useful, with most students reporting that they could be replaced with recorded video lectures. In contrast, recorded video lectures were perceived as extremely helpful for studying. Regular weekly quizzes were essential during online classes as they gave students adequate feedback and guided their learning process. Students greatly benefitted from additional course materials and interactive lessons, which were made easily available via e-learning platform.ConclusionsAnatomical dissection and interaction during contact classes remain the most important aspects of teaching anatomy. However, online teaching increases learning efficiency by allowing alternative learning strategies and by substituting certain components of contact classes, thus freeing up more time for practical work.

From the middle of the last century, lecturers in anatomy courses for medical students have faced two major challenges. The first has been how to incorporate the rapidly expanding new medical knowledge into the curricula. This required a reorganization of the existing curricula, and anatomy in particular was under pressure to reduce teaching hours and the student load (1-3). The second challenge has been how to modernize the teaching approach and didactically redesign the anatomy course. There has been pressure to replace cadaver work due to high expenses and high organizational demands. In many medical schools, authorities have advocated the idea that cadaver work can be replaced by other learning approaches with identical final outcomes (4). This pressure has become particularly notable in recent years and has been advocated by advancements in new digital technologies such as augmented and virtual reality (5).Anatomy is one of the fundamental and most demanding courses in any medical school curriculum. A frequent point of discussion is how to approach teaching anatomy and facilitate students’ comprehension of difficult concepts and memorization of vast amounts of new information. Universities worldwide adopt different teaching approaches. Modern teaching usually includes a combination of teaching methods within integrated and multimodal approaches to anatomy teaching (6,7). Six techniques for anatomy education have been proposed: in-person lectures, cadaveric dissection, inspection of prosected specimens, models, radiological and living anatomy teaching, and computer-assisted learning (8). Some universities have implemented curricular changes, especially since the time allotted to anatomy education in Europe, the United States of America, and Australia has considerably declined (9). The majority of schools have switched from a completely traditional cadaver-based curriculum toward more interactive custom-made approaches that better fit the learning strategies of new generations and that appreciate technologies such as augmented and virtual reality, social networks, and imaging for a better understanding (7,10,11). Cadaver dissection, considered a gold standard for teaching anatomy (12), still remains widely used. While occasionally contested, its importance in different aspects of anatomy education has been proven by schools that returned to cadaver dissection after having temporarily abandoned it (3,13). However, meta-analyses suggest that educators should appreciate and reevaluate each instructional method in order to meet all the students’ needs, since none has so far been proven superior to any other (14).At the University of Zagreb School of Medicine (UZSM), we teach a cadaver dissection-oriented teaching curriculum, with the use of additional teaching methods/tools, such as prosection and instructions/demonstrations on cadavers and artificial anatomical models. In recent years, we have enhanced the provided e-learning by vastly expanding the materials and activities available on our online platform for communication and teaching. We have also implemented a new, functionally oriented textbook (15,16). These changes aimed to enhance the awareness of the subject''s clinical relevance and to raise the students’ active involvement in the course.Our Department has been systematically assessing students'' satisfaction with the Anatomy course through anonymous surveys (student evaluation of teaching) after the course completion. The Anatomy course is taught during two semesters in the first year of medical school. In the first semester of the academic year 2019/2020, we finished the planned curricular activities as scheduled using our usual multimethod approach. In the second semester, the SARS-CoV-2 (COVID-19) pandemic forced us to switch to exclusively online teaching for an extended period of time (17,18). Online teaching was prolonged because of the heavy damage sustained by the UZSM buildings in an earthquake that hit Zagreb on March 22, 2020 (19), immediately after the introduction of the first lock-down. We organized only a very short practical revision on cadavers and models in June, at the end of the academic year.Such an organization of classes in the academic year 2019/2020 allowed our students to provide unique feedback about the perceived advantages and disadvantages of different components of contact and online classes. It also allowed them to evaluate the significance of these classes for meeting the anatomy course’s aims and give feedback on the overall teaching approach of the faculty. Thus, we conducted a survey with the aim of analyzing information on the efficacy of contact and online classes in covering the anatomy course material. We also analyzed how students’ success on continuous assessment during the academic year related to the way they responded to different survey questions and whether there were significant differences in those responses.  相似文献   

9.
AimTo validate the Croatian version of the Zarit Burden Interview (ZBI) and to investigate the predictors of perceived burden.MethodsThis cross-sectional study involved 131 dyads of one informal caregiver family member and one patient with dementia visiting primary care practices (Health Care Center Zagreb-West; 10/2017-9/2018). Patient-related data were collected with the Mini-Mental-State-Examination, Barthel-index, and Neuropsychiatric-Inventory-Questionnaire (NPI-Q); caregiver-related data with the ZBI, and general information on caregivers and patients with a structured questionnaire. Principal-axis-factoring with varimax-rotation was used for factor analysis.ResultsThe caregivers'' mean age was 62.1 ± 13 years. They were mostly women (67.9%) and patients'' children (51.1%). Four dimensions of ZBI corresponding to personal strain, frustration, embarrassment, and guilt were assessed and explained 56% variance of burden. Internal consistency of ZBI (α = 0.87) and its dimensions (α1 = 0.88, α2 = 0.83, α3 = 0.72, α4 = 0.75) was good. Stronger cognitive and functional impairment of patients was associated only with personal strain, whereas more pronounced neuropsychiatric symptoms and the need for daily care were associated with more dimensions. Longer caregiver education suppressed embarrassment and promoted guilt. Guilt was higher in younger caregivers, caregivers of female patients, patients'' children, and non-retired caregivers. In multivariate analysis significant predictors of higher overall burden were male sex of the patient, higher NPI-Q, the need for daily-care services, shorter duration of caregiving, non-spouse relationship, higher number of hours caring per-week, and anxious-depressive symptoms in a caregiver.ConclusionThe Croatian version of ZBI is reliable and valid. Our data confirm that ZBI is a multidimensional construct. Caregivers may benefit from individually tailored interventions.

Dementia is an increasing health care problem associated with population aging (1-3). There are 8.9 million persons worldwide caring for patients with dementia older than 50 years (4). Caregiving for a family member with dementia substantially affects all aspects of informal caregivers'' lives and demands lifestyle reorganization and adaptation. Usually, one family member becomes a dominant caregiver, devoting three quarters of a day to caregiving tasks, an amount of time that increases with disease progression (5). Caregivers often neglect their own needs and health problems and become increasingly exposed to physical, emotional, financial, and other loads, all of which can be assembled under the term caregiver burden (6). Patients with dementia frequently experience neuropsychiatric symptoms, which become an increasingly difficult problem, often worse than cognitive deterioration itself (7-9). These symptoms can lead to an inability of the informal caregivers to care for patients within her or his own family and increase the perceived caregiving burden.The most commonly used tool for the assessment of caregiver burden is the Zarit Burden Interview (ZBI) (10). The original 29-item version was shortened to a 22-item version, which is currently the most widely used interview form. Several author groups showed that ZBI was a multidimensional construct, and that caregivers with the same total score might be differently affected by different aspects of burden (11-13). In addition, different burden dimensions might be differently affected by caregiver-related factors such as age, socio-economic factors, family relationship, availability of social support, etc (14-16). The most notable patient-related factors that affect caregiver burden are the presence of neuropsychiatric symptoms (especially irritability, agitation, sleep disorders, anxiousness, apathy, and delusions) and loss of cognitive function (17-20). These considerations have important implications for the planning of appropriate caregiver-oriented interventions.The number of patients with dementia in Croatia ranges from 67 000 (21) to 85 000 (22) (estimates from 2013 and 2010, respectively), approximately 15 000 out of whom reside in the wider Zagreb area. Due to population aging and migrations, these numbers are probably increasing. However, there is currently no official registry of patients with dementia or informal caregivers in Croatia that would provide a direct insight into the real magnitude of the problem. The population of informal caregivers of patients with dementia in Croatia has not been extensively studied so far. It was shown that a high proportion of caregivers suffer from anxious and depressive symptoms (23). In addition, in comparison with professionals, informal caregivers were more anxious and depressive, especially if they were of older age and lived in the same household with the patient (24). Caregiving burden was identified as a contributor to the satisfaction with social support (25).There is currently no version of ZBI questionnaire validated for the Croatian population. Thus, the aims of our study were to validate the Croatian version of the ZBI, to evaluate the validity and internal consistency of the questionnaire, and to assess the relationship of caregivers’ and patients’ characteristics with total and different aspects of caregiver burden.  相似文献   

10.
11.
AimTo assess the correlations of B regulatory cells (Bregs) and monocyte subsets in peripheral blood with the National Institutes of Health (NIH)-consensus-defined clinical manifestations of chronic graft-vs-host disease (cGvHD), in an attempt to establish their role as cellular biomarkers.MethodsThis multidisciplinary prospective study enrolled adult cGVHD patients treated in the University Hospital Center Zagreb and University of Zagreb School of Medicine. Immunophenotypic subpopulations of CD24highCD38high Bregs (CD27-, CD27+, and total) and monocyte (classical, intermediate, and non-classical) counts were correlated with demographic, transplant, and cGVHD-related data. Bivariate correlation analysis was performed to evaluate the correlations between Bregs and monocytes subsets and cGVHD organ involvement, as well as cGVHD severity and immunosuppression intensity.ResultsTwenty-two adult patients (54.5% female) with cGVHD were enrolled. The median (range) age was 44.5 years (24-65). All patients were transplanted for hematologic malignancies and 40.9% had severe NIH cGVHD global score. The median time from cGVHD diagnosis to the analysis was 16.6 months (0-176). The organ most frequently affected with cGVHD were the eyes (68.2%), skin (45.5%), lungs (45.5%), and liver (40.9%). Lower total and CD27-Bregs counts were correlated with worse cGVHD severity, higher immunosuppression intensity, and lung cGVHD, in terms of cell count, but also with skin cGVHD, in terms of percentages. Patients with liver and joint/fascia cGVHD had a lower percentage of non-classical monocytes and patients with more severe global NIH score had a higher classical monocytes count.ConclusionDifferent organs affected by cGVHD are differently associated with different subpopulations of Bregs and monocytes.

Allogeneic hematopoietic cell transplantation (alloHCT) is a curative treatment option for an array of severe malignant diseases and non-malignant conditions. Chronic graft-vs-host disease (cGVHD) is a major late complication of alloHCT, influencing immune reconstitution and affecting different organs and tissues (1). cGVHD poses a significant morbidity and mortality burden on long-term survivors, a portion of whom require prolonged systemic immunosuppression (2). Since cGVHD is a complex multisystemic allo- and autoimmune disease, different clinical cGVHD presentations might be underlied by distinct pathophysiological processes.Recently, major insights have been gained into the pathophysiology of cGVHD and its clinical manifestations (3,4). However, research of specific immunological events and their association to specific phenotypical manifestations is still lacking. The National Institutes of Health (NIH) cGVHD Diagnosis and Staging criteria provided much-needed guidelines for clinicians and researchers, facilitating cGVHD drug approvals and novel therapeutic approaches (5). The proposed three-phase pathophysiological model consists of an initial inflammatory phase, a dysregulated immunity phase, and a tissue fibrosis phase (4). The initial tissue injury of hematopoietic stem-cell recipient caused either by the chemotherapeutic regimen, acute GVHD, or infection, produces a number of antigens, which are processed by components of innate and adaptive immune system. The cascade of immunologic events induces the differentiation of pathogenic Th17 cells. Chronic inflammation occurring in cGVHD is consequent to the inability of regulatory mechanisms to control donor-derived effector immune mechanisms. Pathological fibrosis (excessive accumulation of extracellular matrix) is the underlying pathophysiological mechanism for some of the characteristic cGVHD clinical features, such as superficial and deep skin sclerosis, but other organs can also be involved. Many attempts have been made at finding cGVHD biomarkers, whether molecular or cellular, in peripheral blood to improve early diagnosis, prognostication, or therapy monitoring (6,7). However, due to insufficient understanding of the disease pathophysiology and complexity of clinical manifestation, developing reliable biomarkers for clinical use in cGVHD remains an active research aim.B cells play an important role in cGVHD pathogenesis (8), and different B cell subsets have already been associated with clinical cGVHD manifestations (9,10). Regulatory B cells (Bregs) secrete IL-10 and have immunosuppressive activity, but they do not share a specific immunophenotype, as different subsets of B lymphocytes are capable to differentiate into Bregs (11). The CD24highCD38high B cells subset has been associated with immunosuppressive capacity and regulatory functions in autoimmune diseases (12,13).Monocytes are mononuclear phagocytes that originate in the bone marrow and have a short life span in the circulation before they migrate to the tissue and differentiate into macrophages and dendritic cells (14). Three different subtypes of monocytes have been recognized according to the expression of surface markers CD14 (co-receptor for lipopolysaccharide) and CD16 (FCγIII receptor) (15): non-classical, intermediate, and classical monocytes. Relative percentages of these three subtypes have been correlated with the activity of various cardiovascular, infectious, and autoimmune diseases (16), but rarely in cGVHD.This pilot study assessed the subpopulations of peripheral blood regulatory CD24highCD38high B-lymphocytes (Bregs: CD27-, CD27+, total Bregs) and monocytes (classical: CD142+CD16-, intermediate: CD142+CD16+, and non-classical: CD14+CD162+) in patients with different clinical manifestations and severity of cGVHD.  相似文献   

12.
AimTo explore the prognostic value of modified Discriminant Function (mDF), Glasgow Alcoholic Hepatitis Score (GAHS), Model of End Stage Liver Disease (MELD), Age-Bilirubin-International Normalized Ratio-Creatinine score (ABIC), and the Lille Model for the 28- and 90-day mortality in patients with alcoholic hepatitis.MethodsThis retrospective study enrolled patients treated for alcoholic hepatitis in Dubrava University Hospital between January 2014 and May 2018. The diagnosis was established based on histology findings or the combination of patient´s history of ongoing alcohol consumption before hospitalization, serum bilirubin above 50 mmol/L, and aspartate transaminase to alanine transaminase ratio greater than 1.5. We calculated mDF, MELD, GAHS, and ABIC on the first and seventh day of hospitalization (including the Lille model).ResultsIn total, 70 patients were enrolled. ABIC at admission most accurately predicted the 28-day mortality, with a cut-off of 9.92 (AUC 0.727; 95% CI 0.608-0.827, P = 0.0119), while GAHS most accurately predicted the 90-day mortality, calculated both at admission (cut off >7, AUC 0.765, 95% CI 0.639-0.864, P < 0.0001) and after seven days of hospitalization (cut-off >8, AUC 0.835 95% CI 0.716-0.918, P < 0.0001). Modified DF was able to predict the 28- and 90-day mortality only when calculated after seven days of hospitalization.ConclusionThere is a need for better prognostic indicators for patients with AH.

Alcoholic hepatitis (AH) is a clinical entity characterized by a sudden onset of jaundice and coagulopathy, often accompanied by elements of systemic inflammatory response syndrome, such as pyrexia and leukocytosis. About 35% of patients with alcohol-related liver disease (ALD) develop AH with steatohepatitis as the main histologic feature, whereas most patients who present with a severe form of AH (SAH) have already developed cirrhosis (1-3). High mortality rates of 16% and 30% at one and three months, respectively, with an overall five-year survival of 56%, indicate the importance of early recognition and adequate management of patients with SAH (4,5).Historically, SAH was defined as modified Discriminant Function (mDF)≥32, a cut-off above which patients had significantly higher mortality rates and benefited from methylprednisolone therapy (6). Indeed, a recent randomized control trial on more than 1100 patients has confirmed that steroid therapy decreased the 28-day mortality in these patients. This benefit, however, was observed only in a subgroup of patients with SAH without overt sepsis or gastrointestinal hemorrhage at presentation. Furthermore, corticosteroid therapy beyond 28 days yielded no survival benefit (4). Consequently, some authors emphasized that mDF suffered from a major limitation: it is highly sensitive but not as specific. This results in the overexposure of some patients to steroid therapy and subsequent higher infection rates, without a clear therapeutic benefit (7,8).Alternatives to mDF have shown better prognostic values (7). The Model for End-Stage Liver Disease (MELD) was originally developed (published in 2000) to predict the outcome of cirrhotic patients undergoing elective transjugular intrahepatic portosystemic shunt but was subsequently shown as an independent survival predictor in various cohorts of cirrhotic patients (9,10). It is comparable to mDF in predicting the outcome in patients with AH. Furthermore, MELD is easier to apply than mDF since it uses international standardized ratio (INR) instead of prothrombin time (PT) in seconds (11). Still, MELD has the drawback of using creatinine, which needs to be adjusted in the context of severe hyperbilirubinemia. In 2005, Forrest et al proposed the use of Glasgow Alcoholic Hepatitis Score (GAHS), a tool that was even simpler to calculate using independent factors associated with increased mortality (age, white cell count, urea, INR, and bilirubin) and that showed better results than mDF (12). GAHS has been advocated as an alternative to mDF by the updated guidelines of the European Association for the Study of the Liver, whereas the American College of Gastroenterology has advocated the use of MELD (13,14). The Lille Model (which uses age, albumin, bilirubin – initial and after 7 days, creatinine, and PT) is another prognostic tool, introduced in 2007, used to assess the efficacy of corticosteroid therapy in patients with SAH, ie, to predict poor survival in corticosteroid-treated patients (15). It was shown to outperform mDF, GAHS, and MELD in accuracy. Lastly, Age-Bilirubin-International Normalized Ratio-Creatinine score (ABIC), validated in 2008, can further stratify patients into three risk-categories with the cut-off values ranging from 6.71 to 9 (low, intermediate, and high risk of death). Dominguez et al proposed that patients with intermediate and high risk of death receive corticosteroid therapy (16). In these patients, ABIC was shown to better predict longer term (90-day) mortality than mDF (16). The aim of our study was to evaluate the prognostic value of the mentioned scoring systems in a group of AH patients from a single tertiary center.  相似文献   

13.
AimTo describe the SARS-CoV-2 epidemic pattern in Croatia during February-September 2020 and compare the case fatality ratio (CFR) between spring and summer.MethodsNational data were used to calculate the weekly and monthly CFRs, stratified by three age groups: 0-64, 65-79, and 80+ years. We also calculated the standardized mortality ratios (SMR) to offset the differences in age composition.ResultsThe epidemic consisted of the initial wave, a trough in June, and two conjoined summer waves, yielding 17 206 coronavirus disease 2019 cases and 290 deaths. While the number of confirmed cases nearly quadrupled during summer, case fatality estimates decreased; CFR in spring was 4.81 (95% confidence interval 3.91-5.71), compared with 1.24 (1.06-1.42) in summer. The SMR for summer was 0.45 (0.37-0.55), suggesting that the case fatality risk halved compared with spring. Cardiovascular comorbidity was an important risk factor for case fatality (SMR 2.63 [2.20-3.13] during spring and 1.28 [1.02-1.59] during summer). The risk of death in ventilated patients remained unchanged (SMR 0.98 [0.77-1.24]).ConclusionsThe epidemic dynamics suggests summer decline in case fatality, except in ventilated patients. While the effect of comorbidity also decreased, cardiovascular comorbidity remained an important risk factor for death even during summer. A plethora of possible confounders and an ever-changing landscape of SARS-CoV-2 epidemic in Croatia require constant monitoring and evaluation, with an aim to prevent the uncontrolled spread of the virus and a disruption of health care functioning.

The understanding of the seasonal pattern of SARS-CoV-2 spread and coronavirus disease 2019 (COVID-19) symptoms severity is not of only academic significance, but plays a central role in the epidemic preparedness (1-4). For the Northern hemisphere, the issues currently of the greatest importance are the prediction of the autumn and winter disease spread and mortality (5). Previous articles have most commonly confirmed the expected role of seasonality (6), often with an explanation that higher temperature reduces the virus spread during summer (7-9). On the other hand, some studies have shown that we may not expect a substantial seasonality pattern, suggesting that SARS-CoV-2 may not behave like a common seasonal respiratory pathogen (10-12). A recent systematic review confirmed the existence of some degree of seasonality, but also stated that it may not be the only or the critical variable that will define the pandemic patterns (13).There are various possible causes underlying the seasonal pattern of the pandemic, including changes in temperature and humidity (14-17), or wind speed and air pollution (18). Additionally, unfavorable conditions were reported to affect the ability of respiratory mucosa to fend off (19), which suggests a possible synergistic effect between numerous factors. An important distinction needs to be made by separately addressing the viral spread and case fatality, since the two might not exhibit similar patterns. The recognition of such differences may shed more light on the disease pathogenesis, and possibly even steer the path toward mechanisms of epidemic control. However, studies that made this distinction often report the expectation that for Europe, an increase in humidity will lower the number of new cases and deaths, while a rise in temperature will increase these parameters (20). Although such predictions are very sensitive to numerous modeling input parameters and assumptions, none can be superior to the epidemiological ex post-facto analysis. Several such articles suggested lowering of the summer case fatality (9,21), but this finding was not replicated systematically across all studies (22). Therefore, the aim of this study was to explore the epidemic pattern in Croatia, with a special focus on understanding the difference in the disease spread and case fatality during the spring and summer of 2020.  相似文献   

14.
AimTo assess the differences in the way how Slovenian and Croatian health care professionals (HCPs) confront ethical dilemmas and perceive the role of hospital ethics committees (HECs).MethodsThis cross-sectional, survey-based study involved HCPs from three Slovenian and five Croatian university medical centers (UMC). The final sample sizes were 308 (244 or 79.2% women) for Slovenia and 485 (398 or 82.1% women) for Croatia.ResultsCompared with Croatian physicians, Slovenian physicians reported a higher share of ethical dilemmas regarding waiting periods for diagnostics or treatment, suboptimal working conditions due to interpersonal relationships in the ward, and end-of-life treatment withdrawal, and a lower share regarding access to palliative care and patient information protection. Compared with Croatian nurses, Slovenian nurses reported a lower share of ethical dilemmas regarding the distribution of limited resources, recognizing the patient’s best interests, and access to palliative care. Compared with Croatian other HCPs, Slovenian other HCPs reported a lower burden of ethical dilemmas regarding waiting periods for diagnostics or treatment, distribution of limited resources, and access to palliative care. When encountering an ethical dilemma, all HCPs in both countries would first consult their colleagues. Slovenian and Croatian HCPs recognized the importance of the HECs to a similar extent, but viewed their role differently.ConclusionCroatian and Slovenian HCPs are confronted with different ethical dilemmas and perceive the role of HECs differently.

An ethical dilemma arises when we are confronted with a situation with two morally justifiable solutions, none of which is entirely satisfactory (1). In the course of their daily work, health care professionals (HCPs) encounter a broad range of ethical dilemmas (2-4), which often result in a moral distress for HCPs (5,6). A critical requirement for a successful response to an ethical dilemma is a strong foundation in medical professionalism cultivated during medical training and consolidated during professional work experience and career development (7-9).Slovenia and Croatia, previously the westernmost republics of the former Yugoslavia and now European Union members, share the same historical, geopolitical, economic, and religious background. A recent survey in the largest Slovenian tertiary hospital, the University Medical Center Ljubljana, found that the most important contexts that give rise to ethical dilemmas among HCPs were waiting periods for diagnostics and treatment, suboptimal working conditions due to poor interpersonal relationships, and preserving patients'' dignity, while the least important contexts were biomedical research, organ transplantation, and vaccine hesitancy (10). A study at the University Medical Center Rijeka found similar main ethical dilemmas in Croatian nurses and physicians, which included limiting life-sustaining therapy, euthanasia, and physician-assisted suicide (11).Except these two studies, little to nothing is known about the ethical dilemmas of HCPs in Slovenia and Croatia. In response to this limited evidence, we conducted a prospective survey with a primary objective to assess the differences in the share of ethical dilemmas among different categories of HCPs (physicians, nurses, and other HCPs) in Slovenian and Croatian tertiary hospitals (university medical centers, UMCs). The UMCs were purposively selected because in this kind of hospitals, one encounters complicated cases usually referred from other health care institutions for complex diagnostic and therapeutic procedures, which can often raise ethical issues. The secondary objectives of our survey were to study differences in the opinion on the existence of standard procedures when HCPs are facing an ethical dilemma; to determine whom HCPs consult when facing an ethical dilemma; and to identify the opinion on the importance of hospital ethics committees (HECs) and their role in Slovenia and Croatia.  相似文献   

15.
16.
AimTo evaluate the relationship between the neurological outcome, neonatal epileptic seizures, and signal-intensity visibility of the frontal and parietal periventricular crossroads of pathways on brain magnetic resonance imaging (MRI) in preterm infants at term-equivalent age.MethodsThe study enrolled 48 preterm infants born between 2012 and 2016. The signal-intensity characteristics of the frontal and parietal periventricular crossroads were evaluated and classified into four grades. A non-favorable outcome was defined as a motor and functional disorder with developmental delay and/or cerebral palsy.ResultsNeonatal seizures, epilepsy, pathological EEG and brain ultrasound finding, and brain MRI abnormalities were mostly found in neonates with non-favorable outcomes. Visible frontal and parietal periventricular crossroads were associated with a normal neurologic outcome (P = 0.0004; P = 0.0009, respectively). Not-visible or slightly visible periventricular crossroads were associated with non-favorable outcomes in the case of frontal crossroads (P = 0.036) and not-visible periventricular crossroads in the case of both frontal and parietal crossroads (P = 0.001, P = 0.015, respectively). The visibility of the frontal and parietal periventricular crossroads was associated with a lack of neonatal epileptic seizures (P = 0.03; P = 0.02, respectively). The frontal crossroads were more frequently slightly visible, while the parietal periventricular crossroads were more frequently visible.ConclusionPoor visibility of the frontal and parietal crossroads of pathways on MRI is associated with neonatal epileptic seizures and poor neurological outcomes in preterm infants at term-equivalent age.

The advancement of neonatal medicine has increased the survival rates of preterm children with neonatal white matter brain injury and a higher risk of poor developmental outcomes, including motor and cognitive deficits with impaired social development. Approximately 10% of preterm neonates (PN) develop cerebral palsy, while 40% experience cognitive loss, language development problems, and a consequent lower educational attainment (1-3). The prevalence of neurodevelopmental difficulties in PNs is higher compared with full-term infants and increases with lower gestational age (4,5).The neonatal brain MRI is an essential clinical tool for predicting the long-term outcome of brain injury in PN (6,7). However, a significant number of PNs have neurodevelopmental difficulties without having brain abnormalities structurally visible on MRI. A biological substrate of these disabilities could be abnormal brain maturation. The level of cerebral maturity in PNs is estimated by assessing the MRI characteristics of transient fetal compartments (periventricular crossroads areas, subplate, and von Monakow segments) that persist at term-equivalent age (8). According to von Monakow''s division of the cerebral white matter into five compartments, the periventricular crossroads of pathways are part of the segment II (9). These transient fetal compartments are where intense developmental processes take place during the second half of gestation, and they are possibly unrecognized cellular and topographic areas affected by perinatal brain lesions.The periventricular crossroads are located in fetal white matter along the lateral ventricles and correspond to the cross-section of the main cortical projection, associative, and commissural fibers. These areas, characterized by a hydrophilic extracellular matrix, can be well recognized on T2w MRI (10). We hypothesized that the imaging characteristics of these intersections on brain MRI were the biomarkers of neurodevelopmental outcomes. The original study aim was to assess the relationship between the neurological outcome and neuroimaging characteristics. As we obtained encouraging results regarding brain crossroads, we decided to retrospectively review standard T2w MRI brain scan to specifically assess these brain structures. Accordingly, we analyzed the relationship between the visibility of the frontal and parietal crossroads and the occurrence of neonatal epileptic seizures and neurological outcomes.  相似文献   

17.
AimTo examine the characteristics of pregnancies at a very advanced age and the effect of parity on adverse obstetric outcomes.MethodsWe retrospectively reviewed the records of women who gave birth at the Obstetrics and Gynecology Department of Okmeydanı Training and Research Hospital between January 2012 and December 2019. Overall, 22 448 of women were younger than 40 and 593 were aged 40 and older. Women aged 40 and older were divided into the primiparous (52 or 8.77%) and multiparous group (541 or 91.23%).ResultsSignificantly more women aged 40 and older had a cesarean section. The most common indications for a secondary cesarean delivery in both age groups were a previous cesarean procedure or uterine operation. The most frequent indication for primary cesarean section in both groups was fetal distress. Cesarean section rates due to non-progressive labor, fetal distress, and preeclampsia were significantly more frequent in primiparous women compared with multiparous women aged 40 and older. In primiparous women, fetal birth weight was lower and preeclampsia/gestational hypertension frequency were higher.ConclusionSince primiparity was a risk factor for lower fetal birth weight and preeclampsia/gestational hypertension in the age group of 40 years and above, more attention should be paid to the follow-up and treatment of these patients.

Due to social and economic problems, career priority, and prolonged education, an increasing number of women choose to give birth at an advanced age. Their choice is facilitated by the availability and efficacy of contraceptive methods and assisted reproductive technology (ART) (1). Still, the rate of nulliparity at advanced age increases, while the parity rate decreases (2).The International Federation of Gynecology and Obstetrics (FIGO) uses the term “advanced maternal age” for pregnancies at the age of 35 and over, and the term “very advanced maternal age” for pregnancies at the age 40 and over. In these pregnancies, chronic diseases and medical problems are more common and these women constitute a high-risk patient group (3).Most studies have shown that the advanced maternal age increases the risk of hypertension, gestational diabetes mellitus, postpartum hemorrhage, premature birth, cesarean procedure, intrauterine growth retardation, and perinatal mortality (4,5). However, studies comparing the outcomes in primiparous and multiparous women pregnant at an advanced maternal age are scarce, and the evidence for some of the outcomes is conflicting. Therefore, the aim of our study is to examine the characteristics of pregnancies at a very advanced maternal age and to assess the effect of parity on negative obstetric outcomes.  相似文献   

18.
MethodsData on patients aged ≤19 years with a positive SARS-CoV-2 PCR test recorded in the period March 12-May 12 (first wave) and June 19-July 19, 2020 (second wave) were retrospectively analyzed. The periods were separated by several weeks with no incident cases.ResultsWe analyzed data on 289 children and adolescents (6.5% of all cases; incidence rate [IR] = 3.54, 95% confidence interval [CI] 3.14-3.97/million person-days), 124 in the first wave (IR = 2.27) and 165 in the second wave (IR = 6.37): IRR second/first = 2.71 (2.13-3.44). During the first wave, the incidence was highest in infants (IR = 3.48), while during the second wave it progressively increased to IR = 7.37 in 15-19-year olds. Family members were the key epidemiological contacts (72.6% cases), particularly during the first wave (95.8% vs 56.3%). Overall, 41.3% patients were asymptomatic, 25.3% in the first and 52.6% in the second wave. Age 15-19 years (vs younger) was associated with a higher (RR = 1.26, 1.02-1.54) and infection in the second wave with a lower probability (RR = 0.66, 0.53-0.81) of being symptomatic. The most common symptoms were fever, cough, and rhinorrhea. In children aged ≥7 years, headache, anosmia/ageusia, and sore throat were also recorded. Only one child suffered a severe disease. All but 18 (7.8%) children were treated only symptomatically, and all fully recovered.ConclusionA large proportion of SARS-CoV-2 PCR-positive children/adolescents were asymptomatic. The associated disease was predominantly mild, comparably so in the first and second pandemic wave.

Since the late December 2019, coronavirus disease 2019 (COVID-19) caused by the novel severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) has spread quickly worldwide and as of early December accounts for more than 65 million cases diagnosed in more than 200 countries (1). At this point, the most affected countries in Europe are Russia, Spain, France, United Kingdom (UK), and Italy with consequently the highest mortality rates. The first case in Croatia was reported in the late February 2020, and within the next two months the infection expanded nationwide. During this first epidemic wave, Croatia was under a one-month lockdown, which rapidly decreased the disease incidence, and only a few newly diagnosed cases were reported between May 25 and June18, 2020. Easing of restrictions increased the incidence in late June, causing a second wave of COVID-19 in Croatia, with >147 000 cases reported so far (1,2).Over the last two decades, there were two other coronavirus outbreaks. Severe acute respiratory syndrome coronavirus appeared in 2002, affecting around 8000 people, with 10% mortality. Children (4 months-17 years) accounted for <0.02% of total cases, and there was no reported death in this age group. During the outbreak of the Middle East respiratory syndrome coronavirus, around 2300 people were infected, and children (<19 years of age) were rarely affected as well (2% of total cases; 2 reported deaths) (3,4). COVID-19 has exhibited a similar epidemiological pattern. Although early reports from China, Italy, and the United States (US) suggested that children and adolescents accounted for only 1%-2% of the overall COVID-19 cases (5-7), later reports around the world indicated a higher proportions of pediatric cases, between 1%-8% (8-10). Children of all ages can be affected by SARS-CoV-2 infection, but in contrast to other respiratory viruses, they usually suffer a mild or asymptomatic infection. Compared with adults, severe infections and fatal outcomes in children are rare, and several immunopathological mechanisms could be responsible for such differences in disease severity (11). Although many studies have reviewed the features of adults with COVID-19, overall data regarding pediatric cases are scarce, and most of them are reports from China and the US, with only a few studies describing disease in children from European countries.We aimed to describe epidemiological and clinical features of children and adolescents with COVID-19 confirmed by the polymerase chain reaction (PCR) test for SARS-CoV-2 in Croatia and to assess potential differences between the first (March-May 2020) and second (on-going) pandemic wave (June-July 2020).  相似文献   

19.
20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号