Assessment of potentially traumatic events and related psychological symptoms in refugee youth is common in epidemiological and intervention research. The objective of this study is to characterize reactions to assessments of trauma exposure and psychological symptoms, including traumatic stress, in refugee youth and their caregivers. Eighty-eight Somali youth and their caregivers participated in a screening and baseline interview for a psychological intervention in three refugee camps in Ethiopia. Participants were asked about their levels of distress prior to, immediately after, and approximately two weeks after completing the interview. Other quantitative and qualitative questions inquired about specific reactions to interview questions and procedures. Children and caregivers became increasingly relaxed over the course of the interview, on average. Few children (5.3%) or caregivers (6.5%) who reported being relaxed at the beginning of the interview became upset by the end of the interview. Some children and caregivers reported that certain assessment questions were upsetting and that feeling upset interfered with their activities. Despite some participants reporting persistent negative reactions, most reported liking and benefitting from the interview. While the majority of refugee youth and their caregivers reported positive experiences associated with completing trauma-related assessments, some reported negative reactions. Researchers and practitioners must consider the necessity, risks, and benefits of including questions about potentially traumatic events and related symptoms that are particularly upsetting in screening, survey research, and clinical assessment. When included, it is important that researchers and practitioners monitor negative reactions to these assessments and connect participants who become distressed with appropriate services.
The reasons patients with haematological malignancies die in hospital more often than those with other cancers is the subject of much speculation. We examined variations in place of death by disease sub-type and time from diagnosis to death, to identify groups of ‘at-risk’ patients.
Methods
The study is based in the United Kingdom within the infrastructure of the Haematological Malignancy Research Network (HMRN), a large on-going population-based cohort including all patients newly diagnosed with haematological malignancies in the north of England. Diagnostic, demographic, prognostic, treatment and outcome data are collected for each patient and individuals are ‘flagged’ for death. This study includes all adults (≥18 years) diagnosed 1st September 2004 to 31st August 2010 (n?=?10,325), focussing on those who died on/before 31st August 2012 (n?=?4829).
Results
Most deaths occurred in hospital (65.9%), followed by home (15.6%), nursing home (11%) and hospice (7.5%) and there was little variation by diagnostic sub-type overall. Differences in place of death were, however, observed by time from diagnosis to death, and this was closely related to sub-type; 87.7% of deaths within a month of diagnosis happened in hospital and these largely occurred in patients with acute myeloid leukaemia, diffuse large B-cell lymphoma and myeloma. Patients surviving longer, and particularly beyond 1 year, were less likely to die in hospital and this corresponded with an increase in the proportion of home deaths.
Conclusions
Time from diagnosis to death was clearly a major determinant of place of death and many patients that died within three months of diagnosis did so in hospital. This was closely related to disease sub-type, with early deaths occurring most notable in the more aggressive diseases. This is likely to be due to a combination of factors including acute presentation, rapid disease progression without transition to a palliative approach to care and complications of treatment. Nonetheless, hospital deaths also occurred frequently in indolent diseases, suggesting that other factors were likely to contribute to the large proportion of hospital deaths overall. More evidence is needed to fully understand these complex cancers.
Systemic sclerosis (SSc) is an autoimmune disease that can cause disfiguring changes in appearance. This study examined the structural validity, internal consistency reliability, convergent validity, and measurement equivalence of the Social Appearance Anxiety Scale (SAAS) across SSc disease subtypes.
Methods
Patients enrolled in the Scleroderma Patient‐centered Intervention Network Cohort completed the SAAS and measures of appearance‐related concerns and psychological distress. Confirmatory factor analysis (CFA) was used to examine the structural validity of the SAAS. Multiple‐group CFA was used to determine whether SAAS scores can be compared across patients with limited and diffuse disease subtypes. Cronbach's alpha was used to examine internal consistency reliability. Correlations of SAAS scores with measures of body image dissatisfaction, fear of negative evaluation, social anxiety, and depression were used to examine convergent validity. SAAS scores were hypothesized to be positively associated with all convergent validity measures, with correlations significant and moderate to large in size.
Results
A total of 938 patients with SSc were included. CFA supported a 1‐factor structure (Comparative Fit Index 0.92, Standardized Root Mean Residual 0.04, and Root Mean Square Error of Approximation 0.08), and multiple‐group CFA indicated that the scalar invariance model best fit the data. Internal consistency reliability was good in the total sample (α = 0.96) and in disease subgroups. Overall, evidence of convergent validity was found with measures of body image dissatisfaction, fear of negative evaluation, social anxiety, and depression.
Conclusion
The SAAS can be reliably and validly used to assess fear of appearance evaluation in patients with SSc, and SAAS scores can be meaningfully compared across disease subtypes. 相似文献
Coronary stents are commonly deployed using high pressure. However, the duration time of balloon inflation during deployment is still to be determined.
Vallurupalli and coworkers, in this issue of CCI, show that the stent system takes an average of 33 sec to “accommodate” its pressure during in vitro deployment. In patients, the mean stent inflation time to achieve pressure stability was 104 seconds, ranging from 30 to 380 sec.
These results challenge a rapid inflation/deflation approach for stent deployment. It is suggested that the duration of the inflation might be individualized, in a case‐by‐case approach.
However, the findings must be interpreted with caution, as they cannot be directly extrapolated to more diverse clinical, angiographic, and interventional scenarios.
Widespread coverage of vulnerable populations with insecticide-treated nets (ITNs) constitutes an important component of the Roll Back Malaria (RBM) strategy to control malaria. The Abuja Targets call for 60% coverage of children under 5 years of age and pregnant women by 2005; but current coverage in Africa is unacceptably low. The RBM 'Strategic Framework for Coordinated National Action in Scaling-up Insecticide-Treated Netting Programmes in Africa' promotes coordinated national action and advocates sustained public provision of targeted subsidies to maximise public health benefits, alongside support and stimulation of the private sector. Several countries have already planned or initiated targeted subsidy schemes either on a pilot scale or on a national scale, and have valuable experience which can inform future interventions. The WHO RBM 'Workshop on mapping models for delivering ITNs through targeted subsidies' held in Zambia in 2003 provided an opportunity to share and document these country experiences. This paper brings together experiences presented at the workshop with other information on experiences of targeting subsidies on ITNs, net treatment kits and retreatment services (ITN products) in order to describe alternative approaches, highlight their similarities and differences, outline lessons learnt, and identify gaps in knowledge. We find that while there is a growing body of knowledge on different approaches to targeting ITN subsidies, there are significant gaps in knowledge in crucial areas. Key questions regarding how best to target, how much it will cost and what outcomes (levels of coverage) to expect remain unanswered. High quality, well-funded monitoring and evaluation of alternative approaches to targeting ITN subsidies is vital to develop a knowledge base so that countries can design and implement effective strategies to target ITN subsidies. 相似文献
We previously found that contrast-induced nephropathy (CIN) complicating percutaneous coronary intervention adversely affects patients with chronic kidney disease (CKD). Therefore, we further investigated whether the predictors and outcome of CIN after percutaneous coronary intervention differ among patients with versus without CKD. Among 7,230 consecutive patients, CIN (>or=25% or >or=0.5 mg/dl increase in preprocedure serum creatinine 48 hours after the procedure) developed in 381 of 1,980 patients (19.2%) with baseline CKD (estimated glomerular filtration rate [eGFR] <60 ml/min/1.73 m(2)) and in 688 of 5,250 patients (13.1%) without CKD. Decreased eGFRs, periprocedural hypotension, higher contrast media volumes, lower baseline hematocrit, diabetes, pulmonary edema at presentation, intra-aortic balloon pump use, and ejection fraction <40% were the most significant predictors of CIN in patients with CKD. Apart from intra-aortic balloon pump use, predictors of CIN in patients without CKD were the same as mentioned, plus older age and type of contrast media. Regardless of baseline renal function, CIN correlated with longer in-hospital stay and higher rates of in-hospital complications and 1-year mortality compared with patients without CIN. By multivariate analysis, CIN was 1 of the most powerful predictors of 1-year mortality in patients with preexisting CKD (odds ratio 2.37, 95% confidence interval 1.63 to 3.44) or preserved eGFR (odds ratio 1.78; 95% confidence interval 1.22 to 2.60). Thus, regardless of the presence of CKD, baseline characteristics and periprocedural hemodynamic parameters predict CIN, and this complication is associated with worse in-hospital and 1-year outcomes. 相似文献