首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
2.
Medical error reduction is an international issue, as is the implementation of patient care information systems (PCISs) as a potential means to achieving it. As researchers conducting separate studies in the United States, The Netherlands, and Australia, using similar qualitative methods to investigate implementing PCISs, the authors have encountered many instances in which PCIS applications seem to foster errors rather than reduce their likelihood. The authors describe the kinds of silent errors they have witnessed and, from their different social science perspectives (information science, sociology, and cognitive science), they interpret the nature of these errors. The errors fall into two main categories: those in the process of entering and retrieving information, and those in the communication and coordination process that the PCIS is supposed to support. The authors believe that with a heightened awareness of these issues, informaticians can educate, design systems, implement, and conduct research in such a way that they might be able to avoid the unintended consequences of these subtle silent errors.Medical error reduction is an international issue. The Institute of Medicine''s report on medical errors1 dramatically called attention to dangers inherent in the U.S. medical care system that might cause up to 98,000 deaths in hospitals and cost approximately $38 billion per year. In the United Kingdom, the chief medical officer of the newly established National Patient Safety Agency estimates that “850,000 incidents and errors occur in the NHS each year.”2 In The Netherlands, the exact implications of the U.S. figures for the Dutch health care scene are much debated. There as well, however, patient safety is on its way to becoming a political priority. Medication errors alone have been estimated to cause 80,000 hospital admissions per year in Australia, costing $350 million.3In much of the literature on patient safety, patient care information systems (PCISs) are lauded as one of the core building blocks for a safer health care system.4 PCISs are broadly defined here as applications that support the health care process by allowing health care professionals or patients direct access to order entry systems, medical record systems, radiology information systems, patient information systems, and so on. With fully accessible and integrated electronic patient records, and with instant access to up-to-date medical knowledge, faulty decision making resulting from a lack of information can be significantly reduced.5 Likewise, computerized provider order entry (CPOE) systems and automated reminder systems can reduce errors by eliminating illegible orders, improving communication, improving the tracking of orders, checking for inappropriate orders, and reminding professionals of actions to be undertaken. In this way, these systems can contribute to preventing under-, over-, or misuse of diagnostic or therapeutic interventions.6,7,8 Among the broad array of health informatics applications, CPOE systems, and especially medication systems, have received the most attention.9,10,11,12PCISs are complicated technologies, often encompassing millions of lines of code written by many different individuals. The interaction space13 within which clinicians carry out their work can also be immensely complex, because individuals can execute their tasks by communicating across rich social networks. When such technologies become an integral part of health care work practices, we are confronted with a large sociotechnical system in which many behaviors emerge out of the sociotechnical coupling, and the behavior of the overall system in any new situation can never be fully predicted from the individual social or technical components.13,14,15,16,17It is not surprising, therefore, that authors have started to describe some of the unintended consequences that the implementation of PCISs can trigger.18 For instance, professionals could trust the decision support suggested by the seemingly objective computer more than is actually called for.15,19 Also, PCISs could impose additional work tasks on already heavily burdened professionals,20,21 and the tasks are often clerical and therefore economically inefficient.17 They can upset smooth working relations and communication routines.13,22 Also, given their complexity, PCISs could themselves contain design flaws “that generate specific hazards and require vigilance to detect.”23,24 As a consequence, PCISs might not be as successful in preventing errors as is generally hoped. Worse still, PCISs could actually generate new errors.25(p.511),26,27It is obvious that PCISs will ultimately be a necessary component of any high-quality health care delivery system. Yet, in our research in three different countries, we have each encountered many instances in which PCIS applications seemed to foster errors rather than reduce their likelihood. In health care practices in the United States, Europe, and Australia alike, we have seen situations in which the system of people, technologies, organizational routines, and regulations that constitutes any health care practice seemed to be weakened rather than strengthened by the introduction of the PCIS application. In other words, we frequently observed instances in which the intended strengthening of one link in the chain of care actually leads unwittingly to a deletion or weakening of others.We argue that many of these errors are the result of highly specific failures in PCIS design and/or implementation. We do not focus on errors that are the result of faulty programming or other technical dysfunctions. Hardware problems and software bugs are more common than they should be, especially in a high-risk field such as medicine. However, these problems are well known and can theoretically be dealt with through testing before implementation. Similarly, we do not discuss errors that are the result of obvious individual or organizational dysfunctioning such as a physician refusing to seek information in the computer system “because that is not his task,” or a health care delivery organization cutting training programs for a new PCIS for budgetary reasons.We do focus on those often latent or silent errors that are the result of a mismatch between the functioning of the PCIS and the real-life demands of health care work. Such errors are not easily found by a technical analysis of the PCIS design, or even suspected after the first encounter with the system in use. They can only emerge when the technical system is embedded into a working organization and can vary from one organization to the next. Yet, in failing to take seriously some by now well-recognized features of health care work, some PCISs are designed or implemented in such a way that error can arguably be expected to result. Only when thoughtful consideration is given to these issues, we argue, will PCISs be able to fulfill their promise.  相似文献   

3.
Objectives: To determine clinicians'' (doctors'', nurses'', and allied health professionals'') “actual” and “reported” use of a point-of-care online information retrieval system; and to make an assessment of the extent to which use is related to direct patient care by testing two hypotheses: hypothesis 1: clinicians use online evidence primarily to support clinical decisions relating to direct patient care; and hypothesis 2: clinicians use online evidence predominantly for research and continuing education.Design: Web-log analysis of the Clinical Information Access Program (CIAP), an online, 24-hour, point-of-care information retrieval system available to 55,000 clinicians in public hospitals in New South Wales, Australia. A statewide mail survey of 5,511 clinicians.Measurements: Rates of online evidence searching per 100 clinicians for the state and for the 81 individual hospitals studied; reported use of CIAP by clinicians through a self-administered questionnaire; and correlations between evidence searches and patient admissions.Results: Monthly rates of 48.5 “search sessions” per 100 clinicians and 231.6 text hits to single-source databases per 100 clinicians (n = 619,545); 63% of clinicians reported that they were aware of CIAP and 75% of those had used it. Eighty-eight percent of users reported CIAP had the potential to improve patient care and 41% reported direct experience of this. Clinicians'' use of CIAP on each day of the week was highly positively correlated with patient admissions (r = 0.99, p < 0.001). This was also true for all ten randomly selected hospitals.Conclusion: Clinicians'' online evidence use increases with patient admissions, supporting the hypothesis that clinicians'' use of evidence is related to direct patient care. Patterns of evidence use and clinicians'' self-reports also support this hypothesis.Current literature1,2,3 indicates that clinicians do not routinely use the available evidence to support clinical decisions. Several studies have shown that simply disseminating evidence, for example, in the form of practice guidelines, does not lead to increased use of that information to inform clinical decisions.4 Clinicians apparently pursue answers to only a minority of their questions5,6 and, when they do so, they rely most heavily on colleagues for answers.5 Lack of easy access to up-to-date evidence is cited as a barrier to evidence-based practice by clinicians.7,8Online clinical information resources have the potential to support clinicians who adopt an evidence-based approach by providing them with the information they need when they need it.9 We have some evidence that searches of bibliographic databases such as Medline are efficacious.5,10 Given sufficient time, clinicians are able to retrieve research evidence relevant to their clinical questions.11 Training in online searching techniques enhances the quality of evidence retrieved,12 whereas education in critical appraisal increases clinicians'' abilities to apply the information obtained.13However, measuring the actual uptake of online information retrieval systems is problematic and few studies have been attempted. Studies of intranet provision of online resources report monthly utilization rates of 30 to 720 searches per 100 person-months.14 However, most studies only report use rates that exclude clinicians who have access to the system but do not use it. Consequently, these studies do not provide a measure of actual uptake by the clinical population.It is also difficult to measure the impact that online access to evidence has on clinical practice. Assessments of the impact of Medline information on decision making and patient care have relied primarily on self-reports of clinicians. Haynes and McKibbon15 provided training and access to online Medline to a group of 158 U.S. physicians. For 92% of searches related to a patient''s care, clinicians reported that the information retrieved resulted in “at least some improvement” in care. Using the critical incident technique, Lindberg et al.16 interviewed U.S.-clinician Medline users about their searches. Of the 1,158 searches described, 41% were classified as affecting decisions regarding patient care. A survey of U.K. general practitioners who used the ATTRACT system, which provides rapid access to evidence-based summaries to clinical queries, found 60% (24 of 40 doctors) reported that the information gained had changed their practice.17Based on the assumption that providing clinicians with easy access to “evidence” will support decision making and result in improvements in patient care, in 1997 the State Health Department in New South Wales (NSW), Australia, implemented the Clinical Information Access Program (CIAP; <http://www.ciap.health.nsw.gov.au/>). CIAP is a Web site providing point-of-care, 24-hour, online access to a wide range of bibliographic and other resource databases for the approximately 55,000 clinicians (doctors, nurses, and allied health staff) employed by the state and primarily working in public hospitals.Qualitative data from case studies indicated that clinicians perceived a range of factors influenced their use of the system, including support from facility managers and direct supervisors, access to computer terminals, training and skills (appraisal of evidence, database searching, and computer skills), and the place of evidence-based practice in their professional work.18We sought to test hypotheses generated as a result of these case studies using Web log and survey data. We aimed to determine the rates of “actual” and “reported” use of online evidence by the population of clinicians working in the public health system in NSW and to assess the extent to which system use was related to direct patient care.We posed and tested two competing hypotheses. These hypotheses were formulated following a qualitative study examining clinicians'' use of online evidence18: hypothesis 1—clinicians use online evidence primarily to support clinical decisions relating to direct patient care; and hypothesis 2—clinicians use online evidence predominantly for research and continuing education. These hypotheses were tested by the following methods.

Examination of Patterns of Online Evidence Use by Clinicians by Time, Day, and Location of Searches

Hypothesis 1 would be supported by a pattern of use that coincided with patient care and peaked between the core working hours of 9 am and 5 pm, with most use originating from within hospitals (). Hypothesis 2 would be supported by a relatively wide distribution of use across the times of the day with sustained rates of use into the evening when clinicians have more free time for research (). A high proportion of access would be expected to occur from outside hospitals, e.g., at home. Figure 1.Hypotheses regarding patterns of online evidence use by clinicians. H = hypothesis.

Measuring the Association between Hospital Admissions and Use of Online Evidence

A significant positive correlation between patient admissions to hospitals and online evidence searches would provide support for hypothesis 1 by demonstrating that CIAP use is likely to be used primarily to inform patient care as opposed to meeting research or continuing education information needs. Absence of a correlation would support hypothesis 2.  相似文献   

4.
There are constraints embedded in medical record structure that limit use by patients in self-directed disease management. Through systematic review of the literature from a critical perspective, four characteristics that either enhance or mitigate the influence of medical record structure on patient utilization of an electronic patient record (EPR) system have been identified: environmental pressures, physician centeredness, collaborative organizational culture, and patient centeredness. An evaluation framework is proposed for use when considering adaptation of existing EPR systems for online patient access. Exemplars of patient-accessible EPR systems from the literature are evaluated utilizing the framework. From this study, it appears that traditional information system research and development methods may not wholly capture many pertinent social issues that arise when expanding access of EPR systems to patients. Critically rooted methods such as action research can directly inform development strategies so that these systems may positively influence health outcomes.Electronic patient record (EPR) systems fundamentally change the way health information is structured. An EPR is a dynamic entity, affording greater efficiency and quality control to the work processes of clinicians by providing data entry at the point of care, logical information access capabilities, efficient information retrieval, user friendliness, reliability, information security, and a capacity for expansion as needs arise.1,2An EPR system promotes patient participation in care to a greater extent than paper records because of its capacity for interaction. Patients can transmit real-time vital signs and other forms of data from their bedside, home, or office and receive up-to-date supportive information customized and contextualized to their individual needs.3,4In this journal, Ross and Lin recently presented a comprehensive review of the world literature on the effects of patient access to medical records, noting a potential for modest benefits and minimal risk, while also citing that the impact of access may vary depending on the patient population in question.5 This is consistent with findings in the information system literature that systems fail when inadequate attention is paid to stakeholder needs and work processes during design6 or when assumptions are made about how well a system fits with the user''s role within the organization during implementation.7Medical records are structured primarily for the use of clinicians and administrators. Patients typically are not counted among the primary users of an EPR system. They tend to be given access sometime after the system is implemented in the organization. Structural concessions and decisions made when the system is first implemented, such as fragmented data entries and foreign lexicons, can make the information difficult for patients to follow and the records all but impossible for them to effectively use.8  相似文献   

5.
Syndromic surveillance refers to methods relying on detection of individual and population health indicators that are discernible before confirmed diagnoses are made. In particular, prior to the laboratory confirmation of an infectious disease, ill persons may exhibit behavioral patterns, symptoms, signs, or laboratory findings that can be tracked through a variety of data sources. Syndromic surveillance systems are being developed locally, regionally, and nationally. The efforts have been largely directed at facilitating the early detection of a covert bioterrorist attack, but the technology may also be useful for general public health, clinical medicine, quality improvement, patient safety, and research. This report, authored by developers and methodologists involved in the design and deployment of the first wave of syndromic surveillance systems, is intended to serve as a guide for informaticians, public health managers, and practitioners who are currently planning deployment of such systems in their regions.Bioterrorism preparedness has been the subject of concentrated national effort1 that has intensified since the events of fall 2001.2 In response to these events, the biomedical, public health, defense, and intelligence communities are developing new approaches to real-time disease surveillance in an effort to augment existing public health surveillance systems. New information infrastructure and methods to support timely detection and monitoring,3,4,5,6,7 including the discipline of syndromic surveillance, are evolving rapidly. The term syndromic surveillance refers to methods relying on detection of clinical case features that are discernable before confirmed diagnoses are made. In particular, prior to the laboratory confirmation of an infectious disease, ill persons may exhibit behavioral patterns, symptoms, signs, or laboratory findings that can be tracked through a variety of data sources. If the attack involved anthrax, for example, a syndromic surveillance system might detect a surge in influenza-like illness, thus, providing an early warning and a tool for monitoring an ongoing crisis.Unlike traditional systems that generally utilize voluntary reports from providers to acquire data, contemporary syndromic surveillance relies on an approach in which data are continuously acquired through protocols or automated routines. The real-time nature of these syndromic systems makes them valuable for bioterrorism-related outbreak detection, monitoring, and investigation. These systems augment the capabilities of the alert frontline clinician who, athough an invaluable resource for outbreak detection, is generally better at recognizing individual cases rather than patterns of cases over time and across a region. Syndromic surveillance technology may be useful not only for bioterrorism event detection, but also for general public health, clinical medicine, quality improvement, patient safety, and research. This report, authored by developers and methodologists involved in the design and deployment of the first wave of syndromic surveillance systems, is intended to serve as a guide for informaticians, public health managers, and practitioners who may be planning deployment of such systems in their regions.  相似文献   

6.

Objective

Home telemonitoring represents a patient management approach combining various information technologies for monitoring patients at distance. This study presents a systematic review of the nature and magnitude of outcomes associated with telemonitoring of four types of chronic illnesses: pulmonary conditions, diabetes, hypertension, and cardiovascular diseases.

Methods

A comprehensive literature search was conducted on Medline and the Cochrane Library to identify relevant articles published between 1990 and 2006. A total of 65 empirical studies were obtained (18 pulmonary conditions, 17 diabetes, 16 cardiac diseases, 14 hypertension) mostly conducted in the United States and Europe.

Results

The magnitude and significance of the telemonitoring effects on patients’ conditions (e.g., early detection of symptoms, decrease in blood pressure, adequate medication, reduced mortality) still remain inconclusive for all four chronic illnesses. However, the results of this study suggest that regardless of their nationality, socioeconomic status, or age, patients comply with telemonitoring programs and the use of technologies. Importantly, the telemonitoring effects on clinical effectiveness outcomes (e.g., decrease in the emergency visits, hospital admissions, average hospital length of stay) are more consistent in pulmonary and cardiac studies than diabetes and hypertension. Lastly, economic viability of telemonitoring was observed in very few studies and, in most cases, no in-depth cost-minimization analyses were performed.

Conclusion

Home telemonitoring of chronic diseases seems to be a promising patient management approach that produces accurate and reliable data, empowers patients, influences their attitudes and behaviors, and potentially improves their medical conditions. Future studies need to build evidence related to its clinical effects, cost effectiveness, impacts on services utilization, and acceptance by health care providers.Continued advances in science and technology and general improvements in environmental and social conditions have increased life expectancy around the world. 1 As a result, the world’s population is aging. Over the last 50 years, the number of people age 60 years or over has tripled, and is expected to triple again to almost two billion by 2050. 2 Population ageing is a global phenomenon affecting all regions. Globally, the proportion of older people was 8% in 1950 and 10% in 2000, and is projected to reach 21% in 2050. 3 China is the region where the increase is likely to be most spectacular, from 6.9% in 2000 to 22.7% in 2050. 3 Population ageing is profound, having major consequences and implications for all facets of human life, including health and health care. Indeed, as we age, the incidence and prevalence of chronic diseases, such as cardiovascular disease, chronic obstructive pulmonary disease (COPD), and diabetes, continue to increase. 1,4 Chronic diseases have become major causes of death in almost all countries. By the end of 2005, it is estimated that 60% of all deaths will be due to chronic diseases. 5 Such prevalence of chronic diseases is one reason why expenditures on health care are skewed: in most health care delivery systems, 5% of patients are responsible for 50% of costs. 6 The economic burden of chronic diseases is profound, accounting for 46% of the global burden of disease. 7 The losses in national income for 2005 due to deaths from heart disease, stroke, and diabetes were estimated (in international dollars) to be $18 billion in China, $1.6 billion in the United Kingdom, and $1.2 billion in Canada. 5 In the United States, chronically ill patients account for 78% of all medical costs nationally. 8 The increasing burden of chronic disease on health care resources and costs provides a powerful incentive to find more compelling ways to care for these patients.The challenge is even more complex because of the supply-and-demand curve in health care. 4 Indeed, at the same time as we face dramatic increases in the numbers of chronically ill patients, there are global provider shortages. An acute nursing shortage exists in many developed countries, including the United States, United Kingdom, Australia, and Canada, and there is no realistic prospect that this situation will change in the near future. 9–11 Furthermore, some countries have to cope with reductions in the number of persons entering the nursing profession. 12–14 Several studies have also suggested a substantial physician shortage, which is expected to develop in the coming years in various countries. 15–17 Dramatic increases in the numbers of chronically ill patients in the face of shrinking provider numbers and significant cost pressures mean that a fundamental change is required in the process of care. We need to identify patient management approaches that would ensure appropriate monitoring and treatment of patients while reducing the cost involved in the process. Provision of care directly to the patient home represents an alternative. It may be perceived as a substitute for acute hospitalization, an alternative to long-term institutionalization, a complementary means of maintaining individuals in their own community, and an alternative to conventional hospital outpatient or physician visits. 1 Information technology can play a crucial role in providing care to the home, and telehealth technologies have been growing dramatically. More precisely, home telemonitoring is a way of responding to the new needs of home care in an ageing population. In this regard, Meystre 18 recently concluded that long-term disease monitoring of patients at home currently represents the most promising application of telemonitoring technology for delivering cost effective quality care. Yet, to be able to comprehensively assess and determine the benefits of home telemonitoring, it is essential to perform a systematic review that can critically synthesize the results of various studies in this area and provides a solid ground for clinical and policy decision making. 19 This article provides a systematic review of experimental and quasi-experimental studies involving home telemonitoring of chronic patients with pulmonary conditions, diabetes, hypertension, and cardiovascular diseases. Precisely, it reveals the nature and magnitude of the outcomes or impacts associated with telemonitoring programs across the world.  相似文献   

7.
8.
9.
Intestinal lymphangiectasia (IL) is a rare disease characterized by dilatation of intestinal lymphatics. It can be classified as primary or secondary according to the underlying etiology. The clinical presentations of IL are pitting edema, chylous ascites, pleural effusion, acute appendicitis, diarrhea, lymphocytopenia, malabsorption, and intestinal obstruction. The diagnosis is made by intestinal endoscopy and biopsies. Dietary modification is the mainstay in the management of IL with a variable response. Here we report 2 patients with IL in Bahrain who showed positive response to dietary modification.Intestinal lymphangiectasia (IL) is a rare1-4 benign disease characterized by focal or diffuse dilation of the mucosal, submucosal, and subserosal lymphatics.2,5 In addition to being an important cause of protein losing enteropathy (PLE),6 IL is frequently associated with extraintestinal lymphatic abnormalities.5 Depending on the underlying pathology IL can be classified as primary or secondary disease.1,2,4,5 Primary IL (PIL) probably represents a congenital disorder of mesenteric lymphatics.1,3 The IL can be secondary to diseases like constrictive pericarditis, lymphoma, sarcoidosis, and scleroderma.1 A secondary disorder should always be ruled out before labeling IL as primary, this is by testing for proteinuria, rheumatic, neoplastic, and parasitic infection.1,3 Recently, a functional form of PIL with typical endoscopic and pathological findings but without clinical symptoms has been reported.3 The clinical presentations of IL are pitting edema, chylous ascites, pleural effusion, acute appendicitis, diarrhea, lymphocytopenia, malabsorption, and intestinal obstruction.1,2,4 Palliative treatment with lifelong dietary modification is the most effective and widely prescribed therapy.6 Limiting the dietary fat intake reduces chyle flow and therefore, protein loss.1 Once protein level is within the normal range, recurrence of enteric protein loss can be prevented by total parenteral nutrition (TPN) and medium chain triglycerides (MCT).1 In cases of secondary IL, treating the underlying primary disorder may be curative.2 Although the therapeutic approach for this disorder have gained a lot of attention lately, few studies have considered the therapeutic effects, nutritional condition, and long-term results in PIL patients.4 Here, we report 2 patients with PIL who were diagnosed by endoscopy and biopsy, and showed positive response to dietary modifications. We present these particular cases to highlight the effect of dietary modifications on the clinical status of patients with IL.  相似文献   

10.
Up to 50% of hospitalized patients worldwide are malnourished or at risk of malnutrition. Guidelines recommend nutritional screening of all patients on hospital admission. Results from studies of hospitalized patients show that screening, with follow-up nutritional assessment and care when indicated, can improve patients’ clinical outcomes and reduce healthcare costs. Despite compelling evidence, attention to nutritional care remains suboptimal in clinical settings worldwide. The feedM.E. Global Study Group developed a simple, stepwise Nutrition Care Pathway to facilitate best-practice nutrition care. This pathway guides clinicians to screen patients’ nutritional status on hospital admission or at initiation of care; intervene promptly with nutrition care when needed; and supervene or follow-up routinely with adjustment and reinforcement of nutrition care plans. The feedM.E. Middle East Study Group seeks to extend this program to our region. We advise clinicians to adopt and adapt the Nutrition Care Pathway, bringing quality nutrition care to everyday practice.Up to 50% of hospitalized patients are reported to be at risk of malnutrition or actually malnourished.1,2 Clinical studies in healthcare settings worldwide have shown that disease-related malnutrition is exceedingly common,3-7 especially in older patients.8,9 The prevalence of disease-related malnutrition-nutritional inadequacy with an inflammatory component10 is similarly high in hospitals of both emerging and industrialized nations. This prevalence remains as high now as it was a decade ago in almost every country.11-14 Patients with poor nutritional status are susceptible to disease progression and complications, and their recovery from illness or injury is often prolonged.1,15,16 A key barrier to best-practice nutrition care is limited hospital resources; clinicians report that too little time and not enough money constrain staff training on how to recognize and treat malnutrition.17,18While educational training and nutrition interventions have financial costs, so do the consequences of malnutrition. Disease-related malnutrition increases costs of care due to higher rates of complications (infections, pressure ulcers, falls) longer hospital stays, and more frequent readmissions.19-28 By contrast, clinical study results show that attention to nutrition care during hospitalization can improve patients’ health outcomes and cut healthcare costs.29-39 Nutrition planning and follow-up nutrition care can also provide both health and financial paybacks, whether the patient is living in the community, preparing for surgery, or ready to be discharged from the hospital.40-43 Yet disease-related malnutrition continues to be overlooked and under-treated.Despite compelling evidence of benefits from nutrition care29,30,32,34-38 and clearly-stated nutrition care guidelines,43-46 nutrition interventions for people with disease-related malnutrition still fall far short of best-practice.3,4,47 To address this shortfall, clinicians worldwide have issued a “call-to-action” for increased recognition of nutrition’s role in improving patient outcomes.42,48-50 To take action, clinical nutrition experts from Asia, Europe, the Middle East, and North and South America formed the feedM.E. (Medical Education) Global Study Group and put together a working program to increase awareness and improve nutrition care around the world.1 The global feedM.E initiative introduced the mantra “screen, intervene, and supervene” to cue the steps of a straightforward Nutrition Care Pathway.1 To support the feedM.E. global educational initiative, we formed a feedM.E. Middle East Study Group, which includes nutrition leaders from Egypt, Saudi Arabia, Turkey, and the United Arab Emirates (Open in a separate windowIn our current article, we the members of the feedM.E. Middle East Study Group emphasize the screen-intervene-supervene strategy for nutrition care, which is further defined for incorporation into practice as a Nutrition Care Pathway. For Middle East healthcare, we advise that this pathway can be adapted to meet cultural differences in different Middle East countries, and can be followed for patients in the community, in the hospital, and after discharge to home or to long-term care centers.

Malnutrition in the Middle East

Countries of the Middle East region are highly diverse in ecology (green valleys and dry yellow deserts), political structures (republics, monarchies), government stability or instability (conflicts, civil wars, unrest), and economic status (world’s richest and poorest countries). This diversity creates marked differences in the health and nutritional status of people in regional populations.51 In some cases, rapid urbanization and social development have occurred in the absence of economic growth.51 For adults in the Middle East and worldwide, malnutrition is often related to sickness, which includes people with limited physical or mental function. Disease-related malnutrition occurs in people of all ages and circumstances but is notably common in older people.9 Disease-related malnutrition is evident at hospital admission, during hospitalization, and in the periods before admission and after discharge. With all these influences, the prevalence of disease-related malnutrition varies widely across the Middle East; from 6% to 58% of hospitalized patients are malnourished or at risk of malnutrition (52

Table 2

Reports of hospital- and community-based malnutrition prevalence in Middle East countries.Open in a separate window

Malnutrition predicts poor health outcomes

Poor nutrition is a predictor of poor outcomes, as shown by results of a large multicenter collaboration including hospitals in 3 countries of the Middle East (Lebanon, Egypt, and Libya) and 9 countries in Europe.53 This prospective study enrolled 5,051 patients; of these, 33% of patients were found to be ‘at risk’ of malnutrition (NRS 2002 score). The proportion of ‘at risk’ patients generally reflected the severity of the underlying illness or injury in the population studied. For patients in the Middle East, risk for malnutrition by hospital department was: internal medicine (11%), oncology (37%), surgery (55%), and intensive care (97%).53 Patients ‘at risk’ had significantly more complications, longer lengths of hospital stay, and higher rates of mortality.53 Further, of those who were discharged, fewer ‘at risk’ patients were discharged to home, and more were sent to nursing homes or to other hospital care sites, as compared with patients who were adequately nourished.53

Malnutrition is under-treated

Even when nutrition problems are identified, studies have found that such problems are not adequately treated. In one Middle Eastern study, 34 Turkish hospitals from 19 cities contributed data from 29,139 patients.54 On admission, 15% of patients were found to be at nutritional risk; risk was highest in intensive care unit patients (52%). Of those identified to be ‘at nutritional risk’ in this study, only around half received nutrition intervention.54 Studies carried out in Australian and European hospitals reported similar shortfalls in treating malnutrition or reaching nutritional targets.3,47

Nutrition care improves outcomes and lowers costs

Nutrition interventions, including food fortification or oral nutrition supplements (ONS), tube-fed enteral nutrition, and parenteral nutrition, are recognized to have significant clinical and economic benefits across patient groups and in different settings. Specifically, nutrition interventions were associated with fewer in-hospital complications,30 reduced pressure ulcer incidence,37 achievement of higher functional status in recovery,30 improved quality of life,34,55 and reduced risk of mortality,56 as shown by results of randomized and controlled trials and by meta-analyses. Nutrition interventions to prevent or treat disease-related malnutrition also show resource savings; reports have shown reduced length of hospital stay,57 fewer readmissions,38,55 and lowered hospital-related costs.35,36 Few studies have considered cost of hospital-based malnutrition in Middle East countries. However, a recent survey of neurologists from 8 tertiary centers in Turkey examined current practice related to treatment of patients recovering from strokes.58,59 The researchers determined that the overall one-year costs of care were higher for malnourished patients compared to those who were adequately-nourished ($5201 versus $3618; p=0.09). Of the total costs, oral nutrition supplements (ONS) costs were $868 in patients with malnutrition and $501 in patients without malnutrition, whereas all others costs were $4334 and $3117. Investment in ONS as treatment for malnutrition was thus supported as a way to decrease the cost of illness.

Malnutrition definition

For caregivers to provide best-practice nutrition care, it is important to be aware of the current definition of malnutrition. Malnutrition is now recognized as 3 clinical syndromes, which are characterized by underlying illness or injury and varying degrees of inflammation.10 These malnutrition syndromes are: 1) starvation-related malnutrition, namely, a form of malnutrition without inflammation; 2) chronic disease-related malnutrition, namely, nutritional inadequacy associated with chronic conditions that impose sustained inflammation of a mild-to-moderate degree; and 3) acute disease- or injury-related malnutrition, namely, undernutrition related to conditions that elicit marked inflammatory responses. Inflammation is a component of underlying disease in several chronic disease states, such as kidney disease and heart failure and thus increases the risk of malnutrition,60 even among patients who are overweight or obese.61 Most severe acute health crises such as severe infection, surgery, burn injury, or sepsis are associated with marked inflammation, which contributes to and intensifies risk for severe malnutrition.60 Adult undernutrition was further described as a condition characterized by 2 or more of 6 criteria: unintentional weight loss, inadequate energy intake, loss of muscle mass, loss of subcutaneous fat, fluid accumulation, and functional decline (for example, decreased hand-grip strength).62

The feedM.E. Nutrition Care Pathway

The feedM.E. Global Study Group recently introduced screen-intervene-supervene as a guide for delivering prompt and complete nutrition care (Appendix 1A).1 We members of the feedM.E. Middle East Study Group support this overall strategy, and we advise the use of the Nutrition Care Pathway to bring this strategy to everyday practice. To facilitate broad use of the Nutrition Care Pathway throughout the Middle East, we provide versions in Turkish and Arabic (Appendix 1 B and andCC). For complete uptake, specific aspects of nutrition care may need adjustments to meet country-to-country cultural differences to accommodate disparate lifestyles, food availability, and genetic factors, as was the case with a diabetes nutrition program.63,64

Nutrition Care Pathway: screen for malnutrition risk

Screening patients for malnutrition on admission to the hospital is now a standard of care. In the Middle East, we advise that routine nutrition screening is likewise appropriate in rehabilitation facilities, long-term care centers, and community healthcare settings. To determine nutritional risk, we advise screening with (1) the 2 Malnutrition Screening Tool (MST) questions65,66 and (2) a quick clinical decision on whether the patient’s illness or injury carries risk for malnutrition.10In the Middle East, as is the case elsewhere, admitting nurses are often the first contacts for patients, and we suggest that nurses conduct the initial screen for nutritional risk. If risk is found, we advise immediate intervention with nutrition advice, an increase in the quantity or protein density of food, and/or use of protein-containing oral nutrition supplements. With risk recognition, particularly when the patient is unable to take food orally, refer to a trained clinician (dietitian, nutrition specialist) for further assessment and specific treatment.

Nutrition Care Pathway: intervene with targeted nutrition

The intervention portion of the Nutrition Care Pathway includes assessment of nutritional status, diagnosis of malnutrition, and implementation of treatment. For nutrition assessment, the SGA is widely used for most adults,67 and the MNA is used for older persons;68 other tools are available.52 To facilitate malnutrition diagnosis and help standardize malnutrition care, experts from A.S.P.E.N. and the Academy of Nutrition and Dietetics (AND) defined specific criteria for malnutrition diagnosis.62 Guidelines support prompt intervention, namely, targeted nutrition therapy within 24 to 48 hours of admission.43-46 Implementation of treatment involves decisions on how much to feed, how and when to feed, and what to feed, as discussed in detail for the feedM.E. global initiative.1

Nutrition Care Pathway: supervene

The next step of the Nutrition Care Pathway is to supervene, or follow-up with continuing attention to meeting nutrition needs. Individuals receiving nutrition therapy should also be monitored regularly to ensure feeding tolerance and adequate supplies of energy with sufficient protein, vitamins, and minerals.69 For those patients who are initially well-nourished, rescreening should occur at regularly determined intervals, especially when clinical status changes.70 An effective nutrition plan considers multiple aspects of care.43 It requires that the patient have cognitive competence, social and functional abilities, and economic access to food; alternatively, some patients need a caregiver and other social support programs to meet their needs. The nutrition plan should be prepared for and discussed with the patient, modified as needed to meet personal and cultural preferences, and include ongoing measures/assessment of the patient’s nutritional status.To ensure best-practice nutrition care in the Middle East, we recommend continued efforts to prevent and treat malnutrition among patients who have been discharged from the hospital into long-term care centers or into the community. Such efforts include nutrition education for the patient or their caregivers and individualized dietary advice on the use of food enrichment and/or oral nutrition supplements. We also emphasize the importance of routine rescreening for malnutrition risk. We call on regional and local health authorities to endorse nutritional risk assessment as an integral part of routine medical care.In conclusion, attention to nutrition is fundamental to good clinical practice. As members of the feedM.E. Middle East Group on nutrition in healthcare, we call healthcare providers in our region to action. To do so, we recommend use of the Nutrition Care Pathway that includes 3 key steps: screen always, intervene promptly when needed, and supervene or follow-up routinely. Because of wide socioeconomic differences among Middle Eastern countries, we recognize that feedM.E. global strategies may need to be adapted to meet country-specific needs, and we propose testing pilot models for feedM.E. training in each country.  相似文献   

11.
12.

Objectives:

To assess knowledge, perceptions, and attitudes toward antimicrobial prescribing among physicians practicing in Riyadh, Saudi Arabia.

Methods:

A questionnaire was developed and distributed to physicians working in hospitals in Riyadh, Saudi Arabia between June and August 2013. The results were analyzed using Stata 12 software.

Results:

Two hundred and twelve (84.8%) full responses were returned. Most respondents perceived antimicrobial resistance as a significant problem in their daily practice (119, 56.1%) and at a national level (148, 69.8%). Inappropriate empirical therapy (101, 47.6%) and excessive use of antimicrobials in healthcare settings (66, 31.1%) were believed to be the main contributors to increasing bacterial resistance. Respondents favor treating infection rather than colonization (98, 46.2%), and physician education (74, 34.9%) as the most effective interventions to reduce antimicrobial resistance. Many respondents (95, 44.8%) do not feel confident in their knowledge of antimicrobial prescribing. Two-thirds of the respondents (135, 63.7%) have local antimicrobial guidelines, of which 90 (66.7%) felt were useful. Most respondents (160, 75.5%) considered their local infectious diseases service to be very helpful.

Conclusion:

There are considerable unmet training and education need for physicians in the area of antimicrobial prescribing. Local antimicrobial guidelines need revision to ensure they are more relevant and helpful for medical practitioners.The importance of judicious clinical use of antimicrobial agents and increasing rates of antimicrobial resistance have been the subject of numerous studies in the last decade.1-6 These studies involved either health care workers such as physicians, medical students, or pharmacists, or the general public. Several factors may contribute to inappropriate antimicrobial usage, including doctors’ knowledge and experiences, uncertain diagnosis, patients’ expectations, pharmaceutical marketing influences, and unregulated antibiotic dispensing.1 Despite continuous efforts to improve antimicrobial prescribing and address issues such as self-prescribing, unnecessary use for viral infections, dosing errors, and excessive treatment durations, rates of antimicrobial resistant infections continue to rise globally.7-10 Investigators from different parts of the world have identified knowledge gaps regarding antimicrobial prescribing and growing concern over the increasing antimicrobial resistance among healthcare workers.6,11,12 The development and implementation of wide ranging educational programs for both physicians and the general population are among the commonly recommended strategies to help address those concerns.13 In Saudi Arabia, accurate, denominated antimicrobial prescribing data is not available. It is however, important to note that antimicrobials are the third most commonly prescribed group of medications in the country.14 Furthermore, antibiotics are prescribed to 44-88% of patients who present to primary healthcare centers with upper respiratory tract infections (URTI).15 In dental practice, Al-Harthi et al16 found that healthcare workers believed that antimicrobials are excessively used and that their participants did not find hospital guidelines as helpful as other resources. Family and caretakers beliefs, especially among parents of young children, along with peer pressure are also significant contributing drivers of antimicrobial misuse.17 Better understanding the physicians’ knowledge, perception, and attitude toward antimicrobial prescribing is essential for formulating effective antimicrobial stewardship programs. The objective of this study was to assess knowledge, perceptions, and attitudes in relation to antibiotic prescribing among physicians practicing in hospitals in Riyadh, Saudi Arabia.  相似文献   

13.

Aim

To assess the glucose tolerance of South Asian and Caucasian women with previous gestational diabetes mellitus (GDM).

Method

A retrospective follow‐up study of 189 women diagnosed with GDM between 1995 and 2001. Glucose tolerance was reassessed by oral glucose tolerance test at a mean duration since pregnancy of 4.38 years.

Results

South Asian women comprised 65% of the GDM population. Diabetes developed in 36.9% of the population, affecting more South Asian (48.6%) than Caucasian women (25.0%). Women developing diabetes were older at follow‐up (mean (SD) 38.8 (5.7) vs 35.9 (5.6) years; p<0.05) and had been heavier (body mass index 31.4 (6.3) vs 27.7 (6.7) kg/m2; p<0.05), more hyperglycaemic (Gl0 6.5 (1.7) vs 5.2 (1.1) mmol/l; p<0.01: G120 11.4 (3.3) vs 9.6 (1.8) mmol/l; p<0.01: HbA1c 6.4 (1.0) vs 5.6 (0.7); p<0.01) and more likely to require insulin during pregnancy (88.1% vs 34.0%; p<0.01). Future diabetes was associated with and predicted by HbA1c taken at GDM diagnosis in both South Asian (odds ratio 4.09, 95% confidence interval 1.35 to 12.40; p<0.05) and Caucasian women (OR 9.15, 95% CI 1.91 to 43.87; p<0.01) as well as by previously reported risk factors of increasing age at follow‐up, pregnancy weight, increasing hyperglycaemia and insulin requirement during pregnancy.

Conclusion

GDM represents a significant risk factor for future DM development regardless of ethnicity. Glycated haemoglobin values at GDM diagnosis have value in predicting future diabetes mellitus.Gestational diabetes mellitus (GDM) is defined as abnormal carbohydrate tolerance that is diagnosed or first recognised in pregnancy1 and affects approximately 5% of pregnancies.2 However, the prevalence depends on the population studied and the diagnostic criteria used3 with an increased frequency of GDM when less stringent diagnostic criteria are used and in ethnic groups who traditionally have a higher rate of type 2 diabetes.4,5,6 Differences in the prevalence of GDM reflect the background susceptibility of individual ethnic groups2,7 and possibly a different stage within the natural history of diabetes at the time of pregnancy.8Previous GDM confers an increased risk of subsequent diabetes mellitus such that 50% of women will have diabetes mellitus after 10 years.9,10 Several antenatal and maternal factors have been shown to predict this11,12,13 and identification of these during the screening of women with GDM may lead to more effective targeting of strategies for primary prevention of diabetes in local populations.3,14 Glycated haemoglobin (HbA1c), while convenient to measure, has little sensitivity in making the diagnosis of GDM15 and has been little studied as a risk marker for predicting future diabetes.A number of studies have suggested that diabetes following GDM develops more rapidly in non‐Caucasian groups.5,16,17 A recent meta‐analysis, however, suggested that differences between the ethnic groups studied could largely be explained by standardising diagnostic criteria, duration of follow‐up and patient retention.18 The Leicestershire population consists of a significant minority of women from the Indian subcontinent who have higher rates of glucose intolerance both in and out of pregnancy.19 This study examined the development of glucose intolerance and its pregnancy associations in this ethnically mixed population.  相似文献   

14.

Objectives:

To elucidate the contribution of x-ray repair cross-complementing (XRCC) protein 1 399Gln, XRCC3 241M, and XRCC3-5’-UTR polymorphisms to the susceptibility of breast cancer (BC) in a Jordanian population.

Methods:

Forty-six formalin fixed paraffin embedded tissue samples from BC diagnosed female patients, and 31 samples from the control group were subjected to DNA sequencing. Samples were collected between September 2013 and December 2014.

Results:

The XRCC1 Arg399Gln genotype did not exhibit any significant correlation with the susceptibility of BC (odds ratio [OR]=1.45, 95% confidence interval [CI]: 0.60-3.51) (p=0.47). Likewise, XRCC3 M241T genotype did not show significant correlation with BC (OR=2.02, 95% CI: 0.50-8.21) (p=0.40). However, distribution of XRCC3-5’UTR (rs1799794 A/G) genotype showed a significant difference between the patient and control group (OR=0.73, 95% CI: 0.06-8.46) (p=0.02).

Conclusion:

The XRCC3-5’UTR (rs1799794) G allele frequency was higher in cancer patients while XRCC1 (rs25487) and XRCC3 (rs861539) did not show any significant correlation with susceptibility of BC in the selected Jordanian population. Contribution of other environmental factors should be studied in future works, as well as the response of cancer therapy.Breast cancer (BC) incidence in Jordan has been estimated at 1,237 cases in 2012, with a prevalence of 4,260 cases over 5 years, and mortality rate up to 426 cases.1 Genetic predisposition contributes to less than 10% of BC cases, which raises a demand for further research into new genetic markers of BC risks.2 Fewer than 5% of BC cases have been found to be mutated at breast cancer 1 (BRCA1) early onset and BRCA2 genes, and approximately 40% of familial BC families have been identified for genetic predisposition.3 Unfortunately, mammalian cells are habitually exposed to genotoxic agents, such as ionizing radiation that can lead to DNA damage. Many double strand break,4 and single strand break (SSB) repairing proteins have been identified including DNA repair protein homolog, or RAD tecombinase, or x-ray repair cross-complementing (XRCC)s family proteins.5 Deficiency in repairing system might contribute to cancer development due to the loss of genetic integrity and genome instability.6 Mutation in DNA repair proteins is very rare.7 Therefore, many studies have been conducted to evaluate the role of allelic polymorphisms in DNA repair genes involved in cancers development.8,9 Genetic polymorphisms in DNA repair genes XRCC1, and XRCC3 have been screened to find an association with the risk of BC.10-12 Studies have demonstrated an association between XRCC1 and XRCC3 polymorphisms, and certain cancers subsuming colorectal cancer,13 lung cancer,14 pancreatic cancer,15 head and neck cancer,16 gastric cancer,17 esophageal cancer,18 melanoma skin cancer,19 oral squamous cell carcinomas,20 lung cancer risk,21 bladder cancer,22 and BC.23 Furthermore, a meta-analysis study supported the contribution of XRCC1 Arg399Gln polymorphism in susceptibility of BC in the American population.24 On the other hand, no relationship has been found between XRCC1 and XRCC3 polymorphisms and the risk of BC,25 lung cancer,26 bladder cancer,27 prostate cancer,28 lung cancer risk,29 cutaneous malignant melanoma,30 furthermore, it may decrease the risk for myeloblastic leukemia31 and non-melanoma skin cancer.32 Alcoholism, abortion, and non-breast feeding have been associated with increased risk of BC with contribution of XRCC1 399Gln and XRCC3 T241M polymorphisms.11 Moreover, family history,12 age group,33 polycyclic aromatic hydrocarbon-DNA adducts, fruit and vegetable and antioxidant intake, and non-smokers have been suggested to be associated with the risk of BC in interaction with XRCC1 or XRCC3 polymorphisms.34 The aim of the current study was to elucidate the contribution of XRCC1 399Gln, XRCC3 241M and XRCC3-5’-UTR polymorphisms in the susceptibility of BC in the Jordanian population. This study is intended to establish a reference point for future single nucleotide polymorphism (SNP) studies in the Jordanian population, which may contribute to the development of a national cancer database.  相似文献   

15.

Objectives:

To examine relationship between the quality of marital relationship and anxiety among women with breast cancer (BC) in the Kingdom of Saudi Arabia (KSA).

Methods:

This cross-sectional study recruited a consecutive series of 49 married women with BC seen in the Al-Amoudi Breast Cancer Center of Excellence at King Abdulaziz University, Jeddah, KSA in early 2013. Participants completed the Hospital Anxiety and Depression Scale, Spouse Perception Scale, and Quality of Marriage Index forms, and answered questions on demographic and cancer characteristics.

Results:

Anxiety symptoms indicating “possible” anxiety disorder were present in 10.4% and “probable” anxiety disorder in 14.6% (25% total). No significant relationship was found between the quality of marital relationship and anxiety symptoms (B=-0.04, standard error=0.05, t=-0.81, p=0.42). Anxiety was primarily driven by low education, poor socioeconomic status, and young age.

Conclusion:

Anxiety symptoms are prevalent among married women with BC seen in a university-based clinic in the KSA. Further research is needed to determine whether a diagnosis of BC adversely affects marital relationship, and whether this is the cause for anxiety in these women.Breast cancer (BC) is the most common cause of cancer death in women worldwide,1 and the Kingdom of Saudi Arabia (KSA) is no exception.2 Breast cancer has become a particular problem in Arab countries due to its late stage at presentation and its increased occurrence among young women.3 Both during and after treatment, even if the cancer goes into remission, concerns regarding recurrence, effect on the marital relationship, and frequent medical visits for monitoring, often result in high levels of anxiety (including post-traumatic stress-like symptoms).4-8 Anxiety and other mood symptoms are not benign in women with BC, as they are associated with increased mortality and cancer recurrence.9,10Studies in Western countries (United States, Canada, England, Australia, and Germany) indicate a prevalence of significant anxiety ranging from 4-45% in BC patients, depending on anxiety measure, cutoff score, geographical region, and time since diagnosis11-14 (compared with 15-37% of cancer patients in general with anxiety during the first year after diagnosis).15 The most commonly used measure of anxiety symptoms in BC patients is the Hospital Anxiety and Depression Scale (HADS), which assesses for “possible” and “probable” anxiety disorder (with a sensitivity and specificity of approximately 80%).12,16,17 Using this measure, the prevalence of “probable” anxiety disorder in BC patients ranges from 2-23% and “possible” anxiety disorder is present in an additional 19-22% (21-45% combined).11,13,18 Although factors that increase risk of anxiety in women with BC are poorly understood, a few studies largely from Western countries report more symptoms in younger persons and Caucasians, immigrants, those with lower education, later disease stage, and lower social support.8,11,13,19 In one of the few studies from an Eastern country,20 anxiety levels among BC patients from Bangkok, Thailand, were significantly higher among those with poor problem solving skills, more pain and fatigue, and poorer family functioning. Although research is limited almost entirely to the US and other Western countries, studies indicate that support from a spouse (especially emotional support) improves the adjustment of women to BC,21-25 and may even impact survival.26 Not all studies, however, report that having a marital partner buffers against the stress of BC.27,28 The demands of caregiving, the effects of BC and its treatments on sexual relationship, and coping with psychological changes in a BC patient can all lead to lower well-being in a spouse, and decrease his ability to provide support.24 Our exhaustive review of the literature uncovered several studies that have examined the prevalence of emotional reactions to BC in the Middle East, finding that 19-73% of women had significant anxiety symptoms.22,29-34 In those studies, anxiety was associated with poorer physical functioning, the presence of metastatic disease, higher education, lower social support, duration of marriage, and spouse’s level of anxiety. With regard to KSA, there has been a significant increase in the incidence of BC, which occurs at a younger age than in Western countries.35 A recent review of research on coping with BC, however, revealed not a single study from KSA.36 Our review identified only 2 studies37,38 that examined the prevalence or correlates of anxiety in Saudi cancer patients (none specifically in BC), and only one study39 that examined attitudes of Saudi males toward BC. The first study examined anxiety in 30 hospitalized patients with cancer (9 with BC) at the King Khalid National Guard Hospital in Jeddah, KSA.37 Researchers found that anxiety symptoms assessed by the Hamilton Anxiety Scale were significantly higher in cancer patients compared with 39 patients with a range of chronic illnesses; 3 patients with cancer (10%) had a clinical diagnosis of generalized anxiety disorder based on Diagnostic and Statistical Manual of Mental Disorders, 4th edition (DSM-IV) criteria. The second study examined non-pain symptoms in 124 cancer patients (27% with BC) at King Faisal Specialist Hospital in Riyadh, KSA.38 The most frequently reported non-pain symptoms were fatigue (80%), loss of appetite (72%), dry mouth (69%), and anxiety (61%). Finally, researchers examined attitudes toward BC among males accompanying female patients to outpatient clinics at King Abdulaziz Hospital in Jeddah, KSA. When men were asked what they would do if their wives were diagnosed with BC, 9.4% said they would leave their wives.39Given the current knowledge gap on this subject in KSA, we decided to: 1) determine the prevalence of anxiety symptoms in married women seen in an urban-based university outpatient clinic in Jeddah; 2) identify the correlates of anxiety symptoms (especially marital quality [MQ]); and 3) determine whether the relationship between MQ and anxiety differed between Saudi nationals and immigrants. We hypothesized that anxiety symptoms would be prevalent, that higher MQ would be strongly and inversely related to anxiety symptoms, and that this relationship would be particularly strong in women who were Saudi nationals (where cultural factors might have the most influence).  相似文献   

16.
Yu W  Li W  Wang Z  Ye X  Li N  Li J 《Postgraduate medical journal》2007,83(977):187-191

Objective

Percutaneous transhepatic gallbladder drainage (PTGD) was compared with endoscopic treatment (within 72 hours after the onset of symptoms) in patients with severe biliary pancreatitis to evaluate the curative effect of PTGD in preventing bile track complications in these patients.

Methods

Eligible patients were randomised to receive early treatment with PTGD or endoscopic treatment. If the initial emergency endoscopic or PTGD treatment failed, patients received another drainage treatment within 24 hours. From November 2001 to August 2005, 101 patients were randomly assigned to early PTGD (n = 51) or endoscopic retrograde cholangiopancreatography (ERCP) (n = 50). Overall mortality, mortality due to pancreatitis and complications were compared in these two groups.

Results

48 of 52 patients were successfully treated with ERCP and 53 of 55 with PTGD. Seven patients (6.9%; three in the endoscopic treatment group and four in the PTGD group) died within four months after the onset of pancreatitis (p = 0.798); three patients in the endoscopic group and three in the PTGD group died from acute biliary pancreatitis. The overall rate of complications was similar in the two groups and there were no major differences in the incidence of local or systemic complications.

Conclusions

PTGD treatment is a simple, convenient and effective treatment of severe gallstone associated acute pancreatitis when endoscopic treatment fails.In China, gallstones account for approximately 50–70% of cases of acute pancreatitis.1,2 The pathogenesis of acute biliary pancreatitis is not fully understood and may be multifactorial.3,4,5,6,7,8 Most gallstones that initiate an episode of acute pancreatitis pass spontaneously through the ampulla of Vater into the duodenum and can subsequently be recovered in the faeces within a few days. It is uncertain whether or not early surgical or endoscopic removal of gallstones is beneficial in mild gallstone associated acute pancreatitis (GAP).9,10 However, early endoscopic retrograde cholangiopancreatography (ERCP) with or without endoscopic sphincterotomy has been advocated to reduce complications in patients with a severe attack of GAP.9,10,11,12Although early endoscopic intervention is the procedure of choice in patients with stone impaction and cholangitis, some patients are unable to tolerate endoscopic intervention, especially those with early severe pancreatitis who have developed organ failure within 72 hours after the onset of pain.13 Therefore, it is not sufficient to rely solely on endoscopic treatment to release bile matter in severe GAP. We need to find alternatives for those patients who fail to be cured by endoscopic treatment.Percutaneous transhepatic gallbladder drainage (PTGD) has been widely accepted as an alternative to operative decompression in critically ill patients with cholecystitis or cholangitis, especially in the elderly.14,15,16 However, little is known about the factors affecting the outcome of PTGD in severe GAP. In clinical practice, we have been successful in using PTGD in some patients who were unable to receive endoscopic intervention and had complicating obstructive jaundice. In this study, we have performed a prospective, randomised trial to compare early PTGD (within 72 hours after the onset of symptoms) with endoscopic treatment in patients with severe GAP.  相似文献   

17.
This article describes the algorithms implemented in the Essie search engine that is currently serving several Web sites at the National Library of Medicine. Essie is a phrase-based search engine with term and concept query expansion and probabilistic relevancy ranking. Essie’s design is motivated by an observation that query terms are often conceptually related to terms in a document, without actually occurring in the document text. Essie’s performance was evaluated using data and standard evaluation methods from the 2003 and 2006 Text REtrieval Conference (TREC) Genomics track. Essie was the best-performing search engine in the 2003 TREC Genomics track and achieved results comparable to those of the highest-ranking systems on the 2006 TREC Genomics track task. Essie shows that a judicious combination of exploiting document structure, phrase searching, and concept based query expansion is a useful approach for information retrieval in the biomedical domain.A rapidly increasing amount of biomedical information in electronic form is readily available to researchers, health care providers, and consumers. However, readily available does not mean conveniently accessible. The large volume of literature makes finding specific information ever more difficult. Development of effective search strategies is time consuming, 1 requires experienced and educated searchers, 2 well versed in biomedical terminology, 3 and is beyond the capability of most consumers. 4 Essie, a search engine developed and used at the National Library of Medicine, incorporates a number of strategies aimed at alleviating the need for sophisticated user queries. These strategies include a fine-grained tokenization algorithm that preserves punctuation, concept searching utilizing synonymy, and phrase searching based on the user’s query.This article describes related background work, the Essie search system, and the evaluation of that system. The Essie search system is described in detail, including its indexing strategy, query interpretation and expansion, and ranking of search results.  相似文献   

18.
Pemphigus is a group of immune-mediated bullous disorders, which often cause fragile blisters and extensive lesions of the skin or mucous membranes, such as in the mouth. This disease could be life-threatening in some cases. During pregnancy, its condition will become more complicated due to the change in the mother’s hormone level and the effect of drug therapy on both the mother and her fetus. Thus, it will be more difficult to identify the clinical manifestations and to establish the treatment plan. In this article, we present a comprehensive review of pemphigus and pregnancy by analyzing 47 cases of pemphigus reported between 1966 and 2014, with diagnosis before or during pregnancy. The aim of this study is to make a comprehensive review of pemphigus and pregnancy, provide organized and reliable information for obstetricians, dermatologists, physicians, and oral medicine specialists.Pemphigus is characterized by widely distributed bullae and erosions on the skin and mucosa membranes. There are mainly 3 types of pemphigus: Pemphigus vulgaris (PV), Pemphigus foliaceus (PF), and other variants of pemphigus.1,2 The pathogenesis of pemphigus is associated with autoantibodies directed against transmembrane glycoproteins of desmosomes, which causes steric hindrance to homophilic adhesion of desmogleins, and results in the formation of Dsg1-depleted desmosomes in PF and Dsg3-depleted desmosomes in PV.3,4 Pemphigus usually affects the elderly, and genetics play an important role in predisposition.5,6 Pemphigus could involve one or more mucosae, while PV often shows extensive lesions of the oral mucosa.7,8 When it occurs in pregnancy, the condition becomes more complex.9 Early diagnosis and individually adjusted therapy are needed to avoid any risk for mother or child.10 The purpose of this article is to make a comprehensive review of the pemphigus and pregnancy, and provide organized and reliable information for clinicians.

Basic demographics

The existing reseasrch is mainly focused on case reports and retrospective studies. References were retrieved by an electronic search strategy “(pemphigus [MeSH Terms]) AND pregnancy [MeSH Terms] Filters: Case Reports” on PubMed, and a total of 62 cases were reviewed. Of the 62 cases, 14 were excluded based on abstract, which indicated discussion about gestational pemphigoid, and 7 were excluded because they were non-English. Finally, we included 41 relevant case reports according to their titles and abstracts. These 41 case reports between 1966 and 2014 involved 47 women identified with pemphigus before (n=21 cases) or during pregnancy (n=26 cases). These cases of pemphigus and pregnancy have been reported in different populations, Asia, Europe, and North America, with more than in Africa, South America, and Oceania (Figure 1). A recent study from the United Kingdom has suggested an incidence of PV of 0.68 cases per 100,000 persons per year. The incidence varies in different areas, being more common in the Near and Middle East than in Western Europe and North America.11-14Open in a separate windowFigure 1Regional distribution of 47 cases of pemphigus and pregnancy between 1966 and 2014.We analyzed the characteristics of 21 patients with pemphigus diagnosed before pregnancy. Among them, 71.4% were diagnosed as PV, 19% as PF, 4.8% as Pemphigus vegetans, while the remaining was indefinite. The age of onset of pemphigus was generally 20-42 years old (mean age 27.35±5.73), with a mean interval of 3.16±2.11 years between disease onset and pregnancy. The pemphigus course was characterized by exacerbation (61.9%), improvement (9.5%), and remaining stable (28.6%) during the pregnancy. The newborn status is meaningful for our conclusion. The incidence of neonatal pemphigus was as high as 57.1% (including 38.1% of PV and 19% of PF). In contrast, the percentage of healthy neonates was only 33.3%, which may be considered to be publication bias (15-31

Table 1

Characteristics of 21 patients with pemphigus diagnosed between 1966 and 2013.Open in a separate windowIt seems to be quite a rare phenomenon that pregnancy as a triggering factor of PV seems to be quite a rare phenomenon.13 4,14,19,32-52

Table 2

Characteristics of 26 patients with pemphigus diagnosed during pregnancy between 1966 and 2013.Open in a separate window

Effects on the mother

If a pregnant woman becomes sick (such as pemphigus), she is more likely to suffer from disorders of the neuroendocrine system and immune system due to the state of high pressure.53 According to the current study, the mother’s condition may exacerbate, enter into remission, or remain stable during the pregnancy.54 The disease is aggravated most likely during the first, second trimester, and postpartum, then is relived during the third trimester.15 This may be due to the increased level of endogenous corticosteroid hormone chorion and subsequent immunosuppression.40,55 Although some literature reports the postpartum flare of pemphigus due to the rapid drop of corticosteroid hormones levels, the postpartum status in our study was optimistic, only 2 cases (9.5%) of pemphigus diagnosed before pregnancy and 8 cases (30.8%) of pemphigus diagnosed during pregnancy exacerbated after delivery.19,56 However, some patients with pemphigus during pregnancy may not show any obvious changes, especially those patients in remission.9,15

Effects on the mode of delivery

Goldberg et al32 and Fainaru et al14 indicated that the trauma of vaginal delivery can result in extension and deterioration of the wound. In a cesarean section, patients who receive long-term steroid therapy will increase the risk, and the disease itself, and corticosteroid therapy may complicate wound healing. Therefore, delivery by cesarean section is the absence of additional benefits. Except for obstetric contraindications, vaginal delivery is recommended. Although it is a potential risk that local blisters may result in passive transfer of antibodies to the fetus through the breast milk, breastfeeding is not contraindicated.

Effects on the pregnancy outcome

Pregnancy outcome includes live birth, stillbirth, spontaneous abortion, and induced abortion.57 Pemphigus vulgaris in pregnancy may result in abortion, fetal growth retardation, intrauterine death, premature delivery, and in approximately 30% neonatal PV of the newborns.58 In this article, we will discuss the 3 most common outcomes of pemphigus in pregnancy: normal fetal outcome, neonatal pemphigus, and stillbirth.

Normal fetal outcome

Most of the patients with pemphigus can give birth to a normal full-term, healthy newborn through vaginal delivery or cesarean section, depending on the collaborative efforts of the dermatologist and obstetrician.56 In our study, although there were only 7 (33.3%) healthy neonates from the cases with pemphigus diagnosis before pregnancy, we considered it likely to be an underestimate due to the less frequent reports of successful deliveries than that of neonatal adverse outcomes.

Neonatal pemphigus

Neonatal pemphigus is a rarely reported transitory autoimmune blistering disease. It is clinically characterized by transient flaccid blisters and erosions on the skin and rarely on the mucous membranes.17 The disease can be self-healing at 2-3 weeks without special treatment, and does not have long-term clinical significance. No new vesicles or bullae appears in the newborn after birth. Neonatal PV has never been reported to persist beyond the neonatal period and progress to adult disease.17,34,35,39 Neonatal pemphigus is mainly due to the transplacental transmission of antibodies, and only a very small amount of immunoglobulin G (IgG) is synthesized by the neonate itself.36,59 Pemphigus IgG is found both in the fetal circulation and fixed to the fetal epidermis in a characteristic intercellular distribution, while IgA, IgM, IgE, and IgD generally do not participate in the passive transport.60 Contrary to PV, PF in pregnant women rarely leads to neonatal skin lesions.61 The absence of skin disease in the newborns may be due to low transfer of IgG4 autoantibodies through the placenta, and the “immunosorbent” effect of the placenta to contain desmosomes and desmogleins.62-65 This is because to the distribution and cross-compensation of the pemphigus antigens desmoglein 3 and 1 in neonatal and adult skin or mucosa are different.60

Stillbirth

In the literature, the rate of stillbirth in pemphigus during pregnancy was reported to range from 1.4-27%.18,33,56,66 In contrast to the high percentage of some previous observations, pregnancy ended in stillbirth in only one case (4.8%) of pemphigus diagnosed before pregnancy and 2 cases (7.7%) of pemphigus diagnosed during pregnancy in our study. The occurrence of stillbirth emphasizes the management problems encountered when a pemphigus patient becomes pregnant.56,66No relevance has been indicated between maternal treatment regimen and fetal outcome.38,67 Instead of particular medications, adverse pregnancy outcomes seem to be correlated more closely to poor maternal disease control, higher maternal serum, and umbilical cord blood antibody titers.38

Treatment options

Almost all types of pemphigus patients experience severe worsening of the disease after delivery if there is a lack of treatment during pregnancy (n=66). Treatment is often required to control both maternal diseases and fetal outcomes.68 The current study suggested that standard therapy gives priority to systemic glucocorticoids, alone or in combination with other immunosuppressive agents such as immunosuppressant, intravenous immunoglobulin (IVIg) or plasmapheresis.15,32,69 If the disease worsens during the first trimester, a medical termination of pregnancy may be considered, and if it happens during the second and third trimester, application of corticosteroids is a safe treatment.20 The treatment of pemphigus patients diagnosed during pregnancy is similar to the treatment before pregnancy.38

Glucocorticoids

The use of systemic steroids is considered safe in pregnancy, and glucocorticoid remains the first-line agent for treatment with low dosages when patients are mildly ill.70 Some corticosteroids such as prednisone (FDA pregnancy category B), featured with a fast action and high pharmacological effect, can be safely used as immunosuppressive drugs during pregnancy as they do not readily cross the placenta. Prednisone is the safest drug compared with other less used glucocorticoids such as dexamethasone and betamethasone.71,72 The dose of prednisone/prednisolone should be reduced to the lowest effective dose, and standardized doses are still experimental.15,19,32,56

Immunosuppressants

Immunosuppression of steroid-sparing agents are needed when pemphigus has to be controlled by larger doses of medications. Azathioprine is the most widely used steroid-sparing agent for pemphigus.73,74 Cyclosporine is believed to be less effective in the treatment of pemphigus, but it is the safest corticosteroid-sparing agent in pregnancy.38,69 Mycophenolatemofetil, cyclophosphamide, and methotrexate are strongly discouraged or even contraindicated in pregnancy.38,72

Intravenous immunoglobulin

There is moderate evidence suggestive of an effective and safe effect of IVIg as an auxiliary therapy in pregnancy patients with pemphigus.75-77 Therefore, when pregnancy is associated with significant medical problems or disease states, clinicians may need to consider using IVIg early.78

Plasmapheresis

Plasmapheresis is a useful alternative immunosuppressive therapy in pregnancy, which can be used as adjuvant therapy, combined with systemic corticosteroids, reducing the dosage of glucocorticoid treatment.21In conclusion, the patients may suffer from pemphigus before or during pregnancy. The condition of pemphigus and pregnancy can interact with each other and make the treatment and prognosis of these diseases more complicated, presenting challenges for the clinician. Pregnancy may precipitate or aggravate pemphigus, and new born babies of such patients may have a normal outcome or neonatal pemphigus, or, rarely, a stillbirth. Current treatment of pemphigus coexisting with pregnancy priorities systemic glucocorticoids, alone or in combination with other immunosuppressive agents such as immunosuppressants, IVIg or plasmapheresis. The number of reported cases of pemphigus in pregnancy is too small to predict the change of conditions for an individual patient. In summary, pemphigus and pregnancy is still an indistinct area that needs collaborative work by obstetricians, dermatologists, neonatologists, endocrinologists, and oral medicine specialists, to establish a mechanism of multi-disciplinary treatment.  相似文献   

19.
Should face transplants be undertaken? This article examines the ethical problems involved from the perspective of the recipient, looking particularly at the question of identity, the donor and the donor''s family, and the disfigured community and society more generally. Concern is expressed that full face transplants are going ahead.So it has happened. According to the New Scientist,i the case was made out by a team in Louisville, Kentucky, USA, in 2004,1 and the first partial face transplant was performed in Lyon, France, in November 2005 on a woman whose face was mutilated by a dog.2ii Full face transplants are to go ahead at the Royal Free Hospital in Hampstead, North London, UK.3iii It has been called “the boldest cut”,4 and “a glorified sewing job”.4The Royal College of Surgeons in the UK5,6 and the Comité Consultatif National d'' Ethique Pour les Sciences de la Vie et de la Santé in France7 have cautioned against it. But even before the flurry of activity in late 2005, it was clear that this latest venture in transplantation technology would proceed: there were statements of intent—a prospectus almost—from clinicians and researchers, most notably from a University of Louisville team.1It is sometimes suggested that face transplants are morally analogous to limb transplants.1,8 There have been hand transplants—some 20 transplants since the first was performed on Clint Hallam in 1998.9,10 Many of the problems are the same, but the ethical dilemmas that surround face transplants are, arguably, of a different dimension.11 And caution is rightly being argued in the case of hand transplants too.12,13 There must be concern that scientific advance is being urged before legal frameworks can be established, and before ethicists have debated the morality of the surgery.iv There may also be a danger that clinicians will seek to legitimate what they propose to do by finding justifications within ethicists on whom they can rely—even if no one else does.vThis paper considers the ethical issues of face transplants. Another paper will explore some of the legal questions. In considering the ethical issues, it is important to separate those which principally concern the recipient from those which affect the donor and the donor''s family. But we must not ignore the interests of the disfigured community and those of society at large, and consideration will be given to those two interest groups as well.  相似文献   

20.

Objective

Although demand for information about the effectiveness and efficiency of health care information technology grows, large-scale resource-intensive randomized controlled trials of health care information technology remain impractical. New methods are needed to translate more commonly available clinical process measures into potential impact on clinical outcomes.

Design

The authors propose a method for building mathematical models based on published evidence that provides an evidence bridge between process changes and resulting clinical outcomes. This method combines tools from systematic review, influence diagramming, and health care simulations.

Measurements

The authors apply this method to create an evidence bridge between retinopathy screening rates and incidence of blindness in diabetic patients.

Results

The resulting model uses changes in eye examination rates and other evidence-based population parameters to generate clinical outcomes and costs in a Markov model.

Conclusion

This method may serve as an alternative to more expensive study designs and provide useful estimates of the impact of health care information technology on clinical outcomes through changes in clinical process measures.The announcement 1 and reaffirmation 2 of the federal commitment to advancing health care information technology (HIT) has been further bolstered by events in the Gulf South after the recent hurricane seasons. 3 This commitment creates both opportunities and challenges for health services and clinical informatics researchers. Clinicians, policy makers, lobbyists, economists, and the media demand evidence-based recommendations for HIT. To make decisions that will affect millions of lives and billions of dollars, decision makers require more than efficacy studies—they require results that indicate both the effectiveness and the efficiency of HIT solutions. The ability of the informatics research community to respond to this need with useful and credible evidence will determine our relevance to the debate.Many evaluations of health services focus primarily on process measures. 4 For example, there are numerous studies in the disease management literature that report the impact of technology on the rate of annual eye or foot examinations for diabetic patients. 5–14 However, there are few published studies that evaluate HIT’s impact on the rate of blindness or amputations. Despite the increasing demand for credible clinical outcomes evidence, many studies in HIT lack the power to detect changes in clinical outcomes, a product of limited time and resources. 15 In addition, the rapid evolution of new technologies makes the study subject itself, HIT, a moving target. By the time a large-scale trial is completed, the state of the art will have moved on. 16 Evaluations in HIT therefore tend to be relatively brief studies comparing convenient measures, more often made in a laboratory environment, or potentially idiosyncratic academic environments, than in real-world clinical settings, thus limiting generalizability. These studies would be classified by Fuchs and Garber 17 as stage 1 and 2 technology assessments—evaluating the performance of the technology itself and perhaps the impact of the technology on processes of care. However, the demand for outcomes evidence mandates that future HIT research be at the level of stage 3 technology assessments, in which comprehensive clinical, economic, and social outcomes are evaluated to determine both the effectiveness and efficiency of the intervention. The importance of linking process measures to clinical outcomes has been previously described, but progress has been limited. 18 We propose an approach to maximize the ability of HIT evaluation research to report clinical and financial outcomes.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号