首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Medical error reduction is an international issue, as is the implementation of patient care information systems (PCISs) as a potential means to achieving it. As researchers conducting separate studies in the United States, The Netherlands, and Australia, using similar qualitative methods to investigate implementing PCISs, the authors have encountered many instances in which PCIS applications seem to foster errors rather than reduce their likelihood. The authors describe the kinds of silent errors they have witnessed and, from their different social science perspectives (information science, sociology, and cognitive science), they interpret the nature of these errors. The errors fall into two main categories: those in the process of entering and retrieving information, and those in the communication and coordination process that the PCIS is supposed to support. The authors believe that with a heightened awareness of these issues, informaticians can educate, design systems, implement, and conduct research in such a way that they might be able to avoid the unintended consequences of these subtle silent errors.Medical error reduction is an international issue. The Institute of Medicine''s report on medical errors1 dramatically called attention to dangers inherent in the U.S. medical care system that might cause up to 98,000 deaths in hospitals and cost approximately $38 billion per year. In the United Kingdom, the chief medical officer of the newly established National Patient Safety Agency estimates that “850,000 incidents and errors occur in the NHS each year.”2 In The Netherlands, the exact implications of the U.S. figures for the Dutch health care scene are much debated. There as well, however, patient safety is on its way to becoming a political priority. Medication errors alone have been estimated to cause 80,000 hospital admissions per year in Australia, costing $350 million.3In much of the literature on patient safety, patient care information systems (PCISs) are lauded as one of the core building blocks for a safer health care system.4 PCISs are broadly defined here as applications that support the health care process by allowing health care professionals or patients direct access to order entry systems, medical record systems, radiology information systems, patient information systems, and so on. With fully accessible and integrated electronic patient records, and with instant access to up-to-date medical knowledge, faulty decision making resulting from a lack of information can be significantly reduced.5 Likewise, computerized provider order entry (CPOE) systems and automated reminder systems can reduce errors by eliminating illegible orders, improving communication, improving the tracking of orders, checking for inappropriate orders, and reminding professionals of actions to be undertaken. In this way, these systems can contribute to preventing under-, over-, or misuse of diagnostic or therapeutic interventions.6,7,8 Among the broad array of health informatics applications, CPOE systems, and especially medication systems, have received the most attention.9,10,11,12PCISs are complicated technologies, often encompassing millions of lines of code written by many different individuals. The interaction space13 within which clinicians carry out their work can also be immensely complex, because individuals can execute their tasks by communicating across rich social networks. When such technologies become an integral part of health care work practices, we are confronted with a large sociotechnical system in which many behaviors emerge out of the sociotechnical coupling, and the behavior of the overall system in any new situation can never be fully predicted from the individual social or technical components.13,14,15,16,17It is not surprising, therefore, that authors have started to describe some of the unintended consequences that the implementation of PCISs can trigger.18 For instance, professionals could trust the decision support suggested by the seemingly objective computer more than is actually called for.15,19 Also, PCISs could impose additional work tasks on already heavily burdened professionals,20,21 and the tasks are often clerical and therefore economically inefficient.17 They can upset smooth working relations and communication routines.13,22 Also, given their complexity, PCISs could themselves contain design flaws “that generate specific hazards and require vigilance to detect.”23,24 As a consequence, PCISs might not be as successful in preventing errors as is generally hoped. Worse still, PCISs could actually generate new errors.25(p.511),26,27It is obvious that PCISs will ultimately be a necessary component of any high-quality health care delivery system. Yet, in our research in three different countries, we have each encountered many instances in which PCIS applications seemed to foster errors rather than reduce their likelihood. In health care practices in the United States, Europe, and Australia alike, we have seen situations in which the system of people, technologies, organizational routines, and regulations that constitutes any health care practice seemed to be weakened rather than strengthened by the introduction of the PCIS application. In other words, we frequently observed instances in which the intended strengthening of one link in the chain of care actually leads unwittingly to a deletion or weakening of others.We argue that many of these errors are the result of highly specific failures in PCIS design and/or implementation. We do not focus on errors that are the result of faulty programming or other technical dysfunctions. Hardware problems and software bugs are more common than they should be, especially in a high-risk field such as medicine. However, these problems are well known and can theoretically be dealt with through testing before implementation. Similarly, we do not discuss errors that are the result of obvious individual or organizational dysfunctioning such as a physician refusing to seek information in the computer system “because that is not his task,” or a health care delivery organization cutting training programs for a new PCIS for budgetary reasons.We do focus on those often latent or silent errors that are the result of a mismatch between the functioning of the PCIS and the real-life demands of health care work. Such errors are not easily found by a technical analysis of the PCIS design, or even suspected after the first encounter with the system in use. They can only emerge when the technical system is embedded into a working organization and can vary from one organization to the next. Yet, in failing to take seriously some by now well-recognized features of health care work, some PCISs are designed or implemented in such a way that error can arguably be expected to result. Only when thoughtful consideration is given to these issues, we argue, will PCISs be able to fulfill their promise.  相似文献   

2.
Objective: To determine the availability of inpatient computerized physician order entry in U.S. hospitals and the degree to which physicians are using it.Design: Combined mail and telephone survey of 964 randomly selected hospitals, contrasting 2002 data and results of a survey conducted in 1997.Measurements: Availability: computerized order entry has been installed and is available for use by physicians; inducement: the degree to which use of computers to enter orders is required of physicians; participation: the proportion of physicians at an institution who enter orders by computer; and saturation: the proportion of total orders at an institution entered by a physician using a computer.Results: The response rate was 65%. Computerized order entry was not available to physicians at 524 (83.7%) of 626 hospitals responding, whereas 60 (9.6%) reported complete availability and 41 (6.5%) reported partial availability. Of 91 hospitals providing data about inducement/requirement to use the system, it was optional at 31 (34.1%), encouraged at 18 (19.8%), and required at 42 (46.2%). At 36 hospitals (45.6%), more than 90% of physicians on staff use the system, whereas six (7.6%) reported 51–90% participation and 37 (46.8%) reported participation by fewer than half of physicians. Saturation was bimodal, with 25 (35%) hospitals reporting that more than 90% of all orders are entered by physicians using a computer and 20 (28.2%) reporting that less than 10% of all orders are entered this way.Conclusion: Despite increasing consensus about the desirability of computerized physician order entry (CPOE) use, these data indicate that only 9.6% of U.S. hospitals presently have CPOE completely available. In those hospitals that have CPOE, its use is frequently required. In approximately half of those hospitals, more than 90% of physicians use CPOE; in one-third of them, more than 90% of orders are entered via CPOE.In an editorial in American Medical News, legibility, remote access, and the potential “to make users better doctors” were described as the upsides of computerized physician order entry (CPOE) use, but the downsides of typing, system rigidity, and time were cited as making implementation of CPOE systems a highly controversial topic.1 We define CPOE as a process that allows a physician to use a computer to directly enter medical orders. Physicians are not the only members of the health care team who might enter orders into a computerized system, but they are the focus of this particular study. Hospitals are being encouraged by outside forces to implement CPOE in an effort to reduce medical errors. We conducted a survey in 1997, with results published in 1998,2 to discover what percentage of U.S. hospitals had CPOE at that time and to determine how heavily used it was in hospitals that had it. We found that one-third of hospitals claimed to have CPOE available but that it was little used at these sites. An earlier survey with a small response rate had found that 20% of surveyed institutions had CPOE,3 and a study published in 2000 that was limited to inpatient medication ordering by physicians reported that less than 10% of hospitals or health systems had such systems.4 A survey of hospital information systems in Japan discovered that order-entry systems for laboratory, imaging, and pharmacy were available at fewer than 20% of reporting hospitals, but this was not necessarily physician order entry.5 A 2003 report by the Leapfrog Group (a coalition of public and private organizations founded by the Business Roundtable, which is an association of chief executive officers of Fortune 500 companies) stated that 4.1% of the reporting hospitals in a recent survey had CPOE fully implemented,6 but the sample was primarily limited to certain demographics. During the five years since the results of our last survey were published, there have been numerous publications about the benefits of CPOE7,8,9,10 and about some of the difficulties encountered by hospitals implementing it.11,12,13 Several governmental agencies and other bodies such as the Leapfrog Group have made efforts to encourage CPOE use.14,15,16 To aid organizations during planning and implementation, a number of guides and manuals have been published as well.17,18,19,20,21 Although much attention is being focused on CPOE, no recent nationwide figures on hospital installations have been published. Therefore, we decided to send the same survey to the same sample population in 2002 that we did in 1997. The questions to be addressed here are: how widespread is the implementation of CPOE in hospitals across the United States, where is it available, and how much is it used?  相似文献   

3.
4.
There are constraints embedded in medical record structure that limit use by patients in self-directed disease management. Through systematic review of the literature from a critical perspective, four characteristics that either enhance or mitigate the influence of medical record structure on patient utilization of an electronic patient record (EPR) system have been identified: environmental pressures, physician centeredness, collaborative organizational culture, and patient centeredness. An evaluation framework is proposed for use when considering adaptation of existing EPR systems for online patient access. Exemplars of patient-accessible EPR systems from the literature are evaluated utilizing the framework. From this study, it appears that traditional information system research and development methods may not wholly capture many pertinent social issues that arise when expanding access of EPR systems to patients. Critically rooted methods such as action research can directly inform development strategies so that these systems may positively influence health outcomes.Electronic patient record (EPR) systems fundamentally change the way health information is structured. An EPR is a dynamic entity, affording greater efficiency and quality control to the work processes of clinicians by providing data entry at the point of care, logical information access capabilities, efficient information retrieval, user friendliness, reliability, information security, and a capacity for expansion as needs arise.1,2An EPR system promotes patient participation in care to a greater extent than paper records because of its capacity for interaction. Patients can transmit real-time vital signs and other forms of data from their bedside, home, or office and receive up-to-date supportive information customized and contextualized to their individual needs.3,4In this journal, Ross and Lin recently presented a comprehensive review of the world literature on the effects of patient access to medical records, noting a potential for modest benefits and minimal risk, while also citing that the impact of access may vary depending on the patient population in question.5 This is consistent with findings in the information system literature that systems fail when inadequate attention is paid to stakeholder needs and work processes during design6 or when assumptions are made about how well a system fits with the user''s role within the organization during implementation.7Medical records are structured primarily for the use of clinicians and administrators. Patients typically are not counted among the primary users of an EPR system. They tend to be given access sometime after the system is implemented in the organization. Structural concessions and decisions made when the system is first implemented, such as fragmented data entries and foreign lexicons, can make the information difficult for patients to follow and the records all but impossible for them to effectively use.8  相似文献   

5.
Objectives: To determine clinicians'' (doctors'', nurses'', and allied health professionals'') “actual” and “reported” use of a point-of-care online information retrieval system; and to make an assessment of the extent to which use is related to direct patient care by testing two hypotheses: hypothesis 1: clinicians use online evidence primarily to support clinical decisions relating to direct patient care; and hypothesis 2: clinicians use online evidence predominantly for research and continuing education.Design: Web-log analysis of the Clinical Information Access Program (CIAP), an online, 24-hour, point-of-care information retrieval system available to 55,000 clinicians in public hospitals in New South Wales, Australia. A statewide mail survey of 5,511 clinicians.Measurements: Rates of online evidence searching per 100 clinicians for the state and for the 81 individual hospitals studied; reported use of CIAP by clinicians through a self-administered questionnaire; and correlations between evidence searches and patient admissions.Results: Monthly rates of 48.5 “search sessions” per 100 clinicians and 231.6 text hits to single-source databases per 100 clinicians (n = 619,545); 63% of clinicians reported that they were aware of CIAP and 75% of those had used it. Eighty-eight percent of users reported CIAP had the potential to improve patient care and 41% reported direct experience of this. Clinicians'' use of CIAP on each day of the week was highly positively correlated with patient admissions (r = 0.99, p < 0.001). This was also true for all ten randomly selected hospitals.Conclusion: Clinicians'' online evidence use increases with patient admissions, supporting the hypothesis that clinicians'' use of evidence is related to direct patient care. Patterns of evidence use and clinicians'' self-reports also support this hypothesis.Current literature1,2,3 indicates that clinicians do not routinely use the available evidence to support clinical decisions. Several studies have shown that simply disseminating evidence, for example, in the form of practice guidelines, does not lead to increased use of that information to inform clinical decisions.4 Clinicians apparently pursue answers to only a minority of their questions5,6 and, when they do so, they rely most heavily on colleagues for answers.5 Lack of easy access to up-to-date evidence is cited as a barrier to evidence-based practice by clinicians.7,8Online clinical information resources have the potential to support clinicians who adopt an evidence-based approach by providing them with the information they need when they need it.9 We have some evidence that searches of bibliographic databases such as Medline are efficacious.5,10 Given sufficient time, clinicians are able to retrieve research evidence relevant to their clinical questions.11 Training in online searching techniques enhances the quality of evidence retrieved,12 whereas education in critical appraisal increases clinicians'' abilities to apply the information obtained.13However, measuring the actual uptake of online information retrieval systems is problematic and few studies have been attempted. Studies of intranet provision of online resources report monthly utilization rates of 30 to 720 searches per 100 person-months.14 However, most studies only report use rates that exclude clinicians who have access to the system but do not use it. Consequently, these studies do not provide a measure of actual uptake by the clinical population.It is also difficult to measure the impact that online access to evidence has on clinical practice. Assessments of the impact of Medline information on decision making and patient care have relied primarily on self-reports of clinicians. Haynes and McKibbon15 provided training and access to online Medline to a group of 158 U.S. physicians. For 92% of searches related to a patient''s care, clinicians reported that the information retrieved resulted in “at least some improvement” in care. Using the critical incident technique, Lindberg et al.16 interviewed U.S.-clinician Medline users about their searches. Of the 1,158 searches described, 41% were classified as affecting decisions regarding patient care. A survey of U.K. general practitioners who used the ATTRACT system, which provides rapid access to evidence-based summaries to clinical queries, found 60% (24 of 40 doctors) reported that the information gained had changed their practice.17Based on the assumption that providing clinicians with easy access to “evidence” will support decision making and result in improvements in patient care, in 1997 the State Health Department in New South Wales (NSW), Australia, implemented the Clinical Information Access Program (CIAP; <http://www.ciap.health.nsw.gov.au/>). CIAP is a Web site providing point-of-care, 24-hour, online access to a wide range of bibliographic and other resource databases for the approximately 55,000 clinicians (doctors, nurses, and allied health staff) employed by the state and primarily working in public hospitals.Qualitative data from case studies indicated that clinicians perceived a range of factors influenced their use of the system, including support from facility managers and direct supervisors, access to computer terminals, training and skills (appraisal of evidence, database searching, and computer skills), and the place of evidence-based practice in their professional work.18We sought to test hypotheses generated as a result of these case studies using Web log and survey data. We aimed to determine the rates of “actual” and “reported” use of online evidence by the population of clinicians working in the public health system in NSW and to assess the extent to which system use was related to direct patient care.We posed and tested two competing hypotheses. These hypotheses were formulated following a qualitative study examining clinicians'' use of online evidence18: hypothesis 1—clinicians use online evidence primarily to support clinical decisions relating to direct patient care; and hypothesis 2—clinicians use online evidence predominantly for research and continuing education. These hypotheses were tested by the following methods.

Examination of Patterns of Online Evidence Use by Clinicians by Time, Day, and Location of Searches

Hypothesis 1 would be supported by a pattern of use that coincided with patient care and peaked between the core working hours of 9 am and 5 pm, with most use originating from within hospitals (). Hypothesis 2 would be supported by a relatively wide distribution of use across the times of the day with sustained rates of use into the evening when clinicians have more free time for research (). A high proportion of access would be expected to occur from outside hospitals, e.g., at home. Figure 1.Hypotheses regarding patterns of online evidence use by clinicians. H = hypothesis.

Measuring the Association between Hospital Admissions and Use of Online Evidence

A significant positive correlation between patient admissions to hospitals and online evidence searches would provide support for hypothesis 1 by demonstrating that CIAP use is likely to be used primarily to inform patient care as opposed to meeting research or continuing education information needs. Absence of a correlation would support hypothesis 2.  相似文献   

6.

Objective

Home telemonitoring represents a patient management approach combining various information technologies for monitoring patients at distance. This study presents a systematic review of the nature and magnitude of outcomes associated with telemonitoring of four types of chronic illnesses: pulmonary conditions, diabetes, hypertension, and cardiovascular diseases.

Methods

A comprehensive literature search was conducted on Medline and the Cochrane Library to identify relevant articles published between 1990 and 2006. A total of 65 empirical studies were obtained (18 pulmonary conditions, 17 diabetes, 16 cardiac diseases, 14 hypertension) mostly conducted in the United States and Europe.

Results

The magnitude and significance of the telemonitoring effects on patients’ conditions (e.g., early detection of symptoms, decrease in blood pressure, adequate medication, reduced mortality) still remain inconclusive for all four chronic illnesses. However, the results of this study suggest that regardless of their nationality, socioeconomic status, or age, patients comply with telemonitoring programs and the use of technologies. Importantly, the telemonitoring effects on clinical effectiveness outcomes (e.g., decrease in the emergency visits, hospital admissions, average hospital length of stay) are more consistent in pulmonary and cardiac studies than diabetes and hypertension. Lastly, economic viability of telemonitoring was observed in very few studies and, in most cases, no in-depth cost-minimization analyses were performed.

Conclusion

Home telemonitoring of chronic diseases seems to be a promising patient management approach that produces accurate and reliable data, empowers patients, influences their attitudes and behaviors, and potentially improves their medical conditions. Future studies need to build evidence related to its clinical effects, cost effectiveness, impacts on services utilization, and acceptance by health care providers.Continued advances in science and technology and general improvements in environmental and social conditions have increased life expectancy around the world. 1 As a result, the world’s population is aging. Over the last 50 years, the number of people age 60 years or over has tripled, and is expected to triple again to almost two billion by 2050. 2 Population ageing is a global phenomenon affecting all regions. Globally, the proportion of older people was 8% in 1950 and 10% in 2000, and is projected to reach 21% in 2050. 3 China is the region where the increase is likely to be most spectacular, from 6.9% in 2000 to 22.7% in 2050. 3 Population ageing is profound, having major consequences and implications for all facets of human life, including health and health care. Indeed, as we age, the incidence and prevalence of chronic diseases, such as cardiovascular disease, chronic obstructive pulmonary disease (COPD), and diabetes, continue to increase. 1,4 Chronic diseases have become major causes of death in almost all countries. By the end of 2005, it is estimated that 60% of all deaths will be due to chronic diseases. 5 Such prevalence of chronic diseases is one reason why expenditures on health care are skewed: in most health care delivery systems, 5% of patients are responsible for 50% of costs. 6 The economic burden of chronic diseases is profound, accounting for 46% of the global burden of disease. 7 The losses in national income for 2005 due to deaths from heart disease, stroke, and diabetes were estimated (in international dollars) to be $18 billion in China, $1.6 billion in the United Kingdom, and $1.2 billion in Canada. 5 In the United States, chronically ill patients account for 78% of all medical costs nationally. 8 The increasing burden of chronic disease on health care resources and costs provides a powerful incentive to find more compelling ways to care for these patients.The challenge is even more complex because of the supply-and-demand curve in health care. 4 Indeed, at the same time as we face dramatic increases in the numbers of chronically ill patients, there are global provider shortages. An acute nursing shortage exists in many developed countries, including the United States, United Kingdom, Australia, and Canada, and there is no realistic prospect that this situation will change in the near future. 9–11 Furthermore, some countries have to cope with reductions in the number of persons entering the nursing profession. 12–14 Several studies have also suggested a substantial physician shortage, which is expected to develop in the coming years in various countries. 15–17 Dramatic increases in the numbers of chronically ill patients in the face of shrinking provider numbers and significant cost pressures mean that a fundamental change is required in the process of care. We need to identify patient management approaches that would ensure appropriate monitoring and treatment of patients while reducing the cost involved in the process. Provision of care directly to the patient home represents an alternative. It may be perceived as a substitute for acute hospitalization, an alternative to long-term institutionalization, a complementary means of maintaining individuals in their own community, and an alternative to conventional hospital outpatient or physician visits. 1 Information technology can play a crucial role in providing care to the home, and telehealth technologies have been growing dramatically. More precisely, home telemonitoring is a way of responding to the new needs of home care in an ageing population. In this regard, Meystre 18 recently concluded that long-term disease monitoring of patients at home currently represents the most promising application of telemonitoring technology for delivering cost effective quality care. Yet, to be able to comprehensively assess and determine the benefits of home telemonitoring, it is essential to perform a systematic review that can critically synthesize the results of various studies in this area and provides a solid ground for clinical and policy decision making. 19 This article provides a systematic review of experimental and quasi-experimental studies involving home telemonitoring of chronic patients with pulmonary conditions, diabetes, hypertension, and cardiovascular diseases. Precisely, it reveals the nature and magnitude of the outcomes or impacts associated with telemonitoring programs across the world.  相似文献   

7.
This article describes the algorithms implemented in the Essie search engine that is currently serving several Web sites at the National Library of Medicine. Essie is a phrase-based search engine with term and concept query expansion and probabilistic relevancy ranking. Essie’s design is motivated by an observation that query terms are often conceptually related to terms in a document, without actually occurring in the document text. Essie’s performance was evaluated using data and standard evaluation methods from the 2003 and 2006 Text REtrieval Conference (TREC) Genomics track. Essie was the best-performing search engine in the 2003 TREC Genomics track and achieved results comparable to those of the highest-ranking systems on the 2006 TREC Genomics track task. Essie shows that a judicious combination of exploiting document structure, phrase searching, and concept based query expansion is a useful approach for information retrieval in the biomedical domain.A rapidly increasing amount of biomedical information in electronic form is readily available to researchers, health care providers, and consumers. However, readily available does not mean conveniently accessible. The large volume of literature makes finding specific information ever more difficult. Development of effective search strategies is time consuming, 1 requires experienced and educated searchers, 2 well versed in biomedical terminology, 3 and is beyond the capability of most consumers. 4 Essie, a search engine developed and used at the National Library of Medicine, incorporates a number of strategies aimed at alleviating the need for sophisticated user queries. These strategies include a fine-grained tokenization algorithm that preserves punctuation, concept searching utilizing synonymy, and phrase searching based on the user’s query.This article describes related background work, the Essie search system, and the evaluation of that system. The Essie search system is described in detail, including its indexing strategy, query interpretation and expansion, and ranking of search results.  相似文献   

8.
9.

Objective

Although demand for information about the effectiveness and efficiency of health care information technology grows, large-scale resource-intensive randomized controlled trials of health care information technology remain impractical. New methods are needed to translate more commonly available clinical process measures into potential impact on clinical outcomes.

Design

The authors propose a method for building mathematical models based on published evidence that provides an evidence bridge between process changes and resulting clinical outcomes. This method combines tools from systematic review, influence diagramming, and health care simulations.

Measurements

The authors apply this method to create an evidence bridge between retinopathy screening rates and incidence of blindness in diabetic patients.

Results

The resulting model uses changes in eye examination rates and other evidence-based population parameters to generate clinical outcomes and costs in a Markov model.

Conclusion

This method may serve as an alternative to more expensive study designs and provide useful estimates of the impact of health care information technology on clinical outcomes through changes in clinical process measures.The announcement 1 and reaffirmation 2 of the federal commitment to advancing health care information technology (HIT) has been further bolstered by events in the Gulf South after the recent hurricane seasons. 3 This commitment creates both opportunities and challenges for health services and clinical informatics researchers. Clinicians, policy makers, lobbyists, economists, and the media demand evidence-based recommendations for HIT. To make decisions that will affect millions of lives and billions of dollars, decision makers require more than efficacy studies—they require results that indicate both the effectiveness and the efficiency of HIT solutions. The ability of the informatics research community to respond to this need with useful and credible evidence will determine our relevance to the debate.Many evaluations of health services focus primarily on process measures. 4 For example, there are numerous studies in the disease management literature that report the impact of technology on the rate of annual eye or foot examinations for diabetic patients. 5–14 However, there are few published studies that evaluate HIT’s impact on the rate of blindness or amputations. Despite the increasing demand for credible clinical outcomes evidence, many studies in HIT lack the power to detect changes in clinical outcomes, a product of limited time and resources. 15 In addition, the rapid evolution of new technologies makes the study subject itself, HIT, a moving target. By the time a large-scale trial is completed, the state of the art will have moved on. 16 Evaluations in HIT therefore tend to be relatively brief studies comparing convenient measures, more often made in a laboratory environment, or potentially idiosyncratic academic environments, than in real-world clinical settings, thus limiting generalizability. These studies would be classified by Fuchs and Garber 17 as stage 1 and 2 technology assessments—evaluating the performance of the technology itself and perhaps the impact of the technology on processes of care. However, the demand for outcomes evidence mandates that future HIT research be at the level of stage 3 technology assessments, in which comprehensive clinical, economic, and social outcomes are evaluated to determine both the effectiveness and efficiency of the intervention. The importance of linking process measures to clinical outcomes has been previously described, but progress has been limited. 18 We propose an approach to maximize the ability of HIT evaluation research to report clinical and financial outcomes.  相似文献   

10.
11.
12.
Intestinal lymphangiectasia (IL) is a rare disease characterized by dilatation of intestinal lymphatics. It can be classified as primary or secondary according to the underlying etiology. The clinical presentations of IL are pitting edema, chylous ascites, pleural effusion, acute appendicitis, diarrhea, lymphocytopenia, malabsorption, and intestinal obstruction. The diagnosis is made by intestinal endoscopy and biopsies. Dietary modification is the mainstay in the management of IL with a variable response. Here we report 2 patients with IL in Bahrain who showed positive response to dietary modification.Intestinal lymphangiectasia (IL) is a rare1-4 benign disease characterized by focal or diffuse dilation of the mucosal, submucosal, and subserosal lymphatics.2,5 In addition to being an important cause of protein losing enteropathy (PLE),6 IL is frequently associated with extraintestinal lymphatic abnormalities.5 Depending on the underlying pathology IL can be classified as primary or secondary disease.1,2,4,5 Primary IL (PIL) probably represents a congenital disorder of mesenteric lymphatics.1,3 The IL can be secondary to diseases like constrictive pericarditis, lymphoma, sarcoidosis, and scleroderma.1 A secondary disorder should always be ruled out before labeling IL as primary, this is by testing for proteinuria, rheumatic, neoplastic, and parasitic infection.1,3 Recently, a functional form of PIL with typical endoscopic and pathological findings but without clinical symptoms has been reported.3 The clinical presentations of IL are pitting edema, chylous ascites, pleural effusion, acute appendicitis, diarrhea, lymphocytopenia, malabsorption, and intestinal obstruction.1,2,4 Palliative treatment with lifelong dietary modification is the most effective and widely prescribed therapy.6 Limiting the dietary fat intake reduces chyle flow and therefore, protein loss.1 Once protein level is within the normal range, recurrence of enteric protein loss can be prevented by total parenteral nutrition (TPN) and medium chain triglycerides (MCT).1 In cases of secondary IL, treating the underlying primary disorder may be curative.2 Although the therapeutic approach for this disorder have gained a lot of attention lately, few studies have considered the therapeutic effects, nutritional condition, and long-term results in PIL patients.4 Here, we report 2 patients with PIL who were diagnosed by endoscopy and biopsy, and showed positive response to dietary modifications. We present these particular cases to highlight the effect of dietary modifications on the clinical status of patients with IL.  相似文献   

13.

Aim

To assess the glucose tolerance of South Asian and Caucasian women with previous gestational diabetes mellitus (GDM).

Method

A retrospective follow‐up study of 189 women diagnosed with GDM between 1995 and 2001. Glucose tolerance was reassessed by oral glucose tolerance test at a mean duration since pregnancy of 4.38 years.

Results

South Asian women comprised 65% of the GDM population. Diabetes developed in 36.9% of the population, affecting more South Asian (48.6%) than Caucasian women (25.0%). Women developing diabetes were older at follow‐up (mean (SD) 38.8 (5.7) vs 35.9 (5.6) years; p<0.05) and had been heavier (body mass index 31.4 (6.3) vs 27.7 (6.7) kg/m2; p<0.05), more hyperglycaemic (Gl0 6.5 (1.7) vs 5.2 (1.1) mmol/l; p<0.01: G120 11.4 (3.3) vs 9.6 (1.8) mmol/l; p<0.01: HbA1c 6.4 (1.0) vs 5.6 (0.7); p<0.01) and more likely to require insulin during pregnancy (88.1% vs 34.0%; p<0.01). Future diabetes was associated with and predicted by HbA1c taken at GDM diagnosis in both South Asian (odds ratio 4.09, 95% confidence interval 1.35 to 12.40; p<0.05) and Caucasian women (OR 9.15, 95% CI 1.91 to 43.87; p<0.01) as well as by previously reported risk factors of increasing age at follow‐up, pregnancy weight, increasing hyperglycaemia and insulin requirement during pregnancy.

Conclusion

GDM represents a significant risk factor for future DM development regardless of ethnicity. Glycated haemoglobin values at GDM diagnosis have value in predicting future diabetes mellitus.Gestational diabetes mellitus (GDM) is defined as abnormal carbohydrate tolerance that is diagnosed or first recognised in pregnancy1 and affects approximately 5% of pregnancies.2 However, the prevalence depends on the population studied and the diagnostic criteria used3 with an increased frequency of GDM when less stringent diagnostic criteria are used and in ethnic groups who traditionally have a higher rate of type 2 diabetes.4,5,6 Differences in the prevalence of GDM reflect the background susceptibility of individual ethnic groups2,7 and possibly a different stage within the natural history of diabetes at the time of pregnancy.8Previous GDM confers an increased risk of subsequent diabetes mellitus such that 50% of women will have diabetes mellitus after 10 years.9,10 Several antenatal and maternal factors have been shown to predict this11,12,13 and identification of these during the screening of women with GDM may lead to more effective targeting of strategies for primary prevention of diabetes in local populations.3,14 Glycated haemoglobin (HbA1c), while convenient to measure, has little sensitivity in making the diagnosis of GDM15 and has been little studied as a risk marker for predicting future diabetes.A number of studies have suggested that diabetes following GDM develops more rapidly in non‐Caucasian groups.5,16,17 A recent meta‐analysis, however, suggested that differences between the ethnic groups studied could largely be explained by standardising diagnostic criteria, duration of follow‐up and patient retention.18 The Leicestershire population consists of a significant minority of women from the Indian subcontinent who have higher rates of glucose intolerance both in and out of pregnancy.19 This study examined the development of glucose intolerance and its pregnancy associations in this ethnically mixed population.  相似文献   

14.
Should face transplants be undertaken? This article examines the ethical problems involved from the perspective of the recipient, looking particularly at the question of identity, the donor and the donor''s family, and the disfigured community and society more generally. Concern is expressed that full face transplants are going ahead.So it has happened. According to the New Scientist,i the case was made out by a team in Louisville, Kentucky, USA, in 2004,1 and the first partial face transplant was performed in Lyon, France, in November 2005 on a woman whose face was mutilated by a dog.2ii Full face transplants are to go ahead at the Royal Free Hospital in Hampstead, North London, UK.3iii It has been called “the boldest cut”,4 and “a glorified sewing job”.4The Royal College of Surgeons in the UK5,6 and the Comité Consultatif National d'' Ethique Pour les Sciences de la Vie et de la Santé in France7 have cautioned against it. But even before the flurry of activity in late 2005, it was clear that this latest venture in transplantation technology would proceed: there were statements of intent—a prospectus almost—from clinicians and researchers, most notably from a University of Louisville team.1It is sometimes suggested that face transplants are morally analogous to limb transplants.1,8 There have been hand transplants—some 20 transplants since the first was performed on Clint Hallam in 1998.9,10 Many of the problems are the same, but the ethical dilemmas that surround face transplants are, arguably, of a different dimension.11 And caution is rightly being argued in the case of hand transplants too.12,13 There must be concern that scientific advance is being urged before legal frameworks can be established, and before ethicists have debated the morality of the surgery.iv There may also be a danger that clinicians will seek to legitimate what they propose to do by finding justifications within ethicists on whom they can rely—even if no one else does.vThis paper considers the ethical issues of face transplants. Another paper will explore some of the legal questions. In considering the ethical issues, it is important to separate those which principally concern the recipient from those which affect the donor and the donor''s family. But we must not ignore the interests of the disfigured community and those of society at large, and consideration will be given to those two interest groups as well.  相似文献   

15.
Described are the changes to ICD-10-CM and PCS and potential challenges regarding their use in the US for financial and administrative transaction coding under HIPAA in 2013. Using author constructed derivative databases for ICD-10-CM and PCS it was found that ICD-10-CM''s overall term content is seven times larger than ICD-9-CM: only 3.2 times larger in those chapters describing disease or symptoms, but 14.1 times larger in injury and cause sections. A new multi-axial approach ICD-10-PCS increased size 18-fold from its prior version. New ICD-10-CM and PCS reflect a corresponding improvement in specificity and content. The forthcoming required national switch to these new administrative codes, coupled with nearly simultaneous widespread introduction of clinical systems and terminologies, requires substantial changes in US administrative systems. Through coordination of terminologies, the systems using them, and healthcare objectives, we can maximize the improvement achieved and engender beneficial data reuse for multiple purposes, with minimal transformations.In April, 2004 President Bush directed the US healthcare system to adopt and use electronic health records (EHR) such that coverage extends to the ‘entire population within a decade’.1 In 2009, President Obama affirmed that announcement and Congress committed in excess of US$17 billion via the American Recovery and Revitalization Act to ensure that it happens.2 Since that time the US informatics community has focused on providing definitional infrastructure on the successful attributes those systems should have through emerging definitions for ‘meaningful use’ and requirements for certification of systems.3Meanwhile, following an almost identical timeline, the financial and administrative reporting structure as defined by the Health Insurance Portability and Accountability Act of 1997 (HIPAA)4 has implemented a generational update to the reporting structure and code sets used. Those providers implementing the newly prescribed EHR for their clinical use will also provide for a path in their administrative systems to accommodate the change in reporting structure from ASC X12 4010a to 50105 and move to the International Classification of Disease (ICD) version 10 revisions6 7 of the reporting code sets. Wide variation exists in the total implementation cost estimates but, based on final federal cost estimates, direct costs to providers for the administrative system updates will range between US$9.5 and US$16.9 billion, with ICD-10 changes amounting to between US$2.3 and US$2.7 billion.The tension described above exists in our view of the code sets associated with the EHR and administrative transactions. Tang et al8 have outlined the difficulties in using administrative classifications, primarily ICD variants, for clinical information. Chute and colleagues9 10 have noted many of the inherent difficulties in using classifications to represent clinical content. Cimino and Clayton11 have described how to use a reference terminology to manage some of the issues and then succinctly summarized these issues in the desiderata.12 Brown et al13 extended the work of Cimino by using SNOMED-CT as a reference terminology. This decade-plus long excursion clearly shows the difficulty the informatics community has with using classifications to express clinical content, despite the fact that the classifications are the only terminologies most likely encountered and used by most practitioners outside of research settings.The clinical terminology community has also found difficulties in developing and deploying new products.14 During the same time, from approximately 1995 to now, intense work on the development of clinical terminologies such as Read Codes Version 3,15 SNOMED-RT16 17 and the GALEN project18 19 occurred, all trying to apply logic constructs to develop new inferences. Over this time, the National Health Service in the UK invested billions of pounds in the Read Codes Version 3 and SNOMED-CT,20 now an international product of the International Health Terminology Standard Development Organization (IHTSDO) for intended future universal deployment.21 Like the classifications, SNOMED-CT is not without criticism regarding completeness or logical construction.22–24 The USA entered into an agreement with the intellectual property holder of SNOMED-CT, first the College of American Pathologists and now the IHTSDO, to make it freely and universally available in US healthcare.25 It is still in limited use, primarily in those locations that used it before the US license, such as Kaiser Permanente26 and the University of Nebraska. The only major national use now proposed is for the problem list. A SNOMED-CT subset of problems that are seen frequently in a variety of healthcare systems is available.25 It remains to be seen whether this subset will be used in ways that take advantage of SNOMED-CT''s logical inference structure or—perhaps more likely—be employed simply as a large term list.It is imperative that the informatics community understands the structure and potential impact of the new classifications so we can help guide successful introduction. They currently serve as the most used source for inferential decisions regarding health care and will continue to do so for at least the next decade, until and if a clinical code system providing internal logical support emerges and is universally accepted. Given the immediate widespread use that will occur on adoption of the new generation HIPAA classifications and the subsequent long-term effects, it is surprising that little detail on their terminological construction or changes from the previous version are available. This article will focus on these areas and suggest areas for focus and coordination.  相似文献   

16.
Yu W  Li W  Wang Z  Ye X  Li N  Li J 《Postgraduate medical journal》2007,83(977):187-191

Objective

Percutaneous transhepatic gallbladder drainage (PTGD) was compared with endoscopic treatment (within 72 hours after the onset of symptoms) in patients with severe biliary pancreatitis to evaluate the curative effect of PTGD in preventing bile track complications in these patients.

Methods

Eligible patients were randomised to receive early treatment with PTGD or endoscopic treatment. If the initial emergency endoscopic or PTGD treatment failed, patients received another drainage treatment within 24 hours. From November 2001 to August 2005, 101 patients were randomly assigned to early PTGD (n = 51) or endoscopic retrograde cholangiopancreatography (ERCP) (n = 50). Overall mortality, mortality due to pancreatitis and complications were compared in these two groups.

Results

48 of 52 patients were successfully treated with ERCP and 53 of 55 with PTGD. Seven patients (6.9%; three in the endoscopic treatment group and four in the PTGD group) died within four months after the onset of pancreatitis (p = 0.798); three patients in the endoscopic group and three in the PTGD group died from acute biliary pancreatitis. The overall rate of complications was similar in the two groups and there were no major differences in the incidence of local or systemic complications.

Conclusions

PTGD treatment is a simple, convenient and effective treatment of severe gallstone associated acute pancreatitis when endoscopic treatment fails.In China, gallstones account for approximately 50–70% of cases of acute pancreatitis.1,2 The pathogenesis of acute biliary pancreatitis is not fully understood and may be multifactorial.3,4,5,6,7,8 Most gallstones that initiate an episode of acute pancreatitis pass spontaneously through the ampulla of Vater into the duodenum and can subsequently be recovered in the faeces within a few days. It is uncertain whether or not early surgical or endoscopic removal of gallstones is beneficial in mild gallstone associated acute pancreatitis (GAP).9,10 However, early endoscopic retrograde cholangiopancreatography (ERCP) with or without endoscopic sphincterotomy has been advocated to reduce complications in patients with a severe attack of GAP.9,10,11,12Although early endoscopic intervention is the procedure of choice in patients with stone impaction and cholangitis, some patients are unable to tolerate endoscopic intervention, especially those with early severe pancreatitis who have developed organ failure within 72 hours after the onset of pain.13 Therefore, it is not sufficient to rely solely on endoscopic treatment to release bile matter in severe GAP. We need to find alternatives for those patients who fail to be cured by endoscopic treatment.Percutaneous transhepatic gallbladder drainage (PTGD) has been widely accepted as an alternative to operative decompression in critically ill patients with cholecystitis or cholangitis, especially in the elderly.14,15,16 However, little is known about the factors affecting the outcome of PTGD in severe GAP. In clinical practice, we have been successful in using PTGD in some patients who were unable to receive endoscopic intervention and had complicating obstructive jaundice. In this study, we have performed a prospective, randomised trial to compare early PTGD (within 72 hours after the onset of symptoms) with endoscopic treatment in patients with severe GAP.  相似文献   

17.
18.
Kimura disease is a chronic inflammatory disease that mainly manifests as a lump in the cervical region. Although the underlying pathophysiology is not clear yet, the diagnosis can be established based on specific histopathological characteristics. The first case of this disease was described in China, as well as the majority of subsequent cases that were also described in the Far East countries made Kimura disease traditionally a disease of adult patients of Asian descent. This report describes the occurrence of Kimura disease in pediatric non-Asian patient with a similar clinicopathologic presentation.Although Kimura disease can be grouped under inflammatory disease of chronic nature, the underlying cause is still to be investigated. The disease usually present with enlarged, but painless cervical lymph node or subcutaneous masses in the cervical region.1,2 Clinical and histological characteristics of Kimura disease (primary allergic reaction or an alteration of immune regulation) help to differentiate it from angiolymphoid hyperplasia with eosinophilia (an arteriovenous malformation with secondary inflammation mostly involving dermal or subcutaneousparts), which were previously thought to be the same disease.1,2 Most cases have been reported in adult patients from the Far East of Asia.1,2 Elevation of inflammatory mediators that are usually elevated in autoimmune disorders made hypersensitivity a possible underlying pathophysiological mechanism of this disease.1,2 Patients usually present with non-tender mass in the cervical region with elevated eosinophils count and high levels of serum immunoglobulin type E (IgE).2 Unfortunately, there are no specific radiological characteristics of that disease.2 The only way to diagnose Kimura disease is through its histopathologic features, which necessitate a surgical biopsy.1,2 Treatment usually start with medical therapy and if that fail or show no spontaneous resolution then surgical excision would be the choice at that point with radiotherapy reserved for selected cases.1,2 The main objective of presenting this case report is to emphasize that Kimura disease can involve pediatric Saudi patients in contrast to what was historically described as a disease of adult Asian only. Secondary, it is to support what had been reported of occurrence of the disease in non-Asian patient with a similar clinicopathologic presentation of the Asian patients.2,3  相似文献   

19.
Mucormycosis is an uncommon acute invasive fungal infection that affects immunocompromised patients. It progresses rapidly and has poor prognosis if diagnosed late. Early detection, control of the underlying condition with aggressive surgical debridement, administration of systemic and local antifungal therapies, hyperbaric oxygen as adjunctive treatment improves prognosis and survivability.Mucormycosis also known as zygomycosis and phycomycosis is an uncommon, opportunistic, aggressive fatal fungal infection caused by fungi of the order Mucorales, frequently among immunocompromised patients. This fungal infection begins from the sinonasal mucosa after inhalation of fungal spores; the aggressive and rapid progression of the disease may lead to orbital and brain involvement.1-4 In the past, the mortality rate of the rhino-cerebral type was 88%, but recently the survival rate of rhino-cerebral mucormycosis averages 21-73% depending on the circumstances.1 Mucormycosis is classified according to anatomical site into rhino-cerebral, which is the most common, central nervous system, pulmonary, cutaneous, disseminated, and miscellaneous.1,2,4-6 The rhino-orbito-cerebral is the most common form of mucormycosis.3 The most common predisposing factor is uncontrolled diabetes mellitus (DM), especially when the patient has a history of ketoacidosis, these species thrive best in a glucose rich and acidic environment.3,4,6,7 Immunosuppressive drugs such as steroids, neutropenia, acquired immune deficiency syndrome, dialysis patients on deferoxamine, malnutrition, hematologic malignancy, and organ transplant patients are also at risk of affection by the fungi.1,4-7 This case report describes a case of rhino-orbital mucormycosis affecting a diabetic female with good prognosis and satisfactory healing. Our objective in presenting this particular case is to emphasize that early diagnosis and proper management leads to good prognosis and high survivability.  相似文献   

20.

Objectives:

To elucidate the contribution of x-ray repair cross-complementing (XRCC) protein 1 399Gln, XRCC3 241M, and XRCC3-5’-UTR polymorphisms to the susceptibility of breast cancer (BC) in a Jordanian population.

Methods:

Forty-six formalin fixed paraffin embedded tissue samples from BC diagnosed female patients, and 31 samples from the control group were subjected to DNA sequencing. Samples were collected between September 2013 and December 2014.

Results:

The XRCC1 Arg399Gln genotype did not exhibit any significant correlation with the susceptibility of BC (odds ratio [OR]=1.45, 95% confidence interval [CI]: 0.60-3.51) (p=0.47). Likewise, XRCC3 M241T genotype did not show significant correlation with BC (OR=2.02, 95% CI: 0.50-8.21) (p=0.40). However, distribution of XRCC3-5’UTR (rs1799794 A/G) genotype showed a significant difference between the patient and control group (OR=0.73, 95% CI: 0.06-8.46) (p=0.02).

Conclusion:

The XRCC3-5’UTR (rs1799794) G allele frequency was higher in cancer patients while XRCC1 (rs25487) and XRCC3 (rs861539) did not show any significant correlation with susceptibility of BC in the selected Jordanian population. Contribution of other environmental factors should be studied in future works, as well as the response of cancer therapy.Breast cancer (BC) incidence in Jordan has been estimated at 1,237 cases in 2012, with a prevalence of 4,260 cases over 5 years, and mortality rate up to 426 cases.1 Genetic predisposition contributes to less than 10% of BC cases, which raises a demand for further research into new genetic markers of BC risks.2 Fewer than 5% of BC cases have been found to be mutated at breast cancer 1 (BRCA1) early onset and BRCA2 genes, and approximately 40% of familial BC families have been identified for genetic predisposition.3 Unfortunately, mammalian cells are habitually exposed to genotoxic agents, such as ionizing radiation that can lead to DNA damage. Many double strand break,4 and single strand break (SSB) repairing proteins have been identified including DNA repair protein homolog, or RAD tecombinase, or x-ray repair cross-complementing (XRCC)s family proteins.5 Deficiency in repairing system might contribute to cancer development due to the loss of genetic integrity and genome instability.6 Mutation in DNA repair proteins is very rare.7 Therefore, many studies have been conducted to evaluate the role of allelic polymorphisms in DNA repair genes involved in cancers development.8,9 Genetic polymorphisms in DNA repair genes XRCC1, and XRCC3 have been screened to find an association with the risk of BC.10-12 Studies have demonstrated an association between XRCC1 and XRCC3 polymorphisms, and certain cancers subsuming colorectal cancer,13 lung cancer,14 pancreatic cancer,15 head and neck cancer,16 gastric cancer,17 esophageal cancer,18 melanoma skin cancer,19 oral squamous cell carcinomas,20 lung cancer risk,21 bladder cancer,22 and BC.23 Furthermore, a meta-analysis study supported the contribution of XRCC1 Arg399Gln polymorphism in susceptibility of BC in the American population.24 On the other hand, no relationship has been found between XRCC1 and XRCC3 polymorphisms and the risk of BC,25 lung cancer,26 bladder cancer,27 prostate cancer,28 lung cancer risk,29 cutaneous malignant melanoma,30 furthermore, it may decrease the risk for myeloblastic leukemia31 and non-melanoma skin cancer.32 Alcoholism, abortion, and non-breast feeding have been associated with increased risk of BC with contribution of XRCC1 399Gln and XRCC3 T241M polymorphisms.11 Moreover, family history,12 age group,33 polycyclic aromatic hydrocarbon-DNA adducts, fruit and vegetable and antioxidant intake, and non-smokers have been suggested to be associated with the risk of BC in interaction with XRCC1 or XRCC3 polymorphisms.34 The aim of the current study was to elucidate the contribution of XRCC1 399Gln, XRCC3 241M and XRCC3-5’-UTR polymorphisms in the susceptibility of BC in the Jordanian population. This study is intended to establish a reference point for future single nucleotide polymorphism (SNP) studies in the Jordanian population, which may contribute to the development of a national cancer database.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号