首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Objective

To ascertain if outpatients with moderate chronic kidney disease (CKD) had their condition documented in their notes in the electronic health record (EHR).

Design

Outpatients with CKD were selected based on a reduced estimated glomerular filtration rate and their notes extracted from the Columbia University data warehouse. Two lexical-based classification tools (classifier and word-counter) were developed to identify documentation of CKD in electronic notes.

Measurements

The tools categorized patients'' individual notes on the basis of the presence of CKD-related terms. Patients were categorized as appropriately documented if their notes contained reference to CKD when CKD was present.

Results

The sensitivities of the classifier and word-count methods were 95.4% and 99.8%, respectively. The specificity of both was 99.8%. Categorization of individual patients as appropriately documented was 96.9% accurate. Of 107 patients with manually verified moderate CKD, 32 (22%) lacked appropriate documentation. Patients whose CKD had not been appropriately documented were significantly less likely to be on renin-angiotensin system inhibitors or have urine protein quantified, and had the illness for half as long (15.1 vs 30.7 months; p<0.01) compared to patients with documentation.

Conclusion

Our studies show that lexical-based classification tools can accurately ascertain if appropriate documentation of CKD is present in a EHR. Using this method, we demonstrated under-documentation of patients with moderate CKD. Under-documented patients were less likely to receive CKD guideline recommended care. A tool that prompts providers to document CKD might shorten the time to implementing guideline-based recommendations.  相似文献   

2.
3.

Aims

To audit the safety of differing protocol‐driven early‐discharge policies, from two sites, for low‐risk acute upper gastrointestinal (GI) bleeding and determine if default early (<24 h) in‐patient endoscopy is necessary.

Methods

All patients with low‐risk acute upper GI bleeding presenting to two separate hospital sites in Leeds from August 2002 to March 2005 were identified. Both hospitals operate nurse‐led process‐driven protocols for discharge within 24 h, but only one includes default endoscopy. Relevant information was obtained from patients'' notes, patient administration systems, discharge letters and endoscopy records.

Results

120 patients were admitted to site A and 74 to site B. Median length of stay on the clinical decisions unit was 12.6 h at site A and 9.4 h at site B (p = 0.045). Oesophagogastroduodenoscopy was performed on 89/120 (74%) patients at site A compared with only 7/74 (9%) at site B (p<0.001). Six of 120 (5%) patients from site A were admitted to hospital for further observation compared with 6/74 (8%) from site B (p = 0.38). Of the remaining patients, all were discharged within 24 h, and 8/114 (7%) at site A vs 17/68 (25%) at site B were given hospital clinic follow‐up (p<0.001). None of the 194 patients had further bleeding or complications within 30 days.

Conclusions

Patients admitted with a low‐risk acute upper GI bleeding can be managed safely by a nurse‐led process‐driven protocol, based on readily available clinical and laboratory variables, with early discharge <24 h. Avoiding in‐patient endoscopy appears to be safe but at the price of greater clinic follow‐up.  相似文献   

4.

Background

Failure or delay in diagnosis is a common preventable source of error. The authors sought to determine the frequency with which high-information clinical findings (HIFs) suggestive of a high-risk diagnosis (HRD) appear in the medical record before HRD documentation.

Methods

A knowledge base from a diagnostic decision support system was used to identify HIFs for selected HRDs: lumbar disc disease, myocardial infarction, appendicitis, and colon, breast, lung, ovarian and bladder carcinomas. Two physicians reviewed at least 20 patient records retrieved from a research patient data registry for each of these eight HRDs and for age- and gender-compatible controls. Records were searched for HIFs in visit notes that were created before the HRD was established in the electronic record and in general medical visit notes for controls.

Results

25% of records reviewed (61/243) contained HIFs in notes before the HRD was established. The mean duration between HIFs first occurring in the record and time of diagnosis ranged from 19 days for breast cancer to 2 years for bladder cancer. In three of the eight HRDs, HIFs were much less likely in control patients without the HRD.

Conclusions

In many records of patients with an HRD, HIFs were present before the HRD was established. Reasons for delay include non-compliance with recommended follow-up, unusual presentation of a disease, and system errors (eg, lack of laboratory follow-up). The presence of HIFs in clinical records suggests a potential role for the integration of diagnostic decision support into the clinical workflow to provide reminder alerts to improve the diagnostic focus.  相似文献   

5.

Objective

To assess intensive care unit (ICU) nurses'' acceptance of electronic health records (EHR) technology and examine the relationship between EHR design, implementation factors, and nurse acceptance.

Design

The authors analyzed data from two cross-sectional survey questionnaires distributed to nurses working in four ICUs at a northeastern US regional medical center, 3 months and 12 months after EHR implementation.

Measurements

Survey items were drawn from established instruments used to measure EHR acceptance and usability, and the usefulness of three EHR functionalities, specifically computerized provider order entry (CPOE), the electronic medication administration record (eMAR), and a nursing documentation flowsheet.

Results

On average, ICU nurses were more accepting of the EHR at 12 months as compared to 3 months. They also perceived the EHR as being more usable and both CPOE and eMAR as being more useful. Multivariate hierarchical modeling indicated that EHR usability and CPOE usefulness predicted EHR acceptance at both 3 and 12 months. At 3 months postimplementation, eMAR usefulness predicted EHR acceptance, but its effect disappeared at 12 months. Nursing flowsheet usefulness predicted EHR acceptance but only at 12 months.

Conclusion

As the push toward implementation of EHR technology continues, more hospitals will face issues related to acceptance of EHR technology by staff caring for critically ill patients. This research suggests that factors related to technology design have strong effects on acceptance, even 1 year following the EHR implementation.  相似文献   

6.

Introduction

Doctors in all specialties are involved in making “do not attempt resuscitation” (DNAR) decisions; this can be a difficult and challenging process. Guidelines exist to provide an ethical and legal framework for the process and documentation of these decisions.

Objective

To audit the documentation of resuscitation decisions in a sample of medical inpatients from two district general hospitals.

Method

A retrospective case note audit of 50 medical inpatients, in which a DNAR decision had been made (28 from hospital 1, 22 from hospital 2).

Results

Average age was 78.9 years (48% male:52% female). In both hospitals DNAR decisions were usually discussed with relatives (84%), documented in nursing notes (100%) and made by senior team members (90%). Although the decision was usually dated and clearly documented (98%), abbreviations were commonly used in hospital 2 (45.5% vs 0% in hospital 1, p<0.05). Decisions regarding other treatment were not consistently documented (78.6% and 72.7%, respectively) and there was little evidence that decisions were reviewed (14.3% and 31.8%). The decision was rarely discussed with the patient (6% of all patients), although 66% of patients were not in a position to have a discussion.

Conclusions

Specific forms for recording DNAR decisions improve the clarity of documentation. Current recommendations to discuss resuscitation with patients are controversial and not followed. However, many patients are not in a position to hold a discussion when the need arises and the guidelines should advocate early discussion during a hospital admission in patients where this is appropriate, prior discussion with family and/or wider use of advanced directives.After the introduction of cardiopulmonary resuscitation (CPR) in the 1960s its use has been widened to involve many situations where it has been shown to have little benefit. In the UK, there are joint guidelines on decisions relating to CPR from the British Medical Association, the UK Resuscitation Council and the Royal College of Nursing and these were updated in 2001, partly to incorporate the implications of the Human Rights Act 1998 (in particular, the right to life (article 2) and the right to hold opinions and receive information (article 10)).1,2The guidelines acknowledge that it is essential to identify patients for whom CPR is inappropriate and those who competently refuse it.1 It is appropriate to consider a do not attempt resuscitation (DNAR) order where:
  • attempting CPR will not restart the patient''s heart and breathing,
  • where there is no clear benefit in restarting the patient''s heart and breathing, and
  • where the expected benefit is outweighed by the burdens.
  相似文献   

7.

Aim

To evaluate left ventricle (LV) diastolic function dynamics in patients after acute myocardial infarction (AMI) after combined operation of coronary artery bypass graft with LV aneurismectomy (CABG + AE) according to the results of tissue Doppler imaging (TDI).

Methods

Forty patients after AMI underwent Doppler echocardiography (EchoCG) with TDI and M‐mode colour‐flow imaging before and in 3 and 12 months after CABG + AE. Mitral annulus (MA) TDI with velocity indices was performed in 4 segments of LV.

Results

Conventional transmitral diastolic Doppler indices before and after CABG + AE remained unchanged. TDI showed significant improvement of LV systolic (systolic movement velocity S: 6.1±0.8, 7.4±1.2 and 6.9±1.3 cm/sec. before and in 3 and 12 months after the operation, respectively, p<0.01) and diastolic function after the operation (MA early diastolic movement velocity ( e''): 7.3 ± 2.1, 8.4 ± 1.5 and 8.9 ± 1.8 cm/s.; ratio of transmitral early‐flow velocity (E) to MA early‐diastolic movement velocity (E/e''): 18.4 ± 2.2, 12.3 ± 1.8 and 11.5 ± 2.3; ratio of E diastolic flow propagation velocity (Vp) 3.1 ± 0.45, 2.2 ± 0.38 and 1.8 ± 0.16 before and in 3 and 12 months after the operation, respectively, p<0.01).

Conclusions

Results of the study demonstrate significant improvement of LV diastolic function in the patient after CABG + AE according to TDI, regardless of transmitral flow pattern. TDI is more sensitive and preload independent method of LV myocardial function evaluation.  相似文献   

8.

Background

Health records are essential for good health care. Their quality depends on accurate and prompt documentation of the care provided and regular analysis of content. This study assessed the quantitative properties of inpatient health records at the Federal Medical Centre, Bida, Nigeria.

Method

A retrospective study was carried out to assess the documentation of 780 paper-based health records of inpatients discharged in 2009.

Results

732 patient records were reviewed from the departments of obstetrics (45.90%), pediatrics (24.32%), and other specialties (29.78%). Documentation performance was very good (98.49%) for promptness recording care within the first 24 h of admission, fair (58.80%) for proper entry of patient unit number (unique identifier), and very poor (12.84%) for utilization of discharge summary forms. Overall, surgery records were nearly always (100%) prompt regarding care documentation, obstetrics records were consistent (80.65%) in entering patients'' names in notes, and the principal diagnosis was properly documented in all (100%) completed discharge summary forms in medicine. 454 (62.02%) folders were chronologically arranged, 456 (62.29%) were properly held together with file tags, and most (80.60%) discharged folders reviewed, analyzed and appropriate code numbers were assigned.

Conclusions

Inadequacies were found in clinical documentation, especially gross underutilization of discharge summary forms. However, some forms were properly documented, suggesting that hospital healthcare providers possess the necessary skills for quality clinical documentation but lack the will. There is a need to institute a clinical documentation improvement program and promote quality clinical documentation among staff.  相似文献   

9.

Objective

To determine the effect of the introduction of an acute medical admissions unit (AMAU) on key quality efficiency and outcome indicator comparisons between medical teams as assessed by funnel plots.

Methods

A retrospective analysis was performed of data relating to emergency medical patients admitted to St James'' Hospital, Dublin between 1 January 2002 and 31 December 2004, using data on discharges from hospital recorded in the hospital in‐patient enquiry system. The base year was 2002 during which patients were admitted to a variety of wards under the care of a named consultant physician. In 2003, two centrally located wards were reconfigured to function as an AMAU, and all emergency patients were admitted directly to this unit. The quality indicators examined between teams were length of stay (LOS) <30 days, LOS >30 days, and readmission rates.

Results

The impact of the AMAU reduced overall hospital LOS from 7 days in 2002 to 5 days in 2003/04 (p<0.0001). There was no change in readmission rates between teams over the 3 year period, with all teams displaying expected variability within control (95%) limits. Overall, the performance in LOS, both short term and long term, was significantly improved (p<0.0001), and was less varied between medical teams between 2002 and 2003/04.

Conclusions

Introduction of the AMAU improved performance among medical teams in LOS, both short term and long term, with no change in readmissions. Funnel plots are a powerful graphical technique for presenting quality performance indicator variation between teams over time.  相似文献   

10.

Objective

To determine the therapeutic effect (alleviation of vascular type headache) and side effects of a slow intravenous metoclopramide infusion over 15 min compared with those effects of a bolus intravenous metoclopramide infusion over 2 min in the treatment of patients with recent onset vascular type headache.

Material and methods

All adults treated with metoclopramide for vascular type headache were eligible for entry into this clinical randomised double blinded trial. This study compared the effects of two different rates of intravenous infusion of metoclopramide over a period of 13 months at a university hospital emergency department. During the trial, side effects and headache scores were recorded at baseline (0 min), and then at 5, 15, 30 and 60 min. Repeated measures analysis of variance was used to compare the medication''s efficacy and side effects.

Results

A total of 120 patients presenting to the emergency department met the inclusion criteria. Of these, 62 patients (51.7%) were given 10 mg metoclopramide as a slow intravenous infusion over 15 min (SIG group) and 58 patients (48.3%) were given 10 mg metoclopramide intravenous bolus infusion over 2 min (BIG group). 17 of the 58 patients in the BIG group (29.3%) and 4 of the 62 patients (6.5%) in the SIG group had akathisia (p = 0.001). There were no significant differences between the BIG and SIG groups in terms of mean headache scores (p = 0.34) and no adverse reactions in the study period. Metoclopramide successfully relieved the headache symptom(s) of patients in both the BIG and SIG groups.

Conclusion

Slowing the infusion rate of metoclopramide is an effective strategy for the improvement of headache and reducing the incidence of akathisia in patients with vascular type headache.  相似文献   

11.

Objective

Advances in healthcare information technology have provided opportunities to present data in new, more effective ways. In this study, we designed a laboratory display that features small, data-dense graphics called sparklines, which have recently been promoted as effective representations of medical data but have not been well studied. The effect of this novel display on physicians'' interpretation of data was investigated.

Design

Twelve physicians talked aloud as they assessed laboratory data from four patients in a pediatric intensive care unit with the graph display and with a conventional table display.

Measurements

Verbalizations were coded based on the abnormal values and trends identified for each lab variable. The correspondence of interpretations of variables in each display was evaluated, and patterns were investigated across participants. Assessment time was also analyzed.

Results

Physicians completed assessments significantly faster with the graphical display (3.6 min vs 4.4 min, p=0.042). When compared across displays, 37% of interpretations did not match. Graphs were more useful when the visual cues in tables did not provide trend information, while slightly abnormal values were easier to identify with tables.

Conclusions

Data presentation format can affect how physicians interpret laboratory data. Graphic displays have several advantages over numeric displays but are not always optimal. User, task and data characteristics should be considered when designing information displays.  相似文献   

12.

Background

Syndrome Z describes the interaction of obstructive sleep apnoea (OSA) with the metabolic syndrome.

Purpose of study

A pilot study to determine the prevalence of syndrome Z in a teaching hospital in Singapore.

Methods

Patients (age ⩾18 years) recruited for this prospective study had to satisfy three of the following five inclusion criteria: fasting glucose >6.1 mmol/l, blood pressure ⩾130/85 mm Hg, HDL cholesterol <1.04 mmol/l in men and <1.2 mmol/l in women, triglycerides ⩾1.7 mmol/l, and a waist circumference >102 cm in men and >88 cm in women. All subjects underwent standard overnight polysomnography. Overnight fasting glucose and lipid levels were measured and baseline anthropometric data recorded. All sleep studies were scored and reported by a sleep physician. OSA was deemed to be present if the respiratory disturbance index (RDI) was ⩾5, with mild, moderate and severe categories classified according to the Chicago criteria.

Results

There were 24 patients (19 males and five females) of whom 10 were Chinese, eight Malay and five of Indian origin, with one other. Mean age was 48±13.5 years, mean body mass index was 34.9±6.1 kg/m2 and mean waist circumference was 111.3±15.7 cm. 23 (95.8%) of the patients had OSA with a mean RDI of 39.6±22.4 events/h with 15 patients (62.5%) in the severe category. The five patients who fulfilled all five criteria for diagnosis of the metabolic syndrome had severe OSA.

Conclusion

The prevalence of OSA in our studied population exhibiting the metabolic syndrome is very high. Therefore, a polysomnogram should always be considered for this subset of patients.  相似文献   

13.

Background

There is an increased prevalence of coeliac disease (CD) among relatives of those with the disease.

Aims

To compare the clinical features in patients with CD detected via family screening with those in patients diagnosed routinely.

Methods

Information on screening was provided to relatives of patients. Those who wished to be screened were tested for endomysial and/or tissue transglutaminase antibodies. Duodenal biopsy was performed in those with positive antibodies. The clinical details of the relative screening group were compared with those of 105 patients diagnosed routinely.

Results

183 relatives underwent screening, of whom 32 had positive serology, 24 had histology diagnostic of CD, six had normal biopsies and two declined duodenal biopsy. Patients in the relative screening group were younger with a median age of 33 years (range 17–72 years) compared to the routine group which had a median age of 54 years (range 25–88 years). In the relative screening group, there was a male preponderance (M:F ratio 16:8), anaemia at presentation was significantly less common (13% v 58%; p<0.001) and osteoporosis was less frequent (9% v 22%; p<0.244) compared with the routine group. 65% of the relative screening group had gastrointestinal symptoms or anaemia at diagnosis.

Conclusions

Patients detected by family screening are younger with a male preponderance, but fewer had anaemia and osteoporosis.  相似文献   

14.

Objective

To evaluate the time to communicate laboratory results to health centers (HCs) between the e-Chasqui web-based information system and the pre-existing paper-based system.

Methods

Cluster randomized controlled trial in 78 HCs in Peru. In the intervention group, 12 HCs had web access to results via e-Chasqui (point-of-care HCs) and forwarded results to 17 peripheral HCs. In the control group, 22 point-of-care HCs received paper results directly and forwarded them to 27 peripheral HCs. Baseline data were collected for 15 months. Post-randomization data were collected for at least 2 years. Comparisons were made between intervention and control groups, stratified by point-of-care versus peripheral HCs.

Results

For point-of-care HCs, the intervention group took less time to receive drug susceptibility tests (DSTs) (median 9 vs 16 days, p<0.001) and culture results (4 vs 8 days, p<0.001) and had a lower proportion of ‘late’ DSTs taking >60 days to arrive (p<0.001) than the control. For peripheral HCs, the intervention group had similar communication times for DST (median 22 vs 19 days, p=0.30) and culture (10 vs 9 days, p=0.10) results, as well as proportion of ‘late’ DSTs (p=0.57) compared with the control.

Conclusions

Only point-of-care HCs with direct access to the e-Chasqui information system had reduced communication times and fewer results with delays of >2 months. Peripheral HCs had no benefits from the system. This suggests that health establishments should have point-of-care access to reap the benefits of electronic laboratory reporting.  相似文献   

15.

Objective

Renal transplantation has dramatically improved the survival rate of hemodialysis patients. However, with a growing proportion of marginal organs and improved immunosuppression, it is necessary to verify that the established allocation system, mostly based on human leukocyte antigen matching, still meets today''s needs. The authors turn to machine-learning techniques to predict, from donor–recipient data, the estimated glomerular filtration rate (eGFR) of the recipient 1 year after transplantation.

Design

The patient''s eGFR was predicted using donor–recipient characteristics available at the time of transplantation. Donors'' data were obtained from Eurotransplant''s database, while recipients'' details were retrieved from Charité Campus Virchow-Klinikum''s database. A total of 707 renal transplantations from cadaveric donors were included.

Measurements

Two separate datasets were created, taking features with <10% missing values for one and <50% missing values for the other. Four established regressors were run on both datasets, with and without feature selection.

Results

The authors obtained a Pearson correlation coefficient between predicted and real eGFR (COR) of 0.48. The best model for the dataset was a Gaussian support vector machine with recursive feature elimination on the more inclusive dataset. All results are available at http://transplant.molgen.mpg.de/.

Limitations

For now, missing values in the data must be predicted and filled in. The performance is not as high as hoped, but the dataset seems to be the main cause.

Conclusions

Predicting the outcome is possible with the dataset at hand (COR=0.48). Valuable features include age and creatinine levels of the donor, as well as sex and weight of the recipient.  相似文献   

16.

Objectives

To determine the incidence and character of drink spiking in an urban population of patients within the UK presenting to an emergency department concerned they had consumed a deliberately contaminated drink.

Study design

Prospective case series determining the presence and quantity of sedative and illicit drugs, and ethanol in biological samples (blood and urine) obtained from consenting patients >18 years of age presenting to a large inner city London emergency department alleging they had consumed a spiked drink within the previous 12 h.

Results

Biological samples were obtained from 67 (blood) and 75 (urine) of 78 study participants. 82% of participants were female, mean age 24 years. Mean time from alleged exposure to biological sampling was 5.9 h (range 1–12 h). Ethanol was detected in 89.7% of participants. Mean serum ethanol concentration was 1.65 g/l (range 0.04–3.1 g/l); 60% of participants had a serum ethanol concentration associated with significant intoxication (>1.5 g/l). Illicit drugs were detected in 12 (15%) participants; 7 denied intentional exposure (3 methylenedioxymethamphetamine, 3 cannabis, 1 γ‐hydroxybutyrate). Medicinal drugs were detected in 13 participants; only 1 exposure was unexplained (benzodiazepine). Overall illicit or medicinal drugs of unexplained origin were detected in 8 (10%) participants. Unexplained sedative drug exposure was detected in only 2 (3%) participants.

Conclusions

Use of sedative drugs to spike drinks may not be as common as reported in the mainstream media. A large number of study participants had serum ethanol concentrations associated with significant intoxication; the source (personal over‐consumption or deliberate drink spiking) is unclear.  相似文献   

17.

Objective

To test the hypothesis that an acute increase in plasma homocysteine produced by methionine is associated with an acute increase in pulse wave velocity.

Design

A double blind, cross over, placebo controlled design was used and pulse wave velocity, plasma homocysteine, total cholesterol: high density lipoprotein ratio, plasma triglyceride, oxidised low density lipoprotein cholesterol concentrations, apolipoproteins A1 and B, and C reactive protein were measured between 12.5 and 20 hours after methionine loading or placebo.

Results

Between 12.5 and 20 hours after exposure to a methionine loading test, arterial pulse wave velocity showed no significant difference compared with placebo. At 12 hours after exposure to the methionine loading test, in the presence of a controlled diet, triglyceride concentration significantly increased by 32.6% (p<0.02), cholesterol: high density lipoprotein ratio increased significantly by 22.5% (p<0.05) compared with placebo. Simultaneously, systolic blood pressure increased significantly by 4.9% (p<0.02).

Conclusion

In elderly volunteers, acute hyperhomocysteinaemia induced by methionine loading resulted in no overall significant delayed reduction in peripheral arterial distensibility. A significant deterioration in the lipid profile and increased blood pressure was seen during acute hyperhomocysteinaemia.  相似文献   

18.

Objective

Despite at least 40 years of promising empirical performance, very few clinical natural language processing (NLP) or information extraction systems currently contribute to medical science or care. The authors address this gap by reducing the need for custom software and rules development with a graphical user interface-driven, highly generalizable approach to concept-level retrieval.

Materials and methods

A ‘learn by example’ approach combines features derived from open-source NLP pipelines with open-source machine learning classifiers to automatically and iteratively evaluate top-performing configurations. The Fourth i2b2/VA Shared Task Challenge''s concept extraction task provided the data sets and metrics used to evaluate performance.

Results

Top F-measure scores for each of the tasks were medical problems (0.83), treatments (0.82), and tests (0.83). Recall lagged precision in all experiments. Precision was near or above 0.90 in all tasks.

Discussion

With no customization for the tasks and less than 5 min of end-user time to configure and launch each experiment, the average F-measure was 0.83, one point behind the mean F-measure of the 22 entrants in the competition. Strong precision scores indicate the potential of applying the approach for more specific clinical information extraction tasks. There was not one best configuration, supporting an iterative approach to model creation.

Conclusion

Acceptable levels of performance can be achieved using fully automated and generalizable approaches to concept-level information extraction. The described implementation and related documentation is available for download.  相似文献   

19.

Objective

To evaluate the utility of N terminal pro brain natriuretic peptide (NT‐proBNP) as a diagnostic marker for diastolic dysfunction or failure, systolic dysfunction, and significant valve disorders in patients over 75 years.

Design

Cohort study.

Setting

Outpatient echocardiography service in a district general hospital.

Participants

100 consecutive patients.

Main outcome measures

Sensitivity, specificity, positive predictive values, negative predictive values, and area under receiver operating characteristic curve for NT‐proBNP assay in the diagnosis of left ventricular diastolic dysfunction or failure, systolic dysfunction, and significant valve disorders.

Results

For diagnosis of systolic dysfunction NT‐proBNP level of 424 pg/ml had a sensitivity of 96%, specificity of 45%, positive predictive value of 36%, and negative predictive value of 96%. The area under the curve was 0.71 (95% confidence intervals: 0.69 to 0.89). In valve heart disease, level of 227 pg/ml had sensitivity of 91%, specificity of 43%, positive predictive value of 40%, and negative predictive value of 92%. Patients with diastolic dysfunction/failure had lower plasma concentrations.

Conclusions

This study showed that NT‐proBNP had excellent negative predictive value for systolic dysfunction and significant valve disorders in very elderly patients. It increased significantly in systolic dysfunction, valve heart disease, and atrial fibrillation. NT‐proBNP is not useful in the diagnosis of diastolic dysfunction or diastolic heart failure using standard echocardiography criteria.  相似文献   

20.

Background

Electronic health record (EHR) users must regularly review large amounts of data in order to make informed clinical decisions, and such review is time-consuming and often overwhelming. Technologies like automated summarization tools, EHR search engines and natural language processing have been shown to help clinicians manage this information.

Objective

To develop a support vector machine (SVM)-based system for identifying EHR progress notes pertaining to diabetes, and to validate it at two institutions.

Materials and methods

We retrieved 2000 EHR progress notes from patients with diabetes at the Brigham and Women''s Hospital (1000 for training and 1000 for testing) and another 1000 notes from the University of Texas Physicians (for validation). We manually annotated all notes and trained a SVM using a bag of words approach. We then used the SVM on the testing and validation sets and evaluated its performance with the area under the curve (AUC) and F statistics.

Results

The model accurately identified diabetes-related notes in both the Brigham and Women''s Hospital testing set (AUC=0.956, F=0.934) and the external University of Texas Faculty Physicians validation set (AUC=0.947, F=0.935).

Discussion

Overall, the model we developed was quite accurate. Furthermore, it generalized, without loss of accuracy, to another institution with a different EHR and a distinct patient and provider population.

Conclusions

It is possible to use a SVM-based classifier to identify EHR progress notes pertaining to diabetes, and the model generalizes well.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号