首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
3.
In the past, filming lateral radiogram needed the association of filming bed and angle wooden-box. Which has the defects thatit required too complex operation and can‘t avoid the eontralateral shoulder warding off the ray. The author invented a new way to film Lateral radiogram of mandible through years‘ practice, which is more simple and convenient and suitable for routine operation.  相似文献   

4.
Time-location sampling (TLS) is useful for collecting information on a hard-to-reach population (such as men who have sex with men [MSM]) by sampling locations where persons of interest can be found, and then sampling those who attend. These studies have typically been analyzed as a simple random sample (SRS) from the population of interest. If this population is the source population, as we assume here, such an analysis is likely to be biased, because it ignores possible associations between outcomes of interest and frequency of attendance at the locations sampled, and is likely to underestimate the uncertainty in the estimates, as a result of ignoring both the clustering within locations and the variation in the probability of sampling among members of the population who attend sampling locations. We propose that TLS data be analyzed as a two-stage sample survey using a simple weighting procedure based on the inverse of the approximate probability that a person was sampled and using sample survey analysis software to estimate the standard errors of estimates (to account for the effects of clustering within the first stage [locations] and variation in the weights). We use data from the Young Men's Survey Phase II, a study of MSM, to show that, compared with an analysis assuming a SRS, weighting can affect point prevalence estimates and estimates of associations and that weighting and clustering can substantially increase estimates of standard errors. We describe data on location attendance that would yield improved estimates of weights. We comment on the advantages and disadvantages of TLS and respondent-driven sampling.  相似文献   

5.
Optimization of Cyclosporine Therapy with Predicted AUC in Renal Transplant Patients  相似文献   

6.
Food consumption surveys are performed in many countries. Comparison of results from those surveys across nations is difficult because of differences in methodological approaches. While consensus about the preferred methodology associated with national food consumption surveys is increasing, no inventory of methodological aspects across continents is available. The aims of the present review are (1) to develop a framework of key methodological elements related to national food consumption surveys, (2) to create an inventory of these properties of surveys performed in the continents North-America, South-America, Asia and Australasia, and (3) to discuss and compare these methodological properties cross-continentally. A literature search was performed using a fixed set of search terms in different databases. The inventory was completed with all accessible information from all retrieved publications and corresponding authors were requested to provide additional information where missing. Surveys from ten individual countries, originating from four continents are listed in the inventory. The results are presented according to six major aspects of food consumption surveys. The most common dietary intake assessment method used in food consumption surveys worldwide is the 24-HDR (24 h dietary recall), occasionally administered repeatedly, mostly using interview software. Only three countries have incorporated their national food consumption surveys into continuous national health and nutrition examination surveys.  相似文献   

7.
8.
To explore the potential of an integrated outpatient electronic health record (EHR) for preconception health optimization. An automated case-finding EHR-derived algorithm was designed to identify women of child-bearing age having outpatient encounters in an 85-site, integrated health system. The algorithm simultaneously cross-referenced multiple discrete data fields to identify selected preconception factors (obesity, hypertension, diabetes, teratogen use including ACE inhibitors, multivitamin supplementation, anemia, renal insufficiency, untreated sexually transmitted infection, HIV positivity, and tobacco, alcohol or illegal drug use). Surveys were mailed to a random sample of patients to obtain their self-reported health profiles for these same factors. Concordance was assessed between the algorithm output, survey results, and manual data abstraction. Between 8/2010-2/2012, 107,339 female outpatient visits were identified, from which 29,691 unique women were presumed to have child-bearing potential. 19,624 (66 %) and 8,652 (29 %) had 1 or ≥2 health factors, respectively while only 1,415 (5 %) had none. Using the patient survey results as a reference point, health-factor agreement was similar comparing the algorithm (85.8 %) and the chart abstraction (87.2 %) results. Incorrect or missing data entries in the EHR encounters were largely responsible for discordances observed. Preconception screening using an automated algorithm in a system-wide EHR identified a large group of women with potentially modifiable preconception health conditions. The issue most responsible for limiting algorithm performance was incomplete point of care documentation. Accurate data capture during patient encounters should be a focus for quality improvement, so that novel applications of system-wide data mining can be reliably implemented.  相似文献   

9.
The fatty acid content, total lipid, refractive index, peroxide, iodine, acid and saponification values of Iranian linseed oil (Linum usitatissimum) were studied. For optimization of extraction conditions, this oil was extracted by solvents (petroleum benzene and methanol–water–petroleum benzene) in 1:2, 1:3 and 1:4 ratios at 2, 5 and 8 h. Then its fatty acid content, omega-3 content and extraction yield were determined. According to the statistical analysis, petroleum benzene in a ratio of 1:3 at 5 h was chosen for the higher fatty acid, extraction yield, and economical feasibility. For preservation of ω-3 ingredients, oil with specified characters containing 46.8% ω-3 was kept under a nitrogen atmosphere at ?30°C during 0, 7, 30, 60 and 90 days and its peroxide value was determined. Statistical analysis showed a significant difference in the average amount of peroxide value only on the first 7 days of storage, and its increase (8.30%) conformed to the international standard.  相似文献   

10.
11.
12.
13.
Congener-specific PCB analysis allows use of toxic equivalency (TEQ) TCDD-based risk assessment approaches when analytical methods are sufficiently sensitive. Many efforts to analyze fish samples for PCB congeners report the majority of samples as non-detects; these data are of little use for human health risk assessment if the limits of analytical detection exceed levels of potential health concern. However, increasing analytical sensitivity is costly and technically difficult. An approach to assess analytical sensitivity needs for risk assessment by defining toxicological endpoints of concern and acceptable risk levels is presented. This framework was applied to assessment of potential PCB TEQ cancer risks to the general United States population and tribal consumers of Columbia River fish, but may be easily adjusted for other situations. A probabilistic model was used to calculate the necessary analytical sensitivity for PCB TEQ cancer risk assessment using the Environmental Protection Agency's new draft cancer risk slope factor for TCDD and fish consumption data. Desired levels of analytical sensitivity were estimated for the congener expected to contribute the most to PCB TEQ, PCB 126, and compared to limits of detection for various analytical methods. The financial and health value of methods with different levels of analytical sensitivity were compared using a value of information approach, which includes analytical cost and cost of potential health outcomes, and a proposed risk assessment utility approach which considers the relative health protectiveness of analytical options non-monetarily. Sensitivity analyses indicate that average consumption rate, cancer risk slope factor choice, and knowledge of existing PCB contamination are important considerations for planning PCB congener analysis. Received: 17 January 2002/Accepted: 29 July 2002  相似文献   

14.
Five different extraction methods, namely hydrodistillation, solvent extractions using hexane, diethyl ether and chloroform, and supercritical fluid extraction using supercritical carbon dioxide were used to extract lemon oil from an inclusion complex of β -cyclodextrin and lemon oil. The extraction efficiency and volatile profiles of 10 major selected volatiles in extracted oil were compared using GC–MS.Volatile flavour components of the complex were successfully extracted by hydrodistillation under carefully controlled conditions. All three solvents, hexane, diethyl ether and chloroform, can be used for extraction of volatiles from the complex, however, hexane was more efficient and easier to handle, with the organic phase easy to separate. Chloroform was the most inefficient in terms of phase separation. The supercritical fluid extraction (SFE) method was not successful and extracted only 33% of the total encapsulated oil after 7 h at 45°C and 210 kg/cm2. The volatile profiles of the solvent extracted oil were similar, SFE was significantly different from the other four methods and hydrodistilled oil was also different, but closer to the solvent extracted oil.  相似文献   

15.
ObjectivesQualitatively describe the adoption of strategies and challenges experienced by intervention facilities participating in a study targeted to improve quality of care in nursing homes “in need of improvement”. To describe how staff use federal quality indicator/quality measure (QI/QM) scores and reports, quality improvement methods and activities, and how staff supported and sustained the changes recommended by their quality improvement teams.Design/setting/participantsA randomized, two-group, repeated-measures design was used to test a 2-year intervention for improving quality of care and resident outcomes in facilities in “need of improvement”. Intervention group (n = 29) received an experimental multilevel intervention designed to help them: (1) use quality-improvement methods, (2) use team and group process for direct-care decision-making, (3) focus on accomplishing the basics of care, and (4) maintain more consistent nursing and administrative leadership committed to communication and active participation of staff in decision-making.ResultsA qualitative analysis revealed a subgroup of homes likely to continue quality improvement activities and readiness indicators of homes likely to improve: (1) a leadership team (nursing home administrator, director of nurses) interested in learning how to use their federal QI/QM reports as a foundation for improving resident care and outcomes; (2) one of the leaders to be a “change champion” and make sure that current QI/QM reports are consistently printed and shared monthly with each nursing unit; (3) leaders willing to involve all staff in the facility in educational activities to learn about the QI/QM process and the reports that show how their facility compares with others in the state and nation; (4) leaders willing to plan and continuously educate new staff about the MDS and federal QI/QM reports and how to do quality improvement activities; (5) leaders willing to continuously involve all staff in quality improvement committee and team activities so they “own” the process and are responsible for change.ConclusionsResults of this qualitative analysis can help allocate expert nurse time to facilities that are actually ready to improve. Wide-spread adoption of this intervention is feasible and could be enabled by nursing home medical directors in collaborative practice with advanced practice nurses.  相似文献   

16.
OBJECTIVE: To assess the impact of changes in relative health maintenance organization (HMO) penetration on changes in the physician-to-population ratio in California counties when changes in the economic conditions in California counties relative to the U.S. average are taken into account. DATA SOURCES: Data on physicians who practiced in California at any time from 1988 to 1998 were obtained from the AMA Masterfile. The analysis was restricted to active, patient care physicians, excluding medical residents. Data on other covariates in the model were obtained from the Bureau of Economic Analysis, InterStudy, the Area Resource File, and the California state government. Data were merged using county FIPS codes. STUDY DESIGN: Changes in the physician-to-population ratio in California counties include the effects of both intrastate migration and interstate migration. A reduced-form model was estimated using the Arellano-Bond dynamic panel estimator. Economic conditions in California relative to the U.S. were measured as the ratio of county-level real per capita income to national-level real per capita income. Relative HMO penetration in California was measured as the ratio of county-level HMO penetration to HMO penetration in the U.S. relative HMO penetration was instrumented using five identifying variables to address potential endogeneity. Omitted-variable bias was controlled for by first differencing the model. The model also incorporated eight other covariates that may be associated with the demand for physicians: the percentage of the population enrolled in Medicaid, beds in short-term hospitals per 100,000 population, the percentage of the population that is black, the percentage of the population that is Hispanic, the percentage of the population that is Asian, the percentage of the population that is below age 18, the percentage of the population that is aged 65 and older, and the percentage of the population that are new legal immigrants in a given year. All of the above variables were lagged one period. The lagged physician-to-population ratio was also included to control for the supply of physicians. Separate equations were estimated for primary care physicians and specialist physicians. PRINCIPAL FINDINGS: Changes in lagged relative HMO penetration are negatively associated with changes in specialist physicians per 100,000 population. However, this effect of HMO penetration is attenuated and at times reversed in areas where the magnitude of the difference in relative economic conditions is sufficiently large. We did not find any statistically significant effects for primary care physicians. CONCLUSIONS: Consistent with prior studies, we find that changes in physician supply are associated with changes in relative HMO penetration. Relative economic conditions are an important moderator of the effect of changes in relative HMO penetration on physician migration.  相似文献   

17.
In their recent Health Services Research article titled “Squeezing the Balloon: Propensity Scores and Unmeasured Covariate Balance,” Brooks and Ohsfeldt (2013) addressed an important topic on the balancing property of the propensity score (PS) with respect to unmeasured covariates. They concluded that PS methods that balance measured covariates between treated and untreated subjects exacerbate imbalance in unmeasured covariates that are unrelated to measured covariates. Furthermore, they emphasized that for PS algorithms, an imbalance on unmeasured covariates between treatment and untreated subjects is a necessary condition to achieve balance on measured covariates between the groups. We argue that these conclusions are the results of their assumptions on the mechanism of treatment allocation. In addition, we discuss the underlying assumptions of PS methods, their advantages compared with multivariate regression methods, as well as the interpretation of the effect estimates from PS methods.The use of propensity score (PS) methods in observational studies of medical treatments to adjust for measured confounding has increased substantially during the last decade (Shah et al. 2005). The PS is defined as a subject#x0027;s probability of treatment given his or her characteristics. For groups of subjects with the same PS, measured covariates that were used to construct the score tend to be balanced across treatment groups (Rosenbaum and Rubin 1983). However, unlike random assignment of treatments in a randomized trial, covariates that were not measured (and thus not included in the PS model) will not necessarily be balanced when conditioning on the PS. Hence, imbalances in unmeasured covariates are not addressed by PS methods.In a recent study, Brooks and Ohsfeldt assessed the balancing property of PSs with respect to unmeasured covariates (Brooks and Ohsfeldt 2013) and wondered why subjects with the same PS may receive different treatments. Essentially, they stated that for two subjects with the same PS, for example, a PS of 0.65 (i.e., a probability of 0.65 for receiving treatment), of whom one received treatment and the other did not, there must be a reason why one received treatment and the other did not. They argued that, apparently, this reason was not included in the PS model (thus being unmeasured covariates, potentially confounders). Thereafter, they concluded that PS methods balance measured confounders at a cost of exacerbating any imbalance in unmeasured covariates that are independent of the measured covariates. Further extended, if these unmeasured covariates are confounders (related to both treatment and outcome), PS methods can exacerbate the bias in treatment effect estimates.We do not agree with their main conclusions for reasons outlined below. In the following paragraphs, we focus on three topics: (1) the assumptions underlying PS methods; (2) the conceptual advantage of PS methods in contrast to classical regression techniques; and (3) the estimand (treatment effect estimate) obtained using multivariable regression and the different PS approaches.  相似文献   

18.
The scientific evidences show that the content, baking methods, and types of bread can make health impacts. Bread, as a major part of Iranian diet, demonstrates a significant potential to be targeted as health promotion subject. Healthy Food for Healthy Communities (HFHC) was a project of Isfahan Healthy Heart Program (IHHP), consisting of a wide variety of strategies, like Healthy Bread (HB) Initiative. The HB Initiative was designed to improve the behaviour of both producers and consumers, mainly aiming at making high-fibre, low-salt bread, eliminating the use of baking soda, providing enough rest time for dough before baking (at least one hour), and enough baking time (at least one minute in oven). A workshop was held for volunteer bakers, and a baker-to-baker training protocol under direct supervision was designed for future volunteers. Cereal Organization was persuaded to provide less refined flour that contained more bran. Health messages in support of new breads were disseminated by media and at bakeries by health professionals. Evaluation of the HB Initiative was done using before-after assessments and population surveys. While HB was baked in 1 (0.01%) bakery at baseline, 402 (41%) bakeries in the intervention area joined the HB Initiative in 2009. Soda was completely eliminated and fibre significantly increased from 4±0.4 g% before study to 12±0.6 g% after the intervention (p<0.001). The preparation and baking times remarkably increased. Wastage of bread decreased from 13±1.8 g% to 2±0.5 g% and was expressed as the most important advantage of this initiative by consumers. People who lived in Isfahan city consumed whole bread 6 times more than those who lived in reference area Arak (p<0.001). The HB Initiative managed to add new breads as a healthy choice that were compatible with local dishes and made a model to solve the long-standing problems of bread. It used various health promotion approaches but was best consistent with Beattie''s model.Key words: Bread, Community trial, Health promotion, Nutrition, Iran  相似文献   

19.
INTRODUCTION: Current methods of assessing routes taken during active transport rely on subjective recall of trip length and barriers encountered enroute or the utilization of objective measures (Geographic Information Systems -[GIS]) that may not represent actual travel patterns. This study examined the utility of Global Positioning Systems (GPS) to measure actual routes taken compared with GIS-estimated travel distance and barriers encountered. METHODS: Comparisons between GPS and GIS routes were performed for 59 of 75 children who wore a GPS during the journey to school on a single occasion. Home and school addresses were reported by parents and geocoded in GIS. Children were provided with a GPS and were instructed to travel their normal route to and from school. Data were collected between March and November 2005 and exported to the GIS to determine travel distance, number of busy streets crossed, and the ratio of busy streets to the total streets traveled on. Data analysis was performed in August 2006. RESULTS: No differences were observed between GPS-measured journeys to and from school on any of the examined variables. No differences were observed between GIS and GPS measures of travel distance (p>0.05). GIS-estimated travel routes crossed a significantly (p<0.05) higher number of busy streets (GIS: 1.68+/-0.12 vs GPS: 1.19+/-0.11) and traveled on a higher ratio of busy streets to total streets traveled on (GIS: 0.46+/-0.03 vs GPS: 0.35+/-0.04) (p<0.05) compared with GPS-measured actual travel routes. CONCLUSIONS: Geographic Information Systems provides estimates of travel distance similar to GPS-measured actual travel distances. Travel routes estimated by GIS are not representative of actual routes measured by GPS, which indicates that GIS may not provide an accurate estimate of barriers encountered. The continued use of GPS in active transport research in encouraged.  相似文献   

20.

Introduction

Many epidemiological methods for analysing follow-up studies require the calculation of rates based on accumulating person-time and events, stratified by various factors. Managing this stratification and accumulation is often the most difficult aspect of this type of analysis.

Tutorial

We provide a tutorial on accumulating person-time and events, stratified by various factors i.e. creating event-time tables. We show how to efficiently generate event-time tables for many different outcomes simultaneously. We also provide a new vocabulary to characterise and differentiate time-varying factors. The tutorial is focused on using a SAS macro to perform most of the common tasks in the creation of event-time tables. All the most common types of time-varying covariates can be generated and categorised by the macro. It can also provide output suitable for other types of survival analysis (e.g. Cox regression). The aim of our methodology is to support the creation of bug-free, readable, efficient, capable and easily modified programs for making event-time tables. We briefly compare analyses based on event-time tables with Cox regression and nested case-control studies for the analysis of follow-up data.

Conclusion

Anyone working with time-varying covariates, particularly from large detailed person-time data sets, would gain from having these methods in their programming toolkit.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号