首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
This paper illustrates how a microcomputer spreadsheet package can be used by epidemiologists to facilitate the computation of multiple logistic regression (MLR) probabilities, as well as odds ratios and associated confidence intervals, given the coefficients of the MLR model. By formatting a spreadsheet, data entry is greatly simplified, and computations are accomplished without any arithmetic manipulations on the part of the user. This approach makes it feasible for clerical support staff to assist in the computation of seemingly complex expressions. The increasing availability of microcomputers in clinical and research settings suggests that numerous analytic applications are amenable to this approach, thereby decreasing reliance on mainframe computers and desk-top calculators.  相似文献   

2.
传染病链二项分布资料的Poisson回归模型   总被引:1,自引:0,他引:1  
目的 本文旨在介绍Poisson回归模型在具有链结构的传染病资料分析中的应用。方法 借助极大似然法,对五口之家的普通感冒资料分别拟合五种Poisson回归模型,其中“代数”以亚变量的形式进入模型,而“已感染者数”以连续性指示变量进入模型。结果 本文介绍的模型以Greenwood和Reed—Frost链二项分布模型为特例,Poisson回归模型与相应的链二项分布模型分析结果往往非常近似。结论 用Poisson回归模型处理和分析传染病链二项分布资料更简便易行,必要时可同时分析协变量的作用。  相似文献   

3.
Despite the increasing emphasis on computers and quantitative methods in health services programs, health services administration students are denied access to many of the most powerful tools of systems analysis, including discrete event simulation, because they lack the necessary background in computer programming, simulation methodology, and stochastic processes. This article presents an approach to the modeling of the growth and decline of population groups and their attributes that can be used by students who do not have the extensive quantitative background required to develop the usual discrete event simulation models. The underlying theory, which is based on the behavior of the expectation process of vector Galton-Watson branching processes, can be explained quite easily. An example is presented that uses an age and sex specific model of population growth to investigate policy questions related to the feasibility of the construction of a long-term care facility for a defined population group. Planning decisions are based on the growth and decline of the numbers of individuals in the various age and sex groups. Extensions of the basic methodology are possible that would include projections of the variance-covariance matrix of the population sizes for each year of the projection process. In addition, the model can be extended to include projections of the impact of infectious and communicable diseases on a defined population group together with the effect of categorical disease screening and control programs. Given the basic data utilized in the model, the implementation of the calculations required by the model can be made on modern microcomputer hardware using any of the standard spreadsheet programs.  相似文献   

4.
Microcomputers can be powerful teaching tools if educators learn to develop effective computer-assisted instruction (CAI). This paper reports on an allied health faculty development project that incorporated hands-on workshops and guided individual instruction led by a specialist in the educational uses of microcomputers. Faculty participants gained basic computer literacy skills and learned to assess the salient characteristics of quality software. They also learned specialized skills for designing their own CAI packages. The positive change in participants' knowledge about and attitude toward microcomputers as instructional tools was measured both subjectively by the authors and by a participants' self-report questionnaire. This project can serve as a model for helping allied health educators become computer literate and gain the skills necessary to evaluate and author quality computer-assisted instruction.  相似文献   

5.
Model‐based standardization enables adjustment for confounding of a population‐averaged exposure effect on an outcome. It requires either a model for the probability of the exposure conditional on the confounders (an exposure model) or a model for the expectation of the outcome conditional on the exposure and the confounders (an outcome model). The methodology can also be applied to estimate averaged exposure effects within categories of an effect modifier and to test whether these effects differ or not. Recently, we extended that methodology for use with complex survey data, to estimate the effects of disability status on cost barriers to health care within three age categories and to test for differences. We applied the methodology to data from the 2007 Florida Behavioral Risk Factor Surveillance System Survey (BRFSS). The exposure modeling and outcome modeling approaches yielded two contrasting sets of results. In the present paper, we develop and apply to the BRFSS example two doubly robust approaches to testing and estimating effect modification with complex survey data; these approaches require that only one of these two models be correctly specified. Furthermore, assuming that at least one of the models is correctly specified, we can use the doubly robust approaches to develop and apply goodness‐of‐fit tests for the exposure and outcome models. We compare the exposure modeling, outcome modeling, and doubly robust approaches in terms of a simulation study and the BRFSS example. Copyright © 2012 John Wiley & Sons, Ltd.  相似文献   

6.
To derive meaningful conclusions in a changing medical setting, medical decision-support systems must represent and reason about the temporal nature of the clinical environments they attempt to model. Because all difficult medical problems have significant temporal features, designers of medical decision support systems must recognize the unique problems caused by representing and reasoning with temporal concepts. This report has three goals: 1) to describe a set of fundamental issues in creating and reasoning with computer models of a changing clinical environment, 2) to present a taxonomy for characterizing the temporal characteristics of computer models of temporal reasoning, and 3) to use this taxonomy to compare the models of time used in some implemented medical decision-support programs. From this examination, it is argued that computational models of time based on a single uniform representational or inferential method are limited by the expressive power of that method. Multiple modeling formalisms that express different temporal properties of the domain task and that work cooperatively are required to capture the subtlety and diversity of temporal features used in expert clinical problem solving. As an example of this approach, the author describes a program called TOPAZ that contains two temporal models that represent different temporal features of the clinical domain.  相似文献   

7.
Administrators have a unique and two-dimensional role in regard to the use of computers in occupational therapy. First, they must understand and advocate for appropriate purchase, accessibility, and application of computers for their overall occupational therapy program. Secondly, they must be adept in using the organizational computer data base and microcomputers in their administrative duties. To assist in the decision-making process for the occupational therapy administrator who feels foreign to computer systems and their functions, this paper identifies the administrative requirements of "computerizing" an occupational therapy program. Additionally, this paper describes the computer's contribution to specific administrative functions in occupational therapy.  相似文献   

8.
关于Reed-Frost模型的研究和改进   总被引:3,自引:0,他引:3       下载免费PDF全文
笔者提出了对Reed-Frost模型的学术评论,并新建了一种Reed-Frost模型的改进模型。该改进模型全面考虑了隐性感染者的产生,传染作用及向免疫者的转化。应用这个模型,模拟了New England的麻疹流行和中国上海市某保育院的水痘流行,结果颇为理想。该模型包括基本的流行要因,故能用以解释流行过程,并可帮助流行病学工作者理解隐性感染的传染病流行过程中的作用。  相似文献   

9.
目的 探讨风疹应急性预防接种的效果 ,为控制风疹爆发疫情提供依据。方法 对某市 8所小学风疹爆发流行进行流行病学调查 ,首先利用 1所小学自然爆发至终止的数据建立本次风疹爆发的Reed -Frost模型 ,以此模型预测其它 7所经应急性预防接种的小学的理论风疹发病人数 ,进而预测应急性预防接种的预防效果。结果  7所小学风疹的理论显性发病例数 1 0 0 0例 ,应急性预防接种减少风疹发病 92 5例。结论 在发生风疹爆发的小学应尽早进行风疹疫苗的应急接种 ,以控制风疹爆发疫情的蔓延  相似文献   

10.
笔者提出了对Reed-Frost模型的学术评论,并新建了一种Reed-Frost模型的改进模型。该改进模型全面考虑了隐性感染者的产生、传染作用及向免疫者的转化。应用这个模型,模拟了NewEngland的麻疹流行和中国上海市某保育院的水痘流行,结果颇为理想。该模型包括了基本的流行要因,故能用以解释流行过程,并可帮助流行病学工作者理解隐性感染在传染病流行过程中的作用。  相似文献   

11.
本文应用Reed-Frost数学模型对一起封闭式麻疹爆发点的易感儿和其它相关流行因素进行理论上探试。其结果有效接触率为P=0.0065,拟合度颇优。得以评价。  相似文献   

12.
13.
J E Till 《Health physics》1988,55(2):331-338
Assessing risk from manmade and naturally occurring radionuclides in the environment has long been of primary interest in radiation protection. Early investigations and decisions relied on direct measurements of radiation in environmental media; however, these techniques were inadequate for determining exposure to humans and biota from very low levels of radiation and in predicting exposures from future releases. The Plowshare Program in 1957 investigated the use of nuclear explosives for peaceful purposes and created an immediate need for predicting the dispersion and ultimate fate of radionuclides that might be vented to the atmosphere or enter groundwater and expose man. As a result, modeling the behavior of radionuclides in the outdoor environment became a field of vigorous interest, merging disciplines of mathematics, biology and physics, among others. Environmental models and, more specifically, models predicting radiation dose from radionuclides in the environment have evolved rapidly and have increased significantly in sophistication, complexity and scope. Application of these models is commonly known as radiological assessment. This presentation examines the current status, future direction and weaknesses of radiological assessment models used in making decisions about radiation. The trend in recent years has been toward more complex models, which has not necessarily improved the accuracy of dose estimates and, in certain cases, has had the opposite effect. The presentation also discusses the future of model development, with particular emphasis on simple techniques and the expanded use of microcomputers, which will make radiological assessment models widely available for risk assessment and for practical applications such as the design of efficient monitoring programs and engineering calculations on the clean-up of contaminated soils. The screening models developed by the National Council on Radiation Protection and Measurements are offered as an example of a powerful yet simple method for demonstrating compliance with standards. Finally, the paper reviews two major areas of challenge for the future--defining uncertainty associated with radiological assessment models and the potential for converting radiological assessment models for use with nonradioactive environmental pollutants such as chemicals. Modeling radionuclides in the environment is adopting a new perspective. Future models will be less complex than their predecessors and will be adaptable to a much broader range of users. Key challenges remain in applying the techniques used for radionuclides to nonradioactive pollutants.  相似文献   

14.
With the advent of dense single nucleotide polymorphism genotyping, population-based association studies have become the major tools for identifying human disease genes and for fine gene mapping of complex traits. We develop a genotype-based approach for association analysis of case-control studies of gene-environment interactions in the case when environmental factors are measured with error and genotype data are available on multiple genetic markers. To directly use the observed genotype data, we propose two genotype-based models: genotype effect and additive effect models. Our approach offers several advantages. First, the proposed risk functions can directly incorporate the observed genotype data while modeling the linkage disequilibrium information in the regression coefficients, thus eliminating the need to infer haplotype phase. Compared with the haplotype-based approach, an estimating procedure based on the proposed methods can be much simpler and significantly faster. In addition, there is no potential risk due to haplotype phase estimation. Further, by fitting the proposed models, it is possible to analyze the risk alleles/variants of complex diseases, including their dominant or additive effects. To model measurement error, we adopt the pseudo-likelihood method by Lobach et al. [2008]. Performance of the proposed method is examined using simulation experiments. An application of our method is illustrated using a population-based case-control study of association between calcium intake with the risk of colorectal adenoma development.  相似文献   

15.
Many HIV/AIDS (acquired immunodeficiency syndrome) models have been developed to help our understanding of the dynamics and interrelationships of the determinants of HIV (human immunodeficiency virus) spread and/or to develop reliable estimates of the eventual extent of such spread. These models range from very simple to very complex. WHO has developed a simple model for short-term projections of AIDS, details of which are presented here along with results obtained using the model to estimate and project AIDS cases for the USA, sub-Saharan Africa, and south/south-east Asia. WHO has also developed, based on the model described in this paper, a computer program (Epi Model), which will enable the user to easily change the values of any of the variables required by the WHO model.  相似文献   

16.
Regression calibration has been described as a means of correcting effects of measurement error for normally distributed dietary variables. When foods are the items of interest, true distributions of intake are often positively skewed, may contain many zeroes, and are usually not described by well-known statistical distributions. The authors considered the validity of regression calibration assumptions where data are non-Gaussian. Such data (including many zeroes) were simulated, and use of the regression calibration algorithm was evaluated. An example used data from Adventist Health Study 2 (2002-2008). In this special situation, a linear calibration model does (as usual) at least approximately correct the parameter that captures the exposure-disease association in the "disease" model. Poor fit in the calibration model does not produce biased calibrated estimates when the "disease" model is linear, and it produces little bias in a nonlinear "disease" model if the model is approximately linear. Poor fit will adversely affect statistical power, but more complex linear calibration models can help here. The authors conclude that non-Gaussian data with many zeroes do not invalidate regression calibration. Irrespective of fit, linear regression calibration in this situation at least approximately corrects bias. More complex linear calibration equations that improve fit may increase power over that of uncalibrated regressions.  相似文献   

17.

Objective

To propose a more realistic model for disease cluster detection, through a modification of the spatial scan statistic to account simultaneously for inflated zeros and overdispersion.

Introduction

Spatial Scan Statistics [1] usually assume Poisson or Binomial distributed data, which is not adequate in many disease surveillance scenarios. For example, small areas distant from hospitals may exhibit a smaller number of cases than expected in those simple models. Also, underreporting may occur in underdeveloped regions, due to inefficient data collection or the difficulty to access remote sites. Those factors generate excess zero case counts or overdispersion, inducing a violation of the statistical model and also increasing the type I error (false alarms). Overdispersion occurs when data variance is greater than the predicted by the used model. To accommodate it, an extra parameter must be included; in the Poisson model, one makes the variance equal to the mean.

Methods

Tools like the Generalized Poisson (GP) and the Double Poisson [2] may be a better option for this kind of problem, modeling separately the mean and variance, which could be easily adjusted by covariates. When excess zeros occur, the Zero Inflated Poisson (ZIP) model is used, although ZIP’s estimated parameters may be severely biased if nonzero counts are too dispersed, compared to the Poisson distribution. In this case the Inflated Zero models for the Generalized Poisson (ZIGP), Double Poisson (ZIDP) and Negative Binomial (ZINB) could be good alternatives to the joint modeling of excess zeros and overdispersion. By one hand, Zero Inflated Poisson (ZIP) models were proposed using the spatial scan statistic to deal with the excess zeros [3]. By the other hand, another spatial scan statistic was based on a Poisson-Gamma mixture model for overdispersion [4]. In this work we present a model which includes inflated zeros and overdispersion simultaneously, based on the ZIDP model. Let the parameter p indicate the zero inflation. As the the remaining parameters of the observed cases map and the parameter p are not independent, the likelihood maximization process is not straightforward; it becomes even more complicated when we include covariates in the analysis. To solve this problem we introduce a vector of latent variables in order to factorize the likelihood, and obtain a facilitator for the maximization process using the E-M (Expectation-Maximization) algorithm. We derive the formulas to maximize iteratively the likelihood, and implement a computer program using the E-M algorithm to estimate the parameters under null and alternative hypothesis. The p-value is obtained via the Fast Double Bootstrap Test [5].

Results

Numerical simulations are conducted to assess the effectiveness of the method. We present results for Hanseniasis surveillance in the Brazilian Amazon in 2010 using this technique. We obtain the most likely spatial clusters for the Poisson, ZIP, Poisson-Gamma mixture and ZIDP models and compare the results.

Conclusions

The Zero Inflated Double Poisson Spatial Scan Statistic for disease cluster detection incorporates the flexibility of previous models, accounting for inflated zeros and overdispersion simultaneously.The Hanseniasis study case map, due to excess of zero cases counts in many municipalities of the Brazilian Amazon and the presence of overdispersion, was a good benchmark to test the ZIDP model. The results obtained are easier to understand compared to each of the previous spatial scan statistic models, the Zero Inflated Poisson (ZIP) model and the Poisson-Gamma mixture model for overdispersion, taken separetely. The E-M algorithm and the Fast Double Bootstrap test are computationally efficient for this type of problem.  相似文献   

18.
Autoregressive and cross‐lagged models have been widely used to understand the relationship between bivariate commensurate outcomes in social and behavioral sciences, but not much work has been carried out in modeling bivariate non‐commensurate (e.g., mixed binary and continuous) outcomes simultaneously. We develop a likelihood‐based methodology combining ordinary autoregressive and cross‐lagged models with a shared subject‐specific random effect in the mixed‐model framework to model two correlated longitudinal non‐commensurate outcomes. The estimates of the cross‐lagged and the autoregressive effects from our model are shown to be consistent with smaller mean‐squared error than the estimates from the univariate generalized linear models. Inclusion of the subject‐specific random effects in the proposed model accounts for between‐subject variability arising from the omitted and/or unobservable, but possibly explanatory, subject‐level predictors. Our model is not restricted to the case with equal number of events per subject, and it can be extended to different types of bivariate outcomes. We apply our model to an ecological momentary assessment study with complex dependence and sampling data structures. Specifically, we study the dependence between the condom use and sexual satisfaction based on the data reported in a longitudinal study of sexually transmitted infections. We find negative cross‐lagged effect between these two outcomes and positive autoregressive effect within each outcome. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   

19.
The objective of this communication is to introduce a conceptual framework for a study that applies a rigorous systems approach to rural disaster preparedness and planning. System Dynamics is a well-established computer-based simulation modeling methodology for analyzing complex social systems that are difficult to change and predict. This approach has been applied for decades to a wide variety of issues of healthcare and other types of service capacity and delivery, and more recently, to some issues of disaster planning and mitigation. The study will use the System Dynamics approach to create computer simulation models as "what-if" tools for disaster preparedness planners. We have recently applied the approach to the issue of hospital surge capacity, and have reached some preliminary conclusions--for example, on the question of where in the hospital to place supplementary nursing staff during a severe infectious disease outbreak--some of which we had not expected. Other hospital disaster preparedness issues well suited to System Dynamics analysis include sustaining employee competence and reducing turnover, coordination of medical care and public health resources, and hospital coordination with the wider community to address mass casualties. The approach may also be applied to preparedness issues for agencies other than hospitals, and could help to improve the interactions among all agencies represented in a community's local emergency planning committee. The simulation models will support an evidence-based approach to rural disaster planning, helping to tie empirical data to decision-making. Disaster planners will be able to simulate a wide variety of scenarios, learn responses to each and develop principles or best practices that apply to a broad spectrum of disaster scenarios. These skills and insights would improve public health practice and be of particular use in the promotion of injury and disease prevention programs and practices.  相似文献   

20.
Pre-school professionals need to be able to assess the role of microcomputers in education. Young children are able to interact with a microcomputer through a keyboard or other input devices and show by their interest and attention that they enjoy using one. There is some software of the Computer-Assisted-Learning type suitable for this age group, but in many cases, these programs do not use the potential of the computer as effectively as they could. Observations of children under 5 years using the LOGO language with simplified graphics commands shows that this provides opportunities for cognitive development, stimulating problem solving and experimentation. Creativity and social and language skills are also likely to be promoted. Microcomputers can be useful for administration as well as providing valuable educational experiences in pre-school settings.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号