首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
In cohort studies, binary outcomes are very often analyzed by logistic regression. However, it is well known that when the goal is to estimate a risk ratio, the logistic regression is inappropriate if the outcome is common. In these cases, a log‐binomial regression model is preferable. On the other hand, the estimation of the regression coefficients of the log‐binomial model is difficult owing to the constraints that must be imposed on these coefficients. Bayesian methods allow a straightforward approach for log‐binomial regression models and produce smaller mean squared errors in the estimation of risk ratios than the frequentist methods, and the posterior inferences can be obtained using the software WinBUGS. However, Markov chain Monte Carlo methods implemented in WinBUGS can lead to large Monte Carlo errors in the approximations to the posterior inferences because they produce correlated simulations, and the accuracy of the approximations are inversely related to this correlation. To reduce correlation and to improve accuracy, we propose a reparameterization based on a Poisson model and a sampling algorithm coded in R. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

3.
Outreach immunization services, in which health workers immunize children in their own communities, are indispensable to improve vaccine coverage in rural areas of developing countries. One of the challenges faced by these services is how to reduce high levels of vaccine wastage. In particular, the open vial wastage (OVW) that result from the vaccine doses remaining in a vial after a time for safe use -since opening the vial- has elapsed. This wastage is highly dependent on the choice of vial size and the expected number of participants for which the outreach session is planned (i.e., session size). The use single-dose vials results in zero OVW, but it increases the vaccine purchase, transportation, and holding costs per dose as compared to those resulting from using larger vial sizes. The OVW also decreases when more people are immunized in a session. However, controlling the actual number of people that show to an outreach session in rural areas of developing countries highly depends on factors that are out of control of the immunization planners. This paper integrates a binary integer-programming model to a Monte Carlo simulation method to determine the choice of vial size and the optimal reordering point level to implement an (nQ, r, T) lot-sizing policy that provides the best tradeoff between procurement costs and wastage.  相似文献   

4.
5.
6.
In a multi-hospital surveillance system for detecting increases in congenital malformation rates, the performance of the set technique applied individually to hospitals and of the cumulative sum (Cusum) technique applied to the aggregate of hospitals has been compared using a Monte Carlo method. The Cusum technique appears to be both more sensitive to real increases and less likely to issue false alarms.  相似文献   

7.
Eberly LE  Carlin BP 《Statistics in medicine》2000,19(17-18):2279-2294
The marked increase in popularity of Bayesian methods in statistical practice over the last decade owes much to the simultaneous development of Markov chain Monte Carlo (MCMC) methods for the evaluation of requisite posterior distributions. However, along with this increase in computing power has come the temptation to fit models larger than the data can readily support, meaning that often the propriety of the posterior distributions for certain parameters depends on the propriety of the associated prior distributions. An important example arises in spatial modelling, wherein separate random effects for capturing unstructured heterogeneity and spatial clustering are of substantive interest, even though only their sum is well identified by the data. Increasing the informative content of the associated prior distributions offers an obvious remedy, but one that hampers parameter interpretability and may also significantly slow the convergence of the MCMC algorithm. In this paper we investigate the relationship among identifiability, Bayesian learning and MCMC convergence rates for a common class of spatial models, in order to provide guidance for prior selection and algorithm tuning. We are able to elucidate the key issues with relatively simple examples, and also illustrate the varying impacts of covariates, outliers and algorithm starting values on the resulting algorithms and posterior distributions.  相似文献   

8.
A Monte Carlo simulation in a novel approach is used for studying the problem of the outbreak and spread dynamics of the new COVID-19 pandemic in this work. In particular, our goal was to generate epidemiological data based on natural mechanism of transmission of this disease assuming random interactions of a large-finite number of individuals in very short distance ranges. In the simulation we also take into account the stochastic character of the individuals in a finite population and given densities of people. On the other hand, we include in the simulation the appropriate statistical distributions for the parameters characterizing this disease. An important outcome of our work, besides the generated epidemic curves, is the methodology of determining the effective reproductive number during the main part of the daily new cases of the epidemic. Since this quantity constitutes a fundamental parameter of the SIR-based epidemic models, we also studied how it is affected by small variations of the incubation time and the crucial distance distributions, and furthermore, by the degree of quarantine measures. In addition, we compare our qualitative results with those of selected real epidemiological data  相似文献   

9.
In 2009 in the United States, breast cancer was the most common cancer in women, and colorectal cancer was the third most common cancer in both men and women. Currently, over 40% of these cancers are diagnosed at an advanced stage, which results in higher morbidity and mortality than would obtain with optimal cancer screening utilization. To provide information that might improve these cancer outcomes we use spatial analysis to answer questions related to both Why and Where disparities in late-stage cancer diagnoses are observed. In examining Why, we include state level characteristics reflecting characteristics of states' cancer control planning, insurance markets and managed care environments to help model the spatial heterogeneity from place to place. To answer questions related to Where disparities are observed, we generate county level predictions of late-stage cancer rates from a random-intercept multilevel model estimated on the population data from 11 pooled SEER Registries. The findings allow for comparisons across states that reveal logical starting points for a national effort to control cancer.  相似文献   

10.
Advances in marker technology have made a dense marker map a reality. If each marker is considered separately, and separate tests for association with a disease gene are performed, then multiple testing becomes an issue. A common solution uses a Bonferroni correction to account for multiple tests performed. However, with dense marker maps, neighboring markers are tightly linked and may have associated alleles; thus tests at nearby marker loci may not be independent. When alleles at different marker loci are associated, the Bonferroni correction may lead to a conservative test, and hence a power loss. As an alternative, for tests of association that use family data, we propose a Monte Carlo procedure that provides a global assessment of significance. We examine the case of tightly linked markers with varying amounts of association between them. Using computer simulations, we study a family-based test for association (the transmission/disequilibrium test), and compare its power when either the Bonferroni or Monte Carlo procedure is used to determine significance. Our results show that when the alleles at different marker loci are not associated, using either procedure results in tests with similar power. However, when alleles at linked markers are associated, the test using the Monte Carlo procedure is more powerful than the test using the Bonferroni procedure. This proposed Monte Carlo procedure can be applied whenever it is suspected that markers examined have high amounts of association, or as a general approach to ensure appropriate significance levels and optimal power.  相似文献   

11.
A Monte Carlo evaluation of three statistical methods used in path analysis   总被引:1,自引:0,他引:1  
Results of a Monte Carlo study to investigate the properties of three statistical methods used extensively in path analysis of family data are presented. All three methods are based on the maximum likelihood principle and involve the assumptions of multivariate normality and large sample (asymptotic) statistical properties. The methods differ, however, in the specification of the likelihood function. Given a set of correlation estimates, method 1 maximizes the likelihood function under the stipulation that the estimates are independent. Method 2 differs from the former by allowing for covariances among the correlation estimators. Method 3 involves (direct) maximization of the likelihood function for the individual family observations assuming multivariate normality for the vector of family observations. The Monte Carlo study investigated validity of the test statistics and confidence intervals and evaluated the relative efficiency and bias of the parameter estimates based on 1,000 replications of each of several simulation conditions. The effects of violating the two basic assumptions, multivariate normality and asymptotic theory, were investigated by comparing results for non-normally vs normally distributed family data and for small vs large sample sizes. It is shown that method 3 provides valid statistical inferences under multivariate normality and that it is generally robust against minor departures from normality. Method 2 is also robust against minor deviations from normality, but it is sensitive to small sample sizes. Method 1 yields highly conservative test statistics under all conditions studied.  相似文献   

12.
Probabilistic sensitivity analysis (PSA) is required to account for uncertainty in cost-effectiveness calculations arising from health economic models. The simplest way to perform PSA in practice is by Monte Carlo methods, which involves running the model many times using randomly sampled values of the model inputs. However, this can be impractical when the economic model takes appreciable amounts of time to run. This situation arises, in particular, for patient-level simulation models (also known as micro-simulation or individual-level simulation models), where a single run of the model simulates the health care of many thousands of individual patients. The large number of patients required in each run to achieve accurate estimation of cost-effectiveness means that only a relatively small number of runs is possible. For this reason, it is often said that PSA is not practical for patient-level models.We develop a way to reduce the computational burden of Monte Carlo PSA for patient-level models, based on the algebra of analysis of variance. Methods are presented to estimate the mean and variance of the model output, with formulae for determining optimal sample sizes. The methods are simple to apply and will typically reduce the computational demand very substantially.  相似文献   

13.
Objectives

Evaluate different non-continuous temperature-monitoring practices for detection of out-of-range temperatures (above or below the recommended temperature range of 2–8 °C for refrigeration units), which are called excursions, within vaccine storage units.

Methods

Simulations based on temperature data collected by 243 digital data loggers operated in vaccine storage units at health-care providers who participated in a CDC-sponsored continuous temperature monitoring pilot project, from 2012 to 2015. In the primary analysis, we evaluate: (1) twice-daily current temperature readings without minimum and maximum readings (min/max), (2) twice-daily current temperature readings with once-daily min/max, and (3) twice-daily current temperature readings with twice-daily min/max.

Results

Recording current temperature twice daily without min/max resulted in the detection of 4.8—6.4% of the total number of temperature excursions. When min/max readings were introduced, the percentage of detected temperature excursions increased to 27.8—96.6% with once-daily min/max and to 34.8—96.7% with twice-daily min/max.

Conclusions

Including min/max readings improves the ability of a temperature monitoring practice to detect temperature excursions. No combination of the non-continuous temperature monitoring practices were able to consistently detect all simulated temperature excursions.

  相似文献   

14.
Kim CH  Reece WD 《Health physics》2002,83(2):243-254
In a steam generator channel head, it was not unusual to see radiation workers wearing as many as twelve dosimeters over the surface of the body to avoid a possible underestimation of effective dose equivalent (H(E)) or effective dose (E). This study shows that only one or two dosimeters can be used to estimate H(E) and E without a significant underestimation. MCNP and a point-kernel approach were used to model various exposure situations in a steam generator channel head. The single-dosimeter approach (on the chest) was found to underestimate H(E) and E significantly for a few exposure situations, i.e., when the major portion of radiation source is located in the backside of a radiation worker. In this case, the photons from the source pass through the body and are attenuated before reaching the dosimeter on the chest. To assure that a single dosimeter provides a good estimate of worker dose, these few exposure situations cannot dominate a worker's exposure. On the other hand, the two-dosimeter approach (on the chest and back) predicts H(E) and E very well, hardly ever underestimating these quantities by more than 4% considering all worker positions and contamination situations in a steam generator channel head. This study shows that two dosimeters are adequate for an accurate estimation of H(E) and E in a steam generator channel head.  相似文献   

15.
OBJECTIVES: This study used Monte Carlo (MC) simulation to examine the influence of uncertainty on an exposure model and to determine whether a difference exists between two worker groups in a ceramic fiber manufacturing plant. METHODS: Data on work practices and conditions were gathered in interviews with long-serving employees. With the use of previously developed deterministic modeling techniques and likely distributions for model parameters, MC simulations generated exposure profiles for the two job titles. RESULTS: The exposure profiles overlapped considerably, although the average estimated exposure for one job was approximately double that of the other. However, when the correlation between the model parameters in the two jobs was considered, it was concluded that there was a significant difference in the two estimates. CONCLUSIONS: Models are increasingly being used to estimate exposure. Different work situations inevitably result in different exposure estimates. However, it is difficult to determine whether such differences in estimated exposure between worker groups are simply the result of uncertainty with respect to the model parameters or whether they reflect real differences between occupational groups. This study demonstrates the value of MC simulation in helping define the uncertainty in deterministic model estimates.  相似文献   

16.
Matched cohort analyses are becoming increasingly popular for estimating treatment effects in observational studies. However, in the applied biomedical literature, analysts and authors are inconsistent regarding whether to terminate follow‐up among members of a matched set once one member is no longer under observation. This paper focused on time‐to‐event outcomes and used Monte Carlo simulation methods to determine the optimal approach. We found that the bias of the estimated treatment effect estimate was negligible under both approaches and that the percentage of censoring had no discernible effect on the magnitude of bias. The mean model‐based standard error of the treatment estimate was consistently higher when we terminated observation within matched pairs. Furthermore, the type 1 error rate was consistently lower when we did not terminate follow‐up within matched pairs. In conclusion, when the focus was on time‐to‐event outcomes, we demonstrated that there was no advantage to terminating follow‐up within matched pairs. Continuing follow‐up on each subject until their observation was naturally complete was superior compared with terminating a subject's observation time once its matched pair had ceased to be under observation. Given the frequency with which these analyses are conducted in the applied literature, our results provide important guidance to analysts and applied researchers as to the preferred analytic approach. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   

17.
The GATE Monte Carlo simulation platform has good application prospects of treatment planning and quality assurance. However, accurate dose calculation using GATE is time consuming. The purpose of this study is to implement a novel cloud computing method for accurate GATE Monte Carlo simulation of dose distribution using MapReduce. An Amazon Machine Image installed with Hadoop and GATE is created to set up Hadoop clusters on Amazon Elastic Compute Cloud (EC2). Macros, the input files for GATE, are split into a number of self-contained sub-macros. Through Hadoop Streaming, the sub-macros are executed by GATE in Map tasks and the sub-results are aggregated into final outputs in Reduce tasks. As an evaluation, GATE simulations were performed in a cubical water phantom for X-ray photons of 6 and 18 MeV. The parallel simulation on the cloud computing platform is as accurate as the single-threaded simulation on a local server and the simulation correctness is not affected by the failure of some worker nodes. The cloud-based simulation time is approximately inversely proportional to the number of worker nodes. For the simulation of 10 million photons on a cluster with 64 worker nodes, time decreases of 41× and 32× were achieved compared to the single worker node case and the single-threaded case, respectively. The test of Hadoop’s fault tolerance showed that the simulation correctness was not affected by the failure of some worker nodes. The results verify that the proposed method provides a feasible cloud computing solution for GATE.  相似文献   

18.
Detailed comparisons of the predictions of the Relativistic Form Factors (RFFs) and Modified Form Factors (MFFs) and their advantages and shortcomings in calculating elastic scattering cross sections can be found in the literature. However, the issues related to their implementation in the Monte Carlo (MC) sampling for coherently scattered photons is still under discussion. Secondly, the linear interpolation technique (LIT) is a popular method to draw the integrated values of squared RFFs/MFFs (i.e. ) over squared momentum transfer (). In the current study, the role/issues of RFFs/MFFs and LIT in the MC sampling for the coherent scattering were analyzed. The results showed that the relative probability density curves sampled on the basis of MFFs are unable to reveal any extra scientific information as both the RFFs and MFFs produced the same MC sampled curves. Furthermore, no relationship was established between the multiple small peaks and irregular step shapes (i.e. statistical noise) in the PDFs and either RFFs or MFFs. In fact, the noise in the PDFs appeared due to the use of LIT. The density of the noise depends upon the interval length between two consecutive points in the input data table of and has no scientific background. The probability density function curves became smoother as the interval lengths were decreased. In conclusion, these statistical noises can be efficiently removed by introducing more data points in the data tables.  相似文献   

19.
The effects of radiation backscattered from the secondary collimators into the monitor chamber in an Elekta linac (producing 6 and 10 MV photon beams) are investigated using BEAMnrc Monte Carlo simulations. The degree and effects of this backscattered radiation are assessed by evaluating the changes to the calculated dose in the monitor chamber, and by determining a correction factor for those changes. Additionally, the fluence and energy characteristics of particles entering the monitor chamber from the downstream direction are evaluated by examining BEAMnrc phase-space data. It is shown that the proportion of particles backscattered into the monitor chamber is small (<0.35 %), for all field sizes studied. However, when the backscatter plate is removed from the model linac, these backscattered particles generate a noticeable increase in dose to the monitor chamber (up to ≈2.4 % for the 6 MV beam and up to 4.4 % for the 10 MV beam). With its backscatter plate in place, the Elekta linac (operating at 6 and 10 MV) is subject to negligible variation of monitor chamber dose with field size. At these energies, output variations in photon beams produced by the clinical Elekta linear accelerator can be attributed to head scatter alone. Corrections for field-size-dependence of monitor chamber dose are not necessary when running Monte Carlo simulations of the Elekta linac operating at 6 and 10 MV.  相似文献   

20.
Radiation of experimental culture cells on plates with various wells can cause a risk of underdosage as a result of the existence of multiple air–water interfaces. The objective of our study was to quantify this error in culture plates with multiple wells. Radiation conditions were simulated with the GAMOS code, based on the GEANT4 code, and this was compared with a simulation performed with PENELOPE and measured data. We observed a slight underdosage of ∼4% on the most superficial half of the culture medium. We believe that this underdosage does not have a significant effect on the dose received by culture cells deposited in a monolayer and adhered to the base of the wells.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号