共查询到20条相似文献,搜索用时 15 毫秒
1.
The GATE Monte Carlo simulation platform has good application prospects of treatment planning and quality assurance. However, accurate dose calculation using GATE is time consuming. The purpose of this study is to implement a novel cloud computing method for accurate GATE Monte Carlo simulation of dose distribution using MapReduce. An Amazon Machine Image installed with Hadoop and GATE is created to set up Hadoop clusters on Amazon Elastic Compute Cloud (EC2). Macros, the input files for GATE, are split into a number of self-contained sub-macros. Through Hadoop Streaming, the sub-macros are executed by GATE in Map tasks and the sub-results are aggregated into final outputs in Reduce tasks. As an evaluation, GATE simulations were performed in a cubical water phantom for X-ray photons of 6 and 18 MeV. The parallel simulation on the cloud computing platform is as accurate as the single-threaded simulation on a local server and the simulation correctness is not affected by the failure of some worker nodes. The cloud-based simulation time is approximately inversely proportional to the number of worker nodes. For the simulation of 10 million photons on a cluster with 64 worker nodes, time decreases of 41× and 32× were achieved compared to the single worker node case and the single-threaded case, respectively. The test of Hadoop’s fault tolerance showed that the simulation correctness was not affected by the failure of some worker nodes. The results verify that the proposed method provides a feasible cloud computing solution for GATE. 相似文献
2.
The second generation of the scintillation gamma camera gamma KC-2 has been recently produced in lots. Compared to the first Soviet model gamma KC-1 it features advanced spatial resolution, maximum count rate, and image uniformity. A built-in microprocessor controls in real time corrections for distortions thus providing a marked gain in serviceability and expanded diagnostic capacity. 相似文献
4.
The paper describes the development of chemical modules simulating the prechemical and chemical stages of charged particle tracks in pure liquid water. These calculations are based on our physical track structure codes for electrons and ions (KURBUC, LEPHIST and LEAHIST) which provide the initial spatial distribution of H2O+, H2O* and subexcitation electrons at approximately 10(-15) s. We considered 11 species and 26 chemical reactions. A step-by-step Monte Carlo approach was adopted for the chemical stage between 10(-12) s and 10(-6) s. The chemistry codes enabled to simulate the non-homogeneous chemistry that pertains to electron, proton and alpha-particle tracks of various linear energy transfers (LET). Time-dependent yields of chemical species produced by electrons and ions of different energies were calculated. The calculated primary yields (G values at 10(-6) s) of 2.80 for OH and 2.59 for e(aq)- for 1 MeV electrons are in good agreement with the published values. The calculated G values at 10(-6) s for a wide range LETs from of 0.2 to 235 keV microm(-1) were obtained. The calculations show the LET dependence for OH and H2O2. The electron penetration ranges were calculated in order to discuss the role of low energy electrons. 相似文献
5.
OBJECTIVES: This study used Monte Carlo (MC) simulation to examine the influence of uncertainty on an exposure model and to determine whether a difference exists between two worker groups in a ceramic fiber manufacturing plant. METHODS: Data on work practices and conditions were gathered in interviews with long-serving employees. With the use of previously developed deterministic modeling techniques and likely distributions for model parameters, MC simulations generated exposure profiles for the two job titles. RESULTS: The exposure profiles overlapped considerably, although the average estimated exposure for one job was approximately double that of the other. However, when the correlation between the model parameters in the two jobs was considered, it was concluded that there was a significant difference in the two estimates. CONCLUSIONS: Models are increasingly being used to estimate exposure. Different work situations inevitably result in different exposure estimates. However, it is difficult to determine whether such differences in estimated exposure between worker groups are simply the result of uncertainty with respect to the model parameters or whether they reflect real differences between occupational groups. This study demonstrates the value of MC simulation in helping define the uncertainty in deterministic model estimates. 相似文献
6.
目的 借助全国血流感染耐药监测联盟(BRICS)平台收集的链球菌属细菌,评价头孢曲松、左氧氟沙星及莫西沙星给药方案,以期为临床医生合理用药提供依据.方法 采用琼脂稀释法测定头孢曲松、左氧氟沙星及莫西沙星药物敏感情况,应用蒙特卡洛模拟方法研究三种药物不同给药方案的达标概率和累计反应分数(CFR).结果 最低抑菌浓度(MI... 相似文献
7.
ObjectivesEvaluate different non-continuous temperature-monitoring practices for detection of out-of-range temperatures (above or below the recommended temperature range of 2–8 °C for refrigeration units), which are called excursions, within vaccine storage units. MethodsSimulations based on temperature data collected by 243 digital data loggers operated in vaccine storage units at health-care providers who participated in a CDC-sponsored continuous temperature monitoring pilot project, from 2012 to 2015. In the primary analysis, we evaluate: (1) twice-daily current temperature readings without minimum and maximum readings (min/max), (2) twice-daily current temperature readings with once-daily min/max, and (3) twice-daily current temperature readings with twice-daily min/max. ResultsRecording current temperature twice daily without min/max resulted in the detection of 4.8—6.4% of the total number of temperature excursions. When min/max readings were introduced, the percentage of detected temperature excursions increased to 27.8—96.6% with once-daily min/max and to 34.8—96.7% with twice-daily min/max. ConclusionsIncluding min/max readings improves the ability of a temperature monitoring practice to detect temperature excursions. No combination of the non-continuous temperature monitoring practices were able to consistently detect all simulated temperature excursions. 相似文献
8.
The literature contains both endorsements of, and advice against, the use of protective apparel in nuclear medicine procedures.
The main issues usually centre around: Whether the shielding which can be provided by a protective garment light enough to
wear (0 to 0.6 mm lead equivalent at the gamma energies commonly encountered in nuclear medicine) is enough to warrant its
use; and (more recently); Whether the dose enhancement behind the protective garment from electron scatter in lead is sufficient
to be of concern. In this work, the Monte Carlo code EGSnrc was used to investigate the effectiveness of lead of thicknesses
of 0 to 0.6 mm, in shielding staff from photons of energies of 140 and 511 keV. Furthermore, dose escalation behind the lead
was investigated. Reasonable dose reductions are obtained at 140 keV with protective garments of 0.5 mm lead equivalence.
This perhaps warrants their use, in certain circumstances. At 511 keV, the reduction in dose is less than 10%, and their use
is probably not justified (given the weight that has to be carried) from an ALARA point of view. It should be noted here that
protective garments designed for X-ray shielding will generally not have the same lead equivalence at the gamma energies used
in nuclear medicine. It should also be noted that protective garments which do not contain lead do not always attenuate as
much as their stated lead equivalence claims. Dose escalation does occur, but the depth of penetration of the scattered electrons
beyond the exit side of the lead shielding is such that it is highly unlikely that a significant dose would be delivered to
viable tissue in wearers of protective garments. 相似文献
9.
We have applied the technique of Monte Carlo simulation to the determination of sample size for a partially completed clinical trial of chemotherapy for breast cancer. Simulations based on results observed after the entry of 243 patients in 2 years indicated a power greater than that predicted by the calculations made before the protocol was activated, and allowed a recommendation for an eventual trial closure earlier than would have been permitted by traditional methods. Both estimative and predictive approaches to the simulation of expected survival times for censored patients are presented. The use of simulation is recommended as an aid in reassessing the exact nature of the underlying survival distributions (as these affect the sample size calculations) and in optimizing stopping rules relating to patient accrual to a clinical trial in progress. 相似文献
12.
In a multi-hospital surveillance system for detecting increases in congenital malformation rates, the performance of the set technique applied individually to hospitals and of the cumulative sum (Cusum) technique applied to the aggregate of hospitals has been compared using a Monte Carlo method. The Cusum technique appears to be both more sensitive to real increases and less likely to issue false alarms. 相似文献
14.
Two kinds of error are considered, namely Berkson and classical measurement error. The true values of the measurands will never be known. Possibly true sets of values are generated by the Monte Carlo simulation of the uncertainty analysis. This is straightforward for Berkson errors but requires the modeling of statistical dependence between measured values and errors in the classical case. A method is presented that enables this dependence modeling as part of the uncertainty analysis. Practical examples demonstrate the applicability of the method. Two "quick fixes" are also discussed together with their shortcomings. The uncertainty analysis of the application of a small computer model from the area of dose reconstruction illustrates, by example, the effect both kinds of error can have on model results like individual dose values and mean value and standard deviation of the population dose distribution. 相似文献
15.
A Monte Carlo simulation in a novel approach is used for studying the problem of the outbreak and spread dynamics of the new COVID-19 pandemic in this work. In particular, our goal was to generate epidemiological data based on natural mechanism of transmission of this disease assuming random interactions of a large-finite number of individuals in very short distance ranges. In the simulation we also take into account the stochastic character of the individuals in a finite population and given densities of people. On the other hand, we include in the simulation the appropriate statistical distributions for the parameters characterizing this disease. An important outcome of our work, besides the generated epidemic curves, is the methodology of determining the effective reproductive number during the main part of the daily new cases of the epidemic. Since this quantity constitutes a fundamental parameter of the SIR-based epidemic models, we also studied how it is affected by small variations of the incubation time and the crucial distance distributions, and furthermore, by the degree of quarantine measures. In addition, we compare our qualitative results with those of selected real epidemiological data 相似文献
17.
In a steam generator channel head, it was not unusual to see radiation workers wearing as many as twelve dosimeters over the surface of the body to avoid a possible underestimation of effective dose equivalent (H(E)) or effective dose (E). This study shows that only one or two dosimeters can be used to estimate H(E) and E without a significant underestimation. MCNP and a point-kernel approach were used to model various exposure situations in a steam generator channel head. The single-dosimeter approach (on the chest) was found to underestimate H(E) and E significantly for a few exposure situations, i.e., when the major portion of radiation source is located in the backside of a radiation worker. In this case, the photons from the source pass through the body and are attenuated before reaching the dosimeter on the chest. To assure that a single dosimeter provides a good estimate of worker dose, these few exposure situations cannot dominate a worker's exposure. On the other hand, the two-dosimeter approach (on the chest and back) predicts H(E) and E very well, hardly ever underestimating these quantities by more than 4% considering all worker positions and contamination situations in a steam generator channel head. This study shows that two dosimeters are adequate for an accurate estimation of H(E) and E in a steam generator channel head. 相似文献
18.
In this work a study of the energy fluence of the photon beam produced by a commercial irradiator that uses a single collimated 137Cs source is performed by employing the Monte Carlo code PENELOPE. A set of lead attenuators is placed at the exit window of the irradiator to vary the air kerma rate that is required to cover the instrument scales at a particular calibration distance. A possible variation in response due to this beam modification isalso investigated for LiF (TLD-100) dosimeters and for a secondary standard radiation protection level ionization chamber. The results show an important enhancement of beam mean energy from 633 to 642 keV as the lead attenuators increase in thicknesses. For this energy range, a maximum response change of 45% was found for LiF and 4.4% for the ionization chamber. These results reinforce the idea that a single source may very well be a practical solution for calibration laboratories without compromising the overall uncertainties acceptable for this application. 相似文献
19.
Advances in marker technology have made a dense marker map a reality. If each marker is considered separately, and separate tests for association with a disease gene are performed, then multiple testing becomes an issue. A common solution uses a Bonferroni correction to account for multiple tests performed. However, with dense marker maps, neighboring markers are tightly linked and may have associated alleles; thus tests at nearby marker loci may not be independent. When alleles at different marker loci are associated, the Bonferroni correction may lead to a conservative test, and hence a power loss. As an alternative, for tests of association that use family data, we propose a Monte Carlo procedure that provides a global assessment of significance. We examine the case of tightly linked markers with varying amounts of association between them. Using computer simulations, we study a family-based test for association (the transmission/disequilibrium test), and compare its power when either the Bonferroni or Monte Carlo procedure is used to determine significance. Our results show that when the alleles at different marker loci are not associated, using either procedure results in tests with similar power. However, when alleles at linked markers are associated, the test using the Monte Carlo procedure is more powerful than the test using the Bonferroni procedure. This proposed Monte Carlo procedure can be applied whenever it is suspected that markers examined have high amounts of association, or as a general approach to ensure appropriate significance levels and optimal power. 相似文献
20.
Studies on the household distribution of trachoma have reached conflicting conclusions. This paper describes a cross-sectional survey of endemic trachoma in a Gambian village. Cases of active trachoma were mapped, and the compound and household distribution of the disease analysed by a Monte Carlo simulation procedure which takes into account differences in the size and age distribution within individual households. Significant clustering of active trachoma cases both by village compound (p less than 0.0001) and bedroom (less than 0.05) were detected supporting the concept that intra-familial transmission of trachoma is important. There was no evidence of spatial clustering of rooms with higher than expected prevalence of trachoma. Clustering of disease in space or time provides important evidence of infectious aetiology and route of transmission. The methods discussed here are generally applicable in the study of other infectious diseases. 相似文献
|