首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Computer models are being increasingly used to provide an efficient cost-effective means of evaluating the fate and behavior of chemicals in the environment. These models can be used in lieu of or in conjunction with field studies. Because of the increasing reliance on models for critical regulatory decision making, the need arose to assess the validity of regulatory models via an analysis of the correlation of model response estimates with measured data. In conjunction with the evaluation of the correlation of model response estimates and measured field data, a rigorous statistically based validation was also warranted. Because of the unique nature of the correlative exercise using modeled and measured data, standard statistical analyses, while informative, failed to encompass factors associated with the uncertainty of measured environmental fate data and potential model inputs. In an effort to evaluate this uncertainty, an initial sensitivity analysis was performed where key model input parameters for runoff and leaching simulations were identified. Once the sensitive input parameters were identified, a Monte Carlo-based preprocessor was developed whereby the sampling distributions of these parameters were used to propagate uncertainty in the input parameters into error in model predictions. Importantly, assumptions about parameter distributions for input into the Monte Carlo tool were made only after a formal detailed site-specific analysis of measured field data. Employing the functionality of the Crystal Ball Pro development environment, the pesticide root zone model (PRZM) 3.12 was run iteratively for 500 trials, and model output was collated and analyzed. The model predictions were considered reasonably accurate for most regulatory requirements, and the model prediction error was considered acceptable.  相似文献   

2.
Environmental fate modeling results are often used in risk assessment without adequately considering uncertainty in exposure predictions. Sensitivity analysis is fundamental to model validation and error prediction since sensitive model input parameters account for the largest variance in model prediction. Once identified, sensitive model input parameters can be used to propagate parametric uncertainty in numerical predictions. Output sensitivity to variation in input code sequences was investigated for the pesticide root zone model (PRZM 3) using Plackett-Burman analysis for six runoff and leaching data sets. The analysis utilized an incomplete block factorial design with even parameter weighting and uniform proportional input perturbation. Timing and duration of key period rainfall were assumed a priori to be dominant sensitive inputs. Thus, meteorological data were fixed, allowing identification of additional input components contributing to model sensitivity. Results validated expert modeler assumptions concerning parameters most critical for model validation. For leaching data sets, the application rate, soil bulk density (an indicator of available water-holding capacity), chemical partition coefficient, and pesticide degradation rates were commonly the most sensitive inputs. For runoff data sets, the in-crop runoff curve number was the most significant input governing pesticide loss in runoff and erosion flux. The chemical partition coefficient, soil and foliar decay rates, and soil bulk density were also common sensitive components for runoff predictions. These commonly observed sensitive components for runoff and leaching prediction need to be carefully considered in the design and conduct of relevant field studies, modeling assessment of such studies, and future improvements in algorithms for environmental transport modeling.  相似文献   

3.
4.
A dynamic or level IV multimedia model is described and illustrated by application to the fate of three polychlorinated biphenyl (PCB) congeners in the United Kingdom over a 60-year period from their introduction into commerce until the present. Models of this type are shown to be valuable for elucidating the time response of environmental systems to increasing, decreasing, or pulse inputs. The suggestion is made that in addition to the outputs of time-dependent concentrations (which can be compared with monitoring data for validation purposes), it is useful to examine masses, fugacities, and fugacity ratios between media? The relative importance of processes is best evaluated by compiling cumulative intermedia fluxes and quantities lost by reaction and advection and examining the corresponding process rate constants or their reciprocals, the characteristic times. The suggestion is made that uncertainty and sensitivity analyses are desirable, but it must be appreciated that relative sensitivities of input parameters may change during the simulation period, so a single sensitivity analysis conducted at one point in time can be misleading. The use of the model for forecasting future trends in concentration is illustrated. Given the uncertainties in emission and advective inflow rates, the simulation of PCB fate in the United Kingdom is regarded as showing time trends that are in satisfactory agreement with monitoring data.  相似文献   

5.
A user interface to the U.S. Environmental Protection Agency pesticide root zone model (PRZM) was constructed to allow Monte Carlo sampling of input parameter distributions. The interface was constructed employing the Visual Basic for Applications development environment, along with the functionality of the Crystal Ball Professional forecasting and risk analysis package. The tool has been utilized by the Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) Environmental Model Validation Task Force to perform detailed statistical analyses of model input parameter uncertainty and the propagation of this uncertainty on the model outputs as well as comparisons of modeled and field-measured data.  相似文献   

6.
7.
Individuals from the Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) Environmental Model Validation Task Force (FEMVTF) Statistics Committee periodically met to discuss the mechanism for conducting an uncertainty analysis of Version 3.12 of the pesticide root zone model (PRZM 3.12) and to identify those model input parameters that most contribute to model prediction error. This activity was part of a larger project evaluating PRZM 3.12. The goal of the uncertainty analysis was to compare site-specific model predictions and field measurements using the variability in each as a basis of comparison. Monte Carlo analysis was used as an integral tool for judging the model's ability to predict accurately. The model was judged on how well it predicts measured values, taking into account the uncertainty in the model predictions. Monte Carlo analysis provides the tool for inferring model prediction uncertainty. We argue that this is a fairer test of the model than a simple one-to-one comparison between predictions and measurements. Because models are known to be imperfect predictors prior to running the model, the inaccuracy in model predictions should be considered when models are judged for their predictive ability. Otherwise, complex models can easily fail a validation test. Few complex models, such as PRZM 3.12, would pass a typical model validation exercise. This paper describes the approaches to the validation of PRZM 3.12 used by the committee and discusses issues in sampling distribution selection and appropriate statistics for interpreting the model validation results.  相似文献   

8.
Bias from unmeasured confounding is a persistent concern in observational studies, and sensitivity analysis has been proposed as a solution. In the recent years, probabilistic sensitivity analysis using either Monte Carlo sensitivity analysis (MCSA) or Bayesian sensitivity analysis (BSA) has emerged as a practical analytic strategy when there are multiple bias parameters inputs. BSA uses Bayes theorem to formally combine evidence from the prior distribution and the data. In contrast, MCSA samples bias parameters directly from the prior distribution. Intuitively, one would think that BSA and MCSA ought to give similar results. Both methods use similar models and the same (prior) probability distributions for the bias parameters. In this paper, we illustrate the surprising finding that BSA and MCSA can give very different results. Specifically, we demonstrate that MCSA can give inaccurate uncertainty assessments (e.g. 95% intervals) that do not reflect the data's influence on uncertainty about unmeasured confounding. Using a data example from epidemiology and simulation studies, we show that certain combinations of data and prior distributions can result in dramatic prior‐to‐posterior changes in uncertainty about the bias parameters. This occurs because the application of Bayes theorem in a non‐identifiable model can sometimes rule out certain patterns of unmeasured confounding that are not compatible with the data. Consequently, the MCSA approach may give 95% intervals that are either too wide or too narrow and that do not have 95% frequentist coverage probability. Based on our findings, we recommend that analysts use BSA for probabilistic sensitivity analysis. Copyright © 2017 John Wiley & Sons, Ltd.  相似文献   

9.
The European Union System for the Evaluation of Substances (EUSES) is a computerized model system to facilitate and harmonize health and environmental risk assessment of previously notified and new substances. For calculation of regional background exposure, a multimedia distribution model is used. In the present study, the uncertainty of this regional model is analyzed. Environmental parameters were collected for North Rhine Westphalia (Germany), which resembles the standard region of EUSES. Probability distribution functions of various types (uniform, triangular, normal, log normal) depending on data availability were derived for environmental input parameters, including geometric parameters. Generic log-normal distribution functions with fixed standard deviations were chosen for solubility in air, water, and n-octanol as well as for degradation half-lives. Monte Carlo simulations were carried out for 10 reference substances having different properties. Contribution of environmental parameter uncertainty to total output uncertainties is higher than that of substance parameters. Range of output uncertainty, defined as the ratio of the logarithms of the 90th and 10th percentiles of the cumulative probability distribution function, shows an increase from air and water to soil. The highest-occurring range is 1.4 orders of magnitude, which means that total uncertainty of the regional model is relatively low and, usually, is lower than the range of measured values. The median of output probability distributions lies above the point estimate. Influence of input parameters was estimated as their rank correlation coefficients to output uncertainty. Substance and environmental parameters contribute differently to output variance depending on individual substance properties and environmental compartment. Hence, the present study underlines the need to perform uncertainty analyses instead of either using a set of simple rules or just looking at certain parameters.  相似文献   

10.
Using a conceptual hydraulic model, a one-dimensional dynamic river water quality model has been developed to assess the short-term fate of linear alkylbenzene sulfonates (LAS) in the river compartments water and benthic sediment. The model assumes local equilibrium sorption and that both dissolved and sorbed chemical are available for biodegradation. To investigate the interaction of nutrient dynamics and organic contaminant fate, the model is coupled with a basic water quality model. On the basis of the Lambro River (Italy) as a case study, the result shows that the model predictions agree well with the measured data set. The model output sensitivity to model parameters has been tested, and the results depict that the model is highly sensitive to the biodegrading parameters. Also, a comparison of a steady state with a dynamic simulation and the effect of nutrient dynamics on the LAS fate in the Lambro River as a scenario analysis are presented. The results indicate the usefulness of the proposed model for the short-term simulation of organic contaminant fate in unsteady environmental conditions.  相似文献   

11.
Probabilistic sensitivity analysis (PSA) is required to account for uncertainty in cost-effectiveness calculations arising from health economic models. The simplest way to perform PSA in practice is by Monte Carlo methods, which involves running the model many times using randomly sampled values of the model inputs. However, this can be impractical when the economic model takes appreciable amounts of time to run. This situation arises, in particular, for patient-level simulation models (also known as micro-simulation or individual-level simulation models), where a single run of the model simulates the health care of many thousands of individual patients. The large number of patients required in each run to achieve accurate estimation of cost-effectiveness means that only a relatively small number of runs is possible. For this reason, it is often said that PSA is not practical for patient-level models.We develop a way to reduce the computational burden of Monte Carlo PSA for patient-level models, based on the algebra of analysis of variance. Methods are presented to estimate the mean and variance of the model output, with formulae for determining optimal sample sizes. The methods are simple to apply and will typically reduce the computational demand very substantially.  相似文献   

12.
Conventional confidence intervals reflect uncertainty due to random error but omit uncertainty due to biases, such as confounding, selection bias, and measurement error. Such uncertainty can be quantified, especially if the investigator has some idea of the amount of such bias. A traditional sensitivity analysis produces one or more point estimates for the exposure effect hypothetically adjusted for bias, but it does not provide a range of effect measures given the likely range of bias. Here the authors used Monte Carlo sensitivity analysis and Bayesian bias analysis to provide such a range, using data from a US silica-lung cancer study in which results were potentially confounded by smoking. After positing a distribution for the smoking habits of workers and referents, a distribution of rate ratios for the effect of smoking on lung cancer, and a model for the bias parameter, the authors derived a distribution for the silica-lung cancer rate ratios hypothetically adjusted for smoking. The original standardized mortality ratio for the silica-lung cancer relation was 1.60 (95% confidence interval: 1.31, 1.93). Monte Carlo sensitivity analysis, adjusting for possible confounding by smoking, led to an adjusted standardized mortality ratio of 1.43 (95% Monte Carlo limits: 1.15, 1.78). Bayesian results were similar (95% posterior limits: 1.13, 1.84). The authors believe that these types of analyses, which make explicit and quantify sources of uncertainty, should be more widely adopted by epidemiologists.  相似文献   

13.
Species sensitivity distributions (SSDs) quantify fractions of species potentially affected in contaminated environmental compartments using test species sensitivity data. The present study quantitatively describes associations between predicted and observed ecological impacts of contaminant mixtures, based on monitoring data of benthic macroinvertebrates. Local mixture toxic pressures (multisubstance potentially affected fraction of species [msPAF]) were quantified based on measured concentrations of 45 compounds (eight metals, 16 chlorinated organics, mineral oil, 16 polycyclic aromatic hydrocarbons, four polychlorinated biphenyls), using acute as well as chronic 50%-effective concentration-based SSD-modeling combined with bioavailability and mixture modeling. Acute and chronic toxic pressures were closely related. Generalized linear models (GLMs) were derived to describe taxon abundances as functions of environmental variables (including acute toxic pressure). Acute toxic pressure ranged from 0 to 42% and was related to abundance for 74% of the taxa. Habitat-abundance curves were generated using the GLMs and Monte Carlo simulation. Predicted abundances for the taxa were associated with acute mixture toxic pressure in various ways: negative, positive, and optimum abundance changes occurred. Acute toxic pressure (msPAF) was associated almost 1:1 with the observed fraction of taxa exhibiting an abundance reduction of 50% or more. The findings imply that an increase of mixture toxic pressure associates to increased ecological impacts in the field. This finding is important, given the societal relevance of SSD model outputs in environmental policies. Environ. Toxicol. Chem. 2012; 31: 2175-2188. ? 2012 SETAC.  相似文献   

14.
A Monte Carlo uncertainty analysis with correlations between parameters is applied to a Markov-chain model that is used to support the choice of a replacement heart-valve. The objective is to quantify the effects of uncertainty in and of correlations between probabilities of valve-related events on the life expectancies of four valve types. The uncertainty in the logit- and log-transformed parameters-mostly representing probabilities and durations-is modeled as a multivariate normal distribution. The univariate distributions are obtained through values for the median and the 0.975 quantile of each parameter. Correlations between parameters are difficult to quantify. A sensitivity analysis is suggested to study their influences on the uncertainty in valve preference prior to further elicitation efforts. The results of the uncertainty analysis strengthen the conclusions from a preceding study, which did not include uncertainty in the model parameters, where the homograft turned out to be the best choice. It is concluded that the influence of correlations is limited in most cases. Preference statements become more certain when the correlation between valve types increases.  相似文献   

15.
16.
The NORMTOX model predicts the lifetime-averaged exposure to contaminants through multiple environmental media, that is, food, air, soil, drinking and surface water. The model was developed to test the coherence of Dutch environmental quality objectives (EQOs). A set of EQOs is called coherent if simultaneous exposure to different environmental media that are all polluted up to their respective EQOs does not result in exceeding the acceptable or tolerable daily intake (ADI or TDI). Aim of the present study is to separate the impact of uncertainty and interindividual variability in coherence predictions with the NORMTOX model. The method is illustrated in a case study for chlorfenvinphos, mercury and nitrate. First, ANOVA was used to calculate interindividual variability in input parameters. Second, nested Monte Carlo simulation was used to propagate uncertainty and interindividual variability separately. Lifetime-averaged exposure to chlorfenvinphos, mercury and nitrate was modeled for the Dutch population. Output distributions specified the population fraction at risk, due to a particular exposure, and the reliability of this risk. From the case study, it was obtained that at lifelong exposure to all media polluted up to their standard, 100% of the Dutch population exceeds the ADI for chlorfenvinphos, 15% for mercury and 0% for nitrate. Variance in exposure to chlorfenvinphos, mercury and nitrate is mostly caused by interindividual variability instead of true uncertainty. It is concluded that the likelihood that ADIs of chlorfenvinphos and mercury will be exceeded should be further explored. If exceeding is likely, decision makers should focus on identification of high-risk subpopulations, rather than on additional research to obtain more accurate estimates for particular parameters.  相似文献   

17.
Monte Carlo techniques are increasingly used in pesticide exposure modeling to evaluate the uncertainty in predictions arising from uncertainty in input parameters and to estimate the confidence that should be assigned to the modeling results. The approach typically involves running a deterministic model repeatedly for a large number of input values sampled from statistical distributions. In the present study, six modelers made choices regarding the type and parameterization of distributions assigned to degradation and sorption data for an example pesticide, the correlation between the parameters, the tool and method used for sampling, and the number of samples generated. A leaching assessment was carried out using a single model and scenario and all data for sorption and degradation generated by the six modelers. The distributions of sampled parameters differed between the modelers, and the agreement with the measured data was variable. Large differences were found between the upper percentiles of simulated concentrations in leachate. The probability of exceeding 0.1 microg/L ranged from 0 to 35.7%. The present study demonstrated that subjective choices made in Monte Carlo modeling introduce variability into probabilistic modeling and that the results need to be interpreted with care.  相似文献   

18.
In probabilistic sensitivity analyses, analysts assign probability distributions to uncertain model parameters and use Monte Carlo simulation to estimate the sensitivity of model results to parameter uncertainty. The authors present Bayesian methods for constructing large-sample approximate posterior distributions for probabilities, rates, and relative effect parameters, for both controlled and uncontrolled studies, and discuss how to use these posterior distributions in a probabilistic sensitivity analysis. These results draw on and extend procedures from the literature on large-sample Bayesian posterior distributions and Bayesian random effects meta-analysis. They improve on standard approaches to probabilistic sensitivity analysis by allowing a proper accounting for heterogeneity across studies as well as dependence between control and treatment parameters, while still being simple enough to be carried out on a spreadsheet. The authors apply these methods to conduct a probabilistic sensitivity analysis for a recently published analysis of zidovudine prophylaxis following rapid HIV testing in labor to prevent vertical HIV transmission in pregnant women.  相似文献   

19.
Bayesian methods for cluster randomized trials with continuous responses   总被引:1,自引:0,他引:1  
Bayesian methods for cluster randomized trials extend the random-effects formulation by allowing both the use of external evidence on parameters and straightforward relaxation of the standard normality and constant variance assumptions. Care is required in specifying prior distributions on variance components, and a number of different options are explored with implied prior distributions for other parameters given in closed form. Markov chain Monte Carlo (MCMC) methods permit the fitting of very general models and the introduction of parameter uncertainty into power calculations. We illustrate these ideas using a published example in which general practices were randomized to intervention or control, and show that different choices of supposedly 'non-informative' prior distributions can have substantial influence on conclusions. We also illustrate the use of forward simulation methods in power calculations with uncertainty on multiple inputs. Bayesian methods have the potential to be very useful but guidance is required as to appropriate strategies for robust analysis. Our current experience leads us to recommend a standard 'non-informative' prior distribution for the within-cluster sampling variance, and an independent prior on the intraclass correlation coefficient (ICC). The latter may exploit background evidence or, as a reference analysis, be a uniform ICC or a 'uniform shrinkage' prior.  相似文献   

20.
There has been an increasing interest in using expected value of information (EVI) theory in medical decision making, to identify the need for further research to reduce uncertainty in decision and as a tool for sensitivity analysis. Expected value of sample information (EVSI) has been proposed for determination of optimum sample size and allocation rates in randomized clinical trials. This article derives simple Monte Carlo, or nested Monte Carlo, methods that extend the use of EVSI calculations to medical decision applications with multiple sources of uncertainty, with particular attention to the form in which epidemiological data and research findings are structured. In particular, information on key decision parameters such as treatment efficacy are invariably available on measures of relative efficacy such as risk differences or odds ratios, but not on model parameters themselves. In addition, estimates of model parameters and of relative effect measures in the literature may be heterogeneous, reflecting additional sources of variation besides statistical sampling error. The authors describe Monte Carlo procedures for calculating EVSI for probability, rate, or continuous variable parameters in multi parameter decision models and approximate methods for relative measures such as risk differences, odds ratios, risk ratios, and hazard ratios. Where prior evidence is based on a random effects meta-analysis, the authors describe different ESVI calculations, one relevant for decisions concerning a specific patient group and the other for decisions concerning the entire population of patient groups. They also consider EVSI methods for new studies intended to update information on both baseline treatment efficacy and the relative efficacy of 2 treatments. Although there are restrictions regarding models with prior correlation between parameters, these methods can be applied to the majority of probabilistic decision models. Illustrative worked examples of EVSI calculations are given in an appendix.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号