首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Chemical risk assessment: a review   总被引:2,自引:0,他引:2  
People are exposed to a staggering assortment of chemicals and foreign substances. Potential health risks accompany these exposures. Intelligent, informed decisions are needed on which risks can and should be reduced, eliminated, or simply ignored. Therefore, a method of determining the attendant human health risks involved in chemical exposure is necessary. This need has resulted in the evolution of the risk assessment process which was developed to aid in identifying, characterizing and quantifying risks. Risk assessment today is an essential component of regulatory decision-making. In the context of chemical exposure, risk assessment is an evaluation of the risk in human exposure to chemicals in the environment. Quantitative risk assessment (QRA) is the use of experimental laboratory data and/or human epidemiological data in a process to derive a quantitative value for the estimate of the probability of harm occurring to exposed human populations. It is a sophisticated process involving an array of techniques that can be used to identify potential risks to human health. There are 4 components involved in the formalized risk assessment process--hazard identification, toxicity assessment, exposure assessment and risk characterization. These 4 steps collectively address each of 6 key areas identified as essential in characterizing a risk situation involving a chemical exposure. The process of risk estimation involves uncertainties because there are always gaps in knowledge or a lack in understanding mechanisms. These crucial gaps in knowledge are filled when extrapolations, models or assumptions are used. The uncertainties inherent in the risk assessment process are the basis of arguments against the use of the process. Many of these sources of uncertainty inherent in the risk assessment process are examined herein. These include, but are not limited to, modeling methods, understanding mechanisms and pharmacodynamics, exposure data, assumptions and extrapolations. Some new techniques and approaches being applied to the risk assessment process are examined. These include improved models for extrapolating data and quantifying risks, improved laboratory techniques for investigating pharmacodynamic and mechanistic pathways and advancements in quality and application of epidemiological data. The actual concept of uncertainty is being examined and attempts are being made to directly address, quantify and manage uncertainty.  相似文献   

2.
Human exposure limits (HELs) for chemicals with a toxicological threshold are traditionally derived using default assessment factors that account for variations in exposure duration, species sensitivity and individual sensitivity. The present paper elaborates a probabilistic approach for human hazard characterization and the derivation of HELs. It extends the framework for evaluating and expressing uncertainty in hazard characterization recently proposed by WHO-IPCS, i.e. by the incorporation of chemical-specific data on human variability in toxicokinetics. The incorporation of human variability in toxicodynamics was based on the variation between adverse outcome pathways (AOPs). Furthermore, sources of interindividual variability and uncertainty are propagated separately throughout the derivation process. The outcome is a two-dimensional human dose distribution that quantifies the population fraction exceeding a pre-selected critical effect level with an estimate of the associated uncertainty. This enables policy makers to set separate standards for the fraction of the population to be protected and the confidence level of the assessment. The main sources of uncertainty in the human dose distribution can be identified in order to plan new research for reducing uncertainty. Additionally, the approach enables quantification of the relative risk for specific subpopulations. The approach is demonstrated for two pharmaceuticals, i.e. the antibiotic ciprofloxacin and the antineoplastic methotrexate. For both substances, the probabilistic HEL is mainly influenced by uncertainty originating from: (1) the point of departure (PoD), (2) extrapolation from sub-acute to chronic toxicity and (3) interspecies extrapolation. However, when assessing the tails of the two-dimensional human dose distributions, i.e. the section relevant for the derivation of human exposure limits, interindividual variability in toxicodynamics also becomes important.  相似文献   

3.
4.
5.
Health risk assessments have been so widely adopted in the United States that their conclusions are a major factor in many environmental decisions. The procedure by which these assessments are conducted is one which has evolved over the past 10-15 years and a number of short-comings have been widely recognized. Unfortunately, improvements in the process have often occurred more slowly than advancements in technology or scientific knowledge. Recent significant advances for more accurately estimating the risks posed by environmental chemicals are likely to have a dramatic effect on the regulation of many substances. Each of the four portions of risk assessment (hazard identification, dose-response assessment, exposure assessment, and risk characterization) has undergone significant refinement since 1985. This paper reviews some of the specific changes and explains the likely benefits as well as the implications. Emphasis is placed on the improved techniques for (a) identifying those chemicals which may pose a human cancer or developmental hazard, (b) using statistical approaches which account for the distribution of interindividual biological differences, (c) using lognormal statistics when interpreting environmental data, (d) using physiologically based pharmacokinetic models for estimating delivered dose and for scaling up rodent data, (e) using biologically based cancer models to account for the seven or more apparently different mechanisms of chemical carcinogenesis, (f) describing the severity of the public health risks by considering those portions of the population exposed to various concentrations of a contaminant, and (g) reviewing how criteria for acceptable risk have been influenced by the number of exposed persons. The net benefit of these improvements should be a reduction in the uncertainty inherent in current estimates of the health risks posed by low level exposure to carcinogens and developmental toxicants.  相似文献   

6.
The objectives of REACH cannot be achieved under the current risk assessment approach. A change in mind set among all the relevant stakeholders is needed: risk assessment should move away from a labor-intensive and animal-consuming approach to intelligent and pragmatic testing, by combining exposure and hazard data effectively and trying to group chemicals (category approaches). The focus should be on reducing the overall uncertainties of 30,000 chemicals while acknowledging the existence of the uncertainty paradox: reducing uncertainty in the assessment of individual chemicals following the classical chemical-by-chemical approach as we have in previous decades will result in a prolongation of uncertainty for the entire group of 30,000 chemicals as a whole. With the first REACH registration deadline (2010) rapidly approaching, a mind set change is urgently needed. We can speed up the regulatory acceptance process, starting with the maximum use of currently available exposure and hazard data, tools and models. Optimal use should also be made of experimental exposure and hazard data generated under REACH. Only such an approach will make it possible to obtain a sufficient level of information within the time frame of REACH. A much more intensive dialogue between stakeholders is necessary.  相似文献   

7.
The development and use of emerging technologies such as nanomaterials can provide both benefits and risks to society. Emerging materials may promise to bring many technological advantages but may not be well characterized in terms of their production volumes, magnitude of emissions, behaviour in the environment and effects on living organisms. This uncertainty can present challenges to scientists developing these materials and persons responsible for defining and measuring their adverse impacts. Human health risk assessment is a method of identifying the intrinsic hazard of and quantifying the dose–response relationship and exposure to a chemical, to finally determine the estimation of risk. Commonly applied deterministic approaches may not sufficiently estimate and communicate the likelihood of risks from emerging technologies whose uncertainty is large. Probabilistic approaches allow for parameters in the risk assessment process to be defined by distributions instead of single deterministic values whose uncertainty could undermine the value of the assessment. A probabilistic approach was applied to the dose–response and exposure assessment of a case study involving the production of nanoparticles of titanium dioxide in seven different exposure scenarios. Only one exposure scenario showed a statistically significant level of risk. In the latter case, this involved dumping high volumes of nano-TiO2 powders into an open vessel with no personal protection equipment. The probabilistic approach not only provided the likelihood of but also the major contributing factors to the estimated risk (e.g. emission potential).  相似文献   

8.
In the most recent risk assessment for Bisphenol A for the first time a multi-route aggregate exposure assessment was conducted by the European Food Safety Authority. This assessment includes exposure via dietary sources, and also contributions of the most important non-dietary sources. Both average and high aggregate exposure were calculated by source-to-dose modeling (forward calculation) for different age groups and compared with estimates based on urinary biomonitoring data (backward calculation). The aggregate exposure estimates obtained by forward and backward modeling are in the same order of magnitude, with forward modeling yielding higher estimates associated with larger uncertainty. Yet, only forward modeling can indicate the relative contribution of different sources. Dietary exposure, especially via canned food, appears to be the most important exposure source and, based on the central aggregate exposure estimates, contributes around 90% to internal exposure to total (conjugated plus unconjugated) BPA. Dermal exposure via thermal paper and to a lesser extent via cosmetic products may contribute around 10% for some age groups. The uncertainty around these estimates is considerable, but since after dermal absorption a first-pass metabolism of BPA by conjugation is lacking, dermal sources may be of equal or even higher toxicological relevance than dietary sources.  相似文献   

9.
Risk characterization comprises hazard characterization and exposure assessment. Hazard characterization may not be based on human data alone, as these data (1) are seldom available, (2) are quite insensitive in identifying the hazards, and (3) mostly lack reliable exposure-response information. Thus epidemiological information needs to be complemented with information from experimental animals and in vitro systems. These observations suffer from the necessity for species-to-species extrapolation, which is often based on weakly based generic default values. Default values may be replaced by chemical-specific uncertainty factors, but need to be applied cautiously and preferably in a predetermined framework with transparent guidance on what constitutes reliable evidence. Structure-activity relationships (SARs) are useful in setting priorities for hazard characterization and data generation, but seldom alone constitute a sufficient basis for quantitative hazard characterization. Little progress has been made in the assessment of the hazards from multiple simultaneous or successive exposures. Information on the exposure of the population whose risks are to be assessed relies predominantly on models of varying complexity. In the assessment of exposure to elements, speciation and bioavailability are important parameters for which the information often is limited.  相似文献   

10.
On a global scale, pathogenic contamination of drinking water poses the most significant health risk to humans, and there have been countless numbers of disease outbreaks and poisonings throughout history resulting from exposure to untreated or poorly treated drinking water. However, significant risks to human health may also result from exposure to nonpathogenic, toxic contaminants that are often globally ubiquitous in waters from which drinking water is derived. With this latter point in mind, the objective of this commission paper is to discuss the primary sources of toxic contaminants in surface waters and groundwater, the pathways through which they move in aquatic environments, factors that affect their concentration and structure along the many transport flow paths, and the relative risks that these contaminants pose to human and environmental health. In assessing the relative risk of toxic contaminants in drinking water to humans, we have organized our discussion to follow the classical risk assessment paradigm, with emphasis placed on risk characterization. In doing so, we have focused predominantly on toxic contaminants that have had a demonstrated or potential effect on human health via exposure through drinking water. In the risk assessment process, understanding the sources and pathways for contaminants in the environment is a crucial step in addressing (and reducing) uncertainty associated with estimating the likelihood of exposure to contaminants in drinking water. More importantly, understanding the sources and pathways of contaminants strengthens our ability to quantify effects through accurate measurement and testing, or to predict the likelihood of effects based on empirical models. Understanding the sources, fate, and concentrations of chemicals in water, in conjunction with assessment of effects, not only forms the basis of risk characterization, but also provides critical information required to render decisions regarding regulatory initiatives, remediation, monitoring, and management. Our discussion is divided into two primary themes. First we discuss the major sources of contaminants from anthropogenic activities to aquatic surface and groundwater and the pathways along which these contaminants move to become incorporated into drinking water supplies. Second, we assess the health significance of the contaminants reported and identify uncertainties associated with exposures and potential effects. Loading of contaminants to surface waters, groundwater, sediments, and drinking water occurs via two primary routes: (1) point-source pollution and (2) non-point-source pollution. Point-source pollution originates from discrete sources whose inputs into aquatic systems can often be defined in a spatially explicit manner. Examples of point-source pollution include industrial effluents (pulp and paper mills, steel plants, food processing plants), municipal sewage treatment plants and combined sewage-storm-water overflows, resource extraction (mining), and land disposal sites (landfill sites, industrial impoundments). Non-point-source pollution, in contrast, originates from poorly defined, diffuse sources that typically occur over broad geographical scales. Examples of non-point-source pollution include agricultural runoff (pesticides, pathogens, and fertilizers), storm-water and urban runoff, and atmospheric deposition (wet and dry deposition of persistent organic pollutants such as polychlorinated biphenyls [PCBs] and mercury). Within each source, we identify the most important contaminants that have either been demonstrated to pose significant risks to human health and/or aquatic ecosystem integrity, or which are suspected of posing such risks. Examples include nutrients, metals, pesticides, persistent organic pollutants (POPs), chlorination by-products, and pharmaceuticals. Due to the significant number of toxic contaminants in the environment, we have necessarily restricted our discussion to those chemicals that pose risks to human health via exposure through drinking water. A comprehensive and judicious consideration of the full range of contaminants that occur in surface waters, sediments, and drinking water would be a large undertaking and clearly beyond the scope of this article. However, where available, we have provided references to relevant literature to assist the reader in undertaking a detailed investigation of their own. The information collected on specific chemicals within major contaminant classes was used to determine their relative risk using the hazard quotient (HQ) approach. Hazard quotients are the most widely used method of assessing risk in which the exposure concentration of a stressor, either measured or estimated, is compared to an effect concentration (e.g., no-observed-effect concentration or NOEC). A key goal of this assessment was to develop a perspective on the relative risks associated with toxic contaminants that occur in drinking water. Data used in this assessment were collected from literature sources and from the Drinking Water Surveillance Program (DWSP) of Ontario. For many common contaminants, there was insufficient environmental exposure (concentration) information in Ontario drinking water and groundwater. Hence, our assessment was limited to specific compounds within major contaminant classes including metals, disinfection by-products, pesticides, and nitrates. For each contaminant, the HQ was estimated by expressing the maximum concentration recorded in drinking water as a function of the water quality guideline for that compound. There are limitations to using the hazard quotient approach of risk characterization. For example, HQs frequently make use of worst-case data and are thus designed to be protective of almost all possible situations that may occur. However, reduction of the probability of a type II error (false negative) through the use of very conservative application factors and assumptions can lead to the implementation of expensive measures of mitigation for stressors that may pose little threat to humans or the environment. It is important to realize that our goal was not to conduct a comprehensive, in-depth assessment of risk for each chemical; more comprehensive assessments of managing risks associated with drinking water are addressed in a separate issue paper by Krewski et al. (2001a). Rather, our goal was to provide the reader with an indication of the relative risk of major contaminant classes as a basis for understanding the risks associated with the myriad forms of toxic pollutants in aquatic systems and drinking water. For most compounds, the estimated HQs were < 1. This indicates that there is little risk associated with exposure from drinking water to the compounds tested. There were some exceptions. For example, nitrates were found to commonly yield HQ values well above 1 in- many rural areas. Further, lead, total trihalomethanes, and trichloroacetic acid yielded HQs > 1 in some treated distribution waters (water distributed to households). These latter compounds were further assessed using a probabilistic approach; these assessments indicated that the maximum allowable concentrations (MAC) or interim MACs for the respective compounds were exceeded <5% of the time. In other words, the probability of finding these compounds in drinking water at levels that pose risk to humans through ingestion of drinking water is low. Our review has been carried out in accordance with the conventional principles of risk assessment. Application of the risk assessment paradigm requires rigorous data on both exposure and toxicity in order to adequately characterize potential risks of contaminants to human health and ecological integrity. Weakness rendered by poor data, or lack of data, in either the exposure or effects stages of the risk assessment process significantly reduces the confidence that can be placed in the overall risk assessment. Overall, while our review suggested selected instances of potential risks to human health from exposure to contaminants in drinking water, we also noted a distinct paucity of information on exposure levels for many contaminants in this matrix. We suggest that this represents a significant limitation to conducting sound risk assessments and introduces considerable uncertainty with respect to the management of water quality. In this context, future research must place greater emphasis on targeted monitoring and assessment of specific contaminants (e.g., pharmaceuticals) in drinking water for which there is currently little information. This could be conducted using a tiered risk approach, beginning with, for example, a hazard quotient assessment. Potentially problematic compounds identified in these preliminary assessments would then be subjected to more comprehensive risk assessments using probabilistic methods, if sufficient data exist to do so. On this latter point, adequate assessment of potential risks for many contaminants in drinking water is currently limited by a paucity of toxicological information. Generating this important information is a critical research need and would reduce the uncertainty associated with conducting risk assessments.  相似文献   

11.
Although risk assessment, assessing the potential harm of each particular exposure of a substance, is desirable, it is not feasible in many situations. Risk assessment uses a process of hazard identification, hazard characterisation, and exposure assessment as its components. In the absence of risk assessment, the purpose of classification is to give broad guidance (through the label) on the suitability of a chemical in a range of use situations. Hazard classification in the EU is a process involving identification of the hazards of a substance, followed by comparison of those hazards (including degree of hazard) with defined criteria. Classification should therefore give guidance on degree of hazard as well as hazard identification. Potency is the most important indicator of degree of hazard and should therefore be included in classification. This is done for acute lethality and general toxicity by classifying on dose required to cause the effect. The classification in the EU for carcinogenicity and reproductive toxicity does not discriminate across the wide range of potencies seen (6 orders of magnitude) for carcinogenicity and for developmental toxicity and fertility. Therefore potency should be included in the classification process. The methodology in the EU guidelines for classification for deriving specific concentration limits is a rigorous process for assigning substances which cause tumours or developmental toxicity and infertility in experimental animals to high, medium or low degree of hazard categories by incorporating potency. Methods are suggested on how the degree of hazard so derived could be used in the EU classification process to improve hazard communication and in downstream risk management.  相似文献   

12.
While probabilistic methods gain attention in hazard characterization and are increasingly used in exposure assessment, full use of the available probabilistic information in risk characterization is still uncommon. Usually, after probabilistic hazard characterization and/or exposure assessment, percentiles from the obtained distributions are used as point estimates in risk characterization. In this way, all information on variability and uncertainty is lost, while these aspects are crucial in any risk assessment. In this paper, we present a method to integrate the entire distributions from probabilistic hazard characterization and exposure assessment into one risk characterization plot. This method is illustrated using di(2-ethylhexyl) phthalate as an example. The final result of this probabilistic risk assessment is summarized in a single plot, containing two pieces of information: the confidence we may have in concluding there is no risk, and the fraction of the population this conclusion applies to. This information leads to a better informed conclusion on the risk of a substance, and may be very useful to define the necessary measures for risk reduction.  相似文献   

13.
Reducing uncertainty in estimated risks is always desirable. While regulatory assumptions and policies regarding risk assessment of chemicals can be debated, these are the rules under which many risk assessments are currently conducted. Methods for reducing the uncertainty in risk estimates generated under those rules are therefore useful, whether or not one agrees with the models and the underlying assumptions that comprise those rules. The guidance for risk assessment of mixtures of chemicals used by EPA was reexamined to determine methods for reducing the uncertainty in the cumulative risk estimates. It was found that the uncertainty could be significantly reduced if the assumptions concerning the assessment of mixtures of chemicals were combined with the assumptions for evaluating the risks of individual chemicals. Methods are proposed for reducing the uncertainty of mixtures of chemicals whose individual constituents are evaluated by cancer potency factors, hazard quotients, or margins of exposure. The methods developed do not require data beyond that which would be required for generating the current risk estimates for the individual chemicals. These analyses also demonstrated that some of the assumptions currently in use for regulatory risk assessment may lead to inconsistencies that should be reevaluated. As the proposed methods do not require any change in the current assumptions to reduce uncertainty in the risk estimates, the proposed methods should prove useful until such time as the assumptions are reevaluated and possibly changed.  相似文献   

14.
Risk characterization is the final step of the risk assessment process as practiced in the U.S. EPA. In risk characterization, the major scientific evidence and "bottom-line" results from the other components of the risk assessment process, hazard identification, dose-response assessment, and exposure assessment, are evaluated and integrated into an overall conclusion about the risks posed by a given situation. Risk characterization is also an iterative process; the results of a specific step may require re-evaluation or additional information to finalize the risk assessment process. Risks posed by atmospheric emissions are an example of an involuntary human health risk which typically receives a great deal of public attention. Characterization of the risks posed by atmospheric emissions typically requires the use of mathematical models to evaluate: 1) the environmental fate of emitted pollutants, 2) exposures to these pollutants, and 3) human dose-response. Integration of these models results in quantitative risk estimates. The confidence in a quantitative risk estimate is examined by evaluating uncertainty and variability within individual risk assessment components. Variability arises from the true heterogeneity in characteristics within a population or an event; on the other hand, uncertainty represents lack of knowledge about the true value used in a risk estimate. U.S. EPA's 1997 Mercury Study will illustrate some aspects of the risk characterization process as well as the uncertainty and variability encountered in the risk assessment process.  相似文献   

15.
ABSTRACT

Risk characterization is the final step of the risk assessment process as practiced in the U.S. EPA. In risk characterization, the major scientific evidence and “bottom-line” results from the other components of the risk assessment process, hazard identification, dose-respdnse assessment, and exposure assessment, are evaluated and integrated into an overall conclusion about the risks posed by a given situation. Risk characterization is also an iterative process; the results of a specific step may require re-evaluation or additional information to finalize the risk assessment process. Risks posed by atmospheric emissions are an example of an involuntary human health risk which typically receives a great deal of public attention. Characterization of the risks posed by atmospheric emissions typically requires the use of mathematical models to evaluate: I) the environmental fate of emitted pollutants, 2) exposures to these pollutants, and 3) human dose-response. Integration df these models results in quantitdtive risk estimates. The confidence in a quantitative risk estimate is examined by evaluating uncertainty and variability within individual risk assessment components. Variability arises from the true heterogeneity in characteristics within a population or an event; on the other hand, uncertainty represents lack of knowledge about the true value used in a risk estimate. U.S. EPA's 1997 Mercury Study will illustrate some aspects of the risk characterization process as well as the uncertainty and variability encountered in the risk assessment process.  相似文献   

16.
The setting of standards for individual substances is an important tool in the protection of human health. However, it has been a topic of discussion for many years whether only this type of standard-setting will meet common goals for health protection, since humans are exposed to a large variety of chemical substances from many different sources in variable concentrations and by different routes of exposure. The complexity of this problem makes it difficult to answer this question and almost impossible to quantify the answer. It is common knowledge that combined exposure to chemical substances may cause synergism. However, these examples are referring to relatively high levels of exposure. In present environmental and occupational practice, exposure to individual chemicals has usually been reduced to acceptable levels. The key question is whether exposure to mixtures at levels of the single components near or below no-observed-adverse-effect levels can still cause adverse effects. Few countries have incorporated procedures concerning combination toxicity in their policy regarding chemical substances. If so, the uncertainty in these procedures is considerable because of lack of relevant data. This usually leads to a conservative approach. Roughly two approaches can be distinguished for systemic toxicants: introduction of an (extra) uncertainty factor or application of the additivity principle. In The Netherlands, for systemic toxicants a safety factor was introduced in 1989 to account for combination effects. Problems related to this approach have led to adaptations in procedures and a reconsideration of the chosen safety factor.  相似文献   

17.
Exposure reconstruction for substances of interest to human health is a process that has been used, with various levels of sophistication, as far back as the 1930s. The importance of robust and high-quality exposure reconstruction has been recognized by many researchers. It has been noted that misclassification of reconstructed exposures is relatively common and can result in potentially significant effects on the conclusions of a human health risk assessment or epidemiology study. In this analysis, a review of the key exposure reconstruction approaches described in over 400 papers in the peer-reviewed literature is presented. These approaches have been critically evaluated and classified according to quantitative, semiquantitative, and qualitative approaches. Our analysis indicates that much can still be done to improve the overall quality and consistency of exposure reconstructions and that a systematic framework would help to standardize the exposure reconstruction process in the future. The seven recommended steps in the exposure reconstruction process include identifying the goals of the reconstruction, organizing and ranking the available data, identifying key data gaps, selecting the best information sources and methodology for the reconstruction, incorporating probabilistic methods into the reconstruction, conducting an uncertainty analysis, and validating the results of the reconstruction. Influential emerging techniques, such as Bayesian data analysis, are highlighted. Important issues that will likely influence the conduct of exposure reconstruction into the future include improving statistical analysis methods, addressing the issue of chemical mixtures, evaluating aggregate exposures, and ensuring transparency with respect to variability and uncertainty in the reconstruction effort.  相似文献   

18.
Factors influencing the precision of an acceptable daily intake (ADI) are discussed in this paper. As the same principles apply to tolerable daily intake (TDI) or provisional tolerable weekly intake (PTWI), although not specifically mentioned, this paper also refers to TDI and PTWI. The allocation of an ADI is in principle based on the most critical (many times the lowest) no-observed (adverse)-effect level [NO(A)EL] established in toxicological studies in experimental animals or in humans by applying a uncertainty factor for extrapolation from animals or humans to the general human population (and for the extrapolation from high to low intake levels). As the ADI predicts a virtual safe intake level for a life span exposure, to establish a NO(A)EL in general the toxicological database should include long-term studies, otherwise only a provisional ADI will be allocated for which a higher uncertainty factor is applied. The validity of an ADI greatly depends on the precision of the toxicological studies considered for the safety of a food additive or contaminant. The precision of the ADI is also inversely related to the uncertainty factors applied, although these uncertainty factors are not totally independent of the completeness and precision of the toxicological data from which a NO(A)EL is derived. This paper focuses on the precision of the toxicological data and the established NO(A)EL. Human data on the toxicity of a chemical which are preferred for the safety evaluation or hazard assessment are frequently not available or incomplete with respect to a quantitative dose–response assessment. Epidemiological studies will have inherent difficulties for hazard assessment such as possible confounders, restricted number of toxicological end points which can be studied, and limited quantitative data on oral exposure levels. Case report studies include the same limitations but in addition the exposure data are usually very imprecise due to reconstruction of the possible dose level(s). Case reports of intoxication are mainly restricted to acute and at best subacute effects. Controlled human exposure studies (human volunteer studies) are restricted in their experimental design such as the level of the dose and the toxicological end points due to medical ethical reasons. Therefore, quite rarely a safety evaluation of a food chemical will be solely based on human data. In the practice of hazard assessment of chemicals in foods the experimental animal studies will be totally or partly the basis for establishing an ADI. In these toxicological studies in animals there are many experimental variables which can affect the precision of an ADI, such as (1) duration of the experiment, dose ranges, identity, and purity of the substance; (2) the parameters and toxicological end points studied; (3) the species and strain used; (4) the gut microflora of the test animals; (5) dietary composition; (6) statistics performed; and (7) knowledge about the kinetic behavior and metabolism (e.g., elimination half-life and bioavailability of the chemical and its main metabolites) of the chemical considered. How these factors can influence the precision of a NO(A)EL, respectively the ADI, is illustrated by several examples. In relation to the question of incidental excursions of an ADI, it can be concluded that due to the variation in precision of experiments slight incidental excursions would not lead to an increased risk. However, to answer in general the question of how often and/or how much the total intake of a chemical in food may exceed the ADI is not possible. This should be considered case by case. To answer such a question, the precision for those studies representative for incidental excursions should be considered. Other factors which should be considered are (1) type of effect on which the ADI was based, (2) mechanism of toxicity, (3) toxicokinetics and metabolism, and (4) difference in NO(A)ELs from short-term toxicity studies with the NO(A)EL on which the ADI was based. Nevertheless, slight excursions of the intake above the ADI for a substance should always trigger additional consideration and if possible additional studies with high precision.  相似文献   

19.
REACH requires health risk management for workers and the general population and introduced the concept of Derived No-Effect Level (DNEL). DNELs must be derived for all substances that are classified as health hazards. As with analogues to other health-risk based guidance values, such as reference doses (RfDs) and tolerable daily intakes (TDIs), risk to health is considered negligible if the actual exposure is less than the DNEL. Exposure assessment is relatively simple for occupational situations but more complex for the general public, in which exposure may occur via multiple pathways, routes, and media. For such complex or partially defined exposure scenarios, human biomonitoring gives a snapshot of internal or absorbed dose of a chemical and is often the most reliable exposure assessment methodology as it integrates exposures from all routes. For human risk management human biomonitoring data can be interpreted using the recently developed concept of Biomonitoring Equivalents (BE). Basically, a BE translates an established reference value into a biomarker concentration using toxicokinetic data. If the results of an exposure assessment using human biomonitoring indicate that the levels measured are below the DNEL-based BE (BE(DNEL)), it would indicate that the combined exposure via all potential exposure routes is unlikely to pose a risk to human health and that health risk management measures might not be needed. Hence, BEs do not challenge existing risk assessments but rather build upon them to help risk management, the ultimate goal of any risk assessment. A challenge in implementing this approach forms the limited availability of toxicokinetic information for many substances. However, methodologies such as generic physiologically-based toxicokinetic models, which allow estimation of biomarker concentrations based on physicochemical properties, are being developed for less data-rich chemicals. Use of BE by regulatory authorities will allow initial screening of population exposure to chemicals to identify those chemicals requiring more detailed risk and exposure assessment, assisting in priority setting and ultimately leading to improved product stewardship and risk management.  相似文献   

20.
Hakkinen PJ 《Toxicology》2001,160(1-3):59-63
There are many challenges and opportunities in being a global and/or “local” toxicologist, risk assessor, or risk manager. These include the information gathering approaches used to address human hazard identification, exposure assessment, risk characterization, risk management, risk perceptions, and needs for risk communication. Finding the best ways to keep current with the literature and other information associated with the many aspects of toxicology and risk analysis is another challenge and opportunity, as are access to good sources of training and staying aware of the applicable regulations in various countries and regions. Fortunately, a greatly increasing number of people throughout the world will have access to the Internet and World Wide Web; these systems provide 24 h a day access to numerous valuable sources of information and opportunities for training and information sharing.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号