首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 421 毫秒
1.
The current revision of the European policy for the evaluation of chemicals (REACH) has lead to a controversy with regard to the need of additional animal safety testing. To avoid increases in animal testing but also to save time and resources, alternative in silico or in vitro tests for the assessment of toxic effects of chemicals are advocated. The draft of the original document issued in 29th October 2003 by the European Commission foresees the use of alternative methods but does not give further specification on which methods should be used. Computer-assisted prediction models, so-called predictive tools, besides in vitro models, will likely play an essential role in the proposed repertoire of "alternative methods". The current discussion has urged the Advisory Committee of the German Toxicology Society to present its position on the use of predictive tools in toxicology. Acceptable prediction models already exist for those toxicological endpoints which are based on well-understood mechanism, such as mutagenicity and skin sensitization, whereas mechanistically more complex endpoints such as acute, chronic or organ toxicities currently cannot be satisfactorily predicted. A potential strategy to assess such complex toxicities will lie in their dissection into models for the different steps or pathways leading to the final endpoint. Integration of these models should result in a higher predictivity. Despite these limitations, computer-assisted prediction tools already today play a complementary role for the assessment of chemicals for which no data is available or for which toxicological testing is impractical due to the lack of availability of sufficient compounds for testing. Furthermore, predictive tools offer support in the screening and the subsequent prioritization of compound for further toxicological testing, as expected within the scope of the European REACH program. This program will also lead to the collection of high-quality data which will broaden the database for further (Q)SAR approaches and will in turn increase the predictivity of predictive tools.  相似文献   

2.
There is a great need for rapid testing strategies for reproductive toxicity testing, avoiding animal use. The EU Framework program 7 project ChemScreen aimed to fill this gap in a pragmatic manner preferably using validated existing tools and place them in an innovative alternative testing strategy. In our approach we combined knowledge on critical processes affected by reproductive toxicants with knowledge on the mechanistic basis of such effects. We used in silico methods for prescreening chemicals for relevant toxic effects aiming at reduced testing needs. For those chemicals that need testing we have set up an in vitro screening panel that includes mechanistic high throughput methods and lower throughput assays that measure more integrative endpoints. In silico pharmacokinetic modules were developed for rapid exposure predictions via diverse exposure routes. These modules to match in vitro and in vivo exposure levels greatly improved predictivity of the in vitro tests. As a further step, we have generated examples how to predict reproductive toxicity of chemicals using available data. We have executed formal validations of panel constituents and also used more innovative manners to validate the test panel using mechanistic approaches. We are actively engaged in promoting regulatory acceptance of the tools developed as an essential step towards practical application, including case studies for read-across purposes. With this approach, a significant saving in animal use and associated costs seems very feasible.  相似文献   

3.
Toxicology continues to rely heavily on use of animal testing for prediction of potential for toxicity in humans. Where mechanisms of toxicity have been elucidated, for example endocrine disruption by xenoestrogens binding to the estrogen receptor, in vitro assays have been developed as surrogate assays for toxicity prediction. This mechanistic information can be combined with other data such as exposure levels to inform a risk assessment for the chemical. However, there remains a paucity of such mechanistic assays due at least in part to lack of methods to determine specific mechanisms of toxicity for many toxicants. A means to address this deficiency lies in utilization of a vast repertoire of tools developed by the drug discovery industry for interrogating the bioactivity of chemicals. This review describes the application of high-throughput screening assays as experimental tools for profiling chemicals for potential for toxicity and understanding underlying mechanisms. The accessibility of broad panels of assays covering an array of protein families permits evaluation of chemicals for their ability to directly modulate many potential targets of toxicity. In addition, advances in cell-based screening have yielded tools capable of reporting the effects of chemicals on numerous critical cell signaling pathways and cell health parameters. Novel, more complex cellular systems are being used to model mammalian tissues and the consequences of compound treatment. Finally, high-throughput technology is being applied to model organism screens to understand mechanisms of toxicity. However, a number of formidable challenges to these methods remain to be overcome before they are widely applicable. Integration of successful approaches will contribute towards building a systems approach to toxicology that will provide mechanistic understanding of the effects of chemicals on biological systems and aid in rationale risk assessments.  相似文献   

4.
Abstract

Identification of the potential hazards of chemicals has traditionally relied on studies in laboratory animals where changes in clinical pathology and histopathology compared to untreated controls defined an adverse effect. In the past decades, increased consistency in the definition of adversity with chemically-induced effects in laboratory animals, as well as in the assessment of human relevance has been reached. More recently, a paradigm shift in toxicity testing has been proposed, mainly driven by concerns over animal welfare but also thanks to the development of new methods. Currently, in vitro approaches, toxicogenomic technologies and computational tools, are available to provide mechanistic insight in toxicological Mode of Action (MOA) of the adverse effects observed in laboratory animals. The vision described as Tox21c (Toxicity Testing in the 21st century) aims at predicting in vivo toxicity using a bottom-up-approach, starting with understanding of MOA based on in vitro data to ultimately predict adverse effects in humans. At present, a practical application of the Tox21c vision is still far away. While moving towards toxicity prediction based on in vitro data, a stepwise reduction of in vivo testing is foreseen by combining in vitro with in vivo tests. Furthermore, newly developed methods will also be increasingly applied, in conjunction with established methods in order to gain trust in these new methods. This confidence is based on a critical scientific prerequisite: the establishment of a causal link between data obtained with new technologies and adverse effects manifested in repeated-dose in vivo toxicity studies. It is proposed to apply the principles described in the WHO/IPCS framework of MOA to obtain this link. Finally, an international database of known MOAs obtained in laboratory animals using data-rich chemicals will facilitate regulatory acceptance and could further help in the validation of the toxicity pathway and adverse outcome pathway concepts.  相似文献   

5.
The current testing requirements for both adult and developmental neurotoxicity evaluation are based on in vivo animal models and the neurotoxic potency of compounds is mainly determined by neurobehavioural and neuropathological effects. In vitro studies are considered complementary to animal tests because they provide an understanding of the molecular/cellular mechanisms involved in neurotoxicity. However, the selection of relevant in vitro neuronal/glial specific endpoints applied to various neuronal cellular models should be done in a careful way to build reliable and feasible testing strategies since usually these endpoints have to be tested in various complementary in vitro systems. The requirements for applying a more complex test strategy where toxicokinetic aspects are included together with different tools to compensate for the lack of in vitro metabolic competence are discussed. Taking into consideration the recent European Commission chemical legislation concerning registration, evaluation and authorisation of chemicals (REACH) it has become a priority to develop new intelligent testing strategies integrating computational models and in vitro assays based on cell culture models and endpoints that are amenable for adaptation to high throughput screening to be able to test a large number of chemicals.  相似文献   

6.
Toxicity testing is essential for the protection of human health from exposure to toxic environmental chemicals. As traditional toxicity testing is carried out using animal models, mammalian cell culture models are becoming an increasingly attractive alternative to animal testing. Combining the use of mammalian cell culture models with screening‐style molecular profiling technologies, such as metabolomics, can uncover previously unknown biochemical bases of toxicity. We have used a mass spectrometry‐based untargeted metabolomics approach to characterize for the first time the changes in the metabolome of the B50 cell line, an immortalised rat neuronal cell line, following acute exposure to two known neurotoxic chemicals that are common environmental contaminants; the pyrethroid insecticide permethrin and the organophosphate insecticide malathion. B50 cells were exposed to either the dosing vehicle (methanol) or an acute dose of either permethrin or malathion for 6 and 24 hours. Intracellular metabolites were profiled by gas chromatography–mass spectrometry. Using principal components analysis, we selected the key metabolites whose abundance was altered by chemical exposure. By considering the major fold changes in abundance (>2.0 or <0.5 from control) across these metabolites, we were able to elucidate important cellular events associated with toxic exposure including disrupted energy metabolism and attempted protective mechanisms from excitotoxicity. Our findings illustrate the ability of mammalian cell culture metabolomics to detect finer metabolic effects of acute exposure to known toxic chemicals, and validate the need for further development of this process in the application of trace‐level dose and chronic toxicity studies, and toxicity testing of unknown chemicals.  相似文献   

7.
Validated alternative test methods are urgently needed for safety testing of drugs, chemicals and cosmetics. Whereas some animal tests for topical toxicity have been successfully replaced by alternative methods, systemic toxicity testing requires new test strategies in order to achieve an adequate safety level for the consumer. Substantial numbers of animals are required for the current in vivo assays for drugs, chemicals and cosmetics and a broad range of pioneering alternative methods were already developed. These prerequisites motivate the development of a tiered testing strategy based on alternative tests for reproductive toxicity hazard. In the Integrated Project ReProTect, a consortium set up by the European Centre for the Validation of Alternative Methods (ECVAM) takes the lead to manage the development of a testing strategy in the area of reproductive toxicity. The reproductive cycle can be broken down into well-defined sub-elements, namely male and female fertility, implantation and pre/postnatal development. In this project, in vitro, in silico and sensor technologies will be developed, leading to testing strategies, that shall be implemented and disseminated.  相似文献   

8.
This review critically examines the data on claimed endocrine-mediated adverse effects of chemicals on wildlife populations. It focuses on the effects of current-use chemicals, and compares their apparent scale and severity with those of legacy chemicals which have been withdrawn from sale or use, although they may still be present in the environment. The review concludes that the effects on wildlife of many legacy chemicals with endocrine activity are generally greater than those caused by current-use chemicals, with the exception of ethinylestradiol and other estrogens found in sewage effluents, which are causing widespread effects on fish populations. It is considered that current chemical testing regimes and risk assessment procedures, at least those to which pesticides and biocides are subjected, are in part responsible for this improvement. This is noteworthy as most ecotoxicological testing for regulatory purposes is currently focused on characterizing apical adverse effect endpoints rather than identifying the mechanism(s) responsible for any observed effects. Furthermore, a suite of internationally standardized ecotoxicity tests sensitive for potential endocrine-mediated effects is now in place, or under development, which should ensure further characterization of substances with these properties so that they can be adequately regulated.  相似文献   

9.
Humans are unavoidably exposed to a variety of environmental toxicants and combinations thereof, resulting in an increased risk for a number of diseases. The emerging field of toxicoproteomics has been boosted by quantitative and qualitative proteomic technologies and its increasing applications in toxicology research. The discipline is focused on the proteomic studies of toxicity, caused in response to toxic chemicals and environmental exposures, both in episodes of acute exposure to toxicants along with the long-term development of disease. Toxicoproteomics uses the discovery potential of proteomics in toxicology research by applying global protein measurement technologies to biofluids and tissues after host exposure to injurious agents. This field is challenging too, largely due to the shear size of the proteome and the massive data that are generated by it. Hence, improved toxicoproteomics studies applying advanced methodologies must be carried out to pave the way for commencing a new phase in toxicology research. In this regard, this study reviewed recent studies applying proteomic analysis to toxicological research, and the proteomic technologies and their capabilities with exemplary studies from biology and medicine. Expanding the repertoire of identified predictive biomarkers of toxicants exposure by toxicoproteomic studies will provide critical tools in the evaluation of their safety and design of appropriate measures to minimize adverse effects.  相似文献   

10.
The upcoming European chemicals legislation REACH (Registration, Evaluation, and Authorisation of Chemicals) will require the risk assessment of many thousands of chemicals. It is therefore necessary to develop intelligent testing strategies to ensure that chemicals of concern are identified whilst minimising the testing of chemicals using animals. Xenobiotics may perturb the reproductive cycle, and for this reason several reproductive studies are recommended under REACH. One of the endpoints assessed in this battery of tests is mating performance and fertility. Animal tests that address this endpoint use a relatively large number of animals and are also costly in terms of resource, time, and money. If it can be shown that data from non-reproductive studies such as in-vitro or repeat-dose toxicity tests are capable of generating reliable alerts for effects on fertility then some animal testing may be avoided. Available rat sub-chronic and fertility data for 44 chemicals that have been classified by the European Union as toxic to fertility were therefore analysed for concordance of effects. Because it was considered appropriate to read across data for some chemicals these data sets were considered relevant for 73 of the 102 chemicals currently classified as toxic to reproduction (fertility) under this system. For all but 5 of these chemicals it was considered that a well-performed sub-chronic toxicity study would have detected pathology in the male, and in some cases, the female reproductive tract. Three showed evidence of direct interaction with oestrogen or androgen receptors (linuron, nonylphenol, and fenarimol). The remaining chemicals (quinomethionate and azafenidin) act by modes of action that do not require direct interaction with steroid receptors. However, both these materials caused in-utero deaths in pre-natal developmental toxicity studies, and the relatively low NOAELs and the nature of the hazard identified in the sub-chronic tests provides an alert for possible effects on fertility (or early embryonic development), the biological significance of which can be ascertained in a littering (e.g. 2-generation) study. From the chemicals reviewed it would appear that where there are no alerts from a repeat-dose toxicity study, a pre-natal developmental toxicity study and sex steroid receptor binding assays, there exists a low priority for animal studies to address the fertility endpoint. The ability for these types of tests to provide alerts for effects on fertility is clearly dependent on the mode of action of the toxicant in question. Further work should therefore be performed to determine the 'failure rate' of this type of approach when applied to a larger group of chemicals with diverse modes of action.  相似文献   

11.
This paper reviews regulatory requirements and recent case studies to illustrate how the risk assessment (RA) of chemical mixtures is conducted, considering both the effects on human health and on the environment. A broad range of chemicals, regulations and RA methodologies are covered, in order to identify mixtures of concern, gaps in the regulatory framework, data needs, and further work to be carried out. Also the current and potential future use of novel tools (Adverse Outcome Pathways, in silico tools, toxicokinetic modelling, etc.) in the RA of combined effects were reviewed.The assumptions made in the RA, predictive model specifications and the choice of toxic reference values can greatly influence the assessment outcome, and should therefore be specifically justified. Novel tools could support mixture RA mainly by providing a better understanding of the underlying mechanisms of combined effects. Nevertheless, their use is currently limited because of a lack of guidance, data, and expertise. More guidance is needed to facilitate their application. As far as the authors are aware, no prospective RA concerning chemicals related to various regulatory sectors has been performed to date, even though numerous chemicals are registered under several regulatory frameworks.  相似文献   

12.
Concerns, legislation and research needs have precipitated developments such as the mode of action concept, the Tox21 strategy, the concept of pathways of toxicity and the adverse outcome pathway framework. New technologies and paradigms are currently transforming these concepts into applicable animal‐free toxicity testing systems. The adverse outcome pathway framework provides a structure for collecting, organizing and evaluating the available data that describe the compound and the events resulting in an adverse outcome at a biological level of organization. The current chapter intends to provide a non‐exhaustive review of (i) our current understanding of the molecular mechanisms driven the key events of the mode of action for sensitization induction by chemicals, (ii) the tools that were developed on the basis of the available knowledge and (iii) the major gaps that need to be filled.  相似文献   

13.
This study was conducted to evaluate the utility of a selection of commercially and freely available non‐testing tools and to analyse how REACH registrants can apply these as prioritisation tool for low‐volume chemicals. The analysis was performed on a set of organic industrial chemicals and pesticides with extensive peer‐reviewed risk assessment data. Analysed in silico model systems included Derek Nexus, Toxtree, QSAR Toolbox, LAZAR, TEST and VEGA, and results from these were compared with expert‐judged risk classification according to the classifying, labelling and packaging (CLP) regulation. The most reliable results were obtained for carcinogenicity; however, less reliable predictions were derived for mutagenicity and reproductive toxicity. A group of compounds frequently predicted as false negatives was identified. These were relatively small molecules with low structural complexity, for example benzene derivatives with hydroxyl‐, amino‐ or aniline‐substituents. A rat liver S9 metabolite simulator was applied to illustrate the importance of considering metabolism in the risk assessment procedure. We also discuss outcome of combining predictions from multiple model systems and advise how to apply in silico tools. These models are proposed to be used to prioritise low‐volume chemicals for testing within the REACH legislation, and we conclude that further guidance is needed so that industry can select and apply models in a reliable, systematic and transparent way.  相似文献   

14.
The developing immune system is among the most sensitive targets for environmental insult and risk of chronic disease including cancer. Developmental immunotoxicity (DIT)-associated health risks include not only pediatric diseases like childhood asthma and type 1 diabetes, but also multi-disease “patterns” of conditions linked to the initial immune dysfunction. DIT contributes to ever-increasing health care costs, increasing reliance on drugs and reduced quality of life. Drug discovery efforts using cutting-edge immunology produce effective tools for management of allergic, autoimmune and inflammatory diseases; in stark contrast, required immunotoxicity testing clings to an outdated understanding of the immune system and its relationship to disease. As currently required, immune safety evaluation of drugs and chemicals lacks the capability of protecting against the most prevalent pediatric immune dysfunction-based diseases. For this reason, mandatory and relevant DIT testing is needed for all drugs and chemicals where pregnant women and children are at risk.  相似文献   

15.
Categorizing chemicals is an approach with the potential to reduce animal testing for hazard assessment of chemicals. In this study we investigated the category approach for testing the hemolytic effects of ethylene glycol alkyl ethers (EGAEs) for repeated-dose toxicity (RDT). Using mechanistic information on the hemolytic effects of ethylene glycol butyl ether, a toxicologically meaningful category was built on the basis of similarity of metabolism, mode of action and the hemolytic effects of several EGAEs and related chemicals. The developed category was then evaluated for analogs from a different data source. Given all structural information on category chemicals, the category can be finally defined as EGAEs (alkyl chain carbon number: 1-4) and their acetates. Current RDT test data suggest that EGAEs with 3 and 4 alkyl carbons primarily cause hemolytic effects, while EGAEs with 1 and 2 alkyl carbon(s) show toxicity to the testis before demonstrating any hemolytic effects. Hence, the category approach appears to be applicable to hemolytic effects of EGAEs with 3 and 4 alkyl carbons and their acetates to estimate the no observable adverse effect level (NOAEL) for RDT. It consists of three steps: structure-based primary screening of untested chemicals, categorization of compounds that form hemolytic alkoxyacetic acids by predicting how they are metabolized, and finally estimation of hemolytic levels by employing read-across. Our results clearly demonstrate the usefulness of the category approach for predicting the hemolytic effects of untested EGAEs and their acetates in RDT.  相似文献   

16.
The effects of several drugs and neurotoxins on schedule-controlled responding are reviewed in a number of species. In general, the behavioral effects of these chemicals in different species differ quantitatively more frequently than qualitatively. The sensitivity of schedule-controlled behavior to chemical effects across species does not show any obvious relationship to position on the phylogenetic tree. Pigeons are more sensitive, less sensitive, or equally sensitive to chemicals than other species, depending on the chemical. Because pigeons are inexpensive, have a long life span and are easy to train and handle, they should receive serious consideration as a species of choice for behavioral testing of potential neurobehavioral toxins.  相似文献   

17.
The design of toxicological testing strategies aimed at identifying the toxic effects of chemicals without (or with a minimal) recourse to animal experimentation is an important issue for toxicological regulations and for industrial decision-making. This article describes an original approach which enables the design of substance-tailored testing strategies with a specified performance in terms of false-positive and false-negative rates. The outcome of toxicological testing is simulated in a different way than previously published articles on the topic. Indeed, toxicological outcomes are simulated not only as a function of the performance of toxicological tests but also as a function of the physico-chemical properties of chemicals.  相似文献   

18.
Exposure to occupational and environmental contaminants is a major contributor to human health problems. Inhalation of gases, vapors, aerosols, and mixtures of these can cause a wide range of adverse health effects, ranging from simple irritation to systemic diseases. Despite significant achievements in the risk assessment of chemicals, the toxicological database, particularly for industrial chemicals, remains limited. Considering there are approximately 80,000 chemicals in commerce, and an extremely large number of chemical mixtures, in vivo testing of this large number is unachievable from both economical and practical perspectives. While in vitro methods are capable of rapidly providing toxicity information, regulatory agencies in general are still cautious about the replacement of whole-animal methods with new in vitro techniques. Although studying the toxic effects of inhaled chemicals is a complex subject, recent studies demonstrate that in vitro methods may have significant potential for assessing the toxicity of airborne contaminants. In this review, current toxicity test methods for risk evaluation of industrial chemicals and airborne contaminants are presented. To evaluate the potential applications of in vitro methods for studying respiratory toxicity, more recent models developed for toxicity testing of airborne contaminants are discussed.  相似文献   

19.
OECD test strategies and methods for endocrine disruptors   总被引:2,自引:0,他引:2  
Gelbke HP  Kayser M  Poole A 《Toxicology》2004,205(1-2):17-25
The question whether (man-made or natural) chemical substances may have an adverse effect on the endocrine system has gained high visibility in the public as well as in the scientific community. This relates to possible effects on the environment as well as on human health for chemicals with (anti)estrogenic, (anti)androgenic or (anti)thyroid activity. Taking into account the broad universe of chemicals to which humans or the environment may be exposed, a sound testing strategy and robust test methods are urgently needed. Both subjects have been addressed by a specific OECD working group (EDTA-Endocrine Disruptor Testing and Assessment Task Force) involving regulatory agencies, the scientific community, chemical industry and NGOs. Like other organizations the OECD has adopted a tiered-testing strategy with the first tier using screening assays as quick and inexpensive tools, providing a way of generating alerts to potential endocrine activity that can be used to prioritize substances for definitive tests that then can determine the toxicological consequences of endocrine toxicity. The efforts of the OECD have therefore concentrated on the validation of specific screening and testing guidelines, like the uterotrophic, the Hershberger, and the "enhanced TG 407" test. The experimental testing necessary for this validation procedure is completed for the uterotrophic and the "enhanced TG 407" tests and near completion for the Hershberger assay. The data obtained so far have been published (for the uterotrophic assay) or will be submitted to the EDTA working group for final evaluation. Overall, the validation program has been very successful and should be sufficient for setting up OECD test guidelines for these experimental procedures. This will add substantially to the "tool-box" of OECD test methods that is available internationally to regulatory agencies and chemical industry for the identification and assessment of possible endocrine disruptors. Despite this success it is well recognized that the methodological "tool-box" should be supplemented by further screening and testing procedures related to effects on human health and the environment.  相似文献   

20.
Recent concerns about the potential of certain chemicals to modulate estrogen-regulated processes have led to questions as to how chemicals should be tested for such effects. Therefore, AIHC has developed a comprehensive, resource-efficient, and flexible tiered strategy for estrogen modulation (EM) testing. Levels of evaluation include Tier 0, in which exposure, along with alerts based on structure-activity, persistence, bioaccumulation, and other data, are assessed to prioritize chemicals for preliminary testing. In Tier I, short term in vitro, ex vivo, and/or in vivo assays are used to obtain a preliminary indication of EM potential. Among these, an in vivo response assay is considered the most reliable at this time. However, none of these tests are intended for risk assessment, but rather to aid in choosing chemicals for further testing and in guiding the extent of that testing. Tier II is aimed at risk assessment and involves whole animal tests that contain EM-sensitive end points (e.g., two-generation reproduction study). Tier III consists of hypothesis-driven research reserved for situations where targeted research can reduce levels of uncertainty. This tiered approach provides a framework for the strategic and effective application of EM test methods to address specific information needs on a case by case basis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号