首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 718 毫秒
1.
2.
Mechanistic toxicology has evolved by relying, to a large extent, on methodologies that substitute or complement traditional animal tests. The biotechnology and informatics revolutions of the last decades have made such technologies broadly available and useful, but regulatory toxicology has been slow to embrace these new approaches. Major validation efforts, however, have delivered the evidence that new approaches do not lower safety standards and can be integrated into regulatory safety assessments.Particularly in the EU, political pressures, such as the REACH legislation and the 7th Amendment to the cosmetic legislation, have prompted the need of new approaches. In the US, the NRC vision report calling for a toxicology for the 21st century (and its most recent adaptation by EPA for their toxicity testing strategy) have initiated a debate about how to create a novel approach based on human cell cultures, lower species, high-throughput testing, and modeling.Lessons learned from the development, validation, and acceptance of alternative methods support the creation of a new approach based on identified toxicity pathways. Conceptual steering and an objective assessment of current practices by evidence-based toxicology (EBT) are required. EBT is modeled on evidence-based medicine, which has demonstrated that rigorous systematic reviews of current practices and meta-analyses of studies provide powerful tools to provide health care professionals and patients with the current best scientific evidence. Similarly, a portal for high-quality reviews of toxicological approaches and tools for the quantitative meta-analyses of data promise to serve as door opener for a new regulatory toxicology.  相似文献   

3.
Progress in applying genomics in drug development   总被引:4,自引:0,他引:4  
Lord PG 《Toxicology letters》2004,149(1-3):371-375
Genomics has had an impact on two areas of drug development, "predictive" toxicology and mechanism-based risk assessment. Predictive toxicology studies are aimed at identifying the potential for a compound to be toxic. By developing databases of expression profiles for a wide variety of toxic compounds and toxic models it has been possible to create statistical and computational methods which provide an indication of the toxic potential of a drug from the pattern of gene expression changes it elicits in in vitro or in vivo systems. Because gene expression is central to many responses to xenobiotics, genomic approaches lend themselves very readily to mechanistic toxicology studies. By examining changes in gene expression in cells and tissues in response to drugs it is possible to generate hypotheses as to the underlying mechanism and in some cases it is possible to evaluate hypotheses of toxic mechanism. Some concerns remain about the use of the technology but toxicogenomics can no longer be regarded as "new" technology in drug development. The investments made in applying the technology are maturing and there is a determined effort to bring the full power of the technology into drug development.  相似文献   

4.
The applied use of in silico technologies (a.k.a. computational toxicology, in silico toxicology, computer-assisted tox, e-tox, i-drug discovery, predictive ADME, etc.) for predicting preclinical toxicological endpoints, clinical adverse effects, and metabolism of pharmaceutical substances has become of high interest to the scientific community and the public. The increased accessibility of these technologies for scientists and recent regulations permitting their use for chemical risk assessment supports this notion. The scientific community is interested in the appropriate use of such technologies as a tool to enhance product development and safety of pharmaceuticals and other xenobiotics, while ensuring the reliability and accuracy of in silico approaches for the toxicological and pharmacological sciences. For pharmaceutical substances, this means active and impurity chemicals in the drug product may be screened using specialized software and databases designed to cover these substances through a chemical structure-based screening process and algorithm specific to a given software program. A major goal for use of these software programs is to enable industry scientists not only to enhance the discovery process but also to ensure the judicious use of in silico tools to support risk assessments of drug-induced toxicities and in safety evaluations. However, a great amount of applied research is still needed, and there are many limitations with these approaches which are described in this review. Currently, there is a wide range of endpoints available from predictive quantitative structure-activity relationship models driven by many different computational software programs and data sources, and this is only expected to grow. For example, there are models based on non-proprietary and/or proprietary information specific to assessing potential rodent carcinogenicity, in silico screens for ICH genetic toxicity assays, reproductive and developmental toxicity, theoretical prediction of human drug metabolism, mechanisms of action for pharmaceuticals, and newer models for predicting human adverse effects. How accurate are these approaches is both a statistical issue and challenge in toxicology. In this review, fundamental concepts and the current capabilities and limitations of this technology will be critically addressed.  相似文献   

5.
6.
Expectations are high that the use of proteomics, gene arrays and metabonomics will improve risk assessment and enable prediction of toxicity early in drug development. These molecular profiling techniques may be used to classify compounds and to identify predictive markers that can be used to screen large numbers of chemicals. One of the challenges for the scientific community is to discriminate between changes in gene/protein expression and metabolic profiles reflecting physiological/adaptive responses, and changes related to pathology and toxicology. In these proceedings we provide a brief overview of the technologies with focus on proteomics and the possible applications to mechanistic and predictive toxicology. The discussion also includes strengths and limitations of molecular profiling technologies.  相似文献   

7.
Monitoring the exposure of a drug and its metabolites in humans and preclinical species during drug development is required to ensure that the safety of drug‐related components in humans are adequately assessed in the standard toxicology studies. Recently published FDA guidance on metabolites in safety testing (MIST) has generated broad discussion from various perspectives. Most of the opinions and experiences shared among the scientific community are scientifically sound and practical. There are various approaches to assess the metabolite exposure margin between toxicology species and humans: either by direct or indirect comparison or by qualitative or quantitative comparison. The choice of when and how to pursuit metabolite assessment is based on the overall development strategy of the compound. Therefore, it is important to understand the utility and limitations of analytical instruments in order to apply an appropriate analytical tool to address specific questions posed at different stages of drug development. The urgency of metabolite monitoring depends on the intrinsic nature of the compound, therapeutic intent and objective of the clinical development. The strategy for assessing metabolite exposure in humans should be a holistic approach considering clinical situations and cumulative knowledge of the metabolism of the drug in order to appropriately address metabolite safety in humans. A one‐size‐fits‐all approach is rarely the best use of resources. Copyright © 2009 John Wiley & Sons, Ltd.  相似文献   

8.
Emerging technologies applied in the regulatory field encompass a group of technologies that are used in addition to or in replacement of the standard toxicology studies conducted to support an Investigational New Drug Application (IND) or New Drug Application (NDA). The standard package includes general toxicology studies of various duration, safety pharmacology studies, genetic toxicology studies, and reproductive toxicology studies. New and emerging technologies applied to the regulation of new drugs include the use of novel biomarkers, transfected cells and transgenic animals, and the "omics" technologies (toxicogenomics, proteomics, and metabonomics). These technologies are at various stages of regulatory development and acceptance. For example, the use of transgenic animals have gained acceptance by regulatory authorities to replace a 2-year carcinogenicity assay. Alternatively, the "omics" technologies are not sufficiently advanced to achieve regulatory acceptance as replacements, although these assays have a role early in drug development and they may prove useful as supplements to standard studies. Data from these assays have been used to address specific mechanistic questions in combination with standard toxicology assays.  相似文献   

9.
Recent regulatory guidance suggests that drug metabolites identified in human plasma should be present at equal or greater levels in at least one of the animal species used in safety assessments (MIST). Often synthetic standards for the metabolites do not exist, thus this has introduced multiple challenges regarding the quantitative comparison of metabolites between human and animals. Various bioanalytical approaches are described to evaluate the exposure of metabolites in animal vs. human. A simple LC/MS/MS peak area ratio comparison approach is the most facile and applicable approach to make a first assessment of whether metabolite exposures in animals exceed that in humans. In most cases, this measurement is sufficient to demonstrate that an animal toxicology study of the parent drug has covered the safety of the human metabolites. Methods whereby quantitation of metabolites can be done in the absence of chemically synthesized authentic standards are also described. Only in rare cases, where an actual exposure measurement of a metabolite is needed, will a validated or qualified method requiring a synthetic standard be needed. The rigor of the bioanalysis is increased accordingly based on the results of animal:human ratio measurements. This data driven bioanalysis strategy to address MIST issues within standard drug development processes is described.  相似文献   

10.
药物蛋白质组学研究进展   总被引:1,自引:0,他引:1  
蒋宁  周文霞  张永祥 《中国新药杂志》2005,14(12):1391-1394
药物蛋白质组学是基因组和药物发现之间的桥梁,目前已广泛应用于临床和生物医学的各个领域.其研究内容在临床前包括:构建分子药理筛选模型、筛选药物作用靶点、研究药物作用机制和毒理机制等;临床研究包括:利用疾病特异性蛋白质作为疾病分类分型和诊断的标志,还用于评价疗效和预测疾病的预后和转归等.  相似文献   

11.
The frequency of detection of cocaine and/or its major metabolite, benzoylecgonine, during toxicology screening of a university medical center patient population was evaluated by retrospective review of the results of the 2,200 toxicology screens performed during 1986 on either urine or urine in conjunction with blood. Of these screens, 234 (10%) were positive for cocaine and/or its metabolite--a substantial increase from the 1% noted for the year 1978 at this medical center. Men and women were represented equally with the most common age range being 21 to 30 years for both. Most adults (64%) were located on either the obstetrics or the trauma services. In 37 instances cocaine was detected in neonates, presumably due to transplacental transmission. Cocaine and/or its metabolite were found either alone or in combination with other drugs with about equal frequency. The most common other drugs were ethanol, morphine, amphetamine, methamphetaine, and phencyclidine. Cocaine detection increased throughout the study period with 68% of positives occurring from July through December 1986. Analysis of cocaine and/or benzoylecgonine should be an integral part of toxicology screening performed on a university medical center patient population.  相似文献   

12.
Discovering new schistosome drug targets: the role of transcriptomics   总被引:1,自引:0,他引:1  
  相似文献   

13.
Early assessment of the toxicity potential of new molecules in pharmaceutical industry is a multi‐dimensional task involving predictive systems and screening approaches to aid in the optimization of lead compounds prior to their entry into development phase. Due to the high attrition rate in the pharma industry in last few years, it has become imperative for the nonclinical toxicologist to focus on novel approaches which could be helpful for early screening of drug candidates. The need is that the toxicologists should change their classical approach to a more investigative approach. This review discusses the developments that allow toxicologists to anticipate safety problems and plan ways to address them earlier than ever before. This includes progress in the field of in vitro models, surrogate models, molecular toxicology, ‘omics’ technologies, translational safety biomarkers, stem‐cell based assays and preclinical imaging. The traditional boundaries between teams focusing on efficacy/ safety and preclinical/ clinical aspects in the pharma industry are disappearing, and translational research‐centric organizations with a focused vision of bringing drugs forward safely and rapidly are emerging. Today's toxicologist should collaborate with medicinal chemists, pharmacologists, and clinicians and these value‐adding contributions will change traditional toxicologists from side‐effect identifiers to drug development enablers. Copyright © 2013 John Wiley & Sons, Ltd.  相似文献   

14.
Drug induced toxicity remains one of the major reasons for failures of new pharmaceuticals, and for the withdrawal of approved drugs from the market. Efforts are being made to reduce attrition of drug candidates, and to minimize their bioactivation potential in the early stages of drug discovery in order to bring safer compounds to the market. Therefore, in addition to potency and selectivity; drug candidates are now selected on the basis of acceptable metabolism/toxicology profiles in preclinical species. To support this, new approaches have been developed, which include extensive in vitro methods using human and animal hepatic cellular and subcellular systems, recombinant human drug metabolizing enzymes, increased automation for higher-throughput screens, sensitive analytical technologies and in silico computational models to assess the metabolism aspects of the new chemical entities. By using these approaches many compounds that might have serious adverse reactions associated with them are effectively eliminated before reaching clinical trials, however some toxicities such as those caused by idiosyncratic responses, are not detected until a drug is in late stages of clinical trials or has become available to the market. One of the proposed mechanisms for the development of idiosyncratic drug toxicity is the bioactivation of drugs to form reactive metabolites by drug metabolizing enzymes. This review discusses the different approaches to, and benefits of using existing in vitro techniques, for the detection of reactive intermediates in order to minimize bioactivation potential in drug discovery.  相似文献   

15.
Felbamate is an antiepileptic drug that is associated with minimal toxicity in preclinical species such as rat and dog but has an unacceptable incidence of serious idiosyncratic reactions in man. Idiosyncratic reactions account for over half of toxicity-related drug failures in the marketplace, and improving the preclinical detection of idiosyncratic toxicities is thus of paramount importance to the pharmaceutical industry. The formation of reactive metabolites is common among most drugs associated with idiosyncratic drug reactions and may cause deleterious effects through covalent binding and/or oxidative stress. In the present study, felbamate was compared to several other antiepileptic drugs (valproic acid, carbamazepine, phenobarbital, and phenytoin), using covalent binding of radiolabeled drugs and hepatic gene expression responses to evaluate oxidative stress/reactive metabolite potential. Despite causing only very mild effects on covalent binding parameters, felbamate produced robust effects on a previously established oxidative stress/reactive metabolite gene expression signature. The other antiepileptic drugs and acetaminophen are known hepatotoxicants at high doses in the rat, and all increased covalent binding to liver proteins in vivo and/or to liver microsomes from human and rat. With the exception of acetaminophen, valproic acid exhibited the highest covalent binding in vivo, whereas carbamazepine exhibited the highest levels in vitro. Pronounced effects on oxidative stress/reactive metabolite-responsive gene expression were observed after carbamazepine, phenobarbital, and phenytoin administration. Valproic acid had only minor effects on the oxidative stress/reactive metabolite indicator genes. The relative ease of detection of felbamate based on gene expression results in rat liver as having potential oxidative stressor/reactive metabolites indicates that this approach may be useful in screening for potential idiosyncratic toxicity. Together, measurements of gene expression along with covalent binding should improve the safety assessment of candidate drugs.  相似文献   

16.
Abstract

Although an overwhelmingly large portion of the resources in toxicologic research is devoted to single chemical studies, the toxicology of chemical mixtures, not single chemicals, is the real issue regarding health effects of environmental and/or occupational exposure to chemicals. The relative lack of activities in the area of toxicology of chemical mixtures does not suggest ignorance of the importance of the issue by the toxicology community. Instead, it is a reflection of the difficulty, complexity, and controversy surrounding this area of research.

Until recently, much of the literature on the toxicology of chemical mixtures has been either very focused on certain specific interaction studies or slanted toward broad-based, relatively vague theoretical deliberation. The typical interaction study involved binary mixtures at relatively high dose levels with acute toxicities as endpoints. Although the theoretical papers have been valuable contributions, little is available on actual, practical experimental approaches toward a systematic solution of this immensely complex area of research.

We present here a broad discussion on the important issues of the toxicology of chemical mixtures. First, we provide some background information with respect to the problem and significance of toxicology of chemical mixtures in relation to some of the real life issues. Second, we review and compare the existing experimental approaches relevant to toxicologic interactions of chemical mixtures. Third, we propose three integrated approaches that involve the combination of physiologically based pharmacokinetic/pharmacodynamic (PBPK/PD) modeling with: (1) Monte Carlo simulation, (2) median effect principle (MEP), and (3) response surface methodology (RSM). These modeling approaches, coupled with very focused mechanistically based toxicology studies, could be the basis for solving the problems of toxicology and risk assessment of chemical mixtures.  相似文献   

17.
The increasing demands on toxicology of large-scale risk assessment programmes for chemicals and emerging or expanding areas of chemical use suggest it is timely to review the toxicological toolbox. Like in clinical medicine, where an evidence-based medicine (EBM) is critically reviewing traditional approaches, toxicology has the opportunity to reshape and enlarge its methodology and approaches on the basis of compounded scientific knowledge. Such revision would have to be based on structured reviews of current practice, ie, assessment of test performance characteristics, mechanistic understanding, extended quality assurance, formal validation and the use of integrated testing strategies. This form of revision could optimize the balance between safety, costs and animal welfare, explicitly stating and, where possible, quantifying uncertainties. After a self-critical reassessment of current practices and evaluation of the thus generated information, such an evidence-based toxicology (EBT) promises to make better use of resources and to increase the quality of results, facilitating their interpretation. It shall open up hazard and also risk assessments to new technologies, flexibly accommodating current and future mechanistic understanding. An EBT will be better prepared to answer the continuously growing safety demands of modern societies.  相似文献   

18.
High-throughput omics strategies delineate the molecular mechanism of toxicity, predict the toxicity of newer drugs and chemicals, and identify individuals at high risks on the basis of expression patterns of messenger ribonucleic acids, genes, and proteins, and detection of intermediary metabolites. Despite being a developing country, India is one of the fastest growing nations in the usages and applications of omics technologies. Several differentially expressed genes and proteins under various pathological and toxicant-exposed conditions have been identified, and many association studies on genetic polymorphisms with toxicant-induced diseases have been conducted for the predictive and mechanistic purposes. To date, omics-driven approaches have identified some novel fingerprints associated with disease risk/protection and prediction of toxicity of newer chemicals. Although the contributions of such findings in the mechanistic toxicology have been immense, predictive values of these findings in toxicology have been limited. In this review, the current status of omics-based research and its future possibilities at the Indian Institute of Toxicology Research (IITR), Lucknow, India, have been discussed.  相似文献   

19.
New treatments are currently required for the common metabolic diseases obesity and type 2 diabetes. The identification of physiological and biochemical factors that underlie the metabolic disturbances observed in obesity and type 2 diabetes is a key step in developing better therapeutic outcomes. The discovery of new genes and pathways involved in the pathogenesis of these diseases is critical to this process, however identification of genes that contribute to the risk of developing these diseases represents a significant challenge as obesity and type 2 diabetes are complex diseases with many genetic and environmental causes. A number of diverse approaches have been used to discover and validate potential new targets for obesity and diabetes. To date, DNA-based approaches using candidate gene and genome-wide linkage analysis have had limited success in identifying genomic regions or genes involved in the development of these diseases. Recent advances in the ability to evaluate linkage analysis data from large family pedigrees using variance components based linkage analysis show great promise in robustly identifying genomic regions associated with the development of obesity and diabetes. RNA-based technologies such as cDNA microarrays have identified many genes differentially expressed in tissues of healthy and diseased subjects. Using a combined approach, we are endeavouring to focus attention on differentially expressed genes located in chromosomal regions previously linked with obesity and/or diabetes. Using this strategy, we have identified Beacon as a potential new target for obesity and diabetes.  相似文献   

20.
Wu KM  Farrelly JG 《Toxicology》2007,236(1-2):1-6
Many therapeutic agents are prepared in prodrug forms, which are classified into Type I, II and subtypes A, B based on their sites of conversion. Recently, an increasing number of INDs have appeared as Type II prodrugs that often contain dual tracks of toxicity profile exploration, one on the prodrug and another on the active drug. A comparative toxicology analysis is introduced here to assist reviewers to evaluate the dual toxicity profiles effectively. The analysis helps determine which toxicity is contributed by the prodrug itself, its intermediates, or the active drug itself. As prodrug INDs, or any other new molecular entity (NME) INDs progress into advanced phases of toxicology development, analysis of time-dependent component of toxicity expression, regarding the emergence of new target organs over time, becomes more significant. A strategy is developed to address Pharm/Tox issues such as what duration is required for a toxicity to emerge at the exposure level achieved or dose studied, how many animals in the group are affected, whether the toxicity is a cross-species phenomenon, and whether it is reversible, etc. In conclusion, dual-track comparative toxicology can be useful in the understanding of Type II prodrug's mechanism of toxicity, and that time-dependent toxicology analysis offers means to detecting new toxicity emergence over time. Both approaches could significantly facilitate secondary and tertiary review processes during IND development of a prodrug or NME.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号