Download Free Toxicity Testing In The 21st Century Book in PDF and EPUB Free Download. You can read online Toxicity Testing In The 21st Century and write the review.

Advances in molecular biology and toxicology are paving the way for major improvements in the evaluation of the hazards posed by the large number of chemicals found at low levels in the environment. The National Research Council was asked by the U.S. Environmental Protection Agency to review the state of the science and create a far-reaching vision for the future of toxicity testing. The book finds that developing, improving, and validating new laboratory tools based on recent scientific advances could significantly improve our ability to understand the hazards and risks posed by chemicals. This new knowledge would lead to much more informed environmental regulations and dramatically reduce the need for animal testing because the new tests would be based on human cells and cell components. Substantial scientific efforts and resources will be required to leverage these new technologies to realize the vision, but the result will be a more efficient, informative and less costly system for assessing the hazards posed by industrial chemicals and pesticides.
From the use of personal products to our consumption of food, water, and air, people are exposed to a wide array of agents each day-many with the potential to affect health. Exposure Science in the 21st Century: A Vision and A Strategy investigates the contact of humans or other organisms with those agents (that is, chemical, physical, and biologic stressors) and their fate in living systems. The concept of exposure science has been instrumental in helping us understand how stressors affect human and ecosystem health, and in efforts to prevent or reduce contact with harmful stressors. In this way exposure science has played an integral role in many areas of environmental health, and can help meet growing needs in environmental regulation, urban and ecosystem planning, and disaster management. Exposure Science in the 21st Century: A Vision and A Strategy explains that there are increasing demands for exposure science information, for example to meet needs for data on the thousands of chemicals introduced into the market each year, and to better understand the health effects of prolonged low-level exposure to stressors. Recent advances in tools and technologies-including sensor systems, analytic methods, molecular technologies, computational tools, and bioinformatics-have provided the potential for more accurate and comprehensive exposure science data than ever before. This report also provides a roadmap to take advantage of the technologic innovations and strategic collaborations to move exposure science into the future.
Risk assessment has become a dominant public policy tool for making choices, based on limited resources, to protect public health and the environment. It has been instrumental to the mission of the U.S. Environmental Protection Agency (EPA) as well as other federal agencies in evaluating public health concerns, informing regulatory and technological decisions, prioritizing research needs and funding, and in developing approaches for cost-benefit analysis. However, risk assessment is at a crossroads. Despite advances in the field, risk assessment faces a number of significant challenges including lengthy delays in making complex decisions; lack of data leading to significant uncertainty in risk assessments; and many chemicals in the marketplace that have not been evaluated and emerging agents requiring assessment. Science and Decisions makes practical scientific and technical recommendations to address these challenges. This book is a complement to the widely used 1983 National Academies book, Risk Assessment in the Federal Government (also known as the Red Book). The earlier book established a framework for the concepts and conduct of risk assessment that has been adopted by numerous expert committees, regulatory agencies, and public health institutions. The new book embeds these concepts within a broader framework for risk-based decision-making. Together, these are essential references for those working in the regulatory and public health fields.
Toxicity testing in laboratory animals provides much of the information used by the Environmental Protection Agency (EPA) to assess the hazards and risks associated with exposure to environmental agents that might harm public health or the environment. The data are used to establish maximum acceptable concentrations of environmental agents in drinking water, set permissible limits of exposure of workers, define labeling requirements, establish tolerances for pesticides residues on food, and set other kinds of limits on the basis of risk assessment. Because the number of regulations that require toxicity testing is growing, EPA called for a comprehensive review of established and emerging toxicity-testing methods and strategies. This interim report reviews current toxicity-testing methods and strategies and near-term improvements in toxicity-testing approaches proposed by EPA and others. It identifies several recurring themes and questions in the various reports reviewed. The final report will present a long-range vision and strategic plan to advance the practices of toxicity testing and human health assessment of environmental contaminants.
The History of Alternative Test Methods in Toxicology uses a chronological approach to demonstrate how the use of alternative methods has evolved from their conception as adjuncts to traditional animal toxicity tests to replacements for them. This volume in the History of Toxicology and Environmental Health series explores the history of alternative test development, validation, and use, with an emphasis on humanity and good science, in line with the Three Rs (Replacement,Reduction, Refinement) concept expounded by William Russell and Rex Burch in 1959 in their now classic volume, The Principles of Humane Experimental Technique. The book describes the historical development of technologies that have influenced the application of alternatives in toxicology and safety testing. These range from single cell monocultures to sophisticated, miniaturised and microfluidic organism-on-a-chip devices, and also include molecular modelling, chemoinformatics and QSAR analysis, and the use of stem cells, tissue engineering and hollow fibre bioreactors. This has been facilitated by the wider availability of human tissues, advances in tissue culture, analytical and diagnostic methods, increases in computational processing, capabilities, and a greater understanding of cell biology and molecular mechanisms of toxicity. These technological developments have enhanced the range and information content of the toxicity endpoints detected, and therefore the relevance of test systems and data interpretation, while new techniques for non-invasive diagnostic imaging and high resolution detection methods have permitted an increased role for human studies. Several key examples of how these technologies are being harnessed to meet 21st century safety assessment challenges are provided, including their deployment in integrated testing schemes in conjunction with kinetic modelling, and in specialized areas, such as inhalation toxicity studies. The History of Alternative Test Methods in Toxicology uses a chronological approach to demonstrate how the use of alternative methods has evolved from their conception as adjuncts to traditional animal toxicity tests to replacements for them. This volume in the History of Toxicology and Environmental Health series explores the history of alternative test development, validation, and use, with an emphasis on humanity and good science, in line with the Three Rs (Replacement, Reduction, Refinement) concept expounded by William Russell and Rex Burch in 1959 in their now-classic volume, The Principles of Humane Experimental Technique. The book describes the historical development of technologies that have influenced the application of alternatives in toxicology and safety testing. These range from single cell monocultures to sophisticated miniaturised and microfluidic organism-on-a-chip devices, and also include molecular modelling, chemoinformatics and QSAR analysis, and the use of stem cells, tissue engineering and hollow fibre bioreactors. This has been facilitated by the wider availability of human tissues, advances in tissue culture, analytical and diagnostic methods, increases in computational processing capabilities, and a greater understanding of cell biology and molecular mechanisms of toxicity. These technological developments have enhanced the range and information content of the toxicity endpoints detected, and therefore the relevance of test systems and data interpretation, while new techniques for non-invasive diagnostic imaging and high resolution detection methods have permitted an increased role for human studies. Several key examples of how these technologies are being harnessed to meet 21st century safety assessment challenges are provided, including their deployment in integrated testing schemes in conjunction with kinetic modelling, and in specialised areas, such as inhalation toxicity studies.
The sophistication of modelling and simulation technologies have improved dramatically over the past decade and their applications in toxicity prediction and risk assessment are of critical importance. The integration of predictive toxicology approaches will become increasingly necessary as industrial chemicals advance and as new pharmaceuticals enter the market. In this comprehensive discussion of predictive toxicology and its applications, leading experts express their views on the technologies currently available and the potential for future developments. The book covers a wide range of topics including in silico, in vitro and in vivo approaches that are being used in the safety assessment of chemical substances. It reflects the growing and urgent need to strengthen and improve our ability to predict the safety and risks posed by industrial and pharmaceutical chemicals in humans. The reader will find extensive information on the use of current animal models used for various toxicities and target mediated toxicities. Also discussed are the recent regulatory initiatives to improve the safety assessment of chemicals. The book provides an expert and comprehensive discussion on the current status and future directions of predictive toxicology and its application. The various chapters in the book also reflect the growing need for improvements in our technologies and abilities to predict toxicities of pharmaceutical and industrial chemicals to ensure product safety and protect public health.
Toxicity testing is used to assess the safety or hazards presented by substances such as industrial chemicals, consumer products, and pharmaceuticals. At present, many methods involve laboratory animals. Alternative procedures, some involving human cell-based technologies, are now being developed which reduce, refine, or replace animal usage and minimize the pain and distress caused. These new tests must protect public health and the environment at least as well as currently accepted methods. This book describes the ever-expanding "toolbox" of methods available to assess toxicity. Such techniques often result from our growing understanding of the biochemical and cellular pathways that mediate toxicity mechanisms. This permits evaluations of information generated from several sources to generate a "weight of evidence". By combining in silico, in vitro, and ex vivo methods with technologies that rely on biochemical- and cell-based in vitro assays, toxicologists are developing mechanistically based alternatives to live animal experimentation. This text also explores the complexities associated with adequate validation, and the assessment of test reliability and relevance. It provides an essential reference source for postgraduates, academics and industrialists working in this rapidly changing area.
Tens of thousands of chemicals are released into the environment every day. High-throughput screening (HTS) has offered a more efficient and cost-effective alternative to traditional toxicity tests that can profile these chemicals for potential adverse effects with the aim to prioritize a manageable number for more in depth testing and to provide clues to mechanism of toxicity. The Tox21 program, a collaboration between the National Institute of Environmental Health Sciences (NIEHS)/National Toxicology Program (NTP), the U.S. Environmental Protection Agency’s (EPA) National Center for Computational Toxicology (NCCT), the National Institutes of Health (NIH) National Center for Advancing Translational Sciences (NCATS), and the U.S. Food and Drug Administration (FDA), has generated quantitative high-throughput screening (qHTS) data on a library of 10K compounds, including environmental chemicals and drugs, against a panel of nuclear receptor and stress response pathway assays during its production phase (phase II). The Tox21 Challenge, a worldwide modeling competition, was launched that asks a “crowd” of researchers to use these data to elucidate the extent to which the interference of biochemical and cellular pathways by compounds can be inferred from chemical structure data. In the Challenge participants were asked to model twelve assays related to nuclear receptor and stress response pathways using the data generated against the Tox21 10K compound library as the training set. The computational models built within this Challenge are expected to improve the community’s ability to prioritize novel chemicals with respect to potential concern to human health. This research topic presents the resulting computational models with good predictive performance from this Challenge.
This book collects protocols from different areas of knowledge to assist in the identification of toxic effects exerted by different xenobiotics. At the same time as classical techniques are presented, modern techniques with alternative models to the use of animals are also presented. Given the ever-increasing exposure to different compounds and their effects on population health, the assessment of multiple endpoints is of utmost importance for better risk assessment, and this collection addresses that need. Written for the highly successful Methods in Molecular Biology series, chapters include introductions to their respective topics, lists of the necessary materials and reagents, step-by-step, readily reproducible laboratory protocols, and tips on troubleshooting and avoiding known pitfalls. Authoritative and practical, Toxicity Assessment: Methods and Protocols aims to serve researchers in this vast field of science as they seek to better understand the mechanisms of action of different xenobiotics.
Animal Experimentation: Working Towards a Paradigm Change critically appraises current animal use in science and discusses ways in which we can contribute to a paradigm change towards human-biology based approaches.