Office of Food Additive Safety, Center for Food Safety and Applied Nutrition, U.S. Food and Drug Administration, College Park, Maryland.
Department of Environmental and Occupational Health Sciences, School of Public Health, University of Washington, Seattle, Washington.
Toxicol Sci. 2018 Jan 1;161(1):5-22. doi: 10.1093/toxsci/kfx186.
Toxicology has made steady advances over the last 60+ years in understanding the mechanisms of toxicity at an increasingly finer level of cellular organization. Traditionally, toxicological studies have used animal models. However, the general adoption of the principles of 3R (Replace, Reduce, Refine) provided the impetus for the development of in vitro models in toxicity testing. The present commentary is an attempt to briefly discuss the transformation in toxicology that began around 1980. Many genes important in cellular protection and metabolism of toxicants were cloned and characterized in the 80s, and gene expression studies became feasible, too. The development of transgenic and knockout mice provided valuable animal models to investigate the role of specific genes in producing toxic effects of chemicals or protecting the organism from the toxic effects of chemicals. Further developments in toxicology came from the incorporation of the tools of "omics" (genomics, proteomics, metabolomics, interactomics), epigenetics, systems biology, computational biology, and in vitro biology. Collectively, the advances in toxicology made during the last 30-40 years are expected to provide more innovative and efficient approaches to risk assessment. A goal of experimental toxicology going forward is to reduce animal use and yet be able to conduct appropriate risk assessments and make sound regulatory decisions using alternative methods of toxicity testing. In that respect, Tox21 has provided a big picture framework for the future. Currently, regulatory decisions involving drugs, biologics, food additives, and similar compounds still utilize data from animal testing and human clinical trials. In contrast, the prioritization of environmental chemicals for further study can be made using in vitro screening and computational tools.
毒理学在过去的 60 多年中取得了稳步的进展,在越来越精细的细胞组织水平上了解毒性的机制。传统上,毒理学研究使用动物模型。然而,3R(替代、减少、优化)原则的普遍采用为毒性测试中体外模型的发展提供了动力。本评论试图简要讨论一下大约从 1980 年开始的毒理学转变。在 80 年代,许多在细胞保护和毒物代谢中重要的基因被克隆和表征,基因表达研究也变得可行。转基因和基因敲除小鼠的发展为研究特定基因在产生化学物质的毒性作用或保护生物体免受化学物质毒性作用方面的作用提供了有价值的动物模型。毒理学的进一步发展来自于“组学”(基因组学、蛋白质组学、代谢组学、相互作用组学)、表观遗传学、系统生物学、计算生物学和体外生物学工具的整合。总的来说,过去 30-40 年毒理学的进展有望为风险评估提供更具创新性和更有效的方法。实验毒理学的未来目标是减少动物的使用,同时能够使用替代毒性测试方法进行适当的风险评估和做出明智的监管决策。在这方面,Tox21 为未来提供了一个总体框架。目前,药物、生物制品、食品添加剂和类似化合物的监管决策仍然利用动物测试和人体临床试验的数据。相比之下,可以使用体外筛选和计算工具来确定环境化学物质进一步研究的优先级。