Saturday, March 28, 2009

USEPA Offers "Strategic Plan" for Evaluating the Toxicity of Chemicals


This week the USEPA released a "Strategic Plan for Evaluating the Toxicity of Chemicals," which departs from "the traditional risk assessment approach that relies heavily on data generated through the intentional dosing of experimental animals." As most people know, animal welfare issues have led to a desire to find non-animal testing methods. At the same time there is pressure to provide data that adequately characterizes the hazards and risks of industrial and consumer chemicals.

According to EPA, while the traditional approach
"has provided EPA with sound science to support regulatory decision making over the past several decades, EPA must address ever-increasing demands, including consideration of complex issues such as cumulative exposures, life-stage vulnerabilities, and genetic susceptibilities, not to mention the increasing number of chemicals and cost of toxicity testing. A new approach is proposed to address these demands, an approach based on the application of advances in molecular biology and computational sciences to transform toxicity testing and risk assessment practices."

Based on a 2007 report by the National Research Council (NRC) of the National Academies, "Toxicity Testing in the 21st Century: a Vision and a Strategy," an Agency workgroup coordinated to produce the new Strategic Plan that "focuses on identifying and evaluating "toxicity pathways," i.e., cellular response pathways responsible for adverse health effects when sufficiently perturbed by environmental agents under realistic exposure conditions."

While EPA expects that the new paradigm will "create more efficient and cost-effective means to screen and prioritize for further assessment the tens of thousands of chemicals that are already found in the environment," there is some question as to whether that expectation is realistic. For example, I see four major challenges.

1) Developing and validating the methodologies: Development of new methods generally takes many iterations to determine the conditions that provide the most information with reliability and repeatability. Thus, it will likely be quite a few years before the methods being developed can be used for decision-making. Can we wait that long?

2) Translating expressions of exposure at the gene, protein, molecular, and cellular levels to the target organ and organism level: Whereas a standard animal study provides easily interpretable and accepted measures of toxicity (e.g., death, loss of body weight, reduced reproduction), the new methods provide much more nuanced results whose toxicological significance may be very difficult to establish. These subtle responses may simply be adaptive rather than result in diminished capacity.

3) Communicating why this is better than the current QSAR-based screening methods: The new methods will serve only as screening tools for prioritizing chemicals for further review. Ultimately the final risk management decision-making may still be based on the established standard testing methods. EPA will need to explain why these new screening level methods are better than the current US approach to screening chemicals. This may be especially difficult given that a Canadian prioritization program reviewed the 23,000 chemicals on its existing chemicals inventory based largely on existing study data and QSAR analysis.

4) Funding: The 2007 NRC report “Toxicity Testing in the 21st Century” suggested that transitioning into this new computational, informational, and molecular-based strategy would require $100M in funding every year for a period of 10-20 years. Given the current economic situation and competing issues such as climate change, TSCA reform, green chemistry, endocrine disruption, and others, it is hard to imagine that adequate funding can be made available for this endeavor.

No comments: