Saturday, March 28, 2009

USEPA Offers "Strategic Plan" for Evaluating the Toxicity of Chemicals


This week the USEPA released a "Strategic Plan for Evaluating the Toxicity of Chemicals," which departs from "the traditional risk assessment approach that relies heavily on data generated through the intentional dosing of experimental animals." As most people know, animal welfare issues have led to a desire to find non-animal testing methods. At the same time there is pressure to provide data that adequately characterizes the hazards and risks of industrial and consumer chemicals.

According to EPA, while the traditional approach
"has provided EPA with sound science to support regulatory decision making over the past several decades, EPA must address ever-increasing demands, including consideration of complex issues such as cumulative exposures, life-stage vulnerabilities, and genetic susceptibilities, not to mention the increasing number of chemicals and cost of toxicity testing. A new approach is proposed to address these demands, an approach based on the application of advances in molecular biology and computational sciences to transform toxicity testing and risk assessment practices."

Based on a 2007 report by the National Research Council (NRC) of the National Academies, "Toxicity Testing in the 21st Century: a Vision and a Strategy," an Agency workgroup coordinated to produce the new Strategic Plan that "focuses on identifying and evaluating "toxicity pathways," i.e., cellular response pathways responsible for adverse health effects when sufficiently perturbed by environmental agents under realistic exposure conditions."

While EPA expects that the new paradigm will "create more efficient and cost-effective means to screen and prioritize for further assessment the tens of thousands of chemicals that are already found in the environment," there is some question as to whether that expectation is realistic. For example, I see four major challenges.

1) Developing and validating the methodologies: Development of new methods generally takes many iterations to determine the conditions that provide the most information with reliability and repeatability. Thus, it will likely be quite a few years before the methods being developed can be used for decision-making. Can we wait that long?

2) Translating expressions of exposure at the gene, protein, molecular, and cellular levels to the target organ and organism level: Whereas a standard animal study provides easily interpretable and accepted measures of toxicity (e.g., death, loss of body weight, reduced reproduction), the new methods provide much more nuanced results whose toxicological significance may be very difficult to establish. These subtle responses may simply be adaptive rather than result in diminished capacity.

3) Communicating why this is better than the current QSAR-based screening methods: The new methods will serve only as screening tools for prioritizing chemicals for further review. Ultimately the final risk management decision-making may still be based on the established standard testing methods. EPA will need to explain why these new screening level methods are better than the current US approach to screening chemicals. This may be especially difficult given that a Canadian prioritization program reviewed the 23,000 chemicals on its existing chemicals inventory based largely on existing study data and QSAR analysis.

4) Funding: The 2007 NRC report “Toxicity Testing in the 21st Century” suggested that transitioning into this new computational, informational, and molecular-based strategy would require $100M in funding every year for a period of 10-20 years. Given the current economic situation and competing issues such as climate change, TSCA reform, green chemistry, endocrine disruption, and others, it is hard to imagine that adequate funding can be made available for this endeavor.

Tuesday, March 24, 2009

Scientists as political activists?


I mentioned previously the climate change conference held in Copenhagen, which was designed to stimulate some activism on the part of scientists in preparation for December's much anticipated international climate change conference. Despite this goal, however, one point made repeatedly was that "formulating an action plan to curb climate change is not the job of scientists."

Herein lies a dilemma (wrapped inside a conundrum, or is it the other way around).

Scientists traditionally prefer to do the science and leave the policy development to others. For one thing, public policy must consider many more things than just the science. There are sociological, economic, political, and pragmatic considerations. But at the same time politicians are asking scientists, for example climate change scientists, what action they should take to address the problem. Unfortunately, while they can predict what may happen, it's even more difficult to provide guidance on what to do about it. This is true for a couple of reasons.

First, models can never provide a perfect prediction of how and where the climate will change. One participant in Copenhagen noted: "Tell me what the stock market will do in 100 years and I will tell you what the climate will do." Second, most climate scientists will tell you that their role does not include policy formulation. They can provide scenarios of what will happen if emissions hit certain thresholds, but when politicians ask what is the absolute maximum amount of CO2 we should allow, there is no easy answer. In the end, it depends on how much risk we are willing to take. That, and how good we are at predicting tipping points.

So the organizers of the Copenhagen conference hoped that they can encourage scientists to take a more active role and speak not as scientists but as concerned citizens. Some may feel uncomfortable with "blurring the line between science and activism," but they also know that no one understands the risks better than they do and no one is better placed to give informed opinions.

One positive note from the current Obama administration is that he has placed scientists in key appointee positions, jobs that too often in the past have gone to political friends whether they know anything about science or not. These scientists as risk managers are "people who are willing and able to weigh up the risks, costs and benefits of various degrees of action."

Sunday, March 22, 2009

Global Warming Denialists Turning on Themselves as Their Climate Change Titanic Sinks


I did a post a while back about a "Climate Skeptics Conference" held in New York City. According to Andrew Revkin, writing in the Scotland in Sunday newspaper, it may just have been a last hurrah for the global warming denialist industry.

Case against climate change melting away

For example, from the linked article:

"But large corporations such as Exxon Mobil, which in the past financed the Heartland Institute and other groups that challenged the climate consensus, have reduced support. Many such companies no longer dispute that the greenhouse gases produced by burning fossil fuels pose risks.

From 1998 to 2006, Exxon Mobil, for example, contributed more than $600,000 (£414,000) to Heartland, according to annual reports of charitable contributions from the company and company foundations.

Alan Jeffers, a spokesman for Exxon Mobil, said by e-mail that the company had ended support "to several public policy research groups whose position on climate change could divert attention from the important discussion about how the world will secure the energy required for economic growth in an environmentally responsible manner"."


And:

"But Kert Davies, a climate campaigner for Greenpeace, said that the experts giving talks were "a shrinking collection of extremists" and that they were "left talking to themselves"."


And this long piece from the article above:

"But Lindzen also criticised widely publicised assertions by other sceptics that variations in the sun were driving temperature changes in recent decades. To attribute short-term variation in temperatures to a single cause, whether human-generated gases or something else, was erroneous, he said.

Speaking of the sun's slight variability, he said: "Acting as though this is the alternative (to blaming greenhouse gases] is asking for trouble."

S Fred Singer, a physicist often referred to by critics and supporters alike as the dean of climate contrarians, said: "As a physicist, I am concerned that some sceptics, a very few, are ignoring the physical basis.

"There is one who denies that CO2 is a greenhouse gas, which goes against actual data," Singer said, adding that other sceptics wrongly contend that "humans are not responsible for the measured increase in atmospheric CO2."

There were notable absences from the conference this year. Russell Seitz, a physicist from Cambridge, Massachusetts, delivered a speech at last year's meeting. But Seitz, who has lambasted environmental campaigners for distorting climate science, now warns that the sceptics are in danger of doing the same thing.

The most strident advocates on either side of the global warming debate, he said, are "equally oblivious to the data they seek to discount or dramatise".

John Christy, an atmospheric scientist at the University of Alabama who has long publicly questioned projections of dangerous global warming, most recently at a House committee hearing last month, said he had skipped both Heartland conferences to avoid the potential for "guilt by association"."


As we have seen, many of the global warming denialists have been pushing the very "party line" that their lauded "shills and charlatans" (plus, of course, the real scientists with real concerns) are now criticizing as very misguided ideologically based opinions.

Keep in mind, this was at the "Skeptics Conference" and these are the words of the "skeptics" themselves. Apparently skeptics are beginning their "every man for himself" race to the lifeboats as their Titanic sinks (perhaps after hitting a melting iceberg).