Science, policy, and politics. Focus on science communication and climate change. The Dake Page offers news, analysis and book reviews.
Saturday, November 28, 2009
Climate Denialists Still Lying About CRU Emails - The CRU Responds
A statement from the University of East Anglia Climatic Research Unit can be read here.
Using quotes (in italics below) from the cited link, note that:
1) The denialists are lying about not having access to the CRU data:
“It is well known within the scientific community and particularly those who are sceptical of climate change that over 95% of the raw station data has been accessible through the Global Historical Climatology Network for several years. We are quite clearly not hiding information which seems to be the speculation on some blogs and by some media commentators,” commented the University’s Pro-Vice-Chancellor, Research Enterprise and Engagement Professor Trevor Davies.
2) The denialists are lying when they say that the emails reveal a conspiracy to adjust data.
There is nothing in the stolen material which indicates that peer-reviewed publications by CRU, and others, on the nature of global warming and related climate change are not of the highest-quality of scientific investigation and interpretation.
3) The denialists are lying when they say that the data don't support the scientific consensus. Obviously the scientific consensus was reached AFTER the data overwhelmingly led scientists to that conclusion. And all of the data from other sources all support this conclusion, not just the CRU data.
Our global temperature series tallies with those of other, completely independent, groups of scientists working for NASA and the National Climate Data Center in the United States, among others. Even if you were to ignore our findings, theirs show the same results. The facts speak for themselves; there is no need for anyone to manipulate them.
4) The denialists are lying when they claim there are "many" emails that "prove illegal behavior." There is no such thing. There are only a handful of emails cherry picked from over 1000 stolen in which the denialists are creatively interpreting the wording to fit their preconceived non-scientific views.
A selection of these emails have been taken out of context and misinterpreted as evidence that CRU has manipulated climate data to present an unrealistic picture of global warming. This conclusion is entirely unfounded and the evidence from CRU research is entirely consistent with independent evidence assembled by various research groups around the world.
5) The denialists are lying when they say the evidence is not there, when in fact the evidence of human-induced global warming is overwhelming.
The Intergovernmental Panel on Climate Change (IPCC) in its 4th Assessment Report (AR4) published in 2007 concluded that the warming of the climate system was unequivocal. This conclusion was based not only on the observational temperature record, although this is the key piece of evidence, but on multiple strands of evidence. These factors include: long-term retreat of glaciers in most alpine regions of the world; reductions in the area of the Northern Hemisphere (NH) snow cover during the spring season; reductions in the length of the freeze season in many NH rivers and lakes; reduction in Arctic sea-ice extent in all seasons, but especially in the summer; increases in global average sea level since the 19th century; increases in the heat content of the ocean and warming of temperatures in the lower part of the atmosphere since the late 1950s.
Thursday, November 26, 2009
The Copenhagen Diagnosis: Climate Science Report (or What's Been Happening Since the Last IPCC Report?)
‘The Copenhagen Diagnosis’ is a special report prepared by 26 climate researchers, most of whom are authors of published IPCC reports. In it they conclude "that several important aspects of climate change are occurring at the high end or even beyond the expectations of only a few years ago."
The text of the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) was drafted more than three years ago, and since then "many hundreds of papers have been published on a suite of topics related to human-induced climate change." Therefore, the purpose of this new report is to "synthesize the most policy-relevant climate science published" in that time. According to the authors, the rationale is two-fold:
First, the report "serves as an interim evaluation of the evolving science midway through an IPCC cycle - IPCC AR5 is not due for completion until 2013."
Second, and the authors believe the most important, the report "serves as a handbook of science updates that supplements the IPCC AR4 in time for Copenhagen in December 2009, and any national or international climate change policy negotiations that follow."
Purposefully targeting readership by policy-makers, stakeholders, the media and the broader public, each section of the report "begins with a set of key points that summarises the main findings." The authors note that the "science contained in the report is based on the most credible and significant peer-reviewed literature available at the time of publication."
From the Executive Summary, the most significant recent climate change findings are:
Surging greenhouse gas emissions: Global carbon dioxide emissions from fossil fuels in 2008 were nearly 40 percent higher than those in 1990. Even if global emission rates are stabilized at present-day levels, just 20 more years of emissions would give a 25 percent probability that warming exceeds 2°C, even with zero emissions after 2030. Every year of delayed action increases the chances of exceeding the 2°C warming.
Recent global temperatures demonstrate human-induced warming: Over the past 25 years temperatures have increased at a rate of 0.19°C per decade, in very good agreement with predictions based on greenhouse gas increases. Even over the past 10 years, despite a decrease in solar forcing, the trend continues to be one of warming. Natural, short-term fluctuations are occurring as usual, but there have been no significant changes in the underlying warming trend.
Acceleration of melting of ice-sheets, glaciers and ice-caps: A wide array of satellite and ice measurements now demonstrate beyond doubt that both the Greenland and Antarctic ice sheets are losing mass at an increasing rate. Melting of glaciers and ice caps in other parts of the world has also accelerated since 1990.
Rapid Arctic sea ice decline: Summertime melting of Arctic sea ice has accelerated far beyond the expectations of climate models. The area of sea ice melt during 2007-9 was about 40 percent greater than the average prediction from IPCC AR4 climate models.
Current sea level rise underestimated: Satellites show recent global average sea level rise (3.4 millimeters per year over the past 15 years) to be around 80 percent above past IPCC predictions. This acceleration in sea level rise is consistent with a doubling in contribution from melting of glaciers, ice caps and the Greenland and West-Antarctic ice sheets.
Sea level predictions revised: By 2100, global sea level is likely to rise at least twice as much as projected by the Working Group 1 of the IPCC AR4; for unmitigated emissions it may well exceed one meter. The upper limit has been estimated as about two meters sea level rise by 2100. Sea level will continue to rise for centuries after global temperatures have been stabilized, and several meters of sea level rise must be expected over the next few centuries.
Delay in action risks irreversible damage: Several vulnerable elements in the climate system (e.g. continental ice sheets, Amazon rain forest, West African monsoon and others) could be pushed towards abrupt or irreversible change if warming continues in a business-as-usual way throughout this century. The risk of transgressing critical thresholds (“tipping points”) increases strongly with ongoing climate change. Thus waiting for higher levels of scientific certainty could mean that some tipping points will be crossed before they are recognized.
The turning point must come soon: If global warming is to be limited to a maximum of 2°C above preindustrial values, global emissions need to peak between 2015 and 2020 and then decline rapidly. To stabilize climate, a de-carbonized global society – with near-zero emissions of CO2 and other long-lived greenhouse gases – needs to be reached well within this century. More specifically, the average annual per-capita emissions will have to shrink to well under one metric ton CO2 by 2050. This is 80-95 percent below the per-capita emissions in developed nations in 2000.
Clearly the time to act is now.
Wednesday, November 25, 2009
Senate to Hold Hearing on TSCA Chemical Control Reform
The Senate has announced that it will hold a joint hearing of both the full committee and subcommittee on Superfund, Toxics and Environmental Health next Wednesday, December 2, 2009. This is part of the Senate Committee on Environment and Public Works, which is Barbara Boxer chairs and James Inhofe is ranking minority member.
The title of the hearing will be "Oversight Hearing on the Federal Toxic Substances Control Act.”
It is scheduled to begin at 2:30 pm EST in the EPW Hearing Room - 406 Dirksen Senate Office Building.
So far no witnesses have been announced. I'll provide an update here when they do, or you can check the Committee's web site.
Last week the House held a hearing on TSCA reform.
Tuesday, November 24, 2009
Beth Bosley Testimony at Congressional Hearing on TSCA Chemical Control
Today I continue with the testimony of the final witness present at the November 17, 2009 Congressional hearings on "Prioritizing Chemicals for Safety Determination." Earlier I gave an overall summary. Beth Bosley is Managing Director of the Boron Specialties company, but spoke on behalf of the Society of Chemical Manufacturers and Affiliates (SOCMA). SOCMA is a trade association that represents the batch and custom chemical industry, which are typically small to medium-sized businesses and thus may not have the more extensive resources of the larger chemical firms.
Bosley noted that SOCMA agrees with the idea that TSCA needs to be modernized, but said that this should be done in a way that "doesn't devastate a strategic American industry that is already facing recession and foreign competition." She offered two essential principles for "a sustainable chemical management law that won't eliminate jobs, economic growth, or products."
1) Priorities must be based on risk: Bosley noted that this means "basing priorities and regulatory criteria on the scientific evaluation of toxicological response and exposure factors." She gave an example of a chemical that may be highly toxic but used only in strictly controlled industrial environments or in small quantities, and as such would actually have a fairly small risk to public health.
2) Proven regulatory mechanisms should be the basis for modernization: Bosley insisted that the modernized TSCA must rely on leveraging regulatory mechanisms that work in the US. She said that applying an approach like the European REACH approach in the US "would devastate small and medium sized companies...and do so unnecessarily since a more practical alternative is available. She suggested that "the Canadian approach to chemicals management has systematically prioritized that nation's inventory and is, therefore, much farther ahead of the EU with respect to evaluating chemicals in commerce."
Bosley also said that SOCMA supports the idea of an "inventory reset," which was part of EPA's Chemical Assessment and Management Program (ChAMP), a program that was recently discontinued. She noted that of the "over 80,000 chemicals now listed on the inventory, data suggest that only about 1/3 of these are presently in commerce." Thus, "resetting" the inventory to include only those now in production would significantly reduce the number of chemicals that need to be prioritized and categorized. Bosley suggested that "ChAMP should not have been abandoned, because it will just have to be reinstituted under another name."
Bosley said that we should "embrace TSCA mechanisms that have worked well, like the New Chemicals Program," which through its PreManufacture Notice (PMN) process reviews over 1,000 new chemicals every year. She also encouraged everyone to recognize "the massive amount of data that was generated by EPA's High Production Volume Program and leverage that data in making initial determinations of risk." She felt that "with reasonable amendments," TSCA could provide an easier mechanism to collect data from manufacturers and users related to a) volumes manufactured, processed or used, b) health effects, and c) exposure characteristics.
Finally, she noted that the "safety" standard used by EPA to make determinations should involve:
1) Not overlooking the basic principle of risk (i.e., the evaluation of both hazard and exposure, not just hazard),
2) Not let EPA get burdened with having to determine that each chemical is safe for its intended use (i.e., chemicals need to be prioritized so that only those really of concern should need to be evaluated in depth, and for only those uses that are of concern), and
3) EPA must be adequately funded no matter what approach Congres takes in modernizing TSCA. Bosley noted that "the biggest shortcoming of the TSCA program today is lack of resources, not lack of authority."
Well, that summarizes all of the witnesses who testified at the November 17, 2009 House subcommittee hearing. This is the follow up to a House hearing held back on February 26, 2009, which I discussed earlier.
In addition to the in-person witnesses there was some written testimony provided to the subcommittee. There were also statements published by several other interested stakeholders such at the American Chemistry Council, the Environmental Defense Fund, the Environmental Working Group, and others. I'll be reviewing these in the coming days. I'll also be providing some critical analysis contrasting the different viewpoints and looking at what a final bill might look like.
Sunday, November 22, 2009
Bill Greggs Testimony at Congressional Hearing on TSCA Chemical Control
This is a continuation of my series taking a closer look at the November 17, 2009 Congressional hearings on "Prioritizing Chemicals for Safety Determination." Earlier I gave an overall summary. Today I look at the testimony of William (Bill) Greggs, who is now a private Chemical and Environmental Policy Consultant but for 37 years was a chemical engineer and global chemical policy expert with the Procter & Gamble Company. Greggs was testifying on behalf of three processor and user organizations, namely the Consumer Specialty Products Association (CSPA), the Grocery Manufacturers Association (GMA), and the Soap and Detergent Association (SDA).
Greggs noted that "CSPA, GMA and SDA are committed to manufacturing and marketing safe and innovative products." He emphasized that the organizations agree that TSCA needs to be modernized and that it is critical that there be "development of a mechanism by which EPA will prioritize existing chemicals for review and assessment." He believes that "a priority setting process developed by Congress must be risk-based, taking into account both a chemical's hazards and potential exposures. Chemicals identified as the high priorities should be those substances with both the highest hazards and the highest potential exposures." (emphasis in original)
He noted that the three organizations have "collaborated with various industry representatives on the development of a risk-based and efficient tool that EPA can use to prioritize chemical substances." They recommend a framework that accounts for increasing levels of hazard on one axis and increasing levels of exposure on the other axis. On the hazard axis the criteria could include whether the substance was a carcinogen, mutagen or reproductive toxicant (CMR) or persistent, bioaccumulative and toxic (PBT), as well as others. For exposure axis the indicators could include the use pattern of the chemical, its biomonitoring findings (e.g., the CDC data), industrial releases as reported through the Toxics Release Inventory (TRI), and whether the chemical is found in water or the air.
The belief is that this process would identify the chemicals with greatest potential risk for prioritized review. Greggs asserted that stakeholders must be allowed to review and comment on the draft assessments, and that they must be given the opportunity to "provide additional information enabling a more informed decision or to remedy erroneous results of the priority setting process." He noted that "this is a critical component Congress must include that will significantly improve the results of this very important exercise."
Greggs felt that "done properly, this priority setting process would rank all chemicals from highest to lowest in a relatively short period of time." He noted that "the complete priority setting process will take EPA some time to accomplish," and therefore he encouraged Congress "to develop an additional mechanism that will enable EPA to identify the chemicals of highest priority for immediate assessment." To accomplish this, he recommended a process "that would require EPA to screen the data from the most recent Inventory Update Rule (IUR) submissions to identify chemicals that have the highest hazards and highest potential exposures. (emphasis in original) Through this process he believes "EPA could identify 50 to 100 chemicals that could quickly move into EPA's safety assessment process while the Agency works on prioritizing the remaining chemicals in commerce" using the tool described above.
Tomorrow I will take a look at the testimony of Beth Bosley, a consultant for the Society of Chemical Manufacturers and Affiliates (SOCMA).
Subscribe to:
Posts (Atom)