Quantcast
Channel: heat wave – GlobalWarming.org
Viewing all articles
Browse latest Browse all 2

John Christy on Summer Heat and James Hansen’s PNAS Study

$
0
0

Post image for John Christy on Summer Heat and James Hansen’s PNAS Study

In a recent study published in Procedings of the National Academy of Sciences (PNAS), NASA scientist James Hansen and two colleagues find that whereas “extremely hot” summer weather “practically did not exist” during 1951-1980, such weather affected between 4% and 13% of the Northern Hemisphere land area during 2006-2011. The researchers infer that human-caused global warming is “loading” the “climate dice” towards extreme heat anomalies. They conclude with a “high degree of confidence” that the 2003 European heat wave, the 2010 Russian heat wave, and the 2011 Texas-Oklahoma drought were a “consequence of global warming” and have (as Hansen put it in a recent op-ed) “virtually no explanation other than climate change.”

In a recent post, I reviewed studies finding that the aforementioned anomalies were chiefly due to natural variability. In another post, I summarized an analysis by Patrick Michaels and Chip Knappenberger, who conclude that “the 2012 drought conditions, and every other [U.S.] drought that has come before, is the result of natural processes, not human greenhouse gas emissions.”

But what about the very hot weather afflicting much of the U.S. this summer? Greenhouse gas concentrations keep rising, heat spells are bound to become more frequent and severe as the world warms, and the National Oceanic and Atmospheric Administration (NOAA) reports that July 2012 was the hottest July ever in the U.S. instrumental record. Isn’t this summer what greenhouse warming “looks like“? What else could it be?

University of Alabama in Huntsville (UAH) climatologist John Christy addressed these questions last week in a two-part column. In Part 1, Christy argues that U.S. daily mean temperature (TMean) data, on which NOAA based its report, “do not represent the deep atmosphere where the enhanced greenhouse effect should be detected, so making claims about causes is unwise.” A better measure of the greenhouse effect is daily maximum temperature (TMax), and TMax records set in the 1930s remain unbroken. In Part 2, Christy argues that Hansen’s 10% estimate of the portion of land affected by extreme heat during 2006-2011 shrinks down to 2.9% when anomalies are measured against a longer, more representative climate baseline. 

NOAA’s claim that July 2012 was the hottest July ever is based on daily mean temperature (TMean) data. TMean is the average of daytime maximum temperature and nighttime minimum temperature (TMax + TMin/2). Whereas TMax “represents the temperature of a well-mixed lower tropospheric layer, especially in summer,” TMin “can warm over time due to an increase in turbulent mixing” near the surface. Land use changes such as urbanization, agriculture, and forestry tend to disrupt the natural formation of a shallow layer of cool nighttime air. There has been a lot of population growth and development in the U.S. since 1980, the last year of Hansen’s baseline period. Not coincidentally, most of the surface warming in the U.S. during the past three decades has been in TMin rather than TMax (see second graph below).

The point? TMin warming is not primarily due to the accumulation of heat in the deep atmosphere (i.e. the greenhouse effect). Consequently, averaging TMin with TMax produces a composite (TMean) that inflates the appearance of the greenhouse effect.

Christy’s colleague Roy Spencer produced a chart of TMax using the same weather stations as NOAA. Spencer found that July 2012 was very hot, but not as hot as the summers of 1936 and 1934. More importantly, far more all-time TMax records were set in the 1930s than in any recent decade.

In contrast, about as many TMin records were set in recent years as in the 1930s.

Christy comments:

There has been a relatively steady rise in high TMin records (i.e. hot nights) which does not concur with TMax, and is further evidence that TMax and TMin are not measuring the same thing. They really are apples and oranges. As indicated above, TMin is a poor proxy for atmospheric heat content, and it inflicts this problem on the popular TMean temperature record which is then a poor proxy for greenhouse warming too.

Although TMax is a better proxy than TMin for the greenhouse effect, only satellites can provide “direct and robust” measurements of the heat content of the global atmosphere. UAH satellite data do show that the Earth has been in a long-term warming trend (+ 0.14°C per decade since November 1978). However, the data also show that July 2012 was not the hottest July in the 34-year satellite record either for the continental U.S., the Northern Hemisphere, or the world.

Christy finds two main weaknesses in Hansen’s study. First, it assumes that changes in TMean accurately represent the effect of extra greenhouse gases. Second, it assumes that the distribution (bell curve) of weather anomalies during single 30-year period (1951-1980) represents natural climate variability over the past 10,000 years or so.

As discussed above, TMean “misrepresents the response of the climate system to extra greenhouse gases.” So Christy uses TMax data to estimate trends in hot weather anomalies. In addition, he calculated the spatial extent of North Hemisphere extreme heat anomalies during 2006-2011 using both Hansen’s baseline (1951-1980) and a somewhat longer baseline that includes the 1930s and 1940s (1931-1980). Christy’s results are much less dramatic than Hansen’s.

In the figure above, the top line (black-filled circles) shows the percentage of the Northern Hemisphere land area that the Hansen team calculated to have experienced anomalously high heat during 2006-2011. The next line (gray-filled circles) assumes the same base period (1951-1980) for gauging anomalies, but uses TMax from the quality-controlled Berkeley Earth Surface Temperature (BEST) station data. Although the “correlation between the two is high,” the spatial coverage drops by more than half, “from Hansen’s 6-year average of 12 percent to this analysis at 5 percent.”

The third line (open circles) gauges TMax anomalies in 2oo6-2011 against a 1931-1980 baseline. The result is that 2.9% of the Northern Hemisphere land area experienced extreme heat anomalies — about a quarter of the Hansen team’s results. “In other words,” says Christy, “the results change quite a bit simply by widening the window back into a period with even less greenhouse forcing for an acceptable base-climate.”

The lowest line (open boxes) uses an 80-year baseline (1931-2010) to identify extreme hot weather anomalies during 2006-2011. In this case, only 1.3% of the land surface in 2006-2011 experienced anomalously high heat.

One might object that the 80-year baseline includes the most recent 30 years of greenhouse warming and, thus, masks the impact of greenhouse gas emissions on the ‘natural’ climate. However, excluding the most recent 30 years, as Hansen does, is question-begging — it assumes what Hansen sets out to prove, namely, that the current climate is outside the range of natural variability. That assumption conflicts with studies finding that the Northern Hemisphere was warmer than present for several decades during the Medieval Warm Period and Roman Warm Period and for thousands of years during Holocene Optimum. Christy asks:

What is an accurate expression of the statistics of the interglacial, non-greenhouse-enhanced climate? Or, what is the extent of anomalies that Mother Nature can achieve on her own for the “natural” climate system from one 30-year period to the next? I’ll bet the variations are much greater than depicted by 1951-1980 alone, so this choice by Hansen as the base climate is not broad enough. In the least, there should be no objection to using 1931-1980 as a reference-base for a non-enhanced-greenhouse climate.

 

 


Viewing all articles
Browse latest Browse all 2

Latest Images

Trending Articles





Latest Images