READ ABOUT DETROIT AND SOLUTIONS TO ITS PROBLEMS. CLICK HERE.

Sunday, November 15, 2009

A Critique of the October 2009 NCAR Study Regarding Record Maximum to Minimum Ratios

SEARCH BLOG: GLOBAL WARMING

This is a critique of the October 2009 NCAR publication regarding increasing ratio of new daily maximum temperatures to new daily minimum temperatures.

The NCAR [National Center for Atmospheric Research] study titled “The relative increase of record high maximum temperatures compared to record low minimum temperatures in the U.S.” was published October 19, 2009.

"Abstract

The current observed value of the ratio of daily record high maximum temperatures to record low minimum temperatures averaged across the U.S. is about two to one. This is because records that were declining uniformly earlier in the 20th century following a decay proportional to 1/n (n being the number of years since the beginning of record keeping) have been declining less slowly for record highs than record lows since the late 1970s. Model simulations of U.S. 20th century climate show a greater ratio of about four to one due to more uniform warming across the U.S. than in observations. Following an A1B emission scenario for the 21st century, the U.S. ratio of record high maximum to record low minimum temperatures is projected to continue to increase, with ratios of about 20 to 1 by mid-century, and roughly 50 to 1 by the end of the century."
The following is a graphic representation of the study from the UCAR website:
temps

"This graphic shows the ratio of record daily highs to record daily lows observed at about 1,800 weather stations in the 48 contiguous United States from January 1950 through September 2009. Each bar shows the proportion of record highs (red) to record lows (blue)for each decade. The 1960s and 1970s saw slightly more record daily lows than highs, but in the last 30 years record highs have increasingly predominated, with the ratio now about two-to-one for the 48 states as a whole."

While this study does use sound methodology regarding the data it has included, it falls short of being both statistically and scientifically complete. Hence, the conclusions from this study are prone to significant bias and any projections from this study are likely incorrect.

Study's Methodology

"We use a subset of quality controlled NCDC US COOP network station observations of daily maximum and minimum temperatures, retaining only those stations with less than 10% missing data (in fact the median number of missing records over all stations is 2.6%, the mean number is 1.2%) . All stations record span the same period, from 1950 to 2006, to avoid any effect that would be introduced by a mix of shorter and longer records. The missing data are filled in by simple averages from neighboring days with reported values when there are no more than two consecutive days missing, or otherwise by interpolating values at the closest surrounding stations. Thus we do not expect extreme values to be introduced by this essentially smoothing procedure. In addition our results are always presented as totals over the entire continental U.S. region or its East and West portions, with hundreds of stations summed up. It is likely that record low minima for some stations are somewhat skewed to a cool bias (e.g. more record lows than should have occurred) due to changes of observing time (see discussion in Easterling, 2002, and discussion in auxiliary material), though this effect is considered to be minor and should not qualitatively change the results. Additionally, at some stations two types of thermometers were used to record maximum and minimum temperatures. The switch to the Max/Min Temperature System (MMTS) in the 1980s at about half the stations means that thermistor measurements are made for maximum and minimum. This has been documented by Quayle et al. (1991), and the effect is also considered to be small. To address this issue, an analysis of records within temperature minima and within temperature maxima shows that the record minimum temperatures are providing most of the signal of the increasing ratio of record highs to record lows (see further discussion in auxiliary material)." [lines 82-102 NCAR publication]
Comments

The NCAR Study contains at least two biases:
  1. The selection of 1950 through 2006 significantly biases the outcome of this study because the U.S. was entering a cooling period in the 1960s and 1970s which creates the illusion of unusual subsequent warming from 1980 through 2006.
  2. During the last decade, a large reduction of rural reporting stations in the U.S. has biased records toward urbanized and urbanizing areas. Land use changes as well as deterioration of urban siting versus NOAA standards [http://www.surfacestations.org/] have resulted in a bias toward over-reporting/erroneous reporting of high temperature records and an under-reporting of low temperature records.
The charts below show the results of nearly 4/5ths of the U.S. weather stations and demonstrate a significant bias toward errors greater than +2°C or about three times the total trend reported for global warming. This instrumentation issue as well as the bias toward urbanization contribute to significant doubt about the conclusions that can be reached concerning temperature trends in the second half of the 20th century.

crn_ratings

Reality Check

As stated earlier, this critique does not question the statistical methodology applied for that data used in the study. Rather it challenges the underlying selection of the data and the quality of the data.

The most obvious deficiency to anyone familiar with tracking U.S. temperature records is the omission of data from the 1880s through the 1940s. By selecting a period of cooling as the starting point, the NCAR study “stacks the deck” in favor of a warming trend. This is the same problem with the 1880s as a starting point for the longer term trend, but due to shorter cyclical variations within the 130 years longer trend data are somewhat tempered. There is no full climate cycle in the NCAR study.

To demonstrate this point, we will compare the record of statewide monthly temperature records for the longer period with the approximately 60-years of data used in the NCAR study. Obviously, comparing monthly statewide records to daily single point records can be argued a case of apple seeds and watermelons, but from an analytical perspective it is reasonable, as shall be shown.

The real difference is that we are comparing area climates with micro climates. Area climate monthly data maximum and minimum records will rarely be affected by spurious readings whereas micro climate daily data is wholly dependent upon very specific proper siting and quality control, including avoidance of external heat sources, issues that have been raised at the Surface Stations website.

In February 2007, I published an analysis titled, “Extreme Temperatures – Where’s The Global Warming” and a subsequent data update in January 2009 titled “Where Is The Global Warming... Extreme Temperature Records Update January 2009.” Dr. Roger Pielke, Sr. provided suggestions and a review in his weblog, “Climate Science.” In February 2009, these data were summarized graphically with this animation and data table in a posted titled, “Decadal Occurrences Of Statewide Maximum Temperature Records”:

Each state can have only 12 statewide, monthly records for the 13 decades tracked here... hence, they are "all-time" records for a state for a month.

Graphs 1880s-2000s High Temperature Frequency2
Range goes from 0 [white] to 8 [dark red]. Indiana had the highest frequency of records in one decade with 8 still standing from the 1930s. See the table below for the actual count by decade. Old records are replaced if tied or surpassed by subsequent readings.
The 1930s experienced the highest number of maximum extreme temperatures for which records have not been tied or surpassed subsequently. While the late 1990s did have a very brief hot period associated with El Nino, the 1990s were a rather ordinary period for extreme temperatures in the contiguous 48 states.
I have excluded Alaska and Hawaii from this animation because they are distinct and separate climate zones. For the record, however, Alaska's decade of most frequent high temperature records was the 1970s with 4. Hawaii's decade of most records was the 1910s. Those data are included in the table below.
The 1990s were only particularly hot, as reflected in these records, in New England and Idaho. These selective areas were far more restricted than the geographically widespread heat of the 1930s.

This animation goes to the heart of my arguments regarding global warming as it is reflected in U.S. temperature data.
  • The trendline used by those claiming a long term warming begins in a very cool climate period. Consequently, any trend from that point will be upward.

  • The late 1990s were an aberration and not indicative of the general climate oscillations presented in these records.
Statewide Monthly Maximum Temperature Records
Click on image below for larger view.

Each state can have only 12 maximum or minimum all-time monthly records. The methodology requires that any instance where a previous record is tied or exceeded, the old record in replaced by the newer record. Therefore, this approach has a slight bias toward more recent records due to the “tie goes to the later” rule.

Comparison of Statewide Monthly Temperature Records with the NCAR Study

Going back to the NCAR graphic we can see a trend which may correspond to their conclusion:
For later in the 21st century, the model indicates that as warming continues (following the A1B scenario), the ratio of record highs to record lows will continue to increase, with values of about 20 to 1 by mid-century, and roughly 50 to 1 by late century.

Two factors contribute to this increase as noted by Portman et al (2009a): 1) increases in temperature variance in a future warmer climate (as noted in the model by Meehl and Tebaldi, 2004), and 2) a future increasing trend of temperatures over the U.S. (model projections shown in Meehl et al., 2006). Since the A1B mid-range scenario is used, a lower forcing scenario (e.g. B1) produces reduced values of the ratio in the 21st century, and a higher forcing scenario (e.g. A2) produces greater values. The model cannot represent all aspects of unforced variability that may have influenced the observed changes of record temperatures to date, and the model over-estimates warming over the U.S. in the 20th century. The future projections may also reflect this tendency and somewhat over-estimate the future increase in the ratio. Under any future scenario that involves increases of anthropogenic greenhouse gases and corresponding increases in temperature, the ratio of record high maximum to record low minimum temperatures will continue to increase above the current value.

temps

Before the direct comparison to the statewide monthly records, let us look at their extrapolation [FROM THE NCAR STUDY - "is projected to continue to increase, with ratios of about 20 to 1 by mid-century, and roughly 50 to 1 by the end of the century."] [my graphic]:

Some of you might find that shape familiar.

Before we can judge the extrapolation of maximum to minimum ratios in this study, we should understand how the data used by NCAR fits into the “big picture.”

Let us look at the statewide monthly maximum and minimum records by decade beginning in 1880. Those records are used to calculate a ratio of maximum and minimum records by decade and then compared with the NCAR study ratios.

[click above table image for larger view]

There are two aspects of this comparison that jump out to the reader:
  1. The 1930s were, by far, the hottest period for the time frame. The ratio of maximum to minimum temperatures is greater in the 2000s, but the absolute number of monthly statewide extreme records is far less significant making the ratio far less significant.
  2. The general pattern of ratios for the monthly records follows reasonably closely to the pattern of the daily individual location records, on a decadal basis.
Now let us take the two data sets and plot the ratios and see what conclusions we might draw remembering that in absolute terms, the 1930 had a much higher frequency of maximum temperature extremes than the 1990s or 2000s or the combination of the last two decades.

If you were to only look at the ratios of maximum to minimum monthly temperature records, it would be understandable to draw the conclusion that, while there was a cyclical pattern to the ratios, the ratios were increasing exponentially toward a much hotter environment.

It is only when you compare the absolute number of all-time records that you appreciate that ratios as a measure of global warming may be quite deficient. You should also note a significant divergence in the numerical trend of record high monthly versus daily records.

Keep in mind that there are significant biases toward recording daily warmer temperatures due to the closure of thousands of rural stations and expansion of urban areas into previously rural or suburban areas, creating enormous "heat sinks," that prevent night time temperatures from dropping to the extent that they would in a nearby rural setting. Hence more stations reporting greater frequencies of maximum temperatures and fewer minimums.

Conclusion

The oscillating or cyclical nature of our climate is completely overlooked when one takes a small time frame such as the 1950s through 2006 and extrapolates a full century beyond. Even 130 years of data, starting from a relatively cold period, gives a very brief look at climate history for the U.S. and certainly one that is not sufficient to extrapolate a general warming trend, much less an accelerating one.

In the face of the recent decline in new all-time monthly statewide maximum records, it is more probable that we may be facing a cyclical decline in our overall temperature and that something similar to the 1960s and 1970s may be a far more realistic projection. Our most recent winters have been particularly colder than long-term averages and minimal sunspot activity may be another harbinger of this normal cyclical variation in our relatively stable climate.

For more information about this and related topics, please check out the following blogs:
The authors have contributed ideas and advice for this post.

..

Can"t Find It?

Use the SEARCH BLOG feature at the upper left. For example, try "Global Warming".

You can also use the "LABELS" below or at the end of each post to find related posts.

Blog Archive

Cost of Gasoline - Enter Your Zipcode or Click on Map

CO2 Cap and Trade

There is always an easy solution to every human problem—neat, plausible, and wrong.
Henry Louis Mencken (1880–1956)
“The Divine Afflatus,” A Mencken Chrestomathy, chapter 25, p. 443 (1949)
... and one could add "not all human problems really are."
It was beautiful and simple, as truly great swindles are.
- O. Henry
... The Government is on course for an embarrassing showdown with the European Union, business groups and environmental charities after refusing to guarantee that billions of pounds of revenue it stands to earn from carbon-permit trading will be spent on combating climate change.
The Independent (UK)

Tracking Interest Rates

Tracking Interest Rates

FEDERAL RESERVE & HOUSING

SEARCH BLOG: FEDERAL RESERVE for full versions... or use the Blog Archive pulldown menu.

February 3, 2006
Go back to 1999-2000 and see what the Fed did. They are following the same pattern for 2005-06. If it ain't broke, the Fed will fix it... and good!
August 29, 2006 The Federal Reserve always acts on old information... and is the only cause of U.S. recessions.
December 5, 2006 Last spring I wrote about what I saw to be a sharp downturn in the economy in the "rustbelt" states, particularly Michigan.
March 28, 2007
The Federal Reserve sees no need to cut interest rates in the light of adverse recent economic data, Ben Bernanke said on Wednesday.
The Fed chairman said ”to date, the incoming data have supported the view that the current stance of policy is likely to foster sustainable economic growth and a gradual ebbing in core inflation”.

July 21, 2007 My guess is that if there is an interest rate change, a cut is more likely than an increase. The key variables to be watching at this point are real estate prices and the inventory of unsold homes.
August 11, 2007 I suspect that within 6 months the Federal Reserve will be forced to lower interest rates before housing becomes a black hole.
September 11, 2007 It only means that the overall process has flaws guaranteeing it will be slow in responding to changes in the economy... and tend to over-react as a result.
September 18, 2007 I think a 4% rate is really what is needed to turn the economy back on the right course. The rate may not get there, but more cuts will be needed with employment rates down and foreclosure rates up.
October 25, 2007 How long will it be before I will be able to write: "The Federal Reserve lowered its lending rate to 4% in response to the collapse of the U.S. housing market and massive numbers of foreclosures that threaten the banking and mortgage sectors."
November 28, 2007 FED VICE CHAIRMAN DONALD KOHN
"Should the elevated turbulence persist, it would increase the possibility of further tightening in financial conditions for households and businesses," he said.

"Uncertainties about the economic outlook are unusually high right now," he said. "These uncertainties require flexible and pragmatic policymaking -- nimble is the adjective I used a few weeks ago."
http://www.reuters.com/

December 11, 2007 Somehow the Fed misses the obvious.
fed_rate_moves_425_small.gif
[Image from: CNNMoney.com]
December 13, 2007 [from The Christian Science Monitor]
"The odds of a recession are now above 50 percent," says Mark Zandi, chief economist at Moody's Economy.com. "We are right on the edge of a recession in part because of the Fed's reluctance to reduce interest rates more aggressively." [see my comments of September 11]
January 7, 2008 The real problem now is that consumers can't rescue the economy and manufacturing, which is already weakening, will continue to weaken. We've gutted the forces that could avoid a downturn. The question is not whether there will be a recession, but can it be dampened sufficiently so that it is very short.
January 11, 2008 This is death by a thousand cuts.
January 13, 2008 [N.Y. Times]
“The question is not whether we will have a recession, but how deep and prolonged it will be,” said David Rosenberg, the chief North American economist at Merrill Lynch. “Even if the Fed’s moves are going to work, it will not show up until the later part of 2008 or 2009.
January 17, 2008 A few days ago, Anna Schwartz, nonagenarian economist, implicated the Federal Reserve as the cause of the present lending crisis [from the Telegraph - UK]:
The high priestess of US monetarism - a revered figure at the Fed - says the central bank is itself the chief cause of the credit bubble, and now seems stunned as the consequences of its own actions engulf the financial system. "The new group at the Fed is not equal to the problem that faces it," she says, daring to utter a thought that fellow critics mostly utter sotto voce.
January 22, 2008 The cut has become infected and a limb is in danger. Ben Bernanke is panicking and the Fed has its emergency triage team cutting rates... this time by 3/4%. ...

What should the Federal Reserve do now? Step back... and don't be so anxious to raise rates at the first sign of economic improvement.
Individuals and businesses need stability in their financial cost structures so that they can plan effectively and keep their ships afloat. Wildly fluctuating rates... regardless of what the absolute levels are... create problems. Either too much spending or too much fear. It's just not that difficult to comprehend. Why has it been so difficult for the Fed?

About Me

My photo
Michigan, United States
Air Force (SAC) captain 1968-72. Retired after 35 years of business and logistical planning, including running a small business. Two sons with advanced degrees; one with a business and pre-law degree. Beautiful wife who has put up with me for 4 decades. Education: B.A. (Sociology major; minors in philosopy, English literature, and German) M.S. Operations Management (like a mixture of an MBA with logistical planning)