Temperature Data: Sufficient to Make
the Case for Global Warming?

In general, the case that global climate change is now occurring is based on the Preponderance of Evidence that exists. In particular, the procedure one would use to establish the most credible case is simply to find measurable indicators of climate change. But this is hard as it requires:

  • robust definition of what constitutes a climate

  • set of procedures for measuring change

  • instrumental precision sufficient to measure change

Because of these reasons, there is no one indicator (smoking gun) for global climate change. An aggregate of all forms of climate data then provides the preponderance of evidence as we will see. However, let's just consider global temperature data.

We begin with the famous hockey stick diagram and give two representations of it below:

Without deciding yet whether or not these kinds of data make a convincing case that global warming is occurring, we need to ask some important scientific questions:

  • Is the global annual average temperature of the Earth a meaningful number? What does it represent?

  • How does one measure it? How many thermometers should be used? Where should those thermometers be located?

  • Are the thermometers used in 1860 the same as those used in 2007?

  • Have the 20th century data been properly corrected for the effects or urbanization. In many cases, the location of the thermometer has not changed for 80 years, but in those 80 years, the cow pasture that the thermometer used to be located in is now a parking lot or is now surrounded by buildings. The image below shows the environment an official Lampass, Texas:

    So there are significant issues of data reliability when it comes to using only the global mean temperature of the Earth as an indicator of climate change.

    Now, concerning the top hockey stick figure - what is typically plotted on the Y-axis is not the actual mean global temperature but instead a deviation from the average value that is determined by some baseline. While this form of data presentation is scientifically valid, the amplitude of the deviations does depend upon how the baseline is chosen. It is now standard to use the period 1961-1990 (in most climate studies, 30 years is taken to be the time period for establishing average behavior) as the baseline. Therefore, the mean global temperature of the Earth over the period 1961-1990 is subtracted from each yearly data point. The problem with this approach is that is no guarantee that this period is representative – furthermore, if this period was characterized by either general cooling or general warming, then it doesn’t represent a flat baseline. What you choose as your baseline period does effect the data. In the example below, we compare the 1961-1990 baseline subtracted data against that using the period 1885-1915 for the baseline:

    Clearly, by 2007, the amplitude of the warming (value of the Y-axis) is larger in the case of the 1885-1915 baseline – the overall shape of the curve doesn't change, of course, because you're just shifting it up or down in Y by a constant, where that constant is the average global temperature within the baseline period.

    More recent Hockey Sticks:

    2008 Anamolies

    Lower amplitude signal seen when combining land and water. There has been a lot of blog discussion about the flattening in recent years being an indication that global warming has stopped.

    Monthly Anamoly Analysis (new by GDB usually not done)

    Raw Data

    Smoothed with El Nino Cycle

    Recent down turn most likely just part of the short term El Nino/La Nino cycle. No evidence from this wave form that global warming has stopped.

    The most scientifically accurate representation of the hockey stick is the one that is shown below. In this case, the +/- 2 standard deviation errors are indicated (labeled as 5-95% decadal error bars), and the data has been averaged over a long enough window to suppress much of the inherent noise. The 4 colored lines indicate the linear slopes that are obtained in different time periods, in units of degrees C per decade. Presented in this way, the data reveal an increasing slope when the most recent data is used, and that is cause for concern.

    We can now also apply the Z-test to this global data. For instance, we can ask the question, is the average mean temperature of the Earth over the period 1980 – 2007 significantly different than the period of 1900-1980?

    The actual global data is here – but here is a table of the results:

    Time Period Avg. Temp Deviation Error in Mean 1900 – 1980 58.2 0.37 .04
    1980 - 2007 59.4 0.31 .06
           
    Z-statistic     16.7 (!)
           

    So yes, a highly significant difference exists if one wanted to present the data in this manner. But is that difference due to urbanization and thermometer location or is it indicative of a real change in the climate system?

    The bottom line scientifically is that collapsing all the data down to a one dimensional measure of global average temperature is an ambiguous and unreliable measure of global warming.

    A far more convincing case arises when the actual location of the temperature measurements are used. Simple models easily show that heat flows from the equator to the poles. Since there is much less surface area at the poles of a sphere, then the heat flux per unit area in the Earth’s polar region will be higher than in the equatorial regions.

    This leads to a simple prediction Warming should be higher in the high latitude regions of the Earth.

    Here is the data that strongly supports this:

    Here the temperature data is sliced into 4 different time periods for analysis. Panel (c) clearly shows what is known as the mid-century cooling period (seen in the hockey stick diagram over the period 1940-1965). However, the alarming trends appear in panel (d) – throughout the northern hemisphere, there are significant temperature trends as high as increases of 1 degree C per decade! It is this form of data slicing and representation that is, by far, the most scientifically convincing evidence that global warming is now seen in the actual land temperature data.

    Further slicing of the data shows that this high latitude signal is strongest in the winter months – this has grave implications with respect to permafrost melting and methane releases which we will discuss later:


    Finally, a new regional climate record has been analyzed in seasonal terms that have revealed fairly significant summer time warming in Central Europe:

    The upward trend in these temperatures (beginning around 1980) has amplitude of about 0.9 degrees C per decade (which is large!). The well-known August 2003 heat wave in Europe is clearly evident as well.

    Similar kinds of temperature trends are also seen in the American West:

    There is also the issue of extreme weather events (such as the European heat wave of 2003 or the Iowa floods of 2008) and whether or not they are increasing with time. The statistics of extreme events is a difficult to perform in a convincing manner and whether or not it's significant that 2007 was the second hottest year on record and that 8 of the hottest years measured have occurred since 1998 is unclear – it certainly is consistent with global warming but establishing its statistical significance is difficult.

    Conceptually increasing climate volatility can be represented as follows:

    Panel (a) shows the expectation if there is simply if the new climate simply is a direct shift in average quality from the old climate but the variation around the average is the same. This is likely to be too simplified of notion.

    Panel (b) shows the case where the new climate simply shows more extreme variations around the same average values as the old climate.

    Panel (c) shows the case where there is both a shift in the average and an increase in the volatility (i.e., variance around the average). That situation would predict the most amount of record heat.

    Currently the data are consistent with either Panel (b) or (c) – for instance, the 2008 winter in China was the coldest in 150 years and while record cold does not get as much media coverage as record warmth, there has been a lot of record cold spells over the last 15 years as well.

    Overall – the weather just seems chaotic – here is a snippet from March 2008:

    The global land surface temperature was the warmest on record for March, 3.3F above the 20th century mean of 40.8F. Temperatures more than 8F above average covered much of the Asian continent. Two months after the greatest January snow cover extent on record on the Eurasian continent, the unusually warm temperatures led to rapid snow melt, and March snow cover extent on the Eurasian continent was the lowest on record.