The uncertainties of averages

Dr Vincent Gray

Those who provide us with the supposed Mean Annual Global Temperature Anomaly (graph shown below) treat the annual points in their graph as if they were constants. The points on the graph do not represent actual observations. They are processed versions of actual observations and they are subject to statistical uncertainties.

The latest CRU paper to calculate these uncertainties is Brohan, P., J.J. Kennedy, I. Haris, S.F.B. Tett, P.D. Jones (2006). “Uncertainty estimates in regional and global observed temperature changes: a new dataset from 1850.” J. Geophys. Res. 111: D12106. doi:1020/2005JD006546.

This paper combines many sources of uncertainties and the final figures vary from year to year, but are typically about ±0.2 ºC on a 95% confidence basis. Some versions of their graph include these figures as “error bars” attached to the data points.

Brohan et al even admit that they do not include “unknown unknowns”, even referring to the internationally recognised expert on this subject, Donald Rumsfeld.

It is surprising that they have left out of their discussions the most important source of uncertainty in their figures, one which is “known” to every person who has studied stratistics. It is the uncertainty which arises every time you take an average.

CRU global air temperature anomaly Jan 2010

The actual experimental observations upon which the final figures on the graph are based are the daily measurements of the maximum and the minimum temperature at weather stations all over the world. In order to obtain the annual mean maximum or minimum it is necessary to average 365 daily measurements (366 in a leap year).

According to every one of the several textbooks on statistics that I possess, the equation for obtaining the uncertainty of a single mean is as follows:

Uncertainty is ± txSD/Sqrt of number of observations.

The value for t is obtained from the tables of the t distribution given in the textbooks. For 95% confidence limits and numbers of observations above 50 it is close to 2. The square root of 365 is 19.1.

Kerkin (personal communication) recently downloaded a large number of daily maximum and minimum measurements from the NIWA database and calculated the standard deviation for two weather stations: Albert Park, Auckland, and Te Aroha in the North Island of New Zealand.

For Albert Park the SD for the maximum was 3.8 ºC and for the mimimum 3.7 ºC.

For Te Aroha the SD for the maximum was 4.8 ºC and for the minimum 5.1 ºC.

I do not know how typical of the whole world these might be, but I expect that for countries with a continental climate the SD figures would be much higher. But, anyway, let us take an SD of 4.3 ºC for the maximum and 4.4 ºC for the minimum and try it in the formula.

The 95% confidence limits for the average are therefore ± 2x 4.3/19.1 = 0.45 ºC for the maximum and 2x 4.4/19.1 = 0.46 ºC.

These figures are about double the uncertainties calculated by Brohan et al from all the other possible sources of error.

It is assumed that the average temperature is the mean of the maximum and the minimum. So you have to add up the individual uncertainties to give those for the mean as ±0.91 ºC.

But that ain’t all. There is an addtional uncertainty from choosing such a bad method for calculating the average. There are no published figures as far as I am aware of attempts to calculate the error of doing this, or its uncertainty. However, NIWA have published a set of hourly temperature figures from 24 New Zealand weather stations for a typical summer’s day and a typical winter’s day at their web page “Meteorologist for a Day” (2010).

I have calculated, from the 48 figures supplied, the average difference between the Maximum/Minimum mean and the 24 Hour Mean as 0.2ºC with a standard deviation of 0.8 ºC.

The 95% uncertainty can again be calculated as ± 2×0.8/sqrt of 48 which gives ±0.23 ºC. This amount is about the same as all the uncertainties calculated by Brohan et al.

If the 95% confidence limits are all added together you get 0.2+0.45+0.46+0.23 which comes to a total of ±1.34 ºC on each data point.

This is well above the 0.9 ºC claimed to be the global, or the New Zealand temperature rise over the last 100 years, which means that this figure has a very low probability of being correct.

12 Thoughts on “The uncertainties of averages

  1. Doug Proctor on February 25, 2011 at 5:47 am said:

    All these averages assume that the global changes are equal as they are random in location and time. Insolation heating events are not, for a start, due to orbital eccentricity, axial tilt, hemispheric differences in land/sea/ice proportions, and – as a varying function of several of these and cosmic rays, cloud cover. Regional differences (as for the Arctic) are significant through time.

    ARGO sea data year-to-year and by latitude, longitude and hemisphere show regional differences behind the general patterns that show up as averages. Since it is the small pieces of the action that are claimed to be the dominatrix of climate change, it would be reasonable to see if the local (time as well as space) are enough to distort the global averages.

    Uncertainty in global averages covers up uncertainties in regional impacts. If – for example the 1988 El Nino – effectively heats just a portion of the Pacific, then the high heat event shown as a “global” spike is not global, but a distortion introduced by a regional anomaly. Interpretation of a series of local distortions is not the same as the interpretation of a global phenomenon.

    The uncertainties of the IPCC are mathematically correct in that the methods chosen and the assumptions behind them lead to those levels of uncertainties. But is the uncertainty of what is going on in the world globally the same as the mathematical aretefact? I’d say not.

    Breaking down where and when things get warmer is not a subject of much discussion as it is messy. One number, one planet: simplistic but Gore-worthy.

    Graphing temperature changes globally along with regionally shows that the regional variations are far greater than the global. Global uncertainties are therefore mathematically cute, but they “hide to uncertainty” of what the world experiences.

  2. Clarence on February 25, 2011 at 11:18 am said:

    If I have one foot encased in ice and the other on fire, then my average condition is ‘comfortable’.

  3. QuentinF on February 25, 2011 at 12:22 pm said:

    Gerlich et all state categorically “There are no calculations to determine an average surface temperature of a planet”.

  4. Richard C (NZ) on February 25, 2011 at 7:52 pm said:

    Something to think about.

    CCG regulars will be used to seeing the early NZT7 data adjusted down, but what happens when the reference is swapped from the 2010 end to the 1910 end?

    This is what happens to the Auckland series

    2002 15.4 16.05
    2003 15.5 16.09
    2004 14.9 15.55
    2005 16.1 16.69
    2006 15.3 15.91
    2007 15.6 16.26
    2008 15.7 16.33
    2009 15.1 15.76

    These raw temperatures were measured at Mangere treatment plant but are adjusted to be in terms of Albert Park by reversing the sign of NIWA’s cumulative step change total and adding it to the entire series..

    The adjusted temperatures could be checked by setting up a temporary station at the same spot as the original Albert Park station and taking a years measurements. The same could be done where sensible for every other location of the NZT7.

  5. outtheback on February 26, 2011 at 5:42 am said:

    The only reason to keep the temperature comparisons on a global level is that it is almost impossible for the average punter to check. Make it regional or smaller still by general climate area within a country and the believers will be drowned out by all those who check the temperature daily for their own purposes, either hobby or work, and can easily see that it has not changed for them. It would blow the myth out of the water in no time. Not quite what politicians and others have in mind. No matter what the real intentions are/were, support for the myth would never eventuate. In business it goes like this: if you don’t know the numbers, you can’t control it. That is exactly what they try to do here, the public at large can not check the global data so they can never be in control. A politicians dream.

  6. Richard C (NZ) on February 26, 2011 at 9:25 am said:

    Oops, the cumulative step total is 0.62, not 0.66.

    So every year on average, the temperatures at Albert Park are 0.62 C warmer than Mangere treatment plant if we accept NIWA’s cumulative step change method for the NZT7.

    Believe it or not.

  7. Richard C (NZ) on February 26, 2011 at 2:07 pm said:

    The same goes for Masterton. If the reference is swapped from the East Taratahi AWS end to the Workshop Road end by adding the 0.55 C cumulative step change total to the NZT7 Masterton series, the latest East Taratahi AWS temperatures are adjusted up to be in terms of Workshop Road as follows:-


    1991 12.15 12.74 0.59
    1992 11.2 11.75 0.55
    1993 11.4 11.94 0.54
    1994 12.4 12.97 0.57
    1995 12.7 13.27 0.57
    1996 12.3 12.88 0.58
    1997 11.9 12.48 0.58
    1998 13.5 14.08 0.58
    1999 13.1 13.63 0.53
    2000 12.5 13.02 0.52
    2001 12.7 13.32 0.62
    2002 12.5 13.02 0.52
    2003 12.5 13.09 0.59
    2004 12.3 12.88 0.58
    2005 13.0 13.58 0.58
    2006 12.4 12.96 0.56
    2007 12.6 13.19 0.59
    2008 12.9 13.45 0.55

    So within the same Masterton location, the original site was (and hopefully still is) on average 0.55 C warmer than the latest open site using NIWA’s cumulative step change method (not taking into account any warming 1919-1991).

    The adjustment could be checked with a temporary station at Workshop Rd to take a years data. The two sites are separated by approx 8 km.

  8. Of possible interest is the new temperature record project at Berkeley

  9. Richard C (NZ) on February 28, 2011 at 6:56 pm said:

    NIWA’s justification for their method from page 13 of the NIWA/BOM review report:-

    “it is a simple matter in principle4 to adjust temperatures from one site to the same base level as at another site”

    4 says this

    “It would be almost as simple in practice too, were it not for missing data.”

    So NIWA’s method is based on “a simple matter in principle” – not on an empirical scientific study that proves the “principle”, note.

  10. Doug Proctor on March 1, 2011 at 6:37 am said:

    The global record is a proportional summed average of regions. If we were to deconstruct the global GISTemp historical record into oceanic and non-oceanic, and then subdivide those into their component parts, we would see that only some areas are warming, while others are cooling.

    I can say this with certainty as the American contiguous landmass does not show the warming of the globe. Neither does the oceanic subdivision as the SST and, more recently, the ARGO data have lesser trends than the GISTemp. Also, the GISTemp records are similar in trend but different in amount to the UAH and RSS satellite records. Same goes for HadCruT.

    Breaking out these different databases appears to me to show that the land records in northern Canada and Europe have a far larger high-low range than predicted/projected by AGW theory. The thermal inertia of the oceans, being greater than that of the land or air, means that the small change in the SST data MUST be couinteracted by a very large change in the land data for the average to be what GISTemp shows it to be. At the same time all global maps for any given year show large areas that are cooler than the reference average. Other areas must be abnormally warm for the average to survive as stated.

    What would happen to the global average if, for instance, the Arctic were to be removed from the databases? If, without a regional hotspot, we had minimal “global” heating, one would suspect that the regional hotspot area, not the globe, was undergoing change. If the hotspot moved around too much, one would suspect heat transference distorting the appearance.

    As has been noted by many, the heat energy in a gram of water is larger than that of air. The oceans are 70.1% of the area of Earth. Should the energy in a 0.1K temperature drop in an ocean be transfered to winds moving over lands, a high cumulative energy transference will occur. Will this distort the temperature records to make us think the world is warming? I’d say so.

  11. Richard C (NZ) on March 1, 2011 at 8:54 am said:

    “Should the energy in a 0.1K temperature drop in an ocean be transfered to winds moving over lands, a high cumulative energy transference will occur. Will this distort the temperature records to make us think the world is warming? I’d say so”

    I think you are right Doug. If the ocean gives up heat, it has to pass through the atmosphere in order to dissipate to space. The effect of that heat on the temperature of the air is far greater than the effect it had on the temperature of the ocean.

    So although the ocean might be cooling, air temps will continue to stay up for some time until all the heat that the ocean gives up has passed through the atmosphere.

    There’s a heat-wave in WA Australia right now and if you look at the SST anomaly, there’s an ocean hot spot directly off the coast of WA. The heat-wave heat has to come from somewhere, I suggest that a large amount has been given up by the ocean and is being blown across the land as you suggest.

  12. Richard C (NZ) on March 1, 2011 at 9:01 am said:

    I’ve checked the “principle” empirically using a 1963-1983 Auckland Aero/Albert Park overlap by adding the 1976 0.65 step to the Auck Aero series and comparing the adjusted series to the Albert Park series.

    Both sets correspond well so the principle seems to be valid for Auckland AA and AP.

Leave a Reply

Your email address will not be published. Required fields are marked *

Post Navigation