Seventy years is plenty

Unadjusted NZ temperature history

Barry Brill makes a strong case for the New Zealand temperature record to ignore the period before 1930. In essence, he says that a 70-year-long record is plenty long enough to establish a trend, and in any case the early data is either missing or unreliable — just chuck it out! He says it at greater length and more politely than that in a sometimes tongue-in-cheek article that makes sly digs at NIWA for the mistakes or naked bias that have given us a deeply suspect temperature “history.”    – Richard Treadgold

— by Barry Brill, Chairman of the New Zealand Climate Science Coalition

Climate Change policy is driven by forecasts of temperatures over the next 100 years. But the computer models need to be checked against the actual temperature trends of the last 100 years. If back-casts are wrong, then fore-casts will also be wrong.

The NZ temperature record averages seven weather stations — Auckland, Masterton, Wellington, Nelson, Hokitika, Lincoln and Dunedin — through the twentieth century. But there are many gaps and flaws up to about 1930 and, apart from these seven, there are very few other records to use as benchmarks.

First 30 years a chequered history

Auckland: Moved from the Museum to Albert Park in late 1909, and was affected by rapid tree growth and urbanisation during the next 20 years.

Masterton: Early unofficial records were maintained fitfully by various individuals, but the first visit by the MetService was in 1928. In his 1981 thesis dealing with the reliability of weather station data, Dr J Salinger describes the pre-1920 record as “only fair”, and “must be viewed with caution”.

Wellington: Moved from Bolton St to Buckle St (location uncertain) to Thorndon to Kelburn during 1906-1927, all with different aspects, exposures, elevations and urban heat island (UHI) effects. There were no nearby stations for comparisons.

Nelson: The station at the Vicarage 1907-1919 was “very much sheltered, practically under a hedge overgrown by large trees”, so readings were taken over by the Cawthron Institute. These were “not very good at first” but were approved in 1928, before the station moved out to Appleby in 1931.

Hokitika: Was noted for grossly inaccurate readings in 1912, as well as 1919 and 1926. The enclosure was twice criticised as too small and was greatly expanded in 1928.

Lincoln: Dr Salinger’s thesis warns: “The record from 1927 is reasonable for further climatic change analysis; the record prior to 1927 should be used with caution”.

Dunedin: Moved from Leith Valley to the Botanical Gardens in 1913. Dr Salinger noted that “these sites have widely varying thermal properties … a complex pattern of local microclimates … even though the move may be small in distance the homogeneity of the record will be quite dramatically disturbed.”

In most of these cases, the problem is just poor data, and nothing can be done about that a century later. The only potential remedy is to excise the flawed periods and replace them with intelligent guesses about what the readings ought to have shown. But a little bit of this goes a long way — too many guesses and the record becomes a hypothesis rather than empirical data.

Dodgy “adjustments” to sparse data

So how would an analyst go about estimating the missing temperature data in, say, Masterton or Nelson? A technique described in Peterson et al (1998) — the main authority provided on NIWA’s website — would manipulate data from several neighbouring stations that are climatologically similar to the subject station. But in those early years, no such benchmark stations existed in the Wairarapa or the top of the South Island — or anywhere else in New Zealand for that matter.

In his paper “The NIWA Seven-station Temperature Series,” Dr B. Mullan notes that 238 stations currently report temperatures to NIWA, but “prior to the 1930s, a lot fewer observations are available.”

In his peer-reviewed 1980 paper “Apparent Trends of Mean Temperature in New Zealand Since 1930,” JWD Hessell found that exposures of most New Zealand weather stations “have been affected by changes in shelter, screenage and/or urbanisation, all of which tend to increase the observed mean temperature. A systematic analysis … reveals that no important change in annual mean temperature since 1930 has been found at those [few] stations where the above factors are negligible.”

Hessell notes that “many New Zealand climatological stations were established about 1930, there being only a few with unchanged sites and unbroken records before that date.” The clear implication is that data is just too scarce in the pre-1930 period.

Wasteland of missing data

The difficulty with early period records is further illustrated by NIWA’s Eleven-station Series. Although a representative sample would include at least six from the South Island, NIWA was unable to cobble together three until 1944 and had to wait until mid-1948 for a fourth. Then it was another six years until data from these was available for any two successive years.

Using some legerdemain known only to Government scientists, NIWA claims that the 11SS begins in 1931. The 7SS should begin at the same time.

So why does NIWA bother stretching the temperature record over this wasteland of missing and unreliable data, when all seven stations are isolated islands? What purpose is served by the extra decades? A time series commencing in 1930 would offer 70 years — more than sufficient to fine-tune the computer models.

Other points in favour of a 70-year record:

  • The series would start in 1930. As this is one of the coldest years in the entire record, the start date should help the warming trend NIWA is seeking.
  • The 7SS could coincide with the (alleged) 11SS.
  • All of the post-1975 years of alleged CO2 effects on temperatures would still be included.
  • The number of dodgy adjustments could be reduced from 49 to a mere 16.

Views: 358

25 Thoughts on “Seventy years is plenty

  1. Rodney Hide on 20/09/2010 at 10:11 pm said:


    You have completed NIWA’s review for them. How come they’re taking so long?


  2. Richard C on 20/09/2010 at 11:58 pm said:

    “A time series commencing in 1930 would offer 70 years”

    Wouldn’t that be 80 years and counting?

    On the subject of graphs, here’s some alternatives to the govt’s Climate Change office graph from RT’s “Don’t lie to me Nick Smith -1” post for Nick to ponder thanks to Alan Cheetham at Global Warming Science

    • Richard C on 21/09/2010 at 9:47 am said:

      Where has all

      the warming gone

      long time passing

      Where has all

      the warming gone

      long time ago

      The first graph above would look great framed and hanging on the Climate Change office wall. They’ll have to do an office make-over now its the Climate Disruption office anyway.

      I’ve looked at clouds from both sides now

      from up and down

      but still somehow

      it’s modeling illusions I recall

      I really don’t know clouds

      at all.

    • Bob D on 21/09/2010 at 10:12 am said:


    • Perfect! You’ve a good ear, Richard C.

    • Richard C on 21/09/2010 at 9:05 pm said:

      Just read Wyatt 2006.

      Super-parameterization using a cloud resolving model (CRM) produces negative net cloud forcing of -1.77 Wm2 from the SP-CAM but that doesn’t seem to change model climate sensitivity much.

      “In fact, SP-CAM has only slightly higher climate sensitivity than the least sensitive of the models presented in C89”

      Okay, CS is less than most conventional models (including NIWA’s UM I think) but still on the fringe (C89 is Cess 1989).

      If they would just drop the AGW premise for once we might get a useful comparison. Also it didn’t occur to them to run a -2 K simulation (global cooling) against the control, they only did +2 K (global warming)

      “[8] The control simulation was integrated starting on September 1st using the monthly-mean climatological sea surface temperature (SST). The sea-ice climatology was also prescribed. The perturbation simulation is identical to the control except that the SST is uniformly increased by 2 K. The control simulation was run for 3.67 years and the +2 K simulation for 5.25 years, with the first 6 months considered as spin-up and therefore discarded from the analysis in each case.”

    • Richard C on 22/09/2010 at 11:46 am said:

      A SIMULATION that is NOT SIMILAR to the observed condition is NOT a SIMULATION.

      The 19 natural forcings only “simulations” that were submitted to IPCC AR4 were unable to replicate 1930’s and 1990’s warming.

      From Simulations of the 20th Century, AR4, WGl:

      “The fact that climate models are only able to reproduce observed global mean temperature changes over the 20th century when they include anthropogenic forcings, and that they fail to do so when they exclude anthropogenic forcings, is evidence for the influence of humans on global climate. Further evidence is provided by spatial patterns of temperature change. Figure 9.6 compares observed near-surface temperature trends over the globe (top row) with those simulated by climate models when they include anthropogenic and natural forcing (second row) and the same trends simulated by climate models when only natural forcings are included (third row)”

      “fact” ? – Bollox !

      See the IPCC sleight-of-hand here

      The reason the “natural forcings only simulations” failed was the inept, selective and deceptive use of said forcings (the only IPCC natural RF is solar).

      From “Have Changes In Ocean Heat Falsified The Global Warming Hypothesis? – A Guest Weblog by William DiPuccio”

      “Most scientists who oppose the conclusions of the IPCC have been outspoken in their advocacy of cyclical heating and cooling caused primarily by natural processes, and modified by long-term human climate forcings such as land use change and aerosols. These natural forcings include ocean cycles (PDO, AMO), solar cycles (sunspots, total irradiance), and more speculative causes such as orbital oscillations, and cosmic rays.”

      The Earthshine project shows how variations in cloudiness change the Earth’s albedo. For example, a reduction in albedo 1994-1997 let in more TSI to heat the oceans resulting in subsequent atmospheric warming. The implications are analyzed here:

      Earth’s Albedo Tells an Interesting Story

      But TSI was also reducing 1994-1997 so the IPCC solar RF will not produce the requisite warming for the next period.

      Climate dynamics, IPCC shortcomings and the importance of natural forcing variation are the subject of this paper by Dr Theodor Landscheidt:


      More solar and natural forcing analysis here by Alan Cheetham (note the correlation of Arctic temperatures with TSI as opposed to CO2):

      The IPCC’s natural forcings inadequacies are documented here:

      Reflected Sunlight Shines On IPCC Deceptions And Gross Inadequacies

    • Richard C on 22/09/2010 at 11:51 am said:

      Oops, wrong link.

      Earth’s Albedo Tells an Interesting Story

    • Richard C on 22/09/2010 at 2:24 pm said:

      More on IPCC simulation assumptions and climatology complexity.

      IPCC Studies And Reports Have Nothing to Do with Climate Change

      “The IPCC approach is the antithesis of science. They have predetermined a cause and set about proving it by narrowly defining climate change, limited selection of variables, manipulation of data, and working to prove rather than falsify the hypothesis. It is unquestionably the biggest scam in history because it has been deliberate.”

    • Richard C on 22/09/2010 at 12:23 pm said:

      For a tropospheric “missing heat” discussion germaine to model simulations see:

      Scroll down to:

      Temperature change above Equator

      View the divergence of modeled vs measured tropospheric temperature change on:

      “Diagram showing observed decadal temperature change at surface, 300 hPa and 200 hPa, between 20oN and 20oS, since 1979”

      The simulations differed from the AR4 submissions in this way:

      “These model runs differ from those that were run for the IPCC in that the models were simplified to isolate the effects of CO2 forcing and climate feedbacks (Lindzen 2007). Also the models were run until equilibrium was established rather than run in a transient mode in order to simulate the past. Thus, they tend to isolate greenhouse warming from other things that might be going on.”

      Not a good look for CO2.

    • Richard C on 22/09/2010 at 2:03 pm said:

      More on the “missing heat” and model simulations.

      From Knox and Douglas 2010 Discussion and Summary (easier read at THE HOCKEY SCHTICK link below because the important parts are in bold type)

      ” As many authors have noted, knowing FOHC [ocean heat content] is important because of its close relationship to FTOA, the net inward radiative flux at the top of the atmosphere. Wetherald et al. [13] and Hansen et al. [14] believe that this radiative imbalance in Earth’s climate system is positive, amounting recently [14] to approximately 0.9 W/m2. Pielke [15] has pointed out that at least 90% of the variable heat content of Earth resides in the upper ocean. Thus, to a good approximation, FOHC may be employed to infer the magnitude of FTOA, and the positive radiation imbalance should be directly reflected in FOHC (when adjusted for geothermal flux [9]; see Table 1 caption). The principal approximations involved in using this equality, which include the neglect of heat transfers to land masses and those associated with the melting and freezing of ice, estimated to be of the order of 0.04 W/m2 [14], have been discussed by the present authors [9].
      In steady state, the state of radiative balance, both quantities FTOA and FOHC should be zero. If FTOA > FOHC, “missing energy” is being produced if no sink other than the ocean can be identified. We note that one recent deep-ocean analysis [16], based on a variety of time periods generally in the 1990s and 2000s, suggests that the deeper ocean contributes on the order of 0.09 W/m2. This is not sufficient to explain the discrepancy.
      Trenberth and Fasullo (TF) [2] believe that missing energy has been accumulating at a considerable rate since 2005. According to their rough graph, as of 2010 the missing energy production rate is about 1.0 W/m2, which represents the difference between FTOA ~ 1.4 and FOHC ~ 0.4 W/m2. It is clear that the TF [Trenberth & Fasullo] missing-energy problem is made much more severe if FOHC is negative or even zero. In our opinion, the missing energy problem is probably caused by a serious overestimate by TF of FTOA, which, they state, is most accurately determined by modeling.
      In summary, we find that estimates of the recent (2003–2008) OHC rates of change are preponderantly negative. This does not support the existence of either a large positive radiative imbalance or a “missing energy.” ”

    • Richard C on 22/09/2010 at 1:22 pm said:

      Post just up at THE HOCKEY SCHICK

      CO2 is a bit player in Global Warming

      “Since clouds operate as both powerful heat-trapping agents, overriding others, and a reflector of the sun’s energy, they may be the key factor in the regulation of the average global temperature. At the present time, they are one of the least measured parameters in the computer models predicting future climate changes.”

    • Richard C on 22/09/2010 at 4:25 pm said:

      From the essay

      Carbon Heat Trapping: Merely a Bit Player in Global Warming
      Richard J. Petschauer, Senior Member IEEE

      of which “CO2 is a bit player in Global Warming” is a commentary:-

      With such dire predictions about global warming due to carbon “heat trapping” and the drastic actions being proposed to reduce it, it seems only prudent to question and more fully verify what if any additional man-made carbon dioxide has to do with the recent warming trends. The results of this paper indicate the future temperature increases from CO2 will be minor and are not a cause for concern. It is recommended that we enlist besides climate specialists, practical engineers and people experienced in applied physics and chemistry to evaluate the work done here and pursue further practical evaluations of only increased CO2. Workers that managed the United Nations IPCC project should not be included in this project. Their minds, or at least their spokepersons, are already made up. Very large parts of the IPCC reports are excellent, and many competent scientists contributed to it. However it appears that only a few people put together quantitative numbers that estimated past CO2 and temperature trends and possible future changes from increases in CO2. These models predict such widely varying amounts of changes one must question their validity. The IPCC should publish for each of the models used the resulting average values of the radiation from the atmosphere to the earth in watts per square meters for both the present level of CO2 and that estimated when CO2 is doubled. This would allow atmosphere measurements as proposed in the appendix to see if they are reasonable and would
      be one way to test some of these models’ features before putting any faith in them.

    • Richard C on 21/09/2010 at 11:04 am said:

      All of the warming in the last 30 years occurred in a single year.

      Computer Aided Science Hoax – CA$H

    • Can anybody declare this statistically valid, or does it break a rule?

    • Richard C on 21/09/2010 at 8:35 pm said:

      Not so much statistical but more a climatological phenomena.

      Climate Regime Shifts

      I ‘m actually more for Pielke’s Ocean Heat Content Q as a better metric than mean atmospheric temp in C.

      Ergo – ARGO.

      Or even below ground temperature.

      Heat into the Ground

      Nice smooth sine curves

      If the ocean drives temperature (not CO2), we should look there first then look to see what that does to atmospheric temperature IMHO.

    • Richard C on 22/09/2010 at 1:43 pm said:

      Re Ocean Heat Content climate metric and ARGO.

      From Climate Science: Roger Pielke Sr.

      New Paper “Recent Energy Balance Of Earth” By Knox and Douglas 2010

      Another commentary on the same paper at THE HOCKEY SCHICK:

      Paper: Global Cooling began in 2003

      “Climate scientist Roger Pielke Sr has posted today an in-press paper which demonstrates that ocean temperatures flattened in 2001-2002 and have been on a negative trend since. The ocean temperature trend is far more important than the hopelessly adjusted & flawed land temperature record to assess global warming, as noted by Dr. Pielke. During this period, CO2 levels have steadily climbed, which according to the IPCC should have caused a positive radiative imbalance resulting in about .16C warming. The fact that ocean temperatures have instead been cooling falsifies the entire anthropogenic global warming hypothesis.”

      Knox and Douglas 2010
      ABSTRACT: A recently published estimate of Earth’s global warming trend is 0.63 ± 0.28 W/m2, as calculated from ocean heat content anomaly data spanning 1993–2008. This value is not representative of the recent (2003–2008) warming/cooling rate because of a “flattening” that occurred around 2001–2002. Using only 2003–2008 data from Argo floats, we find by four different algorithms that the recent trend ranges from –0.010 to –0.160 W/m2 with a typical error bar of ±0.2 W/m2. These results fail to support the existence of a frequently-cited large positive computed radiative imbalance.

    • Richard C on 22/09/2010 at 9:14 am said:

      Not forgetting that:

      Global Warming is Not Global

      The climate regime shift is only made apparent by NH data, its imperceptible (to me) in the SH data. The SH plot is more like the unadjusted plot at the top of your post.

      So NIWA “adjust” to be consistent with NH data – not SH.

      But you know that.

  3. Bob D on 21/09/2010 at 10:20 am said:

    A question I have on the 11SS is: why did they not use Te Aroha? It spans all the years from 1888 to 2000, and was specifically mentioned by Hessell as a site unaffected by urbanisation, screen changes or sheltering.

    Perhaps the answer lies here: the long-term trend of this station is just 0.23ºC/century.

    • Richard C on 21/09/2010 at 11:01 am said:

      Yes, and that station may be adequate for the entire NZ series given Hansen can extrapolate 1200 km in the Arctic.

  4. Quentin F on 21/09/2010 at 2:18 pm said:

    Well that graph looks pretty flat to me. I bet NIWA wouldnt include the recent (temps due to) snowfalls in southland 😀

  5. Quentin F on 21/09/2010 at 4:15 pm said:

    I will add that the peak at the end (of the top graph) looks like the1998 El Nino period.

  6. Clarence on 23/09/2010 at 9:47 am said:

    So what is the trend for the 1930-2000 period of the Seven-station Series spreadsheet? Is it still 0.9°C/century?

  7. Clarence on 23/09/2010 at 9:51 am said:

    So, if the first 30 years are omitted, what effect does that have on the Seven-station Series?

    Is the 1930-2000 period of the 7SS spreadsheet still about 0.9°C/century?

Leave a Reply

Your email address will not be published. Required fields are marked *

Post Navigation