Wherein we rebut Points 4, 5 & 6
In What Mullan actually says on 7 November I answered the Hot Topic post Danger Dedekind heartbreak blah blah of 5 November, in which Mr Gareth Renowden, presumably advised by Dr Brett Mullan, principal climate scientist at NIWA, had levelled criticisms at the recently published reanalysis of the NZ temperature record. I set out to identify clear, falsifiable statements that Gareth Renowden (or Brett Mullan) was making. There were nine debating points, which you can find in What Mullan actually says. We promised every one would be rebutted.
Dedekind should be aware that NIWA did consider max and min temperatures — which is essential if you are only going to apply adjustments if they achieve statistical significance. The evidence is there in the Technical Notes supplied to his co-author Barry Brill two years before dFDB 2014 was submitted to EMA. It’s even in the 7SS Review document NIWA produced explaining the process they used to create the latest 7SS. The Review may emphasise the mean temperature shifts but NIWA obviously had to have calculated the max and min shifts for the Review to mention them at all. Mullan (2012) also considers max and min temperatures when applying RS93, and shows why it is important to do so.
Dedekind should, therefore, be well aware that NIWA did not use “old” techniques for the new 7SS, and that they calculated adjustments for maximum and minimum temperatures as well as mean temperatures.
Debating point 4. In compiling Mullan et al. (2010) minimum/maximum data were looked at as well as the mean data used for the publication. These calculations were supplied to BOM.
Min/max records are useful on occasion, but it is the mean that is adjusted up or down and therefore it is the effect on the mean that is of interest to readers. This is why all 170 pages of M10 discuss adjustments to the mean data and only mention min/max data in passing (in relation to Albert Park station).
It is of course possible to prepare a full adjusted minimum or maximum chart along with the adjusted mean temperature chart, but our purpose in the paper was to prepare a mean temperature reanalysis.
We are very interested to learn that the min/max calculations were supplied to BOM. Did BOM agree or did they reject the analysis (as, we suspect, they rejected other aspects of M10, such as significance tests and making adjustments less than 0.3°C)? NIWA has been hugely secretive about its discussions with BOM and I have been waiting nearly four years for the Ombudsman to rule on NIWA’s rejection of my Official Information Act request—an extraordinary delay. What has influenced the Ombudsman in this?
Shifts to maximum and minimum temperatures were calculated by NIWA for the 2010 Review; The statistical significance of all shifts was calculated too. The significance tests were done relative to each comparison (reference) site, rather than evaluating an overall significance level after combining sites as RS93 did.
Debating point 5. The statistical significance of each adjustment was calculated during preparation of the 2010 Review.
The statistical significance may have been calculated, but they weren’t published, nor does it appear that any significance tests were carried out to determine whether an adjustment should or shouldn’t be performed. Note the Waingawa vs Christchurch calculation from M10, reproduced below.
The shift was calculated at 0.05°C. No significance interval is given, but eyeballing the graph it seems that achieving a 95% CI less than 0.05°C would be difficult with these data. NIWA then goes on to calculate the average shift using this 0.05°C value.
We ask NIWA to provide us with the significance test results from M10, along with a list of those comparison tests that failed.
The important point, though, is that, regardless of what M10 did or didn’t do, RS93 uses significance tests on the computed mean shift, so that is what we did in our paper.
Estimating anomalies is certainly the correct approach in place of using climatology. But it doesn’t appear Dedekind has done this for Masterton in dFDB 2014. Table 3 in the paper shows no adjustment made for the 1920 site move, but if you apply RS93 k=2 — their preferred method — this would change to -0.3ºC and have to be applied because it meets their statistical significance test.
Debating point 6. The de Freitas et al. (2014) paper does not seem to use correct techniques for infilling missing data at Masterton in May 1920.
RS93 has no procedure for in-filling missing data, for the simple reason that missing data should not be used. When the Audit was published, we attempted to reproduce as much of NIWA’s work as possible, and so performed calculations even where we were aware that data was poor.
However, in the paper, our stated aim was to follow RS93 as closely as possible. In RS93, the following statement is made regarding missing data:
“The method of section 2.2.1 was applied, with k = 2. Table II shows the time (year and decimal) of each site change, the estimate of the effect of the site change on mean daily minimum temperature based on 2 years of data before and after, the standard error of the estimate, the number of monthly differences used, and the neighbouring stations used for the adjustment. The stations used were those that had complete data and no site changes of their own for 2 years before and after the site change. In some cases no estimate was possible due to insufficient data.” (Emphasis added)
In RS93 Table II, there are three site changes (Christchurch Airfield 1953, Christchurch Gardens 1905, and Eyrewell 1951) that carry the designation “No estimate” under the column “Estimated Shift”.
Usually, missing months are classed as missing because data is absent for only one or two days. In these cases it is possible to estimate the monthly temperature by using the remaining days. However, in the case of Masterton in May 1920 the entire month of May (the month of the station move) is missing. Also, M10 states:
Salinger (1981) noted that by comparison to observations at other stations, the Masterton temperature record prior to 1920 was only ‘fair’ and should be viewed with caution.
In contrast to M10, RS93 takes a conservative view when determining whether or not to make any adjustment. Our paper again follows the RS93 example and does not estimate a shift at all in the case of May 1920 at Masterton.
That is why the adjustment is zero, not because of an incorrect in-filling technique.