In Renowden’s latest apologia at Hot Topic it is quite difficult to discern Brett Mullan’s arguments through the thicket of abuse and misdirection created by Renowden. But I think these are the debating points he’s trying to make, lined up with the passages in which he makes them.
When he says:
Let me pose a question. What does Dedekind think Rhoades and Salinger were doing in their 1993 paper? Indulging in a purely theoretical exercise? In fact, they developed their techniques by working on what became the Seven Station Series (7SS), and from 1992 onwards the 7SS was compiled using RS93 methods properly applied.
We’ll call that Debating point 1. From 1992 onwards the 7SS was recalculated using the Rhoades & Salinger (1993) measurement techniques.
Just to be clear, when I said in the original post that the use of one or two year periods is not adequate, I was using the RS93 terminology of k=1 and k=2; that is, k=2 means 2 years before and after a site change (so 4 years in total, but a 2-year difference series which is tested for significance against another 2-year difference series).
Page 3 in the 1992 NZ Met Service Salinger et al report. The final paragraph clearly states k=2 and k=4 were used.
Top of page 1508 in Peterson et al 1998: “Homogeneity adjustments of in situ atmospheric climate data: a review”, International J. Climatology, 18: 1493-1517. Clearly states k=1, 2 and 4 were considered.
Debating point 2. From 1992 onwards the 7SS was recalculated using k=2 and k=4 for the comparison periods.
During the discovery process before the High Court proceedings, Barry Brill and Vincent Gray examined a set of storage boxes at NIWA — dubbed the “Salinger recall storage boxes” — that contained (amongst other things) all of Jim Salinger’s original calculations for the 1992 reworking of the 7SS.
Debating point 3. All of Jim Salinger’s original calculations for the 1992 version were made available during discovery in the High Court proceedings.
Dedekind should be aware that NIWA did consider max and min temperatures — which is essential if you are only going to apply adjustments if they achieve statistical significance. The evidence is there in the Technical Notes supplied to his co-author Barry Brill two years before dFDB 2014 was submitted to EMA. It’s even in the 7SS Review document NIWA produced explaining the process they used to create the latest 7SS. The Review may emphasise the mean temperature shifts but NIWA obviously had to have calculated the max and min shifts for the Review to mention them at all. Mullan (2012) also considers max and min temperatures when applying RS93, and shows why it is important to do so.
Dedekind should, therefore, be well aware that NIWA did not use “old” techniques for the new 7SS, and that they calculated adjustments for maximum and minimum temperatures as well as mean temperatures.
Debating point 4. In compiling Mullan et al. (2010) minimum/maximum data were looked at as well as the mean data used for the publication. These calculations were supplied to BOM.
Shifts to maximum and minimum temperatures were calculated by NIWA for the 2010 Review; The statistical significance of all shifts was calculated too. The significance tests were done relative to each comparison (reference) site, rather than evaluating an overall significance level after combining sites as RS93 did.
Debating point 5. The statistical significance of each adjustment was calculated during preparation of the 2010 Review.
Estimating anomalies is certainly the correct approach in place of using climatology. But it doesn’t appear Dedekind has done this for Masterton in dFDB 2014. Table 3 in the paper shows no adjustment made for the 1920 site move, but if you apply RS93 k=2 — their preferred method — this would change to -0.3ºC and have to be applied because it meets their statistical significance test.
Debating point 6. The de Freitas et al. (2014) paper does not seem to use correct techniques for infilling missing data at Masterton in May 1920.
Dedekind tries [sic] hand wave away the 11SS as having been “thoroughly debunked elsewhere”, but doesn’t link to any debunking. The fact is that the raw station data from rural sites with long records that require no adjustments show strong warming. In other words, the warming seen in the 7SS is not an artefact of site changes or urban warming. That is an important matter, and should have been addressed in dFDB 2014.
Debating point 7. The 11SS is an important record and has not been debunked.
Brett Mullan’s 2012 paper Applying the Rhoades and Salinger Method to New Zealand’s “Seven Stations” Temperature series (Weather & Climate, 32(1), 24-38) deals with the correct application of the methodology described in Rhoades and Salinger’s 1993 paper.
At the very least, dFDB 2014 should have addressed the existence of Mullan’s paper, and explained why the application of RS93 in that paper is not preferable to their interpretation of it.
Debating point 8. The de Freitas et al. (2014) paper should have discussed the relevant literature, including Mullan (2012).
Dedekind makes much of the fact that the paper does refer to one paper on SSTs around New Zealand — but skips over the essential point: that the SST evidence confirms that warming is occurring faster than they calculate.
Debating point 9. SST around New Zealand is warming faster than the 0.28°C/century shown in the de Freitas et al. (2014) paper.
None of these claims stand up to scrutiny. They will all be disproven here in the next few days.