Does this destroy sceptical arguments?

This is surely too good to be true for the warmists.

In the last few days of a failing international conference, here’s a paper carrying strong confirmation of global warming. It’s not attribution, of course, but nobody will notice that. Proof of warming is enough to tweak the guilt nerve.

The Washington Post says:

The global temperature series is one of the clearest pieces of evidence that the planet is heating up. Over the past century, it’s easy to see from, say, NASA’s data that surface temperatures have risen dramatically. But there’s also a fair bit of short-term natural fluctuation from year to year, which can sometimes obscure what, exactly, is going on.

Annual averages of the adjusted data

Annual averages of the adjusted data.

The paper’s abstract states:

When the data are adjusted to remove the estimated impact of known factors on short-term temperature variations (El Ni˜no/southern oscillation, volcanic aerosols and solar variability), the global warming signal becomes even more evident as noise is reduced.

Would their “estimated impact” stand up to a robust challenge? The paper states: “It is worthy of note that for all five adjusted data sets, 2009 and 2010 are the two hottest years on record.” This is meaningless when you see that in the raw data for the CRU, RSS and UAH records, the peak temperatures are visibly highest at the 1998 El Nino and in 2010.

Raw data for five major global temperature records

Raw data for five major global temperature records.

Since the records are “adjusted” to remove the influence of three “major factors” they no longer reflect reality, so it hardly matters what temperature the team calculates after the adjustments.

Visits: 83

22 Thoughts on “Does this destroy sceptical arguments?

  1. PeterM on 08/12/2011 at 8:53 pm said:

    Sceptics are simply asking for proof that there is anthropogenic global warming. Forget the statistical 1/10thC temp games. Where is the missing heat? Why do some governments consider it necessary to punish their populace with with a ridiculous ETS tax? What is NZ up to at Cop17? Lots of questions but no answers.

  2. “Since the records are “adjusted” to remove the influence of three “major factors” they no longer reflect reality…”

    That is vague. There is no reason to assume the adjustments are proper, that they have in fact competently “removed the influence” of just those “major factors”, or that they have even identified all the pertinent physical factors. And they chop off the data at 2010, instead of showing the temperatures declining quickly through 2011, and continuing the flatlining apparent in the measured temperatures over the last decade. As a scientist, I consider this criminal behavior, and an indictment of the entire system that allows propaganda to be presented to the public as bonafide science.

  3. Richard C (NZ) on 09/12/2011 at 7:58 am said:

    Alternative title:-

    Inquisition update: Chief data torturers obtain confession, new method successful.

    Grant Foster (Tamino) and Stefan Rahmstorf are notorious (and innovative) data torturers, I’ve been busy countering a citation (?) in Herald comments (Brian Fallow “Durban acid test for the planet’s future” article) of a Tamino post on sea levels (So-what? post) where he applied the same techniques that he’s used in this study (remove dollops of data, torture the residuals) except in his sea level inquisition he also spliced disparate satellite data to a tide guage series to obtain a recent “acceleration”. Problem was that his data source (Domingues et al 2008) specifically stated that the satellite data was rising inexplicably faster than the tide guage data.

    So if adherents to the man-made warming theme are quick to cite a dodgy Tamino blog post as sure evidence you can be sure that peer-reviewed Foster and Rahmstorf 2011 will be cited as conclusive proof of it in EVERY forum henceforth.

    First impression of F&R11 is that they follow the IPCC lead that TSI is the only solar influence so there’s an immediate deficiency.

    Second impression is that this de-trending exercise has been done before by CRU and Scafetta with HadCRUT3v (a series that F&T11 used) but using different techniques so it’s nothing new. By removing cyclical components they arrive at at an underlying trend that Scafetta describes with a quadratic equation in EMPIRICAL EVIDENCE FOR A CELESTIAL ORIGIN OF THE CLIMATE OSCILLATIONS AND ITS IMPLICATIONS, Scafetta 2010. The major difference between F&T11 and the CRU and Scafetta analyses is that CRU and Scafetta took their analyses back prior to 1900 so there’s another deficiency in F&R11.

    Being something of an amateur data torturer myself, I took the opportunity to compare Scafetta’s HadCRUT3 quadratic trend to the CO2 Keeling curve by normalizing both series on the same plot and zeroing at 1850. At 1979, the start of the F&T11 analysis, the HadCRUT3 quadratic trend LEADS the Keeling curve by approximately 25 years i.e. GAT leads CO2.

    Does Foster and Rahmstorf 2011 destroy sceptical arguments? Not if their GAT leads CO2 it doesn’t and after their initial flush of self congratulatory smugness they will realize that they’ve shot themselves in all four feet.

    • Richard C (NZ) on 09/12/2011 at 8:07 am said:

      Gahhh, F&T11 should read F&R11 – got Tamino on the brain at the moment.

    • Richard C (NZ) on 09/12/2011 at 9:22 am said:

      The Excel data and formula for GAT vs CO2 can be viewed (and copied and plotted by anyone except Phil Jones) here:-

      https://docs.google.com/spreadsheet/ccc?key=0Ao_i4MX8e3UadFpWN0s1WnVkNnVGWGRIQVR2NGV4aGc&authkey=COjQlrAO&hl=en_US#gid=0

      Ignore the plot, for some reason it doesn’t work in Google Docs and if anyone can help me with the algebra in the formulas I would appreciate some tips.

      BTW, to copy the formulas click “Show all formulas” and Ctl C to copy.

    • Jim McK on 09/12/2011 at 10:14 am said:

      Hi Richard,
      I have had a quick look. Who put this construction together and for what purpose?

    • Richard C (NZ) on 09/12/2011 at 11:26 am said:

      I did, simply to make the comparison. It’s a natural progression, we’ve had the Keeling curve for a while (dodgy as is) and it’s used to initialize all the models so when I saw the Scafetta equation for the GAT trend I thought “let’s see where CO2 fits into the picture?”. Turns out if fits a little belatedly.

    • Andy on 09/12/2011 at 12:38 pm said:

      Hi Richard
      I’ve started using Dropbox to share documents and it works well as an alternative to Google docs. You can share native Excel files that way, hence preserve the graphs etc

      http://www.dropbox.com

    • Richard C (NZ) on 09/12/2011 at 1:58 pm said:

      OK, let’s try Dropbox. I’ve given you Andy access to the Folder CO2 vs GAT that contains the Excel Workbook CO2 vs GAT R2.xls as a trial run and hopefully you received the invitation to collaborate.

      If anyone else wants to try this leave your name below to get access to the Folder here (motives permitting):-

      https://www.dropbox.com/home/CO2%20vs%20GAT#:::85176273

      If that doesn’t work, leave your email below if I don’t already have it or sent it to me via Richard T and I’ll give you access (motives permitting).

    • Andy on 09/12/2011 at 2:25 pm said:

      I got the shared xls from Richard. You can also make it public by putting it in the public folder. You can create a URL for the document on the dropbox website.

    • Jim McK on 09/12/2011 at 3:15 pm said:

      Hi Richard,

      Sorry, I wasn’t being critical just wanted to know the background.

      It looks like the CO2 curve is a best fit type quadratic. There is a lot of good data available to build such a curve and I presume it does not have built in bias. I am going to do some testing on it over the weekend.

      However the temperature equation is a simple function of time squared ie parabolic. I can see no reason for the equation being other than linear. That series artificially builds in acceration in projected temperature with no obvious logic.

    • Richard C (NZ) on 09/12/2011 at 4:52 pm said:

      Jim, I think you may have missed my introductory comment up-thread. The temperature quadratic is from Scafetta 2010, a peer-reviewed paper.

      The CO2 curve is my own best fit 4th order polynomial. I could have just plotted the raw data but the poly is as good a fit as you will get I think because it follows the Keeling curve almost perfectly and certainly good enough to make the comparison plus it makes for a good looking plot

      Don’t worry about criticism – I invite it, it’s the sceptical way. BTW, you will probably have an easier time if you can get to the Workbook in Dropbox.

    • Richard C (NZ) on 09/12/2011 at 5:26 pm said:

      Good plan Andy, I’ve copied the Workbook to the Public folder and the URL is as follows:-

      http://dl.dropbox.com/u/52688456/CO2%20vs%20GAT%20R2.xls

      I’ve got a backup of the original in case changes to the Dropbox “R2” version by collaborators means there’s several different versions of it. Not sure how this works yet.

    • Richard C (NZ) on 09/12/2011 at 5:56 pm said:

      Jim, Here’s the link to Scafetta 2010:-

      http://www.fel.duke.edu/~scafetta/pdf/scafetta-JSTP2.pdf

      Might save you some time.

    • Jim McK on 09/12/2011 at 9:08 pm said:

      Thanks Richard,

      I see where both series come from now and it is a good approach to what happenned over the last 100 years – well done.

      The temptation would be to use these series for future predictions. While your CO2 polynomial would probably work well for quite a while I do not believe that was why Scarfetta developed his quadratic. I also doubt whether he would claim it as a useful tool for prediction of future temperature.

    • Richard C (NZ) on 10/12/2011 at 8:19 am said:

      Actually in the latter half of his paper (see link up-thread), Scafetta DOES use the quadratic trend in combination with the cyclic components for prediction purposes. You need to read the paper in it’s entirety Jim, his analysis of celestial cycles is one of the best around if not the best.

      Nicola Scafetta and Craig Loehle have since put out a paper based on Scafetta 2010. From WUWT:-

      Loehle, C. and N. Scafetta. 2011. Climate Change Attribution Using Empirical Decomposition of Historical Time Series. Open Atmospheric Science Journal 5:74-86.

      The study is available via free open access at http://benthamscience.com/open/toascj/articles/V005/74TOASCJ.htm (links to full paper and supplemental information, both PDF, follow at the end of this post)

      http://wattsupwiththat.com/2011/07/25/loehle-and-scafetta-calculate-0-66%C2%B0ccentury-for-agw/

    • Jim McK on 13/12/2011 at 3:59 pm said:

      Hi Richard,

      Excelent article (bentham), very impressive and very convincing on background 60/20 year cycles. Amazing that the troposheric aerosols completely disappear.

      On page 78/79 he breaks out the elements of his model being a 60 year cycle + a 20 year cycle + a long term natural linear trend of 0.2C per century + a linear trend post 1942 trend of 0.66C per century. It produces a remarkable good fit and each element is well justified from external observation. After removing the cycles the total post 1942 rate of warming he has at 0.86C per century but linear. The quadratic has become linear in this later work.

      Worth noting though that the 0.66C per century he refers to as anthropogenic includes any heat island or measurement problems. He mentions that other sources calculated from 0.3C per century to all of it as attributable to measurement problems. Therefore he concludes that this is the top end of the possible range for actual anthrogenic warming.

    • Richard C (NZ) on 13/12/2011 at 4:49 pm said:

      I’m glad you had a look Jim, that reaction is along the lines of others who I’ve coerced into reading Scafetta10 or Loehle and Scafetta11.

      L&S11’s “A 21st Century forecast suggests that climate may remain approximately steady until 2030-2040” was originally (unlike the IPCC) an alternative scenario to the continuing quadratic trend in Scafetta10 taking in to account the last decades hiatus. That scenario is now consistent with the astrophysics prediction of a solar grand minimum except theirs includes the possibility of steady or cooler climate until further out to 2080.

  4. Doug Proctor on 09/12/2011 at 8:06 am said:

    I see 0.4C of warming in 30 years, or 1.33C/century. And that is with the “multiple” for H20 that the warmists claim.

    So far, pull out all the “other stuff” and you get 1.33C/century. Without acceleration.

    What part of the alarming news am I missing here?

    Oh, right: for some reason, like birds can’t fly, fish can’t swim etc., life in Alberta will end if Alberta becomes like …. North Dakota.

    Where are the dead North Dakotans, by the way? Can’t be in California, apparently that much additional heat causes human self-combustion, which is lucky, for apparently it is too hot there for vegetables, fruit and grapes to grow, based on peer-reviewed science and “scenarios” by the IPCC.

  5. Mike Jowsey on 09/12/2011 at 8:45 am said:

    Jo Nova highlights this very interesting paper (Liu Y, Cai Q F, Song H M, et al.) analysing Tibetan tree rings showing that today’s warmer temperatures are completely within the bounds of natural variation over the last 2500 years.
    http://joannenova.com.au/2011/12/chinese-2485-year-tree-ring-study-shows-shows-sun-controls-climate-temps-will-cool-til-2068/
    The regularity of 600-year temperature increases and 600-year decreases (Figure 3) suggest that the temperature will continue to increase for another 200 years, since it has only been about 400 years since the LIA. However, a decrease in temperature for a short period controlled by century- scale cycles cannot be excluded.

  6. Australis on 09/12/2011 at 11:06 pm said:

    David Whitehouse has a critique of this paper at http://thegwpf.org/the-observatory/4502-global-temperature-evolution-1979-2010.html

    He was less than impressed. A sample:

    “Looking at their figure showing global temperatures with the El Nino, volcanic and solar effects removed I must say that I don’t think their removal has been very good, as many of the features associated with El Nino and volcanic effects are still very obvious in the processed data. What’s more, these effects are large, tenths of a degree to many tenths of a degree, and I have no great confidence that inadequately removing them to reveal a tiny ‘natural’ trend of between 0.014 deg C and 0.018 deg C a year is a robust result.

    Curiously, the authors’ analysis shows that in their adjusted data set 2009 and 2010 are the hottest years. 2010 was warmer (though not statistically significantly so) than some previous years because of an El Nino. Taking it away (and adding a linear trend) seems to have made it even warmer!”

Leave a Reply to PeterM Cancel reply

Your email address will not be published. Required fields are marked *

Post Navigation