Met Office cover-up “crime against science”

Here’s the mainstream media strongly reproaching a pillar of the global warming myth with apparently nary a second thought. Yay! It’s great to see. People serving in public bodies of any country are much improved when publicly expected to justify what they say. It inevitably hatches humility or at least trims their hubris. This is the modern equivalent of the stocks whereby citizens get to hurl herbage at miscreants — only difference now is we fling verbiage, but millions, not dozens, witness their humiliation. Modern times are good. The Daily Mail raises sharp questions about some long-standing and troubling behaviour by the Met Office, whose apologists around the world should themselves pay heed to these questions and how they reflect on the science behind the predictions of global warming. One of the lessons here is that warmists are deceitful in claiming that the debate is over, for there is much to debate — every month there is more doubt over the future course of the climate. But more and more people are voicing questions about the predictions of warming — and what a wonderful thing that they are no longer ashamed to do so, for never in the field of scientific inquiry have so many been silenced for so long by so few. Perhaps the end is beginning.

Editorial, Daily Mail, 10 Jan 2013

To put it mildly, it is a matter of enormous public interest that the Met Office has revised its predictions of global warming, whispering that new data suggest there will be none for the next five years.

After all, the projection implies that by 2017, despite a colossal increase in carbon emissions, there will have been no rise in the planet’s surface temperature for almost two decades.

Why, then, did the Met Office choose to sneak out this intriguing information on Christmas Eve, knowing there would be no newspapers the next day?

Isn’t the inescapable suspicion that our national forecaster was anxious not to shake confidence in its Messianic belief that we are destroying our own planet?

This paper keeps an open mind on climate change – and accepts that the Met Office’s revised prediction doesn’t prove the scientific establishment and its staunch disciples at the BBC wrong.

At the very least, however, it adds to the debate, lending support to those who argue that the threat to the environment has been greatly exaggerated.

Meanwhile, ministers stake gargantuan sums of public money on their faith in the alarmists, scarring the landscape with wind farms, forcing up energy bills and threatening to shut down almost all our fossil fuel-dependent economy.

This is why it is so vital that every scrap of scientific data is fully debated and dispassionately analysed.

The Met Office’s clumsy attempt to hush up an inconvenient truth was a crime against science and the public.

Daily Mail, 10 January 2013

— from GWPF — h/t Barry Brill

Views: 344

37 Thoughts on “Met Office cover-up “crime against science”

  1. Simon on 18/01/2013 at 9:39 am said:

    To put things in perspective, the UK Met Office has reduced its estimate for the period 2012-16
    from 0.54C (0.36-0.72C) to 0.43C (0.28-0.59) above the baseline period 1971-2000. The change is due to a more recent start-point and a newer model called HadGEM3.

    • Richard C (NZ) on 18/01/2013 at 10:20 am said:

      To put things in perspective Simon, it now means the anomaly is little changed by 2017 than it was in 2012 i.e. the trajectory of the HadGEM3 model (accounts for natural variability at last, maybe) projects the flat trajectory of 2002 -2012 instead of the steeply rising HadCM3 projection.

      The new flat trajectory is also at odds with the “background” warming trajectory of the Foster, Rahmstorf and Cazenave that Skeptical Science and every warmist/alarmist blog has been touting. It is also at odds with the latest temperature update put out by NASA GISS by Hansen, Sato and Ruedy (also “background” warming).

      The basic 2011 HadCM3 vs 2012 HadGEM3 comparison:-

      http://climaterealists.com/attachments/ftp/The%20australian%20400.jpg

      2011 HadCM3 projection and hindcasts:-

      http://notalotofpeopleknowthat.files.wordpress.com/2013/01/image_thumb19.png?w=504&h=426

      2012 HadGEM3 projection and (radically) revised hindcasts:-

      http://notalotofpeopleknowthat.files.wordpress.com/2013/01/image_thumb18.png?w=1008&h=778

      Rahmstorf, Foster, and Cazenave 2012 Figiure 1:-

      http://www.skepticalscience.com/pics/FR11_All.gif

      CMIP5/AR5 ensemble vs observations:-

      http://wattsupwiththat.files.wordpress.com/2013/01/clip_image0042.jpg

      Note that HadGEM3 was too late for CMIP5 rendering UKMO’s prior CMIP5 simulations obsolete, as are all the other runs that don’t mimic the observations (all but 2 or or 3). The Russian Academy of Sciences INM-CM (blue line) was the most successful CMIP5 model and now eith HadGEM3 there are only 3 models worth consideration, the rest are of no consequence whatsoever.

      Note too that the 2012 UKMO synthesis plot of 10 HadGEM3 runs (spaghetti plot) is on a downward phase. Where does it go after 2017 i.e. the rest of the decadal projection to 2022?

      Are we finally seeing the real impact of solar recession and a cooler regime on the way like sceptics have been saying for some time now.

    • Richard C (NZ) on 20/01/2013 at 11:04 am said:

      >”The new flat trajectory is also at odds with the “background” warming trajectory of the Foster, Rahmstorf and Cazenave,,,,[Fig 1]”

      Looks like I’m wrong here. The RF&C Fig 1 trajectory does intersect roughly with the 2016 peak in the UKMO 2012 projection.

      So the “true background anthropogenic global warming signal” in both SkS/RF&C and HadGEM3 will be tested over the next 5 yrs.

      And if elevated temperature levels don’t eventuate to perpetuate the “signal”, AGW is dead.

    • Richard C (NZ) on 20/01/2013 at 12:18 pm said:

      Apologies for highjacking both of these threads at 2 blogs (NZCC and CCG), but I’m on a roll here and I think this is a very important juncture in the test of AGW validity (or not).

      Up to 2010, Foster, Rahmstorf and Cazenave had the liberty of “taking out” “exogenous factors” (e.g. ENSO, a bogus method but we’ll go with it here) and that worked to their advantage for fixing “the true background anthropogenic global warming signal”. That had the effect of pulling down all levels prior to their pivot point at 2010, a manipulation that has also been going on in major observation series i.e. hindcasting that RF&C Fig 1 trajectory leads to very silly levels prior to 1980:-

      http://www.skepticalscience.com/pics/FR11_All.gif

      But now that RF&C have immutably fixed the “signal” (according to them), they will NOT be able to pull down their 2010 pivot point level as new data comes in because they no longer have that liberty, the “signal” was fixed in their Fig 1.

      And for 2013 – 2017, RF&C will also have to “take out” any El Nino that may occur to be consistent in their method. Any continuation of their “signal” will have to be anthro-only – no El Nino’s allowed.

      Neither will they be able to “add in” any imaginary warming after their fixed 2010 pivot point to perpetuate the “signal” if observations after “taking out” El Nino’s do not produce higher levels relative to their 2010 Fig 1 pivot.

      Over the next 5 years, Foster, Rahmstorf and Cazenave (and SkS) are about to hoist themselves on their own petard unless there’s some radical warming in that time.

  2. Richard C (NZ) on 18/01/2013 at 10:26 am said:

    Vindication for the much maligned David Rose at the Daily Mail (Mail on Sunday) and the beginnings of some backpeddling and butt covering by the obviously divorced-from-reality institutional posturing by the likes of the UKMO.

    This changes everything for “as the planet continues to warm” reports and “the true background global warming signal” meme now.

  3. Richard C (NZ) on 18/01/2013 at 10:58 am said:

    My question over the UkMO 2012 HadGEM3 projection is this: what is the mechanism to produce the peak at 2016?

    The 2016 peak is at about the same level as the 2010 peak that was produced by a strong El Nino (a real-world phenomenon) but the UKMO’s peak is merely an artifact of synthesizing 10 runs of spaghetti i.e. they do NOT have a real-world driver basis for the 2016 peak.

    2012 was basically an ENSO-neutral year so there are 2 scenarios from 2012 onwards:-

    1st scenario: ENSO-neutral flat trajectory for the next 5 years from 2012 but with minor downwards inflexion becoming evident due to solar recession.

    2nd scenario: a) either El Nino predominating (rising/warming trajectory as per UKMO) or b) La Nina predominating (falling/cooling trajectory) also with the downward inflexion from solar recession.

    The UKMO scenario only fits the 2nd “a)” scenario but the PDO is now negative and it is quite possible looking at the MEI (linked below) that the regime has changed from El Nino dominant to La Nina dominant:-

    http://www.esrl.noaa.gov/psd/enso/mei/ts.gif

    If so, even this revised UKMO projection will be found to be rubbish over the next 5 years. In any event, the acid test is now squarely on warming predictions from 2013 – 2017..

    • The Nir Shaviv video I posted on the other thread is worth watching and relevant here. He discusses the influence of cosmic rays and cloud formation, and estimates CS to be around 1 degree of warming for a doubling of CO2

      So here’s that link again

      http://www.sciencebits.com/Munich-2012

    • Richard C (NZ) on 18/01/2013 at 12:31 pm said:

      Nir links to his post ‘The oceans as a calorimeter’

      By Nir Shaviv, Sun, 2009-04-12 21:48

      http://www.sciencebits.com/calorimeter

      Quoting:-

      “Over the 11 or so year solar cycle, solar irradiance changes by typically 0.1%. i.e., about 1 W/m2 relative to the solar constant of 1360 W/m2. Once one averages for the whole surface of earth (i.e., divide by 4) and takes away the reflected component (i.e., times 1 minus the albedo), it comes out to be about 0.17 W/m2 variations relative to the 240 W/m2. Thus, if only solar irradiance variations are present, Earth’s sensitivity has to be pretty high to explain the solar-climate correlations (see the collapsed box below).

      However, if solar activity is amplified by some mechanism (such as hypersensitivity to UV, or indirectly through sensitivity to cosmic ray flux variations), then in principle, a lower climate sensitivity can explain the solar-climate links, but it would mean that a much larger heat flux is entering and leaving the system every solar cycle.

      Now, is there a direct record which measures the heat flux going into the climate system? The answer is that over the 11-year solar cycle, a large fraction of the flux entering the climate system goes into the oceans. However, because of the high heat capacity of the oceans, this heat content doesn’t change the ocean temperature by much. And as a consequence, the oceans can be used as a “calorimeter” to measure the solar radiative forcing. Of course, the full calculation has to include the “calorimetric efficiency” and the fact that the oceans do change their temperature a little (such that some of the heat is radiated away, thereby reducing the calorimetric efficiency).

      It turns out that there are three different types of data sets from which the ocean heat content can derived. The first data is is that of direct measurements using buoys. The second is the ocean surface temperature, while the third is that of the tide gauge record which reveals the thermal expansion of the oceans. Each one of the data sets has different advantages and disadvantages.

      The ocean heat content, is a direct measurement of the energy stored in the oceans. However, it requires extended 3D data, the holes in which contributed systematic errors. The sea surface temperature is only time dependent 2D data, but it requires solving for the heat diffusion into the oceans, which of course has its uncertainties (primarily the vertical turbulent diffusion coefficient). Last, because ocean basins equilibrate over relatively short periods, the tide gauge record is inherently integrative. However, it has several systematic uncertainties, for example, a non-neligible contribution from glacial meting (which on the decadal time scale is still secondary).

      Nevertheless, the beautiful thing is that within the errors in the data sets (and estimate for the systematics), all three sets give consistently the same answer, that a large heat flux periodically enters and leaves the oceans with the solar cycle, and this heat flux is about 6 to 8 times larger than can be expected from changes in the solar irradiance only. This implies that an amplification mechanism necessarily exists. Interestingly, the size is consistent with what would be expected from the observed low altitude cloud cover variations.”

      References:
      1) Nir J. Shaviv (2008); Using the oceans as a calorimeter to quantify the solar radiative forcing, J. Geophys. Res., 113, A11101, doi:10.1029/2007JA012989. Local Copy [hotlinked]

      Collapsed box from above.

      The IPCC’s small solar forcing and the emperor’s new clothes.

      With the years, the IPCC has tried to downgrade the role of the sun. The reason is stated above – a large solar forcing would necessarily imply a lower anthropogenic effect and lower climate sensitivity. This includes perpetually doubting any non-irradiance amplification mechanism, and even emphasizing publications which downgrade long term variations in the irradiance. In fact, this has been done to such an extent, that clear solar/climate links such as the Mounder minimum are basically impossible to explain with any reasonable climate sensitivity. Here are the numbers.

      According to the IPCC (AR4), the solar irradiance is responsible for a net radiative forcing increase between the Maunder Minimum and today of 0.12 W/m2 (0.06 to 0.60 at 90% confidence). We know however that the Maunder minimum was about 1°C colder (e.g., from direct temperature measurements of boreholes – e.g., this summary). This requires a global sensitivity of 1.0/0.12°C/(W/m2). Since doubling the CO2 is thought to induce a 3.8 W/m2 change in the radiative forcing, irradiance/climate correlations require a CO2 doubling temperature of ΔTx2 ~ 31°C !! Besides being at odds with other observations, any sensitivity larger than ΔTx2 ~ 10°C would cause the climate to be unconditionally unstable (see box here).

      Clearly, the IPCC scientists don’t comprehend that their numbers add up to a totally inconsistent picture. Of course, the real story is that solar forcing, even just the irradiance change, is larger than the IPCC values.

      # # #

      “…ocean basins equilibrate over relatively short periods” is debatable – what time frame?

      Scafetta 2009 estimates a 1.5 W/m2 difference between solar grand minimum and grand maximum over the Holocene.

      Nir’s article doesn’t address Alec Rawls’ contention that the higher TSI levels of the late 20th century don’t require a trend to elevate temperatures (oceanic energy accumulation and thermal lag does that) as has been demanded by solar detractors or the issue of the problems (uncompensated degradation – ACRIM and IRMB are slightly rising or flat) with the downwards PMOD TSI trend from which the solar detractors (in their ignorance) demand an instantaneous downward temperature response (cooling from the 1980s, as per SkS posts among others)

    • Niff on 23/01/2013 at 1:29 pm said:

      Richard,
      thank you for quoting this concise explanation of the TSI/OHC issue which illuminates Alex Rawls comments about the AR5 draft. One wonders how such logic can be wholly ignored in pursuit of the “the idea I started with” from alarmists. Even with half a brain you can see that being proven wrong isn’t going to be that difficult…unless you have MSM and everyone including the emperor admiring the new clothes. And right on cue every commentator about Rawls comments willfully misinteprets and distracts with arguments about GCR….oh well. If the settled science ever gets a critical examination it shouldn’t take too long to debunk CAGW.

  4. Clarence on 18/01/2013 at 12:09 pm said:

    The BBC interpreted the Met Office graph as forecasting that there will be a two-decade standstill by 2017.

    The BBC is not the only organ applying the handbrake to its former ebullience regarding DAGW. NYT has disbanded its Environment desk (2 editors, 7 journalists) formerly the largest single source of DAGW stories. The 5 major US newspapers now have only 7 full-time environmental reporters amongst them.

    Even Reuters ruminates that GW might be good for us. And the UK Independent ran a hagiographic article about David Bellamy.

    Is the worm turning?

  5. Richard C (NZ) on 18/01/2013 at 1:15 pm said:

    From HT:-

    David Lewis January 17, 2013 at 6:03 pm

    […]

    Given the data available, Hansen has chosen to emphasize that there has been a revolutionary change in how certain we can be about if the planet is warming now that there are so many Argo floats measuring the heat content of the global ocean. Whatever role aerosols play, if more energy is coming into the planetary system than is going out, the excess will be apparent in the global ocean, where 90% of the incoming heat, if there is any, should be accumulating. And, sure enough, it is.

    In Earth’s Energy Imbalance and Implications, Hansen explains that Von Schuckmann et.al. have come up with what looks like a good analysis of the Argo data, and if they have, there is no question the planet is warming.

    There isn’t any “standstill” in the rate of accumulation of heat in the global ocean.

    I’ve got a problem with the way Hansen expresses himself.

    In this latest paper Gareth is pointing to, Hansen uses the words “global warming standstill”, words an ordinary human being might be expected to take to mean “global warming has stopped”, to describe what is happening to the 5 year running mean of the global average surface temperature chart.

    Although 90% of the heat that is accumulating in the planetary system is going into the global ocean, it is not directly measured or described by the global average surface temperature chart. Tiny changes in heat distribution in the global ocean, i.e. ENSO (El Nino/La Nina) have a profound effect on the global average surface temperature chart precisely because so much of the heat in the overall system is in the ocean. If you want to know if the planet is heating up, you study the ocean. If you want to describe the system in terms like has there been a “standstill” in the “global warming”, you’d have to point to an indicator that included measurements of the global ocean.

    http://hot-topic.co.nz/people-talkin-10/#comment-36377

    # # #

    >”If you want to describe the system in terms like has there been a “standstill” in the “global warming”, you’d have to point to an indicator that included measurements of the global ocean.”

    Yes, and you also have to look at the “adjustments” Josh Willis made to NODC 0-700m OHC by throwing out ARGO floats that were “impossibly cold” (does he still do that?). NODC had an opposite trend to UKMO EN3 that when noticed by Bob Tisdale was hastily taken down from the UKMO website:-

    http://bobtisdale.files.wordpress.com/2012/05/figure-7.png

    • Richard C (NZ) on 18/01/2013 at 1:17 pm said:

      David Lewis goes on:-

      “Yet even Hansen when he looks at the global average surface temperature chart, he sees the 5 year running means have been fairly flat for a decade, and he pronounces there has been a “global warming standstill”.

      Its preposterous.

      Obviously, Hansen understands what he is talking about, but I think he could be more precise in his choice of words to describe his thoughts.”

      # # #

      “Preposterous” indeed.

    • Richard C (NZ) on 18/01/2013 at 1:37 pm said:

      Even if global warming is defined in terms of OHC as David Lewis requires – fine. The likes of Roger Pielke Snr have been asking for that for years.

      And whatever OHC accumulation there is in NODC composites – valid or not – is solar sourced. It isn’t anthropogenic global warming because there’s no aOHC mechanism documented that the IPCC recognizes anyway.

      These AGW guys (David Lewis’ redirection to ocean warming and SkS’ insistence on “background” atmospheric warming) have run out of wriggle-room now.

  6. Meanwhile, Stuff reports

    “World average temperature rising ”
    http://www.stuff.co.nz/science/8190815/World-temperature-average-rising

    The world’s temperature will keep rising and it’s 21st century activity that is boosting the thermometer, according to Nasa’s Goddard Institute for Space Studies (GISS).

    A visualisation released by GISS in New York, shows the average global temperature has risen about 0.8 degrees Celsius since 1880.

    According to the new analysis, 2012 was one of the hottest on record with a worldwide average of 14.6C and also the ninth-warmest year since 1880 when record keeping began.

    Last year’s global average temperature was 14.4C. New Zealand, however, averaged 12.5C.

    But the institute reports that each year after 2000 has been among the 14 warmest years ever and 2012 was the warmest La Nina year yet, which is unusual as La Nina weather patterns usually result in lower-than-average global temperatures.

    GISS reports 2010 and 2005 ranked as the hottest years on record with the exception of 1998.

    No mention of Jimmy’s flatlining

    • Richard C (NZ) on 18/01/2013 at 4:30 pm said:

      >”2012 was the warmest La Nina year yet…..”

      Huh? Baloney. There was a transition from La Nina to El Nino and then to neutral, but if anything it was a weak El Nino year not a La Nina year, even according to NOAA.

      http://www.esrl.noaa.gov/psd/enso/mei/comp.png

      “In the context of the short-lived transition from La Niña into El Niño in 2012, this section features a comparison figure with weak-to-moderate El Niño events that followed La Niña conditions earlier in the same calendar year”

      http://www.esrl.noaa.gov/psd/enso/mei/

      >”New Zealand, however, averaged 12.5C.”

      13,1 2010 (moderate El Nino)
      12.8 2011 (strong double-dip La Nina)
      12.5 2012 (weak El Nino)

      http://www.esrl.noaa.gov/psd/enso/mei/ts.gif

      >”…which is unusual as La Nina weather patterns usually result in lower-than-average global temperatures.”

      But locally, the 2012 weak El Nino year was cooler than the strong double-dip 2011 La Nina year.

      Except for that, 2012 was nothing exceptional either globally or locally. It’s probably representative of the 21st century regime. Question is: where to from here?

  7. TV3 also join the pack.

    “Australian heatwave is connected to climate change – IPCC”

    http://www.3news.co.nz/Australian-heatwave-is-connected-to-climate-change—IPCC/tabid/1160/articleID/283424/Default.aspx

    Contrary to claims made by an Auckland scientist earlier in the week, scientists from around the world and New Zealand are saying the recent Australian heatwave is part of a global trend.

    Intergovernmental Panel on Climate Change (IPCC) chairman Rajendra Pachauri has told media the Australia heatwave is part of a warming trend around the world and says more heatwaves are likely.

    “We already are getting more frequent and more intense heatwaves, and we are also going to get extreme precipitation events.”

    The IPCC, which is currently meeting in Hobart, warned on Tuesday that heatwaves in Australia could become six times more frequent within 30 years.

    But earlier this week associate professor of climate and environment science Chris de Freitas of the University of Auckland said there was no evidence to suggest the Australian heatwave was connected to climate change.

    “It’s speculation, not fact,” he told 3 News, saying there was no evidence humans were to blame for rising global temperatures.

    However James Renwick, a climate scientist at Victoria University of Wellington attending the IPCC conference in Hobart, says although it is difficult to pin one event to climate change, there probably is a connection.

    • Richard C (NZ) on 18/01/2013 at 4:33 pm said:

      >”…part of a warming trend around the world”

      There it is again.

    • Andy on 18/01/2013 at 4:51 pm said:

      Hansen probably got told off by Gavin about his inappropriate language that might confuse the public into thinking that global warming had stopped and therefore we can all carry on leading our decadent bourgeois lifestyles.

    • Richard C (NZ) on 18/01/2013 at 4:53 pm said:

      Renwick’s touting “background” warming and “taking out” ENSO but with different words (can’t bring himself to concede a standstill either):-

      In his criticism, Dr de Freitas also referred to a leaked draft of the IPCC five-year assessment report which appeared to claim global temperatures have not risen in the past 16 years.

      Mr Renwick says this is probably correct.

      “That’s right, globally temperatures over the last 10 or 15 years have not risen as fast as you would expect.”

      But he says this doesn’t mean climate change is not happening. He says the late 1990s were very warm because of the warm El Nino climate system, but the 2000s have been cooler because of La Nina, which acts to cool the climate.

      “Once you take account of that effect, behind that, temperatures are going up

      Read more: http://www.3news.co.nz/Australian-heatwave-is-connected-to-climate-change—IPCC/tabid/1160/articleID/283424/Default.aspx#ixzz2IIaqS6Ds

      What Renwick DOESN’T say is that El Nino was dominant for a much longer period from mid 1970s to mid 2000s:-

      http://www.esrl.noaa.gov/psd/enso/mei/ts.gif

    • Andy on 18/01/2013 at 5:10 pm said:

      I think Bob Tisdale made a similar comment on WUWT yesterday. They are using these excuses to explain the lack of warming and then forgetting to factor these same issues in when the warming was actually happening.

    • Richard C (NZ) on 18/01/2013 at 7:30 pm said:

      Yes I saw Bob Tisdale’s comment too. There’s some very questionable consideration and application of ENSO for narrative purposes. Another example, Phil Scadden at HT:-

      “Very little can be inferred from the noise (eg ESNO)”

      http://hot-topic.co.nz/people-talkin-10/#comment-36422

      True to a degree, but what when a positive cycle of “noise” (El Nino) dominates for 30 yrs as it did mid 70s to mid 00s (Tisdales comment referred to this I think)? And what if that cycle has actually entered a 30 yr period of negative phase domination (La Nina) from mid 00s (as it might have) or there was a 30 yr negative phase prior to the mid 70s (as there was near enough)?

      And David Lewis doesn’t realize that “ENSO (El Nino/La Nina)” “noise” is overlaid on the ocean oscillations that determine domination by either El Nino or La Nina:-

      “Tiny changes in heat distribution in the global ocean, i.e. ENSO (El Nino/La Nina) have a profound effect on the global average surface temperature chart precisely because so much of the heat in the overall system is in the ocean.”

      http://hot-topic.co.nz/people-talkin-10/#comment-36377

      It’s the general domination one way or the other over long periods (30 yrs say) that’s the important factor, not the 1 or 2 year “tiny changes” even though the effect of that is radical. The PDO is now negative so Phil and David should anticipate some negative ENSO noise dominating for a while. That’s a significant factor and it wont translate to any warming over the next couple of decades plus if the cycle is being repeated.

      The UKMO might now be accounting for some of that natural cyclicity but I don’t know yet what the details of the changes in HadGEM3 are exactly.

  8. Andy on 18/01/2013 at 5:32 pm said:

    By the way, a lot of the Nic Lewis work on sensitivity is starting to make a lot more sense now that I have had time to background research some of the fundamentals

    Dave Frame seems to feature quite a lot in work on selection of priors for Bayesian analysis of CS.

    • Richard C (NZ) on 18/01/2013 at 5:36 pm said:

      You’ve had time for background research already with all the other stuff going on, not to mention work commitments?

      I admire your time management Andy.

    • Andy on 18/01/2013 at 5:43 pm said:

      The family is away for 10 days. Just me, the dog, and the Internet…

    • Australis on 19/01/2013 at 2:34 am said:

      “…selection of priors for Bayesian analysis of CS”..

      Now, what does that mean Andy?

      I presume:
      – there is no direct observable evidence of sensitivity, so that probabilities of various levels are guesstimated by Bayesian means;
      – some prior inputs are required for the guesstimates – possibly papers prepared by a mass of different analysts;
      – Frame has authored some such papers.

      Did I get that right?

    • Australis – I’ll try to fill in the details as I understand them – I am still learning a lot of new stuff.

      The thread at Bishop Hill contains a lot of useful info

      http://www.bishop-hill.net/blog/2013/1/12/lewis-on-schmidt-on-climate-sensitivity.html

      The discussion centres around Nic Lewis’s recalculations of climate sensitivity (S) which was subsequently reported in the WSJ by Matt Ridley.
      Lewis’s later work focused on recalculations of S using updated figures for aerosol forcing. This brought S down to a value of 1.75 degC

      The other recent work by Nic Lewis was the discussion at Judith Curry’s about the use of uniform priors on Forster and Gregory 06 (I’ll say a bit more about Bayes and priors at the end of this comment for the newcomer)

      http://judithcurry.com/2011/07/05/the-ipccs-alteration-of-forster-gregorys-model-independent-climate-sensitivity-results/

      FG2006 was a paper based on empirical observations alone, unlike other IPCC studies that use models to some degree or other.

      The original graph showed a tightly constrained probability density function (PDF) for S, with a modal (average) value of around 1 degree, which you can see in Fig 1 on the Curry page.

      The IPCC then decided to modify the PDF by applying a uniform prior on S. What this means is that they made a claim (a prior assumption – or prior in Bayesian terms), that S was constrained to lie between 0 and 18.5 degrees, with an equal probability of any of those values being correct.
      By then applies Bayes Rule, the IPCC found a posterior PDF ( the one that comes out of Bayes by applying the prior to your evidence – the actual climate data) that was much more flattened out, with a much fatter tail. This also had the effect to push up the modal value of S (at the peak of the graph), and the median too.

      http://curryja.files.wordpress.com/2011/07/fig4_influence-of-prior.jpg

      There are a number of issues with this approach, and Nic Lewis explains a lot better than I can. One is that FG2006 was not a Bayesian study, so using priors may not be appropriate. Secondly, chosing a uniform prior on S may not be appropriate. As Lewis point out, the relationship between S and forcing Y is inverse, so a uniform prior in Y would give a completely different result.

      Where Dave Frame comes into the picture is his 2005 paper
      “Constraining climate forecasts: The role of prior assumptions”
      http://www.climateprediction.net/science/pubs/2004GL022241.pdf

      which seems to get cited a lot in the literature.

      Nic Lewis, in response to a comment by me at BH, writes thus:

      Andy Scrase
      “Frame et al seems a well cited paper on priors in climate prediction”

      Or badly cited, depending on one’s viewpoint! It is the only place I have ever read the proposal that the prior used for estimating climate sensitivity (or indeed any parameter in any field) should depend on the purpose to which the estimate is to be put. That seems to me to go against both probability theory and scientific principles.

      I can understand multiplying a probabilistic parameter estimate (a posterior PDF, in a Bayesian context) different loss functions, according to the use of the estimate, but not changing the estimate itself – which is what Frame et al (2005) advocate, through varying the prior used.

      I would be interested in the views on this point of any scientists and/or statisticians who read this.

      So this is where I am up to – trying to figure out the literature and reading some textbooks too.

      As a postscript, for anyone not familiar with Bayes, then the book recommended to me at BH is a great historical introduction to the subject

      “The Theory That Would Not Die: How Bayes’ Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy”
      http://www.amazon.com/dp/0300188226

      In general terms, Bayes is a simple rule that uses prior assumptions with observed facts to create an improved model or PDF, the so-called posterior
      Bayes gets used in things like anti-spam filters, where we can start with a uniform prior that all mail is not spam, and then “train” the model by flagging spam. Bayes then works to find a PDF that a certain email is spam. This then gets used iteratively so that the posterior gets used as the new prior.

  9. Richard C (NZ) on 18/01/2013 at 8:34 pm said:

    Quick observation re GCRs.

    Argument over counteracting forcing over recent times (since the 70s say) is so complex, interacting, uncertain and difficult to argue in debate. Whether sceptic or warmist, you can spin a line or debunk depending on what composite series you select and what you leave in and leave out, much of which are modulating factors rather than something that explains Holocene maximums and minimums.

    For example, GCRs are discounted over the last 50 yrs because there’s no trend in cosmic ray intensity and that may be valid but depending on dataset integrity. Search Google images and you can find this that supports the recent cloud-cosmic disconnect after 2004:-

    http://larvatusprodeo.net/files/2011/10/cosmic_ray_cloud_data_600.png

    Compare to sun/cosmic/cloud/temperature plots in the Sun and Cosmic Rays section here (several are out of date note):-

    http://members.shaw.ca/sch25/FOS/Climate_Change_Science.html

    But there are different datasets to consider and it gets very murky. For example there are different cloud datasets and there’s been adjustments made that make all the difference in 50 year studies, Svensmark points out those adjustments in response to low cloud/GCR divergence criticism post 2004.

    On the other hand, note that in that Sun and Cosmic Rays section there’s a 150 yr GCR study by Dr. U.R. Rao and this image:-

    http://members.shaw.ca/sch25/FOS/Rao_CR_HMF.jpg

    Definitely a 150 yr trend in that GCR composite. The Rao paper is here:-

    ‘Contribution of changing galactic cosmic ray flux to global warming’

    http://www.friendsofscience.org/assets/documents/Rao-GCR_GW.pdf

    So we have at least one “long-term” trend in 150 yr terms but not a short-term 50 yr trend using datasets that include significant adjustments. Even the requirement for a trend over the last 50 years is questionable given the global dimming/brightening changes. There are some very good (better than CO2) cosmic ray/temperature correlations however:-

    http://calderup.files.wordpress.com/2012/03/101.jpg

    Not sure what I’m getting at but I think my point is that the divergence-based arguments used to dismiss solar/GCR/cloud vs temperature (valid or not) can equally be applied to CO2 vs temperature divergence (e.g. 1940s, 2000s) i.e. there’s a correlation break down there too so it’s a double standard to dismiss one hypothesis on a certain basis where big dataset uncertainties exist but not dismiss another hypothesis on the same basis. The AGW hypothesis cannot be upheld in this way. what has to hold for one hypothesis must hold for the other hypothesis.

    And dataset uncertainties exist for CO2 vs temperature too. The splicing of Law Dome ice core CO2 to Mauna Loa atmosphere CO2 required an adjustment of about 50 yrs (I think it was) to get the fit. The resulting curve is highly uncertain.

    Disclaimer: I don’t regard GCR as the answer to everything since 1750 say, but that it is a contributing factor to be considered in very complex interactions over the intervening period with a great deal of uncertainty as to whether the GCR factor has any significance or not. There’s still a massive amount of science to be undertaken to resolve that but by the time it is, the conclusion may be moot anyway if DAGW hasn’t survived in the meantime.

    • Andy on 18/01/2013 at 9:19 pm said:

      Thanks for the details. My point at HT was that GCR is another parameter to be considered alongside co2 , black carbon, aerosols, and natural cycles. There is a heck of a lot we don’t know about many of these parameters, so any talk of settled science is bunk.

    • Richard C (NZ) on 19/01/2013 at 12:02 am said:

      Yup, it’s not a clear-cut CO2 vs GCR or CO2 vs solar contest where if just GCR and/or solar can be falsified – albeit with considerable uncertainties and miss-attribution – then ignoring other parameter considerations CO2 somehow wins by default, but that is how the debate is swayed.

  10. Richard C (NZ) on 19/01/2013 at 12:11 am said:

    The UKMO decadal prediction revision was a clear demonstration that radical modification of outlook ensues, along with improved modeling of observations, from experimentation with parameters at a time when the science was supposed to be settled but sceptics were crying foul. The sceptics have been vindicated by that revision alone because it goes some way to implementing what sceptics were saying was missing.

    But that same learning process, experimentation and modification will have to continue in view of how the climate plays out over this 2013 – 2017 period and parameters will again have to be reassessed accordingly. I think that was as much the reason for a 5 yr projection instead of a 10 yr projection as it was for the stated computing resource constraints. The UKMO modelers have realized some time back when the standstill set in that HadCM3 was not a realistic configuration hence the new HadGEM3 configuration. And realized too I’m guessing, that there’s no point in a 10 yr projection if parameter reassessment will be needed in the interim even within the 5 yr period.

    In other words, the science wont be settled by 2017 either.

    If UKMO had just come out and said they were addressing concerns over poor parameterization in the way they must have been over the HadCM3 – HadGEM3 transition period they could have saved a huge amount of column space. I don’t recall any such announcement making headlines even though it may have been in UKMO website updates. Wasn’t going to happen obviously given the Christmas Eve revision posting, it was a PR no-go.

    UKMO have made fundamental changes to their modeling approach, not just parameters, in view of the standstill that other groups not mimicing observations (most of them) will be forced to make too now and that includes more circumspect time horizons. UKMO know they have to take a more incremental approach as understanding of the contribution hitherto poorly or non accounted for phenomena make to climate evolves.

    That makes a mockery of AR5 projections to 2100 though.

    • Andy on 19/01/2013 at 8:47 am said:

      Richard Betts works at UKMO and occasionally comments at Bishop Hill. It might be worth keeping an eye on what he has to say about this. He is a fairly approachable and friendly bloke.

    • Richard C (NZ) on 19/01/2013 at 9:42 am said:

      Yes I should check in to BH more. I’m certainly not tarring everyone at UKMO with the same brush. There’s obviously been some fundamental re-thinking going on by non-PR tech types there. Richard Betts seems representative of them from what I can see. As you said at HT re the continuous improvement process “what’s not to like”.

      BTW, occurred to me that NIWA’s regional climate model is derived from UKMO’s Unified Model (UM). Does HadGEM3 make that obsolete I wonder?

    • Jan 10 comment from Betts:

      The white lines show hindcasts, ie: model simulations started from older initial conditions and then run onwards, and compared with the observations to see how well the model does. The point here is that the hindcasts with the new model (HadGEM3) compare better with the observations than the old model (HadCM3) and so this gives us more confidence in the new model.

      These decadal forecasts use “initialised forecasting” techniques, ie: the models are started at the observed state for the current time – as distinct from the long-term climate projections that start back in pre-industrial times, run through the 20th Century and then on into the 21st Century, meaning that they can’t be expected to capture the exact year-by-year variations that the initialised forecasts are attempting to capture. Because the initialised forecasts are started off at, say, the right place in an ENSO cycle, they potentially can capture the natural variations arising from ENSO and other modes of variability. This is still early days of course, there is still a lot more work to do, but you can see from the 2012 figure that the hindcasts show the model agreeing with the observations reasonably well (and better than the HadCM3 hindcasts as shown in the 2011 figure).

      The first time that these initialised forecasting techniques were used for decadal forecasting was this paper published in 2007. So this was the first time there was actually a proper forecast looking forward in time – anything before then is a hindcast. This is the case for all versions of the decadal forecast that you might find.

      http://www.bishop-hill.net/blog/2013/1/10/spot-the-difference.html

    • Richard C (NZ) on 19/01/2013 at 11:40 am said:

      I suspect that NIWA’s use of their model is mostly near-term and is only seldom for more than a decade e.g. the long-term alpine snow forecast (haven’t heard anything from them since that). UKMO’s description of decadal forecasting is here:-

      Decadal Forecasting – What is it and what does it tell us?

      Decadal forecasts are designed to forecast fluctuations in the climate system over the next few years through knowledge of the current climate state and multi-year variability of the oceans. This item provides some more detail on what they are, and what they can tell us.

      >>>>>>>>

      http://www.metoffice.gov.uk/research/news/decadal-forecasting

      Figure 5 is rubbish because the blue 2012 spaghetti has been placed far too high compared to Figure 1 and other synthesis plots elsewhere.

      So if UKMO had to develop HadGEM3 in order to realistically mimic observations for decadal forecasting, the same rationale should hold for the long-term models. At this juncture there are only two CMIP5 groups that can rest on their laurels, the rest have to re-think (that would include UKMO) and come up new configurations. Even the two apparently valid groups will have to stand the test of time over the next 5 yrs as I said earlier.

      The UKMO says this:-

      “The latest decadal forecast, issued in December 2012, show that the Earth is expected to maintain the record warmth that has been observed over the last decade,”

      That’s for 5 yrs and I think that’s reasonable but with a solar recession upon us that wont last forever.

      “….and furthermore a substantial proportion of the forecasts show that new record global temperatures may be reached in the next 5 years.”

      As I asked up-thread, why? By what mechanism? There is no single mechanism evident in the spaghetti. This seems to imply that the 5 yrs will be dominated by El Nino’s but this is what they say about that:-

      “….we know that the predictability of phenomena like El Nino/La Nina is limited to at most a year in advance.”

      They contradict “new record global temperatures” with:-

      “Similarly relative cooling in the north-east and south-east Pacific Ocean compared to other parts of the global oceans is also evident in Figure 3. The pattern of this cooling is similar to that observed in the Pacific Decadal Oscillation, and has been a notable difference in ocean temperatures between the 1990s and 2000s (Figure 4). Were this cold phase to continue as forecast, this would act to moderate global warming in the next few years, as it has over the last decade.”

      So where does the record level arise? Oh yes:-

      “Decadal forecasting is immensely valuable, scientifically, because it represents a stringent test of how well the model simulates natural variability and also how well it captures the longer term anthropogenic warming trend”

      But what if that “longer term anthropogenic warming trend” that they assume, is non-existent because it is based on a fallacy as below?

      http://tallbloke.files.wordpress.com/2010/07/eggert-co2.png

      We’ll see over the next 5 yrs for sure.

    • Richard C (NZ) on 19/01/2013 at 12:08 pm said:

      >”…the rest have to re-think (that would include UKMO)”

      Unless UKMO is the as yet unidentified model that tracks the land-based observations. It could well be I suppose, given their HadGEM3 development. That work may have been transferred to their CMIP5 model in some way:-

      “The Met Office has been the first centre to start producing data and to have finished all the experiments using our state-of-the-art “Earth System” model, HadGEM2-ES”

      http://www.metoffice.gov.uk/research/news/cmip5

  11. Richard C (NZ) on 20/01/2013 at 9:43 am said:

    Placed this comment at New Zealand Climate Change ‘The Halt in Warming’ post:-

    >’The models might even improve. But will this change the tune that is sung?”

    The acid test will be on this decadal projection over the next 5 yrs given that there is improved accounting for natural variability. But although the trajectory is radically modified downwards from 2011 it is still for “record” levels above 2010′s strong El Nino year level and 1998′s superstrong El Nino level. The record levels do not arise primarily from natural factors though from what I can gather but from the assumed “anthropogenic global warming trend” because there’s no natural factor on the horizon short of a superstrong El Nino to boost temperatures to record levels.

    2012 was essentially an ENSO neutral year (a little more El Nino than La Nina) so that temperatures oscillate above and below a 2012 neutral level for this current decadal regime. See RSS (Note the different baseline to the UKMO HadCRUT 2012 projection plot – they’re not directly comparable):-

    http://junksciencearchive.com/MSU_Temps/RSSglobe.png

    If we take the average of 2012 as roughly 0.2 anomaly, then “record” levels in RSS terms are at least 0.4 higher (2010) or 0.6 higher (1998) than 2012 ENSO neutral levels. The lower confidence level if it eventuates would still be 0.25 higher than 2012.

    If this projected boost does not occur over the next 5 yrs, there cannot be a “background” anthro signal of any significance. And say La Nina dominates. That would mean oscillation BELOW 2012 levels i.e. cooling. There’s already a La Nina on the way for 2013 possibly and the PDO is in cold mode so La Nina could dominate for the next 20 yrs plus as La Nina did for the last 30 yrs with PDO in warm mode.

    Then there’s the solar recession upon us, that can’t boost warming. It’s only lagged ocean heat that will keep temperatures elevated but wont do any boosting. With solar forcing reduced, temperatures must decline at some point in the future – when?

    So if over the next 5 yrs temperatures even just maintain 2012 neutral levels on average, the tune that is sung will of necessity have to change.because the assumed “background” anthropogenic signal demands a rise.even at the lower limit by 0.25 C and at mid confidence by 0.4 – 0.6 C by UKMO’s forecast in RSS terms.

    The probabilities that could be assigned to the scenarios would however provide a prediction more like this in my view going by natural indicators (note these are probability factors not anomalies):-

    0.2 warming (anthropogenic forcing above neutral levels, El Nino prevailing)
    0.3 no change (neutral conditions prevailing)
    0.5 cooling (La Nina conditions prevailing, solar recession)

    The next 5 yrs are crucial for DAGW.now.

    http://newzealandclimatechange.wordpress.com/2013/01/19/the-halt-in-warming/#comment-1155

Leave a Reply

Your email address will not be published. Required fields are marked *

Post Navigation