Telling guilt from global warming

Climate Etc

Judith Curry draws a radical conclusion from this radical paper. The authors claim that climate forcings from both human influence and natural variation are likely of similar magnitude, which is the first time since the IPCC was created that the climate establishment has expressed that possibility. Then they admit that telling the difference between them is difficult (the science is not settled). That’s the second time that’s been said (the first time was in an early IPCC report). Since “human influence” has become a hot-button code word for guilt, perhaps the guilt might now subside. Finally, Judith has a plea for the IPCC authors: “No more ‘unequivocals’ or ‘very likelys’ in the AR5, please.” Amazing — you must read this and share it with everyone you know or don’t know. It’s sober and persuasive evidence that a tide is turning — a belief in dangerous warming no longer holds a trump card in climate studies. Make the politicians face this new scientific reality or they’ll go on for years with their ETS and carbon taxes – h/t Barry Brill

Separating natural and anthropogenically-forced decadal climate variability

The issue of separating natural from anthropogenically forced variability, particularly in context of the attribution of 20th century climate change, has been a topic of several previous threads at Climate Etc. The issue of natural vs anthropogenically forced climate variability/change has been a key issue of contention between the climate establishment and skeptics. There are some encouraging signs that the climate establishment is maturing in their consideration of this issue.

Distinguishing the Roles of Natural and Anthropogenically Forced Decadal Climate Variability: Implications for Prediction

Amy Solomon, Lisa Goddard, Arun Kumar, James Carton, Clara Deser, Ichiro Fukumori, Arthur M. Greene, Gabriele Hegerl, Ben Kirtman, Yochanan Kushnir, Matthew Newman, Doug Smith, Dan Vimont, Tom Delworth, Gerald A. Meehl, and Timothy Stockdale.

Newspapers

This is an adopted article.

Abstract. Given that over the course of the next 10–30 years the magnitude of natural decadal variations may rival that of anthropogenically forced climate change on regional scales, it is envisioned that initialized decadal predictions will provide important information for climate-related management and adaptation decisions. Such predictions are presently one of the grand challenges for the climate community. This requires identifying those physical phenomena—and their model equivalents—that may provide additional predictability on decadal time scales, including an assessment of the physical processes through which anthropogenic forcing may interact with or project upon natural variability. Such a physical framework is necessary to provide a consistent assessment (and insight into potential improvement) of the decadal prediction experiments planned to be assessed as part of the IPCC’s Fifth Assessment Report.

Citation: Solomon, Amy, and Coauthors, 2011: Distinguishing the Roles of Natural and Anthropogenically Forced Decadal Climate Variability. Bull. Amer. Meteor. Soc., 92, 141–156. doi: 10.1175/2010BAMS2962.1

Link to the complete article.

JC comment: The first sentence of the abstract really caught my attention: Given that over the course of the next 10–30 years the magnitude of natural decadal variations may rival that of anthropogenically forced climate change on regional scales… I don’t recall the climate establishment “giving” this one before. The implications of this is that the warming between 1970 or 1980 to 2000 should be operating under the same givens also.

From the Introduction:

As the science of decadal prediction is in its infancy, one would like to assess and understand the following:

  1. the expectations for added regional climate in-formation and skill achievable from initialized decadal predictions;
  2. what physical processes or modes of variability are important for the decadal predictability and prediction problem, and whether their relevance may evolve and change with time;
  3. what elements of the observing system are important for initializing and verifying decadal predictions; and
  4. in terms of attribution, to what extent are regional changes in the current climate due to natural climate variations and thus transitory, and to what extent are they due to anthropogenic forcing and thus likely to continue.

The purpose of this paper is to describe existing methodologies to separate decadal natural variability from anthropogenically forced variability, the degree to which those efforts have succeeded, and the ways in which the methods are limited or challenged by existing data. Note that the separation of decadal natural variability from anthropogenically forced variability goes beyond what has already been accomplished in previous studies that focused primarily on the detection of a long-term anthropogenic signal (Hegerl et al. 2007b) because on decadal time scales anthropogenic effects may be nonmonotonic, regionally dependent, and/or convolved with natural variability.

JC comment: the detection of the long-term signal from anthropogenic forcing was detected in the AR4 basically for the period 1970 or 1980 to 2000, without account for this: because on decadal time scales anthropogenic effects may be nonmonotonic, regionally dependent, and/or convolved with natural variability.

Observational uncertainties

Verification of the forced component of twentieth-century climate trends simulated in model experiments depends on the existence of accurate estimates of these trends in observations. Given the limited sampling in both space and time of the observations and proxy records, these verifications must be handled carefully. In particular, knowledge of the spatial patterns and magnitudes of climate trends over the oceans is hampered by the uneven and changing distribution of commercial shipping routes and other observational inputs as well as different approaches to merging analyses of the observations (Rayner et al. 2011).

An example of the impact of observational uncertainties on the interpretation of twentieth-century SST trends is shown in Fig. 7 based on an uninterpolated dataset [version 2 of the Hadley Centre SST dataset (HadSST2); Rayner et al. 2006] and two optimally interpolated reconstructions [the Hadley Centre Sea Ice and SST dataset (HadISST; Rayner et al. 2003) and version three of the National Oceanic and Atmospheric Administration’s (NOAA’s) extended reconstructed SST (ERSSTv3; Smith et al. 2008)]. Although trends from the three datasets share many features in common, such as a strengthening of the equatorial Pacific zonal temperature gradient (Karnauskas et al. 2009), there are also differences. Most notably, the eastern equatorial Pacific shows cooling in HadISST and warming in HadSST2 and ERSSTv3 (see also Vecchi et al. 2008). However, independently measured but related variables, such as nighttime marine air temperatures, provide some evidence that the eastern Pacific trends represented in the HadSST2 and ERSSTv3 datasets may be the more realistic ones (Deser et al. 2010b). These observational sampling issues underscore the challenge of providing a robust target for model validation of twentieth-century surface marine climate trends and perhaps the need to consider a suite of complementary measures for poorly sampled variables and/or regions.

A limitation of the instrumental record is that it spans at most a few realizations of decadal variability. Paleoclimate records—derived from tree rings, corals, lake sediments, or other “proxies”—have been used to extend this record to hundreds of years or more and are generally believed to be free of anthropogenic influence prior to the industrial age (Brook 2009; Jansen et al. 2007), thus constituting a potential means of model verification.

JC comment: with all these uncertainties in the observations of ocean temperature, “unequivocal” and “very likely” in the AR4 seem over-confident.

Modelling uncertainties

The spatial structure and dominant time scales of natural variations differ across models (see discussion of Fig. 5). Additionally, coupled climate models produce a range of responses, in space and time, to anthropogenic radiative forcing (Fig. 8). Such differences in model estimates of internal variability and response to external forcing limit our understanding for the potential of the decadal climate predictions.

As an example, the historical changes and future response of the tropical Pacific mean state have been subjects of debate. Different proposed mechanisms disagree on the expected sign of change in the zonal SST gradient in the tropical Pacific in response to anthropogenic forcing. The observational record does little to clarify the situation, as trends in different observed SST records differ in even their sign (see Fig. 7). Models that simulate the largest El Niño–like response have the least realistic simulations of ENSO variability, while models with the most realistic simulations of ENSO project little change in the Pacific zonal SST gradient (Collins 2005). These differences in tropical Pacific interannual variability and change have implications for Pacific decadal variability through their impact on large-scale changes in the atmospheric circulation (e.g., Alexander et al. 2002; Vimont 2005).

Conclusion

The main conclusion drawn from the body of work reviewed in this paper is that distinguishing between natural and externally forced variations is a difficult problem that is nevertheless key to any assessment of decadal predictability and decadal prediction skill. Note that all the techniques are limited by some assumption intrinsic to their analysis, such as the spatial characteristics of the anthropogenic signal, independence of noise from signal, or statistical stationarity.

JC summary: The authors of this paper are members of the climate establishment, in terms of being involved with the WCRP CLIVAR Programme and also the IPCC. This paper arguably provides more fodder for skepticism of the AR4 conclusions than anything that I have seen from the climate establishment (the authors may not realize this). The issues surrounding natural internal decadal scale variability are a huge challenge for separating out natural from forced climate change. The same issues and challenges raised for future projections remain also for the warming in the last few decades of the 20th century. Sorting this out is the key challenge. No more unequivocals or very likelys in the AR5, please.

Visits: 56

6 Thoughts on “Telling guilt from global warming

  1. Quentin F on 13/04/2011 at 9:22 pm said:

    No human influence.

    • Yes, perhaps. Certainly no detectable human influence. Local influences are common, from clearing or planting forests, crops and grassland and building all manner of structures and paved areas. These can measurably affect humidity, rainfall and particularly temperature. It’s reasonable to propose a global influence from all of these local effects, but so far as I know nobody has reported detecting a global influence. Not even after the spending of about $US50 billion over about 30 years. (I forget the figure which is quoted by Jo Nova.) If anyone can shed more light on this, please speak up.

      UPDATE:

      Found it. On 1 Dec, 2009, Jo posted We paid to find a “crisis”, part of which says:

      Since 1989 the US government has given nearly $80 billion dollars to the climate change industry.

      Thousands of scientists have been funded to find a connection between human carbon dioxide emissions and the climate. Hardly any have been funded to find the opposite. Throw billions of dollars at one question and how could bright, dedicated people not find 800 pages worth of connections, links, predictions, projections and scenarios? What’s amazing is what they haven’t found: empirical evidence.

      The BBC says “there is a consensus and thus no need to give equal time to other theories”. Which means they are not weighing up the arguments, they’re just counting papers. This is not journalism. It’s PR. If the IPCC is wrong, if there is a bias, you’re guaranteed not to hear about it from any organisation that thinks a consensus is scientific.

      When ExxonMobil pays just $23 million to skeptics the headlines run wild. But when $79 billion is poured into one theory, it doesn’t rate a mention.

    • Richard C (NZ) on 16/04/2011 at 12:12 pm said:

      The paper title uses the term “anthropogenically-forced” which includes the land use changes you describe, but the term “anthropogenic radiative forcing” used in the body of the paper under “modeling uncertainties” is the IPCC construct and methodology that really needs to be highlighted as the greatest uncertainty of all (and the most deficient) and yet it is the circular reasoning of that notion that the entire alarm industry and resulting ETS and carbon tax rationale is based on.

      It amazes me that major NZ and more especially Australian carbon dioxide emitters don’t take a class action to force proof that the IPCC assumptions and RF methodology are valid. Now those emitters have morphed into carbon “polluters” and still there’s no exception taken. If it wasn’t for the economic benefits of these companies I would say that they all deserve to be taxed into oblivion for their whimpish and incredibly ill informed stance,

      See:- About 50 high polluters to bear carbon tax brunt, Greg Combet tells Press Club

      http://www.theaustralian.com.au/national-affairs/about-50-high-polluters-to-bear-carbon-tax-brunt-greg-combet-tells-press-club/story-fn59niix-1226038504193

      Also:- Australia’s top 50 carbon emitters.

      http://resources.news.com.au/files/2011/04/13/1226038/651649-aus-news-file-top-carbon-emitters.gif

      It seems that as far as they are concerned, a carbon tax is just another cost to be passed on and no research is necessary to investigate the validity of the basis for it even though they have enormous resources at their disposal to do so.

    • Australis on 16/04/2011 at 6:39 pm said:

      Exporters (and import substituters) must be fully compensated, or the result would simply be carbon leakage.

      That means all 50 businesses “bearing the brunt” will be non-trade-exposed – which means they are cost-plus businesses that simply pass on any cost increases. But they add a margin before passing it on, so they (the big polluters) have a nice upside but no downside.

      However, the whole cost structure of Australia will have moved up a notch on a permanent basis. The country will be that much less competitive. There will be first and second rounds of induced inflation, which must lead to interest rate rises – which raise the exchange rate.

      So, the much-maligned coal-burning generators will be better off than before. The householder will pay more for electricity, and would have an incentive to reduce usage if electricity was price-elastic (which it clearly isn’t). But the householder won’t pay any more for coal power than wind power, so there will be no mode-switching.

      What’s the point of it all – apart from politics?

    • Richard C (NZ) on 17/04/2011 at 4:09 am said:

      The “much-maligned coal-burning generators” only have themselves to blame for that status, they are too complacent due to the nature of the commodity. Why aren’t they pro-active and communicating their mitigation of real pollution (and shame on them if they are not). If they are managing that aspect well (reached the limit or beyond of sensible and acceptable mitigation that can be achieved economically as returns diminish) then dealing with the CO2 issue should be a doddle.

      An interesting comparison and case study is the Kinleith pulp and paper mill. That plant has a plethora of environmental and real pollution issues to deal with and problems have occurred in both water and air emissions prompting a a review. See “Review of science relating to discharges from the Kinleith pulp and paper mill”

      http://www.waikatoregion.govt.nz/publications/Technical-Reports/Review-of-science-relating-to-discharges-from-the-Kinleith-pulp-and-paper-mill/

      But nowhere in the review is “carbon dioxide” or “CO2” categorized as a pollutant at issue.

      There are now far more stringent requirements for discharge consents than in the early days of the plant as the review indicates and although I’m not setting it up as a role model, I do think real polluters would benefit from study of the case. In the early days, water discharges turned the Waikato River brown but the river is clear enough to see the bottom from bridges now. Particulate emissions have been reduced and toxic flue discharges are scrubbed, much of this was voluntarily undertaken as i understand.

      If these operators have a commendable record of real pollution mitigation effort and the public know about it then they will have respect and they can take exception to the “carbon polluter” tag without image damage. Their culture would ensure that no stone is left unturned in the investigation of the CO2 issue. I suspect that companies that resort to the carbon leakage route would be looking to circumvent environmental regulations and consents anyway so leaving those constraints would not just be because of carbon taxes.

      I notice that Gillard is now under pressure from the labour unions to fully compensate a number of sectors (steel, aluminium, cement etc) so the whole rationale is becoming a political farce.- which does seem to be point of the exercise, Green tails wagging larger party dogs.at the expense of the base constituents and a charge to the economy with no gain to the environment. Can it get any dumber?

  2. Clarence on 14/04/2011 at 6:00 pm said:

    It’s a step forward when “the establishment” clearly admits that the model simulations of AGW need to be verified by observations.

    But the paper complains that knowledge of “trends over the oceans is hampered by the uneven and changing distribution of commercial shipping routes”. Then they complain about disagreements between the datasets offered by Hadley, NOAA, etc.

    If they want accurate models, why don’t they use satellite records? They now have access to two 30-year-long records which cover the lower troposphere of the whole globe.

    Also enlightening is the admission that the best available models project little change in the sea surface temperatures of the Pacific. The scare stories about warming oceans come from models that “have the least realistic simulations of ENSO variability.”

Leave a Reply

Your email address will not be published. Required fields are marked *

Post Navigation