Climate Models

This thread is for discussion of computer climate models, or General Circulation Models (GCMs).

Leave a Reply

41 Comment threads
9 Thread replies
Most reacted comment
Hottest comment thread
6 Comment authors
Notify of
Bob D

Well, well, it looks like someone got the models wrong again.

How often have we heard that droughts will increase due to global warming? It’s the single most-quoted effect that alarmists use when discussing Africa, for example.

Seems they were wrong.

We find no evidence in our analysis of a positive feedback—that is, a preference for rain over wetter soils—at the spatial scale (50–100 kilometres) studied. In contrast, we find that a positive feedback of soil moisture on simulated precipitation does dominate in six state-of-the-art global weather and climate models—a difference that may contribute to excessive simulated droughts in large-scale models.

This is why these blokes should have checked their models before shouting about the end of the world.

Richard C (NZ)

New paper shows negative feedback from clouds ‘may damp global warming’ A paper published today in The Journal of Climate uses a combination of two modelling techniques to find that negative feedback from clouds could result in “a 2.3-4.5% increase in [model projected] cloudiness” over the next century, and that “subtropical stratocumulus [clouds] may damp global warming in a way not captured by the [Global Climate Models] studied.” This strong negative feedback from clouds could alone negate the 3C alleged anthropogenic warming projected by the IPCC. As Dr. Roy Spencer points out in his book, “The most obvious way for warming to be caused naturally is for small, natural fluctuations in the circulation patterns of the atmosphere and ocean to result in a 1% or 2% decrease in global cloud cover. Clouds are the Earth’s sunshade, and if cloud cover changes for any reason, you have global warming — or global cooling.” According to the authors of this new paper, current global climate models “predict a robust increase of 0.5-1 K in EIS over the next century, resulting in a 2.3-4.5% increase in [mixed layer model] cloudiness.” EIS or estimated inversion strength has… Read more »

Richard C (NZ)

Climate change research gets petascale supercomputer 1.5-petaflop IBM Yellowstone system runs 72,288 Intel Xeon cores Computerworld – Scientists studying Earth system processes, including climate change, are now working with one of the largest supercomputers on the planet. The National Center for Atmospheric Research (NCAR) has begun using a 1.5 petaflop IBM system, called Yellowstone, that is among the top 20 supercomputers in the world, at least until the global rankings are updated next month. For NCAR researchers it is an enormous leap in compute capability — a roughly 30 times improvement over its existing 77 teraflop supercomputer. Yellowstone is a 1,500 teraflops system capable of 1.5 quadrillion calculations per second. The NCAR-Wyoming Supercomputing Center in Cheyenne, where this system is housed, says that with Yellowstone, it now has “the world’s most powerful supercomputer dedicated to geosciences.” Along with climate change, this supercomputer will be used on a number of geoscience research issues, including the study of severe weather, oceanography, air quality, geomagnetic storms, earthquakes and tsunamis, wildfires, subsurface water and energy resources. […] Scientists will be able to use the supercomputer to model the regional impacts of climate change. A model that is… Read more »

Richard C (NZ)

I queried John Christy as to which modeling group it was that has mimiced absolute temperature and trajectory this century so far in his EPS statement Figure 2.1. This was his reply:-


This model labeled 27 should be inmcm4 (Russia)

John C.

John R. Christy
Director, Earth System Science Center
Distinguished Professor, Atmospheric Science
University of Alabama in Huntsville
Alabama State Climatologist

Richard C (NZ)

What did the Russians do that everyone else didn’t in CMIP5 for AR5? Did they ramp GHG forcing down to zero I wonder? They do say there were “some changes in the formulation”


The INMCM3.0 climate model has formed the basis for the development of a new climate-model version: the INMCM4.0. It differs from the previous version in that there is an increase in its spatial resolution and some changes in the formulation of coupled atmosphere-ocean general circulation models. A numerical experiment was conducted on the basis of this new version to simulate the present-day climate. The model data were compared with observational data and the INMCM3.0 model data. It is shown that the new model adequately reproduces the most significant features of the observed atmospheric and oceanic climate. This new model is ready to participate in the Coupled Model Intercomparison Project Phase 5 (CMIP5), the results of which are to be used in preparing the fifth assessment report of the Intergovernmental Panel on Climate Change (IPCC).

# # #

Good to see a modeling group validating their model against observations (GCM group that is, RTM groups do this religiously) – this is a major breakthrough.

Richard C (NZ)

Simulating Present-Day Climate with the INMCM4.0 Coupled Model of the Atmospheric and Oceanic General Circulations E. M. Volodin, N. A. Dianskii, and A. V. Gusev, 2010 Institute of Numerical Mathematics, Russian Academy of Sciences, ul. Gubkina 8, Moscow, 119991 Russia e-mail: Page 2:- This makes it possible to analyze systematic errors in simulating the present-day climate and to assess the range of its possible changes caused, for example, by anthropogenic forcing. Page 3:- On the basis of this model, a numerical experiment was carried out to simulate the modern climate. To this end, the concentrations specified for all radioactive gases and aerosols corresponded to those in 1960. Page 4:- Name: Air temperature at the surface °C Observations: 14.0 ± 0.2 [34] INMCM3.0: 13.0 ± 0.1 INMCM4.0: 13.7 ± 0.1 Page 4:- The 1951–2000 NCEP reanalysis data [31] were used to compare the model atmospheric dynamics with observational data, and data from [32–41] were used to compare the integral atmospheric characteristics. Page 3:- The parameterizations of the basic physical processes in the model have changed only slightly; namely, some of the tuning parameters have changed. Among these are the parameterizations of radiation… Read more »

Richard C (NZ)

5.2 Heat emission on page 43 of Volodin, Dianskii, and Gusev gives the formulae, share of emissions across the spectrum, and references tables of coefficients.

Richard C (NZ)

Description of the CCM INM RAS and model experiments

Description of the atmospheric climate model inmcm4.0.(new) [hotlink]

Short description of the coupled climate model inmcm3.0 and model experiments. [hotlink]

Timetable of the model experiments.

Selected publications [hotlinked]

Volodin E.M., Diansky N.A.. “Prediction of the climate change in 19-22th centuries using coupled climate model”.

Volodin E.M., Diansky N.A. “ENSO reconstruction in the Coupled Climate Model”.

Volodin E.M.”Simulation of the modern climate. Comparison with observations and data of other climate models”.

Volodin E.M. “Reliability of the future climate change forecasts”.

Volodin E.M., Diansky N.A., Gusev A.V. “Simulating Present Day Climate with the INMCM4.0 Coupled Model of the Atmospheric and Oceanic General Circulations”.(new)

Volodin E.M. “Atmosphere-Ocean General Circulation Model with the Carbon Cycle”.(new)


“Pinatubo Climate Sensitivity and Two Dogs that didn’t bark in the night”

Interesting article on climate sensitivity over at Lucia’s

Richard C (NZ)

Lucia’s blog analysis makes Nuccitelli et al’s DK12 Comment look somewhat ordinary.

For about 2 yrs data and “a single ocean heat capacity model” (one-heat-sink), Lucia’s model “is “seeing” an ocean capacity of 53 watt-months/deg C/m2 – equivalent to about 30 to 40m water depth”. Further down page, the model is “(still) “seeing” a total ocean heat capacity corresponding to about the top 30-40m of ocean”. This for 60S to 60N only.

According to Nuccitelli et al, that’s all “noise” and 5 yr smoothed data should be used down to 2000m.

Can’t say I’m convinced by globally averaged approximations for these calculations. I think the 0-GCM approach using observed ocean heat climatology (which one?) corresponding to TOA satellite observations cell-by-cell are about the only way to arrive at anything anywhere near meaningful. Not that I know what it is about at Lucia-level.

Richard C (NZ)

AR5 Chapter 11; Hiding the Decline (Part II)

Figure 11.33: Synthesis of near-term projections of global mean surface air temperature. a), b) and c):-

They hid the decline! In the first graph, observational data ends about 2011 or 12. In the second graph though, it ends about 2007 or 8. There are four or five years of observational data missing from the second graph. Fortunately the two graphs are scaled identically which makes it very easy to use a highly sophisticated tool called “cut and paste” to move the observational data from the first graph to the second graph and see what it should have looked like:

Well oops. Once one brings the observational data up to date, it turns out that we are currently below the entire range of models in the 5% to 95% confidence range across all emission scenarios. The light gray shading is for RCP 4.5, the most likely emission scenario. But we’re also below the dark gray which is all emission scenarios for all models, including the ones where we strangle the global economy.

+ + +

Also John Christy’s preliminary plot (incomplete) of CMIP5 RCP4.5 vs observations (UAH/RSS):-

Richard C (NZ)

The controversy by Anastassia Makarieva, Victor Gorshkov, Douglas Sheil, Antonio Nobre, Larry Li Thanks to help from blog readers, those who visited the ACPD site and many others who we have communicated with, our paper has received considerable feedback. Some were supportive and many were critical. Some have accepted that the physical mechanism is valid, though some (such as JC) question its magnitude and some are certain it is incorrect (but cannot find the error). Setting aside these specific issues, most of the more general critical comments can be classified as variations on, and combinations of, three basic statements: 1. Current weather and climate models (a) are already based on physical laws and (b) satisfactorily reproduce observed patterns and behaviour. By inference, it is unlikely that they miss any major processes. 2. You should produce a working model more effective than current models. 3. Current models are comprehensive: your effect is already there. Let’s consider these claims one by one. Models and physical laws […] Thus, while there are physical laws in existing models, their outputs (including apparent circulation power) reflect an empirical process of calibration and fitting. In this sense models are… Read more »

Richard C (NZ)

New paper finds IPCC climate models unable to reproduce solar radiation at Earth’s surface A new paper published in the Journal of Geophysical Research – Atmospheres finds the latest generation of IPCC climate models were unable to reproduce the global dimming of sunshine from the ~ 1950s-1980s, followed by global brightening of sunshine during the 1990’s. These global dimming and brightening periods explain the observed changes in global temperature over the past 50-60 years far better than the slow steady rise in CO2 levels. The authors find the models underestimated dimming by 80-85% in comparison to observations, underestimated brightening in China and Japan as well, and that “no individual model performs particularly well for all four regions” studied. Dimming was underestimated in some regions by up to 7 Wm-2 per decade, which by way of comparison is 25 times greater than the alleged CO2 forcing of about 0.28 Wm-2 per decade. The paper demonstrates climate models are unable to reproduce the known climate change of the past, much less the future, that the forcing from changes in solar radiation at the Earth surface is still far from being understood and dwarfs any alleged… Read more »

Richard C (NZ)

‘Global warming slowdown retrospectively “predicted” ‘ By Ashutosh Jogalekar When I was in graduate school I once came across a computer program that’s used to predict the activities of as yet unsynthesized drug molecules. The program is “trained” on a set of existing drug molecules with known activities (the “training set”) and is then used to predict those of an unknown set (the “test set”). In order to make learning the ropes of the program more interesting, my graduate advisor set up a friendly contest between me and a friend in the lab. We were each given a week to train the program on an existing set and find out how well we could do on the unknowns. After a week we turned in our results. I actually did better than my friend on the existing set, but my friend did better on the test set. From a practical perspective his model had predictive value, a key property of any successful model. On the other hand my model was one that still needed some work. Being able to “predict” already existing data is not prediction, it’s explanation. Explanation is important, but a model… Read more »

Richard C (NZ)

‘The “ensemble” of models is completely meaningless, statistically’ Posted on June 18, 2013 by Anthony Watts This comment from rgbatduke, who is Robert G. Brown. at the Duke University Physics Department on the No significant warming for 17 years 4 months thread has gained quite a bit of attention [e.g. reproduced by Dr Judith Curry at Climate Etc] because it speaks clearly to truth. So that all readers can benefit, I’m elevating it to a full post rgbatduke says: June 13, 2013 at 7:20 am Last two paragraphs: “It would take me, in my comparative ignorance, around five minutes to throw out all but the best 10% of the GCMs (which are still diverging from the empirical data, but arguably are well within the expected fluctuation range on the DATA side), sort the remainder into top-half models that should probably be kept around and possibly improved, and bottom half models whose continued use I would defund as a waste of time. That wouldn’t make them actually disappear, of course, only mothball them. If the future climate ever magically popped back up to agree with them, it is a matter of a few… Read more »

Richard C (NZ)

One of the first jobs for NIWA’s High Performance Computing Facility (HPCF) was snow modeling partly funded by the Ski Areas Association of New Zealand:

‘New Zealand snow areas confident they will adapt to any risks from climate change’

16 December 2010

New climate modelling shows seasonal snow levels at New Zealand ski areas will be reduced by the effects of climate change in the coming years, but the good news is the loss may actually be less than originally anticipated and we should be able to continue to make snow, even under a more extreme climate scenario

A lot less. I’ve just seen a newsclip from Mt Hutt (I think it was) where they were saying the 3m base was the most they had ever seen.

Richard C (NZ)

‘New Weather Service supercomputer faces chaos’ By Steve Tracton The National Weather Service is currently in the process of transitioning its primary computer model, the Global Forecast System (GFS), from an old supercomputer to a brand new one [Weather and Climate Operational Supercomputer System (WCOSS)]. However, before the switch can be approved, the GFS model on the new computer must generate forecasts indistinguishable from the forecasts on the old one. One expects that ought not to be a problem, and to the best of my 30+ years of personal experience at the NWS, it has not been. But now, chaos has unexpectedly become a factor and differences have emerged in forecasts produced by the identical computer model but run on different computers. This experience closely parallels Ed Lorenz’s experiments in the 1960s, which led serendipitously to development of chaos theory (aka “butterfly effect). What Lorenz found – to his complete surprise – was that forecasts run with identically the same (simplistic) weather forecast model diverged from one another as forecast length increased solely due to even minute differences inadvertently introduced into the starting analyses (“initial conditions”). […] So what lay behind the chaotic… Read more »

Richard C (NZ)

‘Policy Implications of Climate Models on the Verge of Failure’ By Paul C. Knappenberger and Patrick J. Michaels Center for the Study of Science, Cato Institute, Washington DC [converted from a poster displayed at the AGU Science Policy Conference, Washington, June 24-26] INTRODUCTION Assessing the consistency between real-world observations and climate model projections is a challenging problem but one that is essential prior to making policy decisions which depend largely on such projections. National and international assessments often mischaracterize the level of consistency between observations and projections. Unfortunately, policymakers are often unaware of this situation, which leaves them vulnerable to developing policies that are ineffective at best and dangerous at worst. Here, we find that at the global scale, climate models are on the verge of failing to adequately capture observed changes in the average temperature over the past 10 to 30 years—the period of the greatest human influence on the atmosphere. At the regional scale, specifically across the United States, climate models largely fail to replicate known precipitation changes both in sign as well as magnitude. […] CONCLUSIONS: It is impossible to present reliable future projections from a collection of climate models… Read more »

Richard C (NZ)

‘Climate change: The forecast for 2018 is cloudy with record heat’ Efforts to predict the near-term climate are taking off, but their record so far has been patchy. * Jeff Tollefson In August 2007, Doug Smith took the biggest gamble of his career. After more than ten years of work with fellow modellers at the Met Office’s Hadley Centre in Exeter, UK, Smith published a detailed prediction of how the climate would change over the better part of a decade1. His team forecasted that global warming would stall briefly and then pick up speed, sending the planet into record-breaking territory within a few years. The Hadley prediction has not fared particularly well. Six years on, global temperatures have yet to shoot up as it projected. Despite this underwhelming result, such near-term forecasts have caught on among many climate modellers, who are now trying to predict how global conditions will evolve over the next several years and beyond. Eventually, they hope to offer forecasts that will enable humanity to prepare for the decade ahead just as meteorologists help people to choose their clothes each morning. These near-term forecasts stand in sharp contrast to the… Read more »

Richard C (NZ)

Two GCM papers appear to be creating a “buzz” at present.

First paper:

‘Recent global warming hiatus tied to equatorial Pacific surface cooling’

Yu Kosaka and Shang-Ping Xie

[Judith Curry] “….the same natural internal variability (primarily PDO) that is responsible for the pause is a major and likely dominant cause (at least at the 50% level) of the warming in the last quarter of the 20th century”

[John Michael Wallace of the University of Washington] “It argues that not only could the current hiatus in the warming be due to natural causes: so also could the rapidity of the warming from the 1970s until the late 1990s”

Second paper:

‘Overestimated global warming over the past 20 years’

Opinion & Comment by Fyfe, Gillett and Zwiers

[Judith Curry] “Their conclusion This difference might be explained by some combination of errors in external forcing, model response and internal climate variability is right on the money IMO”

[The Hockey Schtick] “The authors falsify the models at a confidence level of 90%, and also find that there has been no statistically significant global warming for the past 20 years”

# # #

“Pause”, “hiatus”, and “divergence” now standard climatological terms in the literature apparently.

Richard C (NZ)

Twitter / BigJoeBastardi: Now “climate researchers” will …

Now “climate researchers” will want huge grants to tell us that when pdo warms in 20 years, warming will resume,after drop to late 70s temps

Twitter / RyanMaue: Cold-phase of PDO means …

Cold-phase of PDO means “hiatus/less/pause/plateau” of warming. We need a Nature article w/climate models to prove this?

Twitter / RyanMaue: I already blamed lack of global …

I already blamed lack of global TC activity from 2007-2012 on colder Pacific conditions. I thought it was so apparent to be non-publishable

Twitter / BigJoeBastardi: The arrogance and ignorance …

The arrogance and ignorance of these guys, now “discovering” what many have forecasted to happen due to cold PDO is stunning

Richard C (NZ)

Tisdale re Kosaka and Xie:

“Anyone with a little common sense who’s reading the abstract and the hype around the blogosphere and the Meehl et al papers will logically now be asking: if La Niña events can stop global warming, then how much do El Niño events contribute? 50%? The climate science community is actually hurting itself when they fail to answer the obvious questions.”

‘Global warming pause caused by La Nina’

The researchers said similar decade-long pauses could occur in future, but the longer-term warming trend was “very likely to continue with greenhouse gas increases”.

Read more:

# # #

Or “…the longer-term warming trend was “very likely to [turn to cooling] with [solar decreases]”

It all depends on the (correct) attribution.

Richard C (NZ)

Settled science: The heat is hiding in the ocean, while the Pacific Ocean cools, and it’s “pretty straightforward” and “complicated” and a “a chicken vs. egg problem” dogs the finding Pacific Ocean cools, flattening global warming

“Really, this seems pretty straightforward. The climate is complicated, and natural variability can mask trends seen over century-long timescales,” says climate scientist David Easterling of the National Oceanic and Atmospheric Administration’s National Climatic Data Center in Asheville, N.C.

MIT’s Susan Solomon is more skeptical of the Pacific Ocean cooling as an explanation for the flattening, saying “a chicken vs. egg problem” dogs the finding. “Did the sea surface temperatures cool on their own, or were they forced to do so by, for example, changes in volcanic or pollution aerosols, or something else? This paper can’t answer that question.”

Richard C (NZ)

New paper finds ‘up to 30% discrepancy between modeled and observed solar energy absorbed by the atmosphere’ More problems for the climate models: A paper published today in Geophysical Research Letters finds that there is “up to 30% discrepancy between the modeled and the observed solar energy absorbed by the atmosphere.” The authors attribute part of this large discrepancy, which would alone have a greater radiative forcing effect than all of the man-made CO2 in the atmosphere, to water vapor absorption in the near UV region [see hotlink], “But the magnitude of water vapor absorption in the near UV region at wavelengths shorter than 384 nm is not known.” The authors note, “Water vapor is [the most] important greenhouse gas in the earth’s atmosphere” and set out to discover [apparently for the first time] “The effect of the water vapor absorption in the 290-350 nm region on the modeled radiation flux at the ground level.” ‘The influence of water vapor absorption in the 290-350 nm region on solar radiance: Laboratory studies and model simulation’ Juan Du, Li Huang, Qilong Min, Lei Zhu Abstract [1] Water vapor is an important greenhouse gas in the earth’s atmosphere. Absorption… Read more »

Richard C (NZ)

‘Leaked SPM AR5: Multi-decadal trends’ Data Comparisons Written by: lucia […] A way into the section the draft states. “Models do not generally reproduce the observed reduction in surface warming trend over the last 10–15 years……………….” […] Earlier in the draft we find: “There is very high confidence that climate models reproduce the observed large-scale patterns and multi-decadal trends in surface temperature, especially since the mid-20th century” So evidently the AR5 will admit that they have not reproduced observed warming in the past 10-12 years, speculate that it might be unpredictable climate variability, solar, volcanic or aerosol forcings or possibly due to ‘too strong a response to increasing greenhouse-gas forcings”, which mostly amounts too excess climate sensitivity. That said, reading the leaked draft, I can’t help but wonder about their definition of “multi-decadal”. Generally, I assume that means “two or more decades”. So, I ran my script to get roughly 15, 20 and 25 year trends, comparing the observed earth trend to the spread in trends in the ‘AR5′ models forced using the rcp45 scenario. […] As you can see, while the 15 year trend (discussed in the leaked draft SPM) are just… Read more »

Richard C (NZ)

‘Viewpoints: Reactions to the UN climate report’


Professor John Shepherd, ocean & earth science, University of Southampton

“….no-one ever claimed that climate models could predict all these decadal wiggles”

# # #

Successive decadal wiggles are what make multidecadal projections. And Kosaka and Xie (2013) modeled (in retrospect) the present decadal wiggle when constrained by natural oceanic variation.

Therefore, natural variation (e.g. PDO/AMO) must be integrated in the models before realistic projections can be made – the sceptics argument for yonks,

Richard C (NZ)

‘New paper finds simple laptop computer program reproduces the flawed climate projections of supercomputer climate models’

The Hockey Schtick

A new paper finds a simple climate model based on just three variables “and taking mere seconds to run on an ordinary laptop computer, comes very close to reproducing the results of the hugely complex climate models.” and “The [laptop computer] model was based on three key processes: how much energy carbon dioxide prevents from escaping to space (radiative forcing), the relationship between rate of warming and temperature, and how rapidly the ocean takes up heat (ocean thermal diffusivity).”

Actually, you only need one independent variable [CO2 levels] to replicate what the highly complex supercomputer climate models output. This has been well demonstrated by Dr. Murry Salby in his lecture, which shows 1:1 agreement between the supercomputer-simulated global temperature and CO2 levels over the 21st century: [see graph]


I remember, even clearer now 20yrs later, when I was studying Electrical Engineering at University doing Philosophy 101. We were the first year that had philosophy added to the course with the intention to opening the eyes of potential engineers to a frame of reference for the decisions we may one day make as engineers. The topics I remember were: • Value judgments, how our personal values influence what’s right • The energy crisis – over reliance of fossil fuels • Global warming, The Greenhouse effect, man-made CO2 emissions I very much enjoyed the Value judgments topic. Why do we make bridges only 2.5 times stronger than their maximum loading? What makes your decisions and values more important than others? Does it take into account the potential of natural disasters? Excellent stuff! The Energy Crisis topic didn’t make as much sense to me. So many loaded learnings. We weren’t philosophizing, we were being brain washed. I could understand that fossil fuel is a finite resource but I also knew, at the same time I was being brain washed, the constant discovery of more and larger deposits that technology was helping us find were… Read more »

Richard C (NZ)

‘Does the Uptick in Global Surface Temperatures in 2014 Help the Growing Difference between Climate Models and Reality?’ Bob Tisdale / January 16, 2015 CLOSING As illustrated and discussed, while global surface temperatures rose slightly in 2014, the minor uptick did little to overcome the growing difference between observed global surface temperature and the projections of global surface warming by the climate models used by the IPCC. In comments: Simon January 16, 2015 at 9:06 am Quote from NOAA’s annual summary… “This is the first time since 1990 the high temperature record was broken in the absence of El Niño conditions at any time during the year in the central and eastern equatorial Pacific Ocean, as indicated by NOAA’s CPC Oceanic Niño Index. This phenomenon generally tends to increase global temperatures around the globe, yet conditions remained neutral in this region during the entire year and the globe reached record warmth despite this.” As much as this article has tried to imply this record year is not significant, the paragraph above would say otherwise. Reply Bob Tisdale January 16, 2015 at 9:19 am NOAA is playing games, Simon. They well know… Read more »

Richard C (NZ)

‘Warmest year’, ‘pause’, and all that by Judith Curry, January 16, 2015 […] El Nino? One of the key aspects of the hype about the ‘warmest year in 2014′ was that 2014 was not even an El Nino year. Well, there has been a great deal of discussion about this issue on the Tropical ListServ. Here is what I have taken away from that discussion: A global circulation response pattern to Pacific convection with many similarities to El Niño has in fact been present since at least June. Convection to the east of New Guinea is influencing zonal winds in the upper troposphere across the Pacific and Atlantic, looking similar to an El Nino circulation response. So, is it El Niño? Not quite, according to some conventional indices, but a broader physical definition might be needed to capture the different flavors of El Nino. A number of scientists are calling for modernizing the ENSO identification system. So I’m not sure how this event might eventually be identified, but for many practical purposes (i.e. weather forecasting), this event is behaving in many ways like an El Nino. What does this mean for interpreting the… Read more »

Richard C (NZ)

‘Peer-reviewed pocket-calculator climate model exposes serious errors in complex computer models and reveals that Man’s influence on the climate is negligible’

Anthony Watts / January 16, 2015

What went wrong?

A major peer-reviewed climate physics paper in the first issue (January 2015: vol. 60 no. 1) of the prestigious Science Bulletin (formerly Chinese Science Bulletin), the journal of the Chinese Academy of Sciences and, as the Orient’s equivalent of Science or Nature, one of the world’s top six learned journals of science, exposes elementary but serious errors in the general-circulation models relied on by the UN’s climate panel, the IPCC. The errors were the reason for concern about Man’s effect on climate. Without them, there is no climate crisis.

Thanks to the generosity of the Heartland Institute, the paper is open-access. It may be downloaded free from Click on “PDF” just above the abstract.


Richard C (NZ)

‘Questioning the robustness of the climate modeling paradigm’

by Judith Curry, February 2, 2015

Are climate models the best tools? A recent Ph.D. thesis from The Netherlands provides strong arguments for ‘no’.

Richard C (NZ)

Remote Sensing Systems (RSS) – Climate Analysis Atmospheric Temperature […] The troposphere has not warmed as fast as almost all climate models predict. To illustrate this last problem, we show several plots below. Each of these plots has a time series of TLT temperature anomalies using a reference period of 1979-2008. In each plot, the thick black line is the measured data from RSS V3.3 MSU/AMSU Temperatures. The yellow band shows the 5% to 95% envelope for the results of 33 CMIP-5 model simulations (19 different models, many with multiple realizations) that are intended to simulate Earth’s Climate over the 20th Century. For the time period before 2005, the models were forced with historical values of greenhouse gases, volcanic aerosols, and solar output. After 2005, estimated projections of these forcings were used. If the models, as a whole, were doing an acceptable job of simulating the past, then the observations would mostly lie within the yellow band. For the first two plots (Fig. 1 and Fig 2), showing global averages and tropical averages, this is not the case. Only for the far northern latitudes, as shown in Fig. 3, are the observations within… Read more »

Richard C (NZ)

‘Winters in Boston Becoming Drier’ Written by Dr. Roy Spencer on 13 February 2015. Much has been said in recent weeks about how bigger snowstorms in Boston are (supposedly) just what climate models have predicted. “Global warming” is putting more water vapor into the air, leading to more “fuel” for winter storms and more winter precipitation. While this general trend is seen in climate models for global average conditions (warming leads to more precipitation), what do the models really predict for Boston? And what has actually been observed in Boston? The following plot shows that the observed total January precipitation in Boston has actually decreased since the 1930′s, contrary to the average “projections” (in reality, hindcasts) from a total of 42 climate models, at the closest model gridpoint to Boston: [See graph] Note that even the forecast increase in January precipitation is so small that it probably would never be noticed if it actually occurred. During the same period, January temperatures in Boston have seen a statistically insignificant +0.1 deg. F per decade warming, in contrast to 2.5 times faster average warming produced by the 42 climate models: [See graph] What is very… Read more »

Richard C (NZ)

‘Are Climate Modelers Scientists?’ by Pat Frank February 24, 2015 For going on two years now, I’ve been trying to publish a manuscript that critically assesses the reliability of climate model projections. The manuscript has been submitted twice and rejected twice from two leading climate journals, for a total of four rejections. All on the advice of nine of ten reviewers. More on that below. The analysis propagates climate model error through global air temperature projections, using a formalized version of the “passive warming model” (PWM) GCM emulator reported in my 2008 Skeptic article. Propagation of error through a GCM temperature projection reveals its predictive reliability. […] I will give examples of all of the following concerning climate modelers: They neither respect nor understand the distinction between accuracy and precision. They understand nothing of the meaning or method of propagated error. They think physical error bars mean the model itself is oscillating between the uncertainty extremes. (I kid you not.) They don’t understand the meaning of physical error. They don’t understand the importance of a unique result. Bottom line? Climate modelers are not scientists. Climate modeling is not a branch of physical science.… Read more »

Richard C (NZ)

‘On Steinman et al. (2015) – Michael Mann and Company Redefine Multidecadal Variability And Wind Up Illustrating Climate Model Failings’

Bob Tisdale / 4 hours ago February 26, 2015

Some good comments too e.g. Dr Norman Page:

“That the Steinman et al paper got through peer review for Science Magazine says much about the current state of establishment science. However in a short comment on the paper in the same Science issue Ben Booth of the Hadley center does sound a refreshingly cautionary ( for Science Mag and Hadley ) note saying that the paper is only useful if the current models accurately represent both the external drivers of past climate and the climate responses to them and that there is reason to be cautious in both of these areas. This comment is an encouraging sign that empirical reality may be finally making an impression on the establishment consciousness.”

Richard C (NZ)

INMCM4 (Russian Academy of Sciences) in Judith Curry’s post: ‘Climate sensitivity: lopping off the fat tail’ There is one climate model that falls within the range of the observational estimates: INMCM4 (Russian). I have not looked at this model, but on a previous thread RonC makes the following comments. “On a previous thread, I showed how one CMIP5 model produced historical temperature trends closely comparable to HADCRUT4. That same model, INMCM4, was also closest to Berkeley Earth and RSS series. Curious about what makes this model different from the others, I consulted several comparative surveys of CMIP5 models. There appear to be 3 features of INMCM4 that differentiate it from the others.” 1.INMCM4 has the lowest CO2 forcing response at 4.1K for 4XCO2. That is 37% lower than multi-model mean 2.INMCM4 has by far the highest climate system inertia: Deep ocean heat capacity in INMCM4 is 317 W yr m22 K-1, 200% of the mean (which excluded INMCM4 because it was such an outlier) 3.INMCM4 exactly matches observed atmospheric H2O content in lower troposphere (215 hPa), and is biased low above that. Most others are biased high. So the model that most closely… Read more »

Richard C (NZ)

Tom Nelson tweet (he’s hardly missed a beat by suspension, and blogging again too):

“Ok, so maybe the Canadian Climate Model isn’t quite matching reality”

See graphcomment image

Richard C (NZ)

‘Open Letter to U.S. Senators Ted Cruz, James Inhofe and Marco Rubio’ Bob Tisdale / April 14, 2015 Subject: Questions about Climate Model-Based Science From: Bob Tisdale – Independent Climate Researcher To: The Honorable Ted Cruz, James Inhofe and Marco Rubio Dear Senators Cruz, Inhofe and Rubio: I am writing you as chairs of the Subcommittee on Space, Science, and Competitiveness, of the Senate Environment and Public Works Committee, and of the Committee on Oceans, Atmosphere, Fisheries, and Coast Guard. I am an independent researcher who studies global warming and climate change, and I am probably best known for my articles at the science weblog WattsUpWithThat, where I would be considered an investigative reporter. I have a few very basic questions for you about climate model-based science. They are: # Why are taxpayers funding climate model-based research when those models are not simulating Earth’s climate? # Why are taxpayers funding climate model-based research when each new generation of climate models provides the same basic answers? # Redundancy: why are taxpayers funding 5 climate models in the U.S.? # Why aren’t climate models providing the answers we need? Example: Why didn’t the consensus of… Read more »

Richard C (NZ)

Deceit by the the University of New South Wales ARC Centre of Excellence for Climate System Science. Paper: Robust warming projections despite the recent hiatus by Matthew H. England, Jules B. Kajtar and Nicola Maher published in Nature Climate Change, doi:10.1038/nclimate2575 Commentary: “The peer-reviewed study, published today in Nature Climate Change, compared climate models that capture the current slowdown in warming to those that do not.” And, “This shows that the slowdown in global warming has no bearing on long-term projections – it is simply due to decadal variability. Greenhouse gases will eventually overwhelm this natural fluctuation,” said lead author and Chief Investigator with the ARC Centre of Excellence for Climate System Science, Prof Matthew England. # # # 1) The HadCRUT4 series is heavily smoothed in their graph compared to the model runs: HadCRUT4 unsmoothed actually looks like this: Somewhat at odds with the profiles of the model runs selected. The graph caption states “The future projections have been appended to corresponding historical runs at 2006”. 2006 corresponds to the start of the models-observations divergence in the graph. 2) The climate models selected DO NOT capture the full extent… Read more »

Richard C (NZ)

Underlying architecture of selected climate models by Kaitlin Alexander, PhD student in climate science at the University of New South Wales in Sydney, Australia..

Diagram key
COSMOS 1.2.1
Model E (17/06/2011)
HadGEM3 (03/08/2009)
CESM 1.0.3
UVic ESCM 2.9

All have direct incidence of “solar radiation” to the atmosphere module – correct.

Not one has direct incidence of solar radiation to either ocean or land contrary to Trenberth et al’s ‘Global Energy Flows’ (see below), and conventional radiation-matter physics. Apparently there’s no direct incidence, the ocean and land receives solar energy via interaction with the atmosphere. Which it does but that’s the minor “diffuse” component (neglected by Trenberth et al), the major is direct as shown:

Global Energy Flows, Trenberth et al., 2009

The climate science modeling world sure is a strange place. And internally inconsistent and contradictory.


Useful (for us) seeing what aspects are being factored in, but It is curious that she has chosen to go to so much trouble counting lines of code and ranking the models on that basis. I don’t imagine that the impact of each aspect has much to do with how many lines of code there are.

Perhaps the more lines of code the better climate scientist you are.

Richard C (NZ)

‘Update of Model-Observation Comparisons’ [HadCRUT4 & RSS]

Steve McIntyre, posted on Jan 5, 2016 at 12:27 PM

Richard C (NZ)

‘A TSI-Driven (solar) Climate Model’ February 8, 2016 by Jeff Patterson “The fidelity with which this model replicates the observed atmospheric CO2 concentration has significant implications for attributing the source of the rise in CO2 (and by inference the rise in global temperature) observed since 1880. There is no statistically significant signal of an anthropogenic contribution to the residual plotted Figure 3c. Thus the entirety of the observed post-industrial rise in atmospheric CO2 concentration can be directly attributed to the variation in TSI, the only forcing applied to the system whose output accounts for 99.5% ( r2=.995) of the observational record. How then, does this naturally occurring CO2 impact global temperature? To explore this we will develop a system model which when combined with the CO2 generating system of Figure 4 can replicate the decadal scale global temperature record with impressive accuracy. Researchers have long noted the relationship between TSI and global mean temperature.[5] We hypothesize that this too is due to the lagged accumulation of oceanic heat content, the delay being perhaps the transit time of the thermohaline circulation. A system model that implements this hypothesis is shown in Figure 5.” “The… Read more »

Richard C (NZ)

STATISTICAL FORECASTING How fast will future warming be? Terence C. Mills © Copyright 2016 The GlobalWarming Policy Foundation Summary The analysis and interpretation of temperature data is clearly of central importance to debates about anthropogenic globalwarming (AGW). Climatologists currently rely on large-scale general circulation models to project temperature trends over the coming years and decades. Economists used to rely on large-scale macroeconomic models for forecasting, but in the 1970s an increasing divergence between models and reality led practitioners to move away from such macro modelling in favour of relatively simple statistical time-series forecasting tools, which were proving to be more accurate. In a possible parallel, recent years have seen growing interest in the application of statistical and econometric methods to climatology. This report provides an explanation of the fundamental building blocks of so-called ‘ARIMA’ models, which are widely used for forecasting economic and financial time series. It then shows how they, and various extensions, can be applied to climatological data. An emphasis throughout is that many different forms of a model might be fitted to the same data set, with each one implying different forecasts or uncertainty levels, so readers should understand the… Read more »

Richard C (NZ)

MUST READ (yes MUST) Gavin Schmidt and Reference Period “Trickery” Steve McIntyre, posted on Apr 19, 2016 In the past few weeks, I’ve been re-examining the long-standing dispute over the discrepancy between models and observations in the tropical troposphere. My interest was prompted in part by Gavin Schmidt’s recent attack on a graphic used by John Christy in numerous presentations (see recent discussion here by Judy Curry). Schmidt made the sort of offensive allegations that he makes far too often: @curryja use of Christy’s misleading graph instead is the sign of partisan not a scientist. YMMV. tweet; @curryja Hey, if you think it’s fine to hide uncertainties, error bars & exaggerate differences to make political points, go right ahead. tweet. As a result, Curry decided not to use Christy’s graphic in her recent presentation to a congressional committee. In today’s post, I’ll examine the validity (or lack) of Schmidt’s critique. Schmidt’s primary dispute, as best as I can understand it, was about Christy’s centering of model and observation data to achieve a common origin in 1979, the start of the satellite period, a technique which (obviously) shows a greater discrepancy at the end… Read more »

Richard C (NZ)

Gareth S. Jones (UK Met Office, Jones, Lockwood, and Stott (2012) cited AR5 Chap 9 Radiative Forcing, contributing author AR5 Chap 10 Detection and Attribution) Tweets:

Gareth S Jones ‏@GarethSJones1

Update of comparison of simulated past climate (CMIP5) [RCP4.5] with observed global temperatures (HadCRUT4)

comment image


# # #

Schmidt chimes in. Jones’ Tweet to get everyone fizzed up obviously (except Barry Woods in thread) because El Nino spike in central 50% red zone of climate models.

Except ENSO-neutral data is OUTSIDE the red zone. The spike wil be back down again before the end of the year and before an impending La Nina. As Barry Woods puts it:

“An El Nino Step Up, in temps, or a peak to be followed by cooler years? (for a few yrs)”

We wont have to wait long to find out. Some climate scientists, led by Schmidt, headed for a fall I’m picking.

Richard C (NZ)

For the record in conjunction with post: ‘IPCC Ignores IPCC Climate Change Criteria’ (not published yet as of this comment date) The earth’s energy budget has no LW flux into the surface once the net of OLR and DLR is arrived at (-52.4 W.m-2). Energy accumulation at surface i.e. the surface imbalance (+0.6) is therefore simply the residual of solar ingress after all egress is subtracted. Solar ingress is the greater (+188 vs -187.4). LW nomenclature in ocean surface energy budgets varies a little in oceanography papers but the only radiative LW energy transfer flux tabulated by the definitive Fairal et al (1996) paper is “Rnl” (net LW radiation), which is an outgoing transfer (flux) upwards from the surface. But climate models bypass the physics of the AO interface, instead the models allocate energy transfer at the surface by the IPCC forcing assumption. Proof of this is in IPCC AR4 WG1 Chapter 2 on this page: 2.9.5 Time Evolution of Radiative Forcing and Surface Forcing [see Figure 2.23 Surface Forcing] Figure 2.23. Globally and annually averaged temporal evolution of the instantaneous all-sky RF (bottom panel) and surface forcing (top panel) due to… Read more »

Richard C (NZ)

‘Global climate models and the laws of physics’