Another View on Climate

My Own View of Global Warming

Archive for the ‘Models’ Category

Models, Predictions, IPCC

Posted by greg2213 on November 9, 2011

Update 7/25/12: Just a note: By its actions, the IPCC Admits Its Past Reports Were Unreliable. There seems to be some indication that the IPCC reports was a little more advocacy and a little less science.

Original post:

An issue with climate models is whether or not they have any capability to actually predict anything. The amazingly complicated climate system of this planet, for example.

The hysterics place great value in the models, those that produce the “We’re all gonna die!!!” results.

Willis Eschenbach on WUWT posts some thoughts on scientific models in general:

There’s a lovely 2005 paper I hadn’t seen, put out by the Los Alamos National Laboratory entitled “Our Calibrated Model has No Predictive Value” (PDF).


The paper’s abstract says it much better than I could:

Abstract: It is often assumed that once a model has been calibrated to measurements then it will have some level of predictive capability, although this may be limited. If the model does not have predictive capability then the assumption is that the model needs to be improved in some way.

Using an example from the petroleum industry, we show that cases can exist where calibrated models have no predictive capability. This occurs even when there is no modelling error present. It is also shown that the introduction of a small modelling error can make it impossible to obtain any models with useful predictive capability.

We have been unable to find ways of identifying which calibrated models will have some predictive capacity and those which will not.

There are three results in there, one expected and two unexpected.

Interesting stuff. Read it all (and 300 comments) here: A Modest Proposal—Forget About Tomorrow asks:

Are Computer Models Reliable – can they be used to predict the future climate?

No: says the IPCC (Chapter 14,, Working Group 1, The Scientific Basis)

Third Assessment Report: “In sum, a strategy must recognise what is possible. In climate research and modeling, we should recognize that we are dealing with a coupled nonlinear chaotic system, and therefore that long-term prediction of future climate states is not possible.”


All the computer models do is produce scenarios based on the assumptions and limited understanding of climate programmed into them. They do not do clouds,  some of the assumptions are estimates that vary by orders of magitude, we have different scenarios varying from 1.0C to 6.0c,  some scarier scenarios of 10 – 12 C  for the next century.  These assume ‘climate sensitivity’ – ‘feedbacks’ due to CO2 which amplify the simple physics of per doubling of CO2 in the atmosphere of ~1.0C , to increasing temperatures due to positive feedbacks producing the  various accelerated global warming scenarios… Yet the order of magnitude of the sensitivity is unknown.

In fact the sign is unknown, it is assumed positive, is could equally be a negative feedback, it is currently uncertain.

I do hope that some  scientists somewhere are doing some actual experimentation with observable data to establish this.  The fact that the Earth has not experienced this behaviour in the past at point when CO2 were much higher naturally, must demonstrate these assumptions of strong positive feedbacks are wrong.

The original post, well worth reading: Are Computer Models Reliable – Can they Predict Climate?


But then…  a commenter on the WUWT post adds:

“Sir, you say the IPCC does not make predictions. The IPCC says the world is going to warm. I call that a prediction.”

Posted in Models | Leave a Comment »

The Need for Better Theory

Posted by greg2213 on September 11, 2011

The Reference Frame discusses the need for good theory in science. The remarks on Climate models, below, are just a small part of an essay that covers black holes, quamtum mechanics, the need for a better understanding of math and theory among “we the people,” and a few other things.

So the reliance on the climate models is due to a shortage of proper theory, not an excess of it. Those people just don’t understand the things themselves. But they think that if they have an access to expensive computers, these computers may compensate for their personal ignorance. Except that they can’t.

The machines aren’t miraculous and the programs were written by some people. If you can’t do certain calculations without a computer, not even approximately, you won’t even be able to design the tests that will decide whether the models behave properly (at least not when you only claim that your models only reproduce some overall properties of a chaotic system).

A religious belief that the model is omniscient won’t help you. If the model is wrong, other people – better theorists than you – may ultimately see that the model is wrong, regardless of the strength of your beliefs. And if you believe some things despite the evidence, then you are a demonstrable bigot.

Posted in Models, Scientists Say | Leave a Comment »

Models are the Truth

Posted by greg2213 on June 10, 2011

Since the Warm Side believes that models have predictive power, over and above what real data tells us, then we should use models for many more things. This quote puts it nicely:

If we can rely so squarely on computer modeling to confirm global weather patterns decades into the future then why not expand this miraculous technology into every aspect of scientific and safety testing?

Why conduct expensive drug trials for example, when we could simply create computer models to demonstrate if a new drug is safe?

Why worry about physically testing new materials for safety at all? New car crash tests, for example, could be modeled instead. Better yet, let’s allow the car companies to generate their own crash test computer models based on data they collected themselves with no oversite. Then the car companies could submit their models to prove a new vehicle’s safety. There would be no need to verify their data, the computer models would tell us everything we needed to know.

source: Global Warming Hoax Weekly Round-Up, June 9th 2011

Posted in Models | Leave a Comment »

But it Worked in the Simulation!

Posted by greg2213 on June 7, 2011

One of the issue in Climate, or any other, science is the belief in the perfection of the models. Inconvenient real world data is often ignored or downplayed if it doesn’t match the model.

Academia is particularly prone to this. So many beautiful theories, how could they possibly be wrong? So many of those beautiful, well thought out, theories die a gruesome death when exposed to the real world.

It’s not because the creators of the theories are necessarily evil, wrong headed, incompetent, or anything. It’s because the real world is sloppy, messy, chaotic, non-linear, and vastly more complex than the theories. So they break down.

The models and theories don’t work because they can’t. Not right off the bat, anyway, and not without lots of testing and comparing to what the real world tells us.

There’s a wonderful post on which discusses this issue:

People often assume that theoretical mathematical calculations and computer simulations will work well enough that machines or experiments will work successfully the first time or at most within a few tries (or similar levels of performance in other contexts). This belief is often implicit in the promotion of scientific and engineering megaprojects such as the NASA Ares/Constellation program or CERN’s Large Hadron Collider (LHC).

One of the reasons for this belief is the apparent success of theoretical mathematical calculations and primitive computer simulations during the Manhattan Project which invented the first atomic bombs in World War II, as discussed in the previous article “The Manhattan Project Considered as a Fluke”.

This belief occurs in many contexts. In the debate over the Comprehensive Test Ban Treaty (CTBT) which bans all nuclear tests on Earth, proponents (sincerely or not) argued that sophisticated computer simulations could substitute for actual tests of nuclear weapons in the United States nuclear arsenal.

After the terrorist attacks of September 11, 2001, federal, state, and local government officials apparently decided to dispose of most of the wreckage of the World Trade Center and rely on computer simulations to determine the cause of the three major building collapses that occurred (instead of physically reconstructing the buildings as has been done in other major accident investigations).

Space entrepreneur Elon Musk apparently believed he could achieve a functioning orbital rocket on the first attempt; he did not succeed until the fourth attempt, recreating a known but extremely challenging technology. This article discusses the many reasons why theoretical mathematical calculations and computer simulations often fail, especially in frontier engineering and science where many unknowns abound.

The rest is here: But It Worked in the Computer Simulation!

Original link and interesting discussion on WUWT.

One commenter says:

Zero defect or 6 sigma techniques can be applied to a lot of complex issues to greatly reduce error rates, but seem to be little known in the scientific community.

In making complex microchips, with hundreds of process steps, each subject to some degree of variation, thousands of chips per wafer and low millions of transistors or gates per chip, we learned how to ship end products at defect rates below 3 parts per million.

It was impossible to test in this level of outgoing quality, it had to be designed and built in, with every step in design and process involved. When we applied the same techniques to accounting we lowered journal entry defects by 4 orders of magnitude in less than one year.

I don’t know how well these methods would apply to writing software, but have to believe they could be very helpful.

Posted in Models | Tagged: | Leave a Comment »

A Climate Model That Works

Posted by greg2213 on May 2, 2011

Looks like they finally got their acts together.

In a shock result, a new climate model produced results that make sense. The new NCTCCFAFM* model shows that future projected temperatures are closely tied to…

Go here: Climate model finally produces meaningful results

Posted in Humor, Models | Leave a Comment »

No, The Climate Models Can’t Do That

Posted by greg2213 on May 1, 2011

From the amazing Willis E at WUWT, a James Hansen quote on everyone’s  favorite models:

Total solar irradiance (TSl) is the dominant driver of global climate, whereas both natural and anthropogenic aerosols are climatically important constituents of the atmosphere also affecting global temperature. Although the climate effects of solar variability and aerosols are believed to be nearly comparable to those of the greenhouse gases (GHGs; such as carbon dioxide and methane), they remain poorly quantified and may represent the largest uncertainty regarding climate change. …

The analysis by Hansen et al. (2005), as well as other recent studies (see, e.g., the reviews by Ramaswamy et al. 2001; Kopp et al. 2()05b; Lean et al. 2005; Loeb and Manalo-Smith 2005; Lohmann and Feichter 2005; Pilewskie et al. 2005; Bates et al. 2006; Penner et al. 2006), indicates that the current uncertainties in the TSI and aerosol forcings are so large that they preclude meaningful climate model evaluation by comparison with observed global temperature change. These uncertainties must be reduced significantly for uncertainty in climate sensitivity to be adequately constrained (Schwartz 2004).

here’s the rest: Reality Leaves A Lot To The Imagination

As Mr. E. says, “…it does make it clear that at this point the models are not suitable for use as the basis for billion dollar decisions.”

But they are  suitable if your goals are power and money.

More on the Subject:

A poll of climate scientists, working withing climate research institutes, has some interesting results – Most of them feel that the models aren’t quite there, yet: Fewer than 3 or 4 percent said they “strongly agree” that computer models produce reliable predictions of future temperatures, precipitation, or other weather events.

Posted in Models | Tagged: | Leave a Comment »

The Models Fail, Simple Physics, & a Primer on the Oceans

Posted by greg2213 on December 6, 2010

No, they don’t accurately hindcast (retrocast) and they don’t accurately predict and it’s probably because such predictions are impossible due to the extreme number of variables and the chaotic nature of the system.

So here’s the discussion on WUWT

Update: While the “model” below is interesting, to me, WUWT has a new post on simple physics and how it fails in regards to the climate discussion.

But as every engineer knows, these simple laws often do not work when reality gets messy, as it usually is.  Simple physics says that if I drop a ball and a feather they will fall at the same rate.

In reality my feather blew up into a tree.Here’s the rest: Simple Physics – In reality my feather blew up into a tree

Update: This also explains the “simple physics” quite nicely:


And yes, “simple physics” adds disclaimers along the lines of “ the same rate, in a vacuum…” This, of course, proves the point that there is nothing simple about the climate system. It’s entirely possible that climate cannot be modeled on anything other than a general scale.


Thought model:

I think I/we can look at it this way, though this is probably bad…

Take a jar/container/holding device of 99 marbles of differing sizes plus one golfball.

The exact mass of each marble and the ball are known to an arbitrary precision, as are the exact dimensions of the “jar,” height of the jar, etc.

Take a 4×8 plywood sheet, lay it on the floor, and add walls to keep the marbles from rolling off the thing. The exact properties of the sheet are known, as are the height of the walls. Heck, make the sheet marble, titanium, glass, or soft sand.

Place the jar at an exact position and pour the marbles onto the plywood at an exact rate in an exact manner.

freeze time.

Write a computer program with all the known data and use it to predict the exact final position of the golf-ball. Add in the exact position of each of the 100 objects as they start to fall. Heck, predict the exact position of all the 100 objects.

restart time.

wait for the marbles to settle. Check the exact position of the golf ball and marbles.

Do this 10,000 times. How many times was your model correct? Obviously if the definition of “correct” is “some position on a 32 square foot plywood/marble/sand surface” then it’ll be correct all the time. If your definition of “correct” is to a one inch square, then chances are you’ll be right a very small number of times  with over 4600 squares. (32*144 = 4608)

We’ll assume that with the hard surface the golf ball has an equal chance of landing anywhere on the surface, which may or may not be the case as it may or may not bounce all over the place. If it isn’t the case then add the odds to your program. Obviously with a sand surface it won’t be the case.

With a one foot square you will likely be right about 10,000/32 times. With a sand surface you can probably ignore a lot of squares since the balls won’t roll very much. Increase the precision of the prediction and the number of times that you are “correct” will decline, though you may be “almost” often enough. Define “almost.”

Personally, I don’t think modeling climate is any simpler than this. While the surface seems to be bounded to +- 10C or so, there may be more marbles and you’re trying to predict an exact outcome. Good luck.

So why not take a flying triple-spin leap? My prediction is that until such time as we really drop into the next ice age the global climate number will be +- 2 degree C from where it is now, with +2 being vastly better for humankind (and plant/critter kind) than -2, and also better than it is now.

Ok, enough silliness. Back to work.


The Earth is essentially a water planet.

Over 70% of its surface area is covered by oceans, seas, and lakes, while a further 5% or so is covered by glaciers and ice caps resting on land areas. More than two-thirds of this water area is located in the southern hemisphere, and the ocean masses are typically 4 to 5 kilometres deep. With the Earth being over 75% covered by water in one form or another, it follows that the response of this 75% to any increase in greenhouse gases will be decisive in determining to what extent a warming, if any, will occur.

The atmosphere cannot warm until the underlying surface warms first. This is because the transparency of the atmosphere to solar radiation, (which is a key element in the greenhouse warming scenario), prevents the lower atmosphere itself being significantly warmed by direct sunlight alone.

The surface atmosphere therefore gets its warmth from direct contact with the oceans, from infra-red radiation off its surface being absorbed by greenhouse gases, and from the removal of latent heat from the ocean through evaporation. This means, therefore, that the temperature of the lower atmosphere is largely determined by the temperature of the ocean. In other words, it is necessary for the oceans to warm up first before the overlying atmosphere can warm.


On the ocean being a giant heat engine: The Constructal Law of Flow Systems

That article is based, in part, on this paper.

Posted in Models | Leave a Comment »