But it Worked in the Simulation!
Posted by greg2213 on June 7, 2011
One of the issue in Climate, or any other, science is the belief in the perfection of the models. Inconvenient real world data is often ignored or downplayed if it doesn’t match the model.
Academia is particularly prone to this. So many beautiful theories, how could they possibly be wrong? So many of those beautiful, well thought out, theories die a gruesome death when exposed to the real world.
It’s not because the creators of the theories are necessarily evil, wrong headed, incompetent, or anything. It’s because the real world is sloppy, messy, chaotic, non-linear, and vastly more complex than the theories. So they break down.
The models and theories don’t work because they can’t. Not right off the bat, anyway, and not without lots of testing and comparing to what the real world tells us.
There’s a wonderful post on math-blog.com which discusses this issue:
People often assume that theoretical mathematical calculations and computer simulations will work well enough that machines or experiments will work successfully the first time or at most within a few tries (or similar levels of performance in other contexts). This belief is often implicit in the promotion of scientific and engineering megaprojects such as the NASA Ares/Constellation program or CERN’s Large Hadron Collider (LHC).
One of the reasons for this belief is the apparent success of theoretical mathematical calculations and primitive computer simulations during the Manhattan Project which invented the first atomic bombs in World War II, as discussed in the previous article “The Manhattan Project Considered as a Fluke”.
This belief occurs in many contexts. In the debate over the Comprehensive Test Ban Treaty (CTBT) which bans all nuclear tests on Earth, proponents (sincerely or not) argued that sophisticated computer simulations could substitute for actual tests of nuclear weapons in the United States nuclear arsenal.
After the terrorist attacks of September 11, 2001, federal, state, and local government officials apparently decided to dispose of most of the wreckage of the World Trade Center and rely on computer simulations to determine the cause of the three major building collapses that occurred (instead of physically reconstructing the buildings as has been done in other major accident investigations).
Space entrepreneur Elon Musk apparently believed he could achieve a functioning orbital rocket on the first attempt; he did not succeed until the fourth attempt, recreating a known but extremely challenging technology. This article discusses the many reasons why theoretical mathematical calculations and computer simulations often fail, especially in frontier engineering and science where many unknowns abound.
The rest is here: But It Worked in the Computer Simulation!
Zero defect or 6 sigma techniques can be applied to a lot of complex issues to greatly reduce error rates, but seem to be little known in the scientific community.
In making complex microchips, with hundreds of process steps, each subject to some degree of variation, thousands of chips per wafer and low millions of transistors or gates per chip, we learned how to ship end products at defect rates below 3 parts per million.
It was impossible to test in this level of outgoing quality, it had to be designed and built in, with every step in design and process involved. When we applied the same techniques to accounting we lowered journal entry defects by 4 orders of magnitude in less than one year.
I don’t know how well these methods would apply to writing software, but have to believe they could be very helpful.