Chaos Found In Weather Forecast -Same Code, Different Computer, Different Results
Written by Mike James   
Monday, 29 July 2013

We all know that the weather is a chaotic system - recall the origin of the phrase "the butterfly effect". Even so, it comes as a shock to learn that you can take the same model code and run it on different machines and get very different answers.

Notice that we are not talking about differences in initial conditions, which is the usual way that chaotic systems present a problem to computation. Sensitivity to initial conditions is a simple consequence of the equations magnifying small differences as the system evolves.

Consider the problem of predicting where a pencil balanced on its point will fall. Tiny changes in its initial condition, i.e. its initial position, will produce big changes in where it falls. 

The same sensitivity to initial conditions is a characteristic of the equations that control the weather. It is this that makes weather forecasting so difficult. Some initial conditions produce a fairly stable forecast. Think of a pencil that starts tipped over at a large angle. Small changes to its starting position won't modify where it falls too much. Some initial conditions produce a situation that is much more sensitive. Consider a pencil balanced almost perfectly vertical. Now very small changes produce big differences in where it falls. 

Weather forecasters generally use a method called "ensemble forecasting". This is a form of Monte Carlo analysis where the forecast is run a number of times with slightly different starting conditions and even using different models. The results give you a better estimate, in the form of the average, and an estimate of the sensitivity of the result. 

So far so good. 

However, a new paper puts the cat among the pigeons with results that illustrate that chaos and sensitivity manifest themselves in another surprising way. Song-You Hong (from South Korea's Yonsei University Department of Atmospheric Sciences) et al, run tests using the GRIMs (Global/Regional Integrated Model System) weather model on a range of different systems. Using the same initial conditions, they found that the results differed as much as the sort of traditional ensemble prediction. That is, the same initial conditions and the same model code produced different results on different machines with different operating systems. The suggestion is that numerical rounding errors slowly accumulate to produce the observed divergence and that the rounding errors depend on the hardware and the system software. 

It is important to realize that the computations were parallelized and so we might even get differences due to the order of computation. Even so, what is surprising is that the effect of these differences is as great as those produced by varying the initial conditions in an ensemble forcast on a single machine/system. The runs differed in architecture, Fortran compiler used, parallel library used and optimization level. 

The results apply to a ten-day forecast, but relevance to the climate change argument hasn't gone un-noticed, or should that be un-utilized. Various news reports are emphasising the angle that such results put climate change predictions that forecast for decades ahead in serious doubt, Some are even going as far as to claim that global warming is just a rounding error. 

Anyone aware of the difficulties of keeping floating point calculations under control will not be as surprised by the results, and we all should have been skeptical of the accuracy implied by long term climate models in which there is much more to be uncertain about than rounding errors. Long term climate models have to take into earth system factors, such as the carbon cycle and so on, and are much more difficult to get right from the point of view of missing or inaccurate mechanisms. 

The paper is behind a paywall erected by the American Meteorological Society, even though the work was funded by the public. It is time to stop this scandal which in this case will only contribute to the misunderstanding of climate change by limiting access. 

weatherforecast

Sometimes a forecast is ensemble stable
Photo:A View Of Madrid

 

More Information

Song-You Hong, Myung-Seo Koo,Jihyeon Jang, Jung-Eun Esther Kim, Hoon Park, Min-Su Joh, Ji-Hoon Kang, and Tae-Jin Oh
Monthly Weather Review 2013
e-viewdoi: http://dx.doi.org/10.1175/MWR-D-12-00352.1

  

Related Articles 

The Programmer's Guide to Chaos       

Data Analytics for Weathering Storms       

Weather API platform launched       

Google Maps Weather Layer API       

Floating Point Numbers       

Going Further Into Complexity With Santa Fe Institute       

The Programmer's Guide to Fractals  

The Monte Carlo Method            

 

To be informed about new articles on I Programmer, install the I Programmer Toolbar, subscribe to the RSS feed, follow us on, Twitter, FacebookGoogle+ or Linkedin,  or sign up for our weekly newsletter.

 

espbook

 

Comments




or email your comment to: comments@i-programmer.info

 

Banner


Pico 2W Announced But There Is A Surprise!
25/11/2024

Raspberry Pi released the Pico 2 a few months ago and we have been waiting for the Pico 2W since then. But Pimoroni beat them to the draw with the Pico Plus 2W based on the RM2 radio module and hinted [ ... ]



C23 ISO Standard Is Here But You Probably Won't Read It
06/11/2024

At last ISO C23 has been published, but at $250 you probably aren't going to read it. Can we really tolerate this sort of profiteering on the work of others? This is worse than academic publishing!


More News

Last Updated ( Monday, 29 July 2013 )