My name is Ria Persad, and I am the President of StatWeather. For the last 13 years, I have been tracking forecaster accuracy--both publicly available weather forecasters and private vendors. Accurate weather forecasts are vital to the energy and commodities sectors where a one degree shift can mean differentials of millions of dollars.
When I first started tracking forecaster accuracy, The Weather Channel and Accuweather were tied for the bottom of the heap. Over the years, they have both improved their forecasting, and now are frequently more accurate than even the private vendors.
Anymore, it is really a toss-up whether the public or private forecasters are more or less accurate. Certain weeks or months, certain forecasters will lead the pack. Other months, they will all demonstrate considerable bias and help to throw off an entire market. StatWeather tracks these shifts so that companies can recognize arbitrage opportunities in the marketplace.
Short-range forecasts in the 1- to 5-day range are, on average for the United States, about 2.5 degrees off (give or take), which means a 5-degree error range. This is an average; the daily variability can be as much as 10 or 15 degrees off. In my 13 years of tracking, I have seen major forecasters deviate by as much as 20+ degrees for a day-ahead forecast. I have also seen forecasters consistently be too warm by 10 degrees for a particular city for an entire month straight.
This brings me to another point. Do public or private forecasters check themselves? The answer is, not enough of them do, or if they do, it's not consistent; and then fewer yet will release this information to their clients or to the public.
If you were to hire a baseball player for a world-class team, wouldn't you want to know their batting average? Smart companies will keep daily record of how the forecasts they use are performing--including their in-house support. When millions of dollars hinge upon a few degrees difference in temperatures, this information is vital.
In the early 2000's, I used to track the accuracy of the in-house meteorology support at Duke Energy and compare it against the private vendors. The in-house support would be sweating bullets every time I would run these reports, but in the end, the in-house support was more accurate than the vendors. It gave the traders reassurance that they could reliably trust their in-house support.
No private forecaster should be trusted without the hard-and-fast mathematical facts of their performance. Anecdotal evidence is not evidence, and nothing can really replace analytics. No matter how friendly, how cocky, or how popular a forecaster might be, their accuracy might not be any better than a free forecast.
So in all fairness, what is StatWeather's accuracy? StatWeather specializes in long-range forecasting, and over a 10-year period of record, our year-ahead forecasts have been as accurate as a 1- to 5-day forecast, which has been a shocking conclusion. In other words, if we forecasted 55 degrees Fahrenheit for Cincinnati for next March (6 months out, say), the actual temperature would likely range from 53 to 57 degress. This beats using climatological normals. If a customer wants to see weekly accuracy for any city in the country, the report is there with 100% transparency.
When I started the StatWeather company, I wanted to bring our reporting standards up the level of some common engineering standards such as the Six Sigma Black Belt, applying process improvement and quality control standards that you might see for the manufacture of space shuttles or weapons systems. Transparency leaves us more vulnerable to criticism, but ultimately, if our system works, companies soon recognize the benefits. If our system isn't working well, then it's back to the drawing board for StatWeather, and we improve our process.
To view StatWeather's accuracy reports for last Summer 2012, last Winter 2011-2012, or for the last 10 years, email email@example.com.