ForecastAdvisor Weather Forecast Accuracy Blog

A Blog Documenting the Intersection of Computers, Weather Forecasts, and Curiosity
 
 Subscribe in a reader    Subscribe by email:   Delivered by FeedBurner

 

Thursday, December 22, 2005


Best Places To Live If...

In August of this year, we published a fun paper called "Best Places to Live or Work If You Need to Know What the Weather Will Be Like Tomorrow." It was a serious look at how weather forecast accuracy differs depending on where you live. It's still available (for free, I might add) right here.

The paper ranked nearly 700 U.S. cities on two criteria:

  1. How accurate temperature weather forecasts for the city are
  2. How much temperatures vary day-to-day
The thought was, if you needed to know what tomorrow's weather would bring, you'd want to live somewhere where tomorrow's temperatures are much like today's, and where the weather forecasts are the most accurate. Honolulu, Hawaii came in first, while Williston, North Dakota came in last.

Not included in the original paper is the following map showing graphically the results of the paper. The more blue an area is, the less predictable weather it has. The more red, the more predictable. The dots are the cities in the report.

(Click here for a larger version)

The map graphically shows that the southeast and far west are the most predictable, mainly due to the moderating effect of large bodies of water nearby. The upper plains are the least predictable, as they are often the battleground between cold air from the north and warm air from the gulf, which makes wide temperature swings common and keeps weather forecasters up at night. Dr. John W. Enz, North Dakota's state climatologist, noted that temperature variation is perhaps the most important feature of North Dakota's climate.

There are a few oddities. For example, notice the dot of cyan in a sea of red in southern California near the Mexican border. That's Campo, California (overall rank 416). The city ranks 126th and 196th for high and low temperature persistence (in the top third), but nearly at the bottom for high and low temperature forecasting at 682nd and 681st. It looks like both the National Weather Service and Accuweather are having a hard time forecasting for the city. However, the other forecasters including The Weather Channel, Intellicast, and CustomWeather, seem to be forecasting fine. Compare this to nearby San Diego, California, where Accuweather and the National Weather Service seem to be doing fine. San Diego is 13th and 4th for high and low temperature persistence, and 42nd and 40th for high and low temperature forecast accuracy.

permalink

Monday, December 12, 2005


Home Field Advantage

Do weather forecasters do better forecasting their "local" area than a distant city, all other things being equal? Does being "local" improve a weather forecast? These are questions I occasionally hear, and I've heard both sides. Proponents say that knowing the quirks of local weather help a forecaster improve his or her forecast, and that the only way a forecaster can learn about those quirks is by living it. Opponents say that computers do much of the work of forecasting, and that weather works on a large enough scale that the location of the actual forecaster doesn't much matter.

I thought I would look at the ForecastWatch database to see if it could help me answer the question. ForecastWatch collects data on over 800 U.S. and international cities for the major forecasters: Accuweather (located in State College, PA), Weather Channel (located in Atlanta, GA), CustomWeather (located in San Francisco, CA), Intellicast (located in Andover, MA), and the National Weather Service, with many centers around the U.S. Intellicast and The Weather Channel are owned by the same company.

I decided to look at one- to three-day out high temperature forecast accuracy for all of this year, so far. There were 800 U.S. cities where accuracy was calculated. Each forecaster was ranked in order of the total high temperature error for the forecasts. The National Weather Service came in first with 252 cities out of the 800. That means the NWS had the lowest average absolute error in high temperature forecasting this year (2005, through October) in 252 cities. Next was The Weather Channel with 248, Intellicast with 125, Accuweather with 122, and finally CustomWeather with 53 cities.

But that's not real informative. What we really want to see is who won each of the forecasters "home" states. Accuweather was founded and has deep roots in State College, Pennsylvania. Many of its meteorologists came up through the renowned Penn State meteorology program. If there is a "local" advantage to meteorology, Accuweather should have it for Pennsylvania. And indeed that is the case. Of the 23 cities tracked in Pennsylvania, Accuweather came out on top with the least error for 15 of those cities. So while Accuweather came out on top in only 15%% of the cities nationwide, it won 65%% of the cities in Pennsylvania. Accuweather also came out on top in neighboring Maryland, and tied with Weather Channel in New Jersey.

How about The Weather Channel? Headquartered in Atlanta, and with most of their meteorological staff there, would they come out tops in Georgia? They do! While they get the blue ribbon in 31%% of the states overall, they come out on top in about half the cities in Georgia, and lead all providers with 8 of the 17 cities tracked (NWS is next with 5). So it looks like they have a home field advantage as well.

Intellicast should win Massachusetts then, right? Wrong. It does rather poorly there, and also in New Hampshire, which is close to their headquarters in Andover. They are owned by the same company as The Weather Channel. Perhaps there is little local meteorological support in Andover? Or maybe there really isn't a "local" weather advantage. Maybe the statistics are just coincidental.

Let's look at CustomWeather, then. CustomWeather is the smallest of the tracked forecasters, but is very competitive in a lot of statistics. Unfortunately, they only led in less than 7%% of the cities, and didn't win their home state of California. But wait...they dominate Alaska and Hawaii, states probably forgotten about by the east coast forecasters. CustomWeather won 75%% of Alaska's 18 tracked cities, and 80%% of Hawaii's five. In fact, 24 of the 53 cities CustomWeather came out on top in (almost half) are in Alaska, Hawaii, and California. Not home field advantage, but pretty close.


United States map showing where weather forecasters had the most sites ranked number one in high temperature accuracy for one to three days out

(Click here for a larger version)

I don't think this data proves anything, but it is interesting. Maybe there is something to the assertion that being local gives weather forecasters an edge.

permalink
 

 
Archives

December 2005   January 2006   March 2006   June 2006   July 2006   August 2006   September 2006   October 2006   November 2006   December 2006   January 2007   February 2007   March 2009   September 2009   March 2010   April 2010   February 2011   April 2011   June 2011   February 2012   September 2012   June 2013   October 2013   February 2014   June 2016   Current Posts