WDAY.com

WDAZ: Your Home Team

Published February 13, 2014, 12:15 PM

Grand Forks forecasters talk about how the field has advanced, and how it hasn’t

GRAND FORKS - Click. There’s an elevation map of the Red River Valley, and, believe it or not, the flat valley actually has elevation.

By: Tu-Uyen Tran, Forum News Service, INFORUM

GRAND FORKS - Click. There’s an elevation map of the Red River Valley, and, believe it or not, the flat valley actually has elevation.

Click. The map shows wind speed and direction.

Click. There are four supercomputer simulations of the next day’s weather for the North America.

With the click of a mouse or a few taps on a keyboard, Grand Forks meteorologist Mark Ewens has access to literally a world of weather data collected by thousands of sensors around the country and even the world.

For a guy who became a weather forecaster 40 years ago, when he and his colleagues would plot weather data with pens and a paper map, the ability to access all this information is amazing stuff, and he shows it off with obvious pride.

“All of this information is new,” he said gesturing to the four computer screens in front of him. “All of the stuff we’re seeing on the satellite is new from 40 years ago, just the sheer amount of weather data.”

Though there are the occasional woefully wrong forecasts, which Ewens admits readily, the science of weather has advanced by leaps and bounds in the past several decades thanks to the abundance of data and computing power.

Getting it right

Even in the space of a decade, the weather service has reported major improvements. In its State of the National Weather Service report in August 2012, the agency reported that the lead time for its winter storm warnings had increased, on average, from 13 hours in fiscal year 2001 to 20 hours in fiscal year 2011.

Tornado warning lead times have increased from 10 minutes to 15 minutes in that time.

Ewens said before Doppler radars were introduced in the 1990s, meteorologists had a hard time even detecting tornadoes. The small tornadoes that occur in North Dakota were practically invisible to earlier-model radars, and the first clue that one existed was when someone saw it touch down, he said.

Day-to-day forecasts such as rain and snow have improved, too, with a threat score increasing from 26 percent to 34 percent. The score measures how close the weather service came to a perfect forecast where it gets the amount of precipitation and location exactly right for the next day.

Wealth of data

To appreciate how much weather science has changed, consider what it was like in 1974 when Ewens began his career as a weather forecaster in the Air Force.

There were really no automated weather sensors then. Meteorologists had to gather the information from the instruments themselves, checking thermometers and rain gauges. They’d go outside every hour to look at the sky, identify the clouds and estimate cloud height.

In North Dakota, there were all of eight observation stations, including one in Grand Forks, he said. Today, there are 30 such stations in the state and nearly 1,000 more around the nation, most of them unmanned.

To gather data higher in the atmosphere, the meteorologists of 1974 relied on balloons released twice a day and satellites, as meteorologists do now. But there wasn’t as much data then.

Twice a day, an extra-large fax machine would print out the latest satellite images and Ewens and his colleagues would use pens to plot their data on top of it. He pointed to a wall where they used to hang the map, now occupied by two large flat-screen displays.

Today, computers gather images from many weather satellites, stitch them into time-lapse videos, layer the images with data and send them over the Internet — the images and videos are even available to the general public.

Data from the upper atmosphere are also now available from sensors mounted on commercial aircraft, providing information around the clock.

Supercomputing

Weather forecasters in 1974 were assisted by computers as they are today, but the small amount of data available and the relatively low computing power meant they are not as fine-grained as today, according to Ewens and Mark Frazier, the meteorologist in charge at the Grand Forks weather service office.

What the computers do is take what data they have about conditions from specific areas and extrapolate what conditions might be in areas where they don’t have data. Forty years ago, the computers could only simulate general conditions within individual grid-squares measuring hundreds of square miles, and they could only do so for a couple layers of atmosphere.

Today, the weather service’s supercomputers can create models that simulate many layers of atmosphere within grid-squares of 2.5 square miles.

“The model performance today versus the 1980s into the early 1990s is monumental,” said Frazier. “The accuracy we have today, for example, at Day 7 is closer to what we used to have 25 years ago for Days 1 to 3.”

He began with the weather service in 1990 when supercomputers had already taken over the plotting of weather maps from meteorologists.

Besides its own simulations, the weather service also gets simulations from other weather agencies, such as Canada’s and Japan’s, allowing forecasters to pick and choose the ones that most accurately represent current conditions.

And all of this is available free of charge to the public and what Ewens calls the “industrial meteorological community,” by which he means private forecasters ranging from the local TV weatherman to the Weather Channel.

Skills count

But, while the availability of data and supercomputing power has increased dramatically, the skill of the meteorologist still matters.

“The computers are really good but they’re not perfect,” Ewens said.

Grid-squares that they look at are still not small enough and, hidden within the squares, may be small storms that the computers would dismiss as noise from bad instruments, he said. During severe weather, he said, a meteorologist is always keeping an eye on things to catch what the computers miss.

The simulations don’t always account for local factors either, according to Ewens and Frazier.

For example, the pine forests of Minnesota’s lake country can increase the likelihood of thunderstorms, Ewens said. Thunderstorms are formed when there is enough moisture, rising air mass and atmospheric instability, he said, and the moisture rising out of the forests provide two of those factors.

Terrain can also affect the weather, hence the importance of that elevation map of the Red River Valley.

While it might not seem like much of a valley, it is deep enough to act as a funnel so when the wind roars down from the north at the right angle, that funnel causes an increase in wind speed, Ewens said.

Weather models, like people, can be biased, too, and meteorologists pay close attention, according to Frazier.

“How do the models handle that arctic front? Do they handle the wind velocity behind the front correctly? Are they too low? Too high?” he said during a blizzard warning a few weeks ago. “Those are the things you learn, those biases the model has, by forecasting in a certain area for several years.”

More data ahead

As technology marches on, so does the science of weather.

As sophisticated as supercomputers are today, they’re really just looking at layers of atmosphere in two dimensions, according to Frazier. He said he’s looking forward to true three-dimensional modeling, which will require much more computing power.

Weather data is also expected to get better and perhaps there will be more of it still.

Frazier said the Doppler radar network may one day be replaced by more sophisticated sensors called phased-array radars, which can scan an area five to six times faster. That makes a difference when a storm is moving 60 mph, he said.

Ewens said existing sensors don’t do a good job of detecting moisture in three dimensions. Satellites, for example, can only measure whatever moisture they can sense from above in two dimensions, he said. But he has heard of experiments in which signals from the Global Positioning System can be used to measure moisture in three dimensions.

It would be an exponential increase in data requiring yet more supercomputing power to make sense of it all.

But meteorologists will probably remain humble even then because they know they’ll never get it 100 percent right.

“The gentleman who came up with the ‘butterfly effect,’ Edward Lorenz, interestingly enough, is a meteorologist and a statistician,” Ewens said. “His basic premise of the butterfly effect is that it will be impossible to accurately forecast the weather with any consistent outcome because the inherent noise in the atmosphere cannot be modeled properly on a small enough scale. His claim is that you’d have to be able to measure down to the free path of an electron basically to catch all the nuances of the chaos that is part of nature.”

Tags: