Updated

Frustrated scientists carry an extra burden as they watch wildfires torch southern California: Days ago, their instruments told them this destruction would happen.

But once the fires ignited, their computers could not forecast where or how long the fires would burn, or suggest where fire crews might contain their spread.

Researchers say it will be several years before computer programs will accurately forecast fire behavior the way TV weathermen tell viewers where a hurricane or a blizzard is heading.

"Last Friday, we saw there would be very high winds over the Simi Valley and San Diego," said geographer Dar Roberts, principal investigator at the Southern California Wildfire Hazard Center (search) at the University of California-Santa Barbara.

By Tuesday fires had erupted in the area.

"It's a scary moment because you know that if somebody starts a fire, it's going to be tough," Roberts said. "We could see this happening."

Researchers at several universities and the National Interagency Fire Center (search) in Boise, Idaho, integrate information about weather, fire dangers and available fire crews. Their reports go to government land managers and city public safety departments, especially in fire-prone western and southern states.

In addition to weather reports and data from hundreds of automated ground-based climate stations, they use atmospheric computer models that predict conditions above likely fire spots.

As fires burn, the agencies send updates on key factors, such as when the Santa Ana winds might let up.

But what about whether a fire will jump a river or crest a certain hillside in the next hour? Should a neighborhood be evacuated? For those strategic decisions, fire managers cannot turn to scientists and computers.

They still must rely on their own experience and frontline reports from exhausted "ground-pounders" — their fire crews armed with shovels and pickaxes.

Computer simulations of fires are "an extremely valuable tool for lessons learned that can be applied to future fires, but it is not an operational model," said Rich Wagoner of the National Center for Atmospheric Research (search) in Boulder, Colo.

Today's best wildfire models are so complicated they run only on supercomputers normally reserved for calculations related to global warming and nuclear weapons simulations.

They are called "coupled models" because they combine data about the atmosphere and weather with data on fire conditions, such as elevation and soil and plant moisture.

But models bog down on the details that really determine a fire's behavior. Such as, whether a mountain slope is cool and facing north, or warm and facing south. Or, whether the 60 mph winds are gusty or constant. Or finer details, such as the dimensions of leaves and needles on specific plants.

The best models crudely simulate wildfire over about a half-square-mile. Want finer resolution? The models' accuracy drops to an area equal to a few football fields.

And there is the speed issue. Supercomputers operate at trillions of calculations per second, but they can't keep pace with wildfires, which create their own hurricane-force winds at blast furnace temperatures.

"It takes several days to run a model to get an hour or two of forecast," Wagoner said.

Scientists say it will cost $25 million over five years to develop a faster, portable wildfire model.

In September, researchers at NACR and the Rochester Institute of Technology won a $300,000 federal grant to begin translating remote-sensing data and satellite photographs into "mini-movies" predicting a wildfire's behavior for 60 minutes. They hope fire managers can download the animations on laptop computers at the scene of a blaze.