No one wants to believe that the March 11, 2011, Japan earthquake and tsunami could happen again. But it will. Somewhere.
One likely place for a repeat performance is the west coast of North America. The question is when.
Geophysicists from seven research institutions across the country are probing that question like never before, through a five-year, $4.6-million project just getting underway. Combining 1,000-times-faster computing speeds with improved methodologies, the team is creating the first unified simulation of earthquakes all along western North America.
“One of the project goals is to improve our short- and long-term earthquake forecasting capabilities,” project leader James Dieterich of the University of California, Riverside, said in a press release. “More accurate forecasting has practical advantages—earthquake insurance, for example, relies heavily on forecasts.”
The new simulation will help scientists determine the interplay between the two very different fault systems that exist along the U.S. West Coast. Californians have known to expect great lurches along the San Andreas Fault and its counterparts ever since the Great San Francisco Earthquake of 1906. But inhabitants of the U.S. Pacific Northwest face a double threat: Lurking offshore is a 600-mile-long gash in the seafloor, the Cascadia Subduction Zone, that is prone to mega-thrust earthquakes and tsunamis on the order of last year’s 9.0 tsunami-generating temblor in Japan. The last great Cascadia quake occurred in 1700, uncomfortably long ago when you consider that such events occur every 300 to 500 years.
“Observations of earthquakes go back to only about 100 years, resulting in a relatively short record,” Dieiterich said. “If we get the physics right, our simulations of plate boundary fault systems—at one-kilometer resolution for California—will span more than 10,000 years of plate motion and consist of up to a million discrete earthquake events, giving us abundant data to analyze.”
From all that new data, Dieterich and his colleagues hope to locate clues regarding the long-term processes that condition fault systems to fail in great earthquakes (greater than magnitude 8). One condition the team will be watching closely is the effect of so-called slow-slip events, a special class of subtle plate motion among the hundreds of earthquakes we cannot feel at the earth’s surface.
The existence of slow-slip events, which scientists discovered only recently, is turning out to be especially important for accurate forecasting. In a slow-slip event, movement is gradual enough not to create detectable ground motion, but the energy release can be the equivalent of a normal magnitude 6 earthquake. The big question is whether these events may transfer stress to portions of a subduction zone most prone to a violent jolt, Dieterich explained in a recent talk on the UC Riverside campus.
He pointed out that scientists reviewing the seismic recordings from Japan in early 2011 noted a slow-slip event occurred between the main shock on March 11 and its foreshock.
Likewise, an analysis of the 1960 Chile earthquake revealed that there appeared to be a slow slip even between it and its largest aftershock. And dozens of slow-slip events have been detected in the Cascadia Subduction Zone deep beneath Washington and Oregon.
So does that increase the chances for another great Cascadia quake? That’s exactly what Deiterich and his colleagues at UCR, Brown University, Columbia University, the University of Southern California, San Diego State University, UC San Diego and the US Geological Survey hope to discern.
Their motives are simple: an increasing fraction of the world’s population lives in regions where great earthquakes occur and is exposed to high seismic risk as a result. While preparedness is crucial to dealing with earthquakes, better forecasting of these natural disasters can save more lives.