Simulating 800,000 Years Of California Earthquake Historical past To Pinpoint Dangers – Watts Up With That?

0
26
Simulating 800,000 Years Of California Earthquake History To Pinpoint Risks – Watts Up With That?

From the Texas Advanced Computing Center

Posted on January 25th, 2021 by Aaron Dubrow

A randomly selected 3,000-year segment of the physics-based simulated earthquake catalog in California created on Frontera. (Photo credit: Kevin Milner, University of Southern California)

Fortunately, massive earthquakes are rare occurrences. However, this scarcity of information makes us somewhat blind to its risks, especially when it comes to determining the risk for a particular location or structure.

"We missed most of the possible events that could cause great damage," said Kevin Milner, a computer scientist and seismological researcher at the University of Southern California's Southern California Earthquake Center (SCEC).

"Using southern California as an example, we haven't had a really big earthquake since 1857 – this was the last time southern San Andreas broke into a 7.9 magnitude earthquake. An earthquake in San Andreas could affect a much larger area than that the 1994 Northridge earthquake, and other major earthquakes may occur. We are concerned about that. "

The traditional way to get around this lack of data is to dig trenches to learn about past fractures, gather information from many earthquakes around the world, and build a statistical hazard model or use supercomputers to find a simulate a particular earthquake in a particular location with a high level of fidelity.

3D view of a particularly complex multiple fault fracture from the catalog for synthetic earthquakes. (Photo credit: Kevin Milner, University of Southern California)

However, a new framework for predicting the likelihood and impact of earthquakes across an entire region, developed over the past decade by a team of researchers affiliated with SCEC, has struck a middle ground and possibly a better way of determining risk.

A new study by Milner and Bruce Shaw of Columbia University, published in the Bulletin of the Seismological Society of America in January 2021, presents results from a prototype rate-state earthquake simulator (RSQSim) that simulates hundreds of thousands of years of earthquake history in California. Coupled with other code, CyberShake, the framework can calculate the amount of shaking that would occur with any quake. Their results can be easily compared with historical earthquakes and the results of other methods and show a realistic distribution of earthquake probabilities.

According to the developers, the new approach improves the ability to determine exactly how big an earthquake could be in a given location, and enables building code developers, architects, and civil engineers to design more resilient buildings that can survive earthquakes in a given location.

"For the first time we have a whole pipeline from start to finish where earthquakes and ground motion simulations are based on physics," said Milner. "It can simulate up to 100,000 years on a really complicated fault system."

APPLY MASSIVE COMPUTER POWER TO BIG PROBLEMS

RSQSim converts mathematical representations of earthquake geophysical forces – the standard model for the nucleation and propagation of fractures – into algorithms and then solves them on some of the most powerful supercomputers in the world. The computationally intensive research was made possible over several years by government-sponsored supercomputers at the Texas Advanced Computing Center, including Frontera – the most powerful system at a university in the world – Blue Waters at the National Center for Supercomputing Applications and Summit at the Oak Ridge Leadership Computing Facility.

"One way to better predict risk is through physics-based modeling, using the capabilities of systems like Frontera to run simulations," said Milner. "Instead of an empirical statistical distribution, we simulate the occurrence of earthquakes and the propagation of their waves."

"We have made great strides at Frontera in determining what type of earthquake we can expect, when and how often, when it fails," said Christine Goulet, executive director of applied science at SCEC, who was also involved on the work. "We don't prescribe or tell the code when the earthquakes will happen. We run a simulation of hundreds of thousands of years and just let the code transfer the burden from one bug to another."

The simulations began with the geological topography of California and simulated over 800,000 virtual years how tensions form and dissipate when tectonic forces act on the earth. From these simulations, the framework created a catalog – a record that an earthquake occurred in a specific location with a specific strength and attributes at a specific time. The catalog that the SCEC team produced on Frontera and Blue Waters was among the largest ever made, Goulet said. The outputs from RSQSim were then fed into CyberShake, which in turn used computer models of geophysics to predict how much shaking (in terms of ground acceleration or speed and duration) would occur as a result of each quake.

"The framework outputs a complete slip-time history: where a break occurs and how it has grown," explained Milner. "We found that it creates realistic ground motion, which tells us that the physics implemented in the model is working as intended." Further work is planned to validate the results, which will be critical for design applications before adoption.

Read the full article here.

Like this:

To like Loading…