ALCF supercomputers advance earthquake modeling efforts

science
RSQSim simulation
RSQSim simulation

Southern California defines cool. The perfect climes of San Diego, the glitz of Hollywood, the magic of Disneyland. The geology is pretty spectacular, as well.

“Southern California is a prime natural laboratory to study active earthquake processes,” says Tom Jordan, a professor in the Department of Earth Sciences at the University of Southern California (USC). “The desert allows you to observe the fault system very nicely.”

The fault system to which he is referring is the San Andreas, among the more famous fault systems in the world. With roots deep in Mexico, it scars California from the Salton Sea in the south to Cape Mendocino in the north, where it then takes a westerly dive into the Pacific.

Situated as it is at the heart of the San Andreas Fault System, Southern California does make an ideal location to study earthquakes. That it is home to nearly 24-million people makes for a more urgent reason to study them.

Jordan and a team from the Southern California Earthquake Center (SCEC) are using the supercomputing resources of the Argonne Leadership Computing Facility (ALCF), a U.S. Department of Energy Office of Science User Facility, to advance modeling for the study of earthquake risk and how to reduce it. 

Headquartered at USC, the center is one of the largest collaborations in geoscience, engaging over 70 research institutions and 1,000 investigators from around the world.

The team relies on a century’s worth of data from instrumental records as well as regional and seismic national hazard models to develop new tools for understanding earthquake hazards. Working with the ALCF, they have used this information to improve their earthquake rupture simulator, RSQSim.

RSQ is a reference to rate- and state-dependent friction in earthquakes — a friction law that can be used to study the nucleation, or initiation, of earthquakes. RSQSim models both nucleation and rupture processes to understand how earthquakes transfer stress to other faults.

ALCF staff were instrumental in adapting the code to Mira, the facility’s 10-petaflops supercomputer, to allow for the larger simulations required to model earthquake behaviors in very complex fault systems, like San Andreas, and which led to the team’s biggest discovery.

Shake, rattle, and code

The SCEC, in partnership with the U.S. Geological Survey, had already developed the Uniform California Earthquake Rupture Forecast (UCERF), an empirically based model that integrates theory, geologic information, and geodetic data, like GPS displacements, to determine spatial relationships between faults and slippage rates of the tectonic plates that created those faults.

Though more traditional, the newest version, UCERF3, is considered the best representation of California earthquake ruptures, but the picture it portrays is still not as accurate as researchers would hope.

“We know a lot about how big earthquakes can be, how frequently they occur, and where they occur, but we cannot predict them precisely in time,” notes Jordan.

The team turned to Mira to run RSQSim to determine whether they could achieve more accurate results more quickly. A physics-based code, RSQSim produces long-term synthetic earthquake catalogs that comprise dates, times, locations, and magnitudes for predicted events.

Using simulation, researchers impose stresses upon some representation of a fault system, which changes the stress throughout much of the system and thus changes the way future earthquakes occur. Trying to model these powerful stress-mediated interactions is particularly difficult with complex systems and faults like San Andreas.

“We just let the system evolve and create earthquake catalogs for a hundred thousand or a million years. It’s like throwing a grain of sand in a set of cogs to see what happens,” explains Christine Goulet, a team member and executive science director for special projects with SCEC.

The end result is a more detailed picture of the possible hazard, which forecasts a sequence of earthquakes of various magnitudes expected to occur on the San Andreas Fault over a given time range.

The group tried to calibrate RSQSim’s numerous parameters to replicate UCERF3, but eventually decided to run the code with its default parameters. While the initial intent was to evaluate the magnitude of differences between the models, they discovered, instead, that both models agreed closely on their forecasts of future seismologic activity.

“So it was an a-ha moment. Eureka,” recalls Goulet. “The results were a surprise because the group had thought carefully about optimizing the parameters. The decision not to change them from their default values made for very nice results.”

The researchers noted that the mutual validation of the two approaches could prove extremely productive in further assessing seismic hazard estimates and their uncertainties.

Information derived from the simulations will help the team compute the strong ground motions generated by faulting that occurs at the surface — the characteristic shaking that is synonymous with earthquakes. To do this, the team couples the earthquake rupture forecasts, UCERF and RSQSim, with different models that represent the way waves propagate through the system. Called ground motion prediction equations, these are standard equations used by engineers to calculate the shaking levels from earthquakes of different sizes and locations.

One of those models is the dynamic rupture and wave propagation code Waveqlab3D (Finite Difference Quake and Wave Laboratory 3D), which is the focus of the SCEC team’s current ALCF allocation.

“These experiments show that the physics-based model RSQSim can replicate the seismic hazard estimates derived from the empirical model UCERF3, but with far fewer statistical assumptions,” notes Jordan. “The agreement gives us more confidence that the seismic hazard models for California are consistent with what we know about earthquake physics. We can now begin to use these physics to improve the hazard models."

This project was awarded computing time and resources at the ALCF through DOE's Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program. The team's research is also supported by the National Science Foundation, the U.S. Geological Survey, and the W.M. Keck Foundation.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation's first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America's scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy's Office of Science.

The U.S. Department of Energy's Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit the Office of Science website.

Allocations