Skip to main content
SHARE
Research Highlight

Titan Simulates Earthquake Physics Necessary for Safer Building Design

ORNL Image
Snapshots of 10-Hz rupture propagation (slip rate) and surface wavefield (strike-parallel component) for a crustal model (top) without and (bottom) with a statistical model of small-scale heterogeneities.
Researchers conduct unprecedented study on GPUs of damaging, high-frequency shaking

When the last massive earthquake shook the San Andreas Fault in 1906—causing fires that burned down most of San Francisco and leaving half the city’s population homeless—no one would hear about “plate tectonics” for another 50 years, and the Richter scale was still a generation away. Needless to say, by today’s standards, only primitive data survive to help engineers prepare southern California for an earthquake of similar magnitude.

“We haven’t had a really big rupture since the city of Los Angeles existed,” said Thomas Jordan, Southern California Earthquake Center (SCEC) director.

Scientists predict this is just the quiet before the storm for cities like San Francisco and Los Angeles, among other regions lining the San Andreas.

“We think the San Andreas Fault is locked and loaded, and we could face an earthquake of 7.5-magnitude or bigger in the future,” Jordan said. “But the data accumulated from smaller earthquakes in southern California over the course of the last century is insufficient to predict the shaking associated with such large events.”

To prepare California for the next “big one,” SCEC joint researchers—including computational scientist Yifeng Cui of the University of California, San Diego and geophysicist Kim Olsen of San Diego State University—are simulating on Titan, the world’s most powerful supercomputer for open science research, earthquakes at high frequencies for more detailed predictions that are needed by structural engineers.

Titan, which is managed by the Oak Ridge Leadership Computing Facility (OLCF) located at Oak Ridge National Laboratory (ORNL), is a 27-petaflop Cray XK7 machine with a hybrid CPU/GPU architecture. GPUs, or graphics processing units, are accelerators that can rapidly perform calculation-intensive work while CPUs carry out more complex commands. The computational power of Titan enables users to produce simulations—comprising millions of interacting molecules, atoms, galaxies, or other systems difficult to manipulate in the lab—that are often the largest and most complex of their kind.

The SCEC’s high-frequency earthquakes are no exception.

“It’s a pioneering study,” Olsen said, “because nobody has really managed to get to these higher frequencies using fully physics-based models.”

Many earthquake studies hinge largely on historical and observational data, which assumes that future earthquakes will behave as they did in the past (even if the rupture site, the geological features, or the built environment is different).

“For example, there have been lots of earthquakes in Japan, so we have all this data from Japan, but analyzing this data is a difficult task because scientists and engineers preparing for earthquakes in California have to ask ‘Is Japan the same as California?’ The answer is in some ways yes, and in some ways no,” Jordan said.

The physics-based model calculates wave propagations and ground motions radiating from the San Andreas Fault through a 3-D model approximating the Earth’s crust. Essentially, the simulations unleash the laws of physics on the region’s specific geological features to improve predictive accuracy.

Seismic wave frequency, which is measured in Hertz (cycles per second), is important to engineers who are designing buildings, bridges, and other infrastructure to withstand earthquake damage. Low-frequency waves, which cycle less than once per second (1 Hertz), are easier to model, and engineers have largely been able to build in preparation for the damage caused by this kind of shaking.

“Building structures are sensitive to different frequencies,” Olsen said. “It’s mostly the big structures like highway overpasses and high-rises that are sensitive to low-frequency shaking, but smaller structures like single-family homes are sensitive to higher frequencies, even up to 10 Hertz.”

But high-frequency waves (in the 2–10 Hertz range) are more difficult to simulate than low-frequency waves, and there has been little information to give engineers on shaking up to 10 Hertz.

“The engineers have hit a wall as they try to reduce their uncertainty about how to prevent structural damage,” Jordan said. “There are more concerns than just building damage there, too. If you have a lot of high-frequency shaking it can rip apart the pipes, electrical systems, and other infrastructure in hospitals, for example. Also, very rigid structures like nuclear power plants can be sensitive to higher frequencies.”

A better understanding of the effects of high-frequency waves on critical facilities could inform disaster response in addition to structural engineering.

High-frequency waves are computationally more daunting because they move much faster through the ground. And in the case of the SCEC’s simulations on Titan, the ground is extremely detailed: representing a chunk of terrain one-fifth the size of California (including a depth of 41 kilometers) at a spatial resolution of 20 meters. The ground models include detailed 3-D structural variations—both larger features such as sedimentary basins as well as small-scale variations on the order of tens of meters—through which seismic waves must travel.

Along the San Andreas, the Earth’s surface is a mix of hard bedrock and pockets of clay and silt sands.

“The Los Angeles region, for example, sits on a big sedimentary basin that was formed over millions of years as rock eroded out of mountains and rivers, giving rise to a complex layered structure,” Jordan said.

Soft ground like Los Angeles’s sedimentary basin amplifies incoming waves, causing these areas to shake more over a longer period of time than rocky ground, which means some areas further away from the rupture site could actually experience more infrastructure damage.

The entire simulation totaled 443 billion grid points. At every point, 28 variables—including different wave velocities, stress, and anelastic wave attenuation (how waves lose energy to heat as they move through the crust)—were calculated.

“High-frequency ground motion modeling is a complex problem that requires a much larger scale of computation,” Jordan said. “With the capabilities that we have on Titan, we can approach those higher frequencies.”

Back in 2010, the SCEC team used the OLCF’s 1.75-petaflop Cray XT5 Jaguar supercomputer to simulate an 8-magnitude earthquake along the San Andreas Fault. Those simulations peaked at 2 Hertz. At the time the Jaguar simulations were conducted, doubling wave frequency would have required a 16-fold increase in computational power.

But on Titan in 2013, the team was able to run simulations of a 7.2-magnitude earthquake up to their goal of 10 Hertz, which can better inform performance-based building design. By modifying their code originally designed for CPUs for GPUs—the Anelastic Wave Propagation by Olsen, Steven Day, and Cui, known as the AWP-ODC—they significantly improved speed up. The simulations ran 5.2 times faster than they would have on a comparable CPU machine without GPU accelerators.

“We redesigned the code to exploit high performance and throughput,” Cui said. “We made some changes in the communications schema and reduced the communication required between the GPUs and CPUs, and that helped speed up the code.”

The SCEC team anticipates simulations on Titan will help improve its CyberShake platform, which is an ongoing sweep of millions of earthquake simulations that model many rupture sites across California.

“Our plan is to develop the GPU codes so the codes can be migrated to the CyberShake platform,” Jordan said. “Overcoming the computational barriers associated with high frequencies is one way Titan is preparing for this progression.”

Utilizing hybrid CPU/GPU machines in the future promises to substantially reduce the computational time required for each simulation, which would enable faster analyses and hazard assessments. And it is not only processor-hours that matter but real time as well. The 2010 San Andreas Fault simulations took 24 hours to run on Jaguar, but the higher frequency, higher resolution simulations took only five and a half hours on Titan.

And considering the “big one” could shake California anytime in the next few decades to the next few years, accelerating our understanding of the potential damage is crucial to SCEC researchers.

“We don’t really know what happens in California during these massive events, since we haven’t had one for more than 100 years,” Jordan said. “And simulation is the best technique we have for learning and preparing.”