麻豆影音

Skip to main content
SHARE
Research Highlight

Titan Simulates Earthquake Physics Necessary for Safer Building Design

ORNL Image
Snapshots of 10-Hz rupture propagation (slip rate) and surface wavefield (strike-parallel component) for a crustal model (top) without and (bottom) with a statistical model of small-scale heterogeneities.
Researchers conduct unprecedented study on GPUs of damaging, high-frequency shaking

When the last massive earthquake shook the San Andreas Fault in 1906鈥攃ausing fires that burned down most of San Francisco and leaving half the city鈥檚 population homeless鈥攏o one would hear about 鈥減late tectonics鈥� for another 50 years, and the Richter scale was still a generation away. Needless to say, by today鈥檚 standards, only primitive data survive to help engineers prepare southern California for an earthquake of similar magnitude.

鈥淲e haven鈥檛 had a really big rupture since the city of Los Angeles existed,鈥� said Thomas Jordan,  (SCEC) director.

Scientists predict this is just the quiet before the storm for cities like San Francisco and Los Angeles, among other regions lining the San Andreas.

鈥淲e think the San Andreas Fault is locked and loaded, and we could face an earthquake of 7.5-magnitude or bigger in the future,鈥� Jordan said. 鈥淏ut the data accumulated from smaller earthquakes in southern California over the course of the last century is insufficient to predict the shaking associated with such large events.鈥�

To prepare California for the next 鈥渂ig one,鈥� SCEC joint researchers鈥攊ncluding computational scientist Yifeng Cui of the University of California, San Diego and geophysicist Kim Olsen of San Diego State University鈥攁re simulating on Titan, the world鈥檚 most powerful supercomputer for open science research, earthquakes at high frequencies for more detailed predictions that are needed by structural engineers.

Titan, which is managed by the Oak Ridge Leadership Computing Facility (OLCF) located at Oak Ridge National Laboratory (ORNL), is a 27-petaflop Cray XK7 machine with a hybrid CPU/GPU architecture. GPUs, or graphics processing units, are accelerators that can rapidly perform calculation-intensive work while CPUs carry out more complex commands. The computational power of  enables users to produce simulations鈥攃omprising millions of interacting molecules, atoms, galaxies, or other systems difficult to manipulate in the lab鈥攖hat are often the largest and most complex of their kind.

The SCEC鈥檚 high-frequency earthquakes are no exception.

鈥淚t鈥檚 a pioneering study,鈥� Olsen said, 鈥渂ecause nobody has really managed to get to these higher frequencies using fully physics-based models.鈥�

Many earthquake studies hinge largely on historical and observational data, which assumes that future earthquakes will behave as they did in the past (even if the rupture site, the geological features, or the built environment is different).

鈥淔or example, there have been lots of earthquakes in Japan, so we have all this data from Japan, but analyzing this data is a difficult task because scientists and engineers preparing for earthquakes in California have to ask 鈥業s Japan the same as California?鈥� The answer is in some ways yes, and in some ways no,鈥� Jordan said.

The physics-based model calculates wave propagations and ground motions radiating from the San Andreas Fault through a 3-D model approximating the Earth鈥檚 crust. Essentially, the simulations unleash the laws of physics on the region鈥檚 specific geological features to improve predictive accuracy.

Seismic wave frequency, which is measured in Hertz (cycles per second), is important to engineers who are designing buildings, bridges, and other infrastructure to withstand earthquake damage. Low-frequency waves, which cycle less than once per second (1 Hertz), are easier to model, and engineers have largely been able to build in preparation for the damage caused by this kind of shaking.

鈥淏uilding structures are sensitive to different frequencies,鈥� Olsen said. 鈥淚t鈥檚 mostly the big structures like highway overpasses and high-rises that are sensitive to low-frequency shaking, but smaller structures like single-family homes are sensitive to higher frequencies, even up to 10 Hertz.鈥�

But high-frequency waves (in the 2鈥�10 Hertz range) are more difficult to simulate than low-frequency waves, and there has been little information to give engineers on shaking up to 10 Hertz.

鈥淭he engineers have hit a wall as they try to reduce their uncertainty about how to prevent structural damage,鈥� Jordan said. 鈥淭here are more concerns than just building damage there, too. If you have a lot of high-frequency shaking it can rip apart the pipes, electrical systems, and other infrastructure in hospitals, for example. Also, very rigid structures like nuclear power plants can be sensitive to higher frequencies.鈥�

A better understanding of the effects of high-frequency waves on critical facilities could inform disaster response in addition to structural engineering.

High-frequency waves are computationally more daunting because they move much faster through the ground. And in the case of the SCEC鈥檚 simulations on Titan, the ground is extremely detailed: representing a chunk of terrain one-fifth the size of California (including a depth of 41 kilometers) at a spatial resolution of 20 meters. The ground models include detailed 3-D structural variations鈥攂oth larger features such as sedimentary basins as well as small-scale variations on the order of tens of meters鈥攖hrough which seismic waves must travel.

Along the San Andreas, the Earth鈥檚 surface is a mix of hard bedrock and pockets of clay and silt sands.

鈥淭he Los Angeles region, for example, sits on a big sedimentary basin that was formed over millions of years as rock eroded out of mountains and rivers, giving rise to a complex layered structure,鈥� Jordan said.

Soft ground like Los Angeles鈥檚 sedimentary basin amplifies incoming waves, causing these areas to shake more over a longer period of time than rocky ground, which means some areas further away from the rupture site could actually experience more infrastructure damage.

The entire simulation totaled 443 billion grid points. At every point, 28 variables鈥攊ncluding different wave velocities, stress, and anelastic wave attenuation (how waves lose energy to heat as they move through the crust)鈥攚ere calculated.

鈥淗igh-frequency ground motion modeling is a complex problem that requires a much larger scale of computation,鈥� Jordan said. 鈥淲ith the capabilities that we have on Titan, we can approach those higher frequencies.鈥�

Back in 2010, the SCEC team used the OLCF鈥檚 1.75-petaflop Cray XT5 Jaguar supercomputer to simulate an 8-magnitude earthquake along the San Andreas Fault. Those simulations peaked at 2 Hertz. At the time the Jaguar simulations were conducted, doubling wave frequency would have required a 16-fold increase in computational power.

But on Titan in 2013, the team was able to run simulations of a 7.2-magnitude earthquake up to their goal of 10 Hertz, which can better inform performance-based building design. By modifying their code originally designed for CPUs for GPUs鈥攖he Anelastic Wave Propagation by Olsen, Steven Day, and Cui, known as the AWP-ODC鈥攖hey significantly improved speed up. The simulations ran 5.2 times faster than they would have on a comparable CPU machine without GPU accelerators.

鈥淲e redesigned the code to exploit high performance and throughput,鈥� Cui said. 鈥淲e made some changes in the communications schema and reduced the communication required between the GPUs and CPUs, and that helped speed up the code.鈥�

The SCEC team anticipates simulations on Titan will help improve its CyberShake platform, which is an ongoing sweep of millions of earthquake simulations that model many rupture sites across California.

鈥淥ur plan is to develop the GPU codes so the codes can be migrated to the CyberShake platform,鈥� Jordan said. 鈥淥vercoming the computational barriers associated with high frequencies is one way Titan is preparing for this progression.鈥�

Utilizing hybrid CPU/GPU machines in the future promises to substantially reduce the computational time required for each simulation, which would enable faster analyses and hazard assessments. And it is not only processor-hours that matter but real time as well. The 2010 San Andreas Fault simulations took 24 hours to run on Jaguar, but the higher frequency, higher resolution simulations took only five and a half hours on Titan.

And considering the 鈥渂ig one鈥� could shake California anytime in the next few decades to the next few years, accelerating our understanding of the potential damage is crucial to SCEC researchers.

鈥淲e don鈥檛 really know what happens in California during these massive events, since we haven鈥檛 had one for more than 100 years,鈥� Jordan said. 鈥淎nd simulation is the best technique we have for learning and preparing.鈥�