SIAM News Blog
SIAM News
Print

Supercomputer Simulations of Earthquakes in Real Time

By Jillian Kunze

As different research areas impose new workloads on supercomputers, the field of computational science and engineering is changing. The integration of simulation, data, and learning is becoming increasingly important. During a minisymposium presentation at the 2021 SIAM Conference on Computational Science and Engineering, which took place virtually last week, Kengo Nakajima of the University of Tokyo described a new supercomputing software platform and its applications in earthquake simulation. The work he described was done jointly with the University of Tokyo’s Information Technology Center and Earthquake Research Institute.

Figure 1. A diagram of how the Wisteria/BDEC-01 supercomputer system will operate.
The supercomputing center at University of Tokyo currently operates three supercomputing systems. To promote the integration of simulation, data, and learning, the center is now introducing the Big Data & Extreme Computing (BDEC) system called Wisteria/BDEC-01. This system is slated to start operations in May 2021 and will include both simulation nodes and data/learning nodes; some of the data/learning nodes will be connected directly to external nodes (see Figure 1). These node groups make the system hierarchical, hybrid, and heterogeneous (h3). Nakajima and his collaborators are working to optimize procedures for simulations on Wisteria/BDEC-01, especially in nonlinear cases. 

The supercomputing center is developing a new, innovative software platform called h3-Open-BDEC for the Wisteria/BDEC-01 system. This five-year project has been supported by the Japanese government since 2019, with a budget of $1.41 million U.S. dollars. The new platform includes several innovations. h3-Open-BDEC utilizes new principles for numerical analysis through adaptive precision, automatic tuning, and accuracy verification. It also includes a hierarchical data-driven approach that is based on machine learning. 

There are a number of possible applications for h3-Open-BDEC’s simulation, data, and learning framework. Two very typical examples are (i) models that include data assimilation and (ii) atmosphere-ocean coupling in weather and climate simulations. It could also be used for real-time disaster simulations of events such as floods or tsunamis. Nakajima focused on a similar application for the remainder of his talk: earthquake simulations with real-time data assimilation.

Figure 2. A map of the earthquakes that occur in Japan.
Japan is prone to earthquakes, and more than 300 occur there every day (see Figure 2). Most of these are small quakes, but larger earthquakes do occur with disastrous results. “The task of our simulations is to understand earthquake dynamics,” Nakajima said. However, earthquake simulations always have a large amount of uncertainty due to fluctuations and unknown underground structures. The integration of real observations with simulations is therefore essential. Researchers have traditionally employed forward simulations for this type of modeling, but new methods that combine simulations with real-time data assimilation are under development. 

A recent body of research has been moving towards the integration of earthquake simulations with observations to model the propagation of seismic waves. Nakajima’s team used a forecasting model based on data assimilation called Seism3D/OpenSWPC-DAF; this model was developed by Takashi Furumura of the Earthquake Research Institute at the University of Tokyo. The seismic observations for the model were provided by the Japan Data Exchange Network, which is a network of earthquake data that is made available through the Science Information NETwork in real time and produces data on the order of 100 gigabytes per day. To assimilate all this data, h3-Open-BDEC used a very simple and very linear model called the optimal interpolation technique. The model started by concurrently assimilating data on and simulating the earthquake, then switched to pure simulation after a certain amount of time. 

Figure 3. Results of h3-Open-BDEC as applied to simulating the 2007 Niigata earthquake. A+S refers to data assimilation and simulation occurring simultaneously, while Pure S refers to only simulations occurring. ATW refers to the time at which the model switched between those two paradigms. The resulting surface waves show that a longer ATW creates results that are more accurate to the actual event.
Nakajima presented an example of h3-Open-BDEC’s earthquake simulation that was made using data from the magnitude 6.6 earthquake that struck near Niigata, Japan in 2007. There were almost 350 points of observations for this earthquake, providing a lot of data with which the model could work. He found that more accurate results were obtained with a longer assimilation time (ATW); i.e., when the time at which the model switched from assimilation plus simulation to pure simulation occurred later (see Figure 3). A longer ATW also produced less error, though of course the trade-off is that the simulation takes longer to run. 

This work was done offline, using filtered observations of past earthquake events. The process of moving towards forecasts of current earthquake events is ongoing and requires the integration of filtering, data assimilation, and forecasting. These real-time simulations must be significantly faster than the real phenomenon to be of use in emergency situations; however, doing so takes up a lot of computing resources. Nakajima and his collaborators have been doing preliminary work using the Oakbridge-CX (OBCX) supercomputer at the University of Tokyo, and will switch to Wisteria/BDEC-01 when it begins operations in May. To compute a simulation of the propagation of an earthquake over 60 seconds in just six second of real time requires a lot of OBCX’s resources — more the 70 nodes during data assimilation, and around 360 nodes during pure simulations. 

In the future, Nakajima hopes to use simulation, data, and learning to further enhance the accurate prediction of seismic wave propagation using real-time data observation and assimilation. “Emergency information for safer evacuation is one of the goals for this system,” Nakajima said. He also hopes to improve the three-dimensional model of the underground environment for earthquake simulations. This is difficult, as the environment is heterogeneous and difficult to observe, but machine learning may be able to accelerate this process. 

  Jillian Kunze is the associate editor of SIAM News

 

blog comments powered by Disqus