SIAM News Blog

The Future of Scientific Computation

By Bruce Hendrickson

Bruce Hendrickson, Lawrence Livermore National Laboratory.
Attempts to predict the future have a long and inglorious history. Cultures from time immemorial have devoted their very best technologies to the task, utilizing apparatuses such as tea leaves, crystal balls, and animal entrails. But as Niels Bohr famously observed, “prediction is very difficult, especially if it’s about the future.”

Certainly, many aspects of the world are inherently unpredictable. But humans have made enormous progress in recent decades regarding certain kinds of predictions. We can forecast weather many days in advance, predict the ecological impact of different policy options, anticipate the effect of interest rate changes on the economy, and foresee the implications of greenhouse gas emissions on the climate in 2100.

This predictive power comes not from new insights in supernatural divination, but rather through advances in applied mathematics and computational science. In fact, mathematicians make predictions all the time; we usually call it “time integration.” Are there lessons to be learned from applied mathematics that might provide fresh insight into thinking about the future — particularly the future of scientific computing itself? In an invited talk at the 2018 SIAM Annual Meeting, I will argue that there are.

Applied mathematicians typically develop or utilize sets of equations that describe the system being modeled. These equations are commonly the result of conservation laws: temporal invariants that greatly constrain the system’s possible future evolution. Although less rigorous than conservation of mass, might there be invariants that constrain the evolution of scientific fields?

One constant in scientific computing is the continual progress of mathematical models, numerical formulations, and efficient and robust algorithms, which underpin every computation and ensure the field’s enduring progress. But unlike most scientific disciplines, scientific computing benefits from another “invariant” known as Moore’s Law — Gordon Moore’s empirical observation in 1965 that the density of integrated circuits grows exponentially. The natural corollary of this observation for the scientific computing community is that computers of the next decade will be dramatically superior to those of today, even if the precise technology path is unclear. This “invariant” looks likely to hold for at least another decade. As Isaac Newton observed, scientific communities advance by standing on the shoulders of the giants that preceded them. But Moore’s Law has offered scientific computing an additional turbo boost; it is as if our giants were riding up an escalator!

Applied mathematics offers a glimpse into the future of scientific computing. Bruce Hendrickson will address specifics at the 2018 Annual Meeting.

Ever-faster computers have enabled the consistent progress of scientific computing in a manner that is quite clear with hindsight. Research focus areas in one decade have often become the building blocks for more ambitious goals in the subsequent decade. For example, intense research on scalable linear solvers in the 1980s and early 1990s enabled very large-scale implicit simulations by the turn of the century. The increased speed of these highly-resolved forward simulations then allowed the community to focus on outer-loop questions, such as uncertainty quantification, design optimization, and inverse problems. This consistent, rapid progress has enabled researchers to continually shift focus to more expansive questions that require new advances in mathematical formulations and algorithms.

Armed with this perspective, I believe it is possible to integrate forward in time from today and gain insight into likely trends for the next decade of scientific computing. I will share the conclusions of this projection at the 2018 SIAM Annual Meeting.

Bruce Hendrickson is the associate director for computation at ‎Lawrence Livermore National Laboratory. 
blog comments powered by Disqus