SIAM News Blog

Scientific Computing, Machine Learning, and Data Science: Recurring Themes at CSE19

By Paul Davis

To its nearly 2,000 attendees, the 2019 SIAM Conference on Computational Science and Engineering (CSE19), which took place from February 25-March 1 in Spokane, Wash., may have seemed more like a lively music festival than a specialized scientific meeting. On the central stage, prize lecturers reprised beloved favorites while up-and-comers debuted newer hits. A host of minisymposia on the numerous smaller platforms drew their own devoted audiences. The overall effect was an engaging mix of robust scientific computing, innovative machine learning, and insightful data science. Old barriers fell while new ones were scaled.

Jack Dongarra, recipient of the SIAM/ACM Prize in Computational Science and Engineering, delivered a prize lecture that offered a numerical nostalgia (if not magical mystery!) tour of the singular value decomposition (SVD), which has been the “working horse” of linear algebra for nearly half a century. The importance of efficient SVD algorithms is difficult to overstate.  For example, SVD is a key tool for constructing reduced-order models — the subject of more than 20 minisymposia at this meeting alone.

Alistair Adcroft of Princeton University and the U.S. National Oceanic and Atmospheric Administration speaks about ocean modeling during the 2019 SIAM Conference on Computational Science and Engineering, held in Spokane, Wash., earlier this year. SIAM photo.
Dongarra’s tour began with EISPACK and its circa 1970 element-wise Fortran operations. It then proceeded through various versions of LAPACK and increasingly complex arrangements of Basic Linear Algebra Subprograms (BLAS) before concluding with the latest and greatest multiprocessors. Along the way, many attendees likely heard at least one tune that they knew. Most could probably recall the first time they had “played” that tune, as well as the machine they used then.

Dongarra, who holds appointments at the University of Tennessee and Oak Ridge National Laboratory, made his point convincingly: a steady stream of software hits arises from an artful coupling of the right mathematical formulation with algorithms and software shaped to exploit evolving computational architectures [1].

A segue from Dongarra’s computational retrospective could have led to the two-part computational environment spawned partly by his and his colleagues’ work: one class of languages for algorithmic prototypes and another for highly efficient production software. Jeffrey Bezanson, Stefan Karpinski, and Viral Shah of Julia Computing, Inc. received the James H. Wilkinson Prize for Numerical Software for their success in harmonizing the two. Their corresponding talk surveyed some of the ways in which Julia solves that two-language problem—one for prototyping, one for production—in scientific computing and machine learning [2].1

Another player might have turned from Dongarra’s study of matrices to the potential role of tensors—matrices’ higher-order cousins—in machine learning. During her invited lecture, Anima Anandkumar of NVIDIA and the California Institute of Technology did exactly that. Instead of the pairwise correlations that matrices capture, tensors can encode third-order correlations — to detect topics in texts through the co-occurrence of word triplets, for instance. Anandkumar observed that BLAS has successively evolved from Level I vector-vector  computations to Level II matrix-vector and finally Level III matrix-matrix computations. Might we now, she conjectured, expect BLAS Level IV tensor-tensor computations?

Nitty-gritty computational modeling was well-represented on the main stage. Michael Ferris of the University of Wisconsin, Madison described how stochastic optimization illuminates the policy decisions—and sometimes the unexpected consequences—of New Zealand’s bold commitment to 100 percent renewable electrical energy. As another example, Boyce Griffith of the University of North Carolina at Chapel Hill explored methods, models, and applications for fluid-structure interactions in medicine and biology.

On a vastly different scale, Alistair Adcroft of Princeton University and the U.S. National Oceanic and Atmospheric Administration outlined a critical need to improve studies of the role of the world’s oceans in climate warming. They absorb and circulate significant amounts of excess solar heat, but current practical ocean models cannot resolve the dynamic details of convecting eddies with sufficient resolution.

Michael Ferris of the University of Wisconsin, Madison addresses a crowded lecture hall during his invited talk on renewable electricity at the 2019 SIAM Conference on Computational Science and Engineering, which took place earlier this year in Spokane, Wash. SIAM photo.
Steven Brunton of the University of Washington addressed data-driven discovery of underlying physical laws in both a minisymposium presentation and his SIAG/CSE Early Career Prize lecture. He suggested that the challenge of learning physics from data—a recurring theme at CSE19—draws upon many aspects of CSE, including data science, machine learning, computational modeling, high-performance computing, and optimization. Brunton and his colleagues judiciously choose coordinates and measurements to derive interpretable and generalizable models by simultaneously identifying a model’s structure and its parameters’ values.

Opening a minisymposium on scientific machine learning, Nathan Baker of Pacific Northwest National Laboratory summarized a workshop on physics-based machine learning that was conducted through the Department of Energy’s Advanced Scientific Computing Research program. The workshop aimed to extract from the larger mass of machine learning challenges those that are specific to science and engineering — or as Baker aptly put it, to distinguish the problems of managing “the electric grid from targeting ads for diapers.” For instance, many machine learning methods lack the mathematical foundation necessary for understanding their robustness and sensitivity. Those matters become more critical when they guide higher-stakes decisions.

Risk and uncertainty quantification constitute yet another technically difficult but vitally important genre within CSE, especially when it comes to choosing models and methods. Tobin Isaac of the Georgia Institute of Technology, a rising young performer speaking from the main stage, addressed these problems in his SIAG/CSE Best Paper Prize lecture. He described his and his colleagues’ careful identification of the methodological components needed to model uncertainty propagation in problems like predicting the flow of the Antarctic ice sheet [3].

The data sciences were no less ubiquitous. For example, Deanna Needell of the University of California, Los Angeles and Giseon Heo of the University of Alberta worked with the Association for Women in Mathematics to organize a two-part minisymposium on data science. The research reported during these sessions ranged from detecting data anomalies to automating the diagnosis of sleep apnea. The eight papers in the minisymposium were coauthored by a variety of teams that involved 16 distinct individuals in total. The teams began their collaborations in 2017 during a one-week summer research workshop at Brown University’s Institute for Computational and Experimental Research in Mathematics. That week was certainly effective, as those 16 collaborators represented almost one-third of workshop participants!

Other minisymposia talks addressed some wide-ranging data science problems. Two examples are Microsoft’s efforts to automate farmers’ management of their fields and Vanguard’s studies of online financial advice driven by artificial intelligence.

No SIAM meeting would be complete without formal and informal attention to the experiences of individual members. SIAM President Lisa Fauci organized and moderated a panel on recruiting strategies for diversity and inclusion. A minisymposium highlighted the exciting work of younger underrepresented mathematicians. And multiple panels explored workplace issues and experiences in the data sciences.

Be sure to check out future issues of SIAM News for more detailed coverage of these and other hits from CSE19. Additionally, video recordings of all CSE19 invited and prize presentations are available from SIAMSlides with synchronized audio of select minisymposia and PDFs of slides of the above are also available. 

1 Bezanson, Karpinski, and Shah acknowledged their fourth collaborator, Alan Edelman of the Massachusetts Institute of Technology.

References —in addition to select minisymposia—
[1] Dongarra, J., Gates, M., Haidar, A., Kurzak, J., Luszczek, P., Tomov, S., & Yamazaki, I. (2018). The Singular Value Decomposition: Anatomy of Optimizing an Algorithm for Extreme Scale. SIAM Rev., 60(4), 808-865.
[2] Edelman, A. (2016, March). Julia: A Fast Language for Numerical Computing. SIAM News, 49(2), p. 5. 
[3] Isaac, T., Petra, N., Stadler, G., & Ghattas, O. (2015). Scalable and efficient algorithms for the propagation of uncertainty from data through inference to prediction for large-scale problems, with application to flow of the Antarctic ice sheet. J. Comput. Phys., 296, 348-368.

Paul Davis is professor emeritus of mathematical sciences at Worcester Polytechnic Institute.

blog comments powered by Disqus