About the Author

Panelists Talk Machine Learning and the Future of Mathematics at ICIAM 2019

By Hans De Sterck, Gitta Kutyniok, James Nagy, and Eitan Tadmor

The excitement and activity surrounding the field of machine learning was clearly evident at the 9th International Congress on Industrial and Applied Mathematics (ICIAM 2019), which took place this summer in Valencia, Spain. Over 25 minisymposia—as well as several prize lectures and invited talks—touched on the theme of “learning,” while other invited presentations addressed important mathematical research challenges necessary to advance the field.

A special panel on the future of mathematics in the age of machine learning explored the topic in detail. Panelists Hans De Sterck (University of Waterloo), Gitta Kutyniok (Technische Universität Berlin), James Nagy (Emory University), and Eitan Tadmor (University of Maryland, College Park) represented various core areas of computational and applied mathematics that develop and utilize machine learning techniques, including computational science and engineering, imaging science, linear algebra, and partial differential equations.

Discussion broached a variety of issues surrounding machine learning, such as the obvious fact that machine learning will remain, as mathematician Ali Rahimi stated, “an area comparable to alchemy” without new mathematical understanding and developments. Deep learning is among the most transformative technologies of our time, and its many potential applications—from driverless cars to drug discovery—can have tremendous societal impact. Yet although deep learning retains significant public interest, lay people are largely unaware of the key mathematical and computational challenges in the field. This is particularly true in instances that require many layers for the interpretation of highly complex data patterns. Unfortunately, there are currently no theoretical results to support much of the practical experience suggesting that deep learning algorithms can produce amazing results for high-dimensional data.

Multiple presenters at ICIAM 2019 are attempting to address this gap in mathematical theory by developing novel means of interpreting deep learning as a dynamic optimal control problem with ordinary differential equation and partial differential equation (PDE) models. New mathematical theories will allow concepts from applied mathematics to create a rigorous theoretical basis for designing and training deep neural networks, and subsequently providing insight into their reasoning. A number of minisymposia at the conference focused on the mathematical foundations of deep learning.

Similarly, computational mathematics plays an increasingly important role in machine learning by providing new, efficient optimization algorithms and scalable parallel numerical methods for deep network training. These techniques are essential when training very large networks in ways that scale on high-end parallel computing infrastructure using enormous amounts of data, thus pushing the boundaries of machine learning’s capabilities. Several minisymposia at ICIAM 2019 emphasized these novel mathematical developments and their applications in materials, finance, signal and image processing, molecular dynamics, and inverse problems.

The panel generated lively conversation on a multitude of issues, including current limitations of machine learning. For example, more than one attendee expressed concern that machine learning might only be useful “when being wrong is not dangerous.” Others noted that limitations revolve primarily around quality of data (i.e., human involvement is typically essential for data labeling, which can cause biases) and the need for massive computing resources. Since current machine learning models are highly susceptible to errors, poor and biased data can induce significant failures. Moreover, the time constraints posed by learning and verification are substantial; because models demand constant training, these computational requirements will only grow with time. Furthermore, if machine learning algorithms are to run on handheld devices (without the need for cloud computing resources, which can have significant latency problems), new approaches that use less data and computing resources are also necessary. Overcoming many of these limitations requires advancements from core areas of computational and applied mathematics, including linear algebra, PDEs, optimization, inverse problems, high-performance computing, statistics, and uncertainty quantification.

At the same time, mathematical disciplines like inverse problems and numerical analysis of PDEs are also impacted by machine learning techniques. Such methods—foremost deep neural networks—can quickly lead to state-of-the-art approaches, particularly for inverse problems in imaging sciences. This occurs due to the dearth of physical models in imaging science, which consequently makes data-driven methods quite effective. From a numerical standpoint, a particularly promising conceptual approach is the use of model-based methods for as long as they are reliable, and the exploitation of learning-type methodologies when they are not. The development of a mathematical underpinning for machine learning and hybrid-type approaches to inverse problems is one main direction of future research. The field of PDEs was slower to embrace machine learning methods, but theoretical results in the high-dimensional regime—typically demonstrating that deep neural networks can overcome the curse of dimensionality—are already available. Research in the aforementioned directions is accelerating.

Students pursuing degrees in areas of data science, including machine learning—whether in mathematics or computer science—must be trained in the relevant mathematical fields. The machine learning revolution and surrounding excitement may slow down in the coming years, but the technology itself is not going to disappear. We can therefore expect the number of undergraduate and graduate students enrolled in computational and applied mathematics courses to significantly increase in the next decade and remain high for the foreseeable future. Moreover, the mathematical community should consider adapting some of its core curricula to include additional topics related to the mathematics of data science. This would be especially beneficial in courses taken by mathematics students, as well as mathematics courses taken by students from other areas of natural and engineering sciences.

Knowledge of machine learning methods is even becoming increasingly important for humanities students. In this sense—and from an educational viewpoint—many ICIAM 2019 attendees are expecting a paradigm shift. Data science and machine learning are rapidly evolving into the leading quantitative and computational endeavors of our time, transforming the way in which society functions. Mathematics, statistics, and mathematics-based algorithms are foundational building blocks of this revolution, and the role and influence of further mathematical developments will only increase as the revolution continues to unfold.

Hans De Sterck is a professor of applied and computational mathematics at the University of Waterloo in Canada. His research focuses on numerical methods for computational science and data science. Gitta Kutyniok is Einstein Professor of Mathematics and head of the Applied Functional Analysis Group at the Technische Universität Berlin. Her research focuses on applied harmonic analysis, compressed sensing, deep learning, imaging science, inverse problems, and numerical analysis of partial differential equations (PDEs). James Nagy is Samuel Candler Dobbs Professor and chair of the Department of Mathematics at Emory University. His research focuses on numerical linear algebra, structured matrix computations, numerical solution of inverse problems, and image processing. Eitan Tadmor is a Distinguished University Professor at the University of Maryland, College Park. His research focuses on theory and computation of PDEs with applications to shock waves, kinetic transport, incompressible flows, image processing, and self-organized collective dynamics.