SIAM News Blog
SIAM News
Print

Trends in Linear Algebra

By Jennifer Pestana

As secretary of the SIAM  Activity Group on Linear Algebra (SIAG/LA), I have had the pleasure of becoming better acquainted with our members and their work. The breadth of new and innovative linear algebra research—showcased at the most recent SIAM Annual Meeting, International Linear Algebra Society (ILAS) Conference, and SIAM Conference on Applied Linear Algebra—is far greater than I can convey here.  Instead, I will touch on some of the ways our community is handling one of our most pressing issues—ever-larger matrices and, increasingly, tensors—as well as some of the applications to which linear algebra can contribute to  and learn from.

Some of our most visible recent success stories—random linear algebra (RandLA), and low- rank and data-sparse approximations—have found new ways to reduce complexity in matrix and tensor problems. The key idea in RandLA, the subject of the 2015 Gene Golub SIAM Summer School, is to use random sampling to obtain a simpler problem than the original [3, 10]. Randomized algorithms for solving linear systems, computing low-rank approximations and preconditioning have all been proposed in recent years. However, a number of interesting open problems remain. These include obtaining a fuller understanding of when RandLA algorithms outperform deterministic methods, as well as when and how to combine randomized and deterministic computations.

Low-rank matrix decompositions are also receiving growing attention.  Although the singular value decomposition (SVD), also known as principal component  analysis (PCA),  is a well-established tool for obtaining low-rank approximations, it can be sensitive to outliers. Additionally, in certain applications, it is far easier to interpret low-rank factors that satisfy constraints, e.g. an entrywise nonnegativity constraint. Shortcomings of the SVD have motivated new low-rank decompositions, such as Robust PCA and Sparse PCA [7], and nonnegative factorizations [4] . These decompositions have also influenced, and been influenced by, low-rank matrix completion, which aims to fill in missing matrix entries using low-rank factors. Matrix completion additionally draws on advances in compressed sensing, another area to which linear algebra has made valuable contributions [1] .

Closely related to low-rank approximations are rank structured, and hierarchical rank-structured matrices that compactly represent  (approximately) low-rank submatrices. Examples of these data-sparse representations are H and H2 matrices, and hierarchical semi-separable matrices [6, 9] . Research here has focused on rank-structured decompositions themselves, and on lower-complexity algorithms for working with these matrices.

Attendees at the 20th Conference of the International Linear Algebra Society (ILAS) held in Leuven, Belgium in July.

Low-rank and data-sparse representations become even more imperative as we move from matrices to tensors, since the number of elements grows exponentially with the tensor order. The importance of beating this “curse of dimensionality” to the linear algebra community is clear, with seven  of the  top 20 SIMAX papers focusing on tensors. Much research has targeted understanding and computing low-rank decompositions, including the CP, Tucker, tensor train and hierarchical Tucker formats [5, 8], which have the potential to benefit many applications. For  example, tensor methods could have a role to play in solving currently untractable problems in high-dimensional settings in chemistry and physics.

A complementary research effort is developing efficient linear algebra routines for an increasingly diverse array of hardware platforms and programming languages.  Our ongoing challenge on the software side is to ensure that state-of-the-art algorithms are implemented and used in languages like Python and Julia. In terms of hardware, one of the major difficulties is adapting to the growing cost of data storage and movement in many-core machines. Communication-avoiding linear algebra [2] has made significant inroads, particularly for dense matrices,  but reducing, overlapping, hiding and predicting the cost of data storage and movement remains a focus for continued research.

So what are the linear algebra problems that are motivating this research?  They certainly include “traditional” problems from PDEs, optimization and statistics, but  also many post- forward and industrial problems, and new application areas such as complex networks and big data.

Linear algebra is contributing to a number of post-forward problems, including data assimilation, inverse and parameter estimation problems, PDE-constrained optimization and control, and uncertainty quantification (UQ). Take, for example, UQ, which formalizes existing research in statistics, applied mathematics, numerical analysis and computer science. The multiparameter equations associated with stochastic PDEs, which are often a bottleneck in computations, can generally be formulated as tensor equations or huge matrix equations. Advances in these fields could significantly speed up UQ analysis. The recent CIME Summer School and Workshop on Matrix Equations and Tensors show that the linear algebra community is interested in these problems, while the number of talks  involving matrices and tensors at the recent  SIAM's UQ conference indicates that our community is already contributing to this field.

As for new applications, many of these continue to be provided by industrial problems in science and engineering. Two other topics that are particularly prominent at present are network analysis and big data. In network problems, linear algebra has helped develop and compare measures of important graph properties, such as, centrality and communicability. These advances are having an impact in many areas, including physics, chemistry, biology, engineering and the social sciences. In terms of big data, advances in RandLA and tensor methods can aid the analysis of large data sets.  Additionally, stronger interaction with statisticians and machine learning experts could yield benefits for both communities; the upcoming GAMM/ANLA Workshop will focus on linear algebra challenges in machine learning.

Looking to the future, there are many open questions in matrix analysis and numerical linear algebra for the techniques and applications mentioned above. One of our biggest opportunities as a community, however, is to better engage with others in academia and industry.  Although we are fortunate that linear algebra is relatively accessible, dissemination of our results to other fields will be crucial. Importantly, our research can also benefit from contributions from other disciplines. Take, for example, low-rank approximation where modern algorithms increasingly draw on other areas of mathematics, including statistics, optimization and machine learning. SIAM creates an excellent forum for these interdisciplinary discussions, and the SIAG/LA would welcome suggestions for forging new links.

The author gratefully acknowledges thoughtful comments from Michele Benzi, Melina Freitag, Nicolas Gillis, Stefan Gu¨ttel, Tammy Kolda, Jim Nagy, Valeria Simoncini and Martin Stoll.

References

[1] Cand`es, E.J. (2014). Mathematics of sparsity (and a few other things). Proceedings of the International Congress of Mathematicians. Seoul, South Korea.

[2] Demmel, J. (2013). Communication-avoiding algorithms for linear algebra and beyond. Proc. Int.
Parallel and Distributed Processing Symp.
Boston, MA.  

[3] Drineas, P., & Mahoney, M.W. (2016). RandNLA: Randomized Numerical Linear Algebra. Communications of the ACM, 59, 80-90.

[4] Gillis, N. (2014). In J.A. Suykens, M. Signoretto, & A. Argyriou (Eds.) Regularization, Optimization, Kernels, and Support Vector Machines. New York, NY: CRC Press.

[5] Grasedyck, L., Kressner, D., & Tobler, C. (2013). A literature survey of low-rank tensor approximation techniques. GAMM-Mitt., 36, 53-78.

[6] Hackbusch, W. (2015). Hierarchical Matrices: Algorithms and Analysis. Berlin: Springer.

[7] Jolliffe, I.T., & Cadima, J. (2016). Principal component analysis: a review and recent developments. Philos. Trans. A, 374.

[8] Kolda, T.G., & Bader, B.W. (2009). Tensor Decompositions and Applications. SIAM Rev., 51, 455-500.

[9] Vanderbril, R., Van Barel, M., & Mastronardi, N. (2008). Matrix Computations and Semiseparable
Matrices
. Baltimore, MD: The Johns Hopkins University Press.

[10] Woodruff, D.P. (2014). Sketching as a Tool for Numerical Linear Algebra. Found. Trends Theor.
Comput. Sci., 10
, 1-157.

Jennifer Pestana is a lecturer in mathematics at the University of Strathclyde. She is the current secretary of the SIAM Activity Group on Linear Algebra.

blog comments powered by Disqus