Nearly three centuries after Leonhard Euler’s discovery of principal axes of quadratic forms, eigenvalues and eigenvectors are center stage in the study of matrices and linear operators. No other equation has arguably come so close to describing our physical world as succinctly as the eigenvalue equation. The spiritual powers seem to concur, as the Dirac equation (a relativistic form of the Schrödinger equation) is the only equation engraved on the floor of Westminster Abbey.
Eigenvalue equations model physical systems at subatomic, molecular, and human scales. As tools, eigenvalues and the related singular value decomposition (SVD) help us analyze stability and partition graphs, obtain low-rank approximations, and discover information in data sciences. Many of these applications give rise to large, sparse matrices that are often Hermitian, and for which we must compute a small part of the spectrum. Iterative methods are the computational tool for this calculation.
Hermitian eigenproblems have well-conditioned, real eigenvalues that enable the development of many efficient methods. As it happens, practitioners have responded by demanding the solution of more eigenvalues of even larger matrices. Today, it is not uncommon to see eigenproblems of sizes greater than one billion.
With larger matrix sizes and denser spectra, the Lanczos method, which is theoretically optimal over all polynomial methods in terms of the number of matrix-vector multiplications, becomes computationally less attractive than other restarted methods. ARPACK, a remarkable theoretical and software development of the 1990s, is a widely-used, restarted eigensolver with minimal tuning requirements. It is present in virtually all problem-solving environments (PSEs) for scientific computing. However, the problems’ increasing difficulty requires eigensolvers with advanced features, such as preconditioning and inner-outer methods that must also perform well on current computer architectures. Methods that offer these key features are not strictly Krylov, and often require expert tuning to achieve their potential.
PRIMME, pronounced “prime,” stands for PReconditioned Iterative MultiMethod Eigensolver and was first released in 2005 as a C software package for finding a few selected eigenpairs of large Hermitian matrices. It was developed with the goal of bringing state-of-the-art preconditioned eigenvalue methods from “bleeding edge” to production with the best possible robustness and efficiency, while simultaneously offering a highly-usable interface that requires minimal or no tuning. Over the last few years, PRIMME’s functionality has grown significantly, with the software now supporting the computation of singular value triplets (the first such package to support preconditioning), more efficient algorithms, high performance computing (HPC) improvements, a C99 code base, and interfaces to a host of modern problem-solving environments.
Figure 1. Three examples of PRIMME’s use in Python, R, and MATLAB/Octave with only basic parameter settings.
PRIMME differs from other eigenvalue libraries in several crucial ways. First, it focuses only on Hermitian eigensolvers, where algorithmic expertise can translate to highly-optimized code. Second, it implements a parametrized inner-outer method, which—by appropriate settings—can turn into any family of methods, including the near-optimally restarted versions of generalized Davidson and Jacobi-Davidson. The parametrization also controls many algorithmic components, such as blocking, locking, projections, and the extraction method, all of which can be combined to provide a continuum between traditional methods. This is one of the most complete set of algorithmic features available in today’s eigensolvers.
Although expert users may cherish PRIMME’s power and extended control, we have focused much of our efforts on creating a simple yet expressive interface that requires no parameter setting to provide almost the entire span of benefits to non-expert users. Depending on problem size, the number of eigenvalues needed, and the tolerance required, all parameters receive default values based on theoretical arguments and/or our experience. If the user provides additional parameters, the rest of the defaults are set accordingly. In addition, runtime measurements allow the method to adapt to the given problem. The resulting default dynamic method runs consistently within five to ten percent of the time of optimal choice.
PRIMME carries a 3-clause BSD license for integration into a variety of open source and commercial platforms. It has a minimal dependency on the widely-available BLAS and LAPACK libraries through wrappers that could be reassigned. The interface uses a simple C struct and function pointers, rather than the object-oriented approaches of other libraries that depend on particular ecosystems, such as PETSc or Trilinos. Together with the lightweight dependencies, this helps reduce the amount of effort required to port the full interface of PRIMME to other applications.
An increasing amount of scientific and engineering computations are currently performed on PSEs. Such PSEs use optimized numerical libraries and offer multithreading and even distributed parallelism, blurring the line with traditional HPC. Therefore, besides the C and Fortran interfaces of PRIMME, we have recently developed interfaces for both eigenvalue and singular value functionality in MATLAB/Octave, Python (compatible with SciPy), and R (see examples of usage in Figures 1 and 2). We are also aware of the availability of an independent Julia interface. These interfaces, which can be viewed as extensions of the eigs() and svds()routines in MATLAB, have generated substantial interest in the package and a wider user base; both factors are important for the code’s sustainability on new architectures.
Figure 2. Various options for computing the 20 smallest singular values of sls. This is the second-largest matrix in number of nonzero elements of a least squares problem in the University of Florida Sparse Matrix Collection. Runtimes are reported on a machine with an Intel© CoreTM CPU i7-6700K and 16GiB of physical memory. We could not run MATLAB’s svds because more than 16GiB is required to compute the sparse QR factorization, which MATLAB uses for inverse iteration on R−1.
PRIMME is also based on performance-aware software practices that make state-of-the-art theoretical advancements widely accessible. Of course, this is not the end of the road. Support for the generalized eigenvalue problem and algorithmic improvements for the SVD are under development, and we are experimenting with polynomial filtering options, with an eye to exascale. Improved kernels (such as block orthogonalization and blocked inner solvers) and a graphics processing unit-enabled version are also in the works. With PRIMME’s multiplatform availability, we welcome new users and would love to hear about their use of PRIMME in exciting applications.
Download PRIMME here
. Documentation available here
 Stathopoulos, A., & McCombs, J.R. (2010). PRIMME: PReconditioned Iterative Multimethod Eigensolver: Methods and software description. ACM Transactions on Mathematical Software, 37(2), 21:1-21:30.
 Wu, L., Romero, E., & Stathopoulos, A. (2016). PRIMME_SVDS: A high-performance preconditioned SVD solver for accurate large-scale computations. SIAM J. Sci. Comput., special issue on Copper Mountain Conference on Iterative Methods. Preprint, arXiv:1607.01404.
Randall J. LeVeque (firstname.lastname@example.org) of the University of Washington, Seattle, is the editor of the Software and Programming column.