Among the 23 new fellows named by the MacArthur Foundation in 2012 is Daniel Spielman of Yale University. In the article that follows, Shanghua Teng of the University of Southern California draws on his longtime collaboration with Spielman to give readers of SIAM News a glimpse of some of the problems Spielman has taken on to date in his widely ranging, highly interdisciplinary work in the theory of computing.
Daniel Spielman of Yale University has been named a 2012 MacArthur Foundation Fellow.
The 2012 group included a second SIAM member—Maria Chudnovsky of Columbia University, who has joint appointments in the Departments of Industrial Engineering and Operations Research and Mathematics and works in graph theory and combinatorial optimization. A focus of her research, according to the MacArthur citation, is on a deepening of “the connections between graph theory and other major branches of mathematics, such as linear programming, geometry, and complexity theory.”
MacArthur fellows, who receive no-strings $500,000 awards, are selected on the basis of “extraordinary originality and dedication in their creative pursuits,” along with “a marked capacity for self-direction.”
Dan Spielman is a visionary and a pioneer. A prescient scholar, he has built an exceptional record in discovering fundamental research topics and areas, as well as ways to answer outstanding questions that have challenged researchers for years. He is a gifted problem solver, resourceful in making conjectures that set up promising leads. He is also a fabulous mathematical writer and educator, and his talent for making complicated concepts simple has helped give a general audience an appreciation for our field.
Dan’s work has fundamentally transformed nearly all aspects of the theory of computing—from communication to optimization, from complexity theory to algorithm design and analysis, from graph algorithms to numerical methods. In fact, many of his landmark achievements resulted from his extraordinary ability to connect seemingly unrelated disciplines, and therefore have simultaneously advanced multiple areas of computing.
As a Ph.D. student in the MIT mathematics department, for example, Dan began to use techniques from coding theory in designing the nearly linear-size holographic proofs sought by researchers in complexity theory. But in an amazing turn of events, he not only successfully applied coding theory to complexity theory, he also discovered a breakthrough algorithm, using concepts from theoretical computer science and discrete mathematics, such as expanders and superconcentrators; this work fundamentally elevated the state of the art in coding theory. His papers “Expander Codes” (with Mike Sipser) and “Linear-Time Encodable and Decodable Error-Correction Codes” have initiated a new generation of error-correction codes, such as erasure correction codes, that are more suitable than classic codes for modern Internet applications. For a subsequent work, “Improved Low-Density Parity-Check Codes Using Irregular Graphs,” he and co-authors Michael Luby, Michael Mitzenmacher, and Amin Shokrollahi received the 2002 Information Theory Society Paper Award. This family of codes has been used in the Digital Video Broadcasting Standard.
Since 1995, when he completed his Ph.D. thesis—which won the ACM best Doctoral Dissertation Award for contributions in complexity and information theory—Dan has gradually shifted his focus to algorithm design and analysis. I have been very fortunate to be one of his collaborators in this exciting journey. Significant accomplishments along the way include smoothed analysis, a Laplacian solver, and maximum flow approximation; the majority are thoughtfully covered in the four articles listed at the end of this article.
Expanding on those essays in the rest of this article, I focus on a piece of Dan’s earlier work and the critical role it played in his 15-year effort to close the gap between theory and practice in algorithm design and analysis.
A Bridge Between Numerical and Combinatorial Thinking
Back in 1995, Dan was drawn to work of Fielder that established a connection between the connectivity of a graph and the second-smallest eigenvalue of its Laplacian matrix, and to work of Donath and Hoffman that used eigenvectors for partitioning graphs. At the time, spectral graph partitioning was widely popular; it was used in scientific computing, and was demonstrated experimentally to work extremely well in a wide range of applications, including image segmentation, VLSI design, sparse matrix computation, combinatorial optimization, and parallel processing. However, the quality of the partition that the spectral methods should produce on graphs from these important applications had eluded more rigorous analysis.
Intrigued by this algorithm at the intersection of graph theory and matrix theory, Dan dedicated himself to the development of “numerical thinking” for graph-theoretical problems and “combinatorial thinking” for numerical problems. Indeed, he was fascinated by this algorithm: It was an elegant method based on linear algebra and physical intuitions; it used a continuous method to solve a discrete problem; and it was a successful practical heuristic whose theoretical performance analysis was still lacking.
Applying differential geometry and Brouwer’s fixed point theorem, Dan’s work established a novel connection between graph eigenvalues and graph embedding in Euclidian space. This connection proved a spectral theorem for planar graphs that, together with the famous discrete Cheeger’s inequality, showed that spectral methods can be used to find partitions as good as those that can be achieved with the classic Lipton–Tarjan planar separator theorem. It also proved a similar statement for finite-element meshes, thus providing the first mathematical analysis of the amazing practical performance of spectral partitioning methods in domains that include scientific computing, image processing, and VLSI design.
Like many creative thinkers, Dan celebrates his major breakthroughs as the beginning of new research paths, rather than as the end of the pursuits. His work has not only provided the avidly sought answer to the question of why the spectral partitioning method works for planar-like graphs and finite-element meshes, but has also been a launch pad for him. This research instigated a long trek that achieved three landmark results connecting numerical thinking and combinatorial thinking:
- Smoothed analysis of algorithms and its application to the simplex algorithm for linear programming and Gaussian elimination for linear systems
- Spectral graph sparsification and a nearly linear-time algorithm for solving Laplacian and symmetric diagonally dominant linear systems
- The electrical flow framework for maximum flow approximation.
Each of these results built on Dan’s insightful observations, his perseverance, and his foresight in mapping the direction of the explorations. He is never willing to let opportunities, however small, pass by without understanding what they could inspire. As an example, intrigued by the classic Steinitz connection between three-connected planar graphs and the convex polytopes in three dimensions, he painstakingly advanced the aforementioned research in spectral analysis of planar graphs in several directions—to the mathematical study of high-dimensional polytopes and the smoothed analysis of the performance of the simplex algorithm for linear programming, to the development of spectral sparsification and the Laplacian solver, and then to an electrical-flow-based algorithm for approximating the maximum flow in a graph. The essays listed below highlight these accomplishments.
Congratulations to Dan! We look forward to watching as his lively mind, uncanny talent in selecting research directions, ability to break interdisciplinary barriers, and courage to work on challenging problems lead him to the next big thing.
For Further Reading