SIAM News Blog
SIAM News
Print

Obituaries: Michael J.D. Powell

By Arieh Iserles

Michael J.D. Powell, 1936-2015. Photo by Lin Wang, courtesy of the Chinese Academy of Sciences.
Michael James David Powell, who passed away on April 19, was one of the giants who established numerical analysis as a major discipline and created its current intellectual landscape. From his life work  have emerged both mathematical foundations and practical algorithms of nonlinear optimization, as well as decisive contributions to approximation theory.

Mike Powell was born in London and went to school at Eastbourne. In 1957, having completed his National Service, he went to Cambridge to read mathematics. The standard duration of studies for the Mathematical Tripos at Cambridge is three years, but in those more flexible times, Mike accomplished this in just two years, followed by a one-year diploma in Numerical Analysis and Computing. Then, instead of staying in academia and working toward a doctorate, he joined the Atomic Energy Research Establishment Harwell, where he stayed for seventeen years. 

In his first few years at AERE Harwell, Mike worked on questions in computational chemistry. Then, in 1962, came his first paper on optimization, a subject he would make his own.  Historically, there were two “master methods” for optimization without constraints: firstly, the Newton algorithm, clearly impractical for large-scale computations with many variables because of the prohibitive cost of the evaluation of the Jacobian matrix in each iteration and the consequent linear algebra; secondly, the method of steepest descent, iterating locally in a direction determined by the gradient, and representing the ultimate demonstration that locally optimal decisions can be disastrous globally.

In a 1959 paper, Bill Davidon proposed an algorithm that used an approximate Jacobian, now called a “variable-metric method.” For Mike the paper was a revelation. In 1962, he and a younger colleague, Roger Fletcher, published an extremely influential paper on what is now known as the DFP algorithm, acknowledging Davidon’s pioneering contribution. This augured the start of a life journey for Mike and arguably the beginning of modern optimization, and was followed by extensive further research into many aspects of (mostly, but not always, unconstrained) optimization: from the convergence of DFP and BFGS (Broyden-Fletcher-Goldfarb-Shanno) variable-metric methods, to trust-region methods, local line searches, conjugate gradient methods for nonlinear problems, augmented Lagrangian functions, sequential quadratic programming, derivative-free methods, and so forth. Mike was engaged in this work until the last week of his life.

Numerical analysts tend to divide into two classes: those who subject numerical algorithms to hard analysis and full mathematical treatment, yet regard practical programs as an afterthought, best left to others, and those who focus on software issues and practicalities of implementation, while regarding analysis as an often unnecessary encumbrance – if it works, who needs a proof? Mike Powell was an exception. He firmly believed that hard analysis and beautifully written programs go hand in hand and that his responsibility, as a numerical analyst, was both to produce deep and challenging mathematics (his proofs of the convergence of the DFP and BFGS algorithms for convex functions are a striking example of a truly difficult, nonintuitive—and often counterintuitive—rigorous mathematical proof) and to create (and freely share with the community) professionally written software of the highest quality. 

The Harwell terms of engagement, “peaceful use of atomic energy,” allowed Mike a great deal of freedom to plough his own furrow, first and foremost in optimization, but also in approximation theory, and he was instrumental in setting up the Harwell library of numerical subroutines. Then, in 1976, he returned to Cambridge (receiving a Doctor of Science degree in 1979) as the John Humphrey Plummer Professor of Applied Numerical Analysis. 

This was a momentous change in many ways. At Harwell Mike spent all his time on research, surrounded by kindred souls – Roger Fletcher, Alan Curtis, John Reid, Iain Duff, and others. At Cambridge he was expected to undertake the numerous duties of a “proper” professor—teaching, supervision of research students, administration, committee work—which he often regarded as a drain on time best spent doing research. Still worse, while Cambridge has had a glorious tradition in numerical analysis, from Isaac Newton onwards, by the 1960s this tradition had essentially died out. Thus, Mike was expected to establish numerical analysis from scratch in the Department of Applied Mathematics and Theoretical Physics, in an atmosphere in which anything but fluid dynamics was often seen as an improper occupation for a true applied mathematician. It is fair to say that Mike was an outlier in a large department, in what was then a wasteland betwixt the pure and applied mathematics departments at Cambridge. Until his retirement in 2001, Mike led a small group – ultimately, just two “teaching officers” (Cambridgese for “faculty”) and a small cohort of research students, postdocs, and visitors. 

Mike’s interest in approximation theory started in Harwell, first in connection with least-squares calculations, \(\ell_1\) and \(\ell_\infty\) approximations, and then in his very influential work on splines. But he did what may be his most memorable and influential work in this area, on radial basis functions, at Cambridge. The spur was a beautiful paper of Charlie Micchelli proving that, regardless of dimension, the problem of approximation by radial functions is nonsingular subject to fairly broad conditions. This created the promise of an exceedingly powerful interpolation method for multivariate scattered data but also opened a host of questions about the quality of such approximation. These questions have been addressed—and in large measure answered—by Mike and his research students, thereby creating the groundwork for the many subsequent applications of radial basis functions, not least in the computation of partial differential equations. 

Mike Powell was a unique character. He readily confessed to disliking administration, bureaucracy, paperwork, committees, and even teaching – anything that ate into valuable research time. Indeed, he retired early to focus more on his research (and on his golf handicap). Yet his sense of duty was such that, once unhappily compelled to spend time on any of these chores (and although a perfect English gentleman, Mike was never good at hiding his dislike or impatience), he discharged them with total commitment, in an exemplary fashion. In particular, his teaching (like his talks) was always crystal-clear and immaculately prepared: not a word, not a symbol out of place, everything logical and in the right sequence. 

This sense of duty and Mike’s total integrity made him a terrible academic politician: Everybody knew that, push come to shove, Mike Powell would support what he believed was right: There was little point to horse-trading or exchanging favours with him. He did not believe that his role as an academic was to build an empire or demonstrate formal “academic leadership:” He led strictly by example, producing world-class research, educating his students well, and inspiring others. 

Academic honours duly arrived. In 1982 Mike was awarded (jointly with Terry Rockafellar) the inaugural SIAM George Dantzig Prize and a year later was elected a Fellow of the Royal Society. He received, among others, both the Naylor and Senior Whitehead Prizes of the London Mathematical Society (becoming the only person ever to receive two senior LMS prizes), the IMA Gold Medal and its Catherine Richards Prize, foreign membership in the U.S. National Academy of Sciences, corresponding fellowship in the Australian Academy of Sciences, a PhD honoris causa from the University of East Anglia. 

On a personal note, Mike was a colleague, a neighbour, and a friend for 37 years. He was fiercely competitive, but also generous to a fault. With Caroline, he was a wonderful host. He was also a mentor for a young and inexperienced Junior Research Fellow, and a shining example thereafter. His standards were always high and demanding, for those around him but in particular for himself. His students were relatively few, but he trained them ever so well and pushed them to excel themselves. But he also genuinely cared about them and their lives; in return, they demonstrated fierce loyalty, as did his many friends worldwide. 

We take so much for granted because this is our reality as numerical analysts: from variable-metric algorithms to methods for multivariate approximation, but also the very idea that a numerical algorithm is a creature with a double personality – a mathematical entity on call for rigorous mathematical analysis and a computational scheme that must be programmed and implemented with a similarly high level of cleverness. Mike Powell, in his life’s work and attitudes, demonstrated these twin motives of numerical analysis and their underlying unity at their very best. He will be missed.

Arieh Iserles is an emeritus professor in numerical analysis of differential equations at the University of Cambridge. 

blog comments powered by Disqus