SIAM News Blog

The Perils of Beautiful Mathematics

By James Case

Lost in Math: How Beauty Leads Physics Astray. By Sabine Hossenfelder. Courtesy of Basic Books.
Lost in Math: How Beauty Leads Physics Astray. By Sabine Hossenfelder. Basic Books, New York, NY, June 2018. 304 pages, $30.00.

Sabine Hossenfelder is inclined to blame the dearth of meaningful progress in theoretical physics since the 2012 detection of the Higgs boson on an ill-considered quest for mathematical beauty. A theoretical physicist, Hossenfelder acknowledges the aesthetic appeal of the known laws of physics in Lost in Math: How Beauty Leads Physics Astray, but doubts that beauty alone ever led to their discovery or is likely to inspire further innovations.

What seems to discourage her most is the failure of CERN’s Large Hadron Collider (LHC) to detect even a few of the new fundamental particles predicted by the “supersymmetry” theory — “susy” for short. Post-World War II physicists, led by Murray Gell-Mann and Richard Feynman, assembled what has come to be known as the Standard Model of the subatomic world. Comprising 25 presumably fundamental particles from which all other particles may be constructed, the model explains virtually every known subatomic interaction in terms of “gauge symmetries” — symmetries of the Lagrange equations governing the quantum fields of three of nature’s four fundamental forces. Only gravitational forces remain unexplored by the Standard Model.

Susy postulates the existence of a “partner” for each of the 25 particles in the Standard Model, and perhaps a few others. No such partners were found at the Large Electron-Positron Collider (LEP), which ran until 2000, or at the Tevatron, which reached higher energies than the LEP and ran until 2011. Even the powerful LHC, which reuses the LEP’s tunnel and has been running off and on since 2008, has failed to divulge any evidence of the elusive susy partners. The simplest explanation is that the unseen partners are much larger than expected, requiring even higher energy colliders for detection.

Unsurprisingly, particle physicists are lobbying for such colliders. Some have proposed a Chinese Circular Collider (CCC) that would reach collision energies approaching 100 trillion electron volts (TeV). The Japanese have expressed interest in building an almost equally powerful International Linear Collider, while CERN has plans for a super-LHC with a circumference of 100 kilometers that reaches energies comparable to those expected of the CCC. But many physicists anticipate discovery of at least a few susy partners at collision energies as low as 2 TeV — easily obtainable from the LEP, Tevatron, and original LHC incarnation. Who is to say, Hossenfelder asks, that more powerful colliders will succeed where others have failed? Where do leaders in the field stand on the matter? Are they deliberately misleading their governments about the prospects of increasingly costly experiments?

String theory is another source of distress to Hossenfelder. She points out that the field has yet to generate a single testable hypothesis after 30 years of development. Worse still, it has spawned a willingness in some circles to modify—if not abandon—the scientific method itself. Hossenfelder references Austrian philosopher Richard Dawid’s recommendation to amend the scientific method to allow for the evaluation of scientific hypotheses on purely theoretical grounds. In his book, String Theory and the Scientific Method, Dawid specifically cites three non-empirical arguments already in use by string theorists: (i) the absence of alternative explanations, (ii) the use of previously successful mathematics, and (iii) the discovery of unexpected connections. According to Hossenfelder, string theorists welcome such philosophical support while most other physicists refrain from doing so. What, she wonders, will become of everyone if climate scientists rely on non-empirical criteria to evaluate their models?

In part because of the high cost of field experiments, physicists have developed criteria to identify the proposed theories most likely to survive empirical testing. The most obvious, of course, is simplicity. A simple theory is always preferable to a complicated one that explains the same observations. So is one that extends an established theory, since it automatically explains the same observations and more. But the physics community has gone further, developing a “theory of theories” situated in something called “theory space.”

To introduce this idea, Hossenfelder points out that theoretical physics is an amalgam of weakly-related theories operating on different scales. Small-scale (high-resolution) physical theories tend to imply larger-scale (lower-resolution) physical theories. For instance, Newton’s laws of motion—developed at the level of an apple falling from a tree—imply Kepler’s theory of planetary motion. Likewise, atomic-scale quantum mechanics suggests a theory of large-scale chemical reactions and another of fingernail-size computer chips. And so on.

Hossenfelder likens theory space to a box (see Figure 1). Each point in the box represents a different theory, and curves depict chains of theories related by implication, with high-resolution theories implying low-resolution ones. The totality of such curves, presumably capable of branching and/or merging, is called the “flow” of theories.

Figure 1. Sabine Hossenfelder depicts theory space as a box where chains of related theories form curves, with high-resolution theories implying low-resolution ones. Image courtesy of Basic Books.

If the curves emanating from different versions of a particular high-resolution theory appear to converge on a low-resolution theory of interest—marked by \(X\) in Figure 1—the latter is termed “natural,” since all versions of the high-resolution theory yield essentially the same low-resolution conclusions. Such conclusions “follow naturally” from the assumptions. But if the curves diverge and only a few pass near the low-resolution theory of interest, the latter is “fine-tuned” because it follows from only a handful of versions of the high-resolution theory. These must be adjusted just right to suggest the low-resolution theory. In this case, one must know every detail of the high-resolution theory with precision to justify confidence in the low-resolution one.

The greater part of Lost in Math consists of interviews with leading physicists regarding their opinions on significant issues facing the discipline. Why do they find the Standard Model unsatisfactory? Is susy the only viable alternative? Why is naturalness beautiful but fine-tunedness unattractive? Will the quest for beauty produce something better? Is the long-sought “theory of everything” within reach?

Hossenfelder is a skilled interviewer, with a talent for drawing her subjects out on topics of mutual interest and an admirable distaste for trivial gossip. She is also humorous at times. Better still, she seems well-schooled in the subject’s history, including the disputes that have disrupted it over the years. Hossenfelder’s take on these is refreshing, as it rebuts the prevailing view whereby “progress in the sciences is made at the funerals of scientists.”

In hindsight, many historic scientific disputes—such as those surrounding heliocentricity and/or the reality of atoms and molecules—seem foolishly one-sided. Yet, says Hossenfelder, this was not always the case. In almost every instance, good arguments seemed to exist on both sides of the disputed issue for many years. In time, the preponderance of evidence came to rest on one side or the other. But until scientists gathered decisive proof, the eventual outcome remained unpredictable.

The argument surrounding heliocentrism is a prime example. Copernicus’ contemporaries found it hard to accept the model because the planets’ rotation around the sun should imply movement of the fixed stars in the sky as Earth travels from its nearest approach to the farthest remove of a given star. The magnitude of this movement, known as “parallax,” depends on the average distance between Earth and the star — a greater distance correlates with a smaller apparent change in position.

The stars do indeed change position slightly during the course of a year. But because astronomers could not detect such miniscule changes until the 19th century, generations of people were forced to conclude that either Earth remained stationary or the fixed stars were exceedingly far. Moreover, since a beam of light from a distant star that passes through a circular aperture—such as an eye or telescope—will “smear out” and undergo magnification, those distant stars seem gigantic in comparison with other celestial bodies, including the sun. Scientists also did not understand this magnification until the 19th century, so earlier generations of astronomers had to either agree with Ptolemy that the stars remain fixed in the “celestial sphere” or conclude that they were unimaginably large and distant. How surprising is it that many found the more familiar teaching both simpler and easier to accept? After all, simplicity has long been regarded as a reliable indicator of beauty.

Hossenfelder quotes Paul Krugman to the effect that “The economics profession went astray because economists, as a group, mistook beauty, clad in impressive-looking mathematics, for truth” [1]. Her interviews are meant to discover the extent to which physicists are in danger of making the same mistake. After failing to identify a consensus beyond “time will tell” among her impressive roster of respondents, she elects to close on a positive note. “The next breakthrough in physics will occur in this century,” Hossenfelder writes. “It will be beautiful.”

[1] Krugman, P. (2009). How Did Economists Get It So Wrong? The New York Times Magazine. Retrieved from

James Case writes from Baltimore, Maryland.

blog comments powered by Disqus