SIAM News Blog

A Comprehensive Exploration of the Path to Modern Computing

By Paul Messina

A New History of Modern Computing. By Thomas Haigh and Paul E. Ceruzzi. MIT Press, Cambridge, MA, September 2021. 544 pages, $40.00.

A New History of Modern Computing. By Thomas Haigh and Paul E. Ceruzzi. Courtesy of MIT Press.
The 2021 major update to Paul Ceruzzi’s A History of Modern Computing—originally published by MIT Press in 1998 and expanded upon in 2003—provides a comprehensive description of almost every facet of computing. In A New History of Modern Computing, Ceruzzi and Thomas Haigh use an approachable and engaging narrative style with in-line definitions of words and concepts in layman’s terms. The book is a captivating and enjoyable read, though 41 pages of notes and a 28-page bibliography certainly make it a scholarly tome.

The account begins with the commencement of ENIAC’s (Electronic Numerical Integrator and Computer) operation at the University of Pennsylvania in 1945, though there are occasional references to previous developments. The authors chose ENIAC as the starting point because it is typically considered the “first electronic, general purpose, programmable computer;” they base their definition of “computer” upon these three attributes. Although the text focuses on computing practices and technology rather than research, it does amply describe numerous research results.

One feature that stands out is Haigh and Ceruzzi’s inclusion of the etymology for words like “subroutine,” “assembler,” “stored program,” and “operating system.” For instance, the term “memory” for a device that stores information originated from John von Neumann’s use of biological language to describe computer systems. He broke the structure into “organs” and referred to switching circuits as “neurons.” “Memory” is the only biological term that stuck. The authors take care to identify the origins of technologies, protocols, and products that were based on concepts or prototypes from previous decades. An extreme instance of a vision of future capabilities is Vannevar Bush’s 1945 article in The Atlantic that predicted (to some extent) many technologies that were ultimately invented long after its publication, including hypertext, personal computers, the Internet, speech recognition, and online encyclopedias such as Wikipedia [1]. Bush’s vision in the article is awe-inspiring.

Given my background in the field of computing, I was pleasantly surprised to learn new facts when reading A New History of Modern Computing. For instance, I discovered that small- and medium-scale computers sold more than a thousand units in the 1950s; small-scale computers cost as little as $30,000, while large-scale units ranged from $500,000 to $1,000,000.

I was also fascinated to learn that France deployed an experimental packet-switched network called Cyclades in 1974 that routed packets through host computers — an approach that the Internet later adopted. In the early 1980s, France Télécom developed Minitel: one of the first implementations of an end-user information system. Cheap Minitel terminals—which had small monochrome screens and a keyboard—connected over telephone lines and were widely adopted, with nearly 6.5 million units in use by 1993. The text explains that “[d]uring the late 1980s, more French people were using online services for banking, shopping, news, and email than the rest of the world put together.”

In the 1950s, the Semi-Automatic Ground Environment (SAGE) network for air defense against Soviet bomber aircraft emerged. It employed computers to process information from radar, ships, aircraft, telephone links, and radio. According to the authors, “SAGE introduced more fundamentally new features to computing than any other project of its era, including networking computers and sensors and the development of interactive computer graphics.” In other words, it pioneered what is now called edge computing; ironically, a National Science Foundation-funded project about edge computing is also called SAGE: Cyberinfrastructure for AI at the Edge.

Towards the beginning of the text, Haigh and Ceruzzi state that “[t]he history of computing is the story of repeated redefinitions of the nature of the computer itself, as it opened new markets, new applications, and new places in the social order.” Since advances in technologies and applications often result from the evolution of several factors, the authors made the interesting and effective choice to present the book’s 14 chapters based on topic rather than in global chronological order; every chapter then contains its own chronology according to types of computers and their uses. Consequently, each chapter’s chronology overlaps with that of several others. The authors frequently note the dependence of new capabilities or computer uses on prior advances. For example, they expand upon the theme of graphical tools with references to video games and interactivity.

Chapter one, “Inventing the Computer,” carefully dissects the features of the so-called von Neumann architecture (or stored program architecture) versus the ones created by J. Presper Eckert, John Mauchly, and the team that designed the ENIAC and EDVAC (Electronic Discrete Variable Automatic Computer). The text clearly documents the impact of the EDVAC and the IAS family of computers. I was previously unaware that the IAS family included 18 hand-built computers and 29 production line models from the U.S., Sweden, Israel, Australia, Japan, and Denmark [2].

Chapter one also notes that Eckert and Mauchly’s UNIVAC (Universal Automatic Computer) turned a scientific instrument into a commercial business tool that quickly saw many non-scientific applications, such as linear programming, prediction of the 1952 election results, payroll, and logistics. General Electric (GE) purchased the first UNIVAC model in 1954 for tasks that were previously handled by punch card machines and stated that speed of computing was only of tertiary importance. This sentiment was unsurprising given that input/output was a bottleneck for the UNIVAC, which initially printed output via a 10-character-per-second typewriter. But even in the 1950s, GE sought to use computers for advanced tasks like long-range planning, logistics for inventory management and shipping, and market forecasting based on demographic data. Product design applications soon followed.

Despite chapter two’s title of “The Computer Becomes a Scientific Supertool,” it only covers supercomputer systems through the Cray-1 — which was superseded by the Cray X-MP in 1982. The end of the chapter abandons the topic of supercomputers with the sentiment that “[b]y then, scientific computer users were switching to interactive operating systems and the new modes of computing.” As a result, A New History of Modern Computing merely addresses the first three decades of scientific supercomputing. I found this constraint disappointing as a reader and would have liked to have seen as detailed a history of supercomputing as there is of computer gaming.

On a positive note, the authors frequently credit supercomputers with being the first machines to develop or require architectural advances that were eventually incorporated into mainframes, minicomputers, and even smartphones. Yet there is almost no coverage of parallel computer architectures and programming issues, even though all high-performance computing (HPC) systems have such architectures and several large parallel computers existed as early as the 1970s and 80s. Chapter 13—“The Computer Becomes a Network”—does mention Oak Ridge National Laboratory’s Titan and Summit systems, their highly parallel architectures, and their IBM PowerPC processors and NVIDIA graphics processing units (which themselves have high parallelism). However, the book neglects to acknowledge the challenges of utilizing such systems or producing relevant algorithms and software.

Moving on, chapter 10—“The PC Becomes a Minicomputer”—includes a fascinating history about the effect of personal computer (PC) case standard and software availability on the evolution of processors and motherboards. It also discusses the use of scientific computer architecture features in microprocessors. The digital camera section in chapter 11, “The Computer Becomes a Universal Media Device,” offers a nice overview of hardware for images and architectural tradeoffs. Equally interesting are the descriptions of digital media, storage technologies, and the transition to digital music. This chapter also features ample detail on games, game hardware, and game development companies.

Chapter 12, “The Computer Becomes a Publishing Platform,” includes insightful explanations of the successes and failures of certain technologies or products. For example, Tim Berners-Lee and Robert Cailliau developed three unassuming yet effective standards that defined the web: the URL, HTTP, and HTML. All three standards were based on existing technologies and provide a compelling example of the value of design simplicity. Finally, the closing sentence of chapter 13 neatly summarizes the section’s title: “The Computer Becomes a Network.” By 2020, the advent of online applications and cloud storage meant that “[t]he PC has become a network computer and the network has finally become the computer.”

Two topics that I believe would have added value to A New History of Modern Computing are computational science and the international competition at the high end of supercomputing. The text appropriately contains many references to Turing Award winners, but it would have been nice to see several acknowledgments of Nobel Prizes that were largely based on computing — e.g., Max Ferdinand Perutz and John Cowdery Kendrew’s 1962 Nobel Prize in Chemistry, which required the University of Cambridge’s EDSAC (Electronic Delay Storage Automatic Calculator) to determine the structure of myoglobin. A history of HPC’s perceived importance to nations around the world would also have been an interesting addition. This idea was widely publicized when Japan’s Earth Simulator supercomputer became operational and was ranked #1 in the June 2002 Top500 list — it was five times faster than the #2 computer: ASCI White at Lawrence Livermore National Laboratory. Though this was not the first time that a computer from outside the U.S. had placed first, the Earth Simulator’s ranking spurred U.S. Congressional hearings and led to the enactment of the High-End Computing Revitalization Act of 2004, which authorized the Secretary of Energy to carry out a research and development program to advance high-end computing.

Despite that Act, Japanese and Chinese computers were ranked first at various points in the following years. These rankings motivated the 2015 U.S. executive order for a National Strategic Computing Initiative, which established a whole-of-government effort to create a cohesive, multi-agency strategic vision and federal investment strategy to maximize HPC’s benefits in the U.S. The European Commission has also funded numerous projects on various aspects of computing to ultimately create a state-of-the-art European chip ecosystem and unify the European Union’s world-class research, design, and testing capacities.

A New History of Modern Computing closes with a somewhat brooding epilogue—“A Tesla in the Valley”—which is not surprising given that it was published in 2021: a year that was fraught with the COVID-19 pandemic and revelations of the detriments of social media and its role in the so-called post-truth society. In summary, this text is well worth reading in its entirety. Interested researchers will find that its detailed index provides a handy reference for almost any topic in computing.

[1] Bush, V. (1945, July). As we may think. The Atlantic. Retrieved from
[2] Deane, J. (2003). The IAS Computer Family Scrapbook. Australian Computer Museum Society Inc. Retrieved from

Paul Messina is a computational scientist who is retired from the California Institute of Technology and Argonne National Laboratory. 
blog comments powered by Disqus