About the Author

Communication Networks and the Spread of Misinformation

By James Case

The Misinformation Age: How False Beliefs Spread. By Cailin O’Connor and James Owen Weatherall. Yale University Press, New Haven, CT, December 2018. 280 pages, $26.00.

The Misinformation Age: How False Beliefs Spread. By Cailin O’Connor and James Owen Weatherall. Courtesy of Yale University Press.
Social scientists Cailin O’Connor and James Owen Weatherall are faculty members at the University of California, Irvine, with secondary appointments at the university’s Institute for Mathematical Behavioral Sciences. Their new book, The Misinformation Age: How False Beliefs Spread, explores the communication networks through which nonsensical information travels—often with lasting effects—at speeds that depend on the nature of the networks in question.

The introduction and first chapter present a series of anecdotes that concern particular pieces of (mis/dis)information, while the second and third chapters analyze the properties of specific communication networks with the aid of a few simple diagrams. Many of these networks connect scientists, because their connections are comparatively easy to document. Scientific (mis/dis)belief seems to spread in much the same way as other forms of (mis/dis)belief. The fourth and final chapter of The Misinformation Age applies middle chapter methods to the role of (mis)information in public life.

The authors’ lead anecdote concerns a tale that reached 14th century Europe by way of an English knight named Sir John Mandeville. Upon his return from Asia Minor, he spoke of a tree that bore fruit containing tiny sheep. He claimed to have tasted the flesh of these “vegetable lambs” and found it “wondirfulle.” Confirmation soon arrived and Sir Mandeville’s story began to circulate among “naturalists,” eventually appearing in scholarly books. Not until 1683 did King Charles XI of Sweden direct naturalist Engelbert Kaempfer to undertake the exhaustive search of Asia Minor that was necessary to establish that no such tree existed, or ever had. Belief in “the vegetable lambs of Tartary” overcame the doubts of skeptics to persist among scholars for more than three centuries.

A more recent fake news story appeared in September 2016 on a conservative website known as ETF News under the headline “Pope Francis Shocks World, Endorses Donald Trump for President, Releases Statement.” While the number of readers who actually believed the story is unknown, it was liked or shared 960,000 times on Facebook between the day it was posted and the election. A skeptic would argue that because the Pope is a public figure, The New York Times, Washington Post, Wall Street Journal, and any number of other media outlets would have reported any such endorsement. However, this story was but one of many false reports. O’Connor and Weatherall calculate that the top 20 fake news stories in the three months before the 2016 U.S. presidential election were liked or shared a total of 8.7 million times on Facebook, while the top 20 genuine news stories during the same period garnered only 7.3 million likes or shares.

A third anecdote concerns the treatment of stomach ulcers, long believed to be caused by bacteria. This belief went largely unchallenged until 1954, when gastroenterologist E.D. Palmer biopsied the stomachs of more than 1,000 patients without finding evidence of bacteria. The obvious conclusion was that bacteria cannot survive in stomach acid and thus cannot cause ulcers. Instead, physicians believed that ulcers were caused by the acids themselves and could be cured by acid neutralization. In the years that followed, many ulcer patients were “successfully treated” with antacids, though their ulcers displayed a distressing tendency to recur.

Roughly 30 years after Palmer published his results, Australian researcher J. Robin Warren detected a new strain of bacteria in biopsies near the sites of stomach ulcers. His colleague Barry Marshall isolated the new strain, proving that bacteria can in fact dwell in the human stomach. In 2005, the duo received the Nobel Prize in Medicine for convincing their fellow scientists that bacteria can and do cause stomach ulcers in humans.

O’Connor and Weatherall begin their analysis of communication networks by explaining the concept of a model. The models in their book are adaptations of one that economists Venkatesh Bala and Sanjeev Goyal introduced in 1998 [1]. A few years later, Kevin Zollman utilized this model to analyze the networks through which scientists interact [2]. The authors’ use mirrors Zollman’s work.

Bala-Goyal models describe a collection of “agents” that are attempting to choose one of just two possible conclusions: A or B. They use information gathered by both themselves and others to make their choice. For instance, conclusion A might be that excess stomach acid causes ulcers, which one should treat with antacids; conclusion B might find that ulcers are caused by bacteria and are better treated with antibiotics. Because the conclusions of interest assert the superiority of one action over another, there is no need to distinguish between “conclusion A(resp. B)” and “action A(resp. B).”

Over successive rounds of data gathering, the agents update their tentative conclusions in response to information from the latest round. They initially have little data to analyze and minimal confidence in their verdicts. But as time passes, the agents accumulate, share, and digest data, thus enabling increased confidence in their (still tentative) conclusions.

One can visualize such models as graphs, in which each node represents an agent or group of agents and each edge represents a channel of communication between two nodes. Every node is associated with a number between 0 and 1, called a “credence;” this represents a level of certainty that action B is superior to action A. For example, a node of credence 0.7 indicates that a particular agent is 70 percent certain that B is superior to A, while a credence that is smaller than 0.5 signifies that the agent in question favors the opposite conclusion. With luck, the “credence vector” will converge to a vector of 1s after multiple rounds of data gathering. If not, the process could fail to converge at all or end in a stalemate, with some agents clinging to each belief.

Figure 1. Three copies of a graph on six vertices. Image courtesy of Yale University Press.

Figure 1 displays three copies of a graph on six vertices. The fractions beside the nodes in 1a are the agent’s initial credences. Light nodes correspond to agents who plan to take action A because they expect A to outperform B, and dark nodes correspond to agents who plan to take action B. The numbers beside the nodes in 1b indicate the number of times that the actions are successful in a series of 10 independent trials by each agent. Finally, the fractions beside the nodes in 1c represent the updated credences that are obtained by application of Bayes’ rule. These credences indicate that all but one agent are fairly certain that B is superior to A after a single round of data gathering. The sole agent in disagreement does not update because he is not in communication with any of the agents that chose alternative B. A second round of testing will likely produce unanimity.

Another of the authors’ telling anecdotes concerns Lady Mary Wortley Montagu, whose husband became British ambassador to the Turkish Empire. There she encountered a practice called variolation — a primitive version of inoculation that involved scratching a person’s arm and rubbing a scab or fluid from a smallpox pustule into the wound. Though a few patients did indeed contract smallpox and die, most experienced a mild form of the illness while developing immunity. A smallpox victim herself, Lady Mary successfully variolated her own young son.

Upon returning to England, Lady Mary sought to popularize variolation among the British aristocracy but met resistance from English doctors. She turned to her friend, Lady Caroline of Ansbach, Princess of Wales, for help. Though Lady Mary’s information was accurate all along, the practice did not spread among the English nobility until Lady Caroline’s two young daughters were successfully variolated.

O’Connor and Weatherall explain Lady Caroline’s influence in terms of a sequence of graphs on seven vertices (agents), arranged in a ring of six around a central “queen bee.” The latter communicates with everyone else, but they each communicate with her alone. Should the queen happen to revise an opinion—as she does in Figure 2b—the agents are likely to do so as well. Such “star networks” seldom occur by themselves but often lie hidden within larger networks.

Figure 2. The spread of a belief in a star network. Dark nodes represent one action and light nodes represent another action. Image courtesy of Yale University Press.

Once identified or created, these subnetworks can be of considerable value to propagandists who seek to influence important events. For instance, the Russian military appears to have made subtle use of the star network concept in their alleged attempts to influence the outcome of the 2016 U.S. presidential election. Facebook has since revealed that Russian-produced political content reached as many as 126 million U.S. users.

A Facebook “group” is meant to facilitate discussion between members, while a “page” is designed for an organization or celebrity to commune with “followers.” A “community page” lies somewhere in between. Though the creator attempts to attract followers by posting messages that are of interest to a target audience, registered followers may also post messages that everyone sees.

Well before the 2016 election, the Russians apparently began to create community pages of potential interest to a wide variety of existing affinity groups, including the LGBTQ community, Black Lives Matter activists, gun rights supporters, anti-immigration zealots, and even animal lovers. They did so by posting messages that subtly reaffirmed the target audience’s beliefs to gain trust and solidify their position as “queen bee” within a star-like communication network. Only then did they begin to inject a few tenuously related fake news stories into the daily flow of community give and take.

The Misinformation Age treats extremely sensitive material in an entirely scholarly manner, with 27 pages of notes and a 36-page bibliography. It is also highly informative and—at least to this reviewer—a genuine page-turner. 


References
[1] Bala, V., & Goyal, S. (1998). Learning from neighbors. Rev. Econ. Stud., 65(3), 595-621.
[2] Zollman, K.J.S. (2007). The communication structure of epistemic communities. Phil. Sci., 75(5), 574-587.

James Case writes from Baltimore, Maryland.