SIAM News Blog
SIAM News
Print

When Software Harms, What You Reap Is What You Sow

By Matthew R. Francis

As of July 2020, the CalGang database contained the names and personal details of nearly 90,000 people in the state of California who are suspected of being in gangs or associating with gang members. Despite its stated purpose to provide law enforcement agencies with accurate intelligence information, audits and independent investigations revealed that the database was riddled with errors, falsified material, racial profiling, and other serious problems.

Databases and algorithms are ubiquitous parts of our interconnected world, but CalGang illustrates a major way in which they can fail people. If a streaming service suggests a movie that you do not like, no real harm is done; but if your name appears in CalGang, you may face consequences like increased police harassment or harsher sentences if charged with a crime.

“[Most of] the people creating these technologies are not affected in negative ways,” Seny Kamara, a computer scientist at Brown University, said. “But if you’re a young Black male growing up in Chicago or New York or California, you know that you may end up as a false positive in a gang database, and that affects your life in a completely different way.”

During his presentation at the 2021 American Association for the Advancement of Science (AAAS) Annual Meeting, which took place virtually this February, Kamara used CalGang as a specific example of the disconnect between people who design software and those who are harmed by it. Although these databases are frequently promoted as more objective than human decision making, they often perpetuate the same prejudices and harms.

“One of the rationales for risk assessment tools [in criminal justice] is that judges can be inconsistent or make decisions that are not always suitable,” Suresh Venkatasubramanian, a computer scientist at the University of Utah who also spoke at the AAAS session, said. “An algorithmic process might help us, [but] then there is a concern that these systems—by merely looking at prior data—are amplifying patterns of bias, especially racial bias. The question then becomes, ‘Why are we building a risk assessment tool in the first place?’ These are always being commissioned and built by the folks who are putting people in jail.”

In other words, improving software like CalGang still accepts the necessity of its existence, and the technologists who create it are complicit in the abuses of the carceral system — regardless of their own intentions. Both Kamara and Venkatasubramanian indicate that computer scientists often choose a side by merely accepting work that is commissioned by the police, military, or private companies. Though taking a position is not intrinsically bad, the researchers agree that one must understand the ethics and consequences that are associated with building algorithms, databases, and other software tools.

On the surface, CalGang—and other applications like it—may seem necessary. In practice, however, police and other law-enforcement entities have great leeway when it comes to adding people to the database; as a result, teachers, coaches, and relatives of suspected gang members could themselves be listed. Kamara pointed out that investigations found 42 infants under the age of one listed in CalGang, as well as other minors whose families were not informed of their inclusion, as required by law. Once they are added to the database, individuals have had trouble getting themselves removed — or even learning that they are on the list [3].

Due to these issues, the Los Angeles Police Department announced in 2020 that it would no longer use CalGang [2]. However, other California police departments still utilize this application, and Immigration and Customs Enforcement (ICE) employ the similarly problematic ICEGangs database as part of the deportation decision process.

Know Your Enemy

Part of the difficulty in combating such technological problems is that they often work as designed; their intrinsically unequal effects are part of the package. “If you’re an American living in the U.S., drones are fantastic,” Kamara said, referencing package delivery and recreational use. “But if you live in a different part of the world, like in Pakistan, you have a very different feeling about drones. They affect you very differently [because] they’re weapons of war.”

This disparity also applies to other types of robots, which are heavily funded by the military or law enforcement. Even though these machines—such as the “dancing dog” robots of viral video fame—may have life-saving and life-enhancing applications, they are already being weaponized. One’s opinion of such robots largely depends on whether the technology is helping or harming them.

To better explain the situation, Kamara used a metaphor from his own specialization of cryptography: adversarial models, wherein someone attempts to learn the contents of an encrypted message. The adversary in this analogy is a person or group who wants to discover other people’s secrets or harm them in some way.

“When you’re designing a system, you have multiple kinds of adversaries with different powers and different goals,” Kamara said. “If I’m young and Black and live in St. Louis or New York City, odds are that the police are part of my adversarial model. When I leave my house, I have to think about my interactions with police, how to survive those interactions, or whether I’m going to be harassed.”

This is not the case for many white people, whose primary interactions with police are neutral or positive. They thus have different adversarial models than Black people. Similarly, women have different adversarial models than men, immigrants—particularly immigrants of color—have different adversarial models than citizens, and so forth.

“Technology is produced with certain adversarial models in mind,” Kamara said. “But all of these other groups and communities just don’t come up, so the problems that they face are not being met by the technology being produced. This is definitely the case with respect to privacy, security, and safety.”

Why Stand on a Silent Platform?

Some programmers may be loath to accept that one does not have to be ideologically racist to produce something that can be used in racist ways. Both Kamara and Venkatasubramanian emphasized that combatting a racist and exploitative system requires active opposition, rather than ideological neutrality.

“In computer science, we don’t think of ourselves as being part of a larger system of societal governance,” Venkatasubramanian said. “We think of ourselves as tool builders. One of the difficulties has been to realize that 90 percent of the hammers we’re building are being used to beat on people, and only 10 percent are being used to beat on nails.”

Acknowledging that neutrality is not a binary state is one step toward understanding unintended consequences. “It’s not helpful to talk about the artifact itself as being neutral or not,” Venkatasubramanian continued. “It’s an end stage of a whole process that involves people who have their own judgments about what they should be doing.”

For obvious reasons, technologists like to create technological solutions for every issue, regardless of appropriateness. While Kamara admitted to such bias himself, he also acknowledged that a more diverse field would mitigate partiality. “A huge part of the solution is that we just need more diversity,” he said. “We need people with different life experiences. They know what the problems are and can come up with good solutions.”

However, hiring more white women and people of color to produce code will not solve these issues if the field’s established professionals do not admit that problems exist. To that end, a broader education that encompasses history, sociology, and policy would greatly benefit technologists. Furthermore, software that is harmful towards other people might negatively affect white male computer scientists in the future [1].

Problematic technologies like CalGang and surveillance drones are unlikely to vanish because ample funding for their creation continues to flow. However, understanding their potential to hurt certain groups of people can shift value systems in computer science, much like how the natural sciences grapple with funding sources and research independence. “Not only do we need to participate [in change] because of what we have wrought,” Venkatasubramanian said. “We should participate because we could do so much more.”


References
[1] Benjamin, R. (2019). Race after technology: Abolitionist tools for the new Jim code. Medford, MA: Polity.
[2] Chabria, A., & Miller, L. (2020, June 24). Reformers want California police to stop using a gang database seen as racially biased. Los Angeles Times. Retrieved from https://www.latimes.com/california/story/2020-06-24/california-police-urged-to-stop-using-gang-database-deemed-biased.
[3] Winston, A. (2016, March 23). You may be in California’s gang database and not even know it. Reveal. Retrieved from https://revealnews.org/article/you-may-be-in-californias-gang-database-and-not-even-know-it.

Matthew R. Francis is a physicist, science writer, public speaker, educator, and frequent wearer of jaunty hats. His website is BowlerHatScience.org.

blog comments powered by Disqus