Crime Prediction Keeps Society Stuck in the Past

One of the most remarkable examples of the use of predictive technology is the story of Robert McDaniel, detailed by journalist Matt Stroud in the Verge in May 2021. McDaniel is a resident of Austin, a Chicago neighborhood that saw 72 homicides, nearly 10 percent of the city’s total, in 2020 alone. Despite the fact that McDaniel had no record of violence (he had been arrested for selling pot and shooting dice), a Chicago Police Department predictive policing program determined in 2013 that he was a “person of interest”—literally. In the 2011-16 CBS crime drama of that name, “the machine,” created by the show’s protagonist, can only determine that a person will be either the victim or the perpetrator of a violent crime, but not which. Similarly, the algorithm used by the CPD indicated thatMcDaniel was more likely than 99.9 percent of Chicago’s population to be involved in a shooting, though which side of the weapon he’d be on was unknown.

Equipped with this “knowledge,” Chicago police officers placed McDaniel on their Strategic Subject List, later known as the “heat list,” and kept a close watch on him, despite his not being under suspicion of involvement in any specific crime. Because some of that surveillance was overt, it suggested to others in his neighborhood that he might have some kind of connection to the police—that he was perhaps an informant, a tremendously damaging reputation. 

Predictably enough, McDaniel has been shot twice since he was first identified by the CPD: first in 2017, perhaps partly due to publicity generated by his appearance that year in a German documentary, Pre-Crime, that he hoped would help to clear his name; and more recently in 2020. He told the Verge that both shootings were due to the CPD surveillance itself, and the resulting suspicion that he was cooperating with law enforcement. “In McDaniel’s view,” Stroud writes, “the heat list caused the harm its creators hoped to avoid: It predicted a shooting that wouldn’t have happened if it hadn’t predicted the shooting.”

That is true enough, but there is a deeper pattern to observe here as well. Because of police data from the past, McDaniel’s neighborhood, and therefore the people in it, were labeled as violent. The program then said that the future would be the same—that is, that there would not be a future, but merely reiterations of the past, more or less identical with it. This is not merely a self-fulfilling prophecy, though it certainly is that: It is a system designed to bring the past into the future, and thereby prevent the world from changing.

The program that identified McDaniel appears to have been developed specifically for CPD by an engineer at the Illinois Institute of Technology, according to earlier reporting by Stroud. The CPD program identified around 400 individuals most likely to be involved in violent crime and put them on its heat list. That program started in 2012 and was discontinued in 2019, as disclosed that year in a Chicago city government watchdog report that raised concerns about it, including the accuracy of its findings and its policies concerning the sharing of data with other agencies. The custom CPD algorithm reportedly focused on individuals, and it likely resembles a wide range of programs used by law enforcement and militaries of which the public has little knowledge. For instance, in 2018, journalist Ali Winston reported in the Verge that the surveillance company Palantir, founded by Peter Thiel, had been secretly testing similar technology in New Orleans since 2012 without informing many city officials.

Source

Author: showrunner