Skip to content

Precrime: Racism and Algorithms

    Previously we have written about how algorithms can be classist, today we will consider the curious case of Pasco, Florida to reflect on the ways in which algorithms can be racist.

    In 1956 US author Phillip K. Dick published a science fiction novella called The Minority Report, set in an unspecified time in the future. In this novella, three mutants have precognitive abilities, which they use to foresee murders before they happen, enabling the police to stop them before they ever occur. In this future society the murder rate is consequently 0%. The creator of this division of the police force, dubbed ‘precrime’ is John Anderton, who serves as our protagonist.

    The Minority Report was originally published in Fantastic Universe Science Fiction magazine

    John receives a shock when the precrime process predicts that he will murder a man in cold blood in the coming week. Rather than submit to his own process, he becomes convinced that there is a massive conspiracy – which of course he is correct about. We won’t spoil the ending, for our purposes we have all we need.

    The Minority Report reflects Dick’s anxieties about The Cold War, and the increasingly authoritarian measures which are justified because (supposedly) they keep people safe. The novella also plays to our unease around the role of technology and automation in our lives. Until 2014, Facebook’s company motto was “Move fast and break things”, but of course it’s different when your things are being broken – or when your own precrime process targets you.

    Moving fast and breaking things' is such a load of crap | Hacker Noon
    Mark Zuckerberg in front of his company motto

    In 2011, Sheriff Chris Nocco took office in Pasco, Florida. Inspired by Minority Report and other media with the same idea, such as Moneyball, Nocco wanted to build his own precrime division and stop serious crimes before they happened. Of course he did not have access to precognitive mutants, so how did he do it?

    The short answer is that he didn’t. Two weeks ago, on September 3rd 2020, The Tampa Bay Times wrote an explosive exposé on Nocco’s programme. Lacking access to precognition, the Sheriff opted for the next best thing, a complex algorithm that would be fed colossal amounts of data in order to render decisions on which citizens in Pasco were likely to commit crimes.

    It began by using fairly conventional criminological techniques, that is to say, using statistics to reveal geographic areas that have a relatively higher rate of crime than their neighbouring areas and then policing those places more heavily. These techniques themselves have been exposed to severe and relentless critique by sociologists, legal scholars, and other academics, as they usually result in the over policing of already poor or disadvantaged areas and merely ensure the cycle of disadvantage continues for another generation. The racialization of crime data is well established and very problematic.

    However, beginning in 2014, this intelligence led policing initiative was expanded and began to operate algorithmically. Problematic citizens were identified and given a ‘score’, depending on how likely they were to reoffend. People gain these scores depending upon their criminal record, and their score is maintained at the same level even if the charges are dropped. As the exposé says:

    “The manual says people’s scores are “enhanced” — it does not say by how much — if they miss court dates, violate their probation or appear in five or more police reports, even if they were listed as a witness or the victim.”

    In response to this, citizens who had high scores were checked on. Deputies would call to their home, ask them questions and ensure that they were staying out of trouble. Then they would come back. Again, and again, and again. The practice was to make life unbearable for those who were deemed algorithmically likely to commit a serious crime, until they either moved or sued. This indicates that it is not the impending possibility of crime that is an issue, but rather a callous NIMBYism. It is fine if people want to commit crimes somewhere other than Pasco County, Florida, if they move elsewhere then there is no problem.

    Previously we have discussed the importance of human decision making in algorithms. While at a purely superficial level algorithms may seem objective and provide us with the emotionless clarity unique to mathematics; if you look hard enough, and for long enough you will always find a human at some point in the process who has made a decision informed by their values and/or worldview.

    It could be said that the precrime algorithm of Pasco County merely renders a list of likely criminals, but does it even achieve this? In this instance we would point to the very human, and very intentional decision to base most of the algorithms weighting on arrests – and even more intentionally to maintain a negative score even for those whose charges are dropped. In the United States a black person is more than twice as likely to be arrested relative to a white person, and five times as likely to be stopped relative again to a white person. According to the NAACP 5% of illicit drug users are African American, and yet 29% of arrests 33% of illicit drug incarcerations are of African Americans. To see a full rendering of the scope of the problem, check out this graphic which gets into the issue of crimes which are never reported or prosecuted.

    By choosing to focus on arrests, the algorithm reveals its racial bias, as nonwhite Americans are so much more likely to be arrested – and then the algorithm does not even remove these people from consideration once their names have been cleared. In this specific instance, the exposé did not reveal serious issues with regard to racialization, as the algorithm seems predominately to have targeted juvenile offenders.

    However – we would argue that it is the precedent that matters, as the algorithm contains a serious flaw in the essence of its construction. The Pasco County algorithm was ended in November 2019, but its legacy lives on through the controversies it has generated and the various lawsuits which are ongoing. It isn’t the first poorly designed algorithm to be used in an essential part of public life, and unfortunately it won’t be the last.