fbpx

Software Is A Racist Matrix

We can just file this under that infinitely dense file called "More Bullsh*t Black People Have to Deal With.

We can just file this under that infinitely dense file called “More Bullsh*t Black People Have to Deal With.” A recent report undertaken by ProPublica is a classic case of flawed data within a particular algorithm producing a flawed result. In this case, the virus within the machine is racism for the digital age.

Like a scene out of Minority Report, the study revealed computer generated assessment scores utilized across the nation to infer the probability of future recidivism among convicts and predicted the wrong outcome for Blacks at twice the rate of Whites.

BS in and BS comes out. Any programmer will tell you that. But apparently the United States Sentencing Commission didn’t see fit to keep that in mind.

ProPublica found that the assessment software often introduces bias against Black defendants. Duh! The study examined 7,000 risk scores assigned to individuals arrested in Broward County, Florida in 2013 and 2014, then checked to see if they committed new crimes in the next year or so. This was done based upon an algorithm developed by Northpointe, whose motto is ‘Advancing Justice, Embracing Community.”


But whose justice are they advancing? And what community are they embracing?


Image title

Per a standard used by the algorithm in a widely used assessment software developed by Northpointe, the system inaccurately flagged potential future criminals for Black and White defendants. But the Black defendants were wrongly flagged at twice the rate of White defendants. Here’s an official statement from Northpointe refuting the findings of ProPublica;

“…does not agree that the results of your analysis, or the claims being made based upon that analysis, are correct or that they accurately reflect the outcomes from the application of the model.”

Former U.S. Attorney General Eric Holder warned of the potential for inaccuracies within this type of software back in 2014 stating;


“Although these measures were crafted with the best of intentions, I am concerned that they inadvertently undermine our efforts to ensure individualized and equal justice.”

Despite his warnings, more and more courtrooms use this software, and a sentencing reform bill currently pending Congressional approval would mandate its use in federal prisons.


Image title

I think former Attorney General Holder was being rather coy with that “best of intentions” joke. It had to be tongue-in-cheek. This software is now on the verge of Congressional approval and will soon be mandated for federal prison use. It is how the machine justifies feeding itself more Black and Brown people.

Image title

Though this proprietary software’s algorithm is not entirely disclosed to the public, Northpointe factors in education levels and whether the defendant has a job. The software is set to 137 questions directly answered by the defendant or pulled from criminal records.

Questions include: Was one of your parents ever sent to jail or prison? How many of your friends/acquaintances are taking drugs illegally? and How often did you get in fights while at school?

The questionnaire also asks people to agree or disagree with statements such as A hungry person has a right to steal and If people make me angry or lose my temper, I can be dangerous.

This is the part of the Prison Industrial Complex that we rarely get a glimpse at. It is what they’re now using to justify increased sentencing and penalties. It is a monster. Make no mistake. This Northpointe software has unnecessarily burned the lives of thousands in Broward County and will soon go national like some surreal version of Skynet.



This is what the fight is about, ladies and gentlemen. The stench of racism permeates everything the criminal justice system touches. Even the purest optimist has to admit this may be irreparable. The entire criminal justice system is rigged against people of color.