Did you know that a commercial software program is being used by judges to predict whether or not some criminal defendants should be punished more harshly than others?
A product called Compas, sold by Northpointe Inc., is routinely used by courts to predict which criminals will be repeat offenders, and which ones won’t. And some experts have concluded that if the defendant is black, the predictions are consistently against them. What is even worse is that the algorithms used to predict future criminal behavior by the software are kept secret, and defendants and their attorneys have no access to the basis for the “evidence” being used by the prosecution. Software predictions can persuade a judge away from giving you probation, to sentencing you to a prison term, and it is happening – frequently.
The U.S. Supreme court has agreed to hear just such a case, where the use of the software sent a defendant to a longer prison term.
This is scary, on many levels. Self-driving cars are one thing, but now the judge in your case may be getting advice from a PC, or perhaps even delegating his decision to a commercial app, before sentencing.
Some people may think that it is good to use objective software, rather than a human being, to predict your behavior. I disagree. Software is written by people, and human subjectivity always finds a way into human endeavors.
Racist software? Bigoted bytes? Secret algorithms? Minority Report or 1984, or both?
Read the articles below and see if what you discover doesn’t totally creep you out.