Algorithms and Criminal Justice Reform

It’s no big secret that the United States has a prison problem. We lock up people at higher rates than any other nation, and there are huge racial disparities in who we lock up. According to a study from The Sentencing Project, African Americans are incarcerated in state prisons five times more than whites. There are lots of reasons for why we may see these racial disparities, including law enforcement practices, inequitable access to resources, and punitive sentencing policies. Keeping so many people in prison is also really expensive– it costs roughly $80 billion a year, far more than we spend on many other essential public services.  California, for instance, spends more than $70,000 per inmate but less than $12,000 per k-12 student.

As a result, there’s lots of momentum to reform our prison system. One popular strategy that has been gaining traction as part of this reform effort is the use of risk assessment tools.  Like Netflix, these tools use algorithms to make predictions about behavior. While Netflix uses data to predict what videos you would like to watch, risk assessment tools use data to predict whether a person will commit a future crime. Depending on how the tools are used, they could help reduce prison and jail populations. For example, people who are less likely to commit a future crime could be released out of prison early, receive shorter prison sentences or avoid jail time in the first place. Rehabilitative programs and services could also be prioritized for those that are at a higher risk of committing future crimes.

Assessment Tools

While the tools may vary in purpose and how they are used, they all essentially work by assessing an individual’s risk factors. A risk factor is a characteristic that is associated with an increase in the likelihood of future criminal behavior. They are typically presented as questionnaires, and range in complexity including more simple questions about age, criminal history, education level to more complicated questions about personality and criminal attitudes.

The tools compare an individual’s answers to a database of past offenders that have been tracked over time to see if and when they re-commit a crime. Based on the actions of past offenders that share that individual’s risk factors, the tool predicts if that person is likely to re-commit a crime. 

Without the use of these tools, judges and parole boards often take into account many of the same considerations when making decisions about sentencing and parole–  considering things like criminal history, employment status, the nature of the crime, and a person’s attitude toward their crime.

While not new, the tools have been receiving more and more media attention over concerns about their fairness, accuracy and reliability.  

One concern stems from how the tools are tested and validated. Not all risk assessment tools in use in the United States have been independently reviewedIn some cases the companies that make the tools are the ones evaluating how good it is at predicting if someone is likely to reoffend.  Experts point out it’s important to validate tools in local settings in order to ensure they work for that setting.

There are also concerns about how these tools may affect racial disparities already seen in the criminal justice system. While race is not explicitly considered as a risk factor in the tools, critics caution that some risk factors, like a person’s neighborhood, may serve as a stand in for race. For the most part, many of these tools have not been extensively studied to see if they exhibit racial bias, or how they may impact racial disparities already seen in our criminal justice system.

As risk assessment tools gain popularity in prison reform efforts, many are advocating for greater testing, validation and transparency in order to build public trust and ensure they do not further contribute to racial inequality in the prison system.

Learn More….

ARTICLE: How Algorithms Are Used in the Criminal Justice System (The Lowdown/KQED)

ARTICLE: Sent to Prison by a Software Program’s Secret Algorithms (The New York Times)

ARTICLE: A  Computer Program Used for Bail and Sentencing Decisions Was Labeled Biased Against Blacks. It’s Actually Not That Clear. (The Washington Post)

ARTICLE: Exploring the Use of Algorithms in the Criminal Justice System (Stanford Engineering)


ABOVE THE NOISE, a new YouTube series from KQED, follows young journalists as they investigate real world issues that impact young people’s lives. These short videos prompt critical thinking with middle and high school students to spark civic engagement. Join hosts Myles Bess and Shirin Ghaffary for new episodes published every Wednesday on YouTube.

 

Can Algorithms Help Wind Down Mass Incarceration? 24 August,2017Lauren Farrar

  • Andrea Reinsmoen

    If we spent $70,000 in students there would probably be zero inmates!

Author

Lauren Farrar

Lauren has a background in biology, education, and filmmaking. She has had the privilege to work on a diverse array of educational endeavors and is currently a producer for KQED Learning’s YouTube series Above the Noise. Lauren’s career has taken her to the deepest parts of the ocean to film deep sea hydrothermal vents for classroom webcasts, into the pool to film synchronized swimmers to teach about the pH scale, and on roller coasters to create a video about activation energy. And, she’s done it all for the sake of education. Lauren loves communicating science! Follow her on twitter @LFarrarAtWork

Sponsored by

Become a KQED sponsor