The Justice Department pledges to address racial bias in an algorithm that determines early release.
The U.S. Department of Justice is reviewing the algorithm it uses to determine whether people in federal custody are eligible for early release. The algorithm, intended to determine an individual’s risk of reoffending, exhibits racial bias.
The United States has one of the highest recidivism rates in the world. About half of all people released from federal prisons are rearrested within eight years. In light of stark data, experts conclude that U.S. prisons do a poor job of rehabilitating people and preparing them to reenter society successfully. Research demonstrates that U.S. prisons do not adequately treat or address root causes of criminality, such as mental health issues, addiction, and lack of education and job training.
In addition, when people are released from government custody, they can face challenges obtaining employment and housing, as people with criminal records are frequently subject to legally sanctioned discrimination. Difficulties obtaining legal employment and reintegrating into civic society can lead individuals to reengage in crime.
The First Step Act, a bipartisan law passed in 2018, is aimed at reducing recidivism by addressing these root causes. The law allows certain people in federal custody to earn time credit toward early release for participating in approved recidivism reduction programs, which include vocational and apprenticeship training, educational programs, mental health programs, and substance use treatment.
Earlier this year, the Federal Bureau of Prisons issued a rule implementing earned timed credits for participation in approved programs, as the First Step Act requires. By completing recidivism reduction programs, people in the Bureau’s custody can earn up to 54 days of credit for each year of their imposed sentence. Some incarcerated people can apply these credits “toward earlier placement in pre-release custody” arrangements, such as home confinement. In addition, those who earn enough time credit can apply up to 12 months of it toward supervised release at the discretion of the Bureau Director.
There is, however, a catch. Early release is only available to incarcerated people the Justice Department determines have a minimal or low risk of returning to crime.
To make this determination, the Justice Department developed an algorithm called the Prisoner Assessment Tool Targeting Estimated Risk and Needs (PATTERN). The algorithm uses factors such as current age, nature of the convicted offense, education status, incident reports, and financial responsibility to assess the likelihood that individuals will engage in crime upon their release. Based on the point score PATTERN assigns, individuals are classified as minimum, low, medium, or high risk.
Although only those in the minimum-risk and low-risk categories are eligible for early release, individuals can become eligible while incarcerated through good behavior and participation in recidivism reduction programs. People in custody are periodically reassessed to determine if their PATTERN risk score has sufficiently decreased.
Since the Justice Department announced the creation of its risk assessment tool, advocates have raised concerns about racial disparities inherent in predictive algorithms. Predictive algorithms such as PATTERN rely on data that can reflect and perpetuate historic patterns of discrimination and bias.
Months after the Justice Department implemented PATTERN, researchers from the National Institute of Justice reported disparities in the algorithm’s ability to predict recidivism for racial minorities. Among other disparities, the tool overpredicts the risk that Black, Hispanic, and Asian people will reoffend or violate their parole. This overprediction means that relatively fewer Black, Hispanic, and Asian incarcerated individuals are eligible for early release than their similarly situated white peers.
For example, PATTERN uses factors such as criminal history and education without controlling for systemic biases in policing, prosecution, and access to education. The Justice Department’s failure to “correct for systemic biases” inevitably leads to disparities, Melissa Hamilton, a law professor at the University of Surrey, has reportedly noted.
In a recent report to Congress, the Justice Department acknowledged these inequities and committed to revising PATTERN “to ensure that racial disparities are reduced to the greatest extent possible.” According to agency plans, it implemented a new version of PATTERN last week. Although that version, PATTERN 1.3, includes adjustments the Justice Department claims increases the accuracy of the algorithm, it “will neither exacerbate nor solve” the existing “racial bias issues.”
At the same time that the Justice Department works to address racial disparities within PATTERN 1.3, it has pledged to consider other legal options for reducing the algorithm’s disparate outcomes. For example, when it implements version 1.3, the Justice Department has said that it will revise its point criteria for qualification in the low risk category. Department officials hope this change will expand opportunities for incarcerated people to earn early release and “mitigate the effects of various racial and ethnic disparities associated with previous risk groupings.”
Although the Justice Department has committed to revising PATTERN, it remains to be seen whether its actions will allay concerns that have emerged over the use of algorithms in the criminal law context. Advocates continue to call for immediate action and more significant changes in the tool that determines who can, and who cannot, leave prison early.