Recidivism

Algorithmic risk assessment tools used to predict recidivism rates are one example in which bias frequently occurs. In some criminal justice systems, algorithms are used to predict an individual’s likelihood to commit a future crime. Unless such an algorithm accounts for the fact that women consistently have lower recidivism rates than men, it will not predict at equal levels of accuracy for people of all genders. 

Researchers have found that algorithms designed to be neutral regarding women can still produce unfair output. “Gender-neutral” predictors such as criminal history, mental health, age and substance abuse will result in higher recidivism rates for men. In contrast, research suggests that gender-responsive factors such as parenting and relationship stress, victimisation and trauma result in more accurate predictive ability for women. The question remains whether gender should be taken into account in criminal justice decision-making, as some would say risk assessment based on gender-responsive factors is unacceptable gender profiling.

In sum, biased output leads to difficult ethical questions. Are we making fair and equitable decisions grounded in reality? Or are we repackaging an unfair status quo cloaked in technological innovation? The more widely algorithmic assessment tools are applied across society, the more pressing these questions become.