
Dutch police have been using algorithms to predict whether minors would become full-fledged crooks after having contact with law enforcement. However, a staggering margin of error mislabeled one in three.
The algorithm, dubbed "Preselect Recidive," targets young people aged from 12 to 18, attempting to evaluate whether teenagers will reoffend in the future. If this sounds familiar, well it is. The general idea closely resembles Steven Spielberg’s 2002 film Minority Report.
And just as the action flick tried to warn us, predicting the future is hardly an exact science. According to Dutch investigative journalists at FTM, the algorithm produces many false positive scores, with one in three children mislabeled as likely to reoffend.
The use of algorithms by law enforcement and the justice system gets a lot murkier, knowing what metrics are used to assign youngsters a score. “Preselect Recidive” analyses individuals’ age, household police interactions and personal police interactions, which surprisingly include if a youngster was a bystander witness in a crime.
The score’s ingredients make up for some strange situations, where in a group of youths with similar backgrounds, one could be singled out just because a relative faced police attention in the past.
What’s more, the algorithm’s score is no small deal, FTM claims, as it determines whether a young person will receive rehabilitation through Dutch state programs or face criminal persecution. Meanwhile, at least some Dutch experts believe the high number of false positives make the algorithm as good as random chance.
At the same time, various Dutch municipalities have used the tool to form lists with names of young people who are likely to be subjects of police attention in the future. The investigation revealed that while other factors were at play, "Preselect Recidive” played a large part in determining who might offend in the future.
The approach risks creating a vicious circle: algorithm-selected youths get incriminated because of increased attention from law enforcement, not because they’re more prone to crime. Their peers, who were assigned a lesser score, could commit the same number of offenses and not be registered as police are focused on individuals with higher scores.
To make matters worse, neither offenders nor lawyers knew that Dutch law enforcement employed the algorithm at all. As with any score, be it a credit or a crime one, interested parties may have used it to decide whether to, for example, employ someone.
The Dutch Ministry of Justice and Security, contacted by FTM, acknowledges the limitations of its algorithm, but reiterated that the system is good enough for the time being.
Your email address will not be published. Required fields are markedmarked