Research reveals significant unfairness in the choices made by driverless cars. This poses serious risks to the safety of vulnerable groups.
Self-driving cars are on track to become the dominant transport method of the future. However, the systems they currently use for navigation are vulnerable to software glitches that could lead to serious harm or even death for pedestrians and passengers alike.
A team of researchers from the UK and China has discovered that pedestrian detectors used by autonomous cars exhibited bias across age and skin tones.
The rate of undetection is nearly 20% higher for children compared to adults, and there’s over a 7% disparity in undetection rates between individuals with darker skin and those with lighter skin.
This is particularly evident in night time scenarios, as detection performance for the darker-skinned group decreased under low-brightness and low-contrast conditions compared to the lighter-skinned group.
Pedestrian detection in autonomous cars is performed by AI systems, which tell a car whether it’s approaching a pedestrian or not.
According to the researchers, a major cause of the discrepancy in pedestrian detection is that the majority of pedestrian images – which are used to train the AI system – features more people with light skin than dark skin. This shows a lack of balance and fairness in the AI training protocol.
“It is essential for policy makers to enact laws and regulations that safeguard the rights of all individuals and address these concerns appropriately,” concluded the researchers.
Your email address will not be published. Required fields are markedmarked