Disturbing issue with driverless cars
THERE are concerns that self-driving cars may pose a serious threat to some members of the public after a study found that the cars are less likely to detect people with darker skin.
Researchers from the Georgia Institute of Technology found automated vehicles were better at detecting people with lighter skin tones.
The paper, titled Predictive Inequity in Object Detection, raised concerns that pedestrians with dark skin could have a higher risk of being run over by these vehicles.
There have already been cases of pedestrians being hit by self-driving cars, including the woman who was killed in the US by a driverless Uber in 2018.
Situations such as this prove that there are already risks to pedestrians that need to be considered.
"This behaviour suggests that future errors made by autonomous vehicles may not be evenly distributed across different demographic groups," the study read.
"A natural question to ask is which pedestrians these systems detect with lower fidelity, and why they display this behaviour."
The study used the Fitzpatrick scale, a system that classifies different skin tones, and fed the different images into the object detection technology used in these driverless cars.
Researchers then analysed how often the system correctly detected the presence of people in the dark-skinned group compared to those in the light-skinned group.
The results proved unsettling, revealing on average the technology was 5 per cent less accurate in detecting people with dark skin tones.
The academics also accounted for factors like time of day, lighting and other objects that may have obstructed the detection system's view.
This means that the disparity between detections is not just a result of dark skinned people in the images "appearing in more difficult scenes for detection", according to the study.
One of the authors of the study, Jamie Morgenstern, told Vox the companies using these detection technologies may need to have other systems in place as well.
"The main takeaway from our work is that vision systems that share common structures to the ones we tested should be looked at more closely," Ms Morgenstern said.
Though the findings were concerning, it has been pointed out that there were some flaws in the research.
Researchers were unable to test the object detection models actually being used by self-driving car manufacturers, as these companies don't want to make their personal data available for that purpose.
This means that they had to instead had to use already publicly available datasets and test different models that have already been used by other researchers.