A software from the social network Facebook was disabled, after having identified a video with black people with the topic “primates”. The company apologized and will analyze what happened, in what is another episode of programming error related to facial recognition of non-white people.
A Facebook spokesperson acknowledged that it was “a clearly unacceptable mistake” and said the recommendation software involved had been removed from the social network. “We apologize to anyone who may have seen these offensive recommendations,” Facebook said in response to a question from AFP.
“We turned off all topic recommendation functionality as soon as we realized this was happening so we could investigate the cause and prevent this from happening again.” Various facial recognition software has been heavily criticized by civil rights advocates, who point to accuracy issues, particularly with regard to non-white people.
This case was triggered by a video of British tabloid “DailyMail” with black men, which was shown an automatic message asking if the user would like to “continue watching videos about Primates”. The June 2020 video in question was titled “White Man Calls Cops Because of Black Men at Marina.”
A screenshot of the recommendation was shared on Twitter by former Facebook content design manager Darci Groves. “This ‘keep watching’ message is unacceptable,” Groves said, pointing the message to former Facebook colleagues.
– .