A Facebook recommendation algorithm asked users if they wanted to see other “primate videos” under a British tabloid video showing black people, the New York Times revealed on Friday.
The Daily Mail video, over a year old, is titled “White man calls cops against black men at marina”. It only shows people, not monkeys.
Below, the question “see more videos on primates?” With the options “Yes / Reject” was displayed on the screen of some users, according to a screenshot posted on Twitter by Darci Groves, a former designer of the giant of social networks.
“It’s scandalous,” she commented, calling on her ex-colleagues at Facebook to escalate the matter.
“This is clearly an unacceptable error,” responded a spokesperson for Facebook, requested by AFP. “We apologize to anyone who has seen these insulting recommendations. “
The Californian group deactivated the recommendation tool on this subject “as soon as we noticed what was happening in order to investigate the causes of the problem and prevent it from happening again,” she said.
Read also: Facebook doubles its profits in 2Q but forecasts slower growth
“As we said, although we have improved our artificial intelligence systems, we know that they are not perfect and that we have some progress to make,” she continued.
The case highlights the limits of artificial intelligence technologies, regularly highlighted by the platform in its efforts to build a personalized feed to each of its nearly 3 billion monthly users.
It also uses it extensively in content moderation, to identify and block problematic messages and images before they are even seen.
But Facebook, like its competitors, is regularly accused of not fighting enough against racism and other forms of hatred and discrimination.
The subject arouses all the more tension as many civil society organizations accuse social networks and their algorithms of contributing to the division of American society, in the context of the demonstrations of the “black Lives Matter” movement (black lives count).