An algorithm on Facebook has asked a number of users if they would like to watch more videos of monkeys under a video published by a British tabloid showing black people, the New York Times revealed Friday.
The video clip, which was reported by the newspaper “Daily Mail” more than a year ago, bears the title “A white man seeks help from the police against black men in the marina”, and it shows only people, not monkeys.
But the question “Watch more primate videos?” He appeared on the screens of some users under the video clip, and gave them the choice between accepting and rejecting, according to a screenshot posted on Twitter by the former designer of the giant social network, Darcy Groves.
“It’s outrageous,” Groves wrote in her comment, calling on her former colleagues at Facebook to raise the issue.
A spokeswoman said “Facebook” “This is clearly an unacceptable mistake,” she told AFP, adding, “We apologize to everyone who has seen these insulting recommendations.”
She explained that “Facebook” quickly deactivated the recommendation tool in this matter “as soon as it discovered what happened, in order to investigate the causes of the problem and prevent its recurrence.”
“As we’ve said, even though we’ve made improvements to our AI systems, we know it’s not perfect and we need to make more progress” in this area, she added.
The case highlights the limits of the artificial intelligence techniques the network regularly adopts in an effort to make its service fit the preferences of each of its three billion monthly users.
Facebook also uses these technologies extensively to moderate content, in terms of monitoring and blocking problematic posts and photos even before users see them.
But accusations of not fighting racism and other forms of hatred and discrimination are regularly leveled against the network and its competitors.