Facebook apologized on Friday after a facial recognition software it developed recommended to surfers who watched a video in which black men were photographed, “similar content” of primate monkeys (human-like monkeys, such as chimpanzees and gorillas).
The embarrassing bug in Facebook’s facial recognition software has been discovered in recent days by US human rights organizations, who have pointed out problems of accuracy in the software’s recommendations – when it comes to non-white people. “Do you want to see more videos about primates?” – The New York Times reports.
Facebook announced yesterday (Friday) in response that it has disabled the feature of its theme recommendations for videos. A Facebook spokesman called it a “blatantly unacceptable mistake”.
“We apologize to anyone who has seen these offensive recommendations,” Facebook said in response to AFP. “We disabled the entire feature recommendation feature as soon as we realized it was happening, so we could investigate the cause and prevent such effects.”