Facebook issued an apology on behalf of its artificial intelligence (AI) software that asked users watching a video featuring Black men if they wanted to see more “videos about primates.” The social media giant has since disabled the topic recommendation feature and says it’s investigating the cause of the error, but the video had been online for more than a year, reports NPR.
A Facebook spokesperson told The New York Times on Friday, which first reported on the story, that the automated prompt was an “unacceptable error” and apologized to anyone who came across the offensive suggestion.
The video, uploaded by the Daily Mail on June 27, 2020, documented an encounter between a White man and a group of Black men who were celebrating a birthday. The clip captures the White man allegedly calling 911 to report that he is “being harassed by a bunch of Black men,” before cutting to an unrelated video that showed police officers arresting a Black tenant at his own home.
Former Facebook employee Darci Groves tweeted about the error on Thursday after a friend clued her in on the misidentification. She shared a screenshot of the video that captured Facebook’s “Keep seeing videos about Primates?” message.
“This ‘keep seeing’ prompt is unacceptable, @Facebook,” she wrote. “And despite the video being more than a year old, a friend got this prompt yesterday. Friends at [Facebook], please escalate. This is egregious.”
This is not Facebook’s first time in the spotlight for major technical errors. Last year, Chinese President Xi Jinping’s name appeared as “Mr. S*hole” on its platform when translated from Burmese to English. The translation hiccup seemed to be Facebook-specific, and didn’t occur on Google Reuters reported.