Jump to content

Facebook Apologizes After Its AI Labels Black Men As 'Primates'


Recommended Posts

  • Platinum Contributing Member

Oops.  

https://www.npr.org/2021/09/04/1034368231/facebook-apologizes-ai-labels-black-men-primates-racial-bias

Facebook issued an apology on behalf of its artificial intelligence software that asked users watching a video featuring Black men if they wanted to see more "videos about primates." The social media giant has since disabled the topic recommendation feature and says it's investigating the cause of the error, but the video had been online for more than a year.

A Facebook spokesperson told The New York Times on Friday, which first reported on the story, that the automated prompt was an "unacceptable error" and apologized to anyone who came across the offensive suggestion.

The video, uploaded by the Daily Mail on June 27, 2020, documented an encounter between a white man and a group of Black men who were celebrating a birthday. The clip captures the white man allegedly calling 911 to report that he is "being harassed by a bunch of Black men," before cutting to an unrelated video that showed police officers arresting a Black tenant at his own home.

Former Facebook employee Darci Groves tweeted about the error on Thursday after a friend clued her in on the misidentification. She shared a screenshot of the video that captured Facebook's "Keep seeing videos about Primates?" message.

"This 'keep seeing' prompt is unacceptable, @Facebook," she wrote. "And despite the video being more than a year old, a friend got this prompt yesterday. Friends at [Facebook], please escalate. This is egregious."

This is not Facebook's first time in the spotlight for major technical errors. Last year, Chinese President Xi Jinping's name appeared as "Mr. S***hole" on its platform when translated from Burmese to English. The translation hiccup seemed to be Facebook-specific, and didn't occur on Google, Reuters had reported.

  • Haha 4
Link to comment
Share on other sites

Early on in picture analysis by AI was tasked with identification of dogs. So it started to make mistakes by labeling poodles and lap dogs as wild Wolves. Being an AI learning program they had to reverse engineer the program to find out what went wrong. It turns out the AI program had only been shown pictures of wolves in the wild and almost all of those pictures were in snow. So AI came to the conclusion that a dog in snow must be a Wolf. I bet the AI in this case got it wrong in the same way.

Link to comment
Share on other sites

  • Platinum Contributing Member
4 hours ago, Carlos Danger said:

Early on in picture analysis by AI was tasked with identification of dogs. So it started to make mistakes by labeling poodles and lap dogs as wild Wolves. Being an AI learning program they had to reverse engineer the program to find out what went wrong. It turns out the AI program had only been shown pictures of wolves in the wild and almost all of those pictures were in snow. So AI came to the conclusion that a dog in snow must be a Wolf. I bet the AI in this case got it wrong in the same way.

A chinese zoo had a wolf in the wolf pen and it died. So the knock off pro's put a fucking dog in it's place.

 

China zoo 'tries to pass dog off as wolf' - BBC News

  • Haha 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Trying to pay the bills, lol

×
×
  • Create New...