Platinum Contributing Member Highmark Posted September 8, 2021 Platinum Contributing Member Share Posted September 8, 2021 Oops. https://www.npr.org/2021/09/04/1034368231/facebook-apologizes-ai-labels-black-men-primates-racial-bias Facebook issued an apology on behalf of its artificial intelligence software that asked users watching a video featuring Black men if they wanted to see more "videos about primates." The social media giant has since disabled the topic recommendation feature and says it's investigating the cause of the error, but the video had been online for more than a year. A Facebook spokesperson told The New York Times on Friday, which first reported on the story, that the automated prompt was an "unacceptable error" and apologized to anyone who came across the offensive suggestion. The video, uploaded by the Daily Mail on June 27, 2020, documented an encounter between a white man and a group of Black men who were celebrating a birthday. The clip captures the white man allegedly calling 911 to report that he is "being harassed by a bunch of Black men," before cutting to an unrelated video that showed police officers arresting a Black tenant at his own home. Former Facebook employee Darci Groves tweeted about the error on Thursday after a friend clued her in on the misidentification. She shared a screenshot of the video that captured Facebook's "Keep seeing videos about Primates?" message. "This 'keep seeing' prompt is unacceptable, @Facebook," she wrote. "And despite the video being more than a year old, a friend got this prompt yesterday. Friends at [Facebook], please escalate. This is egregious." This is not Facebook's first time in the spotlight for major technical errors. Last year, Chinese President Xi Jinping's name appeared as "Mr. S***hole" on its platform when translated from Burmese to English. The translation hiccup seemed to be Facebook-specific, and didn't occur on Google, Reuters had reported. 4 Quote Link to comment Share on other sites More sharing options...
Zambroski Posted September 8, 2021 Share Posted September 8, 2021 Quote Link to comment Share on other sites More sharing options...
CFM Posted September 8, 2021 Share Posted September 8, 2021 Always “Oooops, technical error.” All the same peep’s when kids “just joking.” Quote Link to comment Share on other sites More sharing options...
Carlos Danger Posted September 8, 2021 Share Posted September 8, 2021 Early on in picture analysis by AI was tasked with identification of dogs. So it started to make mistakes by labeling poodles and lap dogs as wild Wolves. Being an AI learning program they had to reverse engineer the program to find out what went wrong. It turns out the AI program had only been shown pictures of wolves in the wild and almost all of those pictures were in snow. So AI came to the conclusion that a dog in snow must be a Wolf. I bet the AI in this case got it wrong in the same way. Quote Link to comment Share on other sites More sharing options...
Platinum Contributing Member steve from amherst Posted September 8, 2021 Platinum Contributing Member Share Posted September 8, 2021 4 hours ago, Carlos Danger said: Early on in picture analysis by AI was tasked with identification of dogs. So it started to make mistakes by labeling poodles and lap dogs as wild Wolves. Being an AI learning program they had to reverse engineer the program to find out what went wrong. It turns out the AI program had only been shown pictures of wolves in the wild and almost all of those pictures were in snow. So AI came to the conclusion that a dog in snow must be a Wolf. I bet the AI in this case got it wrong in the same way. A chinese zoo had a wolf in the wolf pen and it died. So the knock off pro's put a fucking dog in it's place. China zoo 'tries to pass dog off as wolf' - BBC News 1 Quote Link to comment Share on other sites More sharing options...
Frostynuts Posted September 8, 2021 Share Posted September 8, 2021 Beware of the wolf, dressed up in a dogs skin. Very very, sneaky bunch. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.