MAHK ZUCKENBERGER!!! You got some âsplaininâ to do, my guy.
The New York Times reports that Facebook has apologized after users who watched a video posted on the site by The Daily Mail in 2020 saw automated prompts asking if they wanted to âkeep seeing videos about Primates.â
Suggested Reading
There was nary a primate, Most Valuable or otherwise, in the video. Instead, it was a clip of a Black man interacting with officers after a white man called the police on him at a marina. According to the Times, Facebook says this was an error on behalf of an artificial intelligence feature on the site.
From the Times:
Darci Groves, a former content design manager at Facebook, said a friend had recently sent her a screenshot of the prompt. She then posted it to a product feedback forum for current and former Facebook employees. In response, a product manager for Facebook Watch, the companyâs video service, called it âunacceptableâ and said the company was âlooking into the root cause.â
Ms. Groves said the prompt was âhorrifying and egregious.â
Dani Lever, a Facebook spokeswoman, said in a statement: âAs we have said, while we have made improvements to our A.I., we know itâs not perfect, and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations.â
Here we have another example of how artificial intelligence programs routinely demonstrate biases against people of color. Weâve seen stories on how this Minority Report-ass facial recognition software has led to innocent Black people being arrested or discriminated against due to computer errors.
Then thereâs the racial bias found in voice recognition programs and Twitter cropping Black faces out of photos at a higher rate than it does white faces.
So yeah, itâs safe to say this problem ainât nothing new. The question is, what are Big Tech companies like Facebook and the others going to do about it?
Based on what Groves told the Times, so far the answer is ânot enough.â
Ms. Groves, who left Facebook over the summer after four years, said in an interview that a series of missteps at the company suggested that dealing with racial problems wasnât a priority for its leaders.
âFacebook canât keep making these mistakes and then saying, âIâm sorry,ââ she said.
Straight From
Sign up for our free daily newsletter.