Facebook recently apologized after an A.I. program mistakenly labeled a video featuring Black men as "about Primates." 

The video, posted by The Daily Mail on June 27, 2020, shows clips of Black men and police officers. An automatic prompt asked users if they would like to "keep seeing videos about Primates," despite the video clearly featuring no connection or content related to primates. 

INSTAGRAM ‘INCORRECTLY’ DELETED ACCOUNT OF MOTHER OF FALLEN SERVICE MEMBER KAREEM NIKOUI

"As we have said, while we have made improvements to our A.I., we know it’s not perfect, and we have more progress to make," Facebook said in a statement to The New York Times. "We apologize to anyone who may have seen these offensive recommendations." 

A former content designer at Facebook flagged the issue after a friend forwarded a screenshot of the prompt. A product manager for Facebook Watch reportedly called the error "unacceptable" and said that the company would look "into the root cause." 

ALLIGATOR BITES DRONE OUT OF THE AIR, GETS A MOUTHFUL OF SMOKE

"This was clearly an unacceptable error and we disabled the entire topic recommendation feature as soon as we realized this was happening so we could investigate the cause and prevent this from happening again," Facebook spokesperson Dani Lever said in a statement to USA Today.

Facebook immediately disabled the A.I. program responsible for the error. 

FACEBOOK USES PARTISAN FACT-CHECKERS TO ‘SUPPRESS’ CONSERVATIVE SATIRE, BABYLON BEE CEO SAYS

Technology companies have dealt with similar issues in the past, with some critics claiming facial recognition technology is biased against people of color. 

Google Photos in 2015 mistakenly labeled pictures of Black people as "gorillas," for which Google apologized and tried to fix the error. Wired later found that the solution was to block the words "gorilla," "chimp," "chimpanzee" and "monkey" from searches. 

CLICK HERE TO GET THE FOX NEWS APP

Facebook did not respond to a Fox News request for comment.