Facebook

Hey, Facebook, it's time to put our safety before your algorithms

Tech Take: Android Central's Daniel Bader on the challenges Facebook must overcome in order to control and censor graphic violence on their live streaming platform

 

The murder of an innocent man in Cleveland shown on Facebook by the perpetrator  is but the latest distressing outrage exposing the horrific flipside of social media. Apparently, the grisly implications of the monstrous deed -- including, G-d forbid copycat crimes, has led Facebook CEO, Mark Zuckerberg to announce a review of his company’s policies.

Some argue that this murderer would have acted on his evil impulses with or without the Internet. They may be right in this case, but we will never know since the perpetrator committed suicide.

Two other recent incidents however, confirm that Social Media giants live in a self-imposed bubble that recklessly empowers terrorists and bigots and needlessly endangers us all.

First came word that major international companies pulled millions of advertising dollars from Google/YouTube after they discovered that their brands’ logos were appearing alongside terrorist and bigoted videos broadcast by the video behemoth. This, just after the latest high profile terrorist attack at the entrance to The House of Commons in London.

Fearing a cascading financial hemorrhage Philipp Schindler, Google's chief business officer, blogged a mea culpa, vowing to step up its efforts to block ads on "hateful, offensive and derogatory" videos. "We know that this is unacceptable to the advertisers and agencies who put their trust in us,", he wrote.

But Google/YouTube failed to address one other constituency: We the People. And they left unanswered one critical question: Why does the Social Media giant allow such content to be posted in the first place? Some of the pro-terrorist videos already incorporated visuals from the London carnage in their social media recruitment drive. And Americans have learned what havoc can be wreaked by one or two lone wolves inspired by the hate and the violence.

The questions are not new. As founder of the Simon Wiesenthal Center’s Digital Terrorism and Hate Project, I meet regularly with Social Media companies in Silicon Valley. In this connection, we release an annual report (2017) that provides governments, the media, community activists, and the Internet companies a snapshot of how extremists—from ISIS to David Duke leverage the most powerful marketing tool ever created to push their terrorist and hate agendas.

Last week, we also released our annual report card grading companies on their efforts-if any- to degrade the online hate and terror mongers.

Google/YouTube did not fare well. They earned a C- on Terrorism and a D on Hate. There are no reasons why real-time videos celebrating and promoting terrorism and demeaning and attacking minorities should be allowed on YouTube in the first place. In addition, despite our many protests and their own rules, YouTube still carries an expanding library of how-to-use readily available materials to build a bomb, a remote cellphone detonator, homemade flamethrower, or make napalm…

Along with Twitter’s spotty record, YouTube’s failure to tackle hateful postings in a consistent and serious way, have helped our domestic bigots to have an out-sized impact during the contentious and divisive Presidential campaign last year.

Social Media companies did not invent hate, but they have yet to take seriously their responsibilities to help degrade the online bigots from further infecting the mainstream of our culture.

There is one additional category on our Digital Terrorism and Hate 2017 Report Card—Messaging Apps. These apps have emerged as critical tools for terrorists when they want to evade authorities. By using off the shelf encryption, terrorists in France and Belgium went dark in before they unleashed their carnage.

These companies have done almost nothing to stop the terrorists from coopting their product: grades range from a B- to F. Now comes word that What’s App has refused to cooperate with British authorities who discovered that the House of Commons terrorist, Khalid Massoud used their encrypted app for his last communications before launching the murderous outrage.

Bottom line: It is past due for the social media service providers to take their responsibilities seriously. Algorithms are no substitute for ethics. If they fail to take more effective action look for Washington to begin to consider the dreaded R word—regulation.

Rabbi Abraham Cooper is associate Dean of the Simon Wiesenthal Center in Los Angeles. Follow the Simon Wiesenthal Center on Facebook and on Twitter.