Eboni Williams: When the First Amendment and terror collide

Nohemi Gonzalez, an ambitious 23-year-old student of fashion and design at California State University, Long Beach, was eating at a restaurant on the Rue de Charrone during her fall semester in Paris when gunmen attacked and murdered her. She was the only known American among the 129 people who were killed in the Nov. 13, 2015, Paris terror attacks.

Last week Nohemi’s father, Reynaldo Gonzalez, filed a federal lawsuit in U.S. District Court in the Northern District of California against Twitter, Facebook and Google, alleging that they are liable for his daughter’s death because they “knowingly permitted the terrorist group ISIS to use their social networks as a tool for spreading extremist propaganda.” The suit calls this “material support” for terrorism, saying that without social media, the growth of ISIS “would not have been possible.”

Gonzalez makes a powerful emotional argument. The world, and America in particular, is exhausted from the increasing number of terror attacks that have left us with far more questions than answers, and many of us are ready to stop asking questions and start assigning accountability.

But the basis for Gonzalez’ lawsuit has some major challenges, and he is unlikely to find relief from these social media companies.

Do Twitter, Facebook and Google have a duty to protect society?

The first issue here is whether Twitter, Facebook and Google have a duty to protect society, and whether they breached that duty. The answer (in the broadest application of that question) is yes. But the next element is causation: Did the breach by these companies cause Nohemi Gonzalez’ murder by ISIS terrorists? This is where Gonzalez runs into a real problem. It’s almost impossible to prove the causal link between the tacit actions of these companies and the intentional, gruesome acts of these terrorists.

Gonzalez’ argument rests on the notion that by allowing their videos to continue to live on the World Wide Web, these companies were implicit enablers. But that argument runs smack dab into a direct collision with our Constitution’s First Amendment right of free speech. That right includes hate speech, and that’s why these companies don't immediately take down every propaganda video.

Once that speech crosses into inciting violence or terrorism, or in the most extreme cases actually shows horrific acts like the beheading of American journalist James Foley, then and only then can the authorities act.

Essentially, the immediacy that Gonzalez and so many others seek is delayed by a frustrating but necessary constitutional analysis of whether the propaganda being put up by terror hate groups is simply protected hate speech, or if it crosses the line to unprotected speech, which can be acted upon.

But that analysis takes time. Evaluating the content of the speech is likely what's responsible for the delay in taking the videos down.

So now what? While reconciling First Amendment protections with basic public safety seems daunting, it’s not hopeless.

In the wake of the Foley murder last August and the San Bernardino attack in December, the social media companies took noticeably faster action.

Twitter has accelerated efforts to quash terrorist dialogue, suspending 125,000 accounts since the middle of last year. In a statement, Twitter said it has worked to “strike a balance” between supporting freedom of speech and opposing terrorism.

It’s a solid start, but it will bring little comfort or satisfaction to Gonzalez and the countless other victims of terrorism.

What we all can do is continue to call out instances of terror propaganda so these companies will not drag their feet or struggle to distinguish between protected free speech and the abhorrent, dangerous and deadly rhetoric and imagery that continue to take too many of our sons and daughters away from us.