By James Rogers
Published August 06, 2019
The shooting was live-streamed by the gunman on Facebook and shared across social media, sparking a scramble by tech giants to remove the horrific footage. The gunman reportedly broadcast 17 minutes of the attack.
However, Eric Feinberg, a founding partner of deep web analysis company GIPEC, identified two posts, one on YouTube, and the other on Instagram, that contained footage of the mass shooting.
“It is astonishing that almost five months since the Christchurch Mosque attack, raw video of this attack can still be found on YouTube and Instagram,” he told Fox News, via email.
Instagram removed the video from its platform Tuesday. “We continue to automatically detect and prevent new uploads of this content on our platforms, using a database of more than 900 visually unique versions of this video," a spokesperson said, in an emailed statement to Fox News. "When we identify isolated instances of newly edited versions of the video being uploaded, we take it down and add it to our database to prevent future uploads of the same version being shared.”
Youtube also removed the mosque shooting video that was posted to its platform. "YouTube has strict policies that prohibit violent content," explained a YouTube spokesperson, in an emailed statement. "We quickly remove videos violating our policies when flagged by our users.”
Google, which owns YouTube, is bringing the number of people working to remove violative content to 10,000 people across the tech giant.
The Christchurch shooting is in the spotlight after the suspect in Saturday’s mass shooting that left 22 dead in El Paso reportedly cited the New Zealand attack in a post on the controversial online forum 8chan.
The first sentence of the online rant posted on the 8chan message board expressed support for the man accused of killing 51 people at two New Zealand mosques in March after posting a 74-page document promoting a white supremacist conspiracy theory called "the great replacement."
“The El Paso shooter says he was inspired by the Christchurch shooter,” Feinberg told Fox News.
Feinberg noted that social media companies are protected by Section 230 of the Communications Decency Act, which says that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."
“Since social media platforms cannot be held liable for what others post due to Section230 of the CDA there is no incentive to remove these videos,” he said.
Part of the reason why YouTube struggled to remove the Christchurch shooting video is because the company was no match for the countless versions of the despicable footage that were uploaded in real-time and avoided detection by the tech giant's artificial intelligence.
Fox News' Christopher Carbone contributed to this article.
Correction: An earlier version of this story incorrectly stated that YouTube is boosting its content moderator team by 10,000 people. Google, which owns YouTube, is bringing the number of people working to remove violative content to 10,000 people across the company.
Follow James Rogers on Twitter @jamesjrogers