A U.K. Parliamentary committee has called foul on what it perceives to be a lack of action by social media companies to take down illegal content.
In a report released Monday, the Home Affairs Committee states social media firms must “be held accountable for removing extremist and terrorist propaganda hosted on their networks.” The report went on to state:
“Social media companies currently face almost no penalties for failing to remove illegal content. We recommend that the Government consult on a system of escalating sanctions to include meaningful fines for social media companies…”
The committee acknowledges a current ‘report and take down’ standard that Facebook, Google and Twitter comply with. Under this criterion, employees rely on users to sniff out and report potentially offensive material that may be later deemed as a candidate for removal.
The committee says this is not enough. It wants social media platforms to put their money where their mouth is and pay for their indiscretions. The Home Affairs Committee believes social media companies should abide by the same rules as soccer teams, which “pay for policing in their stadiums.” According to the report, the burden for online policing currently rests with taxpayers which means social media companies are quite literally passing the buck.
The committee is not accusing Facebook, Twitter or Google of not doing anything—it just does not think the companies are doing enough. In fact, the committee states that the companies are “shamefully far from taking sufficient action.”
Fox News reached out to the three social media giants regarding the committee’s claims. Facebook’s Director of Policy Simon Milner agrees with the committee stating “there is more we [Facebook] can do to disrupt people wanting to spread hate and extremism online. We look forward to engaging with the new Government and parliament on these important issues after the election.” Milner further states that Facebook is working with experts to improve its approach.
The U.K.’s general election is scheduled for June 8, 2017.
Twitter’s U.K. Head of Public Policy Nick Pickles echoes Facebook’s collaborative sentiment and says Twitter is also working with different companies and using technology to find offensive content.
“The majority of accounts that Twitter removes for promoting terrorism are detected and removed by technological means. All the accounts that CTIRU [Counter Terrorism Internet Referral Unit] have submitted to us in recent reports have been suspended prior to them submitting their reports.”
Google is taking the issue very seriously and is committed to dealing with it. In a statement to Fox, Google’s Press Team said, “We’ve recently tightened our advertising policies and enforcement; made algorithmic updates; expanded our partnerships with and funding for creators dedicated to countering extremism, hate speech and harassment; and are broadening our partnerships with specialist organisations working in this field.”
The Home Affairs Committee maintains in its report that it welcomes the companies “wanting to do more,” but that the “interpretation and implementation of the community standards in practice is too often slow and haphazard.”