With the midterms less than a month away, Facebook is cracking down on efforts to prevent voter suppression.
The tech giant, which on 2016 started prohibiting posts that misrepresent dates, locations or times to cast a ballot, is now banning any inaccuracies on how to vote — a significant step for a platform typically reticent to outright ban any content.
For instance, any posts claiming you can vote by text message, or claiming that you can't vote in the general election if you already voted in the primary, would be deleted by Facebook.
The Menlo Park, Calif.-based company also introduced a new option for users to flag any voting information that may be false or misleading, along with a separate reporting channel for state election officials so that they can do the same thing.
"Expanding our policy is just one of the steps we're taking to strengthen the integrity of elections around the world," said Jessica Leinwand, public policy manager, in a blog post announcing the changes. "We're also getting better at detecting and removing fake accounts and increasing transparency across political and issue ads on the platform."
However, links to discouraging reports about polling places, which may be exaggerated or misleading, will be referred to the tech giant's third-party fact-checkers. If the content is rated "false," it will be ranked lower in Newsfeed but won't be removed.
Critics have said such partial measures are still not enough.
The Director of National Intelligence Dan Coats said Tuesday that Russia, and potentially other foreign parties, are already making “pervasive” efforts to interfere in upcoming U.S. elections.
According to Reuters, Facebook executives are debating whether to echo Twitter recent policy change to ban posts linking to any hacked material. In 2016, thousands of Democratic National Committee emails, hacked by operatives connected to Russia, were released on social media by Wikileaks.
Sources also told Reuters that top Facebook executives even considered banning all political ads – which reportedly produce less than 5 percent of the company's revenue – but ultimately rejected the idea because product managers did not want to leave money on the table and policy staffers felt a ban would favor incumbents, who can afford more expensive ads on television.
In a lecture at Stanford University on Tuesday evening, former Facebook security chief Alex Stamos called for more education and research into how technology and society intersect.
"One of the real reasons that I believe we’re not well prepared for a secure election in 2018 or 2020 is because we are not working off of the same set of facts that we had in 2016,” said Stamos, adding that he is part of a team at Stanford that is studying intelligence failures that took place during the 2016 U.S. election.
Stamos also praised California's new digital privacy initiative, passed in June, and called for new federal legislation to protect users' data.