Facebook, which has come under fire for the proliferation of fake news on its platform, is ramping up its efforts against misinformation.
Facebook’s ability to police content on its platform is under intense scrutiny after the Cambridge Analytica data sharing scandal and U.S. probes into alleged Russian meddling in the 2016 U.S. presidential election.
The company’s controls over data collection are also in the spotlight after it recently confirmed that Mail.Ru, a Russian tech company which is reported to have Kremlin links, was among a number of companies granted an extension to collect data on app users’ friends beyond a May 2015 cut-off date.
Virginia Democratic Sen. Mark Warner, vice chairman of the Senate Intelligence Committee, this week criticized the social network’s ability to safeguard user data. “In the last 6 months we’ve learned that Facebook had few controls in place to control the collection and use of user data by third parties. Now we learn that the largest technology company in Russia, whose executives boast close ties to Vladimir Putin, had potentially hundreds of apps integrated with Facebook, collecting user data. We need to determine what user information was shared with mail.ru and what may have been done with the captured data.”
Facebook told CNN that only two apps developed by Mail.Ru were granted an extension, which lasted two weeks.
With regard to Facebook’s fight against fake news, the company told Fox News that it has established partnerships with fact-checkers that can interpret dishonest news stories from your feed. The process, which involves both human reviewers and machine learning technology, is designed to catch any misleading information. Facebook says that the process of actually reviewing the claim is done by humans. If a story is found to have any false information it will be flagged with a blue button and possibly pushed down on a users’ news feed. This false content will be deleted from the platform completely if it violates Facebook’s code of conduct.
“When things call for violence, bullying, harassment - those violate our community standards and our viewpoint is you shouldn’t see those things on Facebook,” John Hegeman, vice president of the Facebook News Feed team, told Fox News. “Most of the things going into the ranking are just trying to understand what this person would want to see.”
“There are lot of ways people use the platform for good,” added Sara Su, a product manager on the Facebook News Feed team. “But there are also bad actors that can abuse the platform so the integrity team is focused on identifying problematic content.”
Jason Mollica, a professor at America University’s School of Communication, believes Facebook’s attempt to fact-check content on their platform will help and, “as many eyes as you get - unbiased eyes - on news like this will crack down on fake news angles.”
This is part of a series of efforts Facebook is making to become more transparent and accountable after misinformation plagued social media platforms throughout the 2016 presidential election. Facebook recently rolled out a new political ad policy that requires political ads to say who paid for them and those purchasing ads to go through a verification process.
The social network is firmly in the political spotlight at the moment. Facebook CEO Mark Zuckerberg recently testified before Congress about the company’s data collection policies in the wake of the Cambridge Analytica scandal.
The demand for more transparency comes at a time when Congress is drafting the Honest Ads Act with the purpose of improving disclosure requirements for online political advertisements. Facebook now supports this legislation, but is not waiting for Congress to finalize the bill before it institutes changes into how content is handled on its platform.
“I think regulation can be a good thing and can be really important, and so the important thing is that it’s well-crafted. Our general philosophy there is we’re trying to proactively do the things that we think are right. I think those are probably good steps for everyone to take, but I think we’re trying to do what we can to lead the way there,” said Hegeman.
“I don’t think they need congressional oversight,” said Mollica stating that social media is still relatively young and Facebook needs time to work out any issues they have. “Congress and government need to pump the tires on doing an overarching law to make sure Facebook and other social networks are living up to government standards.”
As the most influential social network, Facebook is working to remove any fake accounts and promote news literacy among its users – taking on a journalistic standard of ethics into how it deciphers information.
“We really feel like it’s a shared responsibility since we’re the platform,” said Su.
“We’re not trying to go after political opinion with this work,” said Hegeman regarding the new fact-checking system. “We’re trying to go after things that are not true and could be harmful.”
The fact-checking systems is currently conducted in fourteen different countries around the world and focuses on examining articles and links, although Facebook is looking into checking for dishonest photos and videos as well.
“We can reduce the likelihood of certain types of content the community doesn’t want to see. We can reduce that and lower it in the news feed so there is less of an incentive for the financially motivated spammer. We think this is the right way to strike a balance between promoting free expression and promoting a safe and authentic community, so that’s a tough nut to crack,” said Su. “We’re never going to be 100 percent perfect, so it’s really important that in addition to the ranking work we invest in building out things out in the user interface,” she added.
At the end of the day, Mollica says it’s up to the individual to figure out what is true and untrue and the responsibility shouldn’t be entirely focused on an employee at Facebook. “We need to be better as individuals to hold social networks to a higher standard but give them time to up their game and say what they are going to do.”
Fox News’ James Rogers contributed to this report.