By Christopher Carbone, ,
Published April 16, 2018
A contrite Mark Zuckerberg has been in damage control mode this week in Washington, D.C. as Facebook's future remains under a cloud of uncertainty following revelations Facebook data from up to 87 million users may have been improperly accessed by data-mining firm Cambridge Analytica.
"We didn't do enough to prevent these tools from being used for harm," the CEO and co-founder said in his remarks before a rare joint hearing of the Senate Commerce and Judiciary Committees on Tuesday. "We didn't take a broad enough view of our responsibility, and that was a big mistake. It was my mistake, and I'm sorry."
Zuckerberg, who said he "didn't know" or would have "his team" follow up dozens of times during the two hearings, did not announce any new initiatives that would fundamentally alter the network's business model, nor was he able to explain why users weren't notified that their data was misused in December of 2015.
His own testimony showcased the many reasons the 33-year-old billionaire's creation has a long way to go before it is once again viewed not as an enemy, but as a place where you spend time with your friends.
British political consulting firm Cambridge Analytica did work for the 2016 U.S. presidential campaigns of both Ted Cruz and Donald Trump and accessed the data thanks to Cambridge University researcher Aleksandr Kogan’s third-party app, "This Is Your Digital Life." His app was a personality quiz that 270,000 Facebook users took without knowing it would vacuum up all their data and all of their friends' data. Facebook learned about the incident in 2015, but didn’t publicly acknowledge it until articles were printed last month on both sides of the Atlantic Ocean.
Cambridge Analytica has repeatedly denied any wrongdoing in the matter.
Fallout from the growing data security scandal has knocked roughly $80 billion from the social giant’s market value, led to a class-action lawsuit and prompted fierce questioning from some lawmakers on Capitol Hill.
The tech company now faces a new FTC investigation into allegations that it may have violated a 2011 consent order requiring Facebook to “establish and maintain a comprehensive privacy program” and get users’ “affirmative express consent before enacting changes that override their privacy preferences.” It also prohibited the social network from misrepresenting the privacy or security of its users’ data.
“This was an institutional failure on the part of American regulators,” said Sam Lester, a consumer privacy fellow at the Electronic Privacy Information Center (EPIC).
Although Facebook COO Sheryl Sandberg recently said the company was "too idealistic" in its handling of privacy issues and abuse, an examination of the company's growth and previous run-ins with regulators shows that both she and Zuckerberg might have known better.
EPIC has warned the federal government since 2009 that Facebook was playing fast and loose with its users’ data and that strong regulatory oversight was needed. That oversight may be coming, thanks in part to the efforts of regulators in Europe, where the stringent General Data Protection Regulation (GDPR), which takes effect on May 25, will empower users with new control over their data in four key areas and impose stiff fines on Facebook or any company that violates the rules.
Among other features, the GDPR will give users a fully inclusive “right to be forgotten” that not only includes deleting your data but also preventing it from being further disseminated or used by third parties; it requires users to be notified of any data breach within 72 hours of a company learning about it; and would impose a fine of up to 4 percent of annual revenue for each violation. In Facebook’s case, that would mean a fine of approximately $1.6 billion per violation in Europe.
If the FTC finds that Facebook violated the 2011 consent order, the agency could fine it up to $41,484 per user that was impacted: meaning a total fine in the trillions of dollars. It's unlikely Facebook would be fined anywhere near that amount, but some industry watchers have speculated a potential FTC penalty could be in the billions of dollars.
A coalition of consumer groups, in an open letter on Monday to Zuckerberg, urged Facebook’s co-founder and CEO to adopt GDPR as the “baseline standard for all Facebook services” globally. In recent comments to reporters, the Facebook chief executive said that he would like to make GDPR protections available to users in North America and non-EU regions, but he hedged by not providing exact details about how that would look.
Forty-six percent of U.S. adults want to see more regulation of how the tech industry handles user privacy, according to a recent Reuters poll.
Steven Berman, managing partner of Hagens Berman, which on Monday filed a class-action lawsuit on behalf of those whose data was allegedly harvested by Cambridge Analytica, said in a statement that Facebook has “repeatedly failed to uphold its privacy agreements” and “brazenly neglected the security of the billions of those who use its social media service.”
The lawsuit seeks compensation and injunctive relief.
In addition, advocacy groups for Muslims and African-Americans have called for a "civil rights audit" of the social network to ensure it is free from discrimination and hate speech.
“Governments have a responsibility to protect people’s privacy and security by solving problems any tech company and industry can’t alone—that’s when regulation comes into play,” Denelle Dixon, COO of Mozilla, told Fox News.
Her company, which makes the web browser Firefox, was the first brand to pull its advertising from Facebook when the data scandal first broke.
Although Facebook has said repeatedly that users freely share a multitude of information and are aware of its privacy rules listed online, many people do not pay attention to the fine print or get discouraged by having to click through multiple screens. The data scandal could change user behavior.
Facebook, for its part, has recently revamped its privacy settings pages to make them more user-friendly.
“Consumers need to use the privacy tools and controls they have today. Those controls don’t solve all of our privacy and security challenges,” said Dixon. “If the existing tools and controls don’t meet your needs, then people have to demand what they need.”
At Mozilla, where privacy has been a core principle for years, the company has long maintained a policy of “no surprises” and “only collect what you need.”
Facebook’s sheer scale is undeniable. For countless people outside of Europe and North America, Facebook is the Internet. The tech giant owns Instagram, Messenger and WhatsApp, which have over 800 million, 1 billion and 1.5 billion monthly active users, respectively. It's incorporated features from competitors like Snapchat, seemingly dampening their growth and along with Google, controlled more than 80 percent of the global digital advertising market in 2017, according to a report in the Financial Times.
Some lawmakers, including Sen. Ron Wyden, (D.-OR.), have even floated the idea that Facebook could be broken up for antitrust violations if it does not handle the burgeoning scandal.
“I don’t see there being a Bell/AT&T-like breakup for Facebook, Instagram, et al. There is plenty of competition to go around online. Breaking it up would only hurt the future of social networks,” Jason Mollica, a lecturer at American University’s School of Communication, told Fox News.
Mozilla's Dixon believes that the pressure from users and advertisers is having an impact on Facebook.
“To be clear, Facebook can and will still monetize user data, as will other companies, but events like this hopefully will force companies to have a more clear, transparent value exchange with users and to curtail activity that doesn’t really put users first,” Dixon added.
However, this isn’t the tech giant’s first time in the spotlight, nor is it the first time it has promised to do better. Lester said that users should be skeptical when Zuckerberg and other Facebook executives say they'll take care of things.
“We don’t want to leave everything in the hands of Facebook,” Lester, who worked as a law clerk with the FTC’s Bureau of Consumer Protection while at Georgetown University, told Fox News. “Their business model relies on collecting all this info and developing these secret profiles.”
"It’s not just Facebook—time and time again, the companies with whom we trust with our most sensitive data have shown that privacy and security are optional.”
A SurveyMonkey poll in the wake of the privacy scandal found that 56 percent of users who know about the topic do not trust the social network to protect user privacy, and 60 percent plan to share less personal information with Facebook.
“Facebook offers a service people value, otherwise so many people wouldn’t continue to use it. There’s nothing inherently wrong with that,” said Dixon. “The problem is that users don’t see or understand what they are giving up in exchange for that service. There’s something out of balance with the value exchange. Users are being last rather than first.”
Regulators are paying attention.
In addition to the FTC’s probe confirmed on March 26, the Federal Election Commission recently announced that political advertisements online will soon have to identify their sponsors. This happened in the wake of Russia’s disinformation campaign, which harnessed Facebook and other platforms.
Facebook executives have said this week that they will implement the requirements of the Honest Ads Act, a bipartisan bill that would improve transparency and increase regulations around online political ads.
“Cambridge Analytica sparked a moment because people are looking more broadly at the power that Facebook has in many areas,” said Lester. “We need public oversight and accountability.”
Cambridge Analytica said in a Monday statement that it did not illegally or inappropriately collect or share Facebook data with anyone else, that it has not broken FEC regulations and that it did not use the Facebook data during the 2016 U.S. presidential election. The company also launched a new site to dispel what it claims are falsehoods surrounding its role in the data scandal.
“It’s not just Facebook—time and time again, the companies with whom we trust with our most sensitive data have shown that privacy and security are optional,” Dixon said.
Indeed, Google collects a gigantic amount of data from you through search, Gmail, Maps and more. This week, the company was accused by a coalition of privacy groups of illegally obtaining data from underage children on YouTube.
“The dismay of users around the world has really underlined that such an approach is untenable, and we, as a technology community, need to build products and services that put the user at the center, that empower them to make informed choices, that meet their expectations around privacy,” she continued, “where they are the customer, not the product.”