The Federal Trade Commission announced today (July 24) that Facebook will pay a historic $5 billion penalty for repeated privacy violations, ending a 16-month probe by regulators into how the tech giant lost control over its users' personal data. The company will also need to submit to imposed restrictions and a modified corporate structure and be under government supervision for the next 20 years, the FTC said in its press release (opens in new tab).
This is the largest fine administered in the FTC's history, and almost 20 times greater than the biggest data-security penalty ever issued worldwide. The FTC concluded that Facebook misled its users about how much control they had over their personal information, violating a 2012 FTC order.
“Despite repeated promises to its billions of users worldwide that they could control how their personal information is shared, Facebook undermined consumers’ choices,” said FTC Chairman Joe Simons.
The settlement demands that Facebook create an independent oversight committee within its board of directors to regulate privacy. Compliance officers must begin keeping watch over the company’s attitude towards data, too. Those appointees, along with Facebook founder and CEO Mark Zuckerberg, must prove they are upholding their end of the settlement on a quarterly basis. Otherwise, Facebook will face civil or criminal penalties, the FTC said.
Facebook first said in April that it expected to pay billions to settle the FTC's investigation into the company's entanglement with the Cambridge Analytica affair that transferred the personal data of millions of Facebook users to an unauthorized third party. On July 12, the FTC voted to approve the fine, but today's official announcement provides the scope of the penalties.
Is $5 billion enough?
While $5 billion sounds like a lot, Facebook made over $55 billion in 2018. FTC commissioners Rohit Chopra and Rebecca Kelly Slaughter, the two Democrats on the FTC, both voted against the announced penalties.
Chopra explained on Twitter today that he feels the settlement doesn't go far enough, and that Facebook can pay the fine out of its profits and won't need to change its ways.
"The FTC can seek civil penalties in addition to unjust gains," Chopra wrote. "I believe the company’s potential exposure is likely far greater."
He cited the FTC's 2012 action against Google, where the FTC obtained a penalty of more than five times the company’s "unjust gains." Chopra said Facebook's mandated fine "is a departure from that approach."
You can read Chopra's full dissent here.
Remind me, what did Facebook do wrong?
In 2012, Facebook assured the FTC that it would improve its efforts to protect user data after countless shortcomings.
But then, in 2014, Facebook permitted an academic-research firm to conduct sociological studies using Facebook data, as had been done many times. The users had to opt into a Facebook survey that informed them their data would be used. About 270,000 Facebook users signed up.
However, because of the way Facebook's permissions worked at the time, the researchers were able to also see personal data for all of the Facebook users who were "friends" on Facebook of the 270,000 users who voluntarily enrolled. That meant that as many as 50 million Facebook users had their data exposed.
The academic researchers later sold the entire trove of data to Cambridge Analytica, a political data-mining firm, which later did some work for the 2016 election campaign of now-president Donald Trump. Facebook said it learned as early as 2015 that the data was being misused.
This information came to light when Christopher Wylie, a Canadian data-mining expert who worked for Cambridge Analytica in 2013 and 2014, offered his side of the story to the New York Times and the Observer. Wylie said he made the decision to come forth after head of Cambridge Analytica, Alexander Nix, testified that his company never used Facebook data to feed political motivations. Wylie said Nix lied.
It took until 2018 for Facebook to concede that Cambridge Analytica actually mined and misused the data from 87 million users' profiles.
Facebook's response to the FTC agreement
On Facebook Newsroom, the company issued a response to the FTC agreement.
"The privacy program we are building will be a step change in terms of how we handle data,' the post states. "We will be more robust in ensuring that we identify, assess and mitigate privacy risk. We will adopt new approaches to more thoroughly document the decisions we make and monitor their impact. And we will introduce more technical controls to better automate privacy safeguards."
Mark Zuckerberg also commented on today's news in a Facebook post.
Posted by zuck on
"Overall, these changes go beyond anything required under U.S. law today," the Facebook founder wrote. "The reason I support them is that I believe they will reduce the number of mistakes we make and help us deliver stronger privacy protections for everyone."
What does this mean for Facebook?
This means Facebook will (hopefully) improve accountability on all levels. Aside from the internal regulators, the FTC's order enhances the ability for third-party assessors to evaluate Facebook's privacy practices.
As part of Facebook’s FTC-mandated privacy program, which also covers Facebook-owned WhatsApp and Instagram, Facebook must conduct a thorough privacy review for every new or modified product, service, or digital practice before its released to the public.
Facebook's relentless bad news run may not be over, but the FTC answering growing demands for greater transparency and accountability for technology companies certainly sets a new, improved tone for data privacy practices.
"The relief is designed not only to punish future violations but, more importantly, to change Facebook’s entire privacy culture to decrease the likelihood of continued violations," Simons said.