Update: In an exclusive TV interview with CNN, Mark Zuckerberg said he will testify before Congress if it is "the right thing to do."
He also suggested that he is not opposed to regulation, saying, "I'm not sure we shouldn't be regulated." Instead, Zuckerberg said the focus should be on what is the right regulation.
This article was originally published at 9:45 p.m. on March 21, 2018.
After days of silence from Facebook's top executives, Mark Zuckerberg and Sheryl Sandberg have finally commented on the major misuse of user data that occurred on the platform and has dominated the news cycle over the past week.
AdvertisementADVERTISEMENT
In a post on his Facebook page, Zuckerberg acknowledges that the company made a major mistake: "We have a responsibility to protect your data, and if we can't then we don't deserve to serve you."
While Zuckerberg's response is nothing out of the ordinary — Facebook accepts responsibility, he explains what happened as best he can, and outlines steps for improvement — one of the most alarming parts of his statement is the acknowledgement that Facebook didn't learn about the breach through its own security systems. Zuckerberg says the company learned about it from journalists at The Guardian, The New York Times, and Channel 4, who have been investigating Cambridge Analytica since 2015.
The theme of Zuckerberg's note largely focuses on broken trust, both between Facebook and its users as well as between Facebook and Cambridge Analytica.
He breaks down a timeline of events leading to this "breach of trust", writing that he has also been "working to understand exactly what happened." That timeline starts with 2007, when Facebook first launched the Facebook Platform which gave developers permission to create third party apps that could be integrated with the site. Then, Zuckerberg goes straight to 2013, when the Cambridge Analytica data drama began.
He outlines much of what the public is already aware of from recent media reports: A Cambridge University researcher created a personality quiz app that gained access not only to the few hundred thousand people who installed it, but also all of their friends. In 2014, Facebook changed their policy so that third party apps could no longer access information about a person's friends unless those friends also gave permission, but the damage had already been done.
AdvertisementADVERTISEMENT
Zuckerberg writes that Facebook learned the Cambridge researcher had violated the company's policies in 2015, when journalists revealed that the data had been shared with voter profiling company Cambridge Analytica. At that time, the researcher's app was banned and both he and Cambridge Analytica provided certification showing they had deleted all user data. It was only last week, Zuckerberg says, that Facebook learned this may not have been the case.
"This was a breach of trust between Kogan, Cambridge Analytica and Facebook," Zuckerberg writes. "But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that."
He says Facebook is working with regulators and that company will conduct a forensic audit of Cambridge Analytica. But users' trust in Facebook has taken a major hit, as has Facebook's stock, and far more damage control needs to be done.
Zuckerberg says that much of what could be done to prevent a situation like this from occurring again was taken in 2014, when its third party app policies were revised. However, he says the company is taking three more steps to bolster security now: Facebook is investigating other apps that were created before 2014's policy revision and will audit any suspicious activity; Facebook is restricting third party apps' access to data; Facebook is creating a new tool to help users learn more about what permissions they are giving apps.
Out of all of those steps, the second — restricting access to data — is the most promising, but it also feels like something that should have happened in 2015, when Facebook first learned about Cambridge Analytica. Zuckerberg ends his post, much of which Sandberg reiterates on her own feed, with a promise to "learn from this experience to secure our platform further" and a message of thanks to members of the Facebook community. Sandberg says she "deeply regret[s] that we didn't do enough to deal with it." Whether these words are too little, too late, is something users will need to decide for themselves.
AdvertisementADVERTISEMENT
Zuckerberg did not address the matter of Joseph Chancellor in his note, a co-founder of Global Science Research which harvested Facebook data for Cambridge Analytica. According to The Guardian, Chancellor is currently a Facebook employee. Earlier this week, a Facebook spokesperson told CNN the company was looking into the matter. In an effort to clear the air (and to address the #WheresZuck hashtag), Zuckerberg will be appearing tonight on CNN to answer further questions.
Related Stories:
AdvertisementADVERTISEMENT