Updated 3/21 at 9:30 PM ET

On Saturday, an investigation published by the New York Times and The Guardian revealed data analysis firm Cambridge Analytica having accessed an appalling amount of Facebook users' data in 2013. Thing is, it wasn't a data breach — Facebook granted Cambridge access to do it. Now, Facebook's first spin is here. 

Speculation's been in a frothy swirl since the initial story dropped. Facebook's valuation plummeted. Congressional committees and the Federal Trade Commission launched investigations. Throughout, Facebook was eerily silent. Zuckerberg was AWOL.

Today, Zuckerberg reemerged. And in a lengthy Facebook post, he wrote:

We have a responsibility to protect your data, and if we can't then we don't deserve to serve you. I've been working to understand exactly what happened and how to make sure this doesn't happen again. The good news is that the most important actions to prevent this from happening again today we have already taken years ago. But we also made mistakes, there's more to do, and we need to step up and do it.

Zuckerberg then walked readers through a detailed timeline about how we got to where we are, beginning in 2007 with the site's launch. In 2013, he acknowledged, Aleksandr Kogan, a Cambridge University researcher who was also involved with Cambridge Analytica, did indeed access the information from 300,000 or so Facebook users, along with their friends.

"Given the way our platform worked at the time this meant Kogan was able to access tens of millions of their friends' data," Zuckerberg writes. Facebook revisited the kind of data that third-party apps like Cambridge Analytica were able to access in 2014, he writes, and Kogan was suspended from the platform in 2015.

Zuckerberg continues:

In this case, we already took the most important steps a few years ago in 2014 to prevent bad actors from accessing people's information in this way. But there's more we need to do and I'll outline those steps here:

Those steps, in brief, would investigate apps that accessed a lot of user information before 2014, restrict developers' access to user data, and help users take control of what third parties can see.

For users that feel shocked and violated by the amount of data accessed by Cambridge Analytica (and, likely, other third-party apps), Zuckerberg doesn't offer much salve. He didn't vow to stop selling user data to third parties. And it sounded an awful lot like he thinks the problem was fixed in 2014.

Later, in a scheduled interview on CNN, Zuckerberg repeated many of the points he had outlined in his Facebook post. In retrospect, Zuckerberg said, it was a mistake not to notify the public of Cambridge Analytica's misconduct when Facebook first knew about it. Facebook plans to do a full audit of any third parties that might have had access to as much data as Cambridge Analytica did, he added. And Facebook shouldn't have trusted Cambridge Analytica to have deleted the data when they said they did, Zuckerberg said —the company will be more reluctant to do the same of other third parties in the future.

Part two of the interview took a broader view at the challenges facing the company, touching on Russian interference in the 2016 U.S. election (the company "wasn't as on top of the issues as it should have been," and has been deploying AI bots to prevent similar meddling by malicious actors in other elections), whether Zuckerberg will testify before Congress (he will "if it's the right thing to do"), and whether regulation is called for (it might be).

As the company works to fulfill Zuck's promises and tackle the issues confronting it, users will have to decide: is this enough to trust Facebook again?


Share This Article