On October 27, in 2012, Facebook CEO Mark Zuckerberg wrote an email to his then head of product development. For years, Facebook had allowed third-party apps to access data on their users' unwanted friends, and Zuckerberg was considering giving away all the information that was risky. In his email, he suggested that it was not: "I am generally skeptical that there is so much data leaking strategic risk that you think," he wrote at that time. "I just can't think of any cases where data has leaked from developer to developer and caused a real issue for us."
If Zuckerberg had a time machine, he could have used it to go back to that moment. Who knows what would have happened if the young executive in 201
But Zuckerberg couldn't see what was right in front of him – nor could the rest of the world, really until March 17, 2018, when a pink-haired whistleblower named Christopher Wylie told New York Times and The Guardian / Observer about a company called Cambridge Analytica.  Cambridge Analytica had bought Facebook data on tens of millions of Americans without knowing how to build a "psychological warfare", releasing US voters to help Donald Trump as president. Just before the news broke, Facebook banned Wylie, Cambridge Analytica, its parent company SCL, and Aleksandr Kogan, the researcher collecting data, from the platform. But those movements came too late and could not threaten outbreaks of users, legislators, integrators and the media. Immediately, the stock market fell and boycott began. Zuckerberg was called to testify before Congress and a year of disputed international debates on consumer privacy rights began online. On Friday, Kogan filed a lawsuit against Facebook.
Wylie's words were fired, even though much of what he said was already a public record issue. In 2013, two University of Cambridge researchers published a document explaining how they could predict people's personalities and other delicate details from their freely available Facebook likes. These predictions, the researchers warned, could "pose a threat to the individual's well-being, freedom or even life." Cambridge Analytics predictions were largely based on this research. Two years later, a Guardian author, named Harry Davies, stated in 2015 that Cambridge Analytica had collected data on millions of US Facebook users without their permission and used their will to create personality profiles for the 2016 US election. But in the heat of the primaries, with so many surveys, news stories and tweets to dissect, most Americans did not pay attention.
The difference was when Wylie told this story in 2018, people knew how it ended – with the election of Donald J. Trump.
This is not to say that the backlash was, as Cambridge Analytics former CEO Alexander Nix has claimed, that some bad beliefs of anti-Trumpers are dissatisfied with the election results. There is more than enough evidence of the company's unscrupulous business practices to justify all the review it received. But it is also true that politics can be destabilizing, like the transport of nitroglycerin. Despite the theories and assumptions that had been about how data could be abused, for many people it took Trump's choice, Cambridge Analytics resolved with it and Facebook's role in seeing this squishy, intangible thing called integrity has real consequences.
Cambridge Analytica may have been the perfect poster child for how data can be abused. But the Cambridge Analytica scandal, as it was called, was never just about the company and its work. In fact, the Trump campaign has repeatedly insisted that it not use Cambridge Analytics information, only its data scientists. And some academics and political practitioners doubt that personality profiling is anything but snake oil. Instead, the scandal and backlash grew to encompass the ways in which companies, including but not limited to Facebook, take more human data than they need, and give away more than they should, often just asking the state of the touch – if they one's question.
One year ago it became the first page news, Cambridge Analytica leaders are still called to Congress to answer their actions during the 2016 election. But the talk about integrity has largely moved from the current company, which closed its offices in May in May. That's a good thing. As Cambridge Analytica faded in the background, other important issues emerged, such as how Facebook may have provided special data management to device makers, or why Google tracks people's location even after they've made site tracking.
It has become a growing recognition that companies can no longer be transferred to regulate themselves, and some states have begun to act on it. Vermont implemented a new law that requires data consultants to buy and sell third-party data to register with the state. In California, a law will come into force in January, which would, among other things, allow residents to opt out of selling their duties. Most states have introduced similar bills in recent months alone. At Capitol Hill, Congress is contemplating the outline of a federal data protection law. Although progress is, as always in Washington, it is slow.
These scandals and blowbacks have seriously damaged Facebook and probably the entire technology industry. If Zuckerberg had trouble seeing the "risk" associated with careless privacy protection again in 2012, they should be too familiar to him now. Facebook is facing a potential record fine by the Federal Trade Commission, and this week, the news broke that the company was criminally investigated for its data sharing policy.
At the same time, the outcome of the Cambridge Analytica tab has urged Facebook to -Mend in some respects – change its way Last week, Zuckerberg claimed that Facebook's future depends on the privacy of a hotly debated blog post. He said that Facebook will add end-to-end encryption to both Facebook Messenger and Instagram Direct as part of a major plan to create a new social network for private communication.
Critics have discussed whether Zuckerberg has finally seen the light or whether he is actually motivated by more mercenary interests. Yet, they immediately encrypt the chats to improve the privacy of billions of people's personal messages worldwide. It can of course also do a lot of damage, creating even more dark spaces on the internet for incorrect information to spread and for criminal activities to party. Just last week, one of Zuckerberg's most trusted allies, Facebook's chief producer officer Chris Cox, announced that he had left Facebook, a decision that is allegedly much to do with these problems.
One year after the Cambridge Analytica story broke, none of these privacy issues have provided simple answers for companies, regulators, or consumers who want the internet to stay comfortable and free and want control over their information. But the trial has at least forced these conversations, once purely academic and secretarial, into the ordinary.
If only the world had seen it coming soon.
More Great WIRED Stories