Published by The Lawfare Institute
in Cooperation With
On March 6, Facebook CEO Mark Zuckerberg announced a long-term road map to turn Facebook into a “privacy-focused communications platform.” His “principles” for this transformation include auto-deleting old user content and choosing “not to build data centers in countries that have a track record of violating human rights like privacy or freedom of expression,” even if that gets Facebook blocked from lucrative markets such as China or Russia.
But Zuckerberg devoted the most space to a promised deployment of end-to-end encryption for private messages: “People's private communications should be secure. End-to-end encryption prevents anyone—including us—from seeing what people share on our services.” As precedent he cited WhatsApp, the popular messaging service purchased by Facebook in 2014, which made headlines in 2016 when it rolled out end-to-end encryption across all its services.
Some of these ideas would be legitimate improvements to Facebook (especially for dissidents in repressive regimes who fear abusive government surveillance). But for the vast majority of Facebook users, little of what’s gotten the company in trouble over the past few years has anything to do with the lack of encryption. Consider, for example, Facebook’s role in Russia’s disinformation campaign during the 2016 U.S. elections. Or the consistent complaints that Facebook’s content-moderation practices are both excessive (as to politically charged speech) and insufficient (as to abusive and threatening content), as well as inscrutable and capricious. Or the rising body of evidence that Facebook, and social media in general, is addictive and harms mood and attention, especially among the young.
Solve these problems and you might actually achieve a “privacy-focused communications platform.” But you won’t do it by encrypting Facebook Messenger. What Zuckerberg is instead attempting is a form of reputation laundering: Rather than meaningfully engage with Facebook’s real issues, Zuckerberg invokes encryption to redirect the public from the difficult—and costly—reforms that would actually fix the social media platform. Call it “privacy laundering.”
For example, Zuckerberg claims that encryption is “decentralizing,” because “it limits services like ours from seeing the content flowing through them and makes it much harder for anyone else to access your information.” But in Facebook’s case the decentralization would only be on the margin; the company will still manage the communications of billions of people behind a walled garden that it controls. Zuckerberg rightly notes that “some people worry that our services could access their messages and use them for advertising or in other ways they don't expect.” Yet encryption will do little to fix the problem of targeted ads. As Zuckerberg has admitted, Facebook doesn’t need access to user content to sell its users’ attention to advertisers. Apparently, (unencrypted) demographic information is more than sufficient.
This is why Zuckerberg’s sudden rediscovery of the virtues of privacy has been rightly met with broad skepticism. Facebook’s business model is the quintessential example of “surveillance capitalism,” with user data serving as the main product that Facebook sells to its advertisers. Zuckerberg’s announcement was conspicuously silent on changes to this core business model, which suggests that none of the changes he lays out will improve user privacy in a meaningful way. It’s a bit like McDonald’s announcing a menu revamp built around kale, or ExxonMobil staking its future on power-generating hamster wheels. Anything’s possible, but I’d like to see the business model first.
Ironically, the closest Zuckerberg gets to linking encryption to an actual issue confronting Facebook is an area in which the argument cuts against widespread deployment of end-to-end encryption: Facebook’s (in)ability to monitor its service to stop the spread of harmful content. For example, Facebook has been criticized for allowing its services, including WhatsApp, to be used to foment anti-Muslim violence in South Asia. As Zuckerberg admits, encryption will make it harder for Facebook to police its own platform, since it won’t have access to the content its users communicate.
But according to Zuckerberg, this is an acceptable risk: Facebook already “will never find all of the potential harm we do today when our security systems can see the messages themselves.” One would think that would suggest the need to devote more resources to content moderation, not (as would happen with broader encryption) less.
There are good reasons for a company to encrypt user data, and there are good reasons not to. The issue is a complex one (as has been discussed extensively on Lawfare). But privacy laundering is an unequivocally bad reason for a company—whether Facebook or any other tech giant—to push encryption.