Cybersecurity & Tech

Why Facebook’s 'Values' Update Matters

Evelyn Douek
Monday, September 16, 2019, 12:05 PM

Amid privacy scandals, sweeping disinformation operations and links to ethnic cleansing, a reasonable person could be forgiven for wondering lately: “What is the point of Facebook?” Now the world has Facebook’s answer to that questio

Facebook CEO Mark Zuckerberg on stage at Facebook's F8 2017 Developers Conference. (Flickr/Maurizio Pesce, CC BY 2.0)

Published by The Lawfare Institute
in Cooperation With
Brookings

Amid privacy scandals, sweeping disinformation operations and links to ethnic cleansing, a reasonable person could be forgiven for wondering lately: “What is the point of Facebook?” Now the world has Facebook’s answer to that question in the form of an “update” to the values that inform the company’s Community Standards, the rules for what Facebook allows to be posted on its platform. The blog post is short and could be mistaken for any number of the company’s public relations press releases over the years. Mark Zuckerberg, Facebook’s founder and CEO, has released periodic “manifestos” in the form of blog posts laying out his vision for the company—or, as Zuckerberg prefers to call it, the “Community.” But this latest update written by Monika Bickert, Facebook’s vice president of global policy management, is far more substantive than mere corporate buzzwords. It may have a significant impact for the platform and, therefore, for online speech.

Under the heading “Expression,” the new values set out Facebook’s vision for the platform and its rules:

The goal of our Community Standards is to create a place for expression and give people voice. Building community and bringing the world closer together depends on people’s ability to share diverse views, experiences, ideas and information. We want people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable. In some cases, we allow content which would otherwise go against our Community Standards—if it is newsworthy and in the public interest. We do this only after weighing the public interest value against the risk of harm, and we look to international human rights standards to make these judgments.

The update then goes on to note that there are four values that may justify limiting expression: authenticity, safety, privacy and dignity.

There is a lot to unpack in this very short post. A few things are especially worth noting: the prioritization of “voice” as the overarching value, the understanding that the purpose of this voice is to “build community and bring the world closer together” and the explicit incorporation of international human rights standards. But before looking at what this could mean in practice, it’s necessary to understand why this update matters and why Facebook has announced it now.

Why Now?

Bickert’s post does not give any clues as to the reason for the update. But the post comes as Facebook is finalizing its plans for an independent Oversight Board, which will be able to review and overrule Facebook’s content moderation decisions. When Facebook released its Draft Charter for the new board, it noted that the board would base its decisions on Facebook’s Community Standards and “a set of values, which will include concepts like voice, safety, equity, dignity, equality and privacy.” As I wrote at the time, “This sounds good, but a list of priorities that includes everything prioritizes nothing.” Facebook had to make difficult choices about what the point of Facebook is in order to guide the board in cases of ambiguity. The quiet update to its values last week represents this important step.

As Facebook readies itself to outsource ultimate decisions about its rules to an independent, external body, these values represent both a set of instructions to the board about the ambit of its role, as well as a commitment to bind Facebook to the mast of the implications of these values expressly laid down.

What Effects Will This Have?

In many cases, the Oversight Board will be asked to decide whether Facebook’s content moderators correctly applied the Community Standards in choosing to remove certain content or allow it to remain. But over the course of the public consultation process, a strong consensus emerged that the board should also have power to determine whether the Community Standards themselves are good policy. Both in reviewing the Community Standards and in evaluating individual moderation decisions, there needs to be something to guide the board’s interpretative process. The board members will be independent, but their decisions should be grounded in something more than mere gut feeling or personal preference. After all, if the U.S. Supreme Court were to determine whether a statute is properly enacted merely based on whether a majority of the justices think the statute is a “good” rule, the court would be mocked; instead, the justices refer to the Constitution to determine what is within Congress’s power, and this (to most, and in most cases) legitimizes the court’s decisions. Similarly, the Oversight Board needs an overarching framework to resolve cases of ambiguity and to determine whether the rules serve the platform’s underlying purposes.

A clear example of how this might operate is Facebook’s decision to prioritize “voice” over other values. This is not a surprise. After all, “voice” on Facebook means people posting “content,” which is the lifeblood of the platform. As law professor Noah Feldman said in a conversation with Mark Zuckerberg about the Oversight Board, “No voice, no Facebook, right?” Furthermore, Facebook is a platform that has grown up acculturated in American First Amendment norms, which are famously and exceptionally speech-protective. And freedom of expression is a universal human right, which Facebook is at least in principle bound to uphold.

But Facebook’s choice to reflect this in its “paramount” commitment to voice was not inevitable. For all the talk of Facebook and social media more generally as the new public square, the platform is also a product made by a private company. Prioritizing voice and expression, even when it is ugly and objectionable, has risks. It is foreseeable, for example, that some of Facebook’s current restrictions on adult nudity, which are presumably responsive to perceived market demand about what people want to see when they log in to Facebook, might be found to be restrictive of voice and not necessary for the purposes of authenticity, safety, privacy or dignity (when consensually shared).

Likewise, it is risky for Facebook, as a private business, not to choose “safety” as its guiding value. Of course, Facebook has said that expression can be curbed for the purposes of safety and “making Facebook a safe place.” But under international human rights law, now expressly incorporated in Facebook’s values, such restrictions need to be necessary and proportionate, in the sense that they impose the least possible burden on speech necessary to achieve the stated purpose of the restriction. Therefore, in cases of ambiguity, when the extent of risk posed by a certain category of speech is not necessarily clear, Facebook will not—and, if the Oversight Board project works, cannot—err on the side of caution and simply take that content down “just in case.” This reflects a certain degree of risk tolerance on Facebook’s part—which will no doubt be praised by those committed to a robust marketplace of ideas. It also perhaps reflects the changing role of these private companies that facilitate so much public discourse: They are not literally, technically or legally a new public square, but their systemic importance means that society is coming to have expectations of their responsibility to the public that go beyond mere consumer satisfaction.

The values update also reflects just how far Facebook has come from its First Amendment roots. Committing to the ability to restrict speech for the purposes of authenticity, safety, privacy and dignity is a departure from the U.S. legal tradition, where unprotected speech is limited to a small set of narrowly drawn categories and the right to anonymity (which is at odds with Facebook’s requirement of “authenticity”) is jealously protected. While Facebook has given voice priority, there will be practical ramifications of the inclusion of values such as authenticity and dignity as countervailing considerations that can justify removing content. As former Justice of the Constitutional Court of South Africa Albie Sachs has observed:

To Americans, the firstness of the First Amendment is axiomatic. It is seen as a source of enlightenment, as being the most constitutive and defining element of the whole constitutional order. The legal cultures of Germany and South Africa, however, have a profoundly different foundational element. It is not free speech, but human dignity. What is axiomatic to an American lawyer could be problematic to us. What is axiomatic to us could be problematic to an American.

Sachs goes on to describe how this has had a material impact on South Africa’s jurisprudence and led to different outcomes than in the U.S. in areas such as hate speech and defamation. One of the key tasks for the Oversight Board, then, will be how to strike a balance in all of these cases: for example, what weight to give dignity interests while still respecting the overarching commitment to voice. Speech decisions are often “zero-sum” controversies where different values come into conflict: voice vs. dignity; authenticity vs. privacy. There will never be consensus on the correct way to decide these trade-offs, but Facebook’s values statement gives the board something to hold on to as it tackles these difficult problems.

The explicit statement that it looks to international human rights standards to make judgments in hard cases is also notable. Calls for social media platforms to make this commitment have been growing louder and louder ever since David Kaye, the U.N. special rapporteur on the promotion and protection of the right to freedom of opinion and expression, called for such a measure in a 2018 report. Many questions remain about how to operationalize this. How is a private company to evaluate interests such as “national security” and “public morals”—which international law says may be weighed against freedom of expression in deciding whether to restrict speech? To what extent can a private platform restrict speech on the basis of product design decisions (for example, a knitting community that wants to restrict certain types of political speech)? But there are many people thinking hard about these issues right now, and the area represents an exciting frontier for human rights scholarship and practice.

Community Standards Realism

Of course, none of the values Facebook has set out are technically binding. Facebook could theoretically change its values the day after it gets an Oversight Board decision it doesn’t like. The values themselves are highly indeterminate (“dignity,” for example, can be particularly hard to define).

But the point of the Oversight Board experiment is to garner greater public legitimacy for Facebook’s content moderation decisions through a commitment to transparency and explanation of Facebook’s decision-making. The board’s existence is fundamentally a bet that this kind of legitimacy matters to users’ perceptions of the company and their decisions on whether to keep using the platform—as well as to regulators pondering what to do about the tech giants. Facebook’s choice in taking steps to spell out its values is part of this venture, and the company would undermine its own wager by failing to stand by these commitments. But it will be the public’s job, too, to hold the company and its new Oversight Board to a fair and justifiable reading of what these commitments entail.


Evelyn Douek is an Assistant Professor of Law at Stanford Law School and Senior Research Fellow at the Knight First Amendment Institute at Columbia University. She holds a doctorate from Harvard Law School on the topic of private and public regulation of online speech. Prior to attending HLS, Evelyn was an Associate (clerk) to the Honourable Chief Justice Susan Kiefel of the High Court of Australia. She received her LL.B. from UNSW Sydney, where she was Executive Editor of the UNSW Law Journal.

Subscribe to Lawfare