Cybersecurity & Tech

Facebook’s ‘Draft Charter’ for Content Moderation: Vague, But Promising

Evelyn Douek
Thursday, January 31, 2019, 11:39 AM

The Charter of the United Nations begins by listing a number of lofty goals, including, “We the peoples of the United Nations [have] determined to save succeeding generations from the scourge of war, which twice in our lifetime has brought untold sorrow to mankind.” This echoes the more famous and similarly aspirational preamble to the U.S.

Published by The Lawfare Institute
in Cooperation With
Brookings

The Charter of the United Nations begins by listing a number of lofty goals, including, “We the peoples of the United Nations [have] determined to save succeeding generations from the scourge of war, which twice in our lifetime has brought untold sorrow to mankind.” This echoes the more famous and similarly aspirational preamble to the U.S. Constitution, which talks of the establishment of a “more perfect union.” On Monday, Facebook published a decidedly less soaring but more defensive draft charter for its proposed “Oversight Board for Content Decisions”: “Every day, teams at Facebook make difficult decisions about what content should stay up and what should come down. … [W]e have come to believe that Facebook should not make so many of those decisions on its own.”

The document gives some detail and shape to founder and CEO Mark Zuckerberg’s announcement late last year that Facebook would be setting up an independent body to which users could appeal content moderation decisions. Establishing such a body has the potential to revolutionize speech online. But until now, as a joint Oxford-Stanford report released earlier this month stated, “[a]ll the major questions” about what such a body would look like and how it would work “remain[ed] unanswered.”

The draft charter makes some progress in answering these questions, but acknowledges that it is still only a “starting point.” Facebook is going to be conducting a worldwide series of workshops and consultations in the next six months as it continues to think about the institutional design of the board. Still, the new details presented in the draft charter are mostly promising. Overall, the document shows that Facebook is listening and open to innovation in designing the body to be an effective mechanism for accountability and transparency. However, so much is left open that it remains far from clear whether this promise will be fulfilled.

Composition

The draft charter announces that the Oversight Board will be made up of “up to 40 global experts,” selected on the basis of publicly available qualifications as well as geographical, cultural, personal and professional diversity. Members will serve for fixed three-year terms, which are automatically renewable for one additional term, and can be removed if, but only if, the member violates the terms of their appointment. Those terms would include to-be-established rules on anti-lobbying, recusal in cases of conflicts of interest, and upholding the secrecy of the board’s private deliberations.

From this rough description, what is apparent is how little the Oversight Board will resemble the U.S. Supreme Court, which Mark Zuckerberg originally invoked when announcing the proposal. Most of the differences are laudable, especially the commitment to diversity in membership across various axes—a trait which the U.S. Supreme Court is famously lacking. The global nature of the disputes that the board will have to adjudicate—and the history of Facebook’s content-moderation decisions having harsh effects, if inadvertently, against minority or disadvantaged communities—make diversity essential to the body’s legitimacy and effectiveness. However, Facebook plans to have rotating panels of unspecified size hear cases. For this reason, it is unclear whether the overall diversity of the board will ensure that the necessary perspectives are heard in any given case.

This concern may be tempered by another very welcome design commitment: Facebook will allow the board to “call upon experts to ensure it has all supplementary linguistic, cultural and sociopolitical expertise necessary to make a decision.” The board’s ability to seek out the materials it needs is a departure from the requirement, typical in adversarial systems, that parties present the materials relied upon by the court. Ideally, it will help ensure that the board does not make decisions without relevant materials. This more inquisitorial model of the proceedings may also correct the obvious potential power and resource imbalance created where an individual petitions the board for reconsideration of a decision made by the well-resourced perpetual defendant—that is, Facebook.

However, it is not clear that the board will be able to meaningfully take advantage of this mechanism given the time constraints set by Facebook. According to the charter, the board will be required to issue an explanation of its decisions within two weeks (although it is unclear when this period would start running). The two-week deadline seems somewhat arbitrary: A functional board would have to balance speed in decision-making in order to assure the possibility of substantive remedy, with the need to review cases carefully. It is unclear why a strict standard of two weeks strikes a good balance—a fortnight is an age in terms of the internet zeitgeist (justice delayed is virality denied) but perhaps not long enough for a multi-member board to gather and consider all the materials it needs.

The three-to-six year terms of board members are more in line with the international norm of around 9–14 years for judicial appointments than the lifetime appointments enjoyed by U.S. federal judges. The risk of these relatively short terms is that members will leave the board just as they develop the necessary expertise, which Facebook acknowledges is a trade-off against the desire for “fresh perspectives.”

What Facebook does not address is how such short, part-time appointments might affect current members’ independence, given that members are likely to still be concerned with the impact their decisions might have on their careers and reputations after serving. This is especially so given that it is by no means clear that serving on the board will carry with it the same level of professional prestige that usually attends appointment to other final courts of appeal. Potentially with an eye to this, and just generally to protect Board members, Facebook has announced that decisions will not be attributed to individual panel members and will be “issued on behalf of the board.” This is in keeping with many civil law jurisdictions, but Facebook has said it will still retain the common law tradition of permitting dissents.

Jurisdiction

Worryingly, the draft charter suggests that Facebook will define the board’s jurisdiction in narrow terms. Facebook says that the issues that the board will discuss are “[o]ur most difficult and contested decisions about taking down or leaving up content.” Content moderation decisions of this nature are the most obvious way Facebook decides what users see online, and such decisions are often the source of the loudest public outcries against the platform. But they are hardly the most pervasive, or necessarily the most consequential, means in which Facebook controls the online environment. Putting aside the subjective and manipulable standard of “most difficult and contested decisions,” Facebook also downranks a large amount of content, reducing the number of users who see it in their news feed without taking that content down completely—and the company has said that it intends to take this step with more and more content going forward. Under the draft charter’s definition, Facebook might avoid review by the board simply by severely limiting a piece of content’s circulation, instead of removing it.

Or consider the application of Facebook’s rules around political advertising, which have caused controversy when the company blocked nonpolitical ads from news sites and nonprofit organizations,while waving through fraudulent ads in the names of sitting senators. Another apparent exemption is decisions made on Instagram, which is a troubling omission: Reports commissioned by the Senate intelligence committee wrote that Instagram was a main vector of the Russian influence operations in the 2016 election and is likely to be a key battleground on an ongoing basis. These categories are just a few examples of the wide range of content-moderation decisions that Facebook makes, and highlight how a narrow jurisdiction over individual take-down decisions hardly empowers the board to bring true transparency, accountability and coherence to Facebook’s content moderation practices.

Facebook is not the only entity that will have de facto control over the board’s jurisdiction. The draft charter says that the Oversight Board will not decide cases where reversing Facebook’s decision would violate the law, in keeping with longstanding Facebook policy of respecting the laws of local jurisdictions where the company operates. As Facebook’s Head of Policy Management Monika Bickert has written, obeying local requests to remove content “may also be the essential component in companies retaining operating privileges in countries with restrictive speech laws, where [for example] a blasphemous post could lead a government to fine a company, arrest its employees, or block its service altogether.” However, as more and more countries take action against the perceived threat of problematic content on social media, this may end up being a substantial limitation on the Oversight Board’s work. This practice also tends to undermine Zuckerberg’s vision of a truly “global” community, where Facebook’s Community Standards “apply around the world to all types of content.” And it will interfere with Facebook’s obligation to uphold the human rights of their users, including their rights to free speech, “regardless of frontiers.”

Similarly, it seems the board will be restricted to assessing whether Facebook’s Community Standards have been correctly enforced in individual cases and not whether those Community Standards are themselves appropriate, although Facebook “can incorporate the board's decisions in the policy development process. In addition, Facebook may request policy guidance from the board.” Some have suggested that limiting the board’s ability to influence broad areas of policy would fail to meaningfully devolve power from Facebook. This remains to be seen. But several suggestions expand the board’s possible jurisdiction: The charter states that the board may have an advisory role or “abstract jurisdiction” (where Facebook can request policy guidance on difficult questions not in the context of a particular dispute) and perhaps even a concrete review jurisdiction (where moderators might escalate matters to review in hard cases where they are uncertain about how the Standards should be applied, in advance of a user-initiated “appeal,” potentially heading off controversy). This would be another welcome departure from the model of U.S. courts, incorporating innovations from other jurisdictions, such as allowing the possibility of the referral of questions to courts for advisory opinions without requiring a live dispute concerning individual claimants

Substantive values

When Zuckerberg first announced the establishment of the board, I wrote that one of the most critical questions the company would face was the substantive rules that such a body would be charged with interpreting. In applying Community Standards in individual cases, what principles will the board refer to in resolving ambiguity? Facebook has answered this question by saying it will publish a final charter that includes a “set of values, which will include concepts like voice, safety, equity, dignity, equality and privacy.”

This sounds good, but a list of priorities that includes everything prioritizes nothing. How should the board balance these values? The divergence between American and other, particularly European, countries’ hate speech jurisprudence, for example, is heavily influenced by a different weighting of individual voice against human dignity. Hopefully the final charter includes greater guidance than merely a long list of human rights buzzwords.

Will the Oversight Board fulfill its promise?

Whether the Oversight Board becomes an avenue for true transparency and accountability in Facebook’s notoriously opaque content moderation processes depends on institutional design. This is not just a matter of a checklist approach to formal features of “independence.” A great deal of comparative law scholarship, especially regarding authoritarian states, shows that these are insufficient safeguards if those in power are not meaningfully committed to accepting the limits on their power introduced by an independent check. Skeptics, therefore, may point to a recent Motherboard report that suggests the true reason Facebook is introducing changes is not meaningful reform, but merely to protect its public image—down to asking moderators to flag pieces of content that might prove to be “PR fires.”

Facebook’s true motivations for ceding some of its power to an independent body might be varied and complex. But good institutional design is a necessary, if not sufficient, requirement for bringing greater checks and more balance into the Facebook content moderation ecosystem.


Evelyn Douek is an Assistant Professor of Law at Stanford Law School and Senior Research Fellow at the Knight First Amendment Institute at Columbia University. She holds a doctorate from Harvard Law School on the topic of private and public regulation of online speech. Prior to attending HLS, Evelyn was an Associate (clerk) to the Honourable Chief Justice Susan Kiefel of the High Court of Australia. She received her LL.B. from UNSW Sydney, where she was Executive Editor of the UNSW Law Journal.

Subscribe to Lawfare