Published by The Lawfare Institute
in Cooperation With
On Jan. 7, the day after the riot in the U.S. Capitol, Facebook CEO Mark Zuckerberg announced the “indefinite” suspension of Donald Trump’s Facebook and Instagram accounts. It remained unclear whether Trump would be banned from the platforms forever: Zuckerberg wrote at the time that Facebook would block Trump at least until the inauguration, “until the peaceful transition of power is complete.” Since then, the ultimate fate of Trump’s accounts has hinged, more or less, on how the Facebook CEO happened to feel whenever he got around to considering whether the decision should be permanent.
But now, the decision is now out of Zuckerberg’s hands. Facebook has referred the suspension to the Facebook Oversight Board: an external court-like institution that Facebook set up to review its “most difficult and significant content decisions” and that, under its bylaws, has the power to issue binding decisions on the platform. The board now has more power over the former president’s future ability to communicate with a large part of his base than either Zuckerberg or Trump himself.
You could be forgiven for not wanting to think about Trump's Facebook account ever again. This is one small part of Trump’s legacy that can simply be literally erased—so why not just delete the account and move on? After four years of being on edge about the president’s posts, I would understand the impulse to want to tune this out. But Facebook’s decision to refer the case to the board is, as I argued last week, the right one—and it has ramifications far beyond this single account. So what should onlookers expect from this case, and why should they care?
What’s on the Table
Facebook has asked the board two questions. First, was Facebook’s decision to indefinitely suspend Donald Trump’s account correct, and should that suspension—which Facebook has said it will keep in place until the board issues its decision—be upheld or overturned? And second, does the board have any observations or recommendations about how Facebook should treat account suspensions when a user is a political leader?
The board’s decision on the first question is directly binding, and Facebook will follow the board’s instructions on what to do with this particular account. That includes, for example, if the board instructs it to reinstate Trump’s account. Trump’s suspension is not necessarily as straightforward a decision as it might seem, and it raises the question of to what extent Facebook should take into account the broader social context and events off its own platforms in deciding to ban a political figure’s account. The platform’s decision to ban Trump was more a result of the events at the Capitol and surrounding political context than anything the president had posted directly prior to the suspension. Taking context into account is the only way to effectively evaluate speech—but how should Facebook assess that context, and when does context require suspending an account whose posts on the platform itself might technically remain within the rules?
The second question Facebook has referred is a request for policy guidance. The board’s response to this question will not be directly binding on the company but is substantively much more interesting. Facebook’s treatment of public figures has been one of its most controversial policies. The platform treats speech from politicians as inherently newsworthy and therefore in the public interest to be seen, even if it breaks the platform’s community standards. A good argument can be made—indeed, I’ve made it—that democracy requires voters to know who their candidates really are and what they believe, even (or, perhaps, especially) when those beliefs are abhorrent. (This does not and should not apply with respect to incitements to violence.) Others argue that, to the contrary, public figures with significant influence should, if anything, be held to higher standards than other users.
There isn’t an obvious or simple answer to this trade-off between the benefit of holding political figures accountable and keeping their words accessible to the public, weighed against the risks of speech short of incitement that nevertheless may be harmful or offensive. At the very least, this is exactly the kind of “difficult and significant” question the board was set up to address.
Given that the board’s policy guidance will not be binding, it remains to be seen how seriously Facebook takes the board’s recommendation. If Facebook doesn’t like what it hears, it could simply respond, “Thanks for your input!” and ignore the board’s advice. But the reputational costs of doing so would be significant given the high-profile nature of this referral and the effort Facebook has gone to—often unsuccessfully—to try to convince people that the board has real power. At the least, the referral suggests Facebook wants someone else to try answering the question.
What Happens Now
Facebook has elected to refer Trump’s case under its normal referral process, rather than the “expedited review” process available for exceptional cases. This left open the possibility the board could decline the case, but it announced immediately that it would accept, noting that Facebook’s decision “has driven intense global interest.”
Now that the board has accepted the referral, it will have 90 days to consider the case. Don’t expect anything like a traditional courtroom drama; there will be no cross-examinations of Trump. Essentially the entire process will proceed by writing, and many of the documents will not necessarily be made public.
Facebook will prepare a set of submissions for the board that include basic information about the content that led to its decision and Facebook’s rationale. The board can then request further information from Facebook, such as about the engagement and reach of Trump’s posts—that is, how many people saw and interacted with them—or how much of the content was flagged by users as violating the rules. Trump will have an opportunity to submit a user statement explaining why he believes Facebook’s content moderation decisions should be overturned. The board can also, at its discretion, consult outside experts and commission issue briefs from advocacy organizations.
For its initial cases, the board has set up a public comment process through which anyone can submit comments on individual cases. But these comments aren’t public, so there’s no way to know the extent to which people have engaged with this process so far. (After this piece was published, the Oversight Board informed me that public comments will be made available, with consent, in an appendix to each case at the time the decision is released.) The last two weeks of op-ed pages, though, suggest that many people might have thoughts about Trump’s case. Perhaps German Chancellor Angela Merkel, who found Trump’s ban from Twitter a “problematic” breach of the “fundamental right to free speech,” could make a submission.
The board currently has 20 members, but only a five-person panel will initially consider the case. Unlike an appellate court panel, though, the names of the panelists will not be released publicly. Four will be assigned at random. At least one panel member will be from the U.S.: as every panel must have a representative from the region the decision primarily affects and, while the region technically includes Canada, the board does not currently have any Canadian members. After reviewing the information about the case, the panel will draft a written decision, which may include any concurring or dissenting views if the panel cannot reach consensus.
The 20-member board as a whole will then review this panel’s draft decision. If the board is not satisfied with the decision, a majority of the members can decide to send the case back for re-review, convening a new panel and starting the process again on an expedited timeline. There is currently no mechanism to break the infinite loop that may occur if a majority of the board is consistently dissatisfied with panel decisions.
Once the decision is final, it will be published on the board’s website. Facebook will publicly respond to the board’s decision and must implement the board’s decision on Trump’s account within seven days. The company will provide a public response regarding the board’s politician policy recommendation within 30 days.
The Bigger Picture
Whether Trump gets to use Facebook in the future will no doubt help shape his post-presidential influence in politics. But this referral, and the board’s decision, could have two more enduring impacts.
The first impact is the global ripple effects. When social media platform after social media platform banned Trump’s accounts following the Capitol riot, the world was watching—and many people asked what this meant for politicians in other countries who have overseen or incited violence, but whose social media accounts remain alive and well. Right now, there’s an apparent inconsistency in how Facebook treated Trump’s account as opposed to how it treats accounts of other leaders: Philippine President Rodrigo Duterte, for example, is still on Facebook despite, among other things, suggesting that drug dealers should be summarily executed. Whether and in what form Trump’s suspension represents a precedent is not merely an academic question. The board’s policy guidance, if taken seriously by Facebook, could have dramatic consequences for politics and societies from the Philippines to Brazil to Ethiopia and beyond.
There is no greater question in content moderation right now than whether Trump’s deplatforming represents the start of a new era in how companies police their platforms, or whether it will be merely an aberration. The past few weeks have also shown that what one platform does can ripple across the internet, as other platforms draft in the wake of first movers and fall like dominoes in banning the same accounts or content. For all these reasons, the board’s decision on Trump’s case could affect far more than one Facebook page.
The second impact of the referral is more inchoate, relating to the board’s uphill battle to build legitimacy and trust. Since the organization was announced in May 2020, there have been two possible paths before it. It could become a potentially important institution that could operate as a meaningful check and balance to the extraordinary power Facebook exercises over the public sphere—or, alternatively, it could be a total flop. Past experiments that have gone down this latter route include Google’s AI ethics board, disbanded after only a week, or Facebook’s failed 2009 attempt to allow democratic input on its policies, which it abandoned when only 0.3 percent of users voted.
A handful of people, including myself, have been cautiously optimistic about the board experiment. But I’ve been discouraged in recent months as the board’s power has become steadily more circumscribed. Like Schrödinger’s cat, it’s still unclear if the Oversight Board is dead or alive—but had Facebook not referred one of the most controversial and high-profile decisions in its history, this would have been a bad sign. Referring Trump’s suspension instead suggests that Facebook has faith in the board experiment and may well put its money where its mouth is on consequential decisions. This could, in turn, build broader external trust in the board.
Before the board stepped in, the bottom line on Trump’s account was simply that Mark Zuckerberg would decide what to do. If you believe that “Mark decides” is a bad governance model for the future of speech online—regardless of whether Mark occasionally happens to decide correctly—this referral is good news.
Editor's note: This piece has been updated to include a clarification from the Oversight Board on the public comment process.