Cybersecurity & Tech

Regulating Online Speech: Ze German Way

Matthias C. Kettemann, Torben Klausa
Monday, September 20, 2021, 8:01 AM

Prohibiting platforms from self-governing is becoming more widespread. German law provides for a different approach, with clearer rules and more rule of law in content moderation practices.

The Federal Court of Justice, Bundesgerichtshof. Source: Wikimedia/Steffen Prößdorf

Published by The Lawfare Institute
in Cooperation With
Brookings

“Moving fast and breaking things” is a scary approach to innovation. This rule-breaking is a challenging normative approach in Europe, and especially in Germany, where rules are seen as something digital behemoths should stick to, not ignore. On the other extreme of the spectrum, authoritarian and illiberal leaders in Brazil, Poland, Texas and Florida have taken to enjoining platforms from doing what they need to do more of: moderate content, keep the good, delete the bad and downrank the ugly. This enjoinment will make it more difficult for social networks to remove harmful, but not illegal content, and conflicts with, opposes, or runs afoul of platforms’ free speech rights and autonomy. European and especially German platform rules go a different route by allowing platforms to moderate content but increasing their responsibilities for enforcing the rules.

Most recently, the German Federal Court of Justice (FCJ), Germany’s highest civil court, passed two judgments (cases III ZR 179/20 and III ZR 192/20) that affirm prior rulings saying that national laws grant platforms flexibility with their rules but not without conditions.

Two users posted right-leaning content involving immigrants and insinuations of rape and murder. Shortly after, Facebook decided to take down the posts due to the hate speech policy in its Community Standards. The users then sued to get their posts reinstated, which they can do under German (and most European states’) law, which lacks a broad Section 230-like clause. In their judicial endeavor to get their posts reinstated, the users reached the FCJ and gave the court its first opportunity to answer this question: Can Facebook set its own content moderation rules and block content that is awful but lawful? Or is the private social network obliged to keep all content online—as long as it does not violate German law?

To be fair, asking, “Why should a private company be forced to keep any third-party content on its own platform?” might seem a legitimate question. But things are not so easy: Some central values and principles of Germany’s constitution, together labeled the “free and democratic foundational order” (freiheitliche demokratische Grundordnung), are so central in German jurisprudence that, under certain conditions, they bind not only the state—but private actors, as well. This concept probably is, apart from Schadenfreude, Germany’s most important linguistic contribution to the global legal discourse: Drittwirkung (horizontal application of fundamental rights including between private parties).

Too Big or Not Too Big? The Question of Drittwirkung

In 1958, the Federal Constitutional Court (FCC) established its famous Drittwirkung doctrine, roughly translated to third-party effect and also known as indirect horizontal effect of fundamental rights. In this doctrine, the FCC argues that because Germany’s constitution is not value-free, these values—embedded for example in fundamental rights—radiate into nonpublic fields of law and private law relationships, like contracts. Therefore, courts have to consider rights in their reading, potentially resulting in an indirect application of fundamental rights to private actors. There are, of course, limits to this. I cannot invoke my right to physical integrity (Art. 2(2)(1) Basic Law) against my violin-practicing neighbor, even if they butcher Mozart. However, especially in systems, or “constellations,” with (extremely) disparate power relationships, the FCC and other German courts use the Drittwirkung doctrine to balance opposing interests that would otherwise play out one-sidedly. (One could think of Drittwirkung as the German “public function” exception to the state action doctrine—but on steroids.)

Recently, the FCC has refined the Drittwirkung doctrine especially with regard to public discourse and freedom of expression. In 2011, the FCC ruled in its “Fraport” decision that the mostly state-owned company operating Frankfurt Airport (Fraport) was directly bound by fundamental rights and had to allow a demonstration on its premises within the airport. In 2015, the court then decided that the freedom of assembly indirectly binds private actors as well, if their premises are “created and operated [...] as places to linger, meet, stroll, consume and spend leisure time.” This ruling became known as the “Beer Can Flashmob” decision, as the claimants planned to protest for their civil liberties by individually drinking a can of beer on the privately owned square. Leave it to Germany to turn drinking beer into an act of civic virtue.

In 2019, the FCC then expanded its Drittwirkung doctrine from the physical into the digital realm. The court ruled that the question of whether Facebook is bound by the indirect horizontal effects of fundamental rights is at least “open” to discussion. In another 2019 decision, “Right to Be Forgotten I,” the court—referring to social networking sites—further explained that:

when private companies take on a position that is so dominant as to be similar to the state’s position, or where they provide the framework for public communication themselves, the binding effect of the fundamental right on private actors can ultimately be close, or even equal to, its binding effect on the state.

Is Facebook—in the market for social networks and for modern communication in general—in “a position that is so dominant as to be similar to the state’s position?” While Germany’s Constitutional Court has still not committed itself to a clear answer, the Federal Court of Justice has now voiced a clear opinion: No, Facebook is not dominant enough to be considered a state for the purposes of the horizontal application of human rights. However, as the court clarified, due to its size it cannot just delete posts and accounts. Facebook has rights, but its users do, too. Facebook’s rights, so the FCJ has ruled, have to be weighed against the rights of its users.

Online Speech Is a Triangle: Weighing Competing Rights on Social Media

On the user’s side, the FCJ has referred to the freedom of expression and the equality principle in past cases. (All users have to be treated equally, and no arbitrary differences in deletion/nondeletion are allowed.) While the relevance of expression rights is clear, the right to equal treatment allows the court to address structural imbalances in certain cases. As the court said in Right to Be Forgotten I, “in special constellations … the Principle of Equal Treatment may have an impact on the relationship between private parties, as well.”

To the FCJ, Facebook is a paradigm for such a special constellation. With Facebook’s number of users, “the network is an important social communication platform, access to which, at least for parts of the population, determines participation in social life to a considerable extent. [...] Anyone who is excluded from this [...] can no longer participate in the internal group or public discussions that take place there.” The court argues that switching to another network is not easily feasible and points to the lock-in effect.

On Facebook’s side, the court mentioned in July’s III ZR 179/20 and III ZR 192/20 cases (sorry, German cases don’t always have memorable names) four different fundamental rights: First, Facebook’s freedom of occupation (basically the right to conduct your business) includes its “commercial interest in creating an attractive communication and advertising environment for both its users and its advertising partners, in order to be able to further collect user data and sell advertising space.” By developing and applying Community Standards, Facebook, in the court’s reading, tries to prevent alienating users and advertising partners.

The second right is Facebook’s own freedom of expression: To the FCJ, the social network operator Facebook “is an indispensable intermediary” for the process of communication as such. However, the company is not a mere “technical distributor” of opinions. Instead, it influences the users’ communication with its Community Standards and their implementation through human and algorithmic tools—Facebook’s activities fall under the freedom of expression as a result. In addition, Facebook’s targeted removal of user content to enforce its Community Standards is an expression of opinion in itself.

Third, Facebook cannot invoke only its own fundamental rights: By maintaining a certain quality of online discourse, the company serves the interest of other users on its platform, as well. Finally, the FCJ acknowledges Facebook’s rights-based interest to not be liable for third-party content. As the assessment of potentially illegal content can be very tough, Facebook has an interest in erring on the safe side and removing content according to its own standards, which is not necessarily illegal under German law. Some overblocking is therefore fine.

By now, the court has mapped out the rather difficult-to-differentiate rights and interests in online content moderation. And while adhering to its prior dictum that Facebook is not big enough to be bound by fundamental rights directly, the FCJ now has to balance the network’s own rights with its dominance in the market.

With Great Power Comes Great Bureaucracy

So how does the FCJ do it? Yes, Facebook may develop its own Community Standards and take action to enforce them—but when doing so, the company has the legal obligation to take its own users’ fundamental rights into account. Or, in the words of the court’s recent ruling (Right to Be Forgotten I): “Facebook’s fundamental rights are to be balanced with those of the users in such a way that the users’ fundamental rights have the greatest possible effect.” The FCJ derived two key requirements from this ruling.

First, there must be objective reasons for the removal of content and the blocking of user accounts. Although Facebook is used for general communication and information exchange, the network may not arbitrarily ban certain (for example, political) opinions. Banning political ads, however, would most likely still be fine. Banning political speech altogether would, in this reading, be an infringement of users’ communicative rights. Conversely, the Community Standards have to include objective and clear rules that leave little to no room for interpretation. In other words, if Facebook wants to ban something, everyone should be able to understand what that something is. This is coincidentally something the Facebook Oversight Board has also asked the company to do, inter alia in the “Goebbels quote” case. Bad and unclear rules are, however, something that not only platforms do. The Austrian Constitutional Court famously criticized the government for passing a law, whose content could only be grasped “with subtle knowledge, extraordinary methodological skills and a certain desire to solve brain teasers[.]”

The world is difficult enough. At least the rules need to be clear. The court’s second requirement for content moderation homes in on Facebook’s size: If a company wants to restrict freedom of expression of millions of people (like a state), it has to adhere to due process requirements (like a state). The FCJ calls this “procedural protection of fundamental rights” (Grundrechtsschutz durch Verfahren). Protecting freedom of opinion on the users’ side requires—within reasonable limits—the most accurate clarification of the facts in question by the network.

The court found it therefore necessary that Facebook commits itself to (a) informing a user about any removal of their content and of any intention to block a user’s account; (b) informing a user of the reason for the action; and (c) giving a user an opportunity to respond, which is followed by (d) a new decision with the chance of reinstating the removed content, akin to an appeal review. Due to the risk of further dissemination of the potentially hazardous content, the subsequent hearing can be conducted after content was removed—but has to happen before a user account is blocked.

But, don’t all of these procedural safeguards require a lot of additional work and resources at Facebook’s expense? Yes, they do, says the FCJ. But it is an effort that is a necessary part of the company’s business model—and it “does not impose any effort on [Facebook] that would economically jeopardize or disproportionately complicate the operation of its social network.” In short: Content moderation is messy, difficult, and costs a lot of money—and Facebook has to pay for it and get better at it.

Procedural Regulation as the Way Out

The FCJ’s Facebook verdict is a classic “Yes, but …” ruling: Yes, Facebook can make its own rules for its own platform, but whatever rules it comes up with will have to be enforced in a certain way. Yes, Facebook is not bound directly by fundamental rights like a state, but they still apply to a degree and the company has the responsibility to meaningfully uphold them. This “Yes, but …” approach and the weighing of the competing fundamental rights enables the court to dissect the social network into its two integral roles: the private role of a company doing business on its own terms and aimed at maximizing profits; and the public role of a digital marketplace of ideas that is too important for Germany’s 21st century democracy to be left unwatched by courts.

This balancing of interests shows that Facebook is in principle entitled to require the users of its network to comply with certain communication standards that go beyond the requirements of criminal law. Facebook may reserve the right to remove posts and block the user account in question in the event of a breach of its Community Standards. However, in order to strike a balance between the conflicting fundamental rights in a manner that is in line with the interests of the parties, it is necessary that Facebook undertakes in its terms and conditions to inform the user concerned at least subsequently about the removal of a contribution and in advance about an intended blocking of the user’s account, to inform the user of the reason for the action, and to grant the user an opportunity to respond, followed by a new decision.

Procedural regulation, it seems, is the way out to bolster rule of law online without imposing stricter rules on free speech itself. In the German case, Facebook will now have to either include thorough appeal procedures in its Community Standards or challenge the FCJ ruling in front of the Constitutional Court (or even the European Court of Human Rights), arguing that the verdict violates the company’s own fundamental rights. However, given the FCC’s faible ruling on the horizontal effect of fundamental rights, it is unlikely that the FCC’s ruling would be more beneficial for Facebook than the one by the FCJ. The European Court of Human Rights will be similarly unmoved—it ruled on Sept. 2 that a politician has to promptly remove illegal hate speech from his Facebook comment section and that a French fine of 3000 euros for failing to comply did not violate his freedom of expression.

In conclusion, the important insight of recent German jurisprudence on platform rules may well be that Germany (and Europe) is entering a more normative platform age. The dawn of platforms with no (or barely any) moderation has passed. Conversely, at least in Germany, platforms’ ability to freely delete content as they see fit is also over. (In Europe, similar rules may come with the proposed Digital Services Act.) In Germany, the need for more law, clearer rules, and shared responsibilities for better content governance has become apparent. Enjoining platforms from moderating is the wrong approach. Rather, governments should continue to share good practices in holding platforms more accountable to their content governance rules and require the platforms to review any deplatforming decisions and share their rationales as well.


Matthias C. Kettemann is professor of innovation law, legal theory and philosophy of law at the University of Innsbruck, Austria, and a research program head at the Leibniz Institute for Media Research | Hans-Bredow-Institut
Torben Klausa is PhD student at the University of Bielefeld and a research fellow at the Berlin Social Sciences Center (WZB).

Subscribe to Lawfare