Cybersecurity & Tech

Social Media Content Moderation: The German Regulation Debate

Nele Achten
Thursday, December 27, 2018, 10:10 AM

Over the past year, lawmakers from Brussels to Washington have discussed whether and how to regulate social media platforms. In Germany, a central question has been whether such platforms—which Germans call social network providers (SNPs)—should be held liable if they fail to delete or remove illegal content.

The Bundestag, Berlin, Germany (Source: Wikimedia/Jorge Royan)

Published by The Lawfare Institute
in Cooperation With
Brookings

Over the past year, lawmakers from Brussels to Washington have discussed whether and how to regulate social media platforms. In Germany, a central question has been whether such platforms—which Germans call social network providers (SNPs)—should be held liable if they fail to delete or remove illegal content. In 2017, the German Bundestag provided an answer to this question when it enacted the Network Enforcement Act (NEA), which came into effect in January 2018. The law requires SNPs with more than two million registered users in Germany to remove “manifestly unlawful” content within 24 hours after receiving a complaint, and to respond to all other complaints within seven days. This post outlines the law’s objectives along with the criticisms that have been leveled against it, and how it has operated during its first year.


Background and Objective of the German Network Enforcement Act


In 2015, the German government created a task force, which included companies and civil society organizations, to address the increased dissemination of hate speech and other illegal content on social network platforms. The companies on the task force, in particular Facebook, Twitter and YouTube, voluntarily committed to implement complaint mechanisms; to verify the illegality of the content with legally- and language-qualified teams within 24 hours; and to eventually delete the content, if the company considered it it illegal.


In 2017, the Federal Ministry for Family Affairs, Senior Citizens, Women and Youth accused SNPs of failing to adequately deleting illegal comments on their platforms, pointing to a study conducted by the public office for youth protection on the internet. It submitted complaints to Twitter, Facebook and YouTube about content that it had previously identified as illegal under German law. In response to complaints made through accounts of individual users, YouTube deleted 90 percent of allegedly illegal content, Facebook 39 percent and Twitter only one percent. After the public office for youth protection on the internet submitted complaints from its accredited account, YouTube deleted another seven percent and Twitter another 63 percent.


The German government proposed the NEA partially in response to the inadequate complaint mechanisms of SNPs. The new legislation sought to force SNPs to respond more quickly and comprehensively to complaints of illegal content.


Main Points of Regulation


Section 1(1) of the NEA defines SNPs as service providers that, for profit-making purposes, operate internet platforms designed to enable users to share any content with other users or to make such content available to the public. The law explicitly excludes platforms offering journalistic or editorial content, along with platforms designed to enable individual communication or the dissemination of specific content.


The core provisions of the NEA, Section 3(1) and 4 (1) 2, impose obligations on SNPs to provide a complaint mechanism for individuals or official authorities whose place of residence or seat is located in the Federal Republic of Germany. Section 2(1) requires the company to publish a transparency report on the handling of complaints about unlawful content on their platforms. Any violation of these obligations can cause a fine of up to 500 thousand euros under Section 4 (1)(1) and (2). SNPs have an obligation to remove “manifestly unlawful” content within 24 hours and to respond to all other complaints within seven days (see Section 3(2)(2) and (3)), but a violation of this obligation cannot be punished with a regulatory fine.


Individuals who have submitted a complaint to an SNP but were unsatisfied with the company’s decision to leave the content online can submit a complaint to the Federal Office of Justice, the public authority responsible for imposing potential fines. Under Section 4(5), the office can than seek a judicial decision establishing the unlawfulness of the content.


Criticism


Before the enactment of the law, a number of civil society groups and digital technology associations criticized the NEA in a “declaration on freedom of expression,” arguing that the government, rather than private companies, should be responsible for determining the legality of online content. The U.N. special rapporteur on the promotion and protection of the right to freedom of opinion and expression also raised concerns over the law, in particular the strict time periods within which SNPs must assess and remove illegal content. Critics feared that SNPs would, out of caution, delete or block content that was close to the line of illegality but appeared lawful after a comprehensive legal examination.


In a hearing at the German Bundestag, experts pointed out the risk that companies might prefer to “overblock” content in case of doubt instead of risking fines. The executive director of Reporters Without Borders, Christian Mihr, warned that the German law could inspire authoritarian regimes to enact similar legislation.


Then-Minister of Justice and Consumer Protection Heiko Maas, whose ministry first proposed the NEA, defended the law. He denied that the law would violate basic rights and reiterated his position that freedom of expression does not excuse spreading criminal content. The proposed NEA would instead guarantee the freedom of expression and would put an end to the “law of the jungle” on networks, he argued.


Implementation


In January 2018, just days after the NEA came into effect, Facebook and Twitter blocked prominent posts and accounts—mostly from the far right, but also satirical comments from journalists. The deleted tweets all concerned refugees in Germany. What follows is a more detailed description of some of the deleted content.


At New Year’s Eve 2017, the police of Cologne posted a “happy new year” tweet in German, English, French and Arabic. Beatrix von Storch, the deputy leader of the far-right Alternative for Germany (AfD) and member of the German Bundestag responded on Twitter and Facebook: “What the hell is going on in this country? Why does the official police website in NRW (North Rhine Westphalia) twitter in Arabic? Do you believe that it soothes the barbaric, Muslim, gang raping hordes of men?”


Von Storch was referring to a clash the previous New Year’s Eve between the Cologne police and a group of young men sexually harassing women in the town. Facebook and Twitter immediately deleted Storch’s comment. The satirical German magazine Titanic soon responded to Storch’s post with the following tweet: “Do you know how to say Twitter in Arabic, dear @polizei_nrw_k,? Yes? Pfui! I don’t know it – because the last thing that I want are soothed barbaric, Muslim, gang raping hordes of men!”



Twitter also took down this tweet and temporarily blocked the Titanic account. After these events, AfD’s president Alexander Gauland described the NEA as a “censorship law”.


Transparency Reports by Social Network Providers


The first comprehensive reports about the handling of complaints were issued in July 2018, six months after the MEA went into effect. YouTube and Twitter reported over 200,000 individual complaints, while Facebook claimed only around 800. Over 70 percent of complaints did not result in removal of the content in question because the companies found that the content did not violate their Community Guidelines or the NEA.


The YouTube transparency report made clear that much of the removed content would already have violated the company’s Community Guidelines even in the absence of the NEA. Content related to hate speech, political extremism, terrorism and defamation and insults, however, was often considered not to violate YouTube’s Community Guidelines and yet was taken down under the NEA. YouTube, Twitter and Facebook also reported that the companies often needed more than 48 hours to decide about the legality of content alleged to be hate speech, defamation and/or insults.


On the basis of these reports, Reporters Without Borders expressed continued worry about overblocking, as did the political opposition party Alliance 90/The Greens. But overblocking is hard to assess. In June, the member of parliament and chairman of the Committee for Digital Policy Jimmy Schulz argued that proof of overblocking could be inferred from the small number of individual complaints submitted at the Federal Office of Justice: The small number of complaints would mean that the majority of contentious posts had beene deleted by SNPs before anyone could submit a complaint about the non-removal at the Federal Office. By November 2018, only 704 individuals had registered complaints about insufficient take-downs by SNPs, though Maas had expected 25,000 cases per year. Konstantin von Notz, member of parliament and the Greens’ spokesperson for digital policy, argued that the small number of individual complaints was only proof of the need to improve the process for submitting complaints to SNPs. His political party has also called for a put-back procedure for content taken down for unjustifiable reasons and suggested establishing a public institution to resolve disputes.


The Challenge of Deleted Content


Since enforcement of the NEA began, the courts have had to decide about the legality of deleting legal content. Individuals who believe that their content has been inappropriately deleted can, under Section 1004 I 2 of the German Civil Law Code, submit a claim against the SNP to omit the removal of their content. The increased number of civil law cases corresponding with enforcement of the NEA might also show that companies are preferring to delete content in case of doubt.


A number of German Regional and District Courts have had to decide whether a SNP may take down content that is actually legal under domestic German law, or whether this would impermissibly restrict freedom of expression. In the majority of cases, the courts ruled that the SNP could take down content that was legal under German law but violated the platform’s community standards (for example: Regional Court of Karlsruhe, Regional Court Dresden, Regional Court Stuttgart, District Court of Frankfurt). Even if constitutional rights only pose direct obligations on the state, the courts wrote, those rights might also have indirect effects on private entities, insofar as the freedom of one person must be balanced against the freedom of the other.


The Regional Court Dresden found that in calculating this balance, the users’ freedom of expression is of particular importance because Facebook has almost a monopoly as a non-professional social network platform within Germany. Based on this quasi-monopoly, the court ruled that Facebook would not be allowed to take down posts voicing a particular political viewpoint or to block accounts arbitrarily. In weighing the concrete balance of the constitutionally protected rights of the company with the users’ freedom of expression, the court held that the company’s constitutional right of property and its freedom to act outweighs the users’ rights. The court considered in particular the risk of an SNP receiving a fine under the NEA. In order to remain operational, the SNP must thus be allowed to delete content that it considers illegal under the community standards and that only later proves to be compliant with domestic German law.


The Regional Court of Stuttgart ruled similarly, considering the risk that the company could be fined under the NEA within the balance of constitutional rights. The court ruled in favor of Facebook, arguing that the user would only be prohibited from expressing his or her opinion on this particular platform.


Only a few court decisions ruled that take-down based on a violation of community standards alone was illegal and that the SNP had to restore the posted content. The District Court Cologne ruled that Facebook’s community standards were formulated such that the company would always be allowed take down content under its on rules, but would never be obliged to do so. If a SNP were allowed to take down content based only on a violation of community standards, the court wrote, it would be impossible for the state to control online content and necessary takedowns.


Conclusion


Despite criticism of the law, it is most likely that the main structure of the NEA will remain. The coalition treaty between the political parties in power currently states that the NEA was a “right and important step for the fight against hate crimes and other punishable statements in social networks.” Meanwhile, Alliance 90/The Greens proposes to foster the creation of a self-regulating private institution under the control of the state, which would solve disputes about the legality of content rather than leaving companies to decide what to take down. Chancellor Angela Merkel has said that it might be necessary to adopt changes to the law, but a first evaluation of the NEA is only planned for 2020. By then, the government expects to have a better picture about the effectiveness and consequences of the law.


Nele Achten is affiliated with the Berkman Klein Center for Internet and Society at Harvard University and a PhD candidate at the University of Exeter, where she focuses on the debate around cyber-specific due diligence obligations and assesses key normative questions relevant for the development of a normative framework that protects cyberspace as a safe and secure environment. Last year, Nele was visiting researcher at Harvard Law School and affiliated with the Cybersecurity Project at Harvard Kennedy School. She holds a LL.M. from the Geneva Academy of International Humanitarian Law and Human Rights and is qualified to practice law in Germany.

Subscribe to Lawfare