Published by The Lawfare Institute
in Cooperation With
Section 230 of the Communications Decency Act is critical to Facebook’s existence. Under Section 230, the platform is immune from liability that might otherwise be incurred from its users’ posts. Without this immunity, which extends to any other “interactive computer service” but not to traditional media outlets, the platform could be held liable any time one of its users made a defamatory statement. But could Facebook lose this protection if it does not operate as a completely neutral forum? And must Facebook sacrifice its Section 230 immunity to keep its own First Amendment rights?
Sen. Ted Cruz, the Republican from Texas, suggested as much while questioning Facebook CEO Mark Zuckerberg during last week’s congressional hearings. But Cruz’s representation of Section 230 is misleading. There is no requirement that a platform remain neutral in order to maintain Section 230 immunity. And Facebook does not have to choose between the protections of Section 230 and those of the First Amendment; it can have both.
When asked by Cruz if Facebook considered itself to be a “neutral public forum,” Zuckerberg’s consistent response was that Facebook considered itself to be a “platform for all ideas.” Cruz insisted that a yes-or-no answer mattered: “It’s just a simple question. The predicate for Section 230 immunity under the CDA is that you’re a neutral public forum. Do you consider yourself a neutral public forum, or are you engaged in political speech, which is your right under the First Amendment.” And slightly earlier, Cruz asked, “Are you a First Amendment speaker expressing your views, or are you a neutral public forum allowing everyone to speak?”
This line of questioning suggests that Facebook must either declare itself to be a neutral public forum to preserve its Section 230 immunity, or concede that it is an actor engaged in political speech to preserve its First Amendment rights. Neither option would be good for the company. The first implies that Facebook’s practice of restricting content in accordance with the company’s guidelines could be reviewed for “neutrality”—a word with little meaning in this context. The second requires Facebook to take a politically difficult position given its efforts to portray itself as a platform for all ideas: As Zuckerberg said during the hearing, the company’s “goal is certainly not to engage in political speech.” And the implication that Facebook must choose one option to the exclusion of the other suggests that Facebook cannot simultaneously enjoy the protections of Section 230 and those of the First Amendment—which is not the case. This post addresses each of these suggestions in turn.
Section 230 Immunity
By asserting that Facebook must be a neutral public forum to benefit from Section 230 immunity, Cruz suggested that Facebook’s practice of content moderation could put this immunity at risk. But Section 230 was adopted precisely to encourage this moderation. As the statute’s proponents explained at the time, the hope was to incentivize online service providers to “help ... control” what enters the home through the “portals of our computer.” Protecting “anyone ... who takes steps to screen indecency and offensive material for their customers” was an explicit goal. And the drafters did not look favorably on government review of these discretionary decisions: “We do not wish to have content regulation by the Federal Government of what is on the internet, ... a Federal Computer Commission with an army of bureaucrats regulating the Internet.” By moderating its website, Facebook doesn’t risk running afoul of Section 230; it lives up to it.
Facebook’s shield from civil suit on the basis of user posts comes from Section 230(c)(1). This subsection says that no interactive computer service provider (like Facebook) “shall be treated as the publisher or speaker of any information provided by another information content provider.” Courts have interpreted this provision as a substantive protection that forecloses civil suits (including, importantly, defamation suits) against service providers on the basis of their users’ speech.
Section 230 also gives Facebook wide latitude to moderate the content posted to its site. Sub-section 230(c)(2) explicitly contemplates such a thing: It protects Facebook from civil liability on the basis of any “good faith” restrictions the company places on access to “material that [it] or [the] user considers to be ... objectionable, whether or not such material is constitutionally protected.” There is no neutrality provision and no review provision. The text explicitly says that the decision about whether material is objectionable is the service provider’s to make.
Though Section 230 itself says nothing about neutrality, some cases do discuss it, beginning with the Ninth Circuit’s decision in Fair Housing Council v. Roomates.com. The en banc court found that the housing website could have violated the Fair Housing Act because it “force[d] subscribers to divulge protected characteristics and discriminatory preferences” (namely, preferences regarding roommates’ sex and family status) as a condition of registration. The opinion refers to Section 230’s protection of “neutral tools” as opposed to tools that themselves encourage discrimination on the basis of a protected category. The court reasoned that though Roommates.com was an interactive service provider, it had acted as an information content provider insofar as it developed the potentially discriminatory tool. Section 230 would not immunize it for the discriminatory nature of its own actions.
So while Roommates.com did discuss neutrality, that discussion only had to do with the narrow context of whether or not the website itself had violated the Fair Housing Act. A requirement that tools be neutral with respect to protected categories to satisfy the Fair Housing Act does not establish the political neutrality described by Cruz as a general condition of Section 230 immunity.
First Amendment Protections
Furthermore, Facebook does not need to portray itself as a political actor to maintain its First Amendment rights. Corporations have free speech rights under Citizens United v. Federal Election Commission, whether or not they are political actors. These rights are important to Facebook because they protect Facebook’s ability to moderate the content on its site. Free speech includes the right not to speak, so the government cannot force Facebook to host content that the company does not want to host. Any such obligation would likely amount to a form of compelled speech in violation of the First Amendment.
As a matter of company policy, Facebook could make claims to neutrality without putting its First Amendment rights at risk. But these claims would not have the legal significance that Cruz suggests. There does exist a public forum doctrine in First Amendment law: If the government opens up access to public property for speech, it can only very rarely exclude speech from that property on the basis of the speech’s content. But despite Zuckerberg’s proposal for a “Supreme Court” of Facebook, his company is not the government, and so this doctrine does not restrict its ability to moderate.
The Supreme Court, prior to Citizens United, has occasionally found limits on a company’s ability to restrict speech on its private property. In the 1946 case of Marsh v. Alabama, it held that a company town—because it functions just like any other town, despite being run by a private entity—could not limit people’s First Amendment rights as a condition of entry. And in 1980, the court held in Pruneyard Shopping Center v. Robins that California could prevent a private shopping mall from excluding expressive activity. But the speakers had no federal First Amendment right to speak in the mall just because it was open to the public.
Would a similar law applied to Facebook be constitutional? Maybe, but it seems unlikely. The Pruneyard court found the restriction on the mall’s rights to be constitutional because the speech at issue did not interfere with the mall’s commercial function and because the mall could easily “disclaim ... sponsorship” of the message. It explained that an analogous statute applied to a newspaper would unconstitutionally “intru[de] into the function of editors,” but “these concerns obviously are not present” in the case of a shopping mall. As for Facebook? The company’s ability to moderate the site seems quite integral to Facebook’s value as a space for discourse; a Facebook overrun with trolls would be a different space entirely. And if moderating a social media site is integral to its operation, as it would seem to be, it’s hard to see how a law that dictated the terms of Facebook’s ability to moderate would not intrude into the function of the site’s operators. As the concurrence in Pruneyard observed, “state action that transforms privately own property into a forum for the expression of the public’s views could raise serious First Amendment questions,” even if it didn’t in the specific case of the Pruneyard mall.
In sum, the dichotomy Cruz presented to Zuckerberg was a false one. Facebook may still have difficult decisions ahead—but contrary to Cruz’s suggestions, it is at risk for losing neither its Section 230 immunity nor its First Amendment rights if it continues its current practices of content moderation.