Courts & Litigation Cybersecurity & Tech

A Zuck Takes on Meta

Paul M. Barrett
Friday, May 10, 2024, 1:30 PM

A professor seeks to turn Silicon Valley’s legal shield into a sword.

Smartphone Showing Facebook Application (Pixabay,; Public Domain)

Published by The Lawfare Institute
in Cooperation With

An unusual lawsuit filed against Meta on May 1 by the Knight First Amendment Institute at Columbia University could clear the way for meaningful reform of the social media industry. If successful, the suit brought on behalf of internet policy expert Ethan Zuckerman could accomplish more than any of the many failed legislative attempts by Congress to rein in Silicon Valley.

The legal action aims to give social media users more control over what they see when they log in, potentially shifting power to shape online experiences from a handful of giant companies to millions, if not billions, of individuals worldwide. It could spark a new cottage industry of firms offering those individuals tools to filter out hateful, conspiratorial, or divisive content that they now cannot avoid encountering in their feeds. In theory, it could also undercut current social media business models based on the collection of users’ personal data to target advertising and algorithms that promote sensationalistic material.

The suit seeks to accomplish these ambitious goals by invoking a generally overlooked provision of Section 230 of the Communications Decency Act, a 28-year-old federal law that was designed to protect 1990s online message boards from costly lawsuits, while also encouraging them to block pornography. To the dismay of many critics, today’s social media behemoths have deployed the statute widely to thwart litigation seeking to hold them accountable for all manner of third-party content on their platforms. If this lawsuit works, the industry’s main courtroom shield would become a sword in the hands of Zuckerman and other internet reformers.

The filing of Zuckerman v. Meta Platforms in the U.S. District Court for the Northern District of California came only six days after President Biden signed a law that requires the Chinese company ByteDance to sell TikTok or see the popular short-video platform banned in the United States. But the TikTok sale-or-ban statute stems not from an attempt to improve how social media companies generally operate, but from a specific fear that the Chinese government could use its influence over ByteDance to spread political disinformation and gain access to American users’ personal information.

Zuckerman v. Meta Platforms could have much broader implications for the entire industry, including not just Meta’s Facebook and Instagram platforms but also YouTube and X, formerly Twitter. To be sure, the suit could run aground for technical legal reasons. But if it clears those obstacles, the complaint could succeed based on its notably public-spirited argument that ordinary individuals—not profit-driven corporate algorithms—should determine what they see online.

A Professor’s Quest

Ethan Zuckerman, a professor at the University of Massachusetts Amherst, directs a research center devoted to “building a more civic-minded internet.” In academic papers and panel discussions, he has argued that, as his lawsuit puts it, “the platforms’ engagement-driven algorithms contribute to the spread of false, extreme, or polarizing content, while also stoking division and violence offline.” Many users want more control over what they’re exposed to, but companies like Meta have blocked this user agency. Zuckerman wants to address this lack of control by introducing digital tools that allow users “to tailor what they see on social media to their own preferences.”

He has designed a browser extension called Unfollow Everything 2.0, which would let users effectively turn off the newsfeeds assembled for them by Facebook’s algorithms. The tool would do this by blocking content they do not want to see, while allowing them to stay connected to friends and family with whom they want to stay connected.

Here’s how it works: When a user activates the browser extension, also known as a plug-in, Unfollow Everything 2.0 would cause the user’s browser to retrieve their list of friends, groups, and pages. The tool would then comb through the “followed” list, causing the browser to ask Facebook to unfollow each friend, group, or page on the list. The tool would allow the user to select friends, groups, and pages to refollow or to keep their newsfeed blank and view only content that they seek out. It would also encrypt the user’s “followed” list and save it locally on the user’s device, which would allow the user to keep the list private while still being able to automatically reverse the unfollowing process.

Facebook currently allows users to cut off unwanted sources of content, but this requires a cumbersome manual process of unfollowing groups, pages, and friends one-by-one. Unfollow Everything 2.0, which Zuckerman plans to make available free of charge, would automate and streamline the process. 

In 2022, Facebook introduced a tab allowing users to access a strictly reverse-chronological feed that excludes algorithmically recommended posts that, but for the algorithm, would not appear in users’ feeds. But users can’t make this alternative feed their default, and even if they select the reverse-chronological option, the next time they open Facebook, the app reverts back to the algorithmically ranked feed. 

Part of Zuckerman’s motivation in bringing the suit is that he wants to study how Unfollow Everything 2.0 affects users’ Facebook experience. He would like people using the tool to provide him with anonymized data—on a purely voluntary, opt-in basis—that would let him try to answer such questions as whether shaping their own version of Facebook makes the app less “addictive” or affects how much time they spend on the platform. (A study published in 2023 in the journal Science found that users who were moved from an algorithmically ranked feed to a reverse-chronological alternative “spent dramatically less time on Facebook and Instagram.”)

But the professor has not yet introduced Unfollow Everything 2.0, according to his lawsuit, “because of the near certainty that Meta will pursue legal action against him for doing so.” That near certainty is based on Meta’s history of shutting down research projects and tools like Zuckerman’s. 

Cease and Desist

In July 2021, the company sent a cease-and-desist letter to the British developer of the original Unfollow Everything, on which Zuckerman’s 2.0 version is modeled. Meta permanently banned the developer, Louis Barclay, from Facebook and Instagram and threatened to sue him for seeking to gain “‘intentional and unauthorized access to its protected computer networks.’” Barclay took down his tool. 

Meta has issued similar legal threats to other developers and researchers. The company in August 2021 issued a separate cease-and-desist letter to, and suspended the accounts of, researchers at New York University who had created a browser extension to study disinformation on Facebook. That tool, called Ad Observer, collected anonymized information about advertising shown to users, including how the ads were targeted. The company’s response effectively ended the NYU research project.

“Professor Zuckerman is unwilling to subject himself and his team to the risk of legal action,” according to his suit, which seeks a declaratory judgment from the district court that Unfollow Everything 2.0 would not violate Facebook’s terms of service, the federal Computer Fraud and Abuse Act, or California’s version of the federal law. Meta has not yet filed a response in court and did not respond to emails seeking comment for this article. 

The suit is not a sure winner. For one thing, the court could dismiss it based on Zuckerman’s lack of “standing,” meaning that he hasn’t yet suffered any harm. In response to such a dismissal, Zuckerman could, of course, go ahead and introduce Unfollow Everything 2.0, which might result in Meta threatening to file suit and banning him from the platform. If Meta did sue, it would presumably argue that Zuckerman intentionally violated the 1986 federal Computer Fraud and Abuse Act because his browser extension intruded on the operation of the Facebook platform without the company’s authorization. 

Deploying Section 230

But it’s clear that Zuckerman and his lawyers at Columbia’s Knight Institute want to strike first and define the terms of this battle. And for that effort, they have constructed an intriguing legal argument under Section 230 of the Communications Decency Act. 

Enacted in 1996 to protect nascent interactive internet businesses, Section 230 ordinarily is invoked by social media companies as a defense against litigation seeking to hold them responsible for harms allegedly associated with content posted on their platforms by third-party users. This defense, which has been notably effective in court, stems from Section 230(c)(1), which states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Critics of the social media industry have called for 230(c)(1) to be modified or erased so that platform companies can “be treated as the publisher or speaker” of content posted by others and thereby held liable for harm, such as defamation, caused by that content.

The Zuckerman suit points to another, rarely litigated part of 230—namely, (c)(2)(B), which immunizes from legal liability any “provider or user of an interactive computer service” for actions “taken to enable or make available to information content providers or others the technical means to restrict access to material” deemed by a provider or a user to be “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.” As convoluted as this language is, it indicates that “Congress intended to promote the development of filtering tools that enable users to curate their online experiences and avoid content they would rather not see,” according to the suit. That’s what Unfollow Everything 2.0 aims to do: “Users would remain free to navigate to their friends’ profiles, but without first being presented with a feed that Facebook has designed to maximize user engagement.”

Meta doubtless will try to refute this interpretation of Section 230(c)(2)(B). For one thing, the company likely will argue that Zuckerman is neither a “provider or user of an interactive computer service.” There isn’t a wealth of case law on this point, but the most relevant precedent appears to favor Zuckerman. 

Ninth Circuit Precedent

In 2009, the U.S. Court of Appeals for the Ninth Circuit, which oversees the federal district court in California that will try Zuckerman v. Meta Platforms, ruled in favor of Kaspersky Lab, the U.S. distributor of a Russian company that makes tools intended to block malicious software (malware), largely on the basis of Section 230(c)(2)(B). Kaspersky Lab had been sued by Zango, a now-defunct marketer of videos, games, and other content, for allegedly interfering with customers’ use of Zango products. Kaspersky Lab asserted that Section 230(c)(2)(B) protected it from the suit, and a federal trial judge agreed—dismissing the case.

A three-judge panel of the Ninth Circuit affirmed the dismissal after going on an arduous textual safari. The panel cited other parts of the law—230(f)(2) and (4)—to determine that the definition of “interactive computer service provider” includes “access software providers”—a category that includes “a provider of software (including client or server software), or enabling tools that … filter, screen, allow, or disallow content.” The panel concluded:

Thus, a provider of software or enabling tools that filter, screen, allow, or disallow content that the provider or user considers … objectionable may not be held liable for any action taken to make available the technical means to restrict access to that material, so long as the provider enables access by multiple users to a computer server.

The Ninth Circuit judges in Zango v. Kaspersky Lab also pointed to 230(b), in which lawmakers laid out the purposes of the law, one of which was “to encourage the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet and other interactive computer services.” That sounds a lot like the stated goal of Unfollow Everything 2.0.

A Trojan Horse?

Despite Zango v. Kaspersky, even savvy observers have expressed surprise over how the Zuckerman suit proposes to make use of Section 230(c)(2)(B). “It could be a trojan horse that no one noticed in Section 230 that effectively bars websites from taking legal action against middleware providers who are providing technical means for people to filter or screen content on their feed,” writes Mike Masnick, author of the Techdirt blog and a prominent voice on platform liability. 

By “middleware,” Masnick means software that users can install to exert greater control over their social media experience. Francis Fukuyama, a political scientist at Stanford, among others, has advocated for the creation of a competitive market of firms offering a range of middleware options that individuals could use to customize platforms like Facebook or YouTube. On his own blog, Zuckerman describes his lawsuit as a possible way to “open up a channel for developing” such a middleware marketplace.

Masnick adds an important caveat, though. Even if the Zuckerman suit survived a standing challenge and Meta’s counterarguments on the meaning of both Section 230(c)(2)(B) and the Computer Fraud and Abuse Act, the company could still put in place technical barriers to try to thwart Unfollow Everything 2.0 and the rush to create middleware services. One such barrier could be automatically banning anyone from the platform who tries to use a filtering device. “But that’s very different from threatening or filing civil suits,” Masnick notes. 

In the meantime, a court fight over Unfollow Everything 2.0 and its potential progeny could amplify in a salutary way the public debate about the possibilities for users who want to reshape their social media experiences. Therefore, no matter its result, Zuckerman v. Meta Platforms merits close attention.

Barrett is the deputy director and senior research scholar at the Center for Business and Human Rights at New York University’s Stern School of Business. He writes about the effects of technology on democracy.

Subscribe to Lawfare