Congress Cybersecurity & Tech Democracy & Elections

Members of the House Are on Notice: No Tweeting Deepfakes

Evelyn Douek, Quinta Jurecic, Jacob Schulz
Thursday, January 30, 2020, 3:54 PM

The House Ethics Committee has announced that members who share deepfakes or “other audio-visual distortions intended to mislead the public” could face sanctions. It’s a small but noteworthy step.

A screenshot comparing a real video of President Barrack Obama with a deepfaked version, Jul. 2019 (Youtube/UC Berkeley/ Reuse Allowed)

Published by The Lawfare Institute
in Cooperation With

You might have missed it amid all the sound and fury of impeachment, but it’s been a busy week for disinformation. Twitter, the Wall Street Journal reports, will start removing posts that it determines are “misleading about an election.” Elizabeth Warren’s campaign rolled out a plan on “Fighting Digital Disinformation,” including a pledge not to “knowingly use or spread false or manipulated information.” And the House Ethics Committee announced that members of the House of Representatives who share deepfakes on social media might face sanctions from the House itself. The ethics announcement attracted the least attention of all, but it’s actually an important step in creating standards for how elected representatives should use social media.

This news might seem trivial compared to the drama of the impeachment trial going on just down the street. But as technology companies and governments alike grapple with how to address the spread of online falsehoods, the committee’s memo is noteworthy. Four years after the shock to the system of 2016, everyone agrees that disinformation and misinformation are problems that need to be dealt with, but the question remains who is best positioned to accept responsibility. The House of Representatives appears to be, in some small way, beginning to take on the task.

The committee’s “pink sheet”—an advisory memorandum on House rules—alerts members of the House to the dangers of posting deepfakes on social media, warning that “manipulation of images and videos that are intended to mislead the public can harm … discourse and reflect discreditably on the House.” For this reason, disseminating “deep fakes or other audio-visual distortions intended to mislead the public” could violate the House’s Code of Official Conduct, which governs the behavior of the chamber’s members and employees.

This might sound like the committee is addressing a problem that doesn’t exist yet. As far as we know, there have not been any cases in which a member of Congress—or any other prominent American political figure—has tweeted a genuine deepfake, meaning doctored audio or video generated through machine learning than can produce extremely lifelike and misleading results. After all, deepfakes—though concerning—just aren’t all that common in politics (yet).

Politicians have, however, published plenty of what the memorandum describes as “other audio-visual distortions”—that is, photos or video deceptively manipulated in a less sophisticated manner than a deepfake. Sometimes the manipulation is obvious: President Trump recently tweeted a picture altered to depict him putting a Medal of Honor around the neck of a dog that played a role in the raid on Islamic State leader Abu Bakr al-Baghdadi (the original photo showed the president bestowing the medal to a Vietnam War medic). But U.S. political figures have published more deceptive images, too. Three days after the strike that killed Iranian general Qassem Soleimani, Rep. Paul Gosar tweeted a photo appearing to show President Obama shaking hands with Iranian President Hassan Rouhani; it took 40 minutes before he acknowledged that the picture was actually a fake, a doctored version of a shot from a 2011 meeting between Obama and then-Indian Prime Minister Manmohan Singh.

Evelyn Douek is an Assistant Professor of Law at Stanford Law School and Senior Research Fellow at the Knight First Amendment Institute at Columbia University. She holds a doctorate from Harvard Law School on the topic of private and public regulation of online speech. Prior to attending HLS, Evelyn was an Associate (clerk) to the Honourable Chief Justice Susan Kiefel of the High Court of Australia. She received her LL.B. from UNSW Sydney, where she was Executive Editor of the UNSW Law Journal.
Quinta Jurecic is a fellow in Governance Studies at the Brookings Institution and a senior editor at Lawfare. She previously served as Lawfare's managing editor and as an editorial writer for the Washington Post.
Jacob Schulz is the former managing editor of Lawfare. He previously served as deputy managing editor and associate editor. He hails from Pennylvania and attended Amherst College.

Subscribe to Lawfare