Published by The Lawfare Institute
in Cooperation With
Our interview is with Kim Zetter, author of the best analysis to date of the weird messaging from the National Security Agency (NSA) and Cyber Command about the domestic “blind spot” or “gap” in their cybersecurity surveillance. I ask Kim whether this is a prelude to new NSA domestic surveillance authorities (definitely not, at least under this administration), why the gap can’t be filled with the broad emergency authorities for the Foreign Intelligence Surveillance Act and criminal intercepts (they don’t fit, quite) and how the gap is being exploited by Russian (and soon other) cyberattackers. My most creative contribution: maybe Amazon Web Services, where most of the domestic machines are being spun up, would trade faster cooperation in targeting such machines for a break on the know-your-customer rules they may otherwise have to comply with. And if you haven’t subscribed to Kim’s (still free for now) substack newsletter, you’re missing out.
In the news roundup, we give a lick and a promise to today’s Supreme Court decision in the fight between Oracle and Google over application programming interface copyrights, but Mark MacCarthy takes us deep on the Supreme Court’s decision cutting the heart out of most, class actions for robocalling. Echoing Congressional Democrats, Mark thinks the court’s decision is too narrow. I think it’s exactly right. We both expect Congress to revisit the law soon.
Considering what a debacle the Google and Apple effort on tracing turned into, with a lot of help from privacy zealots, I’m pleased that Nick and I agree that this is a tempest in a teapot. Paper vax records are likely to be just fine most of the time. That won’t prevent privacy advocates from trying to set unrealistic and unnecessary standards for any electronic vax records system, more or less guaranteeing that it will fall of its own weight.
Speaking of unrealistic privacy advocates, Charles-Albert Helleputte explains why the much-touted General Data Protection Regulation privacy regime is grinding to a near halt as it moves from theory to practice. Needless to say, I am not surprised.
Mark and I scratch the surface of Facebook’s Fairness Flow for policing artificial intelligence bias. Like anything Facebook does, it’s attracted heavy criticism from the left, but Mark thinks it’s a useful, if limited, tool for spotting bias in machine learning algorithms. I’m half inclined to agree, but I am deeply suspicious of the confession in one “model card” that the designers of an algorithm for identifying toxic speech seem to have juiced their real-life data with what they call “synthetic data” because “real data often has disproportionate amounts of toxicity directed at specific groups.” That sure sounds as though the algorithm relying on real data wasn’t politically correct, so the researchers just made up data that fit their ideology and pretended it was real—an appalling step for scientists to take with little notice. I welcome informed contradiction.
Nick explains why there’s no serious privacy problem with the IRS subpoena to Circle, asking for the names of everyone who has more than $20 thousand in cryptocurrency transactions. Short answer: everybody who doesn’t deal in cryptocurrency already has their transactions reported to the IRS without a subpoena.
Charles-Albert and I note that the EU is on the verge of finding that South Korea’s data protection standards are “adequate” by EU standards. The lesson for the U.S. and China is simple: The Europeans aren’t looking for compliance; they’re looking for assurances of compliance. As Fleetwood Mac once sang, “Tell me lies, tell me sweet little lies.”
Mark and I note the extreme enthusiasm with which the FBI used every high-tech tool to identify even people who simply trespassed in the Capitol on Jan. 6. The tech is impressive, but we suspect a backlash is coming. Nick weighs in to tell me I’m wrong when I argue that we didn’t see these tools used this way against Antifa’s 2020 rioters.
Nick thinks we haven’t paid enough attention to the Accellion breach, and I argue that companies are getting a little too comfortable with aggressive lawyering of their public messages after a breach. One result is likely to be a new executive order about breach notification (and other cybersecurity obligations) for government contractors, I predict.
And Charles and I talk about the UK’s plan to take another bite out of end-to-end encryption services, essentially requiring them to show they can still protect kids from sexual exploitation without actually reading the texts and pictures they receive.
Good luck with that!
You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!
The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.