Published by The Lawfare Institute
in Cooperation With
We’ve heard this story before: A terrorist takes a legally purchased gun and goes on a shooting spree. After the shooter’s demise, the FBI is left holding an encrypted iPhone belonging to the attacker, which it then tries to compel Apple to unlock.
This happened in 2015-2016 after the terror attack in San Bernardino. And now it’s happening again following the attack by a Saudi military officer training in Pensacola, Florida. The officer died in a shootout after killing three service members and wounding eight, and federal investigators are now seeking to unlock two iPhones belonging to him.
But there’s an important difference between San Bernardino and Pensacola. After the 2015 attack, Apple was in a position to help the FBI—it just refused to. (The bureau was eventually able to unlock the phone without the company’s help.) This time, though, it is technically impossible for Apple to assist.
First, some background. The iPhone, even an older phone like the iPhone 5, has impressive password-based security. Within the central processing unit (CPU) itself is a hardware encryption engine, designed to quickly enable encrypting and decrypting memory. This encryption engine can use multiple keys, one of which is particularly special.
This special “device key” is a random key generated by the phone during its final assembly: The phone itself writes it into its CPU. Once written, even the phone itself can’t see this key. Instead, the phone tells the hardware encryption engine to use the device key when the need arises. Newer phones do all this in the “secure enclave,” a separate processor within the main chip, but the process is effectively the same whichever model you’re using.
The device key is used when you go to unlock your phone. When you put your passcode into the phone, the CPU combines your passcode with the device key by repeatedly encrypting your passcode. After roughly a tenth of a second, the resulting random-looking value is then used to decrypt the master user key, which protects all personal data on the phone. Of course, if you get the wrong passcode, the phone can’t decrypt the master key and asks you to try again.
The FBI has already purchased at least one of two forensic tools to access locked iPhones: GrayKey and Cellebrite. Both work by first compromising the phone’s operating system, either the main operating system (in the case of an iPhone 5) or the secure enclave (in the case of more modern phones). The software then programs the processor to keep trying passwords at roughly 10 per second. At this rate, the software should unlock a phone protected by a six-digit pin in roughly a day. However, if the user chose a really strong password, GrayKey or Cellebrite might need to keep working until long after the sun goes out.
In the San Bernardino case, the FBI wanted Apple to provide a custom version of the operating system that would allow the FBI to continuously try passwords, providing the functionality of GrayKey—nothing more—without the need for a preexisting vulnerability to compromise the operating system. Apple strongly objected, arguing that creating a malicious operating system update was simply too dangerous a precedent. After the FBI managed to access the iPhone without Apple’s assistance, Apple decided to prevent such a request from ever happening again.
Now, in order to update the iPhone, you have to do one of two things: erase the phone back to factory specifications, meaning that all the data will be wiped, or already know the passcode. The strategy is best described as, “To rekey the lock, you must first unlock the lock.” Absent a vulnerability, the phone’s operating system update path is locked out even to Apple.
There is at least one vulnerability for GrayKey and Cellebrite to use in the Pensacola case. GrayKey appears to work on the latest iPhones, and the phones belonging to the Pensacola shooter are known to have an unpatchable vulnerability (checkm8) of the type needed by GrayKey and Cellebrite to exploit the phone. So absent a vulnerability, Apple can’t do anything to install new software on the phone to help the FBI. And with a vulnerability, Cellebrite and GrayKey already do what the FBI wanted “San Bernardino iOS” to do.
With this in mind, the FBI’s request for Apple’s help is puzzling. If the government takes Apple to court, as it did in the San Bernardino case, the judiciary can’t compel Apple to provide any meaningful assistance—because there is no more meaningful assistance Apple can provide. San Bernardino was about Apple not wanting to assist the FBI. This is about Apple being incapable of assisting the FBI.
So even if the case winds up in court, real help from Apple is not going to come out of a judicial battle. The meaningful fight is in the court of public opinion.