In light of Apple’s response to the FBI’s request to gain access to San Bernardino shooter Syed Farook’s iPhone, I thought I would share some of my thoughts on this. It appears that there is some confusion in the connection of this request from the FBI with the bigger government debate on providing backdoors and encryption.
Let me attempt to break this down a little in the hopes of clearing some of that confusion:
Apple has positioned the request from the FBI to be a request to install a “backdoor” in their product. This is not correct. The FBI request is pretty specific and is not asking for a universal key or backdoor to Apple products.
The FBI request should be interpreted as a lawful request to Apple to help construct a forensics recovery tool for a specific product with a unique serial number.
The phone in question is an Apple 5C, and the method of access requested by the FBI is actually an exploitation of a security vulnerability in this (older) product. The vulnerability does not exist in the current generation of Apple iPhones.
It is easy to argue that the vector for exploitation that the FBI are asking for is a known security vulnerability and was subsequently patched by Apple in later releases of the product. Whether the initial security vulnerability was intentional (in which case the 5C already has a “backdoor”) or it was an omission that was subsequently corrected (i.e. a security weakness) could be debated. Regardless, it is a security flaw that does not exist in subsequent iPhone versions.
As Apple has stated previously, they will and frequently do work with law enforcement agencies around the world to provide assistance with legal cases. They have developed numerous forensics tools and platforms to aid lawful investigation. This current FBI request is in line and scope of those requests.
Apple likely fears that by complying with this request—to create a custom patch for a vulnerable phone – will open the doors to subsequent law enforcement requests to provide support in investigations of similarly vulnerable (old) iPhones. Given the nature of the vulnerability, there is no “universal key” approach, and each legal request will likely require substantial involvement from Apple. This would not appear to scale well and could be financially demanding.
The current generation of iPhones do not (apparently) allow for this vector of attack—and there are no known ways of recovering data from iPhones using the supplied protection mechanisms.
The bigger debate over whether technology manufacturers should install backdoors in to their products for lawful investigation is a much bigger and more demanding discussion—in particular the fact that weakening the encryption or installing a backdoor on devices simply makes it easier for criminals to exploit because there is no guarantee that such “secrets” could be kept and, once uncovered, would expose the “keys to the kingdom.”
I’m concerned that since Apple has attempted to deny the FBI request citing use of “backdoors,” should they lose this legal argument, the repercussions could be extensive to the entire security industry. If the denial were phrased in terms of exploiting a (previously fixed) vulnerability, the prospect of a lost appeal would be greatly limited—and the arguments against government backdoors and law enforcement keys could be listed in their correct context, instead of being included in appeals over anti-terrorism and a specific instance of a horrific crime.
The debate is largely moot anyway. Even if vendors were required to install backdoors or include recoverable keys in the encryption they use, there are a near endless number of applications and software additions that can be installed by the user to ensure that those backdoors are irrelevant. Even if it became illegal for U.S. companies to not provide backdoor keys to their hardware or software, there are plenty of countries around the world with companies only too eager to supply the add-on tools to protect consumer and corporate data.