In particular, the Federal Bureau of Investigation wants Apple's help in unlocking two phones belonging to Mohammed Saeed Alshamrani, the Saudi Air Force trainee who killed three people last month at Pensacola Naval Air Station. Apple says it has turned over all the data it possesses but refuses to go further and create a backdoor past the encryption that protects its devices.
Presumably the issue is headed for the courts.
We've traveled this road before - and the path is instructive. In 2016, the FBI demanded that Apple develop special software that would allow it to unlock an iPhone 5C used by Syed Rizwan Farook, one of two shooters in a terror attack that killed 14 people in San Bernardino. When Apple refused, the government obtained a court order. Most of big tech weighed in on Apple's side. Before the company's appeal could be heard, however, the FBI surprised everybody with the announcement that it had unlocked Farook's phone.
The DOJ's inspector general later found that the FBI had not exhausted all possibilities before taking Apple to court. In particular - and it's important to follow the rabbit down the hole here - the FBI's Cryptographic and Electronic Analysis Unit had not asked for the assistance of the Remote Operations Unit of the Technical Surveillance Section of its own Operational Technology Division.
This mouthful of alphabet-soup matters because, as it turns out, the head of the Remote Operations Unit knew of a "vendor" that was "almost 90% of the way" to finding a way to break into a locked iPhone. Upon learning of this, the department invited the vendor to demonstrate the capability. The next day, the suit against Apple was dropped.
Presumably in-house communications have been better this time around. Even so, one can understand why the FBI is back to asking Apple for help. Back in 2016, techies agreed that whatever trick DOJ used would work only once. Apple would find out how the unnamed vendor broke the encryption, and close that vulnerability in the next generation of phones. Besides, Farook's device was an iPhone 5. As as security goes, that's practically the horse-and-buggy days. Quite likely, then, none of those alphabet-soup players have yet figured out how to break defeat the encryption on the newer devices.
Why does Apple continue to resist? And why do so many of us, notwithstanding our fears about terrorism, think Apple is right?
Here's one reason: The company does not currently have a means of breaking into a locked phone. Forced to develop one, Apple would most likely create a software update that, once sent to the device, would allow the phone to be unlocked through some means other than a password (or facial recognition or fingerprint).
But the mere existence of such a technology is inconsistent with the basis on which the phone is sold. The company proudly trumpets its own inability to recover data from a locked iPhone once the user has exhausted 10 tries at entering the password. The value of this encryption is priced into the device.
Even if we assume that the value of this feature to the consumer is quite small - perhaps no more than one percent of the sale price - the total value is quite considerable to Apple. In the 12 months ended September 28, 2019, Apple's total revenue from selling iPhones was a bit over $142 billion. Thus a one percent security premium would come to $1.4 billion - not pocket change even for a company whose market cap is currently thirteen figures.
Even if the value of the encryption to the buyer is only one half of one percent of the price of the phone, the loss to Apple is $700 million. If, on the other hand, you think the value of the security component is greater than one percent - very much my own suspicion - well, you can do the arithmetic.
In any case, lots of users are attracted to the notion that the Apple does not possess any secret way into the iPhone. (I certainly am.) The government, aware of this concern, insists it's not asking Apple to create a backdoor; it only seeks a way to extract all the data on a pair of phones. This bizarre bit of linguistic legerdemain is meaningless.
To borrow from one of my mentors, you can call it Thucydides or you can call it banana peel, but it's a backdoor all the same. Whatever the label, software that enables recovery of data without the password would mean a lot less privacy for users.
Still, perhaps you're wary of absolutes; maybe you believe that in a particular case, the need to prevent crimes - particularly acts of terrorism - should outweigh the individual's right to privacy. Fair enough. But do ask yourself this: Does history teach that the federal government, once in possession of a surveillance tool, will remain discreet and humble in its use?
Sadly, the record isn't good. That's why we're back here again. And why this time around, the fight will likely be to the finish.
Sign up for the daily JWR update. It's free. Just click here.