I stand with the idealized version of a company who innovates and stands up for our privacy and civil liberties. On TV we don’t often see this company by name (unless they’re a sponsor) but we know who we’re talking about when we see a piece of fruit on a laptop. The Simpsons was cutting it pretty close with “Mapple.”
The real Apple, I’m not so sure about.
These are very technical issues that I’m presenting in a colloquial way. But I think this situation is something that can be understood and thought about by anyone.
The FBI requested* Apple’s assistance in unlocking the phone of one of the San Bernardino gunmen. Specifically they asked Apple to do the following:
- Create a way for the FBI to guess the phone’s code by brute force (trying all possible codes) without having to type in the numbers or letters.
- Disable any delay countermeasures that space out the time between guesses (i.e. Make it so the FBI doesn’t have to wait half an hour after five bad guesses).
- Create a program that prevents the phone from erasing its contents after a set number of bad guesses.
- Do all of these things without modifying the contents of the phone.
Apple CEO Tim Cook stated in a letter Apple’s opposition to the court order**. Their basic argument is the following:
- A program to crack an encrypted iPhone does not currently exist. If they write one, there exists a possibility (however small), that the program will find its way into the wild and will be misused either by hackers, or even by unchecked government surveillance.
- If you write a program to crack this one iPhone, you’ve written it to crack all iPhones. In Apple’s words “The government is asking us to hack our users.”
- Complying with the order may set a dangerous precedent and allow for Government overreach.
Here’s one thing Apple didn’t say: We CAN’T comply with the order.
If you’re technically inclined you should read this article from the Trail of Bits Blog. It does a good job of explaining encryption at all the various layers, what the government asked Apple to do, and how Apple could do it.
Basically there are two locks on a phone. The passcode which the FBI is trying to break, and the phone’s hardware key (stored in a couple of different places depending on the model of phone). You need both keys to decrypt the data, but you can get the hardware key if you get the passcode right. Trying to decrypt the data without these keys is basically impossible.
What Apple is saying is that they CAN write a program that will allow you to brute force guess the first key, which gives you the second key and access to the phone. The phone’s security CAN be bypassed. According to the Trail of Bits estimates it would take half an hour to retrieve a four-digit pin, a few hours to get a six-digit pin, and up to 5.5 years to guess a six-digit alphanumeric passcode. The FBI hasn’t mentioned which type of code they’re trying to crack on this particular phone.
Is this a bad thing?
Well generally, yes.
Apple’s taking a stand that they won’t write this program because to do so would expose their phones. But the fact that it’s possible for them to comply with the order suggests that someone else could write this program and expose iPhones in the same way. Apple didn’t give any estimates on how long it would take to engineer a program like this.
Basically, if Apple cared about privacy as much as they say they do, they’d make a phone they couldn’t crack even if someone asked nicely (though that sounds a lot easier than it is).
So why doesn’t the government write this program themselves if it’s so easy to write?
They need Apple’s digital signatures, and knowledge of the iOS operating system. A rogue program doesn’t just run on your phone. It needs to be verified, and Apple can write something the phone will recognize as authentic.
So it’s not so easy?
Well, the problem is this. Apple and the government are both very security conscious places. And they’ve both been hacked. That’s why Apple says they’re worried about bringing a program like this into the world. It could always get out. The problem is, so can digital signatures. And engineers can always be personally targeted. Many data breaches work at the human level, not the technological.
What do you think?
I think this is one of the worst possible cases for Apple to have to take a stand on. This was a terrible act of terror, the phone is owned by the shooter’s employer who has agreed to let the FBI try to crack it, and it’s possible the FBI could learn about other terrorists or even future attacks from the phone’s contents (though, then again, maybe not).
Tim Cook’s tone is alarmist and a bit strident. Even if you agree that privacy is important, you probably also think that law enforcement should have some ability to get information it needs.
But privacy really is important. Sure, we want to be able to crack the bad guy’s phones. But if we create tools to crack those phones, who’s to say that program won’t be used to crack the phones of people trying to do real good in countries with oppressive regimes.
Maybe what the FBI needs is the assistance of one Benedict Cumberbatch. He was able to guess the passcode of Irene Adler’s phone, thus removing any leverage she might have had over him. Her phone was literally “Sher-locked.” No, seriously. Check out Wikipedia if you think I’m lying.
I think Apple’s right to challenge the court order. These sorts of things shouldn’t be followed blindly without at least some public discussion of practical limits, and an understanding of the potential risks. But I don’t think either side has the clear moral high ground.
So I stand with Pear. They make a great uncrackable myPhone, even though the fruit is terrible.
PS. Thanks to Adam for his great thoughts on this issue and for starting a conversation that lead to a lot of good articles.
* I’m using a nice term, it was a court order actually.
** If you read the whole letter, Apple’s pretty clear about how horrible the San Bernardino shooting is, and how they’ve made every reasonable effort to assist law enforcement.