There is currently a national debate going on in the United States about the challenges of security vs. security. Some are calling it privacy vs. security but that’s not the real issue as I’ll get to shortly. In the wake of the San Bernardino shooting, the FBI has gone to court and demanded via the All Writs Act that Apple create a special insecure version of the operating system for the iPhone (iOS).
As in the naming of the core problem (privacy or security) we need to understand exactly what the FBI is asking for here. Many media outlets continue to report this as the FBI asking Apple to decrypt or unlock the phone. That is not what they’re asking, they are asking Apple to create a special version of the software, load it onto the phone, and then they will brute-force it. In the past Apple has cooperated with the government to retrieve customer data pursuant to a warrant, but they’ve never fundamentally weakened iPhone security at the request of the US or any other government.
I usually try to avoid blogging about political issues as well as national security ones, because they’re complicated and the state does indeed have a legitimate interest in security activities as long as they’re constitutionally supported. In this particular case the press has been all over the place and frequently misreporting the basic facts, so I feel it’s important to keep people informed to allow them ton make better decisions. On the other hand a lot of people who are also making wild claims about what the government is actually asking for. There are some very interesting issues at play in this particular drama – let’s take a look.
Articles such as the editorial today in NYT are written by those who are either ignorant of the technical details or are willfully misleading the public. They refer to “unlocking” the iPhone and “using the front door”. In both cases the phrases are designed to mislead the public by suggesting that there is no downside. A more accurate description would be that they’re asking Apple to make sure the front door is easy to break into. This of course wouldn’t sell well with the public.
As to the ramifications of the issue, The Verge noted:
The FBI has a locked phone and they want it to be unlocked. Getting there will mean building some dangerous tools and setting some troubling precedents, but that’s basically what the bureau wants.
Privacy vs Security
This issue has been framed in the media as privacy vs. security which is misleading.
The security vs privacy debate is an interesting and worthwhile topic and I’ll certainly dive in at a future date, but this issue with the FBI and Apple isn’t that issue at all. Make no mistake, this isn’t about the privacy of the data on a particular iPhone, it’s about the security of all iPhones. The government is specifically not asking for the data, they’re asking for Apple to make the phone less secure.
John Adams, a former head of security at Twitter noted today in the LA Times:
“They try to use the horrors of the world to erode civil liberties and privacy, but the greater good — having encryption, more privacy for more people — is always going to trump small isolated incidents.”
Some have noted that FBI doesn’t want Apple’s encryption keys, which is true. What they want is for Apple to make it easier to brute-force the login.
“In practice, encryption isn’t usually defeated by cryptographic attacks anyway. Instead, it’s defeated by attacking something around the encryption: taking advantage of humans’ preference for picking bad passwords, tricking people into entering their passwords and then stealing them, that kind of thing. Accordingly, the FBI is asking for Apple’s assistance with the scheme’s weak spot—not the encryption itself but Apple-coded limits to the PIN input system.”
In other words, let’s take the weakest link in phone security and rather than make it stronger, let’s make it weaker. Again, this is my point that this IS about security, not privacy. When someone gets in your phone, they get everything – passwords, bank credentials, personal private info that can be leveraged, etc. Are we seriously arguing that that’s ok? Consider how many smartphones and identities are stolen every day.
Chances of dying in a road accident 1 in 25,000,000.
Chance of dying in a terrorist attack world-wide 1 in 9,300,000. If you live in the US the chances are even lower.
Chances of having your phone stolen are about 1 in 42
Chances of being a victim of some form of identity theft are about 1 in 14.
So if you’re worried about something that will actually happen, you should hope that Apple comes out on top in this case. Identity theft affects about 15 million Americans each year and smartphone theft affects about 3 million Americans each year.
Some have suggested they’ve asked for a backdoor. That is an interesting topic that we could spend hours on, just trying to define what a backdoor is. But for the moment, let’s just say that whether or not it’s a backdoor, it certainly weakens the security of the device, specifically to make it vulnerable to brute-force attacks.
There’s another way
Let’s begin with understanding that this is NOT about this particular phone or incident. The government has chewed the data to death on this one and doesn’t really expect to find anything on this phone. If it were just about this phone there are other ways.
First of all, this phone happened not to be a personal phone, but one owned by Farook’s employer. Ironically, the employer is a government agency. This agency had mobile device management (MDM) software but it was either not installed or not configured correctly. Had it been in use this whole issue would be moot. The MDM software would have allowed the county to access the phone and mandate a specific configuration that would meet their security needs.
Next another misstep – sometime in the first 24 hours after the incident the county decided to change the iCloud password associated with the phone. Had this not been done they could have taken the phone to a previously configured network for it, such as their home or office, and tried to do an iCloud auto-backup.
Using law enforcement mistakes as a reason to weaken phone security is a poor argument. The better one would be to make sure that law enforcement knows how to deal with phones, when to get manufacturers involved, etc. This request to re-write the firmware is so extreme that Apple said:
The Apple executive also made a point of saying that no other government—not even China or Russia—has ever asked what American prosecutors have asked the company to do this week.
Offers and advice on accessing the phone have come from many directions. For example noted cybersecurity John Macafee has offered to break in for free. He believes he can accomplish it within a couple weeks without fundamentally weakening the smartphone infrastructure.
Snowden came up with a novel suggestion called de-capping, which uses acid and lasers to read the chip directly.
These offers have not been accepted because the FBI isn’t all that interested in what’s on this phone. They believe the under-informed will be on their side in this case so they can set a precedent. The government claims this won’t set a precedent but of course it will. Already people are saying “Apple has cooperated before, why not now?”. The whole reason the government is going after this phone IS to create a precedent – they don’t really expect to find anything useful on the phone.
Others have noted that the ramifications of specifically building weaknesses into a device at the governments request will of course be requested in the future, both by the US as well as other governments, including those we may find objectionable. In fact as I wrote this it came out that the Justice Department is already requesting Apple for 12 other phones. At this point we can pretty much put the “no precedent” argument to bed.
There are those arguing this would help prevent an attack – note that this isn’t the position of the FBI, but some in the senate who are trying to kill encryption. This is specifically about having access to the phone after you have a warrant – this would not have prevented this attack.
It’s not about finding out who committed the attack, we know that as well. It’s not about finding out who they communicated with – that comes from email logs and phone logs and doesn’t require the phone. Really it’s just a red herring to allow the anti-encryption crowd to further their agenda. Under the guise of making us safer they’d make us all less safe, since real statistics show us that the chances of having identity theft or a stolen phone are MUCH MUCH higher than preventing any terror attack, as I’ve noted above.
Legal and International Impact
If we choose to go down the path that the FBI is demanding, we need to think about the ramifications of this approach. I’ll break down the personal security and privacy vs police interests in the near future. For the moment we can set them aside and focus on what the impact could be.
I have to ask why the government prescribing the “how” rather than the “what“? By this I mean that if they want the phone data, then the request should ask for it. Of course, they could go to others like Macaffee and at least try to get the phone opened. But it isn’t about the phone, it’s about the precedent. That’s why they’ve prescribed how they want Apple to respond. The bigger legal question will be whether the government actually has the right to force a software vendor to write a specific piece of software.
If the government succeeds in their request, what will it mean overall? Can the government go after other security features, eventually asking Apple to backdoor their encryption methods as well? Why just Apple, why not all other smartphone vendors as well? And your desktop computer too.
And if the US government can force this, what about foreign governments? Will our security policies end up being defined by oppressive regimes? Some say that it’s about human rights because of how oppressive regimes handle their spying.
I know we all hate hearing the slippery slope argument, but in this case there is actually very little upside in the FBI demand and a whole lot of downside.
An Unreported Security Vulnerability
There’s one more issue here that scares me as a security professional. Namely the exploit of loading a new version of an OS onto a locked phone. This is certainly a problematic issue from a device security perspective. I wonder if Apple will be plugging this hole in the near future?
There is some security around this method because the new software needs to be digitally signed by Apple, but this is certainly an attack surface that bad actors will be taking a more in-depth look at. What if this build of iOS gets out in the wild somehow? Why wouldn’t people try to steal it? Can people try to figure out how to push their own OTA onto a device? How hard is it to bypass the Apple signature?
I’m certain future versions of iOS will probably take into account everything learned during this case. As always we can expect jailbreaks to continue to get more difficult as Apple tightens their security.
What’s the answer
Again, I need to reiterate that I recognize the legitimate need of law enforcement to investigate crimes. But there is also a legitimate interest for the public in protecting their information, finances and property. We have to ask ourselves if making everyone’s phone less secure is the best way to achieve greater security?
[Update 2016-02-24 16:45 – More info available since this morning. There is an article today in the New York Times that discusses how Apple is apparently planning to fix the update security loophole as I mentioned above. Not surprising, since it’s definitely something a bad actor could also use.
There is also an article at TechDirt that explains how demanding that Apple circumvent security technology may actually be against the Communications Assistance for Law Enforcement Act. So stay tuned on that front as well. ]
[Update 2016-02-26 13:15 – San Bernardino Sheriff Jarron Burguan had an interview with NPR where he admitted there is probably nothing useful on the phone. He said:
I’ll be honest with you, I think that there is a reasonably good chance that there is nothing of any value on the phone
which pretty much shows you what I was saying – it’s not about the phone.]
[Update 2015-02-26 15:30 – Bloomberg news just reported a secret government memo that details how the government is trying to find ways around device encryption – despite wide reports that they’re just interested in those one phone and not in setting a precedent.]