Apple vs. the FBI: The son of the Clipper Chip?

 

The news reports about the lawsuit between Apple and the FBI over a terrorist’s iPhone is fraught with misinformation and security theater. It has been characterized as privacy’s last stand or as the tech industry’s gift to criminals around the world, and everything in between. I assume you have read something about the case, so will start by providing two documents that you may not have links to. Both of them pre-date the Apple case.

First is the Keys Under Doormats paper, written by more than a dozen different security researchers report in July 2015. The paper does a very good job laying out the issues involved in decrypting our modern computing devices. Many of these researchers were involved in the Clipper Chip era of the late 1990s, when the government last tried to force their way into our devices.

While you should read the entire paper, here are some highlights. The paper concludes by saying that “the damage that could be caused by law enforcement exceptional access requirements would be even greater today than it would have been 20 years ago.” They also say that providing decrypts would be “unworkable in practice, raise enormous legal and ethical questions, and would undo progress on security at a time when Internet vulnerabilities are causing extreme economic harm.”

The second document was written by the NYC District Attorney last November. It deals exclusively with whole disk smartphone encryption, which is the central issue of the case. It contains a proposal for device vendors to be able to unlock any phone under the request of a search warrant. The report has justifications and technical questions for both Apple and Google. Again, the entire document is worth reading, but between September 2014 and September 2015, the DA’s office was unable to execute approximately 111 search warrants for smartphones because those devices were running iOS 8, which automatically encrypts its information. They claim this feature benefits criminals and imperils the safety of us all.

Okay, here are some of my own thoughts.

Is the DA’s proposal a backdoor way around encryption? The government and law enforcement officials say no. I would disagree, and say that their proposal is probably better characterized as a side door. Having a way inside an encrypted disk compromises the disk’s security, no matter how it is done and who holds the keys.

Shouldn’t Apple, Google et al. want to cooperate with law enforcement? Sure they should but in a way that won’t be a threat to overall security of everyone. I side with the “doormats” folks on this one. The issue, as they say, is that “Law enforcement cannot be guaranteed access without creating serious risk that criminal intruders will gain the same access.”

The FBI initially stated that they were only interested in a single iPhone, and then later changed their statements. The FBI is being somewhat disingenuous here. If Apple develops the technology to break into a phone, this will certainly be used in numerous other cases. The FBI carefully picked a test case with a known criminal, a terrorist, to make their request more sympathetic to the courts and the public.

Don’t encryption tools benefit criminals? Many of us say that we have nothing to hide. Perhaps that is true, but why should citizens have their phones compromised by others who are either less sanguine about their rights to privacy or who are trying to gain access for illegal intent? Sure, gaining access to encrypted information isn’t easy: you might have read how the FBI arrested Ross Ulbricht for his activities with Silk Road. But that’s the whole point. The FBI got around the various encryption protocols he was using by seizing his open laptop at a public library in San Francisco, preventing him from closing his session so his identity could be verified and they could gain access.

Why can’t corporate IT departments make use of mobile device management tools to open their phones for the law? Indeed, this is sort of what happened with the San Bernardino case. However, his employer, the county health department, had only partially installed the MobileIron MDM tool. Because it wasn’t completely implemented, they couldn’t get all the information out of the phone. Certainly now many IT managers who have heard about this recognize the value of MDM. Perhaps they will finish their own installations as a result. But there will be many phones that other law enforcement staff will get their hands on that will be in a similar state: do we really want to pass legislation to compel IT workers to do their jobs properly? And just because I have a personally owned phone that is managed by an MDM doesn’t mean that IT can obtain any information from it.

6 thoughts on “Apple vs. the FBI: The son of the Clipper Chip?

  1. I have been trying to figure out how Apple is going to get into this phone.
    Here’s why:
    1) They say they don’t want to create a backdoor (side door in your terminology)
    2) Which implies there currently is NO backdoor
    3) Doesn’t that mean that they are going to somehow have to hack into the phone just like anyone else – law enforcement, law abiding, or not? What makes it any easier for Apple to hack in than the FBI hiring independent hackers?

  2. Encryption that is so strong that a government cannot break it is a problem for governments, but a benefit and protection to private citizens and businesses. The government even mandates in many cases that private data be kept private and one of the best ways of doing so is with encryption. From a free speech and self incrimination perspective, if someone decides to keep something private via encryption, then there should not be a built in hole so that a government or corporation could access it at will. Would you trust your banking data to something like that? If you want to pay with Apple Pay, would you want to have a known security hole that someone might figure out how to exploit to use your banking info?

    We cannot afford deliberate insecurity. Our ability to plug the holes we do have in our security is bad enough.

    One of the key principles of Mobile Device Management is that it can be used to keep corporate data on phones private, even securely erasing the phone if need be. But, Mobile Device Management involves giving up personal privacy for corporate manageability.

    The FBI might only need to look as far as the applications on the phone or the carrier. Quite a few of the applications on your phone (and your PC, especially with Windows 10) have likely been granted all sorts of permissions to contact the “mother ship” and data will be stored there as well.

    As long as your device is unlocked, it may be accessible by others.

    • Well, if one has a phone owned by and issued by an enterprise or corporation, what expectation should one have that personal data will be secure and private? None whatsoever! Corporate policy should make this crystal clear to any one issued a phone. With mobile phones being so inexpensive these days and with at least a few very inexpensive mobile plans, there is no excuse for someone to confound his/her personal data with that of the company. Better to simply carry a second mobile phone for personal use.

  3. David, I read this piece while in Israel, and again today. It remains one of the most lucid explanations of the the whole ball of wax involving device encryption, privacy, security, backdoors, and no doors. Maybe you should go before Congress and explain it all to those dolts? ;>)

Leave a Reply to dstrom Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.