One of the more interesting debates in the privacy and security community these days is the ongoing battle between Apple and the FBI. The big question being asked is, are we opening Pandora’s iPhone by asking Tim Cook and Apple to open the backdoor to the iPhone?
The media has reported on this endlessly over the past month, and it was the hottest topic during the highly touted RSA Conference. But there are conflicting views on what the so-called battle is actually about. So without taking sides, let’s run a play-by-play of what’s happened so far.
The part that everyone knows – the FBI is looking to gain access to the phone of Syed Farook, one of the San Bernardino shooters.
Judge Sheri Pym of the Federal District Court in Central California issued a court order asking Apple to modify the iOS of Farook’s iPhone, creating a “backdoor” for the FBI. Typically on iPhones, after 10 wrong guesses for the passcode, the phone will wipe the symmetric encryption key. This is the key between the storage and the CPU that gives access to the contents of the phone.
The court ordered Apple to assist the FBI by disabling the 10 wrong guesses lockout. Again, this is part of the software and can theoretically be changed. There is still an 80 millisecond hardware-enforced delay to slow down brute force attacks. Additionally, they are seeking an electronic method of inputting the passcode guesses. This would basically allow them to brute force their way into the phone, instead of having some intern sit there and guess (e.g. 0000, 0001, 0002, and so on).
In laymen’s terms, the FBI wants to just hook Farook’s phone up to a brute force generator at 80 milliseconds per guess without the downside of potentially having the phone wiped if they guess wrong 10 times.
Of course, we all know Apple’s response – “No!” Apple has stated that they will do everything they can to fight this.
Now that we have looked at what the FBI wants and what the court order says, let’s clear up the confusion on what the current technology of the iPhone says. People are using loosely defined terms. “Backdoor” has become kind of a catchall phrase when talking about access to encrypted devices.
An iPhone periodically checks for updates. The iPhone sends it’s unique device ID and a randomly generated nonce (one-time code) to Apple. If Apple has an update to send to the iPhone it will accept the device ID along with the nonce and bundle those with the update package. Apple then signs it with their super-secret private key and pushes that back to the phone. The phone verifies the signature is correct and that the device ID and nonce both match.
So, why does all of this matter? Well, this means that every single update is customized. And Apple does this for a reason. Apple wants to prevent an older version of iOS from being cross installed and allowing a downgrade attack. This would allow an attacker to recreate old flaws in the iOS that are widely known to exist in earlier versions, but are now fixed in current versions.
Apple has accepted the burden of not being able to mass distribute any of their iOS updates.
What Does This Mean?
So the fundamental question in the original Apple vs. FBI debate… Can Apple respond uniquely to this singular request and provide the FBI, either in their facility or remotely, with a piece of software that answers the court’s demands and is not reusable ever again, not even on the same iPhone?
The answer is yes. They can do just that. That’s currently the way the technology works: it’s sound. It gives Apple the ability to open this single phone.
Apple has filed a formal response to the FBI request. One section beautifully states their position on the matter. Again in laymen’s terms:
Apple recognizes the struggle between the needs of law enforcement and the privacy interests of the public. They think the FBI has taken the wrong direction by bringing the matter into a public forum. Apple acknowledges the FBI’s request to make a brute force attack easier and calls the solution a backdoor to the iPhone. A backdoor would mean that criminals and foreign agents would have a way to access other iPhones.
Apple takes opposition to the government stance that this is a one-time-deal and points to many other cases looking for phones to be unlocked. Further, Apple says this is just the beginning and floodgates would open. They point to the government potentially overstepping other privacy boundaries as well by turning on the microphone or activating the video camera on iPhones.
Where We Stand Now
The reality is that the iPhone in question probably doesn’t have any valuable information. It was Farook’s business phone that his employer, the county, provided him. He destroyed his personal phone; that’s gone. And the iPhone the FBI wants to access wasn’t backed up in the 6 weeks prior to the incident.
The FBI actually went against Apple’s recommendation and requested that the county reset the iCloud password. Without a reset iCloud password, the phone would have backed up to iCloud on a trusted Wi-Fi network when plugged in.
Given what the court order is asking and what Apple’s technology is capable of, this one request sounds doable at face value. But the larger battle is really over the precedent this case creates.
Other law enforcement agencies are lined up, eagerly waiting for an FBI victory, so they can access other Apple devices in their investigations. Then, the question presents itself about foreign governments requesting access as well. Apple sells iPhones in China, and must adhere to Chinese law. What happens if the Chinese government sees what’s happening and sends a truck load of phones to Cupertino, California for Apple to unlock?
The FBI picked the perfect case to fight. When the government throws the word ‘terrorism’ around, it packs a punch like the right arm of Mike Tyson. However, Apple is working diligently to make sure that future versions of iOS don’t run into this problem.
Apple’s position is that a backdoor in future encryption technologies would cripple U.S. businesses like Apple and Google and compromise the privacy protections and security of consumers. The people who really want full encryption solutions will still be able to get it. There are hundreds of encryption solutions outside of the U.S., and they are free. Bad guys with something to hide will still use full encryption.
And that is Apple’s point and why they are fighting. We need to buckle up because this will be an up and down rollercoaster until the very end… which may be in the Supreme Court.