Some familiar voices took the side of Apple last month in its fight against the FBI-prompted court order demanding it hack into the iPhone used by one of the shooters in the San Bernadino, California attacks.
Facebook, Amazon, Google, Microsoft, Cisco Systems, Box, Dropbox and others submitted briefs last week to the Federal District Court for the District of Central California, challenging several legal facets of the government’s case.
On one hand, this is a surprise, because these technology giants usually go their own way, steering clear of acknowledging - never mind supporting - each other. On the other hand, the actions taken by these companies is a clear indication that every single one of them finds common ground around the issue of security. The underlying tone is this: “If it can happen to Apple, it can happen to anyone, including us.” And it’s really not much of a stretch to think that way. While the occurrences leading to this point had a strong Big Brother component to them, the core problem is not about this phone in this situation. It’s about every piece of technology in every situation. When you think about the disparate technologies the companies listed above offer - ecommerce, document storage, personal computing capabilities, email, and more - you can start to see why this might be disturbing to those companies.
Companies like Apple - and all of the other listed above - take security seriously. The Snowden incident and the recent financial and healthcare hacks have put everyone on high alert about security vulnerabilities. Our mobile phones are no longer just personal devices, they’re also used significantly for business work (including viewing files on storage sites like Dropbox and Box), ecommerce transactions (Amazon and other retail giants), calendars and email (Google), all rolled into one tiny device.
Apple’s security is built into its operating system (OS). After the Snowden incident, Apple made some significant changes to its software, ensuring that the data on their iPhone and iPad devices is only accessible to someone who has the device passcode. Encryption on the devices uses strong encryption (AES) that combines the device-unique (256-bit encryption) key and is “tangled” it with the user’s passcode. The device encryption key is not available to Apple and is secured within the phone with no possible way of exporting. This makes the decryption of the data impossible. Apple took the steps and specifically designed its iOS security so no one - not even them - could decrypt it. These steps also included security deterrents around guessing the user passcode. In the court order, the FBI mandated that Apple create a custom iOS that would eliminate these passcode-deterrent safeguards.
The industry reaction to the court order is two-fold: On one hand, it’s a response to “what’s next.” It’s easy to make and defend this type of legal intercept request in the name of “public safety.” The problem is in the precedent it creates. If Apple is forced to comply this time, what happens the next time, and the time after that? Every one of those companies could be next, asked to make their products less secure despite the efforts they have taken to make them more secure. If the U.S government is successful, it sets a precedent for other countries as well. Asking companies to purposefully change their technology - create a backdoor, so to speak - goes against everything the security stands for. And if this order is enforced, every company can look down the road and imagine a scenario where the U.S. government - or any government, for that matter - could compel technology companies like theirs to deploy software over-the-air (OTA) that would provide a wealth of information about the user, such as location, personal data, and so on.
It’s also a strong statement about a government’s right to encroach on its citizens’ privacy and companies’ right to create secure solutions and devices. On one hand, the government has created regulations like HIIPA and Sarbanes-Oxley to protect the privacy of citizens, and the industry has dutifully complied. Now it is asking the opposite, to make personal data less secure, at least in certain instances. And it’s at the whim of the government to decide what those special circumstances are.
Data privacy legislation is already appearing in the United States and U.K. and it’s causing some businesses to take into consideration things like jurisdictional advantage as a core business strategy. More legislation isn’t necessarily the answer, as society - and businesses - as a whole want information to be more secure.
On the other side of the equation is the “key under the mat” argument. What if the custom code Apple uses to crack this particular iPhone is used again? Once created, this hack can be replicated and used over and over again, and not necessarily just by the good guys. If Apple makes its software less secure, how is that a good thing? The same holds true for every company that is joining Apple’s voice in saying “no.” Having a backdoor built undermines businesses and the security of transactions, and it also means that governments are clearly failing to see the importance of personal security.
Apple has already started the process of making its devices more secure, with the introduction of Secure Enclave in A7-based devices. Secure Enclave isolates security safeguards from the iOS, thereby making it even harder to access encrypted data or deploy brute force access attacks.
Apple took a strong stand for itself and for all of us, that our private data should remain private. And now, it’s no longer standing alone.
Chris Peel, VP of Engineering at Echoworx