Apple on the ropes but comes out swinging

In late February, the news broke that Apple is to contest a US court order requiring it to create a ‘backdoor’ (a method of circumventing software security measures) in order to help the FBI unlock the iPhone used by one of the perpetrators of the San Bernardino terrorist attack. The FBI want to access the device because they believe it may hold important information relating to the attack, for example whether there were other people involved, or other future targets. The phone had been regularly backed-up using Apple’s iCloud service and so the FBI were able to obtain some data from the iCloud servers. However the last back-up had been on 19 October 2015, after which point the FBI believe that the function was intentionally disabled, in order to ensure that important data could not be recovered.

In seeking the order, there are two particular iPhone security features that the FBI wants to bypass. Firstly, if an incorrect passcode is entered 10 times in a row, all data is automatically wiped from the device. Secondly, iPhone passcodes need to be entered manually, thereby preventing a ‘brute force’ attempt to crack an iPhone’s passcode (i.e. an automated method of generating a large number of consecutive passcode attempts). Apple’s concern is that it is, in effect, being forced to create a new operating system which would weaken the security of its devices. If it fell into the wrong hands, Apple’s customers would be at risk.

Interestingly, in an unrelated case regarding drug offences, a different court in the US has since refused to grant an order compelling Apple to give the FBI access to a locked iPhone. As the decisions were given by different courts and related to a different set of circumstances (in particular, different offences), neither decision is a precedent for the other. The applications by the FBI in both cases were founded on a very old piece of US legislation called the All Writs Act (AWA), which dates back to 1789 and gives federal courts the power to “issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.” So, rather a wide jurisdiction then.

From a legal perspective, that is part of the problem. The AWA was introduced into US law for a particular purpose – to ensure that the (then) newly created US government and courts had the power to govern effectively and introduce measures that were not covered by existing laws. The AWA was certainly not intended to be used in the manner that the FBI is now seeking to use it. Supporters of Apple point out that the order obtained by the FBI was wrongly granted because it is not “agreeable to the usages and principles of law”, as it potentially cuts across privacy laws. Further, the AWA contains a proviso stating that any order granted must be “appropriate and necessary”. Guidance on when the AWA jurisdiction should be exercised was handed down in 1977 by the Supreme Court of New York, which held that an order must not place an “unreasonable burden” on its subject. Apple will no doubt argue that the order obtained by the FBI falls foul of this requirement, as it is being forced to create a way to hack its own devices which will put customers at risk. Moreover, if the FBI wants such powers, then they should be explicitly mandated in legislation passed by Congress.

It should be noted, however, that obtaining court orders relating to the provision of mobile phone data under the AWA is not unusual. Previously, Apple have complied with orders requiring it to retrieve data from iPhones and provide such data to the FBI. However, that was in relation to earlier versions of iOS. This, it seems, is different. The more recent versions of iOS have robust security which, according to Apple, it cannot currently override. So, Apple is not being ordered to merely provide data. It is being ordered to write entirely new software which it considers “too dangerous to create”. That is where Apple is drawing the line. While Apple might be able to keep the backdoor secret on this one occasion, once created it would set a precedent and inevitably lead to a multitude of similar requests. At that point, the backdoor would need to be used on so many occasions and on so many devices, that it might be hard for Apple to guarantee its safety.

Apple’s position is interesting from a UK law perspective, not least because it is also in the process of lobbying Parliament to make changes to the UK’s Investigatory Powers Bill, which the government is trying to rush through Parliament. The Investigatory Powers Bill would give the UK government and security forces the right to require tech companies to provide just the sort of assistance that Apple is trying to resist in the US. In particular, under the proposed legislation, tech companies could be required to provide assistance in investigating devices, by altering the way their devices work or providing backdoors to circumvent security settings.

However, even if the Investigatory Powers Bill gets through Parliament, tech companies might be able to rely on a proviso in the draft legislation to resist requests for assistance. The section in question states that a ‘technology provider’ would not be required to implement measures which ‘are not reasonably practicable for it to take’. The scope of this qualification is not clear and would need to be interpreted and applied by the Courts on a case by case basis. However, tech companies could theoretically rely upon it where providing assistance would mean incurring disproportionate costs, or (if it can be proved) creating real and potentially damaging vulnerabilities within its products which could cause significant risks to consumers.

Apple’s challenge, both here and in the US, highlights the tension between ensuring that security forces have all the powers they need to help prevent issues of national security, and putting consumers in danger by forcing tech companies to compromise their security measures. What happens next will define how far the US and UK governments and the courts are prepared to go in order to combat terrorism.

Jeremy Harris, Partner, IP & Technology Disputes at Kemp Little

Image Credit: Shutterstock/ymgerman