Apple has been ordered by the US courts to help the FBI gain access to data on an iPhone belonging to San Bernardino gunman Syed Rizwan Farook. Farook and his wife killed 14 people in the California city late last year before being shot dead by police.
The FBI says the phone contains information crucial to the investigation, and needs Apple’s help to unlock it. Data on Apple devices is encrypted by default - and has been since September 2014 - which means no one, other than the device owner, can access it. And that includes Apple itself.
The encryption offered by Apple is great news for anyone who values their privacy, but not so great news for law enforcement agencies. Once a device is locked, the only way to unlock it is by entering a passcode and the data will be erased once ten incorrect attempts have been made.
The FBI wants access to the contents of Farook’s iPhone and is asking Apple to do two things:
- Make changes to Farook’s device, so it’s possible to make unlimited attempts at unlocking it.
- Make it possible for the FBI to "brute force" attack the phone to speed up the time it will take to find the correct unlock code.
Farook used a four-digit PIN, which means there are 10,000 possible combinations.
Apple says it will contest the court order, and Tim Cook has written an open letter to customers explaining the company’s stance.
Apple is right to reject this court order, because what is at stake is too valuable to lose. The government is essentially asking Apple to eliminate a crucial feature of iPhone security, and create a master key that can unlock any Apple device.
The government wants us to trust that it will only use this power for good - to protect its citizens from the bad guys - but there’s no way this backdoor won’t be misused and abused.
As my colleague Mark Wilson said in the newsroom earlier today: "There would need to be very, very clear guidelines in place before Apple should consider allowing encryption circumvention. It would set a very dangerous precedent if Apple just bent over and took it. Should encryption be breakable for 'known' terrorists? 'Suspected' terrorists? Anyone who has been arrested? Who decides what counts as 'terrorism'? It's not as simple as saying 'that man did a bad thing, he loses the rights other people have'."
It's a slippery slope. It starts off with the FBI asking to view the data on a killer’s phone, and ends with any law enforcement agency being able to snoop on anyone’s phone. And once there’s a backdoor in place, what’s to stop bad guys using it? As Tim Cook said last November following calls from the government to weaken encryption, "You can't have a backdoor that's only for the good guys".
The Information Technology Industry Council (ITI) summed up the situation beautifully around the same time, saying: "Encryption is a security tool we rely on everyday to stop criminals from draining our bank accounts, to shield our cars and airplanes from being taken over by malicious hacks, and to otherwise preserve our security and safety.
"We deeply appreciate law enforcement’s and the national security community’s work to protect us, but weakening encryption or creating backdoors to encrypted devices and data for use by the good guys would actually create vulnerabilities to be exploited by the bad guys, which would almost certainly cause serious physical and financial harm across our society and our economy. Weakening security with the aim of advancing security simply does not make sense.
This is a hugely important battle for Apple, and one - for all our sakes - it desperately needs to win.