Skip to main content

Security versus Privacy – which wins out?

It’s a situation that many of us would never have thought we would see – technology giants putting aside their differences to support a competitor fully and openly. But that’s exactly what has happened in the San Bernardino case, with Microsoft, Google, WhatsApp and Facebook backing Apple in its fight not to help the FBI break into an iPhone recovered during the massacre last December.

Apple chief executive, Tim Cook, has said that rewriting code to assist the authorities would “undermine the very freedoms and liberty our government is meant to protect”, while the Reform Government Surveillance lobby group issued a statement that “technology companies should not be required to build in backdoors to the technologies that keep their users’ information secure” – a statement that was subsequently tweeted by Microsoft’s chief legal officer and CEO.

When we put it like that, the stark reality of the dangers of the data ending up in the wrong hands pushes us to choose security over privacy.

The ethical dilemma is real

The whole debate comes down to the basic idea of which we ought to value most: security or privacy? It started with Edward Snowden’s notorious whistleblowing on the NSA’s gathering of phone records. Government authorities argue that they must have the ability to access customer data in cases affecting national security; technology companies, on the other hand, claim that this would be the start of a slippery slope towards unacceptable intrusion into individuals’ privacy.

The ethical dilemma is real. However, when it comes down to it, vendors and technology firms have a responsibility to protect their customers’ data. Insider attacks and remote hacking alike are on the rise, and information security organisations must do whatever it takes to ensure businesses and their customers are safe.

In the case of Apple versus the FBI, the government essentially asked Apple to create a key, which can unlock any Apple device. Although there was good intention behind the request, Tim Cook himself said: “you can’t have a backdoor that’s only for good guys” – so who’s to say this backdoor won’t be misused in the days, weeks or months to follow? Potentially causing a catastrophic impact if it enters into the wrong hands.

When we put it like that, the stark reality of the dangers of the data ending up in the wrong hands pushes us to choose security over privacy.

Severe betrayal of trust

This is something that we, and all technology companies, have to take a strong stance on as an industry.

We have our own secure operations centre. If we were to grant the police or security services unfettered access to this whenever they asked without the proper legal framework being followed, how could we claim that we were offering our client genuinely secure data storage? It would be a severe betrayal of trust. Any keys generated by the systems are owned by the user that generated them and it is their responsibility to disclose them, if compelled under the RIPA 2000. We have a responsibility to ensure our systems are architected in a way that we do not have access to key material, meaning we are not even in a position to disclose it.

With this in mind, perhaps the answer is for the FBI to seek better support in developing its own technical expertise, rather than looking to compel a vendor into doing their job for them.

Matt Hampton, chief technology officer at Imerja