After months of negotiations, this week saw the European Commission (EC) announce a replacement to Safe Harbour after it was declared invalid in October 2015.
The new framework, dubbed the EU-US Privacy Shield, has been put in place to protect the rights of Europeans when their data is transferred to the United States and ensure legal certainty for businesses.
So what implications does this have for businesses on both sides of the Atlantic? And what do businesses need to do now to ensure they comply with this new framework? We speak to industry experts to get their views.
A responsibility for data sovereignty
Deema Freij, global privacy officer, at Intralinks, feels data sharing can’t be taken for granted any more. “Companies and their cloud providers are more responsible than ever for data sovereignty, and this responsibility is only going to increase when the GDPR is adopted, leaving organisations with a two-year time limit to comply. The penalties for wrongdoing are well-publicised and severe for companies which fail to adapt to the new data privacy landscape.”
“At the moment, businesses have switched - or are switching - to other legal solutions so they are able to transfer personal data to the US - in a bid to avoid any issues with the decision invalidating Safe Harbour 1.0 by the Court of Justice of the European Union (CJEU). Those legal solutions include EU-prescribed Model Clauses. Now, if organisations choose to stay on these model clauses, nothing will change, and they can still use them to support data transfers globally. Model clauses work for all data transfers – not exclusively for transfer of personal data to the U.S. – but they are admin-heavy.
“Alternatively, they can use Safe Harbour 2.0 as a means of transferring personal data from the European Economic Area (EEA) to the U.S. - and it won’t be as much of an administrative burden. Model clauses will still be needed for any other data transfers outside of the EEA, however.”
Issues with self-regulation
Safe Harbour has historically been a self-regulated framework and David Mount, director, security solutions consulting EMEA, at Micro Focus, believes this is one of the core issues with any alternative.
“Historically, companies have proved their compliance with the agreement by ticking a box stating that the company adheres to the principles of Safe Harbour and has adequate controls in place. There are some fundamental issues with this, since self-certification does not foster trust and transparency – in fact, it does the opposite.
“It’s important to create more transparency around what data is being stored, what can be shared and what the purpose of this is, but levels of trust are always going to be low in a self-regulated environment. It will be interesting to see how negotiations have addressed the arguably conflicting ideas of trust and self-certification, and whether there is any other way to effectively police data sharing when there is so much data and so many parties involved.”
The movement of data
But what about managing the data itself?
“Meeting the requirements of EU data privacy standards is extremely challenging at the best of times, let alone when the goal posts are constantly being moved,” said Richard Shaw, senior director, field technical operations, EMEA at MapR, “Fundamentally though, how an organisation is able to adapt to this is largely dependent on how it manages its data.
“The reality is that the sheer volume of data an organisation of even modest proportions generates these days is staggering. This means that the only way to effectively provide the US authorities with the information they demand in a way that complies with all mandated regulatory requirements, is by automating governance processes around management, control, and analysis of data. Compliance protocols can be embedded into the system, guarding against nefarious intervention by rogue elements. Without this level of management and control over data the task becomes a manual effort, that’s simply not fit for purpose.”
Access to real-time traffic patterns
However, just addressing the challenges of a new framework from a data residence perspective is incomplete at best, said Dave Allen, SVP & General Counsel at Dyn,
“Businesses need to understand that the actual paths data travels are also a very important factor to consider, and in many ways a more complex problem given the constraints that come with the cross-border routing of data across several sovereign states,” said Allen.
“While there is no silver bullet for compliance with the emerging regulatory regimes that govern data flows, visibility into routing paths along the open Internet and private networks need to be seriously considered by businesses that rely on the global Internet to serve their customers. In this era of emerging geographic restrictions, having access to traffic patterns in real time, along with geo-location information, provides a much more complete solution to the challenges posed by the EU-US Privacy Shield framework.”
Encryption and on-premise keys
Security should also be a consideration here. In light of the stronger obligations, safeguards and transparency of data brought about by the EU-US Privacy Shield, Peter Galvin, Senior VP of Strategy at Thales e-Security, notes that techniques such as encryption will ensure information is protected, regardless of its location.
“Robust encryption ensures the safety and security of data wherever it is in the world, allowing organisations to leverage cloud-based infrastructures while ensuring the safety of their sensitive data,” said Galvin.
“Crucial to this encryption process is effective key management. By ensuring they keep their ‘keys’ on premise – or by allowing them to ‘bring their own keys’ stored safely in a hardware security module (HSM) – organisations hosting protected data in the cloud will be able to take control of their data, no longer needing to worry about external decisions influencing their policies.”