Several months since GDPR came into effect, it has already begun to make its mark. The most extensive restructuring of privacy rules in twenty years has triggered a sharp rise in complaints from consumers, breach notifications from companies and hand-wringing all around. However, the expansion of data protection laws – including but not limited to GDPR – will have been a failure if the private sector’s reaction is mere technical compliance with those laws. Instead, the robust data protection laws being passed throughout the world will only truly be successful when companies have internalised the values behind those laws. To do so, companies need to adopt tools to help them become allies to consumers and regulators, allowing them to bridge the gaps between three sets of interests - the understandable concerns of consumers, the reasonable goals of regulators and the legitimate data usage needs of companies.
The push vs. pull
Unfortunately, consumers, regulators and companies currently appear to be stuck in an adversarial relationship with each other when it comes to the use of personal data. Consumers are increasingly – and justifiably - concerned about how their personal data is being used. They know they’re giving up troves of personal data in order to access the products and services they want, and they are demanding that the companies providing those products and services act responsibly in using their data. They are doing whatever they can to “pull” companies towards more responsible use of their data –threatening to complain to their local regulator if they fail to do so, resulting potentially in fines/sanctions for the offending companies and at a minimum, the tarnishing of their brands.
Regulators (and legislators), for their part, are struggling to both address these consumer concerns while not trying not to stifle the innovation and delivery of products and services that those very same consumers want. To do so, they are “pushing” companies, with the threat of potentially crippling fines, to treat data more responsibly, while aiming not to push so hard that the quality of data-driven products and services erodes, innovation is suffocated and even companies acting in good faith are buried in mountains of red tape.
Caught between these two forces – complementary but not always coordinated or even aligned – is the private sector. This includes companies with either irresponsible intentions or lazy execution, willing in either case to run roughshod over the rights of individuals in the pursuit of profit, but also companies that are responsible with their use of data, but that don’t want that responsible use to be at the expense of being able to provide top quality, data-driven products and services and to innovate to find new products and services to offer the market.
Bridging the gaps
A bridge is needed. A bridge that will allow these three interests – consumer concerns, regulator goals and business needs – to each be satisfied. As with the overcoming of any great challenge, the solution lies in the mind rather than in the hands. For the solution is not simply compliance with laws. Rather, the solution lies with a change in attitude. The private sector – having benefited greatly from the digitalisation of data and having, in using that data, sparked the consumer concerns that are driving the ongoing wave of regulation – must lead the way in this attitude adjustment, by focusing not on complying with the law while using data, but rather on complying with the law as a way to use data.
How can this be done? The answer is privacy by design – the use of technical innovation to create holistic approaches to the use of data that place privacy at the forefront of the approach, building out the usage elements from there. This contrasts with the normal approach to data protection law compliance, which entails the use of legal, administrative and other compliance techniques to adjust or mitigate data usage in such a way as to technically satisfy the requirements of data protection laws, usually at the bare minimum level. It’s treating privacy – and the responsible use of consumers’ data – as a priority to be revered, rather than a nuisance to be dealt with.
To be clear, this doesn’t mean abandoning intelligent business use of data – it means being responsible and transparent with that use; continuing to pursue company objectives, of course, but not at the expense of, or even with priority over, the individual data rights of the customer. And this is where the “push” of regulators vs. the “pull” of consumers comes into play. Because internalising the values behind data protection laws – and therefore pursuing a privacy by design approach – is actually really good business. Being open with customers about how their data will be used and for how long (including adherence to GDPR principles) is a great opportunity to let consumers buy in to your company and its culture and principles – not just your products.
Because companies that are successful in the long term understand that you live and die based on your reputation with your customers. Allowing yourself to be fined by a regulator is bad business – but allowing your brand to be tarnished among consumers for being irresponsible with their data may bring you to a point from which you will struggle to recover. That is why the smart companies in this data-driven economy will gravitate towards privacy by design approaches – they will internalise the privacy-focused values of consumers and seek partners and solutions that do likewise.
To “use it & lose it” or to “anonymise & analyse”?
Bridging the gaps between the three key interests at play here is of course not simple. But there are ways to do it. From our experience the best ways to successfully bridge these gaps are two different privacy by design approaches - “Use It & Lose It” or “Anonymise & Analyse”.
The “Use It & Lose It” approach is as simple as it sounds – and best suited for companies with needs that are equally simple. If a company has significant amounts of personal data in its possession –gathered legally and in the normal course of its business – and yet its business needs don’t rely in any significant way on the retention of that data, then the company should consider just using the data for the business purpose for which it was collected and then immediately and irretrievably deleting that data forever. This “Use It & Lose It” approach would then solve the company’s business needs, address consumer concerns (as the data is deleted immediately after use) and comply with the legal principles of data retention (since you only keep the data for as long as is necessary).
The second approach we see to bridge these gaps is the one that our company, Trūata, has pioneered from its inception – the “Anonymise & Analyse” approach. This one is a bit more nuanced but can be a game-changer for companies using it. The typical user of this approach is a large company with a significant amount of personal data that it has gathered in compliance with data protection laws for a specific purpose. But, unlike the companies discussed above, the companies best suited for this approach cannot just use that data and lose it. Rather, their business models promote the use of the data for a secondary purpose which involves continuously analysing that data (either in order to improve their products and services, to innovate by developing new products and services or to perform analysis used in consulting services or market research).
Companies using this “Anonymise & Analyse” approach with the data they hold must put that data through a state of the art anonymisation process and then – only once the data has been anonymised to such an extent that the risk of de identification is reduced to an insignificant level – analyse the resulting anonymised data set. Of course, the data will never be 100 per cent as analytically valuable as it was in its identifiable state. But, if state of the art anonymisation techniques are correctly applied by an experienced and dedicated team of data scientists and data privacy experts, working in coordination, then the anonymisation can be conducted in a way that the data is neither under-anonymised (which would involve high risk of the data being re-identified) nor over- anonymised (which would pulverise the analytical utility of the data). The result of this process, if applied correctly, is a data set that has been anonymised to the optimal degree to allow for useful data utility while reducing to an insignificant level the risk of data being re identified. This approach therefore bridges the gaps between the needs of the private sector, the concerns of consumers (who are not identifiable from the data in this situation) and the requirements of regulators (personal data has been anonymised prior to that point so the data will not considered as personal data).
The trick with the “Anonymise & Analyse” approach is that it’s not just about applying the approach correctly – it must also be applied by the correct entity. For if a company anonymises and then analyses its own data, while retaining the original identifiable data set, it’s really just performing a parlour trick of sorts as regulators will – justifiably – argue that the anonymisation can be reversed by that company at any time such that there could be re identification of individuals. The approach then becomes a charade and, while the company is satisfying its needs, it is not addressing the concerns of consumers or the goals of regulators. Hiring someone to perform this anonymisation as a processor suffers from the same problem as that processor can be instructed by its customer to reverse the anonymisation. Instead, companies using the “Anonymise & Analyse” approach must do so only via an independent third party provider who, acting as a controller, specialises in such anonymisation and analysis services. Only such a firm, operating the state of the art anonymisation techniques independently via its experienced and dedicated team of data scientists and privacy experts - without influence from its customers - can be deemed truly independent and therefore truly able to provide its customers with an “Anonymise & Analyse” service that bridges the gaps between their needs, consumers’ concerns and regulators’ goals. By engaging such an independent provider of “Anonymise & Analyse” services, data-driven companies can continue to obtain rich analysis from the data sets they hold whilst simultaneously demonstrating to regulators and consumers that they have indeed internalised the values behind those laws.
What bridge is right for you?
In the data driven economy, neither consumer confidence nor regulatory compliance is enough to succeed. Consumer trust is instead the foundation of success for companies. That trust requires internalising the values behind the data protection laws and embracing a privacy by design philosophy – whether the “Use It & Lose It” approach or the “Anonymise & Analyse” approach – that is appropriate for your company and its needs. So ask yourself – how will I show consumers and regulators that I have embraced the spirit of the data protection laws? How will I show them that I understand their concerns and want to address them? What privacy by design approach is right for me to do so? And who will I partner with to successfully pursue that approach?