People might be more identifiable than previously thought from supposedly anonymised information contained in large databases, according to a technology law expert. New research recommends that privacy practices and even privacy laws need to change.
Increasing amounts of personal information are collected by organisations and stored in massive databases. That information is sometimes used or released after being stripped of elements that could identify individuals in a process called 'anonymisation'.
University of Colorado Law School Associate Professor Paul Ohm, though, has said that the techniques used no longer work and that it is now possible to identify people from the release of supposedly anonymised records.
"With the supposed power of anonymisation you can share the data with anyone you want, you can store the data for as long as you like," Ohm told podcast OUT-LAW Radio. "And traditionally it has been a conversation stopper. Once you assert anonymisation everyone nods their heads and says 'that's fine, privacy is protected here, let's focus on something else'."
But even though you are deleting many of the identifying fields of information, everything you leave behind retains identifying power," he said.
Ohm said that research has shown that a combination of increasingly powerful computers and the prevalence of large databases has made it possible to "re-identify" people whose records have been anonymised.
One researcher in Massachusetts about 15 years ago discovered that 87% of Americans are uniquely identified by three pieces of information: their date of birth, their sex and their zip [post] code," he said. "Now the problem was that until she announced her findings, zip code, birth date and sex were three pieces of information that we presumed were privacy-protecting, were anonymised. So they appeared in all sorts of databases."
Ohm told OUT-LAW Radio that the problem was huge because such trust had been placed in anonymisation that it was enshrined in legislation. "Virtually every privacy law allows you to escape the strictures and requirements of the privacy law completely once you've anonymised your data," he said. "Every policy maker who has ever encountered a privacy law, and that's in every country on earth, will need to re-examine the core assumptions they made when they wrote that law."
Ohm said that the problem was hard to solve because the very pieces of information that identify a person are the pieces which are useful to researchers.
He proposed, though, that in some fields of research, such as health, it would be possible to open up much more data than is currently permitted as long as you controlled access to it.
"We can't trust technology any more but at the same time we don't want to keep this information from researchers. So my solution is that we shift our trust from the technology to the people," he said. "We write down the rules of trust among health researchers … [we say] you can get my data but only on a need to know basis."