It is a fact universally acknowledged that laws and regulations lag behind technology. Augmented reality (AR) and virtual reality (VR) applications are no exception to this.
Online games platforms have all the ingredients to attract hackers and cybercriminals: large numbers of individuals and personal data, in-game purchases (involving the handling of payment details), code running on “always on” and “always present” mobile devices and players (whether kids or not) likely to be more interested in the gaming experience than staying safe in cyberspace.
Little wonder then that the Sony PlayStation Network hack in 2011 compromised the database of 77 million people, including personal data and payment card information. As well as losses to the company through network downtime of over 20 days, Sony was fined £250,000 by the UK’s Data Protection Authority, the ICO, and publicly censured for poor security practices, such as failing to keep systems updated with security patches. Sony also decided to offer players who had been affected by the breach, some free games and one year's identity theft protection, further increasing the financial impact.
When you take the vulnerabilities of online gaming and overlay it with the uniquely more personal/intimate interface of AR/VR applications with their users, the consequences of security lapses could be more immediate even than theft and identity fraud. Early reports about this year’s new AR game which has taken the world by storm (we need not speak its name), suggest that AR (and VAR) applications are just as vulnerable to data hacks and malware as other systems. This may sound like the realms of science fiction but "ransomware" could have a whole new meaning – imagine a user getting diverted to a would-be kidnapper's house by following directions on a compromised AR treasure hunting app.
The fact that AR and VR applications will be used in the mobile environment means they will be installed on devices bristling with sensors, from microphones, cameras, GPS, accelerometers, gyroscopes etc. Of course, in many cases those very sensors are required in order for the augmented or virtual reality aspects to function (when a wearer of a VR headset turns his head to observe the adoring fans from the viewpoint of Glastonbury’s Pyramid Stage, the application has to know he has moved his head in order to adjust the display).
There could be the potential for a user’s physical safety to be put at risk, with cyber criminals taking over a VR headset and showing flashing images, intended to trigger an epileptic seizure (there is a report of malicious visitors to an epilepsy online chat room attempting to do just this by using flashing pixels in their posts). In the AR context, what if malicious code operated to create a view of the world with real life hazards blocked out, causing the distracted player to endanger themselves?
Frankly, even without any outside intervention, there are numerous reports of distracted Pokemon Go! players stuck up trees, in mud pits or even colliding with traffic or each other. But are such misadventures the responsibility of a game’s publisher? And what other liabilities could arise?
If an AR or VR game involves the collection by the game publisher in the UK of personal data (that is, information which relates to an identifiable, living individual), then there is a duty under the Data Protection Act 1998 to use “appropriate” technical or organisational measures to protect the personal data from unauthorised access or use, or from loss or damage. What is appropriate in the circumstances depends on the kind of harm that could be caused to the individual by the loss or access of the particular information; so, the greater the potential harm, the more effort is expected under the law. An example of personal data collected could be an email address or social media OAuth used to register for a game.
In the UK, companies are expected to inform the ICO about security breaches involving personal data; the ICO may publish that information and could issue a fine of up to £500,000. It may also be prudent to notify affected individuals, especially where they need to take steps to protect themselves, e.g. changing passwords or checking bank accounts.
Individuals have a right to sue a company that causes them damage by breaching the Data Protection Act. Historically such court cases have been rare, mainly because of the high costs of litigation and the difficulty in proving (financial) damage. There are recent and forthcoming changes to the law in this area which makes it likely that it will become easier for individuals to bring claims in future.
Other types of potential loss or damage could include physical injury or death (e.g. photosensitive seizures), damage to property through trespass or harassment, e.g. the Massachusetts man whose home was a former church and - he discovered – a modern day Pokémon Gym. These claims would not necessarily be straightforward for individuals to succeed with. For example, in an injury case, it would need to be proven that (a) a contractual duty or common law duty of care was owed to the player, (b) the duty had been breached, e.g. by an innately hazardous game design and (c) the player suffered loss caused by the breach and of a kind which would have been reasonably foreseeable.
Minimising Potential Problems
There is considerable uncertainty about how courts in the UK (or elsewhere) would apply the law in the new context of AR/VR gaming. This articles closes with some suggested areas for consideration which may prove helpful when trying to manage potential liability.
- Use good industry practice when it comes to security. Limit access by your app to other data and sensors on the mobile device to what’s essential for the app to function. Arrange default settings so that minimum data are collected. Remember that the less data you have, the less can be lost/stolen/misused by a third party. Early coding on Pokémon Go! reportedly meant that full Google account access was granted to the app, which would have included the ability to read Google mails on iOS (even if that didn’t happen).
- Consider what security measures would be "appropriate" – this means that you need to take account of the kind of information and the kinds of potential harm which a security breach might cause. In the AR/VR environment, that requires an element of blue sky thinking. Be vigilant about security flaws and be prompt in applying third party patches and issuing new releases to your users, if required. Do not leave known problems unfixed.
- If the operation of the game could involve the transfer of personal data outside the EEA, especially to the U.S. then ensure you know which exemption against data exports in the Data Protection Act you are relying on.
- If a game could be played by children (under 18s) then extra care should be taken; consider whether you need to obtain verifiable parental consent. Are there ways of avoiding collecting any personal data about minors?
- What warnings and alerts would it be appropriate to provide to users? What should they say and how should they be communicated? Niantic’s terms and conditions for Pokémon Go! tell users that they play at their own risk and that they should be aware of their surroundings. Warnings are repeated to users each time the app is started. It would be usual for the terms to try to exclude liability for certain types of loss; however, bear in mind this may not be effective, e.g. trying to exclude liability for personal injury caused by negligence. Can insurance cover be obtained?
In these times of increasing focus on privacy laws and consumer protection, we can expect some interesting new applications of existing and new laws in augmented and virtual reality systems.
Kate Brimsted, Partner in Information Technology, Privacy & Data Security team, Reed Smith
Image Credit: Knight Center for Journalism / Flickr