Organisations worldwide are collecting, storing and managing ever-increasing data volumes. Many are deciding to store this data in the cloud because it’s unsustainable to maintain it in their own data centres. But then the unimaginable happens – the organisation receives a ransom email from a group of hackers, explaining that they have gained control of the organisation’s data in the cloud and are demanding a hefty sum to give it back. What to do?
First, learn from other businesses which have had very public experiences. For example, if you take Uber in 2016, they paid the ransom and hoped the data breach would never surface, as it would, in their minds, result in a loss of trust from their customers. Additionally, it might demonstrate a public show of vulnerability, and an invitation to other hackers to try and take them out for another ransom ride. Unfortunately for Uber, the attack became public in late 2017, and they are now facing angry customers, stakeholders, and regulators. How did they get here, and what can organisations do to prevent this kind of massive data breach from happening to them?
One of the most important things companies need to do is to stay updated on the kinds of threats they’re facing. We are at a time when IT environments are going through a dramatic digital transformation, with legacy infrastructure replaced by modern cloud-based solutions. Naturally, following the growing adoption of cloud solutions, a new type of enterprise security threat is emerging, riding on the waves of ransomware: it’s called ‘Leaking Cloud Buckets’.
What are leaking cloud buckets?
When data is exposed on public clouds, most often as the result of a misconfigured storage bucket, it is called a Leaking Cloud Buckets incident.
Every public cloud storage service offers buckets, a term coined by AWS for the repositories that house data objects on the cloud. (Azure calls them ‘blobs’). Enterprise customers can configure storage buckets in any way they choose, including the region in which the bucket is maintained, the lifecycle rules for objects in the bucket, general access rights, and much more.
In the last year, there has been a wave of such incidents afflicting notable organisations such as Uber, Verizon, Viacom, Dow Jones, and even U.S. military organisations.
Who’s to blame? Is it the customers, the cloud providers, the storage vendors or the hackers? As it turns out, the root cause of the problem doesn’t lie with the cloud providers involved, be they AWS, Microsoft, IBM, or Google, but with the way these buckets are being configured and used by the enterprise administrators. Eventually most cases can be drilled down to the age-old problem of user error – no outside hack necessary.
Is this really so surprising? Let’s not forget that Gartner predicts that 95% of cloud security failures will be the customer's fault (opens in new tab) through 2020. Those of us who have been in IT for some time know that user/admin error has long plagued IT organisations. Here is how it happens in the case of those leaking buckets.
There are two main attributes to these buckets that should not be ignored. First, cloud storage and thereby storage buckets are a shared service that resides outside of the private cloud and firewall perimeter; and second, cloud buckets are based on object storage, which doesn’t enforce file system Access Control Lists (ACLs) that have been used for years by organisations to define file-level granular permissions.
The inherent weaknesses of cloud buckets coupled with the immaturity of cloud storage administration relative to the decades of enterprise IT experience with tradition or legacy storage results in unprotected storage, likely to fall prey in the hands of hackers who constantly run their scans searching for the next victim.
What can I do to avoid a Leaking Cloud Bucket?
Luckily there are simple precautions that can ensure data remains protected within the organisation’s boundaries:
1. Encrypt data and keep the keys in your pocket
IT staff will sleep a lot better at night if they follow a simple rule: if the company’s data is outside its walls, it must be encrypted. Just as no one would access sensitive information over public wi-fi without a VPN, companies shouldn’t use public cloud storage without proper encryption. If the data is encrypted at rest and only certain staff members have access to the encryption keys, then there is nothing to worry about if a storage bucket becomes exposed: encrypted data will be useless to any non-authorised user. This is a vital insurance against the probability – large or small – that someday an error will occur.
2. Manage access permissions
Use a multi-layer access control system that starts from the access permissions of the bucket itself all the way to the file level for the relevant workloads, preserving permissions and connecting them to central directory authentication systems.
3. Invest in data loss prevention (DLP)
Leverage DLP software to monitor data-access patterns and find deviations that can detect data-leakage. These tools also can block policy violations, making it possible to stop users from sending sensitive data outside company walls.
4. Lock down endpoints and offices
Use enterprise EMM/MDM tools to eliminate shadow IT and create secure productivity spaces within corporate-provided and BYOD devices.
5. Periodic penetration tests
Penetration (pen) testing is essential when adding new infrastructure to the network, such as cloud storage. But it is good practice to perform regular pen tests to evaluate the organisation’s security posture and ensure no new leaks have appeared over time.
All of these measures should be at the top of all organisations’ privacy agenda, and only then will they have a chance to protect themselves against the fate suffered by many leaky cloud bucket victims.
Sabo Taylor Diab, Vice President Global Marketing at CTERA Networks (opens in new tab)
Image Credit: Alexskopje / Shutterstock