Increasingly, enterprises are using cloud-based environments as the foundation of day-to-day operations. In fact, Delphix’s recent research into the state of DevOps, surveying 100 IT professionals in the UK, revealed that 93 per cent of UK businesses are now using either a public or private cloud, with the top three most popular cloud providers ranking as Microsoft Azure (33 per cent), Amazon Web Services (28 per cent) and Rackspace (20 per cent). However, whilst the cloud has helped cut costs and boost flexibility, one constraint is still holding organisations back – data.
Many organisations base their new applications around architectures in public cloud but leave a huge proportion of their old applications on-premises. This is partly due to the difficulty in moving data, which is a classic barrier when migrating workloads to the cloud. The problem is usually not hardware or software-level incompatibilities with the infrastructure in the cloud; it’s simply that moving the supporting data for organisations with hundreds of applications is a huge challenge.
In part, this is because manually migrating data to the cloud is enormously labour-intensive and error-prone. If your data is stored in databases, you’ll need a database administrator (or a team of them) to run these tasks. After all, their expertise is necessary to ensure the transfer is performed without data loss or corruption.
One way around this is for organisations to use tools that create a “golden” backup of their data and continuously stream any updates. A copy of this data can then be virtualised so workloads can be tested in non-production environments before migrating to the cloud. This ensures the cloud migration process is successful and the business can quickly resolve any issues.
Creating these non-production copies enables organisations to refresh virtual data at will. It also means that for every production environment migrated, a handful of non-production environments won’t need to be migrated at all, simplifying and speeding up the whole migration process.
Hiding sensitive data
Some companies use replications to move data into the public cloud, but this opens up an important hybrid cloud challenge around data security. Sometimes enterprises can’t move confidential data into the cloud, but for cost reasons they would still like to run their test and development workloads there. To avoid leaving confidential data exposed, the only option is to mask the confidential pieces of data; for instance, by replacing real credit card numbers in a database with fake ones before moving anything into the cloud.
With security breaches appearing in the news on a regular basis, organisations should be using data masking more broadly. The ability to audit what data is masked and find all the places where data exists in unmasked form is critical to hardening security practices. Advanced data virtualisation tools enable IT to mask virtual copies of golden backups to hide any sensitive data, making the whole development process much more secure.
The excitement around moving to the public cloud has taken a while to reach established enterprises, but now that they’ve got the bug, seemingly everyone wants to migrate to cut costs and organisational inefficiency. Enterprises can extend these benefits with data virtualisation and Data as a Service software to make their cloud migrations cheaper, faster and easier.
Iain Chidgey, vice president, international sales at Delphix