Data. It’s what everyone is talking about. For enterprises, increased connectivity causes a data deluge and in doing so poses a serious challenge. The amount of data quantities being created is causing the complexity and cost of data management to skyrocket.
Trying to make sense of this data is going to be a huge challenge for all organisations. So, why does this problem exist?
The proliferation of data copies, or multiple copies of the same thing or outdated versions, is the root cause. Consumers make many copies of data: on backup drives, multiple devices and cloud storage. Businesses are worse because of the need to maintain copies for application development, regulatory compliance, business analytics and disaster protection. IDC estimates that 60 per cent of what is stored in data centres is actually copy data, costing companies worldwide as much as $44 billion to manage.
Copy data also poses a significant security threat. Each added copy increases an organisation’s “area of attack”, and gives hackers looking to get at important information, more material to work with. A recent IDC study found that the typical organisation holds as much as 375 data copies at any one time.
So, how do enterprises combat against this?
Copy data virtualisation is the process of freeing an organisation’s data from its legacy physical infrastructure. It is increasingly what forward-thinking companies are doing to tackle the issue of data copies. By eliminating copy data and creating a single ‘golden master’, virtual copies of ‘production quality’ data are available immediately to everyone in the organisation that needs it.
For IT managers, copy data virtualisation could be the way to address the ever-increasing rise in data. However, like with any significant overhaul or change, implementing it across an organisation requires planning and strategic thinking. Here are the four steps I believe businesses need to follow to maximise the copy data virtualisation opportunity:
- Choose your platform carefully
Every organisation faces its own set of challenges and this is what will impact of the choice of platform. Whilst this choice has to suit the needs of each individual business, there is a set of criteria commonly used. A typical enterprise will have workloads across a number of different systems – virtual machines on VMware and physical machines on Windows for example. The platform of choice has to be able to support all systems, databases and applications. This is a must for copy data virtualisation to take effect.
- The initial use case of choice
IT departments adopt a number of overlapping technologies, such as software for backups, snapshots, disaster recovery, and more. Data virtualisation removes the need for all of these redundant technologies by creating virtual, on-demand data copies that suit a number of these uses cases with one platform.
Choose one use case initially and roll it out first. In doing this you’ll be able to iron out any issues that come up before a wider roll out.
- What exactly do you need to prepare?
Now it’s the time to understand your specific needs so you can design the underlying infrastructure to support accordingly. You need to be asking some important questions:
- What rate is the production data changing at?
- What is the retention time required for virtualising backup?
- How many virtual data copies do you need at any one time?
- What testing will be done with that data (performance, functionality, scaling etc.)?
- How much bandwidth do you need (especially important if you’re working with multiple data centres across different locations)?
- How is your data being replicated and encrypted?
By answering all of these questions before you start investing in infrastructure, you will save a lot of time and money.
- Hybrid cloud
Many organisations have begun harnessing both private and public cloud offerings to create a hybrid cloud infrastructure. These hybrid clouds adopt the control and security of a private cloud, along with the flexibility and low cost of public cloud offerings. Working together, they can give organisations a powerful solution to meet the increased demands on IT from the rest of the organisation.
One of the main benefits of this hybrid cloud approach is enhanced agility – using public cloud means you can experience fewer outages and less downtime.
A hybrid approach also allows you to multi-purpose infrastructure – for data recovery and test and development simultaneously, for example – helping to cut down on costs and complexity.
By implementing data virtualisation and reducing physical copies of data, organisations will spend less on storage and can get to the most important stage of data management – data analysis and faster application development in higher quality.
The results are wide-ranging, from immediate access to virtual copies of data to less data moving across networks, less stored data, significantly reduced storage costs, and the total removal of costly operational complexity.
Ash Ashutosh, CEO, Actifio
Image Credit: Welcomia / Shutterstock