"Continued data growth in IT data centres is placing a tremendous strain on data backup. Understanding the challenge is the first step to finding a solution." - Bill Andrews, president and CEO, ExaGrid Systems
Most organisations keep multiple weeks to years of backup. In the average IT environment, primary storage data is growing at 30 per cent or more per year. As a result, data doubles every two and a half years. These factors need to be taken into account to ensure your backup system is sized to handle your data not just today, but into the future as well, in order to maintain performance and keep the backup window short as data volumes increase.
This is the first in a series of sections from this guide. The guide in its entirety identifies and explains the various backup complexities, and the ultimate solution that solves the backup and restore problem permanently.
The movement from using tape for backup to using disk for backup is well underway, with about 70 per cent of IT organisations now using disk staging in front of a tape library onsite, and approximately 15 per cent of IT organisations having replaced their onsite tape backups with disk-based backup with data deduplication. Industry sales of tape libraries have been on the decline for the past four years, while sales of disk-based backup with deduplication are seeing double-digit growth year-over-year.
Many factors should be considered when making the move to disk with deduplication, as it is not as straightforward as simply buying some disk. Unlike primary storage, from a technology perspective, disk backup solutions are based on hundreds of thousands of lines of software code to as many as a million lines of software code. This much software is required to properly work with a backup application, deduplicate the data, store the deduplicated data and replicate the data.
Disk-based backup is not merely a commodity storage solution or a NAS share with storage plus deduplication. Disk backup requires a purpose-built approach that positively impacts backup performance, the backup window, the time to restore traditional or image backups, make offsite tape copies, perform instant recoveries and recover from a site disaster. The purpose-built approach also impacts the long-term costs of expansion over time. Some IT organisations have taken the simple route of purchasing a disk backup system with deduplication from their primary storage vendor, backup application vendor, tape library vendor or server vendor, only to find that they did not ask a fraction of the important questions before buying.
In this guide, we will first explain the differences between the various backup approaches. We will then provide the list of questions you should ask each vendor about every solution to avoid experiencing the following 10 consequences:
- Inability to operate with or support all of your current backup applications and utilities or those you might use in the future
- Missing support for all of the pertinent features of your backup application
- Slow backups, or backups that start quickly when you deploy the solution but then degrade over time as data grows, resulting in an expanding backup window
- Expensive forklift upgrades to replace the controller in order to maintain performance and keep the backup window short as data grows
- Slow restores of traditional full backups or images
- Slow auxiliary copy or offsite tape copy
- Lack of support for instant recovery solutions for files, objects and VMs could mean recoveries take hours instead of minutes
- High cost due to increased equipment needs over time
- Constant additional purchases of disk capacity due to a poor deduplication rate
- Use of additional expensive bandwidth due to poor deduplication rates
This guide explains the various backup complexities, enabling you to ask the right questions and make the right decision for your specific environment and requirements. Stay tuned for part two of this guide, which will be live on ITProPortal shortly.
Image credit: iStock