More and more businesses are moving their data to the cloud and adopting SaaS delivery models for software. In making this switch many assume that they're shifting the responsibility for looking after their data to someone else.
But companies still need to take charge of looking after their information. We spoke to Rob May, senior vice president of business development for Datto (opens in new tab) to find out more about how enterprises can protect themselves.
Surely SaaS applications are backed up by the service provider, do companies really need to take extra measures?
RM: Software-as-a-service (SaaS) providers have backups for their own disasters, not for your disasters. If they lose a server, they can restore the data. But if Mary in the accounting department deletes a file, your SaaS provider probably can't get that back for you. It has no way to know if the deletion was intentional or accidental, and most providers will flush the file from their servers.
Isn't it possible to just backup data to a public cloud service? Why are special techniques needed?
RM: The special techniques are needed because getting data out of these SaaS applications at the scale of millions of users per day is difficult. It can be a real technical challenge to play nice with the various application performance interfaces (APIs) provided by these SaaS companies, while respecting their per-user and per-domain data limits and understanding all the various problems that could arise from dealing with them.
It's relatively easy to capture data and applications but what about preferences, settings and all the other stuff that makes a system usable?
RM: It's actually not that easy to capture the data. There are a thousand things that can go wrong. When you ask the Google Drive API to send a file, and several seconds later it still hasn't responded, what do you do? Maybe the API isn't working. Maybe the file doesn't exist. Maybe the network connection is broken. Maybe you reached your limit on API calls. You have to be able to automatically diagnose what happened and fix it. On top of that, you have to pull all the metadata. SaaS backup isn't just about files - it's about understanding and restoring the state of the application.
Doesn't running SaaS software make life easier in the event of a major disaster anyway as most information can be accessed from the cloud?
RM: Running SaaS does remove some of the disasters that could strike on-premises software. You don't have to worry about server failure, for instance. But the cloud is very collaborative. Many other third-party applications will be tied to your data. When the CEO downloads a new iPhone app to sync her Gmail or Google Drive information, and that app has a bug that deletes some data, it's nice to be able to log in to a cloud to cloud backup platform and restore it in 60 seconds.
Are there any barriers to using the cloud to backup already cloud-based systems?
RM: There are rare times when the APIs don't expose access to all the data we want to back up, but they are evolving rapidly and getting better about that all the time.
How can users be sure their backup is secure?
RM: Users should evaluate the security of any backup service to make sure it has service organization controls to be SOC 2-compliant for accounting and confidentiality standards. Users should also take care to implement updated security and encryption practices.
Datto has recently acquired cloud-to-cloud backup company Backupify (opens in new tab). Will this see your operation moving into new areas?
RM: Absolutely. Datto has a great channel of managed service providers that can cost-effectively reach much of the small- to medium-sized business (SMB) market in a scalable way that Backupify didn't previously pursue. We are really excited to be able to provide our service to that market segment. There is a huge need among organizations of all sizes to back up their data, since the effects of losing one file or one customer’s information can have a ripple effect on profitability.