According to Flexera’s 2020 State of the Cloud report, 74 percent of businesses globally are using at least one public and private cloud to store their data - a hybrid cloud. Hybrid cloud environments have grown popular, as they allow organizations to benefit from the security and autonomy of the private cloud while retaining the reliability and economies of scale offered by the public cloud.
When it comes to storage, a hybrid strategy comes with some challenges. For private clouds, you need to have the team and the capital at hand to ensure that your private cloud provides the performance you need. For public clouds, you need to ensure that your providers are giving you an economical and secure solution that you can easily integrate with your other public and private clouds.
To ensure that you address the challenge of procuring the right public cloud, you should keep an eye out for some important metrics and indicators that show whether the public cloud delivers the performance, security, and value that you’re looking for.
With the growing ubiquity of hybrid cloud, the expectations are on public cloud providers to ensure that their customers can rapidly read and write stored data. Many organizations now rightfully expect to be able to save or open any cloud-stored file within milliseconds. Rapid access doesn’t just improve productivity among organizations through saving time in accessing files, it also saves data center resources such as power and CPU cycles. This makes storage more affordable in the long-run.
Public cloud providers have many tools at their disposal to ensure they improve read and write times. One place providers often look at is the storage medium they use, with more efficient providers tending to pick SMR (Shingled Magnetic Recording) disk drives or SSDs (Solid State Drives), which improve storage density and read/write speeds relative to conventional HDDs (Hard Drive Disks).
Another way providers improve the customer experience is by optimizing the architecture of their data center to ensure that inbound and outbound data packets go through the fewest steps possible. They may also adopt a purpose-built structure for their filing systems so as to fully leverage the hardware and architecture used within their data center, which guarantees that requests for files are processed as quickly as possible.
The amount of work done by providers in optimizing their data center will be reflected in their benchmarks. These benchmarks should be the first point of reference for customers, and thankfully there are many third-party testing tools to help customers decide which provider is better for their needs, such as Intel’s COSbench.
All else being equal, a provider that consistently gives you higher throughput speeds should be preferred over the alternatives.
Reviewing security measures
A hybrid cloud strategy cannot be seen as an opportunity to lower security standards for the public cloud, and the best providers ensure that sensitive data and workloads can run in their data centers. Top-tier security practices are essential for many organizations from a legal and regulatory perspective, which means that a certain base level of security is obligatory when picking their hybrid cloud solutions.
The best way to start due diligence on the security side is to research the ISO security standards your organization needs to comply with. Then, you can look at what standards each potential provider can comply with, which they often list on their own websites.
One feature that’s worth considering is a data immutability function. An immutable object storage bucket is one where the stored data within cannot be deleted or altered by anyone within a certain time period. This is particularly useful as a way to store files that you’d like to keep read-only or keep as backups in case of data loss elsewhere in your hybrid ecosystem. Data immutability prevents accidental deletions, administrative errors, and malware from tampering with or erasing this information, and even signals regulatory compliance in some jurisdictions and industries.
Looking at pricing plans
The best storage plan for your data and applications will involve considerations of security, accessibility and of course, cost. First, you have the “hyperscaler” public cloud providers, such as Amazon, Google and Microsoft. They typically offer varying storage tiers with a unique set of pricing and performance characteristics, which offer trade-offs between “hot” storage (frequently accessed data), “cool” storage (for infrequently accessed data), and archive storage (seldom accessed data). These plans are inflexible in the face of a business’s data requirements growing or its hot/cold/archive ratios changing, which means budgeting out future cloud storage costs can be quite difficult.
In addition, the hyperscalers also tend to charge their customers beyond the nominal cost of their tiers, in the form of fees for data egress or making API calls. This means that many companies can face steep price increases just for trying to read their data more, even if their storage requirements don’t otherwise change. This makes accurate forecasting almost impossible and adds an extra layer of complexity to configuring a hybrid cloud environment.
Newer providers are abandoning these practices, however, and are offering commodity pricing instead: that is, just charging per gigabyte used a month, without regard to how often that data is used or how that data is broken down between hot, cold and archive storage. This allows for companies to scale up their storage requirements easily and budget for future use, and also saves a lot of time that would otherwise be spent having to micromanage their data usage on top of the rest of their hybrid cloud infrastructure.
Affordability, performance, and security are all essential to making sure a hybrid cloud works, which makes it vital that you ensure your public cloud has these qualities. By taking the time to look at how different public cloud providers stack up, you’ll be in a stronger position to fully leverage the opportunities of a hybrid cloud strategy.
David Friend, co-founder and CEO, Wasabi Technologies