Skip to main content

Building and managing a cost-effective hybrid cloud strategy

woman stood under a cloud
(Image credit: Getty)

72 percent of enterprises describe their cloud strategy as “hybrid first,” which clearly shows that today’s businesses understand, or at least appreciate, the value of hybrid cloud solutions. But understanding the value and knowing how to start building a hybrid cloud strategy are two entirely different things.

For many organizations, managing the budget for a hybrid cloud strategy is the main roadblock to successfully executing an operational and efficient hybrid cloud strategy. There are several key factors that should be considered in order to create a cost-effective solution while simultaneously creating the value that you expect to achieve.

Related: The best cloud computing services.

The challenges

workers in a planning meeting

Performance, compliance, security, and cost: optimizations required for any business strategy (Image credit: Getty)

Building a successful hybrid cloud strategy requires optimization — for performance, compliance, security, and cost. The biggest obstacle to achieving optimization is the sheer complexity of it all. As hybrid cloud architectures include a mixture of public, private, and on-prem resources, complexity inherently increases as your hybrid cloud solution begins to materialize.

One point of difficulty can simply be technical training.  From one public cloud provider to the next, you might be surprised to find a staggering variety of products and feature-sets that require at least moderate technical expertise to effectively utilize.  While certifications are readily available, not everyone has the time, or even the desire, to become an expert in every technology before beginning a hybrid cloud planning project.

Additionally, selecting and using the right tools can be tricky, if not downright intimidating. For example, Hyperscale vendors offer plenty of tools that can help you monitor and manage your workloads, however, their cloud-native solutions typically don’t play well with other private clouds, or your on-prem systems. Your familiar and "tried and true" on-prem tools were most likely never designed to integrate with half a dozen public cloud providers.

Finally, the data ingress and egress transfers can become pricey.  Your data never changes? That's easy! But actually, getting your data into a public cloud provider, not to mention pulling it back out again, can quickly rack up a very expensive bill. One of the largest hyperscalers, for example, does not charge for data ingress into any service, but pulling it out or even transferring it between services will generally incur fees.  Unfortunately, no one is exempt from this challenge - not even NASA.

It was reported that after selecting the public cloud provider for its Earthdata Cloud, NASA determined that by 2025 its data footprint will have increased from 32 petabytes to nearly 250 petabytes. 

This astronomical yet unavoidable increase will be significant when calculating the associated costs of retrieving the data from their provider. If you have data that frequently moves between different clouds, or even between different services within the same cloud, you need to pay very close attention to your design in order to minimize unnecessary movement between services and/or clouds, or you should look for a provider that is willing to waive or limit these fees.

The solutions

The good news is that solid solutions exist to most, if not all, of these challenges. One easy way to better optimize your workloads and minimize costs is through compliance with cloud mandates. By starting with, implementing, and (gasp) actually following a well-designed cloud governance policy focused on consistency and predictability, you can minimize costs while increasing your application's performance. Win-win!

When planning for the costs of your inevitable data transfers, pay special attention to what needs to access your data and how, and design your solution to prevent unnecessary ingress/egress.  The more you know about your data, the better your hybrid cloud optimization strategy will be, and you’ll see better performance and lower costs as a result.

Additionally, putting in the time to find the right tools will pay back in dividends after you've implemented your hybrid cloud solution. Seek out only those that are compatible with a variety of hosted environments and avoid solutions that are specific to only one environment. This will help you to unite your underlying infrastructure into a single interface, simplify deployments across multiple clouds, and standardize processes across workloads.

Flexibility in your approach is key to optimization. Your organization’s needs will evolve as your business grows. You will need to redesign or reconfigure your environment perhaps multiple times down the road. Revisit your workloads and applications, legacy and new, regularly to determine how to consistently increase performance and scalability. Staying nimble will also help you stay on top of your total costs.

All things considered, if your organization is looking to experiment with public cloud, or if you need to keep certain legacy systems on-premise, you will find real value in hybrid cloud solutions. 

By being aware of your options, staying compliant, flexible, understanding the rules of your public cloud providers, investing in the right tools, and following best practices, you’ll be well on your way to successfully and cost effectively managing your very own hybrid cloud solution.

Gabe Matthews is a Senior Cloud Engineer at Otava.

Gabe Matthews is a Senior Cloud Engineer for Otava. With more than 20 years of experience in I.T., Gabe brings a unique and broad perspective to whatever challenges he faces. Beginning with dial-up internet technical support for a southern Indiana internet service provider in the late '90s, progressing to today's brave new world of virtualization and cloud computing, his career spans a broad swath of the technological revolution. A life-long learner, Gabriel has earned certifications in Microsoft Active Directory, SQL Server, Networking, and more recently, Veeam Backup & Replication. He enjoys writing and teaching, sharing his perspective and experience with others whenever possible.