Decisive leadership, business expertise, ultimate responsibility for successes and failures—those are the expectations all C-Suite executives must own and direct their collective skills at meeting. Perhaps less personality driven and more grit and longevity dependent is a fourth one: knowing the domain in which you operate.
For the C-Suite, the domain has never been limited to product, but now the overarching technologies it takes to execute on product are evolving so quickly, they require extra attention. Disruptive technologies are real. We learn them or we fail. Cloud is one of the most pervasive disruptors—discussed everywhere—yet still one of the least well understood. The six strategic truths about cloud reported here certainly aren’t the only nuggets of wisdom needed on a cloud journey, but they’re a critical start.
Building In-house Cloud Automation is a Marriage, Not a Fling
Enterprises can’t compete without automation. Somewhere in your company, there’s a team trying to build cloud automation. But, unless you’ve got a Netflix-like operation under the hood, chances are that team has underestimated how hard the task is or has realized it and is struggling. And chances are management and executives are in for a surprise. The automation probably won’t work well or scale fast, correctly, and securely. Go find that team, and make sure you’re aware at a detailed level of what’s happening. Why? Because they have become the nexus impacting many executive-level plans—your multi-year projections, what you’ll be able to accomplish, and how much it will cost.
When you dig in with them, look for danger signs. First, is that team or its leadership telling you cloud automation is less than a one year project? Rest assured, it is not. Second, is anyone telling you that effective automation can be put together out of a pile of scripts and point solutions? Be skeptical. Netflix, rejecting the pain that those kinds of shortcuts bring, hand-rolled full system software. Even as they adopted containers within that system, they forked and wrote their own type of container to solve scaling issues that weren’t solved by market-hot container brands. They accounted for a thousand contingencies. That’s the level of effort it took to do it right in-house.
After (and if) your team gets an initial collection of parts working, expect to hit major, unforeseen obstacles. Expect no one to know why there are dragons under the surface or exactly how to slay them. None of this means your team isn’t smart, devoted, and hard-working. Were they given the resources needed? Is the permanent, demanding nature of cloud automation thoroughly understood by everyone on the technical teams across the shop? DIY cloud automation is a massive commitment that requires massive investment. It won’t be accomplished with 10 people; for an enterprise, it will take closer to 100. It’s not a year affair. It’s a marriage of constant work and compromise. And keep in mind that the pace of cloud services development continues to move at lightning speed, while your team is finding its footing. The catch-up work is ever expanding and permanent. That’s what in-house cloud automation means.
There Are No Silver Bullets for Hybrid Cloud
While you’re swallowing the bitter “in-house automation” pill and looking outside the box for solutions, it’s likely your enterprise is also employing a hybrid cloud model. Managing private cloud, public cloud, and an on-premise data center is a complex endeavor requiring tracking and tuning all manner of connective layers. In addition, if you’re making a transition, that requires a graduated re-architecture and implementation. When someone tells you—and someone will—that they have a solution for hybrid cloud that makes public and private cloud work the same or that they can easily federate your environments, raise a red flag.
You’re probably being offered a solution that’s limiting your team technologically in a profound way. If you choose a reductive product or platform, ultimately that’s to the advantage of your competitors. For example, imagine you decide to distill cloud computing to a containers-only implementation because you can run containers both on your in-house infrastructure and on the cloud. That’s sold as a big answer to cloud, especially regarding standardization and management. But, it’s not nearly a big enough answer. It’s severely limiting. All kinds of other technologies get ignored with a container-centric strategy. As the proliferation of cloud resources results in more and better offerings and strategies, your teams may grow frustrated with the limitations of containers. Where are the integration answers? They don’t exist yet. There are no silver bullets in this space.
The closest you can get is using an agnostic solution that doesn’t lock you into one set of cloud technologies or one path, but instead accounts for what’s coming, whatever it may be. That has to be a single system that provides for API integration regardless of the nature of the services in cloud provider innovation pipelines.
A System of Freedom, Not Doctrine Wins
Avoid approaches, some held over from data center thinking, that force-fit your applications into a predetermined set of cloud templates. That anti-pattern removes much of the value of cloud. If teams only have a multiple choice solution of what they can do on the cloud because those multiple choices made it through your policy and security folks, all you have is a remote data center from a developer’s perspective. You’ve taken away your developers’ ability to innovate and missed the biggest point of being on cloud.
Somewhere out there are competitors who won’t make that mistake—their developers will outpace and outperform your developers because they have freedom. To avoid the multiple choice, force-fit conundrum, what’s needed is a way to enforce security policies and compliance regimes in a dynamic world that promotes experimentation. Give developers a system that lets them explore and create and try new tactics—and tells them when they break the rules. That’s quite different than a platform with a “here’s the only set of things you may do” interface. The latter is far too limiting. Its day-in, day-out effect is to crush innovative architecting and coding, to stifle invention. The degree to which your developers have freedom will make or break your competitive edge.
Beware of Relentless Acceleration and Savage Competition
In tandem with the last point and worth driving home, is that the cloud is now a battlefield in which you’re competing to provide benefits and value to your customers. Everyone’s there, every competitor known and unknown, and the good ones are already leveraging capabilities around speed, innovation, and iteration. It’s already happening.
For companies that aren’t leveraging those capabilities, that failure could be existential no matter what industry you’re in. The digital transformation is very real and cloud accelerated it. Uber and Airbnb grabbed market share because cloud computing slammed pace into fifth gear from second gear in transportation and hospitality. The pace that enterprise was used to is gone forever. Good developers—who understand cloud thoroughly, who share best practices across the organization, and whose creativity is enabled by systems that don’t force lowest common denominator patterns on them—accelerate a business’s pace in the cloud. To some degree, those developers will behave as free agents. They will go where they’re liberated to be creative and constructively challenged to invent. If another company has a King in the North or a fireproof Breaker of Chains, you’re out of luck.
Cloud is Democratizing the Software Eating the World
Executives are charged with seeing the broader connections in industry and developing insights based on those connections. So, in the most rapidly developing tech sectors, how do cloud, the Internet of Things (IoT), and artificial intelligence (AI), for example, play together? How are they connected? A fundamental truth to dwell on is that they are all the software eating the world, to use Marc Andreessen’s famous phrasing. What’s cloud? Cloud is driving infrastructure purely via software, not via human beings. What’s IoT? IoT is exposing devices that we haven’t traditionally thought of as computing devices to software. AI? Neural networks and deep learning depend on nontraditional software program algorithms that process massive amounts of data, can infer rules, engage in pattern recognition, and provide results which cannot always be fully explained. Cognitive computing might ultimately be a more descriptive term than AI for the kinds of non-deterministic systems being developed.
What’s so compelling in all this, in expansions of software so sophisticated, is that mind-numbingly powerful technologies like deep learning are becoming accessible to every proverbial garage engineer—thanks to the cloud. We’ll see more and more cloud services of this nature and beyond with APIs on them. Using those effectively will be a value differentiator for organizations for sure. It’s an absolute marker of democratization when the biggest, most powerful stuff in computing just shows up one day with an API on it and you’ve got access. The potential is exciting and extraordinary, but it also represents a profound problem (outside of obvious ethical considerations that might be firing off in your head!).
Imagine a business application, the boundary of which encompasses your traditional infrastructure, typical cloud services, and also these new machine learning and cognitive learning services exposed by the cloud. How do you begin to integrate and control that system? That’s not a problem widely solved and neatly packaged by an array of vendors. Not at all. It requires a system that can tackle those uniquely challenging boundaries and provide the level of repeatability and predictability businesses need. Government too for that matter.
We Live in a Brave, New World of Inverted Computing Truths
A shift in computing has been underway, accelerated in the last decade, that enterprise leaders must face, ponder, and use to their advantage. In the past, computing hardware was consistent and predictable. You could point at the machines, touch them, and name them. You knew what the hardware was, so the entropy you had to fight was in software. Now, that’s inverted. Now you write a program and it runs on thousands of computers in the cloud and you have no definition of the overall system. The hardware itself has become the highest variability part of the system because its use is constantly moving. The hardware is dynamically being allocated.
So, as unpredictable and difficult as it is to get determinism out of software, the cloud computer itself that your applications are running on, that’s made up of so many disparate components, is even more difficult to predict—how it will perform, how much it costs you at any given moment, what the shape of tomorrow looks like as new, powerful services are available. Everything’s highly mutable and the constantly evolving computer is the new normal. Hardware is now the moving part. We’ll be dealing with this rapid change for decades if not longer. Because of this, expect the rise of functional languages like Clojure and Haskell to continue in general, and Ludwig for infrastructure and policy specifics across cloud. When you have viscosity with hardware, software needs more discipline. Haskell and like languages embody one of the stricter programming language styles ever invented.
Some of this is daunting, but the cloud journey is worth taking. Cloud is a juggernaut that’s unstoppable now, regardless of our levels of preparation. For most companies, its use is inevitable. Being on the strategic edge, with a view of what’s really happening and armed with knowledge that compels you to act, is probably the best place for the C-Suite to be.
Josh Stella, Co-founder and CEO of Fugue
Image Credit: TZIDO SUN / Shutterstock