Data growth hasn’t gone unnoticed by anyone in recent years, but how many of you realise the rate data is accelerating at? To put it into perspective, 90 per cent of the data that exists today was created in the past two years alone. The biggest change is that most of the data being generated today is done by machines, and not by humans using applications. Compounding this problem is the fact that this data has gravity - if it’s created at the edge of your network or in your factory, you may be unable to move it to a central data centre or the cloud to process it in real-time, as your business may require.
So data gravity really does matter, and there is an increasing need to be able to analyse and react to data where it is created. Artificial intelligence (AI) and deep learning (DL) are taking what’s possible with analytics to the next level, and it’s impacting every industry without exception. In fact, by 2020, Gartner predicts AI will be pervasive in almost every software-driven product and service.
The opportunity AI presents is particularly exciting. Not only can it assist in the analysis of big data and transform this to go beyond human consciousness, but AI also offers the capability to build exciting new services and products for customers. What also mustn’t miss a mention, is the efficiency and reliability AI brings to operations; through the use of automation within infrastructure.
The challenge to implementing a cohesive data strategy is that infrastructure has been built over a period of time, as opposed to from scratch. As a result, it is most likely fragmented and far from cloud-like; simple, scalable and agile. Given the importance and value of data, it’s time to re-think IT infrastructure from the bottom-up, and to put data at the heart of the design. A truly data-centric architecture must be invested in, and this is built on five key principles.
Consolidated & simplified
In order to achieve the full potential of data, and boost efficiency, it is critical to move away from data islands and instead consolidate your data. This is where all-flash differentiates the game; consolidating numerous applications into large storage pools, where what were once tiers of storage are now simplified into one. This drives efficiency, agility, and security and as a result management can also be converged, ensuring that storage plugs-in nicely to your infrastructure orchestration strategy.
The second pillar to your architecture is ensuring you have designed it for real-time. Because today, slow data simply is not an option. Real-time data makes applications run faster, customer experiences become instantaneous and immersive, and employees more productive. It’s also worth pointing out that real-time not only means real-time data, it also means real-time copies. This is the ability to take copies of data and easily share these between multiple users. For example, fresh copies of production data shared with test and development.
On demand & self-driving
The third pillar to the structure of your data centre, is on-demand and self-driving; representing a paradigm shift in how we think about operating storage for businesses. Say for example your storage team could stop being a storage operations team, and instead think about their mission as the in-house storage services provider for a particular customer? What if they could deliver data-as-a-service, on-demand, to each of the development teams, just like those developers could get from the public cloud? Instead of having a never-ending cycle of reactive troubleshooting, what if your storage team could spend it’s time automating and orchestrating the infrastructure to make it self-driving and ultra-agile?
For this to become a reality though, it will require some significant changes in how you operate. First, you must be ahead of the curve and be able to anticipate the business needs and effectively design a set of elastically-scalable storage services that will enable you to build ahead of consumption. From a front-end perspective, it is all about standard services and standard APIs. Whereas for the back-end, it is all about automation over administration, and the tools needed to make that happen are becoming increasingly accessible, and very sophisticated.
Architectures of the future will be multi-cloud – even if you run everything on premise. Take a look at your existing environment; you have a production cloud, and probably support multiple development environments, it’s more than likely that you host a cloud for analytics, and you almost certainly operate a global backup network. Each of these environments are increasingly aspiring to run in the cloud, expecting those same cloud attributes of being simple, on-demand, elastic, and a driver for innovation.
At the same time, each of these have their own requirements which means that you have to design a next-gen data strategy that delivers the cloud data experience to each of these environments, yet also delivers the unique capabilities they demand.
This is why your data-centric architecture should be designed with multi-cloud in mind - an assumption that you need to manage data across multiple clouds and achieve the data portability and openness to make this possible. If you don’t design for this, you face the real danger of being locked-in, because data is the foundation of any infrastructure.
Prepared for what’s next
The final element to embrace is the reality of how fast data is actually moving. Eight years ago, 1PB of flash required six racks of space, and AI was a research project. Fast forward to today and 1PB of data can be stored in less than 3U, and AI and automation are becoming mainstream technologies. Data moves equally as fast, and it is imperative to design an architecture that has the performance for tomorrow, but is built with the ability to evolve and innovate for years to come.
So what does this all mean? What if you could implement a data-centric architecture that was consolidated on flash, enabled real-time across the business, and delivered data-as-a-service to each of your internal clouds?
Having a data-centric architecture allows core applications to be simplified, whilst reducing the cost of IT, empowering developers with on-demand data, making builds faster and providing the agility required for DevOps and the continuous integration and delivery pipeline. Deploying a data centric architecture also enables the delivery of next-generation analytics as well as acting as a data hub for the modern data pipeline – including powering AI initiatives.
In a nutshell, a data-centric architecture will provide you with the platform needed to accelerate your business and stay one step ahead of the competition – today, tomorrow and ten years from now.
Peter Gadd, VP Northern EMEA, Pure Storage
Image Credit: Flickr / janneke staaks