The big data advantage, part three: ‘The Dude abides’

Since we know the ferocity of big data continues to grow, the need for systems that can perform at super-scale will likewise increase.

In my prior two articles discussing big data analytics, I highlighted the vast opportunities available to companies who effectively take advantage of big data – as well as the many obstacles that prevent organisations from attaining those tangible benefits, which included:

  • Increasing complexity across all fronts
  • An onslaught of analytics tools available to organisations
  • The difficulty of retaining the right skillsets within IT teams
  • Delays in receiving insights and executing informed business decisions

For a full explanation of how these barriers manifest within organisations and how they can affect business, see part two of my big data advantage series, “this is a very complicated case… you know, a lot of ins, a lot of outs, a lot of what-have-yous.”

It is important for organisations to remember, however, that these hurdles can be overcome, regardless of how high they may seem and how far away the organisation is from achieving a big data advantage. Before jumping into a proof-of-concept or costly experiment, there are organisational, analytics and technology considerations that you can evaluate now and use to develop a successful strategy.

In fact, Gartner says that one of the main reasons big data projects have failed to come together is a lack of proper management, combined with lots of trial and error from companies that just throw systems together willy-nilly. One example of organisational challenges you should consider is the diversity of teams that need to be involved. For example, an ESG (Enterprise Strategy Group) 2016 survey showed that seven different areas of competence were seen as crucial or important to have engaged in a big data project. If you think about what it’s like to coordinate just one meeting with seven different groups, the difficulties of gaining consensus with all those unique concerns and interests is apparent.

Furthermore, different teams will likely have different processes for things like defining requirements, evaluating products and integrating systems. Beyond this coordination issue, many companies do not have the variety of expertise required in-house for complete analytics solutions – while some don’t even know what expertise they’re lacking. So a thorough evaluation of how outcomes will be measured, staff skills and cross-organisational processes is a solid first step.

When it comes to big data analytics, there are a few trends that one should consider that can have an impact goals and strategy. Although we all are very familiar with the volume, variety and velocity of big data – mentioned in my previous blogs – few of us can really grasp what our industry will be like when we double or triple or quadruple the amount of data or its speed. If you only build a solution that’s ready for your needs today, you’ll shortly be disappointed. That means you’ll need to give yourself headroom, planning for scale and dynamic changes. In regards to the need for flexibility, it’s doubly important considering the rate of proliferation of analytics tools. So those with a long-view of ROI should ensure their solutions aren’t locked into just one or two uses.

For innovative businesses grappling with the technological realities of big data, we believe an agile analytics environment that combines the most salient attributes of system agility, speed, performance and accessibility is needed to deliver high-frequency insights efficiently and on a consistent basis. Considering the difficulties mentioned above, a pre-integrated and validated hardware-software solution is an option that enables organisations to significantly shorten the time-to-value, and be ready for data in days as opposed to months. When evaluating this type of platform, consider whether you’ll be able to easily run and manage multiple analytics workloads (such as Spark, Hadoop and graph, among others) to facilitate implementing increasingly complex workflows within a single system. Furthermore, it’s clear that open, standards-based frameworks are important for the inevitable customisation and emerging tools support, so look for systems that leverage common tools like OpenStack, Kafka and Docker.

As I mentioned earlier, since we know the ferocity of big data continues to grow, the need for systems that can perform at super-scale will likewise increase.   With this in mind, organisations looking to harness the power of big data must adopt an agile analytics platform designed with the above needs in mind. This includes the ability to achieve high-frequency insights from basic analysis and streaming analytics to machine learning on a single platform, even with dynamic workflows. By doing so, your organisation can move far beyond its former limitations and achieve accelerated time to value as well as the freedom necessary to evolve and change for whatever needs and challenges may materialise in the future.

In summary, let’s work together on transforming your toughest obstacles into your biggest opportunities. And that’s really the point, isn’t it? We know you can do amazing things, and we know we can help get you there – ­because it’s what we have always done. So to continue my string of references to the film The Big Lebowski, my final quote will be, appropriately: “The Dude abides.”

Amy Hodler, Senior Analytics Product Marketing Manager, Cray

Image source: Shutterstock/wk1003mike