Digital transformation - the process of using machines, networks, operations, applications, and business processes to improve an organization’s ability to remain competitive in the digital economy - relies on data. The new world of digital-everything is using analytics to predict and respond to the forces influencing business success, from tracking customer habits to predicting process failures before they happen.
The move to digital transformation comes just in time. The growth in online data and the advent of big data now makes it nearly impossible to run any good-sized business on paper. To keep up with the influx of new data, businesses need to digitize its assets, and every executive and manager is working to catch up with or get ahead of their competitors in mining the rewards of the all-digital enterprise.
But there’s a fair distance between the vision and the reality, and that’s where the problem – or to use the language of business, the challenge – lies: namely, How do we bridge the gap and achieve digital transformation?
Answering this question will require taking a new look at how we create value for the organization, and how that value is powered by the native language of our digital world, the data itself.
Data: Costs Unsustainable
How do we get there? We start by taking a new look at data.
Data used to be relatively cheap to store and manage. We’d take it from the daily events of the business, then index it and transform it into compatible formats for use by enterprise applications and databases. This is still happening today, of course. Business units are bypassing IT and going to clouds and other resources for their processing. Everyone gets the data they need, and in the format they need, to do their work.
But the cost of storing and processing the data is growing faster than the rate of data growth itself. This model it is unsustainable, and has forced organizations to pick and choose the data they believe is valuable, based on past needs, not future.
It’s clear that the growth of data is outstripping our ability to handle it by the same methods, by duplicating and changing it again and again to suit each new data store design. Digital operations require near-real-time coordination among an organization’s various machines and processes. This reveals a fatal flaw in one’s current thinking, namely, that cross-division audit trails and other complex workflows can be stopped in their tracks by incompatible data types and formats. Incorporating data-cleansing or other processes adds greater complexity, and cost, and it makes it even harder, and more expensive, to work back to the original meanings of the data when it was first created.
Analytics as a Data Service
We also need to look at analytics in a new way, because analytics will help us get the most out of the upcoming petabytes of new data.
Analytics, or BI, used to be thought of as a sub-function of IT, mainly useful for answering questions from the business unit. This model continued as business units brought in their own analytical resources, but most questions continue to be fairly narrow in scope because they were based on the specific needs of the unit.
But because today’s analytics platforms are capable of handling massive volumes of data in real time, they should now be targeted directly to the data, and in particular to the event-streams of original new data that come into website, firewall, and other enterprise logs. The analytics can then capture and work with this data before it gets indexed and modified, thus preserving it as a single version of truth.
In this way, the analytics platform becomes a data-answers-as-a service for business units, IT, security and other users. And it can let the data tell its own story, even if that story crosses multiple business units.
Conventional BI-like analytics are designed to solve problems, and often those problems, and the questions the users ask, are narrow in scope. When analytics is used as an enterprise-wide data access service, questions can be more wide-ranging and open-ended. Organizations can move to “How can we get value from this?” from “How can we fix this?”
Data Analytics Benefits
In a 2018 survey (opens in new tab) of Fortune 1000 executives, NewVantage Partners found that more than 97 percent are actively investing in big data and analytics, but just 73 percent are realizing measurable benefits. Moving to a data-analytics initiative with an enterprise-wide perspective helps CIOs and CEOs maximize their analytics investments by reducing unit-by-unit competition for separate budgets, and lets the CIO determine which expenditures are best for the organization as a whole.
And while the longer-term benefit of this approach comes from its ability to provide visibility across the enterprise’s operational boundaries, it can fit easily into an initial role with one department – security, for instance – and plug into other organizational analytic and data management resources as needed. It then can grow as needed to handle the inevitable expansions of real-time event streams, bringing on new department clients as their data needs increase.
Simplifying, rather than complicating, IT infrastructure will be key to making headway in the new order of digital operations. A data analytics initiative that takes event streams from operational systems – warehouses, production facilities, supply chains, ERP, security and other sources – as well as from online traffic, and delivers a top-down view to simplify, rather than obscure, operations management, is critical.
Perhaps most important, it will give businesses a clear vision of how to capitalize on the convergence of big data and analytics. The two seem like a perfect match but today’s executives are having a hard time figuring out the details. The need for a new approach is clear, and executives are waiting.
Colin Britton, Chief Strategy Officer at Devo (opens in new tab)
Image Credit: Wichy / Shutterstock