Investing in Big Data solutions is necessary for gaining insight advantage but it is not sufficient. There are a number of supplementary conditions which also need to be met.
The most important is that IT is seen as just one component of the solution, albeit an important one. All areas of the operating model need to be aligned for any technology project to work – processes, metrics, competencies, organisation design and culture. That means incorporating other functional areas – HR, Transformation, Analytics, etc. - in addition to IT.
And, alignment of these areas requires any such initiative to be business- rather than IT-led. Indeed research by Gartner has found that less than 10 per cent of Big Data proofs of concept (PoCs) have progressed to implementation with lack of alignment with business objectives cited as the main reason so few proceed.
A PoC should do two things – prove that the technology works and prove that it works in such a way as to deliver business value. The second is harder to achieve than the first. Due to laboratory conditions, business performance cannot be optimised. Also, the competencies required to use new tools to their full capability take time to develop – typically longer that the PoC period.
Finally any experiment requires assumptions to be made – over time these can be refined but in the first instance they are quite likely to be flawed. But even with these three challenges, the data generated should enable a reasonable inference of the run rate benefits achievable once the solution is in production, the competencies have been developed and assumptions have been refined. And if the PoC is not set up to try to achieve this, it becomes little more than a proof of technology.
As well as being business-led, Big Data initiatives require support from the leadership team. This cements the business-led requirement highlighted above and it also enables the required culture change. Despite talk of data being an asset, most organisations have not invested in ‘small’ or operational data, specifically with regard to quality – accuracy, completeness and timeliness.
Maintaining data quality requires governance – a steering board with clearly defined roles and responsibilities, also KPIs for which a senior business executive is held accountable – and resources in the form of a data stewardship team using appropriate tools control data management processes, monitor quality and fix any problems that they find.
But this all costs money and the business case for investment has never been as obvious as with other parts of the IT landscape. The benefits of high quality data are much fuzzier than the headcount or time savings offered by improved integration and automation. One consequence has been data being viewed as digital exhaust – a by-product of IT-enabled processes, transactions and interactions – not a valuable digital asset. The exhaust metaphor highlights parallels with the automotive world.
As cars became commonplace in the 1950s, exhaust emissions increasingly became a problem - an untreated pollutant that caused smog, exacerbated breathing problems and caused lead poisoning. Vehicle emission regulations were introduced in the mid-1960s (with California at the forefront) and these were initially met through de-tuning engines. But, as standards were progressively tightened, de-tuning started to severely impact fuel efficiency, until the only option was to develop an alternative. This led to the introduction of catalytic converters in 1975. These converters performed no other function than reduce exhaust emission toxicity making them an expensive but necessary evil driven by regulatory requirement (and societal benefit).
With customers incurring a cost for catalytic converters for a collective rather than individual benefit, there was an incentive to find an alternative that contributed to automotive performance rather than just mitigated harmful effects. And by the late 1980s turbocharging technology – which recycled exhaust gases to improve power and efficiency – had become standard. Today the technology has evolved such that a modern small car can achieve 60 miles per gallon while driving across a city and emit cleaner air than it takes in.
Figure 1 - Automotive Turbocharger
The history of data in regulated industries such as financial services follows a similar arc. Since the financial crisis, banks have invested heavily in data management, but primarily to meet regulatory reporting requirements. As with the catalytic converter, the expenditure has been enforced by regulatory necessity rather than strategic choice.
But, with personalisation a key strategic objective for many organisations and advances in analytics and decision management engines acting as the enabling technology, the importance of digital exhaust in optimising business performance is starting to be recognised. Data management is exiting its catalytic converter stage and becoming the organisational turbocharger.
This brings us to the next success criterion - insight advantage requires a system rather than solution approach. Big data technologies are only part of the IT capabilities required to optimise customer-facing performance through the generation of next best actions that drive retention, revenue generation and customer lifetime value (see figure 2 below). These include an omni-channel platform for handling digital interactions, traditional data warehousing and an analytics layer that incorporates at rest and real time capabilities, all of which contribute to an enhanced single customer view.
The at rest capability is made up of discovery tools that enable visualisation of patterns which, in turn, inform predictive models that update propensity and other scores on a batch basis. The real-time capability uses in-session on-line behaviour or other expressions of intent (for example broadcasting dissatisfaction on social media or complaining to service centre staff) to update scores as appropriate.
This analytics layer feeds a decision engine incorporating business rules and optimisation capabilities (including flexibility on the KPI being optimised) to generate the best next action. This then executed via the interface through which the customer is interacting (for inbound) or via the customers preferred outbound channel. Adaptive learning is supported via feeding back whether the action executed generated the desired customer response to be incorporated in the updated single customer view.
Figure 2 - Digital Turbocharger
While IT is the key enabler, building a data turbocharger is a multi-faceted challenge requiring business rather than IT teams to take the lead to ensure there is focus on value rather than technology. It also needs a holistic approach to operating model development and senior level sponsorship to make it happen.
But the good news is that any organisation that achieves these three things will find themselves in the minority, ensuring their insight advantage is highly sustainable.
Springman is Head of Customer Analytics at Sopra Steria.