Making decisions with data – Taking an agile, value-based approach to designing analytics

In my last piece on how businesses use data, I outlined a maturity model for taking up analytics across an organisation. However, while this approach works at the strategic level, there are opportunities for BI teams to more quickly increase the value that projects deliver as well. This is based on making analytics more agile.

Agile (with a big A) refers to a philosophy for software development that was originally developed in 2001. The Agile Manifesto was published to help development teams move away from large projects that missed company requirements, and instead put the focus on more incremental improvements that would add up over time.

While the principles of Agile have now entered everyday language, the world of Business Intelligence is one where large legacy projects have persisted.

However, this is starting to change. Just as Agile has gone from a specific term in software development to being an IT management buzzword, there has been a lot of effort to make BI respond faster to business requirements. Today, “agile” with a small A represents all the potential that IT and digital processes can provide in making things run faster. The growth of visual discovery tools has exposed data analysis capabilities to a larger audience, thus expanding the number of “information producers” inside an organisation, while Cloud BI platforms now represent a viable alternative to the legacy BI platforms that previously ruled the roost.

These cloud-architected platforms enable companies to be more agile by dramatically accelerating the rate of progress and shortening the time required to deliver value to the business. By eliminating many of the tasks associated with traditional on-premises deployments, Using Cloud gives organisations the ability to experiment with much less risk – “fail fast” is the popular term – and respond to business changes more quickly.

Cloud BI solutions that have a true web-scale, multi-tenant architectures are also pioneering the concept of “virtual BI spaces”, which introduce fascinating possibilities in terms of how organisations network different sources of data together. This provides decentralised line of business teams and individuals with greater autonomy, without resulting in analytical silos that lead to information chaos.

The challenge is for BI teams to change their management mindset around data. This involves becoming more “Agile” in approach when it comes to working with data, and working with the line of business teams that are asking for more data. Rather than being the gatekeepers for data and responsible for creating reports, IT can instead help those line of business teams access and build their own analytics, dashboards and reports.

Iterating on analytics and data

To make the process easier, there are three steps that can help:

  1. Distinguish global versus local data requirements: Some data and definitions will be common across the enterprise - financial metrics necessary for quarterly results or compensation, for example. These need to be defined and managed centrally for consistency. Other data and definitions will be specific to the local team. Responsibility for this data can therefore be held locally in the format that best fits the needs of the team.
  2. Empower teams to serve their own needs: End users want to make more use of data, but this can require some work beforehand to prepare it for analysis. This can take time for the central time. Helping individuals or local teams to combine and prepare data on their own, without the need for centralised data modelling, can help speed up the process and not lead to bottlenecks. This self-service approach to data can also greatly increase agility by removing the data preparation burden from IT.
  3. Networking BI together: Bringing local and global data together can help teams see and understand business performance faster than relying on static reports. By networking data and BI together, local teams can execute their analysis while still getting global governance support. The use of “virtual spaces” gives decentralised teams and individuals the ability to work in sandboxes that are networked with each other, thus avoiding data silos. As a result, business users can collectively create a common semantic layer, built to ensure global governance.

After this three-step process is completed, it’s important to see how the new analytics approach is performing for the team using it. Are they seeing improvements in the quality of the decisions that they are making, as they have more accurate data to make use of? Are they able to make faster decisions than before? This can then be used again as a framework to improve things further.

For example, marketing teams may get more insight around how many new business leads they create and how many continue through the sales process. However, the metric that matters is not the number of leads passed through sales; it’s the volume of business that these leads convert into, and how quickly this takes place. For marketing, looking at this “lead-to-cash” process can show how well lead generation investments are doing at producing revenue, and where there are opportunities to take time out of the marketing and sales process. By looking at the data in context, the department can improve its whole approach to managing budgets and deliver better return on investment.

By looking at data in this way, it’s possible to spot potential improvements that benefit the whole business. Taking a “value-based” approach to designing analytics tools can help each team improve its performance over time, but also see where its investments are performing.

However, this is about making those decisions far faster, as well. Rather than waiting six months to see the impact of decisions, the results can show up more quickly in the analytics tools that are made available to the teams.

Pedro Arellano, Director of Product Strategy, Birst

Image credit: Shutterstock/Sergey Nivens