Solving the attribution challenge

There’s no denying the growth of digital, but it hasn’t all advanced at the same pace. Attribution has struggled to keep up, and regardless of the noise in the industry around it, and the inadequacy of the models currently in use, adoption of statistical techniques to identify the actual contribution of our digital campaigns has been limited.

With the customer journey ever increasing in complexity, continuing to take a simplistic view of attribution isn’t the best way to tackle it. The solution lies in the use of science to measure more accurately, not just the influence of the channel, but also how channels are interconnected.

The most frequently used attribution models are still either click- or rules-based; the former include first click and last click, and the latter include even distribution, time decay, and positional. None of these methods come without their flaws, but their main appeal is the ease of implementation. They come pre-packaged within advertising platforms and don’t require additional investment.

Lack of current solutions

Looking back on the journey of digital marketing since its birth in the late Nineties, it soon becomes clear there’s disconnect with the development of real-time bidding (RTB), mobile, and others and the stagnation of the platforms and methodologies used to measure them.

Clicks aren’t full representations of the value of a channel, and a failure to recognise that can become perpetuated in the form of inadequate measurement. Buying ads on a fraudulent site, for example, usually comes very cheap (impressions for 1-5p CPM) and they have good click-through rates (CTRs above 1 per cent).

So, if you’re an advertiser with fraud in your inventory, you’ll see a good number of clicks for a small amount of money. When you remove these fraudulent clicks, it can look like traffic is being bought on more expensive sites with worse click-through rates, as the marketing key performance indicators (KPIs) will often go down.

This is why it is very common to see 75-80 per cent of conversion driven by view-through as opposed to click-through in display advertising. A click-centric measurement model will prevent businesses from capitalising on the full potential of their marketing investment by distorting the actual performance from safe, non-fraudulent traffic.

Accepting the challenge

There are several factors that contribute to the current lack of sophistication around the measure of digital marketing attribution. The first is a shortage of technical skills to employ big data technologies to manage the complexity around processing and transforming extreme volumes of hyper-structured data (web log files) to enable advanced statistical analysis.

It’s worth clarifying the term hyper-structure – weblogs are sometimes, and incorrectly referred to as unstructured data. In reality, once you look into the details of how the data is generated by digital platforms, it becomes clear that the structure is there within the information itself, and it can be derived though string parsing it. It’s not a structure in the traditional Business Intelligence sense of columns and rows, but nonetheless, structure it is.

As a result of the above, the technical skills required to manage this complexity are not as widely available as the traditional analytical skills used for the widely adopted, but inapt, current measurement methodologies (e.g. last-click, first-click, etc.)

Power of the analytics experts

The second factor is the difficulty of striking the right mix of expertise between business acumen and the technical understanding of how digital channels and tracking platforms form part of the same ecosystem. The ability to marry the business dimension with the actual implementation of the marketing objectives into digital channels and its tracking platforms is a crucial requirement to enable advanced algorithmic attribution.

The third and a key factor is the shortage of data scientists. The right people with the right skills enable you to establish the link between the business objectives and the statistical methodologies required to get you to the desired outcome, but these people are still difficult to come by. This is a challenge that won’t go away anytime soon, and organisations whose core competence is around data and analytics will be better placed to attract and retain such an elusive and valuable talent.

We’re working to address these issues by developing an approach to attribution combining big data processing, advanced statistical techniques and constraint-based optimisation. Our Algorithmic Attribution Modelling product provides our clients with a robust, scalable solution for measuring the influence of each channel in their digital marketing campaigns and empowers them to make optimal decisions on budget allocation to drive performance of future campaigns.

Organisations should always strive to have robust measurement frameworks to make the most effective business decisions and attribution is no different in this respect. Any model that allows brands to reduce uncertainty in decision-making should be pursued and statistical modelling for algorithmic attribution certainly falls under this ‘uncertainty reduction’ banner.

Rafael Garcia-Navarro is Chief Analytics Officer UK&I at Experian. He is responsible for leading the analytics function, with expertise in data architecture, big data and predictive analytics.