The threat of fraud is growing. More than a million cases were reported in Britain in just the first half of 2016, costing a total of £399.5 million - a 25 per cent increase on the same period in 2015.
Fraud poses a massive threat to the financial services industry. Not only for the cost that affected institutions incur, but also the impact that it can have on confidence – both of customers and in the markets.
The digitisation of financial services has changed the how criminals commit fraud, with hordes of card details from data breaches available on the dark web. However, the rise in technology is also enabling financial services institutions to detect and combat it better. By utilising the vast amount of data that is generated every day, companies can more accurately spot fraud and reduce the number of false positives.
Faster analysis to prevent fraud
With the massive potential that analytics have for eliminating fraud, it is perhaps unsurprising that the financial services industry is leading the curve in big data analytics. Most large banks and financial institutions now routinely analyse a wealth of clients’ behavioural characteristics to more effectively determine which transactions are fraudulent.
From monitoring clients’ location, employment details, account balances, spending patterns, and even the speed at which a customer swipes their credit card, behavioural analytics can determine whether the card is being used by the owner.
However, speed of analysis is key to successful fraud detection. The most effective fraud prevention occurs when the analysis is fast enough to detect fraudulent transactions in time to prevent it going through – or at least to reduce its impact.
While identifying fraud that occurred a month after a transaction is completed may help the institution verify customer complaints for fraud, these delayed results are unable to prevent the theft and is left with the same cost – both from a monetary and brand perspective.
It is for this reason that financial services organisations have been leading the big data analytics revolution, with IDC estimating that banks would spend nearly US $17 billion on big data and business analytics solutions in 2016.
Analytics for fraud prevention has for a long time been at the cornerstone of a successful firm. But with the ever-growing amount of data that institutions hold, storing and guaranteeing the latency of this data for the real time analysis required to prevent fraud is an increasingly complex challenge.
Preventing data delivery delays
However, behavioural analysis can only run as fast as the slowest element in an organisation’s data centre. Data must be delivered to the application quickly, enabling the analysis to be conducted within the short time frame that institutions have to prevent fraud.
Many financial services companies store hundreds of terabytes of market databases – and for some larger firms, even petabytes. While this great quantity of data helps companies provide more accurate analysis, analytics applications sometimes face performance issues because they are unable to access the necessary data fast enough. This then creates an app data gap.
To understand the impact that an app data gap can have on an application, think of it as experiencing the same performance issues and delays that you are faced with when software that you use on your computer stutters, or when you struggle to bring up a document off of your organisation’s server. In the same way that the speed at which you are able to work is impaired, so is the big data application if it is unable to access the data fast enough.
When specifically looking at the case of fraud detection, the slow delivery of data to the analytics application ultimately reduces the speed of the analysis. With just short delays potentially resulting in fraudulent transactions going through, the cost to firms of not addressing data latency in their infrastructure could be significant.
It’s therefore essential that financial services institutions take steps to remove the barriers to data velocity, to ensure their analytics programmes can run at the necessary speeds to effectively detect and prevent fraud.
Identifying the cause
When looking for the root of application breakdowns, the finger is frequently first pointed at storage. Yet, Nimble Storage’s analysis of 7,500 companies demonstrated that in 54 per cent of cases the issue arises from interoperability, configuration, and/or not following best practice steps unrelated to storage.
One underlying cause of these issues is that most data centre components are built independently. So, even ‘best of breed’ elements can create interoperability issues across the entire of infrastructure.
And buying all data centre components from one vendor isn’t a quick fix against this threat, as a great number of large company’s solutions are made up of acquisitions of small businesses and their products.
Closing the app-data gap
Companies looking to remove the barriers to data velocity across their infrastructure should look to solutions incorporating predictive analytics and machine learning to address capacity and/or interoperability issues before an app data gap is created.
Adopting these solutions enables IT teams to analyse the performance metrics garnered from the large volume of high performing environments to establish a baseline. This helps them to identify poor performance earlier, so as to reduce the impact upon the application.
Using sensors that monitor the activity of different elements across the entire infrastructure when an event occurs helps the IT team to identify cause and effect relationships. In turn, this enables them to prevent problems that could arise from interoperability issues between the releases of different components, through comparing the results against those of other environments to identify tactics to avoid future conflicts.
Machine learning used to evolve software releases can also enable IT teams to optimise the performance and availability of correlations across the stack.
Reducing the cost of fraud
With the cost of fraud rising, it is essential that financial services increase data velocity across their infrastructure to ensure that the powerful analytics applications that can identify fraudulent activity are running effectively.
By looking across the entire infrastructure stack, IT teams can reduce the complex and diverse operations that can slow down the delivery of data to applications, and, in turn, analytics programmes. And when time really does costs money, no financial services institution can afford not to.
Jason Monger, Senior Systems Engineer, Financial Services, Nimble Storage
Image Credit: Gustavo Frazao / Shutterstock