Skip to main content

Deep learning – The next big thing in data analytics… and you probably haven’t heard of it!

(Image credit: Image Credit: Enzozo / Shutterstock)

By the end of this year, it’s predicted that Deep Learning will be a core component in the tool-kit of 80 per cent of data scientists (opens in new tab). That’s a pretty big jump considering machine learning and AI are still considered to be cutting edge technologies and Deep Learning goes even further than both.

Deep Learning is used by some of the world’s largest brands such as Google that cites it as the cornerstone of its voice image recognition algorithms. Netflix uses it to work out what you want to watch next, Amazon for what you’ll buy next, and even researchers at MIT use to predict the future (opens in new tab). There are a growing number of tech vendors entering the space who are keen to discuss how this is the next generation of data science, but what exactly is it? And is it more than just another name for AI?

What even is deep learning?

Deep Learning, Machine Learning and AI are inexorably tied together. Machine Learning is cutting edge tech – it’s the field of AI which today is showing the most promise at providing tools that industry and society can use to drive change.

However, if Machine Learning is cutting edge, then Deep Learning is science-fiction. Where Machine Learning is using neural networks to mimic the human decision-making process, Deep Learning goes even further and attempts to create ‘thoughts’ and ‘intelligence’ – whether human or digital.

Essentially Deep Learning involves feeding a computer system a lot of data, which it can use to make decisions about other data. This data is fed through neural networks, as is the case in machine learning. These networks – logical constructions which ask a series of binary true/false questions, or extract a numerical value, of every bit of data which pass through them – classify it according to the answers received.

These networks are called ‘Deep Neutral Networks’ – the type of structure that can analyse and classify datasets on the scale of Google’s image library or Facebook’s advertising information. With datasets of that size, and logical networks sophisticated enough to handle their classification, it becomes not only feasible, but easy for a computer to analyse a picture and determine what it represents to a human with a high probability of accuracy.

Deep learning in action: Insurance quotes

It’s easiest to understand through a real-world example that uses a vast amount of data to produce a single, intelligent results. Take insurance websites that seem relatively simple – you enter your personal details along with your car’s number plate and in seconds the systems pops out a personalised quote for a policy.

To create that single figure response, the system must draw on billions of data points and previous insurance cases to draw a conclusion about how ‘risky’ you are as a driver. First, the system would be given access to a huge database of car types, including their shape, size, maximum speed and even colour to determine likelihood of an accidents. This could be manually compiled or, in more advanced use cases, automatically gathered by the system if it is programmed to search the internet, and ingest the data it finds there.

Next it would take the data that needs to be processed – real-world data which contains the insights, in this case insurance records, and crash data. By comparing the data from its sensors with the data it has “learned”, it can classify, with a certain probability of accuracy, the likelihood that you will have an accident within the next year.

The above process is reasonably straightforward, but with Deep Learning, the system can learn as time goes on through its experience and improve the odds of an accurate quote. This is commonly referred to as ‘training’ itself, or in essence, learning from its mistakes the way humans do!

At first, it may incorrectly assume a young driver has a high likelihood of crashing based on their age. However, if this doesn’t happen the system can learn which are the most important differentiators and adjust the quote accordingly to produce the optimum outcome for consumer and insurance underwriter.

Adding value to the business

With tools like Deep Learning and AI, advanced data analytics users can add more to the organisation than ever before. For the enterprise, successful insights from data are crucial for providing business value beyond the norm.

However, deployment of predictive models that use this technology continue to be a major pain point for many in the enterprise. It’s important to have the correct data analytics platforms, that will enable users to avoid the labour-intensive process of deploying predictive models, faster and more reliably.

As business leaders scramble to get the most out of Deep Learning and AI, its important lessons are learnt from buzzword failures. The intelligent technology will be powered by Deep Learning, but it will be a component of data analytics. Advanced data analytics platforms can give the enterprise the best of both worlds.

By getting caught up in the hype and obsessing how you can get business value out of the latest trends and buzzwords, you risk not seeing the wood for the trees. It’s important to take a step back and assess how Deep Learning can deliver realistic results, or if there’s a more human centric step to use the smartest tools in your business tool chest: Your employees. That begins with arming them with the right tools.

Nick Jewell, Director, Alteryx EMEA (opens in new tab)
Image Credit: Enzozo / Shutterstock

Nick is Lead Technology Evangelist at Alteryx. He works within the product management team to present its end-to-end platform vision, as an evangelist with analysts, data scientists and the public.