Q&A: Artificial intelligence, advancements and applications

Thomas Bradley, NVIDIA Developer Technology Manager EMEA, answers questions regarding AI.

1. Until recently, most people thought AI was science fiction. What’s changed?   

The concept of artificial intelligence has been around for decades; Alan Turing first speculated that machines could one day think like humans back in the 1950s. But it’s the combination of research breakthroughs, the wider availability of big data, and advances in graphics processing unit (GPU) technology that has ignited the AI explosion taking place today.  

When Google DeepMind’s AlphaGo system beat South Korean champion Lee Se-dol at the ancient Chinese game Go in March 2016, it marked a turning point in AI’s place in the public consciousness. Given that there are more possible Go positions than there are atoms in the universe, researchers had predicted it would be years before AI could become sophisticated enough to beat a human.    AlphaGo used a form of AI called “deep learning” to master Go. In effect, it taught itself to play, training an artificial neural network by simulating millions of games. Over the last five years, work by several pioneering scientists on accelerating deep learning using GPUs has brought this approach out of the textbooks and into the datacentre.    

2. When talking about AI, a lot of terms get used interchangeably. Is there a difference between artificial intelligence, machine learning and deep learning?   

These terms aren’t the same but they’re related. In essence they’re a bit like Russian dolls – AI is the overarching idea, and machine learning and deep learning fit inside it.    

At the heart of AI is the concept of creating a computer that can think like a human. We’re a long way from that vision, but we are able to build systems that out-perform humans at specific tasks. Machine learning is an approach to AI in which computers are trained to find the solution to a problem by feeding them with large amounts of data. Deep learning has enabled many practical applications of machine learning and, by extension, the overall field of AI.   

Deep learning builds layers of nodes or neurons, called deep neural networks, to solve problems. These networks are loosely inspired by the structure of the human brain.  In conventional computing, a programmer writes lines of code which tell the computer how to solve a problem; deep learning is different because it enables computers to learn to solve problems by trial and error, in much the same way human children learn.    

3. NVIDIA is best known for computer graphics. How come you’ve entered the AI space?   

The GPU was invented by NVIDIA in 1999 for 3D graphics. In computing terms, gaming is a big data problem, requiring massive amounts of data to be processed in parallel to generate realistic scenes.    Deep learning is a fundamentally new software model where billions of software-neurons and trillions of connections are trained in parallel then deployed to solve real-world problems. Thanks to their gaming heritage, GPUs are naturally great at parallel workloads and speed up deep neural networks by 10-20x, accelerating both the training and deployment (known as ‘inferencing’) phases of deep learning.    

4. What are the practical applications of AI? oHowHow long before we see it being widely used as a business tool?   

Market research firm Tractica has sized the market for AI systems for enterprise applications at $11.1 billion by 2024. Thanks to GPU-accelerated deep learning, AI is already finding its way into deployment across many industries.   We’ve seen such a huge appetite for this technology that, earlier this year, we launched the world’s first deep learning supercomputer in a box. It’s designed as a ‘plug and play’ AI solution to help companies and institutions get started with AI quickly and easily – Open AI, the non-profit artificial intelligence research team co-chaired by Elon Musk, took delivery of the first DGX-1 this August.    From startups to enterprise, AI is having a transformative effect. AI won’t be an industry – it will be part of every industry.   

One example we’ve all experienced first-hand is the way in which AI is being used to customise the way consumers interact, procure and receive services from vendors. When companies like Amazon and Netflix suggest products, they’re using deep learning to analyse not only our own purchasing and browsing history but that of thousands of other consumers to deliver uncannily insightful results.    

In warehouses and manufacturing plants, AI will also be revolutionary. Industrial robots which can learn new processes, rather than require costly modification or replacement, will bring huge gains in effectiveness and flexibility to production lines. There’s some exciting work being done in this ‘future factories’ field by companies like French start-up Akeoplus. And in warehouses, we’ve already seen online retailing giant Zalando achieve impressive improvements in its systems by implementing deep learning to calculate the most efficient picking routes.  

If you’re interested in finding out more about applications of AI, check out the GPU Technology Conference Europe, taking place in Amsterdam on September 28th and 29th. As well as hearing from some of Europe’s leading AI experts, the conference includes a program of hands-on labs where attendees can learn how to get started with deep learning on GPUs.  

Thomas Bradley, NVIDIA Developer Technology Manager EMEA   

Image Credit: Michael Dain / Flickr

ABOUT THE AUTHOR

Thomas Bradley, NVIDIA Developer Technology Manager EMEA.