Skip to main content

Intel and Baidu team up on AI training

(Image credit: Image Credit: Enzozo / Shutterstock)

Intel and Baidu are teaming up to work on a new chip, designed for training deep learning models at ‘lightning speed’.

This was confirmed by the two companies during the Baidu Create AI developer conference, which was recently held in Beijing. Intel Corporate VP Naveen Rao said his company is teaming up with Baidu to work on the new Intel Nervana Neural Network Processor for Training, or NNP-T for short.

The joint effort includes both hardware and software designs with the purpose of training deep learning models at high speeds.

“The next few years will see an explosion in the complexity of AI models and the need for massive deep learning compute at scale. Intel and Baidu are focusing their decade-long collaboration on building radical new hardware, co-designed with enabling software, that will evolve with this new reality – something we call ‘AI 2.0,’” said Naveen Rao.

Intel and Baidu have been partners for years now. Since 2016, Intel has been optimising Baidu’s PaddlePaddle deep learning framework for its Xeon Scalable processors. Now, they’re optimising NNP-T for PaddlePaddle.

The two companies are also working on MesaTEE, a memory-safe function-as-a-service (FaaS) computing framework, based on the Intel Software Guard Extensions (SGX) technology.

VentureBeat believes Intel sees its future in AI. “The Santa Clara company’s AI chip segments notched $1 billion in revenue last year, and Intel expects the market opportunity to grow 30% annually from $2.5 billion in 2017 to $10 billion by 2022.”