Qualcomm is launching its latest hardware aimed at servers and edge computing.
During its recent AI Day conference in San Francisco, the company unveiled the Qualcomm Cloud AI 100 platform, its latest attempt at getting its foot through the door of cloud computing.
This is by no means an old chip with a fancy new dress. It’s a completely new chip, built from the ground up. It’s a 7nm device for AI inference tasks, meaning it won’t be training neural networks – it will be doing the grunt work on numbers passing through them.
According to Android Authority, Qualcomm is directly aiming at the Nvidia Tesla T4 series, as well as the Google Edge TPU inference chips.
Qualcomm claims that the new offering can achieve anywhere between three and 50 times higher performance than that of the Snapdragon 855 or the Snapdragon 820. The media are speculating they’re talking about 350TOPS of performance.
The company also claims a 10 times improvement on AI inference solutions.
The Qualcomm Cloud 100 supports Caffe, Keras, mxnet, TensorFlow, PaddlePaddle, and Cognitive Toolkit frameworks, along with Glow, OnnX, and XLA runtimes.
At the same time, the company announced three new mobile chips, the Snapdragon 730, 730G and 665. The chips should be faster, supporting higher display resolutions (particularly for gaming), and should process AI on-device better than previous versions.
Image Credit: Jejim / Shutterstock