Skip to main content

NVIDIA speeds up deep learning through new AI cloud container registry

(Image credit: Image Credit: Sergey Nivens / Shutterstock)

Nvidia has marked a new step forward in AI development with the release of its Nvidia GPU Cloud container (NGC). 

The NGC helps developers make their first steps into developing deep learning by allowing free access to a “comprehensive, easy-to-use, fully optimised deep learning software stack”.

However at the moment, the cloud-based service is available immediately “to users of the just-announced Amazon Elastic Compute Cloud (Amazon EC2) P3 instances featuring NVIDIA Tesla V100 GPUs”.

The company said it is planning on expanding support to other platforms “soon”. No dates mentioned, though.

“The NVIDIA GPU Cloud democratizes AI for a rapidly expanding global base of users,” said Jim McHugh, vice president and general manager of Enterprise Systems at NVIDIA. “NGC frees developers from the complexity of integration, allowing them to move quickly to create sophisticated neural networks that deliver the transformative powers of AI.”

Getting started is a three-step process. Sign up on this link, run an optimised NVIDIA image on cloud service provider platform, and pull containers from NGC.

Nvidia says among the key benefits of using NGC are instant access to the most widely used GPU-accelerated frameworks, maximum performance and pre-integration.

The company also claims the containers available on the NGC container registry will be continuously developed, “ensuring each deep learning framework is tuned for the fastest training possible on the latest NVIDIA GPUs.”

There’s a short video explanation on YouTube, as well, you can check it out on this link.

Image Credit: Sergey Nivens / Shutterstock