Microsoft has announced a new deep-learning platform that will allow the company to further its cloud-based AI services.
Project Brainwavewill allow Microsoft Azure users to run complex deep-learning models at super-high performance levels.
The service runs a FPGA-powered system designed for ultra-low latency deep learning in the cloud. Announcing the deal in a blog post, Microsoft said that with Intel’s Stratix 10 FPGAs, Brainwave can sustain 39.5 teraflops on large GRU, running each request within a millisecond.
Microsoft said Brainwave running on hardware microservices is actually pushing the boundaries of the types of AI-powered services that can be deployed to the cloud, including computer vision or language processing.
According to the blog post, FPGAs will be available to outside developers via Azure next year.
“In the near future, we’ll detail when our Azure customers will be able to run their most complex deep learning models at record-setting performance. With the Project Brainwave system incorporated at scale and available to our customers, Microsoft Azure will have industry-leading capabilities for real-time AI,” Microsoft said.
However, the Redmond giants are not the only company looking to FPGAs in their cloud data centres. Amazon, as well as Google, are doing similar things.
“Project Brainwave achieves a major leap forward in both performance and flexibility for cloud-based serving of deep learning models,” Microsoft added.
Image Credit: StockStudio / Shutterstock