Microsoft, Intel Unveil FPGA-driven ‘Real-Time AI’ in Azure
We know about the seeming light-speed processing power of FPGAs and the natural fit they pose for data-dense AI workloads. But we also know that FPGAs present usability and programmability problems that flummox IT shops. It's assumed that it would take the resource riches of a FANG (Facebook, Amazon/Azure (Microsoft), Netflix, Google) company to wrestle FPGAs into a practical technology for AI applications.
Microsoft, using Intel’s new 14 nm Stratix 10 and other FPGA technologies, is working to do just that. The company has launched Project Brainwave, a “real-time AI” capability to be available on the Azure public cloud infrastructure. It’s designed to process live data streams, such as video, sensors or search queries, and rapidly deliver data back to users.
Microsoft demonstrated its FPGA-based deep learning platform at Hot Chips 2017, a semiconductor symposium. Microsoft reported that Stratix 10 showed sustained performance of 39.5 teraflops (using Microsoft's custom 8-bit format), running each request in under one millisecond. “At that level of performance, the Brainwave architecture sustains execution of over 130,000 compute operations per cycle, driven by one macro-instruction being issued each 10 cycles.”
Microsoft said it was the first major cloud service provider to deploy FPGAs in its public cloud infrastructure. According to Intel, “the technology advancements it is demonstrating today with Intel Stratix 10 FPGAs enable the acceleration of deep neural networks (DNNs) that replicate 'thinking' in a manner that is conceptually similar to that of the human brain.”
Intel and Microsoft drew a contrast between Stratix 10’s capabilities and those of “many silicon AI accelerators today (that) require grouping multiple requests together [called ‘batching’] to achieve high performance.” They said Stratix 10 demonstrated more than 39 teraflops of sustained performance on a single request, “a new level of cloud performance for real-time AI computation, with record low latency, record performance and batch-free execution of AI requests.”
“We exploit the flexibility of Intel FPGAs to incorporate new innovations rapidly, while offering performance comparable to, or greater than, many ASIC-based deep learning processing units,” said Doug Burger, distinguished engineer at Microsoft Research NExT.
Microsoft said Project Brainwave is built with three main layers:
First, it leverages the massive FPGA infrastructure that Microsoft has been deploying over the past few years. “By attaching high-performance FPGAs directly to our datacenter network, we can serve DNNs as hardware microservices, where a DNN can be mapped to a pool of remote FPGAs and called by a server with no software in the loop,” said Burger. This architecture reduces latency, since the CPU does not need to process incoming requests, and allows high throughput.
Second, Project Brainwave uses a “soft” DNN processing unit (or DPU), synthesized onto FPGAs, “providing a design that scales across a range of data types, with the desired data type being a synthesis-time decision. The design combines both the ASIC digital signal processing blocks on the FPGAs and the synthesizable logic to provide a greater and more optimized number of functional units.”
Burger said this approach “can incorporate research innovations into the hardware platform quickly (typically a few weeks), which is essential in this fast-moving space. As a result, we achieve performance comparable to – or greater than – many of these hard-coded DPU chips but are delivering the promised performance today.”
Third, Project Brainwave incorporates a software stack designed to support widely used deep learning frameworks. “We already support Microsoft Cognitive Toolkit and Google’s Tensorflow, and plan to support many others,” Burger said. “We have defined a graph-based intermediate representation, to which we convert models trained in the popular frameworks, and then compile down to our high-performance infrastructure.”
“We are working to bring this powerful, real-time AI system to users in Azure, so that our customers can benefit from Project Brainwave directly, complementing the indirect access through our services such as Bing," Burger said.