News

Microsoft Claims Fastest Network in the Cloud

Microsoft CEO Satya Nadella demonstrated some of the AI supercomputing capabilities in its Azure public cloud that make it now among the fastest cloud services available. The secret sauce is the use of field-programmable gate arrays.

Microsoft CEO Satya Nadella demonstrated some of the AI supercomputing capabilities of the company's Azure platform. He said, in his keynote this week at Microsoft Ignite in Atlanta, that the company two years ago started upgrading every node in its Azure public cloud with software-defined network (SDN) infrastructure, developed using field-programmable gate arrays (FPGAs).

The result is that Microsoft's Azure public cloud fabric is now built on a 25 gigabit-per-second backbone -- up from 10Gbps -- with a 10x reduction in latency, which Microsoft believes translates to Azure having the highest speed network among cloud services. Combined with new GPU nodes, recently made available in the Azure Portal, Microsoft also clams its cloud can function as the world's fastest supercomputer, capable of running artificial intelligence, cognitive computing and even neuro networking-based applications.

The stealth upgrade started two years ago when Microsoft began installing the FPGAs -- effectively SDN-based processors from Altera, now a part of Intel. At this week's Ignite conference taking place in Atlanta, Microsoft revealed the Azure infrastructure and network upgrade. Microsoft CEO Satya Nadella demonstrated some of the AI supercomputing capabilities the newly bolstered Azure is capable of during his keynote session late Monday.

"We have the ability, through the magic of the fabric that we've built to distribute your machine learning tasks and your deep neural nets to all of the silicon that is available so that you can get performance that scales," Nadella said.

Doug Burger, a networking expert from Microsoft Research, joined Nadella on stage to describe why Microsoft made a significant investment in the FPGAs and SDN architecture. "FPGAs are programmable hardware," Burger explained.  What that means is that you get the efficiency of hardware, but you also get flexibility because you can change their functionality on the fly. And this new architecture that we've built effectively embeds an FPGA-based AI supercomputer into our global hyper-scale cloud.  We get awesome speed, scale and efficiency.  It will change what's possible for AI."

Burger said Microsoft is using a special type of neural network called a 'convolutional neural net', which can recognize the content within a collection of images.  Adding a 30-watt FPGA to a server turbocharges it, allowing the CPU to recognize images significantly faster. "It gives the server a huge boost for AI tasks," he said.

Showing a more complex task, Burger demonstrated how adding 4 FPGA boards to a high-end 24 CPU core configuration can translate the 1,400-page book War and Peace from Russian to English in 2.5 seconds. "Our accelerated cognitive services run blazingly fast," he said. "Even more importantly, we can now do accelerated AI on a global scale, at hyper-scale. "

Applying 50 FPGA boards to 50 nodes, the AI-based cloud supercomputer can translate 5 billion words into another language in less than a tenth of a second, according to Burger, amounting to 100 trillion operations per second. "That crazy speed shows the raw power of what we've deployed in our intelligent cloud," he said.

In an interview, Burger described the deployment of this new network infrastructure in Azure as a major milestone and differentiator for the company's public cloud. "This architecture is disruptive," Burger said, noting it's also deployed in the fabric of the Bing search engine. "So when you do a Bing search, you're actually touching this new fabric."

About the Author

Jeffrey Schwartz is the editor of 1105 Media's Redmond magazine, an editor-at-large and columnist for Redmond Channel Partner magazine, and author of a blog covering enterprise cloud computing called The Schwartz Cloud Report. Earlier in his tenure with the Enterprise Computing Group of 1105 Media, he held senior editorial postions with Application Development Trends, Visual Studio Magazine and Redmond Developer News. He has covered all aspects of enterprise IT for more than two decades and has spent much of that time writing about mobile computing technology. Before joining 1105 Mediaโ€™s Enterprise Computing group, he held several senior editorial roles with such publications as VARBusiness (now part of CRN), InternetWeek and CommunicationsWeek.

comments powered by Disqus

Featured

Subscribe on YouTube