News

New Azure AI VMs Immediately Claim Top500 Supercomputer Rankings

Visual Studio coders who dabble in artificial intelligence projects can now take advantage of new Azure virtual machines (VMs) featuring 80 GB NVIDIA GPUs that immediately claimed four spots on the TOP500 supercomputers list, Microsoft said.

The company in June claimed to have the "fastest public cloud supercomputer" in announcing scale-out NVIDIA A100 GPU Clusters.

Building on those instances, Microsoft last week announced new NDm A100 v4 Series VMs that sport NVIDIA A100 Tensor Core 80 GB GPUs, which the company said expands Azure leadership-class AI supercomputing scalability in the public cloud while also claiming four official places in the TOP500 supercomputing list.

The company positioned the new high-memory NDm A100 v4 series as bringing AI/supercomputer power to the masses, giving organizations opportunities to use them to attain a competitive advantage. That's done with the help of a class-leading design that features:

  • In-Network Computing
  • 200 GB/s
  • GPUDirect RDMA for each GPU
  • An all-new PCIe Gen 4.0-based architecture
The Specs
[Click on image for larger view.] The Specs (source: Microsoft).

"We live in the era of large-scale AI models, the demand for large scale computing keeps growing," said Sherry Wang, senior program manager, Azure HPC and AI. "The original ND A100 v4 series features NVIDIA A100 Tensor Core GPUs each equipped with 40 GB of HBM2 memory, which the new NDm A100 v4 series doubles to 80 GB, along with a 30 percent increase in GPU memory bandwidth for today’s most data-intensive workloads. RAM available to the virtual machine has also increased to 1,900 GB per VM- to allow customers with large datasets and models a proportional increase in memory capacity to support novel data management techniques, faster checkpointing, and more."

About the Author

David Ramel is an editor and writer at Converge 360.

comments powered by Disqus

Featured

  • GitHub Previews Agentic AI in VS Code Copilot

    GitHub announced a raft of improvements to its Copilot AI in the Visual Studio Code editor, including a new "agent mode" in preview that lets developers use the AI technology to write code faster and more accurately.

  • Copilot Engineering in the Cloud with Azure and GitHub

    Who better to lead a full-day deep dive into this tech than two experts from GitHub, which introduced the original "AI pair programmer" and spawned the ubiquitous Copilot moniker?

  • Uno Platform Wants Microsoft to Improve .NET WebAssembly in Two Ways

    Uno Platform, a third-party dev tooling specialist that caters to .NET developers, published a report on the state of WebAssembly, addressing some shortcomings in the .NET implementation it would like to see Microsoft address.

  • Random Neighborhoods Regression Using C#

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the random neighborhoods regression technique, where the goal is to predict a single numeric value. Compared to other ML regression techniques, advantages are that it can handle both large and small datasets, and the results are highly interpretable.

  • As Some Orgs Restrict DeepSeek AI Usage, Microsoft Offers Models and Dev Guidance

    While some organizations are restricting employee usage of the new open source DeepSeek AI from a Chinese company due to data collection concerns, Microsoft has taken a different approach.

Subscribe on YouTube

Upcoming Training Events