News

Microsoft Releases Beta of High-Performance Computing Server

Microsoft took the next step into the arena of high-performance computing (HPC) today when it announced the release of the first beta of Windows HPC Server 2008, the successor to Compute Cluster Server 2003.

The public beta was scheduled to be available here; as of presstime, however, no download was listed on the Website. The final version is expected to be generally available in the second half of 2008, Microsoft said in a press release.

The product was renamed, Microsoft said, "to reflect its readiness to tackle the most challenging HPC workloads". HPC Server 2008 is based on Windows 2008, slated for availability early next year; it will be a specially-crafted version meant to run on large clusters. Microsoft revealed few details of the server, but did list a number of key features, including a service oriented architecture (SOA) job scheduler; support for partners' clustered file systems; better failover capabilities; more efficient and scalable management tools; and high-speed networking.

Kyril Faenov, general manager of HPC at Microsoft, said in the release that Redmond has seen substantial gains in its HPC environment from the new server. "By upgrading to Windows HPC Server 2008 on our 2,048-core production test cluster, we increased the LINPACK performance by 30 percent and were able to deploy and validate the cluster in less than two hours using out-of-the-box software. Expanding beyond traditional MPI [Message Passing Interface]-based HPC applications, Windows HPC Server 2008 enables support for high-throughput SOA applications with its advanced Web service routing capability and paves the way for bringing HPC capabilities to a broad range of enterprise applications."

LINPACK is a software program that solves high-level mathematical equations. It's frequently used as a benchmark for computing power and performance.

Five years is a long time in the computer age for a new server release, but HPC Server 2008 has followed nearly the same timeline as Windows 2008, which will be released about five years after Windows Server 2003. Until Compute Cluster Server 2003's arrival on the scene, Microsoft had largely been frozen out of the HPC space. It was dominated by "big iron" and more scalable OSes like Unix and Linux.

HPC is still largely confined to the realm of science, engineering and medical and other research; many of the largest computer clusters are found at research universities. The trend over the last few years has been to put more cores on processors and more processors on chips, as it's been getting more and more difficult to squeeze additional performance out of a single processor.

About the Author

Keith Ward is the editor in chief of Virtualization & Cloud Review. Follow him on Twitter @VirtReviewKeith.

comments powered by Disqus

Featured

  • Windows Community Toolkit v8.2 Adds Native AOT Support

    Microsoft shipped Windows Community Toolkit v8.2, an incremental update to the open-source collection of helper functions and other resources designed to simplify the development of Windows applications. The main new feature is support for native ahead-of-time (AOT) compilation.

  • New 'Visual Studio Hub' 1-Stop-Shop for GitHub Copilot Resources, More

    Unsurprisingly, GitHub Copilot resources are front-and-center in Microsoft's new Visual Studio Hub, a one-stop-shop for all things concerning your favorite IDE.

  • Mastering Blazor Authentication and Authorization

    At the Visual Studio Live! @ Microsoft HQ developer conference set for August, Rockford Lhotka will explain the ins and outs of authentication across Blazor Server, WebAssembly, and .NET MAUI Hybrid apps, and show how to use identity and claims to customize application behavior through fine-grained authorization.

  • Linear Support Vector Regression from Scratch Using C# with Evolutionary Training

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the linear support vector regression (linear SVR) technique, where the goal is to predict a single numeric value. A linear SVR model uses an unusual error/loss function and cannot be trained using standard simple techniques, and so evolutionary optimization training is used.

  • Low-Code Report Says AI Will Enhance, Not Replace DIY Dev Tools

    Along with replacing software developers and possibly killing humanity, advanced AI is seen by many as a death knell for the do-it-yourself, low-code/no-code tooling industry, but a new report belies that notion.

Subscribe on YouTube