News

ML.NET Now Works on ARM Devices and Blazor WebAssembly

ML.NET now works on ARM64 and Apple M1 devices, and on Blazor WebAssembly, with some limitations for each.

Microsoft regularly updates ML.NET, an open source, cross-platform machine learning (ML) framework for .NET devs, along with the integral Model Builder component that features a simple visual interface for building, training and deploying custom ML models without requiring deep expertise.

In the June 2021 update, the new ARM64 and Blazor WebAssembly functionality was unveiled.

"You can now perform training and inferencing with ML.NET on ARM64 and Apple M1 (in addition to Linux and macOS) devices which enables platform support for mobile and embedded devices as well as ARM-based servers," Microsoft announced.

Training and Inferencing on a Pinebook Pro Laptop Running Manjaro ARM Linux Distribution (source: Microsoft).

Limitations that could throw a DLL not found exception when training and inferencing on ARM include:

  • Symbolic SGD, TensorFlow, OLS, TimeSeries SSA, TimeSeries SrCNN and ONNX are not currently supported for training or inferencing.
  • LightGBM is currently supported for inferencing, but not training.
  • You can add LightGBM and ONNX support by compiling them for ARM, but they don’t provide pre-compiled binaries for ARM/ARM64.

For Blazor WebAssembly -- the client-side component of Blazor, which allows for C#-based web development instead of JavaScript -- some training and inferencing is possible on .NET 6, currently in preview. It has the same limitations as those listed for ARM above, plus two more:

  • You must currently set the EnableMLUnsupportedPlatformTargetCheck flag to false to install in Blazor.
  • LDA and Matrix Factorization are not supported.

According to the release notes for the new update (ML.NET v1.5.5), other new features include:

  • New API allowing confidence parameter to be a double.(#5623) . A new API has been added to accept double type for the confidence level. This helps when you need to have higher precision than an int will allow for.
  • Support to export ValueMapping estimator to ONNX was added (#5577)
  • New API to treat TensorFlow output as batched/not-batched (#5634) A new API has been added so you can specify if the output from TensorFlow is batched or not.

The release notes also detail many bug fixes and documentation updates.

For the Model Builder component specifically, features previously presented in preview are now publicly available, including:

  • Config-based training with generated code-behind files
  • Restructured Advanced Data Options
  • Redesigned Consume step

AutoML, used to automate the time-consuming, iterative tasks of machine learning model development, has also been improved in cooperation with Microsoft Research. Generally, that partnership is expected to provide these benefits:

  • Enabling AutoML support for all ML.NET scenarios
  • Allowing more precise control over the hyperparameter search space
  • Enabling more training environments, including local, Azure, and on-prem distributed training
  • Opening up future collaborations on advanced ML tech, like Network Architecture Search (NAS)

Among the first of specific concrete improvements are better benchmark tests reflecting an increased number of models that are explored for different train times, and other metrics.

The announcement post also includes results of a survey conducted by the dev team, revealing -- among many other things -- that the biggest blockers/pain points/challenges reported by respondents for using ML.NET include:

  • Small ML.NET community
  • Docs and samples (quantity, quality, real world)
  • Insufficient deep learning support
  • Specific ML scenario or algorithm not supported by ML.NET
  • Afraid Microsoft will abandon it

The post details how Microsoft is planning to address those.

About the Author

David Ramel is an editor and writer for Converge360.

comments powered by Disqus

Featured

  • AI for GitHub Collaboration? Maybe Not So Much

    No doubt GitHub Copilot has been a boon for developers, but AI might not be the best tool for collaboration, according to developers weighing in on a recent social media post from the GitHub team.

  • Visual Studio 2022 Getting VS Code 'Command Palette' Equivalent

    As any Visual Studio Code user knows, the editor's command palette is a powerful tool for getting things done quickly, without having to navigate through menus and dialogs. Now, we learn how an equivalent is coming for Microsoft's flagship Visual Studio IDE, invoked by the same familiar Ctrl+Shift+P keyboard shortcut.

  • .NET 9 Preview 3: 'I've Been Waiting 9 Years for This API!'

    Microsoft's third preview of .NET 9 sees a lot of minor tweaks and fixes with no earth-shaking new functionality, but little things can be important to individual developers.

  • Data Anomaly Detection Using a Neural Autoencoder with C#

    Dr. James McCaffrey of Microsoft Research tackles the process of examining a set of source data to find data items that are different in some way from the majority of the source items.

  • What's New for Python, Java in Visual Studio Code

    Microsoft announced March 2024 updates to its Python and Java extensions for Visual Studio Code, the open source-based, cross-platform code editor that has repeatedly been named the No. 1 tool in major development surveys.

Subscribe on YouTube