News

Microsoft Eases Integration with Semantic Kernel AI SDK

Microsoft is previewing a new .NET library that eases integration with its Semantic Kernel SDK, itself used to integrate AI functionality into .NET apps.

Rather than AI app developers, however, the new Microsoft.Extensions.AI NuGet library primarily targets the partner community, such as Independent Software Vendors (ISVs) and AI solution providers, or the companies that provide AI systems like large language models (LLMs) or embeddings. Its purpose is to provides utilities for working with generative AI components.

The basic idea is to provide unified API abstractions, especially for idiomatic C# code, to help platform developers and others work with any provider with standard implementations for caching, telemetry, tool calling and other common tasks.

The initial preview comes with reference implementations for:

  • OpenAI
  • Azure AI Inference
  • Ollama
Microsoft.Extensions.AI Architecture
[Click on image for larger view.] Microsoft.Extensions.AI Architecture (source: Microsoft).

Core benefits, Microsoft's .NET dev team said, include:

  • Unified API: Delivers a consistent set of APIs and conventions for integrating AI services into .NET applications.
  • Flexibility: Allows .NET library authors to use AI services without being tied to a specific provider, making it adaptable to any provider.
  • Ease of Use: Enables .NET developers to experiment with different packages using the same underlying abstractions, maintaining a single API throughout their application.
  • Componentization: Simplifies adding new capabilities and facilitates the componentization and testing of applications.

And, while primarily targeting those platform/library developers and service providers, Microsoft listed ways for other groups to get involved:

  • Library Developers: If you own libraries that provide clients for AI services, consider implementing the interfaces in your libraries. This allows users to easily integrate your NuGet package via the abstractions.
  • Service Consumers: If you're developing libraries that consume AI services, use the abstractions instead of hardcoding to a specific AI service. This approach gives your consumers the flexibility to choose their preferred service.
  • Application Developers: Try out the abstractions to simplify integration into your apps. This enables portability across models and services, facilitates testing and mocking, leverages middleware provided by the ecosystem, and maintains a consistent API throughout your app, even if you use different services in different parts of your application (i.e., local and hosted model hybrid scenarios).
  • Ecosystem Contributors: If you're interested in contributing to the ecosystem, consider writing custom middleware components.

Microsoft's Semantic Kernel dev team also provided a look at what's next for the project: "When Microsoft.Extensions.AI moves from preview to general availability (GA), we will enable this package in Semantic Kernel .NET and when our Memory packages (for .NET, Python, and Java) move to GA within the next few months, we are excited to support vector service providers and ISVs in delivering their services directly to professional developers. More to come about an abstraction package for this group."

Other steps in the near term listed by the .NET team include:

  • Continue collaborating with Semantic Kernel on integrating Microsoft.Extensions.AI as its foundational layer.
  • Update existing samples like eShop to use Microsoft.Extensions.AI.
  • Work with everyone across the .NET ecosystem on the adoption of Microsoft.Extensions.AI. The more providers implement the abstractions, the more consumers use it, and the more middleware components are built, the more powerful all of the pieces become.

About the Author

David Ramel is an editor and writer at Converge 360.

comments powered by Disqus

Featured

Subscribe on YouTube