News
AI Toolkit for VS Code Now Lets You 'Bring Your Own Model'
Microsoft updated its AI Toolkit for Visual Studio Code, now supporting the ability to "bring your own model" for use in the open-source-based code editor.
The toolkit, still in preview, was introduced in May 2024 at the company's Build developer conference, enabling developers to explore, try, fine-tune and integrate state-of-the-art AI models into applications. It features a Model Catalog and a Model Playground, helping devs tweak settings for different models and immediately see the effects. Or, as its official description reads: "AI Toolkit for VS Code streamlines generative AI app development by integrating tools and models. Browse and download public and custom models; author, test and evaluate prompts; fine-tune; and use them in your applications."
Those models primarily came from Hugging Face and Azure AI Studio, showcased in the Model Catalog.
Today, Microsoft announced an update to the tool that expands its capabilities to let developers bring their own models into the toolkit, obtained from Ollama or via APIs.
"AI Toolkit extension for VS code now supports external local models via Ollama," said the announcement. "It has also added support [for] remote hosted models using API keys for OpenAI, Google and Anthropic. As we have seen in past blog posts, AI toolkit supports a range of models using Github Marketplace of models. However, you might require support for external models hosted by Google, Anthropic and Open AI which are either not available in the Github catalog of models or might want to use the models served by Ollama."
Ollama, like Hugging Face, helps provide access to AI models. Basically, Ollama is ideal for local, secure model execution, while Hugging Face offers a comprehensive platform for AI model development and collaboration.
"Several developers are also using Ollama to experiment and play with models using the command line," Microsoft said. "Ollama is an open-source AI tool that allows users to run large language models (LLMs) on their local systems. It's a valuable tool for industries that require data privacy, such as healthcare, finance, and government which might need locally hosted models. So, AI toolkit already supports some locally downloadable models such as those in the Phi-series by Microsoft or those by Mistral. Ollama supports a wider variety of models especially those from Meta's Llama series of LLMs and SLMs. The complete list of models currently supported by Ollama can be found at Ollama library."
The toolkit in May 2024 listed 3,925 installs, while today it lists more than 46,000.
About the Author
David Ramel is an editor and writer at Converge 360.