News

Microsoft Previews AI Prompt Playground for VS Code: Prompty

Prompt engineering positions might not be pulling down $335,000 salaries anymore, but Microsoft is continuing to pump out AI model prompting guidance, with a recent effort being Prompty, described as an intuitive prompt playground delivered in a Visual Studio Code extension.

Those sky-high salaries were a thing in the ancient days of prompt engineering (last year), after which the legitimacy of the "engineer" aspects of the discipline has dimmed and even been widely ridiculed.

Nevertheless, developers using GitHub Copilot and other AI-powered systems powered by large language models (LLMs) still struggle to get the best results from AI interactions, so Microsoft in May published the Prompty extension, which was updated last month and which was the subject of a blog post last week extolling its virtues and explaining its workings.

As of this writing, the tool, still in the preview stage, has been installed some 2,200 times, being reviewed by only one user who gave it a perfect 5.0 rating.

Prompty VS Code Extension
[Click on image for larger view.] Prompty VS Code Extension (source: Microsoft).

The associated Prompty GitHub repo describes the extension as an intuitive prompt playground within VS Code to streamline the prompt engineering process, stating: "Prompty is an asset class and format for LLM prompts designed to enhance observability, understandability, and portability for developers. The primary goal is to accelerate the developer inner loop."

An Example Prompty Prompt
[Click on image for larger view.] An Example Prompty Prompt (source: Microsoft).

Its main site says Prompty is comprised of three things -- a spec, tooling and runtime -- and provides benefits including:

  • Feel confident while building: Understand what's coming in and going out and how to manage it effectively.
  • Language agnostic: Use with any language or framework you are familiar with.
  • Flexible and simple: Integrate into whatever development environments or workflows you have.

Microsoft provided more guidance last week, stating: "Prompty provides an intuitive interface to interact with LLMs directly from your development environment, making it easier than ever to add AI features to your projects.

"Prompty is developed by Microsoft and is available for free on the Visual Studio Code marketplace. Whether you are building chatbots, creating content generators, or experimenting with other AI-driven applications, Prompty can significantly streamline your development process."

A sample flow using the tool was presented as:

  1. Installation: Begin by installing the Prompty extension from the Visual Studio Code Marketplace.
  2. Setup: After installation, configure the extension by providing your API keys and setting up the necessary parameters to connect to the LLM of your choice, such as GPT-4o.
  3. Integration: Prompty integrates seamlessly with your development workflow. Start by creating a new file or opening an existing one where you want to use the LLM. Prompty provides commands and snippets to easily insert prompts and handle responses.
  4. Development: Write prompts directly in your codebase to interact with the LLM. Prompty supports various prompt formats and provides syntax highlighting to make your prompts readable and maintainable.You can use the extension to generate code snippets, create documentation, or even debug your applications by asking the LLM specific questions. Once you have your prompt ready, you can use the extension to generate code snippets, create documentation, or even debug your applications by asking the LLM specific questions.
  5. Testing: Test your prompts and adjust them as needed to get the desired responses from the LLM. Prompty allows you to iterate quickly, refining your prompts to improve the accuracy and relevance of the AI's responses.

The post goes on to walk through that flow in the context of a real sample using a WebAPI application.

"Prompty offers .NET developers an efficient way to integrate AI capabilities into their applications," Microsoft's Bruno Capuano concluded. "By using this Visual Studio Code extension, developers can effortlessly incorporate GPT-4o and other Large Language Models into their workflows.

"Prompty and Semantic Kernel simplifies the process of generating code snippets, creating documentation, and debugging applications with AI-driven focus."

More information can be found in the Getting Started with prompty quickstart guidance.

The tool joins other prompting guidance from Microsoft, such as sharing Copilot prompts in a GitHub repo.

About the Author

David Ramel is an editor and writer for Converge360.

comments powered by Disqus

Featured

  • AdaBoost Binary Classification Using C#

    Dr. James McCaffrey from Microsoft Research presents a C# program that illustrates using the AdaBoost algorithm to perform binary classification for spam detection. Compared to other classification algorithms, AdaBoost is powerful and works well with small datasets, but is sometimes susceptible to model overfitting.

  • From Core to Containers to Orchestration: Modernizing Your Azure Compute

    The cloud changed IT forever. And then containers changed the cloud. And then Kubernetes changed containers. And then microservices usurped monoliths, and so it goes in the cloudscape. Here's help to sort it all out.

  • The Well-Architected Architect on Azure

    In the dynamic field of cloud computing, the architect's role is increasingly pivotal as they must navigate a complex landscape, considering everything from the overarching architecture and individual service configurations to the various trade-offs involved. Here's help.

  • Windows Community Toolkit Update Improves Controls

    The Windows Community Toolkit advanced to version 8.1, adding new features, improving existing controls and making dependency changes.

Subscribe on YouTube