Microsoft Releases Prompty Extension For VSCode
Written by Kay Ewbank   
Tuesday, 03 September 2024

Microsoft has released Prompty, a tool designed for creating, managing, debugging, and evaluating Large Language Models (LLM) prompts for your AI applications. The idea is developers will be able to use it to integrate LLMs like GPT-4o directly into .NET apps.

Prompty is free on the Visual Studio Code Marketplace, although can't be used with Visual Studio Professional. It is an asset class and format for LLM prompts and Microsoft says it is designed to enhance observability, understandability, and portability for developers. The logo for Prompty does bear an eerie echo of Clippy from Microsoft Office, but try not to let that frighten you off.

prompty

Prompty is made up of the specification, its tooling,
and runtime. The specification states that Prompty is intended to be a language agnostic asset class for creating and managing prompts. It uses common markdown format, and modified front-matter to specify metadata, model settings, and sample data. The content is provided in a standard template format.

The tooling features "front matter autocompletion", syntax highlighting and colorization. Validation is provided with red squiggles for undefined variables. There's a quick run option, code generation and evaluation generation.

Prompty can be used to write prompts directly in your codebase to interact with the LLM. It supports various prompt formats and the syntax highlighting aims to make your prompts readable and maintainable.  Once you have your prompt ready, you can use the extension to generate code snippets, create documentation, or even debug your applications by asking the LLM specific questions.

Prompty's runtime is the whatever engine that understands and can execute the format. As a standalone file, it can't really do anything without the help of the extension (when developing) or the runtime (when running). It targets LangChain, Semantic Kernel, and Prompt Flow as supporting runtimes. It works in Python (Prompt Flow and LangChain), and in C# (Semantic Kernel). Plans include support for TypeScript and JavaScript, and it is understood in Azure AI Studio

You can define your model configurations directly in VS Code, and switch between different model configurations. You can also use VS Code settings to define model configuration at the user level for use across different prompty files; and at the workspace level to share with team members via Git.

OpenAI is also supported. You can store the key in VSCode settings or use ${env:xxx} to read the API key from an environment variable.

Prompty is available now on GitHub or the VSCode Marketplace. 

prompty

More Information

Prompty On GitHub

Prompty Webpage

Prompty On VisualStudio Marketplace

Related Articles

GitHub Launches AI Models Sandbox

Microsoft Goes All Out On Generative AI

Microsoft And GitHub Announce Copilot Extensions At Build 2024

OpenAI Introduces GPT-4o, Loses Sutskever

Microsoft's Generative AI for Beginners

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.

Banner


MariaDB Introduces Vector Search
22/08/2024

The MariaDB Foundation has announced a preview version of MariaDB 11.6 Vector. The new functionality is described as being a result of collaborative work by employees of MariaDB plc, MariaDB Foundatio [ ... ]



Windows Community Toolkit Adds Controls
10/09/2024

Microsoft has released version 8.1 of the Windows Community Toolkit, with support for .NET 8 alongside new controls and initial AOT annotations.


More News

kotlin book

 

Comments




or email your comment to: comments@i-programmer.info

Last Updated ( Tuesday, 03 September 2024 )