Microsoft Releases Prompty Extension For VSCode |
Written by Kay Ewbank | |||
Tuesday, 03 September 2024 | |||
Microsoft has released Prompty, a tool designed for creating, managing, debugging, and evaluating Large Language Models (LLM) prompts for your AI applications. The idea is developers will be able to use it to integrate LLMs like GPT-4o directly into .NET apps. Prompty is free on the Visual Studio Code Marketplace, although can't be used with Visual Studio Professional. It is an asset class and format for LLM prompts and Microsoft says it is designed to enhance observability, understandability, and portability for developers. The logo for Prompty does bear an eerie echo of Clippy from Microsoft Office, but try not to let that frighten you off. Prompty is made up of the specification, its tooling, The tooling features "front matter autocompletion", syntax highlighting and colorization. Validation is provided with red squiggles for undefined variables. There's a quick run option, code generation and evaluation generation. Prompty can be used to write prompts directly in your codebase to interact with the LLM. It supports various prompt formats and the syntax highlighting aims to make your prompts readable and maintainable. Once you have your prompt ready, you can use the extension to generate code snippets, create documentation, or even debug your applications by asking the LLM specific questions. Prompty's runtime is the whatever engine that understands and can execute the format. As a standalone file, it can't really do anything without the help of the extension (when developing) or the runtime (when running). It targets LangChain, Semantic Kernel, and Prompt Flow as supporting runtimes. It works in Python (Prompt Flow and LangChain), and in C# (Semantic Kernel). Plans include support for TypeScript and JavaScript, and it is understood in Azure AI Studio You can define your model configurations directly in VS Code, and switch between different model configurations. You can also use VS Code settings to define model configuration at the user level for use across different prompty files; and at the workspace level to share with team members via Git. OpenAI is also supported. You can store the key in VSCode settings or use ${env:xxx} to read the API key from an environment variable. Prompty is available now on GitHub or the VSCode Marketplace. More InformationPrompty On VisualStudio Marketplace Related ArticlesGitHub Launches AI Models Sandbox Microsoft Goes All Out On Generative AI Microsoft And GitHub Announce Copilot Extensions At Build 2024 OpenAI Introduces GPT-4o, Loses Sutskever Microsoft's Generative AI for Beginners To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.
Comments
or email your comment to: comments@i-programmer.info |
|||
Last Updated ( Tuesday, 03 September 2024 ) |