Prompt — Azure OpenAI Library
Welcome to the documentation for Prompt, a lightweight .NET 8 library for Azure OpenAI chat completions.
Overview
Prompt provides a clean, minimal API for integrating Azure OpenAI into .NET applications. It handles the boilerplate — connection pooling, retry policies, environment configuration — so you can focus on building.
Key Features
- Single-call prompts —
Main.GetResponseAsync()for quick one-shot interactions - Multi-turn conversations —
Conversationmaintains full message history across turns - Template engine —
PromptTemplatewith{{variable}}placeholders and composition - Prompt chaining —
PromptChainpipes outputs between steps for multi-step reasoning - Model presets —
PromptOptionswith factory methods for code generation, creative writing, summarization, and data extraction - Automatic retries — Exponential backoff for 429 rate-limit and 503 service errors
- Serialization — Save/load conversations, templates, and chains as JSON
- Thread-safe — Singleton client with connection pooling, safe for concurrent use
Quick Start
dotnet add package prompt-llm-aoi
using Prompt;
// One-shot prompt
string? response = await Main.GetResponseAsync("Explain quantum computing.");
Console.WriteLine(response);
Documentation
| Guide | Description |
|---|---|
| Getting Started | Installation, configuration, and first prompt |
| Conversations | Multi-turn dialogue with history and serialization |
| Templates | Reusable prompts with {{variables}} and composition |
| Prompt Chains | Multi-step reasoning pipelines |
| Model Options | Temperature, tokens, penalties, and presets |
| Error Handling | Exception types, retry behavior, and production patterns |
| Migration Guide | Upgrading between versions with breaking change details |
| API Reference | Full class and method documentation |