These samples demonstrate how to use the Rust binding for Foundry Local.
- Rust 1.70.0 or later
| Sample | Description |
|---|---|
| native-chat-completions | Non-streaming and streaming chat completions using the native chat client. |
| audio-transcription-example | Audio transcription (non-streaming and streaming) using the Whisper model. |
| foundry-local-webserver | Start a local OpenAI-compatible web server and call it with a standard HTTP client. |
| tool-calling-foundry-local | Tool calling with streaming responses, multi-turn conversation, and local tool execution. |
| tutorial-chat-assistant | Build an interactive multi-turn chat assistant (tutorial). |
| tutorial-document-summarizer | Summarize documents with AI (tutorial). |
| tutorial-tool-calling | Create a tool-calling assistant (tutorial). |
| tutorial-voice-to-text | Transcribe and summarize audio (tutorial). |
-
Clone the repository:
git clone https://github.com/microsoft/Foundry-Local.git cd Foundry-Local/samples/rust -
Run a sample:
cargo run -p native-chat-completions
Or navigate to a sample directory and run directly:
cd native-chat-completions cargo run
Tip
Each sample's Cargo.toml uses [target.'cfg(windows)'.dependencies] to automatically enable the winml feature on Windows for broader hardware acceleration. On macOS and Linux, the standard SDK is used. No manual configuration needed.