These samples demonstrate how to use Foundry Local with Python.
- Python 3.11 or later
| Sample | Description |
|---|---|
| native-chat-completions | Initialize the SDK, start the local service, and run streaming chat completions. |
| audio-transcription | Transcribe audio files using the Whisper model. |
| web-server | Start a local OpenAI-compatible web server and call it with the OpenAI Python SDK. |
| tool-calling | Tool calling with custom function definitions (get_weather, calculate). |
| langchain-integration | LangChain integration for building translation and text generation chains. |
| tutorial-chat-assistant | Build an interactive multi-turn chat assistant (tutorial). |
| tutorial-document-summarizer | Summarize documents with AI (tutorial). |
| tutorial-tool-calling | Create a tool-calling assistant (tutorial). |
| tutorial-voice-to-text | Transcribe and summarize audio (tutorial). |
-
Clone the repository:
git clone https://github.com/microsoft/Foundry-Local.git cd Foundry-Local/samples/python -
Navigate to a sample and install dependencies:
cd native-chat-completions pip install -r requirements.txt -
Run the sample:
python src/app.py
Tip
Each sample's requirements.txt uses environment markers to automatically install the right SDK for your platform. On Windows, foundry-local-sdk-winml is installed for broader hardware acceleration. On macOS and Linux, the standard foundry-local-sdk is used. Just run pip install -r requirements.txt — platform detection is handled for you.