diff --git a/docs/auth/byok.md b/docs/auth/byok.md index b244c453..f1e63475 100644 --- a/docs/auth/byok.md +++ b/docs/auth/byok.md @@ -10,6 +10,7 @@ BYOK allows you to use the Copilot SDK with your own API keys from model provide | Azure OpenAI / Azure AI Foundry | `"azure"` | Azure-hosted models | | Anthropic | `"anthropic"` | Claude models | | Ollama | `"openai"` | Local models via OpenAI-compatible API | +| Microsoft Foundry Local | `"openai"` | Run AI models locally on your device via OpenAI-compatible API | | Other OpenAI-compatible | `"openai"` | vLLM, LiteLLM, etc. | ## Quick Start: Azure AI Foundry @@ -250,6 +251,36 @@ provider: { } ``` +### Microsoft Foundry Local + +[Microsoft Foundry Local](https://foundrylocal.ai) lets you run AI models locally on your own device with an OpenAI-compatible API. Install it via the Foundry Local CLI, then point the SDK at your local endpoint: + +```typescript +provider: { + type: "openai", + baseUrl: "http://localhost:/v1", + // No apiKey needed for local Foundry Local +} +``` + +> **Note:** Foundry Local starts on a **dynamic port** — the port is not fixed. Use `foundry service status` to confirm the port the service is currently listening on, then use that port in your `baseUrl`. + +To get started with Foundry Local: + +```bash +# Install Foundry Local CLI +winget install Microsoft.FoundryLocal + +# List available models +foundry model list + +# Run a model (starts the local server automatically) +foundry model run phi-4-mini + +# Check the port the service is running on +foundry service status +``` + ### Anthropic ```typescript @@ -305,6 +336,7 @@ Some Copilot features may behave differently with BYOK: |----------|-------------| | Azure AI Foundry | No Entra ID auth; must use API keys | | Ollama | No API key; local only; model support varies | +| [Microsoft Foundry Local](https://foundrylocal.ai) | Local only; model availability depends on device hardware; no API key required | | OpenAI | Subject to OpenAI rate limits and quotas | ## Troubleshooting @@ -368,6 +400,21 @@ curl http://localhost:11434/v1/models ollama serve ``` +### Connection Refused (Foundry Local) + +Foundry Local uses a dynamic port that may change between restarts. Confirm the active port: + +```bash +# Check the service status and port +foundry service status +``` + +Update your `baseUrl` to match the port shown in the output. If the service is not running, start a model to launch it: + +```bash +foundry model run phi-4-mini +``` + ### Authentication Failed 1. Verify your API key is correct and not expired