Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
47 changes: 47 additions & 0 deletions docs/auth/byok.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ BYOK allows you to use the Copilot SDK with your own API keys from model provide
| Azure OpenAI / Azure AI Foundry | `"azure"` | Azure-hosted models |
| Anthropic | `"anthropic"` | Claude models |
| Ollama | `"openai"` | Local models via OpenAI-compatible API |
| Microsoft Foundry Local | `"openai"` | Run AI models locally on your device via OpenAI-compatible API |
| Other OpenAI-compatible | `"openai"` | vLLM, LiteLLM, etc. |

## Quick Start: Azure AI Foundry
Expand Down Expand Up @@ -250,6 +251,36 @@ provider: {
}
```

### Microsoft Foundry Local

[Microsoft Foundry Local](https://foundrylocal.ai) lets you run AI models locally on your own device with an OpenAI-compatible API. Install it via the Foundry Local CLI, then point the SDK at your local endpoint:

```typescript
provider: {
type: "openai",
baseUrl: "http://localhost:<PORT>/v1",
// No apiKey needed for local Foundry Local
}
```

> **Note:** Foundry Local starts on a **dynamic port** — the port is not fixed. Use `foundry service status` to confirm the port the service is currently listening on, then use that port in your `baseUrl`.

To get started with Foundry Local:

```bash
# Install Foundry Local CLI
winget install Microsoft.FoundryLocal

Comment on lines +271 to +273
Copy link

Copilot AI Feb 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

winget install Microsoft.FoundryLocal is Windows-specific. Please either label this as Windows-only (and mention prerequisites) or provide equivalent install steps for macOS/Linux if Foundry Local supports them, so cross-platform readers aren’t blocked.

Suggested change
# Install Foundry Local CLI
winget install Microsoft.FoundryLocal
# Windows: Install Foundry Local CLI (requires winget)
winget install Microsoft.FoundryLocal
# macOS / Linux: see https://foundrylocal.ai for installation instructions

Copilot uses AI. Check for mistakes.
# List available models
foundry model list

# Run a model (starts the local server automatically)
foundry model run phi-4-mini

# Check the port the service is running on
foundry service status
```

### Anthropic

```typescript
Expand Down Expand Up @@ -305,6 +336,7 @@ Some Copilot features may behave differently with BYOK:
|----------|-------------|
| Azure AI Foundry | No Entra ID auth; must use API keys |
| Ollama | No API key; local only; model support varies |
| [Microsoft Foundry Local](https://foundrylocal.ai) | Local only; model availability depends on device hardware; no API key required |
| OpenAI | Subject to OpenAI rate limits and quotas |

## Troubleshooting
Expand Down Expand Up @@ -368,6 +400,21 @@ curl http://localhost:11434/v1/models
ollama serve
```

### Connection Refused (Foundry Local)

Foundry Local uses a dynamic port that may change between restarts. Confirm the active port:

```bash
# Check the service status and port
foundry service status
```

Update your `baseUrl` to match the port shown in the output. If the service is not running, start a model to launch it:

```bash
foundry model run phi-4-mini
```

### Authentication Failed

1. Verify your API key is correct and not expired
Expand Down
Loading