Skip to content

feat(config): add [[providers]] in .forge.toml#2821

Merged
tusharmath merged 3 commits intomainfrom
config-provider-support
Apr 3, 2026
Merged

feat(config): add [[providers]] in .forge.toml#2821
tusharmath merged 3 commits intomainfrom
config-provider-support

Conversation

@tusharmath
Copy link
Copy Markdown
Collaborator

@tusharmath tusharmath commented Apr 3, 2026

Summary

Allow users to define custom or override built-in providers directly in forge.toml, enabling teams to integrate proprietary or self-hosted LLM endpoints without code changes.

Context

Previously, adding a new LLM provider or adjusting an existing provider's settings (URL, API key variable, protocol, headers) required modifying the built-in provider registry in Rust source code. This made it impossible for teams to use self-hosted models, enterprise proxies, or custom OpenAI-compatible endpoints through configuration alone.

This feature unlocks a configuration-first workflow: any provider reachable via a supported wire protocol (OpenAI, Anthropic, Google, Bedrock, OpenAIResponses) can now be registered or overridden entirely from forge.toml.

Changes

  • Added ProviderEntry and ProviderUrlParam structs to forge_config for inline provider definitions
  • Added a providers field to ForgeConfig that accepts a list of ProviderEntry values
  • Implemented From<ProviderEntry> and From<ProviderUrlParam> conversions in forge_repo to transform config entries into internal ProviderConfig / UrlParamVarConfig types
  • Updated forge.schema.json with full JSON Schema definitions for ProviderEntry and ProviderUrlParam, enabling editor autocomplete and validation

Key Implementation Details

Entries with an id matching a built-in provider override that provider's fields field-by-field; entries with a new id are appended to the provider list. URL templates support {{VAR}} placeholders substituted from url_param_vars at runtime. Custom HTTP headers can be injected per-provider via custom_headers. Wire protocol is set via response_type (accepts "OpenAI", "Anthropic", "Google", "Bedrock", "OpenAIResponses").

Use Cases

Register a self-hosted OpenAI-compatible endpoint:

[[providers]]
id = "my_openai_proxy"
url = "https://proxy.internal/v1/chat/completions"
api_key_var = "PROXY_API_KEY"
response_type = "OpenAI"

Override a built-in provider's base URL with template variables:

[[providers]]
id = "azure"
url = "https://{{AZURE_REGION}}.openai.azure.com/openai/deployments/{{AZURE_DEPLOYMENT}}/chat/completions"

[[providers.url_param_vars]]
name = "AZURE_REGION"
options = ["eastus", "westeurope"]

[[providers.url_param_vars]]
name = "AZURE_DEPLOYMENT"

Add custom headers for an enterprise provider:

[[providers]]
id = "enterprise_llm"
url = "https://llm.corp.example.com/v1/chat"
api_key_var = "CORP_LLM_KEY"
response_type = "OpenAI"

[providers.custom_headers]
X-Org-ID = "my-org"
X-Cost-Center = "eng"

Testing

# Run all config and provider repo tests
cargo insta test --accept -p forge_config -p forge_repo

# Verify schema and types compile
cargo check

@github-actions github-actions bot added the type: feature Brand new functionality, features, pages, workflows, endpoints, etc. label Apr 3, 2026
@tusharmath tusharmath changed the title feat(config): add inline provider definitions to forge config feat(config): add [[providers]] in .forge.toml Apr 3, 2026
@tusharmath tusharmath merged commit 4e2d666 into main Apr 3, 2026
12 checks passed
@tusharmath tusharmath deleted the config-provider-support branch April 3, 2026 11:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

type: feature Brand new functionality, features, pages, workflows, endpoints, etc.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant