Skip to content

fix(mcp): pass base_url to Anthropic LLM and OpenAI Embedder clients#1159

Open
Milofax wants to merge 1 commit intogetzep:mainfrom
Milofax:fix/anthropic-base-url
Open

fix(mcp): pass base_url to Anthropic LLM and OpenAI Embedder clients#1159
Milofax wants to merge 1 commit intogetzep:mainfrom
Milofax:fix/anthropic-base-url

Conversation

@Milofax
Copy link
Copy Markdown
Contributor

@Milofax Milofax commented Jan 17, 2026

Summary

The LLMClientFactory and EmbedderFactory in factories.py read the api_url from config but don't pass it to the client constructors. This causes:

  • AnthropicClient to always use https://api.anthropic.com instead of configured URL
  • OpenAIEmbedder to always use https://api.openai.com/v1 instead of configured URL

This breaks self-hosted/proxy setups where users want to route requests through custom endpoints.

Changes

  • Pass base_url=config.providers.anthropic.api_url to GraphitiLLMConfig for Anthropic
  • Pass base_url=config.providers.openai.api_url to OpenAIEmbedderConfig for OpenAI embedder

Test Plan

  • Tested with CLIProxyAPI (Anthropic-compatible proxy) - requests now correctly route to custom endpoint
  • Tested with Ollama (OpenAI-compatible embeddings) - embeddings now correctly use custom endpoint

🤖 Generated with Claude Code

The api_url from config was not being passed to the client constructors,
causing them to default to official API endpoints instead of custom ones.
@Milofax
Copy link
Copy Markdown
Contributor Author

Milofax commented Mar 8, 2026

Just checking in — this ensures base_url is properly passed to both the Anthropic LLM and OpenAI Embedder clients, which is needed for self-hosted setups. Would appreciate a look when you have time!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant