Skip to content

docs: clarify Ollama and OpenAI-compatible setup#441

Open
Beandon13 wants to merge 1 commit into
aliasrobotics:mainfrom
Beandon13:docs/ollama-openai-compatible-428-20260504155803
Open

docs: clarify Ollama and OpenAI-compatible setup#441
Beandon13 wants to merge 1 commit into
aliasrobotics:mainfrom
Beandon13:docs/ollama-openai-compatible-428-20260504155803

Conversation

@Beandon13
Copy link
Copy Markdown

Summary

  • Clarifies local Ollama setup with the ollama/ model prefix and the /v1 base URL.
  • Adds an OpenAI-compatible provider page documenting OPENAI_BASE_URL with the openai/ prefix.
  • Updates the README custom base URL example so it uses OPENAI_BASE_URL instead of OLLAMA_API_BASE.

Why

Issue #428 calls out confusion around Ollama, OpenAI-compatible endpoints, and when provider prefixes are required. The updated docs make the routing explicit and give copy-pasteable examples for both normal local Ollama and generic OpenAI-compatible endpoints.

Test plan

  • python3 - <<'PY' ... docs sanity checks ... PY
  • uvx --with mkdocs-material --with 'mkdocstrings[python]' mkdocs build --site-dir /tmp/cai-docs-build-428

Note: mkdocs build --strict still fails on existing repository-wide warnings unrelated to this change; non-strict build succeeds.

Fixes #428

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

ollama / openai-compatible documentation

1 participant