Skip to content

feat: support multiple LLM providers (OpenAI, Anthropic, Ollama)#5

Open
Aditya8840 wants to merge 2 commits intomainfrom
feat/issue-1
Open

feat: support multiple LLM providers (OpenAI, Anthropic, Ollama)#5
Aditya8840 wants to merge 2 commits intomainfrom
feat/issue-1

Conversation

@Aditya8840
Copy link
Copy Markdown
Owner

Summary

Closes #1

  • Introduces a LLMProvider abstraction that decouples the agent loop from any specific LLM SDK
  • Implements three providers: OpenAI, Anthropic (Claude), and Ollama (local models)
  • Each provider manages its own message format, tool schema conversion, and API interaction behind a unified interface
  • Adds --provider CLI flag and DROIDPILOT_PROVIDER env var for provider selection
  • Anthropic SDK is an optional dependency: pip install droidpilot[anthropic]

Usage examples

# OpenAI (default, backward compatible)
droidpilot "Open Settings"

# Anthropic Claude
droidpilot "Open Settings" --provider anthropic --model claude-sonnet-4-20250514

# Local Ollama
droidpilot "Open Settings" --provider ollama --model llama3

Architecture

Component Purpose
providers/base.py LLMProvider ABC defining the interface
providers/openai_provider.py OpenAI implementation
providers/anthropic_provider.py Anthropic implementation with tool format conversion
providers/ollama_provider.py Ollama via OpenAI-compatible API (inherits OpenAI provider)
providers/__init__.py Registry, default models, create_provider() factory

Test plan

  • 20 unit tests covering provider registry, tool format conversion, and all three provider implementations
  • black . formatting passes
  • mypy . passes (only pre-existing ui_tree.py error remains)
  • Manual test with OpenAI provider
  • Manual test with Anthropic provider
  • Manual test with Ollama provider

🤖 Generated with Claude Code

Aditya8840 and others added 2 commits March 26, 2026 23:54
Introduce a provider abstraction that decouples the agent loop from
any specific LLM SDK. Each provider manages its own message format,
tool schema conversion, and API interaction behind a unified interface.

- Add LLMProvider ABC with add_user_message/get_tool_call/add_tool_result
- Implement OpenAI, Anthropic, and Ollama providers
- Ollama reuses OpenAI provider via inheritance (OpenAI-compatible API)
- Add --provider CLI flag and DROIDPILOT_PROVIDER env var
- Default model per provider (gpt-4o, claude-sonnet-4-20250514, llama3)
- Anthropic SDK as optional dependency: pip install droidpilot[anthropic]
- Add 20 unit tests covering registry, tool conversion, and all providers

Closes #1

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- Add GitHub Actions CI workflow (black, mypy, pytest)
- Fix mypy arg-type error in ui_tree.py by asserting bounds is not None
  (already guaranteed by the _is_visible guard above)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support multiple LLM providers (Claude, Gemini, local models)

1 participant