A lightweight, Dockerized middleware that lets you use the official Claude Code CLI and Opencode with the Qwen3-Coder-Plus backend (via Qwen Portal).
- Model Translation: Seamlessly routes to Qwen3-Coder-Plus.
- Credential Integration: Securely mounts your existing local Qwen OAuth credentials (
~/.qwen/oauth_creds.json) into the container. - Dockerized: Runs in a lightweight Python container with no dependency pollution on your host machine.
The Qwen Code Proxy leverages LiteLLM, an open-source AI gateway that provides a unified proxy interface for 100+ LLMs.
- Python Application Wrapper - Runs
main.pyand manages the LiteLLM proxy with retry logic and graceful shutdown - LiteLLM Proxy Server - Runs inside a Docker container on port
3455 - Credential Manager - Securely accesses Qwen OAuth credentials with thread-safe caching
`claude` / `opencode` → Local Proxy (`3455`) → LiteLLM Translation → Qwen Portal API → Response Back to CLI- Parameter Filtering: Anthropic-specific parameters like
thinkingandbetasare automatically dropped - Credential Caching: API keys are cached with file modification monitoring to avoid unnecessary reads
- Retry Mechanism: Automatic retries with configurable attempts and delays
- Graceful Shutdown: Signal handling for clean process termination
- Docker & Docker Compose: Installed and running.
- Claude Code CLI: Installed on your host machine (Claude Console Auth).
- Opencode CLI: Installed on your host machine (Zen Auth).
# claude-code CLI installation
npm install -g @anthropic-ai/claude-code
# opencode installation
npm install -g opencode-aiQwen Credentials: You must be logged into the Qwen CLI on your machine.
-
Clone the repository:
git clone <repository-url> cd qwen-code-proxy
-
Start the Proxy: Run the Docker container in the background. This will build the image and start the LiteLLM proxy on port
3455.docker compose up -d
-
Verify Status: Ensure the container is running and listening:
docker compose logs -f
You should see
🚀 Starting Qwen Proxy on http://0.0.0.0:3455.
To use the proxy, configure the Claude CLI to point to localhost instead of Anthropic's servers.
Add the following to your Claude CLI configuration file (~/.claude/settings.json):
{
"env": {
"ANTHROPIC_BASE_URL": "http://127.0.0.1:3455"
}
}When you run
claude, it will ask you to log in. Use console auth.
Add the following to your Opencode CLI configuration file (~/.config/opencode/opencode.json):
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"litellm": {
"name": "litellm",
"options": {
"baseURL": "http://127.0.0.1:3455"
},
"models": {
"openai/qwen3-coder-plus": {
"name": "openai/qwen3-coder-plus"
}
}
}
}
}When you run
opencode, it will ask you to log in. Use Zen auth.
After login, use/connect, select litellm, and enter a dummy API key (sk-xxx) to connect.
- Feature Parity: Some Anthropic-specific features may not be fully supported by Qwen
- Rate Limits: Subject to Qwen Portal's rate limits and usage policies
- Offline Access: Requires internet connectivity to reach Qwen Portal API
For development, you can run the proxy directly without Docker:
# Install dependencies with uv
uv sync
# Run the proxy directly with Python
uv run python main.py
# Run with custom configuration via environment variables
QWEN_LOG_LEVEL=DEBUG uv run python main.pyThis approach is useful for debugging and development, bypassing the Docker container for faster iteration cycles.
-
Q: How do I view the server logs?
A: Rundocker compose logs -f. This is useful for debugging connection issues or verifying that requests are hitting the Qwen API. -
Q: How do I update the project?
A: Pull the latest changes (if any), then rebuild the container:
docker compose up -d --build- Q: I'm getting API errors or authentication issues. How can I refresh my token?
A: If you encounter API errors, refresh your token by navigating to the project folder and running:
cd qwen-code-proxy # Navigate to project directory
qwen "Hello" && docker compose restartThis restarts the proxy container and refreshes the token from your credentials file. The proxy uses thread-safe caching with file modification monitoring, so it automatically picks up updated tokens when the credentials file changes.
Only use for personal and non-commercial purposes. Do not use for any illegal or unauthorized purpose.
- LiteLLM for providing the API translation and proxy infrastructure
- Anthropic for the excellent Claude Code CLI
- Opencode for their open-source CLI tool
- Alibaba Cloud for the Qwen3-Coder-Plus model and API access