Take control of your AI traffic.
A lightweight local proxy that sits between your AI tools and the Anthropic API. See every request in real time. Route models to any provider. One line to set up, zero code changes.
brew install panbanda/croxy/croxyPre-built binaries are also available on the releases page.
croxy init # create ~/.config/croxy/config.toml
croxy start # start in background
eval "$(croxy shellenv)" # point AI tools at croxyAdd to your shell profile for automatic setup:
eval "$(croxy shellenv)"That's it. Claude Code, Cursor, and any Anthropic-compatible client will now route through croxy automatically.
- Live dashboard -- requests per minute, token throughput, response time percentiles (p50/p95/p99), per-model breakdowns, status code distribution, and error tracking, all updating in real time
- Model routing -- regex patterns and AI-based auto-routing send requests to different providers (Anthropic, Ollama, vllm-mlx, anything Anthropic-compatible) based on model name or conversation content
- Zero integration -- one
evalin your shell profile, no SDK changes, no per-project config - Foreground or background -- run with a TUI dashboard, detach to background, or reattach to a running instance
croxy init creates a starter config at ~/.config/croxy/config.toml with Anthropic and Ollama pre-configured. Edit it to add providers and routing rules.
See the configuration guide for the full reference, including provider setup for Ollama, vllm-mlx, and mixed-provider routing. For auto-routing with AI classification, see the routing guide.
croxy Run in foreground with TUI dashboard
croxy start Start in background
croxy stop Stop background instance
croxy init Create default config file
croxy shellenv Print ANTHROPIC_BASE_URL export if running
croxy config get|set Read or modify config values
MIT - see LICENSE for details.
