Skip to content

feat: add OpenRouter proxy for Cursor CLI agent#3087

Open
AhmedTMM wants to merge 1 commit intomainfrom
feat/cursor-openrouter-proxy
Open

feat: add OpenRouter proxy for Cursor CLI agent#3087
AhmedTMM wants to merge 1 commit intomainfrom
feat/cursor-openrouter-proxy

Conversation

@AhmedTMM
Copy link
Copy Markdown
Collaborator

Summary

Adds a local ConnectRPC-to-REST translation proxy that enables Cursor CLI to route LLM traffic through OpenRouter, bypassing Cursor's proprietary protocol lock-in.

Architecture

Cursor CLI → Caddy (HTTPS/H2) → split routing:
  /agent.v1.AgentService/* → H2C Node.js (BiDi streaming → OpenRouter)
  everything else → HTTP/1.1 Node.js (fake auth, models, config)
+ /etc/hosts spoofs api2.cursor.sh → localhost

How it works

  1. Proxy deploys during configure() — installs Caddy, uploads Node.js proxy scripts, configures hosts spoofing
  2. Proxy starts during preLaunch() — starts Caddy + two Node.js backends (systemd or setsid fallback)
  3. Cursor CLI connects — thinks it's talking to api2.cursor.sh, actually hits our proxy
  4. Proxy translates — decodes ConnectRPC protobuf → calls OpenRouter /v1/chat/completions (streaming) → encodes response as AgentServerMessage protobuf frames

Proto schemas (reverse-engineered from CLI binary)

  • AgentServerMessage.InteractionUpdate.TextDeltaUpdate.text for streaming text
  • agent.v1.ModelDetails (model_id=1, display_model_id=3, display_name=4) for model list
  • TurnEndedUpdate (field 14) for turn completion

New files

  • packages/cli/src/shared/cursor-proxy.ts — proxy scripts + deployment functions
  • packages/cli/src/__tests__/cursor-proxy.test.ts — 17 tests (protobuf, framing, deployment)

Verified

  • Lint: 0 errors
  • Tests: 1963 pass (1 pre-existing macOS failure)
  • E2E on Sprite VM: Cursor CLI printed proxy response with EXIT=0

Test plan

  • spawn cursor sprite → verify proxy starts, Cursor CLI launches
  • Send a message → verify response streams through OpenRouter
  • spawn cursor hetzner → verify works on non-Sprite cloud

🤖 Generated with Claude Code

Cursor CLI uses a proprietary ConnectRPC/protobuf protocol with BiDi
streaming over HTTP/2. It validates API keys against Cursor's own servers
and hardcodes api2.cursor.sh for agent streaming — making direct
OpenRouter integration impossible.

This adds a local translation proxy that intercepts Cursor's protocol
and routes LLM traffic through OpenRouter:

Architecture:
  Cursor CLI → Caddy (HTTPS/H2, port 443) → split routing:
    /agent.v1.AgentService/* → H2C Node.js (BiDi streaming → OpenRouter)
    everything else → HTTP/1.1 Node.js (fake auth, models, config)

Key components:
- cursor-proxy.ts: proxy scripts + deployment functions
- Caddy reverse proxy for TLS + HTTP/2 termination
- /etc/hosts spoofing to intercept api2.cursor.sh
- Hand-rolled protobuf codec for AgentServerMessage format
- SSE stream translation (OpenRouter → ConnectRPC protobuf frames)

Proto schemas reverse-engineered from Cursor CLI binary v2026.03.25:
- AgentServerMessage.InteractionUpdate.TextDeltaUpdate.text
- agent.v1.ModelDetails (model_id, display_model_id, display_name)
- TurnEndedUpdate (input_tokens, output_tokens)

Tested end-to-end on Sprite VM: Cursor CLI printed proxy response with
EXIT=0.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@AhmedTMM AhmedTMM marked this pull request as ready for review March 28, 2026 17:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant