Skip to content

feat: add Anthropic Messages API and Google Gemini API endpoints#40

Merged
KochC merged 7 commits intomainfrom
dev
Mar 27, 2026
Merged

feat: add Anthropic Messages API and Google Gemini API endpoints#40
KochC merged 7 commits intomainfrom
dev

Conversation

@KochC
Copy link
Copy Markdown
Owner

@KochC KochC commented Mar 27, 2026

Summary

  • Adds POST /v1/messages — Anthropic Messages API (non-streaming + SSE streaming with message_start, content_block_delta, message_stop events)
  • Adds POST /v1beta/models/:model:generateContent — Google Gemini API (non-streaming)
  • Adds POST /v1beta/models/:model:streamGenerateContent — Google Gemini streaming (newline-delimited JSON)
  • New pure helpers: normalizeAnthropicMessages, normalizeGeminiContents, extractGeminiSystemInstruction, mapFinishReasonToAnthropic, mapFinishReasonToGemini
  • 35 new tests — total 77 → 112, all passing
  • Updated README with usage examples for all three new API styles

Closes #38, #39

KochC added 7 commits March 27, 2026 15:44
- Add 17 new integration tests: CORS edge cases (disallowed origins,
  no-origin header, OPTIONS for disallowed origin), auth (401/pass-through),
  and error handling (400/502/404) for /v1/chat/completions
  Closes #14, closes #16
- Add ESLint with flat config, npm run lint script, and Lint job in CI
  Closes #15
- Improve README with quickstart section, npm install instructions, and
  corrected package name; add type column to env vars table
  Closes #17
- Implement streaming for POST /v1/chat/completions (issue #11):
  subscribe to opencode event stream, pipe message.part.updated deltas
  as SSE chat.completion.chunk events, finish on session.idle
- Implement streaming for POST /v1/responses (issue #11):
  emit response.created / output_text.delta / response.completed events
- Fix provider-agnostic system prompt hint (issue #12): remove
  'OpenAI-compatible' wording so non-OpenAI models are not confused
- Add TextEncoder and ReadableStream to ESLint globals
- Add streaming integration tests (happy path, unknown model, session.error)
- Extract createSseQueue() helper, eliminating duplicated SSE queue pattern
  in /v1/chat/completions and /v1/responses streaming branches (closes #34)
- Add tests for GET /v1/models happy path, empty providers, and error path (closes #33)
- Add tests for POST /v1/responses: happy path, validation, streaming, session.error (closes #32)
- Fix package.json description to be provider-agnostic (closes #35)
- Add engines field declaring bun >=1.0.0 requirement (closes #35)
- Line coverage: 55% -> 89%, function coverage: 83% -> 94%
- POST /v1/messages — Anthropic Messages API with streaming (SSE)
- POST /v1beta/models/:model:generateContent — Gemini non-streaming
- POST /v1beta/models/:model:streamGenerateContent — Gemini NDJSON streaming
- New helpers: normalizeAnthropicMessages, normalizeGeminiContents,
  extractGeminiSystemInstruction, mapFinishReasonToAnthropic/Gemini
- 35 new tests (77 -> 112 total, all passing)
- Update README to document all supported API formats

Closes #38, #39
@KochC KochC merged commit c516686 into main Mar 27, 2026
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat: add Anthropic Messages API endpoint (POST /v1/messages)

1 participant