Skip to content

feat: implement SSE streaming and support all opencode providers#30

Merged
KochC merged 3 commits intomainfrom
dev
Mar 27, 2026
Merged

feat: implement SSE streaming and support all opencode providers#30
KochC merged 3 commits intomainfrom
dev

Conversation

@KochC
Copy link
Copy Markdown
Owner

@KochC KochC commented Mar 27, 2026

Summary

  • Issue streaming not implemented #11 (streaming): Implemented SSE streaming for both POST /v1/chat/completions and POST /v1/responses
    • Uses client.event.subscribe() to receive message.part.updated delta events in real time
    • Terminates on session.idle for the active session
    • Chat completions: emits chat.completion.chunk SSE events + [DONE]
    • Responses API: emits full lifecycle events (response.created, response.output_text.delta, response.completed)
    • Stream errors propagate gracefully via SSE before [DONE]
  • Issue llm support for all opencode providers #12 (all providers): Updated system prompt hint to be provider-agnostic (removed "OpenAI-compatible" wording); the model resolver already supports all opencode providers
  • Added TextEncoder and ReadableStream to ESLint globals
  • Added 3 streaming integration tests (happy path, unknown model → 502 before SSE, session.error propagation)

Closes #11, Closes #12

KochC added 3 commits March 27, 2026 15:44
- Add 17 new integration tests: CORS edge cases (disallowed origins,
  no-origin header, OPTIONS for disallowed origin), auth (401/pass-through),
  and error handling (400/502/404) for /v1/chat/completions
  Closes #14, closes #16
- Add ESLint with flat config, npm run lint script, and Lint job in CI
  Closes #15
- Improve README with quickstart section, npm install instructions, and
  corrected package name; add type column to env vars table
  Closes #17
- Implement streaming for POST /v1/chat/completions (issue #11):
  subscribe to opencode event stream, pipe message.part.updated deltas
  as SSE chat.completion.chunk events, finish on session.idle
- Implement streaming for POST /v1/responses (issue #11):
  emit response.created / output_text.delta / response.completed events
- Fix provider-agnostic system prompt hint (issue #12): remove
  'OpenAI-compatible' wording so non-OpenAI models are not confused
- Add TextEncoder and ReadableStream to ESLint globals
- Add streaming integration tests (happy path, unknown model, session.error)
@KochC KochC merged commit d9f2662 into main Mar 27, 2026
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

llm support for all opencode providers streaming not implemented

1 participant