-
Notifications
You must be signed in to change notification settings - Fork 13
Open
Description
Context:
We use Abacus as a router to access multiple model providers (OpenAI, Grok, Anthropic). We need structured JSON responses, so we send response_format: {"type": "json"} per your API docs.
- I am following the example code from your docs here:
from openai import OpenAI
client = OpenAI(
base_url="https://routellm.abacus.ai/v1",
api_key=<api_key>,
)- I'm sending messages with
response_format: {"type": "json"}in the chat completions request
What's working
- OpenAI models through Abacus (e.g., gpt-4.1, gpt-5) return valid JSON when we set response_format: {"type": "json"}.
What's Not Working:
- Grok, Qwen, Anthropic models through Abacus return malformed JSON.
- Mostly, the issues are a result of extra text before the json object or other invalid json that breaks the JSON.parse() function.
Hypothesis:
Abacus may be ignoring the response_format parameter for non-OpenAI providers, or those providers need different handling.
Questions:
- Is response_format supported for all model providers through Abacus, or only OpenAI?
- Are there provider-specific requirements for enforcing JSON responses / are there workarounds for other model providers that I can use?
Thanks for any info you can provide :)
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels