Skip to content

Response_format not working for non-OpenAI models when using OpenAI SDK #16

@aarondr77

Description

@aarondr77

Context:

We use Abacus as a router to access multiple model providers (OpenAI, Grok, Anthropic). We need structured JSON responses, so we send response_format: {"type": "json"} per your API docs.

  • I am following the example code from your docs here:
from openai import OpenAI
client = OpenAI(
  base_url="https://routellm.abacus.ai/v1",
  api_key=<api_key>,
)
  • I'm sending messages with response_format: {"type": "json"} in the chat completions request

What's working

  • OpenAI models through Abacus (e.g., gpt-4.1, gpt-5) return valid JSON when we set response_format: {"type": "json"}.

What's Not Working:

  • Grok, Qwen, Anthropic models through Abacus return malformed JSON.
  • Mostly, the issues are a result of extra text before the json object or other invalid json that breaks the JSON.parse() function.

Hypothesis:

Abacus may be ignoring the response_format parameter for non-OpenAI providers, or those providers need different handling.

Questions:

  • Is response_format supported for all model providers through Abacus, or only OpenAI?
  • Are there provider-specific requirements for enforcing JSON responses / are there workarounds for other model providers that I can use?

Thanks for any info you can provide :)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions