Summary
The Responses API instrumentation in both the openai and ruby-openai integrations has an incorrect parameter name (max_tokens instead of max_output_tokens) and is missing several parameters that are part of the stable upstream API. Users who set max_output_tokens, service_tier, include, text, or background on their Responses API calls get no visibility into these values in their Braintrust span metadata.
What is missing
The METADATA_FIELDS constant in both Responses API instrumentations currently includes:
METADATA_FIELDS = %i[
model instructions modalities tools parallel_tool_calls
tool_choice temperature max_tokens top_p frequency_penalty
presence_penalty seed user metadata store response_format
reasoning previous_response_id truncation
].freeze
Wrong parameter name
| Current field |
Correct field |
Why it matters |
max_tokens |
max_output_tokens |
The Responses API uses max_output_tokens, not max_tokens. The SDK captures a field that doesn't exist in the Responses API, and misses the one users actually pass. |
Missing parameters
| Parameter |
Why it matters |
service_tier |
Controls processing priority (auto, default, flex, priority). Already captured in the Chat Completions instrumentation — this is an inconsistency within the SDK. |
include |
Array controlling what additional data is returned (e.g., web_search_call.action.sources, code_interpreter_call.outputs). Important for understanding what data the user requested. |
text |
Text response format configuration for structured outputs. The newer alternative to response_format for the Responses API. |
background |
Boolean for async/background response execution. Changes how the response is processed — users need to see this in traces to understand response behavior. |
Braintrust docs status
not_found — The Braintrust docs at https://www.braintrust.dev/docs/instrument/wrap-providers do not mention max_output_tokens, service_tier, include, text, or background for the Responses API.
Upstream sources
Local repo files inspected
lib/braintrust/contrib/openai/instrumentation/responses.rb (lines 26–31) — METADATA_FIELDS uses max_tokens (wrong) and is missing max_output_tokens, service_tier, include, text, background
lib/braintrust/contrib/ruby_openai/instrumentation/responses.rb (lines 32–37) — identical METADATA_FIELDS, same issues
lib/braintrust/contrib/openai/instrumentation/chat.rb (lines 27–32) — Chat Completions already captures service_tier, showing this is an oversight in the Responses instrumentation
Summary
The Responses API instrumentation in both the
openaiandruby-openaiintegrations has an incorrect parameter name (max_tokensinstead ofmax_output_tokens) and is missing several parameters that are part of the stable upstream API. Users who setmax_output_tokens,service_tier,include,text, orbackgroundon their Responses API calls get no visibility into these values in their Braintrust span metadata.What is missing
The
METADATA_FIELDSconstant in both Responses API instrumentations currently includes:Wrong parameter name
max_tokensmax_output_tokensmax_output_tokens, notmax_tokens. The SDK captures a field that doesn't exist in the Responses API, and misses the one users actually pass.Missing parameters
service_tierauto,default,flex,priority). Already captured in the Chat Completions instrumentation — this is an inconsistency within the SDK.includeweb_search_call.action.sources,code_interpreter_call.outputs). Important for understanding what data the user requested.textresponse_formatfor the Responses API.backgroundBraintrust docs status
not_found— The Braintrust docs at https://www.braintrust.dev/docs/instrument/wrap-providers do not mentionmax_output_tokens,service_tier,include,text, orbackgroundfor the Responses API.Upstream sources
OpenAI::Models::Responses::ResponseCreateParamsdefinesmax_output_tokens(notmax_tokens),service_tier,include,text,background, and other parameters inlib/openai/models/responses/response_create_params.rbLocal repo files inspected
lib/braintrust/contrib/openai/instrumentation/responses.rb(lines 26–31) —METADATA_FIELDSusesmax_tokens(wrong) and is missingmax_output_tokens,service_tier,include,text,backgroundlib/braintrust/contrib/ruby_openai/instrumentation/responses.rb(lines 32–37) — identicalMETADATA_FIELDS, same issueslib/braintrust/contrib/openai/instrumentation/chat.rb(lines 27–32) — Chat Completions already capturesservice_tier, showing this is an oversight in the Responses instrumentation