Summary
The Braintrust Rust SDK has no wrapper for async-openai, the dominant Rust client for the OpenAI API. Every other Braintrust SDK provides an OpenAI client wrapper (wrapOpenAI in TypeScript, wrap_openai in Python, traceopenai.NewMiddleware in Go, BraintrustOpenAI.wrapOpenAI in Java, BraintrustOpenAI.WrapOpenAI in .NET, Braintrust.instrument!(:openai) in Ruby), but the Rust SDK has no equivalent.
What is missing
A wrap_openai-style function that takes an async-openai Client and returns a traced version that automatically creates spans for:
- Chat completions —
client.chat().create() and client.chat().create_stream()
- Responses API —
client.responses().create() and streaming variant
- Embeddings —
client.embeddings().create()
The wrapper should capture inputs, outputs, model name, token usage, latency, and tool calls in Braintrust spans, matching the behavior of the OpenAI wrappers in other Braintrust SDKs.
The SDK already has building blocks that could support this:
extract_openai_usage() in src/extractors.rs parses OpenAI usage fields
wrap_stream_with_span() in src/stream.rs attaches spans to async streams
SpanBuilder / SpanHandle in src/span.rs for manual span creation
But there is no integration that wires these together around async-openai client calls.
Library significance
- 4.3M+ total downloads on crates.io, ~2M recent downloads
- 106 published versions, latest 0.35.0 (April 2026)
- By far the most adopted Rust OpenAI client
- Covers: chat completions, responses API, embeddings, audio, images, assistants, fine-tuning, streaming, and more
- Official docs: https://docs.rs/async-openai
- GitHub: https://github.com/64bit/async-openai
- Crate: https://crates.io/crates/async-openai
Braintrust docs status
supported (in other languages) — Braintrust documents OpenAI client wrapping for TypeScript, Python, Ruby, Go, Java, and .NET on the Trace LLM calls page and the OpenAI integration page. Rust is not listed as a supported language for OpenAI wrapping.
Upstream sources
Local files inspected
src/extractors.rs — extract_openai_usage() parses OpenAI usage fields but is not wired to any client wrapper
src/stream.rs — wrap_stream_with_span() exists but is a generic stream helper, not an async-openai integration
src/span.rs — span creation infrastructure exists
src/lib.rs — public API exports; no async-openai references
Cargo.toml — no async-openai dependency
- Full codebase search for "async-openai", "async_openai", "openai::Client" — zero results
Summary
The Braintrust Rust SDK has no wrapper for
async-openai, the dominant Rust client for the OpenAI API. Every other Braintrust SDK provides an OpenAI client wrapper (wrapOpenAIin TypeScript,wrap_openaiin Python,traceopenai.NewMiddlewarein Go,BraintrustOpenAI.wrapOpenAIin Java,BraintrustOpenAI.WrapOpenAIin .NET,Braintrust.instrument!(:openai)in Ruby), but the Rust SDK has no equivalent.What is missing
A
wrap_openai-style function that takes anasync-openaiClientand returns a traced version that automatically creates spans for:client.chat().create()andclient.chat().create_stream()client.responses().create()and streaming variantclient.embeddings().create()The wrapper should capture inputs, outputs, model name, token usage, latency, and tool calls in Braintrust spans, matching the behavior of the OpenAI wrappers in other Braintrust SDKs.
The SDK already has building blocks that could support this:
extract_openai_usage()insrc/extractors.rsparses OpenAI usage fieldswrap_stream_with_span()insrc/stream.rsattaches spans to async streamsSpanBuilder/SpanHandleinsrc/span.rsfor manual span creationBut there is no integration that wires these together around
async-openaiclient calls.Library significance
Braintrust docs status
supported (in other languages) — Braintrust documents OpenAI client wrapping for TypeScript, Python, Ruby, Go, Java, and .NET on the Trace LLM calls page and the OpenAI integration page. Rust is not listed as a supported language for OpenAI wrapping.
Upstream sources
Local files inspected
src/extractors.rs—extract_openai_usage()parses OpenAI usage fields but is not wired to any client wrappersrc/stream.rs—wrap_stream_with_span()exists but is a generic stream helper, not an async-openai integrationsrc/span.rs— span creation infrastructure existssrc/lib.rs— public API exports; no async-openai referencesCargo.toml— noasync-openaidependency