Summary
The Braintrust Rust SDK has no instrumentation support for genai (genai crate), a multi-provider AI client for Rust that abstracts chat completions, streaming, and embeddings across 16+ providers. No other Braintrust SDK (Python, JS, Go, Ruby, Java, .NET) instruments this library either, as it is Rust-specific.
What is missing
The genai crate provides execution-oriented API surfaces that could be instrumented to automatically capture spans:
Client::exec_chat(model, chat_request, options) — chat completion across all supported providers
Client::exec_chat_stream(model, chat_request, options) — streaming chat completion with chunked responses
embed module — embedding execution across providers
ChatResponse — contains model output, usage metadata, and provider-specific fields
Wrapping these methods would enable automatic tracing of genai-based Rust applications, capturing inputs, outputs, token usage, latency, and streaming TTFT for any provider genai supports.
Library significance
- 732 GitHub stars, 774 commits on main branch
- 164K+ total downloads on crates.io (~57K recent)
- 16 supported providers: OpenAI, Anthropic, Gemini, xAI, Ollama, Groq, DeepSeek, Cohere, Together, Fireworks, Nebius, and more
- Latest stable version: 0.5.3 (actively maintained, with 0.6.0 beta series in progress)
- API docs at https://docs.rs/genai (84% documentation coverage)
Braintrust docs status
not_found — No mention of the genai Rust crate in Braintrust documentation. It is not listed on the SDK integrations page, the AI providers page, or the trace LLM calls page. Rust is not listed as a supported language for LLM call tracing.
Upstream sources
Local files inspected
src/extractors.rs — only OpenAI and Anthropic usage extractors; no genai-related code
src/stream.rs — stream aggregator for OpenAI Chat Completions format only
src/lib.rs — public API exports; no genai references
Cargo.toml — no genai dependency
- Full codebase search for "genai", "rust-genai", "jeremychone", "exec_chat" — zero results
Summary
The Braintrust Rust SDK has no instrumentation support for genai (
genaicrate), a multi-provider AI client for Rust that abstracts chat completions, streaming, and embeddings across 16+ providers. No other Braintrust SDK (Python, JS, Go, Ruby, Java, .NET) instruments this library either, as it is Rust-specific.What is missing
The
genaicrate provides execution-oriented API surfaces that could be instrumented to automatically capture spans:Client::exec_chat(model, chat_request, options)— chat completion across all supported providersClient::exec_chat_stream(model, chat_request, options)— streaming chat completion with chunked responsesembedmodule — embedding execution across providersChatResponse— contains model output, usage metadata, and provider-specific fieldsWrapping these methods would enable automatic tracing of genai-based Rust applications, capturing inputs, outputs, token usage, latency, and streaming TTFT for any provider genai supports.
Library significance
Braintrust docs status
not_found — No mention of the
genaiRust crate in Braintrust documentation. It is not listed on the SDK integrations page, the AI providers page, or the trace LLM calls page. Rust is not listed as a supported language for LLM call tracing.Upstream sources
Local files inspected
src/extractors.rs— only OpenAI and Anthropic usage extractors; no genai-related codesrc/stream.rs— stream aggregator for OpenAI Chat Completions format onlysrc/lib.rs— public API exports; no genai referencesCargo.toml— nogenaidependency