Skip to content

[bot] No AWS Bedrock Converse API usage extraction support #52

@braintrust-bot

Description

@braintrust-bot

Summary

The SDK provides usage extractors for OpenAI (extract_openai_usage) and Anthropic (extract_anthropic_usage) but has no equivalent for the AWS Bedrock Converse API. Bedrock uses camelCase field names (inputTokens, outputTokens, totalTokens) that neither existing extractor can parse. Braintrust documents Bedrock as a supported cloud provider and the Go SDK already traces Converse, ConverseStream, and InvokeModel calls.

What is missing

AWS Bedrock's Converse API returns a usage object with camelCase field names:

{
  "output": {
    "message": {
      "role": "assistant",
      "content": [{"text": "Response text"}]
    }
  },
  "stopReason": "end_turn",
  "usage": {
    "inputTokens": 30,
    "outputTokens": 628,
    "totalTokens": 658
  },
  "metrics": {
    "latencyMs": 1275
  }
}

Key differences from OpenAI and Anthropic formats:

  1. Field naming: camelCase (inputTokens, outputTokens, totalTokens) — neither prompt_tokens/completion_tokens (OpenAI) nor input_tokens/output_tokens (Anthropic)
  2. Latency metrics: metrics.latencyMs provides server-side latency — useful for span timing
  3. Cache tokens for Claude models: When using Claude via Bedrock, responses include cacheReadInputTokenCount and cacheWriteInputTokenCount in the usage object
  4. Stop reason at root: stopReason is at the response root level with Bedrock-specific values (end_turn, tool_use, max_tokens, stop_sequence, guardrail_intervened, content_filtered)

Passing a Bedrock Converse response through extract_openai_usage() returns empty metrics because it looks for usage.prompt_tokens / usage.completion_tokens (snake_case), which don't exist in Bedrock's camelCase format. Similarly, extract_anthropic_usage() looks for usage.input_tokens (snake_case), which also doesn't match.

Braintrust docs status

supported — Braintrust's Bedrock integration page documents tracing support: "Converse, ConverseStream, and InvokeModel calls are traced." The Go SDK provides Bedrock Runtime middleware that captures token usage including cache tokens for Claude models. The Rust SDK has no Bedrock support.

Upstream sources

Relationship to existing issues

Local files inspected

  • src/extractors.rs — only extract_openai_usage() and extract_anthropic_usage() exist; no Bedrock extractor
  • src/types.rsUsageMetrics struct could represent Bedrock token data if mapped, but no mapping exists
  • src/stream.rs — stream aggregator only parses OpenAI Chat Completions chunk format; no Bedrock ConverseStream support
  • src/lib.rs — public API exports; no Bedrock references
  • Full codebase grep for "bedrock", "inputTokens", "outputTokens", "converse", "latencyMs" — zero results

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions