feat(datadog): support per-request ml_app override via metadata#25684
Open
liranddd wants to merge 1 commit intoBerriAI:mainfrom
Open
feat(datadog): support per-request ml_app override via metadata#25684liranddd wants to merge 1 commit intoBerriAI:mainfrom
liranddd wants to merge 1 commit intoBerriAI:mainfrom
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
Contributor
Contributor
Greptile SummaryThis PR adds per-request
Confidence Score: 5/5
|
| Filename | Overview |
|---|---|
| litellm/integrations/datadog/datadog_llm_obs.py | Core change: groups log_queue by _dd_ml_app and sends separate batches per ml_app. Clean-copy approach correctly avoids mutating queue entries. Two inline imports violate project style. Multi-group partial-success can cause duplicate spans on retry. |
| litellm/integrations/datadog/datadog_handler.py | Adds get_datadog_ml_app() helper that reads DD_LLMOBS_ML_APP and falls back to get_datadog_service(). Clean, backwards-compatible addition. |
| litellm/types/integrations/datadog_llm_obs.py | Adds optional _dd_ml_app field to LLMObsPayload TypedDict as an internal routing hint. Field is documented as stripped before sending. Clean change. |
| tests/test_litellm/integrations/datadog/test_per_request_ml_app.py | 5 new mock-only tests covering payload field, absence without override, env-var default, multi-app grouping, failure-keeps-queue, and success-clears-queue. All mock the HTTP client; no real network calls. Good coverage. |
Sequence Diagram
sequenceDiagram
participant Caller
participant LiteLLM
participant DDLLMObsLogger
participant DatadogAPI
Caller->>LiteLLM: "completion(model, messages, metadata={ml_app: "svc-a"})"
LiteLLM->>DDLLMObsLogger: async_log_success_event(kwargs)
DDLLMObsLogger->>DDLLMObsLogger: "create_llm_obs_payload()<br/>reads metadata.ml_app → stores _dd_ml_app="svc-a""
DDLLMObsLogger->>DDLLMObsLogger: log_queue.append(payload)
Note over DDLLMObsLogger: On flush (batch_size or periodic)
DDLLMObsLogger->>DDLLMObsLogger: "async_send_batch()<br/>group spans by _dd_ml_app"
DDLLMObsLogger->>DatadogAPI: "POST /api/intake/llm-obs/v1/trace/spans<br/>ml_app="svc-a", spans=[…stripped of _dd_ml_app]"
DatadogAPI-->>DDLLMObsLogger: 202 Accepted
DDLLMObsLogger->>DatadogAPI: "POST /api/intake/llm-obs/v1/trace/spans<br/>ml_app="svc-b", spans=[…]"
DatadogAPI-->>DDLLMObsLogger: 202 Accepted
DDLLMObsLogger->>DatadogAPI: "POST /api/intake/llm-obs/v1/trace/spans<br/>ml_app=DD_LLMOBS_ML_APP (default), spans=[…]"
DatadogAPI-->>DDLLMObsLogger: 202 Accepted
DDLLMObsLogger->>DDLLMObsLogger: log_queue.clear()
Reviews (4): Last reviewed commit: "feat(datadog): support per-request ml_ap..." | Re-trigger Greptile
8273a84 to
e9a9e8b
Compare
Codecov Report❌ Patch coverage is
📢 Thoughts on this report? Let us know! |
e9a9e8b to
3eb11e5
Compare
3eb11e5 to
c6785a1
Compare
Allow callers to pass ml_app in request metadata to control the Application column in Datadog LLM Observability. Also adds support for the DD_LLMOBS_ML_APP env var. Fallback chain: metadata.ml_app → DD_LLMOBS_ML_APP → DD_SERVICE. Closes BerriAI#20701
c6785a1 to
4a55027
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
ml_appin request metadata to control the Application column in Datadog LLM Observabilityml_appat flush time and sent as separate batches (Datadog intake API requiresml_appat batch level)DD_SERVICE— fully backwards compatibleCloses #20701
Motivation
When multiple services share a single LiteLLM proxy, all LLM traces appear under the same application in Datadog LLM Observability. There is currently no way to distinguish which service made the call. This PR lets callers tag their requests so they appear as distinct applications.
Usage
Result in Datadog LLM Obs
Changes
litellm/integrations/datadog/datadog_llm_obs.py— readml_appfrom metadata, group batches by itlitellm/types/integrations/datadog_llm_obs.py— add internal_dd_ml_appfield (stripped before send)tests/test_litellm/integrations/datadog/test_per_request_ml_app.py— 5 new tests