PyPI-distributable LLM control plane: gateway choke point, cost attribution, OTel instrumentation, and offline reporting as an inspectable engineering artifact.
-
Updated
Mar 30, 2026 - Python
PyPI-distributable LLM control plane: gateway choke point, cost attribution, OTel instrumentation, and offline reporting as an inspectable engineering artifact.
FinOps cost attribution for AI code agents: maps Kiro, Cursor, and Claude Code spend to git commits to reveal per-task cost, waste patterns, and agent ROI signals.
Pre-dispatch policy evaluation and cost attribution for LLM inference, built on llmscope.
Evidence-based quality gate for LLM deployments: evaluates telemetry against latency, cost, and error policies to produce auditable go/no-go release decisions for CI/CD.
Claude skill for code-agent execution policy: scope control, credit burn reduction, and cheap context recovery for spec-driven implementation workflows.
OTel-native typed primitives for LLM cost attribution and telemetry — published on PyPI.
Add a description, image, and links to the llm-finops topic page so that developers can more easily learn about it.
To associate your repository with the llm-finops topic, visit your repo's landing page and select "manage topics."