Skip to content

narevai/ai-billing

Repository files navigation

ai-billing

Middleware for the Vercel AI SDK seing events directly to billing platforms (Stripe, Polar, and Lago).

codecov Node Current NPM License Discord Chat

AI Billing Header 1

Supported Providers

Provider Package Size
OpenRouter @ai-billing/openrouter NPM Unpacked Size
OpenAI @ai-billing/openai NPM Unpacked Size
Vercel AI Gateway @ai-billing/gateway NPM Unpacked Size
OpenAI Compatible @ai-billing/openai-compatible NPM Unpacked Size
Groq @ai-billing/groq NPM Unpacked Size
Google Generative AI @ai-billing/google NPM Unpacked Size
Anthropic @ai-billing/anthropic NPM Unpacked Size
xAI Grok @ai-billing/xai NPM Unpacked Size
MiniMax @ai-billing/minimax NPM Unpacked Size

Supported Destinations

Destination Package Size
Polar.sh @ai-billing/polar NPM Unpacked Size
Stripe @ai-billing/stripe NPM Unpacked Size
OpenMeter (Kong) @ai-billing/openmeter NPM Unpacked Size
Lago @ai-billing/lago NPM Unpacked Size

Installation

npm install @ai-billing/core @ai-billing/openrouter # Example for OpenRouter

Basic Usage

Wrap your model provider with the billing middleware and define your destinations.

import { streamText, wrapLanguageModel } from 'ai';
import { createOpenRouter } from '@openrouter/ai-sdk-provider';
import { createOpenRouterV3Middleware } from '@ai-billing/openrouter';

const billingMiddleware = createOpenRouterV3Middleware({});

const model = wrapLanguageModel({
  model: createOpenRouter({ apiKey: process.env.OPENROUTER_API_KEY })('google/gemini-2.0-flash-001'),
  middleware: billingMiddleware,
});

AI Billing Header 2

Send usage to Polar.sh

Wrap your model provider with the billing middleware and define your destinations.

import { streamText, wrapLanguageModel } from 'ai';
import { createOpenRouter } from '@openrouter/ai-sdk-provider';
import { createOpenRouterV3Middleware } from '@ai-billing/openrouter';
import { createPolarDestination } from '@ai-billing/polar';

const billingMiddleware = createOpenRouterV3Middleware({
  destinations: [
    createPolarDestination({
      accessToken: process.env.POLAR_ACCESS_TOKEN,
      eventName: 'llm_usage',
    })
  ],
});

const model = wrapLanguageModel({
  model: createOpenRouter({ apiKey: process.env.OPENROUTER_API_KEY })('google/gemini-2.0-flash-001'),
  middleware: billingMiddleware,
});

const { textStream } = await streamText({
  model,
  messages: [{ role: 'user', content: 'Quantify the value of metadata.' }],
  providerOptions: { 
    'ai-billing-tags': { userId: 'usr_123', org: 'Acme' } 
  },
});

Status and Roadmap

Note: We are prioritizing support for TEXT models.

Active Development

Full list of providers can be found here: https://ai-sdk.dev/providers/ The following providers are planned for future implementation. To prioritize a specific provider, please open a GitHub issue.

Architecture

The package consists of two primary components:

1. Provider Middleware

  • specialized for @ai-sdk/* that understand the specific providerMetadata shapes of different LLM usage
  • provider-specific cost calculation logic that that turn usage into cost
  • PriceResolver allowing to pass custom prices at time of request

2. Destinations

  • functions that receive a normalized BillingEvent and handle the API calls to external services
  • allow charging in credits using standardized meters