Middleware for the Vercel AI SDK seing events directly to billing platforms (Stripe, Polar, and Lago).
| Destination | Package | Size |
|---|---|---|
| Polar.sh | @ai-billing/polar |
|
| Stripe | @ai-billing/stripe |
|
| OpenMeter (Kong) | @ai-billing/openmeter |
|
| Lago | @ai-billing/lago |
npm install @ai-billing/core @ai-billing/openrouter # Example for OpenRouterWrap your model provider with the billing middleware and define your destinations.
import { streamText, wrapLanguageModel } from 'ai';
import { createOpenRouter } from '@openrouter/ai-sdk-provider';
import { createOpenRouterV3Middleware } from '@ai-billing/openrouter';
const billingMiddleware = createOpenRouterV3Middleware({});
const model = wrapLanguageModel({
model: createOpenRouter({ apiKey: process.env.OPENROUTER_API_KEY })('google/gemini-2.0-flash-001'),
middleware: billingMiddleware,
});Wrap your model provider with the billing middleware and define your destinations.
import { streamText, wrapLanguageModel } from 'ai';
import { createOpenRouter } from '@openrouter/ai-sdk-provider';
import { createOpenRouterV3Middleware } from '@ai-billing/openrouter';
import { createPolarDestination } from '@ai-billing/polar';
const billingMiddleware = createOpenRouterV3Middleware({
destinations: [
createPolarDestination({
accessToken: process.env.POLAR_ACCESS_TOKEN,
eventName: 'llm_usage',
})
],
});
const model = wrapLanguageModel({
model: createOpenRouter({ apiKey: process.env.OPENROUTER_API_KEY })('google/gemini-2.0-flash-001'),
middleware: billingMiddleware,
});
const { textStream } = await streamText({
model,
messages: [{ role: 'user', content: 'Quantify the value of metadata.' }],
providerOptions: {
'ai-billing-tags': { userId: 'usr_123', org: 'Acme' }
},
});Note: We are prioritizing support for TEXT models.
Active Development
Full list of providers can be found here: https://ai-sdk.dev/providers/ The following providers are planned for future implementation. To prioritize a specific provider, please open a GitHub issue.
The package consists of two primary components:
- specialized for
@ai-sdk/*that understand the specificproviderMetadatashapes of different LLM usage - provider-specific cost calculation logic that that turn usage into cost
PriceResolverallowing to pass custom prices at time of request
- functions that receive a normalized
BillingEventand handle the API calls to external services - allow charging in credits using standardized meters

