I built an open-source tool that helps add usage-based billing for your LLM projects

I built an open-source tool that helps add usage-based billing for your LLM projects

Nowadays, it is a huge hassle for projects built on top of OpenAI and Anthropic to implement monetization. You have to figure out:

What is my OpenAI and Anthropic cost for each user?
How much should I charge each user?
How do I impose a usage limit on each user to ensure profitability for each pricing tier?

BricksLLM helps you answer all of these questions via a highly scalable API gateway built specifically for LLMs.

Here is a quick demo:

For example, for each user, you could create a proxy API key (through the REST endpoint) that has a spend limit of $100/month and a rate limit of 10000 requests/month:

Then, you can redirect your OpenAI/Anthropic requests to us and start using the key:

// OpenAI Node SDK v4
import OpenAI from openai;

const openai = new OpenAI({
apiKey: MY-SECRET-KEY, // key created earlier
baseURL: http://localhost:8002/api/providers/openai/v1, // redirect to us
});

That’s it. Just start using OpenAI/Anthropic as you would normally.

You can query usage metrics via key id, model, custom id and user id:

The usage data can be used both for analytics and Stripe integration. BricksLLM is free and open-source! You can spin it up using a single docker command. Under the hood it’s just a Go web server with a PostgreSQL db and a Redis cache.

Check us out and let me know what you think!

Here is the repo if you want to learn more about it: https://github.com/bricks-cloud/bricksllm

Leave a Reply

Your email address will not be published. Required fields are marked *