Introduction
Backmesh is an open source Backend as a Service (BaaS) for AI apps. It lets you securely call LLM APIs directly from your mobile or web app using any LLM SDK without exposing private API keys. The only changes in your app are to replace:
- The LLM API URL with the Backmesh Gatekeeper URL.
- The LLM private key with the authenticated user's JWT.
import OpenAI from "openai";
import supabase from "supabase-js";
const BACKMESH_URL =
"https://edge.backmesh.com/v1/proxy/gbBbHCDBxqb8zwMk6dCio63jhOP2/wjlwRswvSXp4FBXwYLZ1/v1";
const jwt = supabase.auth.session().access_token;
const client = new OpenAI({
httpAgent: new HttpsProxyAgent(BACKMESH_URL),
dangerouslyAllowBrowser: true, // no longer dangerous
apiKey: jwt,
});
How is the LLM API protected
- JWT Authentication: Requests are verified with JWTs from the app's authentication provider so only your users have access to the LLM API via Backmesh Gatekeeper.
- Rate limits per user: Configurable per-user rate limits to prevent abuse (e.g. no more than 5 OpenAI API calls per user per hour).
- Resource access control: Sensitive API resources like Files and Threads are protected so only the users that create them can continue to access them.
For more details, see the security documentation.
LLM Private Key APIs Supported:
- OpenAI
- Gemini
- Anthropic
- Cloudflare Workers AI
Authentication Providers Supported:
- Supabase
- Firebase
Leave a comment on Discord if your provider or LLM API is not supported.
LLM Analytics without SDKs
Backmesh will automatically instrument LLM requests to let you understand LLM API usage across your users e.g. error rates, costs, response times across models, etc. Please leave a comment on Discord with more information about what LLM API endpoints you are using and what analytics you would like to see.
Hosting
Backmesh is open source and can be self hosted in your own Cloudflare account which includes a generous free tier. We also offer a hosted SaaS with different pricing plans. LLM API analytics are displayed in the SaaS dashboard only though.