Skip to main content

Introduction

Backmesh lets your mobile or web app (e.g. Javascript, Native Mobile, Flutter, React, React Native, etc) call LLM private key APIs without a server or cloud function backend. Additionally, Backmesh is hosted on Cloudflare’s edge, providing lower response times compared to traditional servers and cloud functions.

How Backmesh works

Backmesh provides secure LLM API access by using your authentication provider’s JWT to ensure only authorized users can make API calls. It adds additional protections like rate limiting (e.g., no more than 5 OpenAI API calls per hour per user). For more details, see the security documentation.

LLM Private Key APIs Supported:

  • OpenAI
  • Gemini
  • Anthropic

Authentication Providers Supported:

  • Supabase
  • Firebase
  • Auth0

LLM Analytics without SDKs

Backmesh will automatically instrument LLM requests to let you understand LLM API usage across your users e.g. error rates, costs, response times across models, etc. Please leave a comment in this Github Issue or message us on Discord with more information about what LLM API endpoints you are using and what analytics you would like to see.

Tutorials