Can We Use LLM APIs Without a Backend?

0
6
Asked By CreativeCat89 On

Hey folks! I've noticed that consuming LLM APIs has become quite popular, and typically, we need a backend to manage the API keys for security purposes. However, it feels a bit excessive to create a backend for every single AI app just to access model APIs. For instance, we developed a custom app for a client that processes a PDF using AI model APIs and generates multiple output PDFs. We just use a 'generateObject' call, but we still require a backend for the API access.

This got me thinking: what if there was a service that could act as a sort of proxy backend? This would allow us to input API keys in a dashboard and have it connect to various model APIs. The service could provide CORS options and security measures to ensure only specific web and mobile apps could use it.

I'm really interested to hear your thoughts on something like this!

8 Answers

Answered By CloudGuru97 On
Answered By QuickFixer77 On

Just rent a cheap API service like Render.com, use FastAPI to forward requests, and store your keys securely. It’s a pretty straightforward setup without needing a dedicated solution. Plus, updating it is super easy!

Answered By CodeCrafter88 On

So you're saying instead of writing a backend to directly interface with the API, we hand our API keys over to some third-party proxy? They would handle the security, but we would still have to manage a backend for that proxy, right?

FootlooseCoder -

Not really! Think of Firebase; they manage security through CORS and app checks so you can avoid exposing your Firebase key.

Answered By CloudGuru97 On

Putting an LLM API on an unauthenticated endpoint could potentially make it free access for everyone to your subscription. Why not just use a lambda function if you're trying to avoid a complex backend?

Answered By TechWhiz81 On

How common are apps that solely call LLM APIs without any other backend support? It seems like a pretty niche situation. If it's just for internal use without the need for authentication, embedding the API key in the frontend might work. If the whole point is to avoid deploying a backend, your market could be really limited.

CreativeCat89 -

That’s a fair point. For example, if someone uses Supabase or Firebase for authentication and database directly from the client, would that make sense for accessing an LLM API? I've noticed Firebase has some integrated AI logic, but it only works with their own APIs.

Answered By APIProGuy23 On

You might be looking for something like N8N to handle that. It can help automate workflows and make it easier to connect APIs.

Answered By SparkyDev42 On

I recently heard about a service called Buildship that seems to fit your description. They offer no-code access to LLM calls, which might take care of what you're looking for!

CreativeCat89 -

Thanks for the tip! That sounds like a solid option. They're combining API services with a visual builder, right?

Answered By GatewayExpert45 On

Are you thinking of something like an API gateway? Check out Cloudflare’s documentation on that; it’s worth looking into.

CreativeCat89 -

Not quite! You’d still need to build and deploy an API gateway. I’m looking for an out-of-the-box solution.

Related Questions

Remove Duplicate Items From List

EAN Validator

EAN Generator

Cloudflare Cache Detector

HTTP Status Code Check

Online PDF Editor

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.