I'm looking to set up different rate limits for various user plans starting in 2025. For instance:
* Free users could have 10 requests/month
* Tier 1 users might get 30 requests/month
* Pro users could enjoy 100 requests/month
* Rate limits would reset on the 1st of each month
* The limits should be enforced before any requests hit the backend.
How do you think I should implement this? Should I store counters in Redis, utilize Cloudflare Workers KV or Durable Objects, manage it within my backend database, or perhaps use an API gateway that has built-in quota rules? Any insights on industry standards would be greatly appreciated!
3 Answers
For managing those request limits, I'd stick to a database solution. With a cap of 100 requests per month, that translates to just a few writes per day, which isn’t too heavy. You’ll keep it simple without needing external services.
Honestly, keeping it simple is key! Just determine the user’s plan on authentication and apply the rate limiting through middleware. It’s what Laravel does.
Like this:
```php
RateLimiter::for('uploads', function (Request $request) {
return $request->user()->vipCustomer()
? Limit::none()
: Limit::perMinute(100)->by($request->ip());
});
```
That approach sounds solid, appreciate the example! I think my current serverless architecture complicates things unnecessarily.
Starting with Redis could be a smart choice. You can easily build and scale as your user base grows. Just set a countdown for each account that resets at month’s end. Each API call decreases that count. If there's no existing count, just check the user’s plan to set the initial value. It’s pretty efficient!
That sounds reasonable! I was thinking the same but wondered if everyone should really implement a Redis setup for every new product they launch.

Thanks for the input! But doesn’t that mean I'd need to implement a method for checking user plans and limits myself? I was hoping for a more straightforward approach.