You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I came accross the following issue, not sure if it's a bug, or I misunderstood something:
We have an API endpoint that has the default throttle middleware applied (60 requests / minute). Our client posts to this endpoint quite frequently using an automated service. It is set now to wait 20 seconds between each request - this means in theory they send 3 requests / minute. However, they are being locked out by rate limiter all the time.
By doing some debugging, I realised that every time when RateLimiter::hit() is called, the cache file containing the attempt counter is renewed.
It looks like the root of the problem is in Illuminate\Cache\FileStore L87:
// Next, we'll extract the number of minutes that are remaining for a cache
// so that we can properly retain the time for things like the increment
// operation that may be performed on the cache. We'll round this out.
$time = ceil(($expire - time()) / 60);
$time will always be 1, if we limit per minute, which means that the client will always reach their limit even is it's set to 1 million (as long as they keep hitting the endpoint at least once every minute).
Do you think this is a legit bug, or is it intentional? What would be the solution to this (apart from increasing $decayMinutes parameter on ThrottleRequests middleware)?
The text was updated successfully, but these errors were encountered:
Hi there,
I came accross the following issue, not sure if it's a bug, or I misunderstood something:
We have an API endpoint that has the default throttle middleware applied (60 requests / minute). Our client posts to this endpoint quite frequently using an automated service. It is set now to wait 20 seconds between each request - this means in theory they send 3 requests / minute. However, they are being locked out by rate limiter all the time.
By doing some debugging, I realised that every time when
RateLimiter::hit()
is called, the cache file containing the attempt counter is renewed.It looks like the root of the problem is in
Illuminate\Cache\FileStore
L87:$time
will always be 1, if we limit per minute, which means that the client will always reach their limit even is it's set to 1 million (as long as they keep hitting the endpoint at least once every minute).Do you think this is a legit bug, or is it intentional? What would be the solution to this (apart from increasing
$decayMinutes
parameter onThrottleRequests
middleware)?The text was updated successfully, but these errors were encountered: