Skip to content

Memcache

Roman edited this page Mar 8, 2019 · 2 revisions

RateLimiterMemcache

Usage

See all options here

  const Memcached = require('memcached');
  const memcached = new Memcached('127.0.0.1:11211');
  const {RateLimiterMemcache} = require('rate-limiter-flexible');

  const opts = {
    storeClient: memcached,
    points: 5, // Number of points
    duration: 1, // Per second(s)
  };

  const rateLimiter = new RateLimiterMemcache(opts);

  rateLimiter.consume(userId)
    .then((rateLimiterRes) => {
      // ... Some app logic here ...
    })
    .catch((rejRes) => {
      if (rejRes instanceof Error) {
        // Some Memcached error
        // Never happen if `insuranceLimiter` set up
        // Decide what to do with it in other case
      } else {
        // Consumed more than allowed
        const secs = Math.round(rejRes.msBeforeNext / 1000) || 1;
        res.set('Retry-After', String(secs));
        res.status(429).send('Too Many Requests');
      }
    });

See all options here

RateLimiterMemcache benchmark

Endpoint is pure NodeJS endpoint launched in node:10.5.0-jessie and memcached:1.5.12 Docker containers by PM2 with 4 workers

By bombardier -c 1000 -l -d 30s -r 2000 -t 5s http://127.0.0.1:8000

Test with 1000 concurrent requests with maximum 2000 requests per sec during 30 seconds

Statistics        Avg      Stdev        Max
  Reqs/sec      1999.33     431.91    3304.15
  Latency        3.89ms   641.62us    13.64ms
  Latency Distribution
     50%     3.77ms
     75%     4.44ms
     90%     5.13ms
     95%     5.64ms
     99%     7.02ms
  HTTP codes:
    1xx - 0, 2xx - 15151, 3xx - 0, 4xx - 44865, 5xx - 0

Heap snapshot statistic on high traffic

Heap snapshot Redis

Clone this wiki locally