Skip to content

Instantly share code, notes, and snippets.

@animir
Last active April 8, 2020 12:38
Show Gist options
  • Save animir/f84c7566784a0505ebca617e7c760adf to your computer and use it in GitHub Desktop.
Save animir/f84c7566784a0505ebca617e7c760adf to your computer and use it in GitHub Desktop.
Benchmark BurstyRateLimiter from rate-limiter-flexible and hyacinth
const {RateLimiterRedis, BurstyRateLimiter} = require('rate-limiter-flexible');
const cluster = require('cluster');
const http = require('http');
const Ioredis = require('ioredis');
const redisClient = new Ioredis({});
const TokenBucket = require('hyacinth');
const numberOfUsers = 100;
if (cluster.isMaster) {
for (let i = 0; i < 6; i++) {
cluster.fork();
}
} else {
const hyacinthTokenBucket = new TokenBucket({
redis: redisClient
});
const burstyLimiter = new BurstyRateLimiter(
new RateLimiterRedis({
redis: redisClient,
keyPrefix: 'rlflx',
points: 2,
duration: 1,
inmemoryBlockOnConsumed: 2,
}),
new RateLimiterRedis({
redis: redisClient,
keyPrefix: 'rlburst',
points: 5,
duration: 10,
inmemoryBlockOnConsumed: 5,
})
);
const srv = http.createServer(async (req, res) => {
if (req.url === '/favicon.ico') {
res.writeHead(404);
res.end();
return;
}
if (req.url === '/bucket') {
hyacinthTokenBucket.rateLimit(`${Math.floor(Math.random() * numberOfUsers)}`, 1, 7, 500, function(err, tokensRemaining){
if(tokensRemaining < 0) {
res.writeHead(429);
res.end();
return;
}
res.end();
}).catch((err) => {
res.writeHead(429);
res.end();
});
} else {
burstyLimiter.consume(`${Math.floor(Math.random() * numberOfUsers)}`)
.then((rlRes) => {
res.end();
})
.catch((rej) => {
res.writeHead(429);
res.end();
});
}
});
srv.listen(3002);
}
@animir
Copy link
Author

animir commented Apr 4, 2020

This small script starts Node.js server in Cluster mode with 6 workers on port 3002. It connects to Redis, which should be run locally on default port.
hyacinth TokenBucket limits localhost:3002/bucket. All other routes limited by BurstyRateLimiter from rate-limiter-flexible.

Both allow 2 requests per second with traffic burst allowance up to 7 requests.

@animir
Copy link
Author

animir commented Apr 4, 2020

Benchmark Macbook Pro late 2016 - 1000rps with ./bombardier -c 1000 -l -d 30s -r 1000 -t 1s http://127.0.0.1:3002

Benchmark 1k rps

BurtsyRateLimiter from rate-limiter-flexible

100

Statistics Avg Stdev Max
Reqs/sec 1000.25 170.60 1727.79
Latency 2.81ms 1.40ms 32.12ms
Latency Distribution
50% 2.78ms
75% 4.72ms
90% 5.55ms
95% 6.40ms
99% 8.57ms
HTTP codes:
1xx - 0, 2xx - 6995, 3xx - 0, 4xx - 23012, 5xx - 0

500

Statistics Avg Stdev Max
Reqs/sec 1001.69 256.35 5400.52
Latency 5.26ms 3.48ms 101.83ms
Latency Distribution
50% 4.85ms
75% 5.82ms
90% 7.42ms
95% 8.11ms
99% 11.28ms
HTTP codes:
1xx - 0, 2xx - 25392, 3xx - 0, 4xx - 4608, 5xx - 0

1000

Statistics Avg Stdev Max
Reqs/sec 997.50 205.85 1812.13
Latency 5.23ms 1.43ms 33.56ms
Latency Distribution
50% 4.89ms
75% 5.65ms
90% 7.01ms
95% 7.87ms
99% 11.64ms
HTTP codes:
1xx - 0, 2xx - 29779, 3xx - 0, 4xx - 230, 5xx - 0

5000

Statistics Avg Stdev Max
Reqs/sec 998.30 207.06 2072.28
Latency 4.64ms 797.55us 26.36ms
Latency Distribution
50% 4.54ms
75% 5.17ms
90% 5.84ms
95% 6.29ms
99% 7.38ms
HTTP codes:
1xx - 0, 2xx - 30009, 3xx - 0, 4xx - 0, 5xx - 0

10000

Statistics Avg Stdev Max
Reqs/sec 995.38 301.15 2422.79
Latency 5.06ms 1.22ms 44.77ms
Latency Distribution
50% 4.97ms
75% 5.50ms
90% 6.07ms
95% 6.42ms
99% 7.43ms
HTTP codes:
1xx - 0, 2xx - 30004, 3xx - 0, 4xx - 0, 5xx - 0

25000

Statistics Avg Stdev Max
Reqs/sec 995.03 250.79 2568.23
Latency 4.94ms 1.89ms 59.15ms
Latency Distribution
50% 4.79ms
75% 5.31ms
90% 5.89ms
95% 6.30ms
99% 8.60ms
HTTP codes:
1xx - 0, 2xx - 30007, 3xx - 0, 4xx - 0, 5xx - 0

50000

Statistics Avg Stdev Max
Reqs/sec 996.21 295.37 2231.47
Latency 4.98ms 1.41ms 45.06ms
Latency Distribution
50% 4.89ms
75% 5.39ms
90% 5.95ms
95% 6.34ms
99% 7.95ms
HTTP codes:
1xx - 0, 2xx - 30010, 3xx - 0, 4xx - 0, 5xx - 0

TokenBucket from hyacinth

100

Statistics Avg Stdev Max
Reqs/sec 999.56 125.25 1571.39
Latency 5.27ms 630.14us 15.88ms
Latency Distribution
50% 5.31ms
75% 5.93ms
90% 6.45ms
95% 6.68ms
99% 7.15ms
HTTP codes:
1xx - 0, 2xx - 6601, 3xx - 0, 4xx - 23409, 5xx - 0

500

Statistics Avg Stdev Max
Reqs/sec 1004.58 141.39 2240.13
Latency 5.37ms 0.97ms 39.23ms
Latency Distribution
50% 5.33ms
75% 5.92ms
90% 6.56ms
95% 6.86ms
99% 7.54ms
HTTP codes:
1xx - 0, 2xx - 28326, 3xx - 0, 4xx - 1684, 5xx - 0

1000

Statistics Avg Stdev Max
Reqs/sec 994.39 312.01 1697.09
Latency 5.27ms 677.74us 21.18ms
Latency Distribution
50% 5.27ms
75% 5.89ms
90% 6.44ms
95% 6.67ms
99% 7.28ms
HTTP codes:
1xx - 0, 2xx - 30007, 3xx - 0, 4xx - 3, 5xx - 0

5000

Statistics Avg Stdev Max
Reqs/sec 1005.06 139.64 3447.78
Latency 5.32ms 2.97ms 81.50ms
Latency Distribution
50% 5.05ms
75% 5.69ms
90% 6.35ms
95% 6.81ms
99% 10.10ms
HTTP codes:
1xx - 0, 2xx - 30008, 3xx - 0, 4xx - 0, 5xx - 0

10000

Statistics Avg Stdev Max
Reqs/sec 999.08 186.53 2804.92
Latency 5.14ms 1.28ms 50.85ms
Latency Distribution
50% 5.12ms
75% 5.76ms
90% 6.25ms
95% 6.48ms
99% 7.21ms
HTTP codes:
1xx - 0, 2xx - 30010, 3xx - 0, 4xx - 0, 5xx - 0

25000

Statistics Avg Stdev Max
Reqs/sec 997.65 193.90 1746.05
Latency 5.01ms 561.44us 16.93ms
Latency Distribution
50% 5.00ms
75% 5.60ms
90% 6.11ms
95% 6.35ms
99% 6.96ms
HTTP codes:
1xx - 0, 2xx - 30010, 3xx - 0, 4xx - 0, 5xx - 0

50000

Statistics Avg Stdev Max
Reqs/sec 998.58 167.49 1920.22
Latency 5.16ms 1.28ms 36.14ms
Latency Distribution
50% 5.11ms
75% 5.76ms
90% 6.26ms
95% 6.51ms
99% 7.52ms
HTTP codes:
1xx - 0, 2xx - 30010, 3xx - 0, 4xx - 0, 5xx - 0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment