Skip to content

Instantly share code, notes, and snippets.

@membphis
Created May 31, 2019 05:45
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save membphis/9e7e3799c6bc83156faf2f7fe2fade7e to your computer and use it in GitHub Desktop.
Save membphis/9e7e3799c6bc83156faf2f7fe2fade7e to your computer and use it in GitHub Desktop.
apisix benchmark
google cloud, CPU: 8core Memory: 8G
@membphis
Copy link
Author

1 route: 1 upstream + 2 plugins (limit-count, prometheus)

curl http://127.0.0.1:2379/v2/keys/apisix/routes/1 -X PUT -d value='
{
    "methods": ["GET"],
    "uri": "/hello",
    "id": 1,
    "plugin_config": {
        "limit-count": {
            "count": 999999999,
            "time_window": 60,
            "rejected_code": 503,
            "key": "remote_addr"
        },
        "prometheus":{}
    },
    "upstream": {
        "type": "roundrobin",
        "nodes": {
            "127.0.0.1:80": 1,
            "127.0.0.2:80": 1
        }
    }
}'

benchmark:

1 worker

$ wrk -d 60 --latency http://127.0.0.1:9080/hello
Running 1m test @ http://127.0.0.1:9080/hello
  2 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   751.72us  519.65us  28.46ms   97.67%
    Req/Sec     6.81k   576.40     8.19k    71.50%
  Latency Distribution
     50%  687.00us
     75%  797.00us
     90%    0.95ms
     99%    1.50ms
  812960 requests in 1.00m, 197.66MB read
Requests/sec:  13548.22
Transfer/sec:      3.29MB

wrk -d 60 -c 20 -t 3 --latency http://127.0.0.1:9080/hello
Running 1m test @ http://127.0.0.1:9080/hello
  3 threads and 20 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.38ms  511.99us  25.02ms   94.30%
    Req/Sec     4.40k   339.61     5.42k    69.17%
  Latency Distribution
     50%    1.30ms
     75%    1.46ms
     90%    1.68ms
     99%    2.46ms
  787475 requests in 1.00m, 191.47MB read
Requests/sec:  13123.18
Transfer/sec:      3.19MB

2 workers

$ wrk -d 60 -c 15 -t 3 --latency http://127.0.0.1:9080/hello
Running 1m test @ http://127.0.0.1:9080/hello
  3 threads and 15 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   630.30us  592.27us  21.84ms   97.97%
    Req/Sec     8.60k   803.35    10.76k    73.00%
  Latency Distribution
     50%  542.00us
     75%  631.00us
     90%  769.00us
     99%    3.26ms
  1541071 requests in 1.00m, 374.69MB read
Requests/sec:  25672.15
Transfer/sec:      6.24MB

$ wrk -d 60 -c 20 -t 3 --latency http://127.0.0.1:9080/hello
Running 1m test @ http://127.0.0.1:9080/hello
  3 threads and 20 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   709.52us  542.48us  17.23ms   97.72%
    Req/Sec     8.96k   747.55    11.33k    63.67%
  Latency Distribution
     50%  627.00us
     75%  718.00us
     90%    0.86ms
     99%    2.84ms
  1605642 requests in 1.00m, 390.39MB read
Requests/sec:  26758.08
Transfer/sec:      6.51MB

4 workers

$ wrk -d 60 -c 30 -t 4 --latency http://127.0.0.1:9080/hello
Running 1m test @ http://127.0.0.1:9080/hello
  4 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   593.16us  532.57us  19.84ms   97.76%
    Req/Sec    12.60k   787.10    29.49k    76.47%
  Latency Distribution
     50%  521.00us
     75%  610.00us
     90%  735.00us
     99%    2.66ms
  3010956 requests in 1.00m, 732.08MB read
Requests/sec:  50099.28
Transfer/sec:     12.18MB

$ wrk -d 60 -c 30 -t 4 --latency http://127.0.0.1:9080/hello
Running 1m test @ http://127.0.0.1:9080/hello
  4 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   596.89us  520.54us  18.43ms   97.70%
    Req/Sec    12.45k   816.47    14.86k    71.29%
  Latency Distribution
     50%  526.00us
     75%  618.00us
     90%  748.00us
     99%    2.43ms
  2974487 requests in 1.00m, 723.21MB read
Requests/sec:  49566.19
Transfer/sec:     12.05MB

@membphis
Copy link
Author

1 route: 1 upstream + 0 plugins

curl http://127.0.0.1:2379/v2/keys/apisix/routes/1 -X PUT -d value='
{
    "methods": ["GET"],
    "uri": "/hello",
    "id": 1,
    "plugin_config": {},
    "upstream": {
        "type": "roundrobin",
        "nodes": {
            "127.0.0.1:80": 1,
            "127.0.0.2:80": 1
        }
    }
}'

benchmark:

1 worker

$ wrk -d 60 --latency http://127.0.0.1:9080/hello
Running 1m test @ http://127.0.0.1:9080/hello
  2 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   631.22us  557.01us  33.61ms   98.68%
    Req/Sec     8.15k   841.78    14.96k    74.27%
  Latency Distribution
     50%  569.00us
     75%  679.00us
     90%  824.00us
     99%    1.28ms
  973460 requests in 1.00m, 177.27MB read
Requests/sec:  16197.32
Transfer/sec:      2.95MB

wrk -d 60 --latency http://127.0.0.1:9080/hello
Running 1m test @ http://127.0.0.1:9080/hello
  2 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   633.87us  672.99us  36.72ms   99.17%
    Req/Sec     8.26k   741.36    15.59k    74.52%
  Latency Distribution
     50%  566.00us
     75%  668.00us
     90%  797.00us
     99%    1.23ms
  987001 requests in 1.00m, 179.74MB read
Requests/sec:  16422.67
Transfer/sec:      2.99MB

2 workers

$ wrk -d 60 -c 20 -t 3 --latency http://127.0.0.1:9080/hello
Running 1m test @ http://127.0.0.1:9080/hello
  3 threads and 20 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   548.77us  486.36us  21.25ms   98.45%
    Req/Sec    11.56k     0.90k   14.24k    74.11%
  Latency Distribution
     50%  486.00us
     75%  557.00us
     90%  670.00us
     99%    1.34ms
  2070664 requests in 1.00m, 377.08MB read
Requests/sec:  34509.82
Transfer/sec:      6.28MB

$ wrk -d 60 -c 20 -t 3 --latency http://127.0.0.1:9080/hello
Running 1m test @ http://127.0.0.1:9080/hello
  3 threads and 20 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   543.78us  476.62us  20.84ms   98.48%
    Req/Sec    11.65k   688.96    15.49k    75.61%
  Latency Distribution
     50%  485.00us
     75%  551.00us
     90%  652.00us
     99%    1.29ms
  2086181 requests in 1.00m, 379.90MB read
Requests/sec:  34757.43
Transfer/sec:      6.33MB

4 workers

$ wrk -d 60 -c 30 -t 4 --latency http://127.0.0.1:9080/hello
Running 1m test @ http://127.0.0.1:9080/hello
  4 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   512.77us  465.89us  18.01ms   97.42%
    Req/Sec    14.41k     1.14k   18.13k    69.54%
  Latency Distribution
     50%  448.00us
     75%  537.00us
     90%  664.00us
     99%    1.92ms
  3441865 requests in 1.00m, 626.78MB read
Requests/sec:  57350.01
Transfer/sec:     10.44MB

$ wrk -d 60 -c 30 -t 4 --latency http://127.0.0.1:9080/hello
Running 1m test @ http://127.0.0.1:9080/hello
  4 threads and 30 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   521.80us  525.59us  17.60ms   97.61%
    Req/Sec    14.40k     1.18k   17.49k    70.92%
  Latency Distribution
     50%  448.00us
     75%  537.00us
     90%  667.00us
     99%    2.39ms
  3440251 requests in 1.00m, 626.48MB read
Requests/sec:  57321.88
Transfer/sec:     10.44MB

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment