Skip to content

Instantly share code, notes, and snippets.

@maxogden maxogden/readme.md
Last active Aug 8, 2016

Embed
What would you like to do?
simple 4mb buffer proxy benchmarks

simple 4mb buffer proxy benchmarks

the goal: to do fast virtual host routing, e.g. to have a single process on a machine listening on port 80 and proxying data based on HTTP Host to other non-port-80 web processes on the same machine

many people use nginx for this because nginx is faster than node is currently for data-heavy applications (see below)

about these benchmarks

they use the JS proxies from https://github.com/substack/bouncy/tree/master/bench

and this node server: https://github.com/substack/bouncy/blob/master/bench/bench.js#L11-L16

each result was generated with ab -n 5000 -c 10

nginx used is v1.4.4 w/ 256 worker connections and 1 worker process

benchmarks done on a macbook air laptop computer, all w/ local http servers

no proxy (control)

Time taken for tests:   13.915 seconds
Complete requests:      5000
Failed requests:        0
Write errors:           0
Total transferred:      20971895000 bytes
HTML transferred:       20971520000 bytes
Requests per second:    359.34 [#/sec] (mean)
Time per request:       27.829 [ms] (mean)
Time per request:       2.783 [ms] (mean, across all concurrent requests)
Transfer rate:          1471865.46 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    1   0.9      1      26
Processing:    10   27   4.3     26      57
Waiting:        0    1   1.4      1      26
Total:         11   28   4.6     27      59

Percentage of the requests served within a certain time (ms)
  50%     27
  66%     28
  75%     29
  80%     30
  90%     32
  95%     35
  98%     42
  99%     48
 100%     59 (longest request)

nginx proxy_pass

Time taken for tests:   37.414 seconds
Complete requests:      5000
Failed requests:        0
Write errors:           0
Total transferred:      20972000000 bytes
HTML transferred:       20971520000 bytes
Requests per second:    133.64 [#/sec] (mean)
Time per request:       74.827 [ms] (mean)
Time per request:       7.483 [ms] (mean, across all concurrent requests)
Transfer rate:          547405.79 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.3      0       5
Processing:    13   74  23.7     70     469
Waiting:        4   23   9.2     22     109
Total:         13   75  23.7     70     470

Percentage of the requests served within a certain time (ms)
  50%     70
  66%     73
  75%     76
  80%     79
  90%     90
  95%    104
  98%    128
  99%    146
 100%    470 (longest request)

bouncy

Time taken for tests:   55.968 seconds
Complete requests:      5000
Failed requests:        0
Write errors:           0
Total transferred:      20976089379 bytes
HTML transferred:       20975714304 bytes
Requests per second:    89.34 [#/sec] (mean)
Time per request:       111.936 [ms] (mean)
Time per request:       11.194 [ms] (mean, across all concurrent requests)
Transfer rate:          366003.19 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.5      0      22
Processing:    44  111  25.1    107     358
Waiting:       10   31  14.9     28     273
Total:         44  112  25.2    107     358

Percentage of the requests served within a certain time (ms)
  50%    107
  66%    118
  75%    126
  80%    132
  90%    144
  95%    154
  98%    166
  99%    173
 100%    358 (longest request)

node-http-proxy

Time taken for tests:   74.141 seconds
Complete requests:      5000
Failed requests:        0
Write errors:           0
Total transferred:      20971895000 bytes
HTML transferred:       20971520000 bytes
Requests per second:    67.44 [#/sec] (mean)
Time per request:       148.281 [ms] (mean)
Time per request:       14.828 [ms] (mean, across all concurrent requests)
Transfer rate:          276236.99 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.2      0       2
Processing:    72  148  28.7    139     385
Waiting:       24   59  19.8     54     245
Total:         73  148  28.7    139     385

Percentage of the requests served within a certain time (ms)
  50%    139
  66%    158
  75%    170
  80%    175
  90%    188
  95%    197
  98%    209
  99%    218
 100%    385 (longest request)

hostproxy

Time taken for tests:   46.905 seconds
Complete requests:      5000
Failed requests:        0
Write errors:           0
Total transferred:      20971895000 bytes
HTML transferred:       20971520000 bytes
Requests per second:    106.60 [#/sec] (mean)
Time per request:       93.811 [ms] (mean)
Time per request:       9.381 [ms] (mean, across all concurrent requests)
Transfer rate:          436631.96 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.1      0       3
Processing:    58   94  26.7     86     534
Waiting:        4   28  23.5     24     466
Total:         58   94  26.8     86     534

Percentage of the requests served within a certain time (ms)
  50%     86
  66%     92
  75%     98
  80%    104
  90%    124
  95%    133
  98%    142
  99%    153
 100%    534 (longest request)
@maxogden

This comment has been minimized.

Copy link
Owner Author

commented Jan 8, 2014

heres the nginx.conf I used

events {
  worker_connections  256;
}

error_log /Users/max/Desktop/bench/error.log debug;

pid /Users/max/Desktop/bench/nginx.pid;

http {

  server {
    access_log  /Users/max/Desktop/bench/access.log;

    listen          8886;
    server_name     _;

    index           index.html index.htm;
    root            html;
    location / {
      proxy_pass http://foo;
    }
  }
  upstream foo {
    server 127.0.0.1:7501;
  }
}

@maxogden

This comment has been minimized.

Copy link
Owner Author

commented Jan 8, 2014

source for hostproxy bench:

var hostproxy = require('hostproxy')
var net = require('net')

module.exports = function(port) {
  return hostproxy(function (host) {
    return net.connect(port, 'localhost')
  })  
}
@indexzero

This comment has been minimized.

Copy link

commented Jan 8, 2014

What versions of node was this on?

@maxogden

This comment has been minimized.

Copy link
Owner Author

commented Jan 9, 2014

@indexzero latest stable, 0.10.24

@mikeal

This comment has been minimized.

Copy link

commented Jan 9, 2014

what are you proxying to?

IMO, the best thing would be a tcp server that just returns the http response as a static buffer. Similar with the client as well.

@mikeal

This comment has been minimized.

Copy link

commented Jan 9, 2014

if you look at the code in hostproxy it isn't really optimized yet at all. there's lots of unnecessary buffer slices and such but i wonder if that's even where it is spending its time. might want to get @trevnorris in here.

@trevnorris

This comment has been minimized.

Copy link

commented Jan 9, 2014

Not sure what is supposed to be happening here. Just taking data from point A and sending it to point B?

If that's the case then I'll write up a quick module that just opens up two points and dumps the input from on fd to the other.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.