Skip to content

Instantly share code, notes, and snippets.

@jamwt
Last active December 14, 2015 02:58
Show Gist options
  • Save jamwt/5017172 to your computer and use it in GitHub Desktop.
Save jamwt/5017172 to your computer and use it in GitHub Desktop.
$ httperf --verbose --num-conns=100 --num-calls=100 --port=4000
httperf --verbose --client=0/1 --server=localhost --port=4000 --uri=/ --send-buffer=4096 --recv-buffer=16384 --num-conns=100 --num-calls=100
httperf: maximum number of open descriptors = 1024
reply-rate = 673.6
reply-rate = 611.0
reply-rate = 660.8
Maximum connect burst length: 1
Total: connections 100 requests 10000 replies 10000 test-duration 15.447 s
Connection rate: 6.5 conn/s (154.5 ms/conn, <=1 concurrent connections)
Connection time [ms]: min 114.3 avg 154.5 max 171.4 median 161.5 stddev 17.4
Connection time [ms]: connect 0.1
Connection length [replies/conn]: 100.000
Request rate: 647.4 req/s (1.5 ms/req)
Request size [B]: 62.0
Reply rate [replies/s]: min 611.0 avg 648.5 max 673.6 stddev 33.0 (3 samples)
Reply time [ms]: response 0.2 transfer 1.3
Reply size [B]: header 125.0 content 1048576.0 footer 2.0 (total 1048703.0)
Reply status: 1xx=0 2xx=10000 3xx=0 4xx=0 5xx=0
CPU time [s]: user 0.00 system 0.00 (user 0.0% system 0.0% total 0.0%)
Net I/O: 663032.9 KB/s (5431.6*10^6 bps)
Errors: total 0 client-timo 0 socket-timo 0 connrefused 0 connreset 0
Errors: fd-unavail 0 addrunavail 0 ftab-full 0 other 0
$ httperf --verbose --num-conns=100 --num-calls=100 --port=3000
httperf --verbose --client=0/1 --server=localhost --port=3000 --uri=/ --send-buffer=4096 --recv-buffer=16384 --num-conns=100 --num-calls=100
httperf: maximum number of open descriptors = 1024
reply-rate = 789.0
reply-rate = 728.9
Maximum connect burst length: 1
Total: connections 100 requests 10000 replies 10000 test-duration 13.315 s
Connection rate: 7.5 conn/s (133.2 ms/conn, <=1 concurrent connections)
Connection time [ms]: min 93.7 avg 133.2 max 150.6 median 137.5 stddev 13.3
Connection time [ms]: connect 0.1
Connection length [replies/conn]: 100.000
Request rate: 751.0 req/s (1.3 ms/req)
Request size [B]: 62.0
Reply rate [replies/s]: min 728.9 avg 758.9 max 789.0 stddev 42.5 (2 samples)
Reply time [ms]: response 0.1 transfer 1.3
Reply size [B]: header 95.0 content 1048576.0 footer 2.0 (total 1048673.0)
Reply status: 1xx=0 2xx=10000 3xx=0 4xx=0 5xx=0
CPU time [s]: user 0.00 system 0.00 (user 0.0% system 0.0% total 0.0%)
Net I/O: 769162.6 KB/s (6301.0*10^6 bps)
Errors: total 0 client-timo 0 socket-timo 0 connrefused 0 connreset 0
Errors: fd-unavail 0 addrunavail 0 ftab-full 0 other 0
{-# LANGUAGE OverloadedStrings #-}
-- Uses Warp + Wai + Scotty
import qualified Data.Text.Lazy as T
import Web.Scotty
main = scotty 3000 $ do
get "/" $ text big
where
big = T.pack $ replicate (1024 * 1024) 'X'
@jamwt
Copy link
Author

jamwt commented Feb 22, 2013

Note: this is assuming that the go and node code mean to return those 'X's as utf-8 (the text action type is going to do a utf-8 encoding pass on this GB of data). If we skipped that step and returned this data as-is without worrying about unicode, this would be even faster. (application/octet-stream style).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment