I ran Apache Bench like so:
ab -n 1000 -c 100 'http://127.0.0.1:8000/'
Server Software: uvicorn
Server Hostname: 127.0.0.1
Server Port: 8000
Document Path: /
Document Length: 13 bytes
Concurrency Level: 100
Time taken for tests: 0.130 seconds
Complete requests: 1000
Failed requests: 0
Total transferred: 132000 bytes
HTML transferred: 13000 bytes
Requests per second: 7672.07 [#/sec] (mean)
Time per request: 13.034 [ms] (mean)
Time per request: 0.130 [ms] (mean, across all concurrent requests)
Transfer rate: 988.98 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 2 1.1 1 6
Processing: 2 11 6.5 9 36
Waiting: 1 9 6.0 7 31
Total: 4 13 6.5 10 37
Percentage of the requests served within a certain time (ms)
50% 10
66% 13
75% 15
80% 16
90% 25
95% 29
98% 30
99% 34
100% 37 (longest request)
Server Hostname: 127.0.0.1
Server Port: 8000
Document Path: /
Document Length: 17 bytes
Concurrency Level: 100
Time taken for tests: 0.316 seconds
Complete requests: 1000
Failed requests: 0
Total transferred: 107000 bytes
HTML transferred: 17000 bytes
Requests per second: 3168.34 [#/sec] (mean)
Time per request: 31.562 [ms] (mean)
Time per request: 0.316 [ms] (mean, across all concurrent requests)
Transfer rate: 331.07 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 2 1.4 2 9
Processing: 5 28 6.9 29 43
Waiting: 2 24 6.4 25 39
Total: 10 30 6.8 31 45
Percentage of the requests served within a certain time (ms)
50% 31
66% 33
75% 35
80% 36
90% 40
95% 42
98% 43
99% 43
100% 45 (longest request)
Server Software:
Server Hostname: 127.0.0.1
Server Port: 8000
Document Path: /
Document Length: 17 bytes
Concurrency Level: 100
Time taken for tests: 0.152 seconds
Complete requests: 1000
Failed requests: 0
Total transferred: 107000 bytes
HTML transferred: 17000 bytes
Requests per second: 6569.61 [#/sec] (mean)
Time per request: 15.222 [ms] (mean)
Time per request: 0.152 [ms] (mean, across all concurrent requests)
Transfer rate: 686.47 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 2 1.9 2 11
Processing: 2 12 4.7 11 27
Waiting: 1 10 4.4 9 22
Total: 6 15 4.4 14 28
Percentage of the requests served within a certain time (ms)
50% 14
66% 16
75% 17
80% 19
90% 21
95% 23
98% 25
99% 25
100% 28 (longest request)
Running a similar test, I also am seeing that uvicorn is faster than sanic on trivial short requests. One thing you've left out is that uvicorn also gets significantly faster with
acces_log=False
My results with
-n 100000 -c 200
(total time/95%ile request time/RPS)