EC2 AWS
Image: Ubuntu 22.04 LTS, 64-bit (x86)
Instance: t2.micro
Location: ap-south-1 (Mumbai)
vCPUs 1 | RAM 1gb
Python and Go benchmark, i will be using using flask and fastapi for python with gunicorn server and for go built in package. I will be using hey for HTTP load generating.
code for api.go
package main
import (
"fmt"
"net/http"
)
func main() {
http.HandleFunc("/", HelloServer)
http.ListenAndServe(":8000", nil)
}
func HelloServer(w http.ResponseWriter, r *http.Request) {
fmt.Fprintf(w, "Hello, %s!", r.URL.Path[1:])
}
For building and run
go build -o api api.go && ./api
Load test with 250 concurrent connections and 5000 total requests
root@mayank:~/golang# ./hey -c 250 -n 5000
Summary:
Total: 2.9152 secs
Slowest: 1.3516 secs
Fastest: 0.0384 secs
Average: 0.0917 secs
Requests/sec: 1715.1451
Total data: 40000 bytes
Size/request: 8 bytes
Response time histogram:
0.038 [1] |
0.170 [4363] |■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■
0.301 [120] |■
0.432 [434] |■■■■
0.564 [12] |
0.695 [56] |■
0.826 [1] |
0.958 [2] |
1.089 [3] |
1.220 [0] |
1.352 [8] |
Latency distribution:
10% in 0.0401 secs
25% in 0.0409 secs
50% in 0.0435 secs
75% in 0.0651 secs
90% in 0.3042 secs
95% in 0.3440 secs
99% in 0.6570 secs
Details (average, fastest, slowest):
DNS+dialup: 0.0016 secs, 0.0384 secs, 1.3516 secs
DNS-lookup: 0.0000 secs, 0.0000 secs, 0.0000 secs
req write: 0.0000 secs, 0.0000 secs, 0.0006 secs
resp wait: 0.0861 secs, 0.0384 secs, 1.3324 secs
resp read: 0.0000 secs, 0.0000 secs, 0.0012 secs
Status code distribution:
[200] 5000 responses
Flask==2.2.2
gunicorn==20.1.0
code for api.py
from flask import Flask
app = Flask(__name__)
@app.route('/')
def hello_world():
return 'Hello, World!'
Starting gunicorn server with 1 thread and 3 workers ((2*CPU)+1)
gunicorn --workers=3 --threads=1 -b :80 api:app
Load test with 250 concurrent connections and 5000 total requests
root@mayank:~/python-flask# ./hey -c 250 -n 5000
Summary:
Total: 12.7446 secs
Slowest: 4.3519 secs
Fastest: 0.0780 secs
Average: 0.5286 secs
Requests/sec: 392.3238
Total data: 65000 bytes
Size/request: 13 bytes
Response time histogram:
0.078 [1] |
0.505 [3901] |■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■
0.933 [224] |■■
1.360 [491] |■■■■■
1.788 [315] |■■■
2.215 [0] |
2.642 [2] |
3.070 [0] |
3.497 [55] |■
3.924 [4] |
4.352 [7] |
Latency distribution:
10% in 0.1143 secs
25% in 0.2048 secs
50% in 0.4065 secs
75% in 0.4622 secs
90% in 1.3421 secs
95% in 1.3787 secs
99% in 3.1049 secs
Details (average, fastest, slowest):
DNS+dialup: 0.2643 secs, 0.0780 secs, 4.3519 secs
DNS-lookup: 0.0000 secs, 0.0000 secs, 0.0000 secs
req write: 0.0000 secs, 0.0000 secs, 0.0005 secs
resp wait: 0.2639 secs, 0.0390 secs, 2.3810 secs
resp read: 0.0002 secs, 0.0000 secs, 0.0767 secs
Status code distribution:
[200] 5000 responses
fastapi==0.92.0
gunicorn==20.1.0
uvicorn==0.20.0
code for api.py
from fastapi import FastAPI
app = FastAPI()
@app.get("/")
async def root():
return {"message": "Hello World"}
Starting gunicorn server with 1 thread and 3 workers ((2*CPU)+1)
gunicorn --workers=3 --threads=1 -k uvicorn.workers.UvicornH11Worker -b :80 api:app
Load test with 250 concurrent connections and 5000 total requests
root@mayank:~/python-fastapi# ./hey -c 250 -n 5000
Summary:
Total: 5.3730 secs
Slowest: 4.3693 secs
Fastest: 0.0389 secs
Average: 0.1493 secs
Requests/sec: 930.5720
Total data: 125000 bytes
Size/request: 25 bytes
Response time histogram:
0.039 [1] |
0.472 [4717] |■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■■
0.905 [171] |■
1.338 [67] |■
1.771 [22] |
2.204 [13] |
2.637 [6] |
3.070 [1] |
3.503 [1] |
3.936 [0] |
4.369 [1] |
Latency distribution:
10% in 0.0412 secs
25% in 0.0424 secs
50% in 0.0475 secs
75% in 0.1402 secs
90% in 0.3963 secs
95% in 0.5025 secs
99% in 1.2908 secs
Details (average, fastest, slowest):
DNS+dialup: 0.0020 secs, 0.0389 secs, 4.3693 secs
DNS-lookup: 0.0000 secs, 0.0000 secs, 0.0000 secs
req write: 0.0000 secs, 0.0000 secs, 0.0006 secs
resp wait: 0.1411 secs, 0.0388 secs, 4.3570 secs
resp read: 0.0023 secs, 0.0000 secs, 0.8092 secs
Status code distribution:
[200] 5000 responses
go won, bye