Skip to content

Instantly share code, notes, and snippets.

@mhausenblas
Created December 20, 2019 11:53
Show Gist options
  • Star 2 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save mhausenblas/56db56d63dad78fc4e81108da49f28b2 to your computer and use it in GitHub Desktop.
Save mhausenblas/56db56d63dad78fc4e81108da49f28b2 to your computer and use it in GitHub Desktop.
EKS on Fargate vertical scale testing

Using the official resource-consumer as a stress tester.

Setup

Using EKS on Fargate cluster in eu-west-1:

$ kubectl version
Client Version: version.Info{Major:"1", Minor:"16", GitVersion:"v1.16.3", GitCommit:"b3cbbae08ec52a7fc73d334838e18d17e8512749", GitTreeState:"clean", BuildDate:"2019-11-14T04:25:00Z", GoVersion:"go1.12.13", Compiler:"gc", Platform:"darwin/amd64"}
Server Version: version.Info{Major:"1", Minor:"14+", GitVersion:"v1.14.8-eks-b8860f", GitCommit:"b8860f6c40640897e52c143f1b9f011a503d6e46", GitTreeState:"clean", BuildDate:"2019-11-25T00:55:38Z", GoVersion:"go1.12.10", Compiler:"gc", Platform:"linux/amd64"}

The test runs are performed as follows:

# launch the stress tester app:
kubectl run stresstest \
        --image=gcr.io/kubernetes-e2e-test-images/resource-consumer:1.4

# create service for stress tester app:
kubectl expose deployment stresstest \
        --port=8080 --target-port=8080

# make stress tester app available locally:
kubectl port-forward svc/stresstest 8888:8080

# launch stress test, consuming 800MB over 60sec:
curl localhost:8888/ConsumeMem \
     --data "megabytes=800&durationSec=60" 

# launch stress test, consuming 5GB over 60sec:
curl localhost:8888/ConsumeMem \
     --data "megabytes=5000&durationSec=60"


# exec into pod:
kubectl exec -it stresstest-7fdbff75cf-llb2k sh

$ cat /proc/self/cgroup
11:memory:/ecs/163cf68d423448aeabbc049abc9441be/163cf68d423448aeabbc049abc9441besupervisord/kubepods/besteffort/pod9b6e0e87-2312-11ea-b782-0608babfc524/ac6b2d0e8adca65bce1020c4edd4eca35db3b95f5763f4f3f4a5d423dc69c8fb
10:pids:/ecs/163cf68d423448aeabbc049abc9441be/163cf68d423448aeabbc049abc9441besupervisord/kubepods/besteffort/pod9b6e0e87-2312-11ea-b782-0608babfc524/ac6b2d0e8adca65bce1020c4edd4eca35db3b95f5763f4f3f4a5d423dc69c8fb
9:cpu,cpuacct:/ecs/163cf68d423448aeabbc049abc9441be/163cf68d423448aeabbc049abc9441besupervisord/kubepods/besteffort/pod9b6e0e87-2312-11ea-b782-0608babfc524/ac6b2d0e8adca65bce1020c4edd4eca35db3b95f5763f4f3f4a5d423dc69c8fb
8:cpuset:/ecs/163cf68d423448aeabbc049abc9441be/163cf68d423448aeabbc049abc9441besupervisord/kubepods/besteffort/pod9b6e0e87-2312-11ea-b782-0608babfc524/ac6b2d0e8adca65bce1020c4edd4eca35db3b95f5763f4f3f4a5d423dc69c8fb
7:hugetlb:/ecs/163cf68d423448aeabbc049abc9441be/163cf68d423448aeabbc049abc9441besupervisord/kubepods/besteffort/pod9b6e0e87-2312-11ea-b782-0608babfc524/ac6b2d0e8adca65bce1020c4edd4eca35db3b95f5763f4f3f4a5d423dc69c8fb
6:blkio:/ecs/163cf68d423448aeabbc049abc9441be/163cf68d423448aeabbc049abc9441besupervisord/kubepods/besteffort/pod9b6e0e87-2312-11ea-b782-0608babfc524/ac6b2d0e8adca65bce1020c4edd4eca35db3b95f5763f4f3f4a5d423dc69c8fb
5:net_cls,net_prio:/ecs/163cf68d423448aeabbc049abc9441be/163cf68d423448aeabbc049abc9441besupervisord/kubepods/besteffort/pod9b6e0e87-2312-11ea-b782-0608babfc524/ac6b2d0e8adca65bce1020c4edd4eca35db3b95f5763f4f3f4a5d423dc69c8fb
4:devices:/ecs/163cf68d423448aeabbc049abc9441be/163cf68d423448aeabbc049abc9441besupervisord/kubepods/besteffort/pod9b6e0e87-2312-11ea-b782-0608babfc524/ac6b2d0e8adca65bce1020c4edd4eca35db3b95f5763f4f3f4a5d423dc69c8fb
3:perf_event:/ecs/163cf68d423448aeabbc049abc9441be/163cf68d423448aeabbc049abc9441besupervisord/kubepods/besteffort/pod9b6e0e87-2312-11ea-b782-0608babfc524/ac6b2d0e8adca65bce1020c4edd4eca35db3b95f5763f4f3f4a5d423dc69c8fb
2:freezer:/ecs/163cf68d423448aeabbc049abc9441be/163cf68d423448aeabbc049abc9441besupervisord/kubepods/besteffort/pod9b6e0e87-2312-11ea-b782-0608babfc524/ac6b2d0e8adca65bce1020c4edd4eca35db3b95f5763f4f3f4a5d423dc69c8fb
1:name=systemd:/ecs/163cf68d423448aeabbc049abc9441be/163cf68d423448aeabbc049abc9441besupervisord/kubepods/besteffort/pod9b6e0e87-2312-11ea-b782-0608babfc524/ac6b2d0e8adca65bce1020c4edd4eca35db3b95f5763f4f3f4a5d423dc69c8fb

$ cat /sys/fs/cgroup/memory/ecs/163cf68d423448aeabbc049abc9441be/163cf68d423448aeabbc049abc9441besupervisord/kubepods/besteffort/pod9b6e0e87-2312-11ea-b782-0608babfc524/ac6b2d0e8adca65bce1020c4edd4eca35db3b95f5763f4f3f4a5d423dc69c8fb
No such file or directory

$ while true ; do cat /sys/fs/cgroup/memory/memory.usage_in_bytes ; sleep 6 ; done

Observations

100MB run

For 100MB memory stress test, that is, using curl localhost:8888/ConsumeMem --data "megabytes=100&durationSec=60" we see the following memory usage in the pod (from /sys/fs/cgroup/memory/memory.usage_in_bytes):

2007040
98238464
107618304
107491328
107487232
107495424
107614208
107618304
107487232
107499520
107618304
2199552
2289664

This is the baseline, maxing out at ca. 102MB.

400MB run

For 400MB memory stress test, that is, using curl localhost:8888/ConsumeMem --data "megabytes=400&durationSec=60" we see the following memory usage in the pod (from /sys/fs/cgroup/memory/memory.usage_in_bytes):

2289664
389632000
423837696
424050688
424517632
424284160
424189952
424382464
424153088
424292352
6344704
6230016

Also expected, maxing out at ca. 404MB

500MB run

For 500MB memory stress test, that is, using curl localhost:8888/ConsumeMem --data "megabytes=500&durationSec=60" we see the following memory usage in the pod (from /sys/fs/cgroup/memory/memory.usage_in_bytes):

5758976
5754880
306044928
*BOOM*

Process gets OOMd and pod re-started at BOOM, last data point before that was ca. 291MB.

Repeating the experiment with a refined script that samples every second confirms that when the size approaches the 500MB border, the process gets OOMd:

while true ; do printf '%s: %s bytes\n' $(date +%s) $(cat /sys/fs/cgroup/memory/memory.usage_in_bytes) ; sleep 1 ; done

Yields:

1576842590: 2187264 bytes
1576842591: 2379776 bytes
1576842592: 2383872 bytes
1576842593: 128122880 bytes
1576842594: 249790464 bytes
1576842595: 323055616 bytes
1576842596: 370008064 bytes
1576842597: 409317376 bytes
*BOOM*
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment