Skip to content

Instantly share code, notes, and snippets.

@initcron
Created May 29, 2024 03:06
Show Gist options
  • Save initcron/c0579ea7616db3fa6b0c73558858f663 to your computer and use it in GitHub Desktop.
Save initcron/c0579ea7616db3fa6b0c73558858f663 to your computer and use it in GitHub Desktop.
EKS Cluster Config with two node groups. Launch with => eksctl create nodegroup -f cluster.yaml --include=ng-1-workers
apiVersion: eksctl.io/v1alpha5
kind: ClusterConfig
metadata:
name: eks-cluster-01
region: ap-southeast-1
vpc:
id: "vpc-0f001c1415b1ae568"
subnets:
public:
apsoutheast1a:
id: subnet-016f8559ad5d152b6
apsoutheast1b:
id: subnet-084e0206b2b44a90a
apsoutheast1c:
id: subnet-042f8e43e3b2d0148
managedNodeGroups:
- name: ng-1-workers
labels: { role: workers }
instanceType: t3.small
desiredCapacity: 1
minSize: 1
maxSize: 5
maxPodsPerNode: 17
ssh:
allow: true
publicKeyName: eks-spore
tags:
k8s.io/cluster-autoscaler/enabled: "true"
k8s.io/cluster-autoscaler/eks-cluster-01: "owned"
updateConfig:
maxUnavailable: 1
- name: ng-2-workers
labels: { role: workers }
instanceType: t3.medium
desiredCapacity: 2
minSize: 1
maxSize: 4
maxPodsPerNode: 17
ssh:
allow: true
publicKeyName: eks-spore
tags:
k8s.io/cluster-autoscaler/enabled: "true"
k8s.io/cluster-autoscaler/eks-cluster-01: "owned"
updateConfig:
maxUnavailable: 1
iam:
withOIDC: true
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment