Skip to content

Instantly share code, notes, and snippets.

@janeczku
Last active January 15, 2024 09:31
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save janeczku/a8485286986e9a8cb1af5d9e8798fae9 to your computer and use it in GitHub Desktop.
Save janeczku/a8485286986e9a8cb1af5d9e8798fae9 to your computer and use it in GitHub Desktop.
Banzai Cluster Logging Elasticsearch Example
apiVersion: logging.banzaicloud.io/v1beta1
kind: ClusterFlow
metadata:
name: archive
spec:
match:
- select: {}
outputRefs:
- s3
apiVersion: logging.banzaicloud.io/v1beta1
kind: ClusterOutput
metadata:
name: cluster-output-es
namespace: cattle-logging-system
labels:
someLabel: foo
spec:
elasticsearch:
host: elasticsearch-elasticsearch-cluster.default.svc.cluster.local
port: 443
scheme: https
ssl_verify: false
user: username
password:
valueFrom:
secretKeyRef:
name: es-log-writer
key: token
ca_file:
valueFrom:
secretKeyRef:
name: es-log-writer
key: ca.pem
logstash_format: true
logstash_prefix: k8s-cluster-foo
logstash_dateformat: YYYY-MM-DD
include_tag_key: true
reload_connections: false
reconnect_on_error: true
reload_on_failure: true
buffer:
type: file
chunk_limit_size: 4M # Determines HTTP payload size
total_limit_size: 1024MB # Max total buffer size
flush_mode: interval
flush_interval: 10s
flush_thread_count: 2 # Parallel send of logs
overflow_action: block
retry_forever: true # Never discard buffer chunks
retry_type: exponential_backoff
retry_max_interval: 60s
---
apiVersion: logging.banzaicloud.io/v1beta1
kind: ClusterFlow
metadata:
name: log-all-elastic
namespace: cattle-logging-system
labels:
someLabel: foo
spec:
filters:
- tag_normaliser: {}
selectors: {}
globalOutputRefs:
- cluster-output-es
apiVersion: logging.banzaicloud.io/v1beta1
kind: Output
metadata:
name: cluster-output-s3
namespace: cattle-logging-system
spec:
s3:
aws_key_id:
valueFrom:
secretKeyRef:
name: s3-write-logs
key: awsAccessKeyId
aws_sec_key:
valueFrom:
secretKeyRef:
name: s3-write-logs
key: awsSecretAccessKey
s3_bucket: log-archive-cluster-foo
s3_region: eu-central-1
path: logs/${tag}/%Y/%m/%d/
buffer:
timekey: 10m
timekey_wait: 30s
timekey_use_utc: true
s3_bucket "${s3_bucket}"
s3_region "${s3_region}"
path "${s3_prefix}%Y/%m/%d/%H/$${tag}/"
s3_object_key_format "%%{path}%%{time_slice}_%%{hex_random}_%%{index}_$${chunk_id}.%%{file_extension}"
storage_class "${storage_class}"
auto_create_bucket false
check_bucket true
<buffer tag,time>
@type file
path /td-agent/buffer/s3-buffer
timekey 3600
timekey_wait 10m
timekey_use_utc true
flush_at_shutdown true
flush_thread_count 8
retry_forever true
retry_max_interval 128s
</buffer>
</store>
%{ endif }
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment