Skip to content

Instantly share code, notes, and snippets.

@superdump
Created November 2, 2016 12:28
Show Gist options
  • Save superdump/c91a774c95c52c309fdc825430ab81bc to your computer and use it in GitHub Desktop.
Save superdump/c91a774c95c52c309fdc825430ab81bc to your computer and use it in GitHub Desktop.
filebeat.yml configuration file (note: the various /rootfs are mounted into the container from the host as read-only volumes):
---
filebeat.prospectors:
- input_type: log
paths:
- /rootfs/var/log/*.log
- /rootfs/var/log/*/*.log
- input_type: log
paths:
- /rootfs/var/lib/docker/containers/*/*.log
json.message_key: log
json.keys_under_root: true
json.add_error_key: true
# avoid feedback loop
processors:
- drop_event:
when:
equals:
some.key.in.structure.that.has.value: filebeat
output.elasticsearch:
hosts: ["ELASTICSEARCH_URL"]
pipeline: log-source-and-time
the log-source-and-time pipeline (note: if all logs are coming from a container then the set and on_failure can be removed or modified as desired. also, install this to your elasticsearch >=5.x before shipping logs to it):
{
"description": "Parse time field from docker logs",
"processors": [
{
"set": {
"field": "log_source",
"value": "container"
}
},
{
"date": {
"field": "time",
"formats": ["ISO8601"],
"on_failure": [
{
"set": {
"field": "log_source",
"value": "host"
}
}
]
}
}
]
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment