I hereby claim:
- I am robertiansweetman on github.
- I am robertsweetman (https://keybase.io/robertsweetman) on keybase.
- I have a public key ASDlCOMAtGQo4wptr0qW9Dk90Ofr7w-6UrQwmv5aQ8xiJAo
To claim this, I am signing this object:
I hereby claim:
To claim this, I am signing this object:
ARG ELK_VERSION=7.6.2 | |
FROM docker.elastic.co/elasticsearch/elasticsearch:${ELK_VERSION} AS elasticbase | |
# copy index files to container | |
COPY ./elastic/index-templates /home/elasticsearch/index-templates/ | |
# copy management scripts to container - required to send index-files to Elastisearch REST endpoint later | |
COPY << management script - not supplied in this gist >> | |
# script to check elasticsearch status before uploading settings | |
COPY ./check-ready.sh check-ready.sh |
# I started here: https://discuss.elastic.co/t/multiple-filebeats/153114/5 | |
# however I found it quite confusing... | |
# In the end I managed to get this working (below) | |
# One 'pipeline.conf' creates 3 indexes, one of whose documents is being manipulated extensively (by logstash) before indexing | |
# the other 2 are using the Filebeat ingest node approach sending documents directly to elasticsearch | |
# filtering all relies on adding fields:type="value" to the filebeat config for each source of logs | |
# this is because the [@metadata][pipeline] approach as officially documented DOES NOT APPEAR TO WORK (YMMV) | |
input { |