Skip to content

Instantly share code, notes, and snippets.

@shinaisan
Last active June 19, 2022 14:27
Show Gist options
  • Save shinaisan/78f3a3ad1ab50cab1d3ff32983454987 to your computer and use it in GitHub Desktop.
Save shinaisan/78f3a3ad1ab50cab1d3ff32983454987 to your computer and use it in GitHub Desktop.
Short Example of Logstash Multiple Pipelines

Short Example of Logstash Multiple Pipelines

This gist is just a personal practice record of Logstash Multiple Pipelines.

The following summary assumes that the PATH contains Logstash and Filebeat executables and they run locally on localhost.

Logstash config

pipelines.yml

Refers to two pipeline configs pipeline1.config and pipeline2.config.

Unlike logstash.yml, environment variables cannot be used in the pipelines.yml for some reason. See Issue #8452.

Each path.config here specifies only a file name, so Logstash has to be launched from the directory where the following config files reside.

pipeline1.config

Deals with syslog line input and listens to port 5044.

pipeline2.config

Deals with Apache log input and listens to port 5045.

Almost the same as the example found in Parsing Logs with Logstash.

Run Logstash

Here we set --path.settings just to let Logstash point to our $THIS_GIST_DIR to look for config files. We have to make sure not to specify -f option because Logstash reads pipelines.yml by default.

pushd $THIS_GIST_DIR
mkdir data
mkdir logs
logstash --path.settings $THIS_GIST_DIR --path.logs $THIS_GIST_DIR/logs --path.data $THIS_GIST_DIR/data

First Filebeat

The first filebeat config is filebeat1.yml that specifies only syslog in the paths setting. Here we use a copy of syslog as an input.

# Works only if syslog is written in /var/log...
cp /var/log/syslog $THIS_GIST_DIR/

If the previous registry file of Filebeat remains, we remove it.

rm -i $THIS_GIST_DIR/data/registry

Now we can try Filebeat and see Logstash emits filtered output onto stdout.

pushd $THIS_GIST_DIR
mkdir data
mkdir logs
filebeat --path.home $THIS_GIST_DIR -c filebeat1.yml

After all the log lines are printed, we can shutdown Filebeat, e.g. by CTRL-C.

Second Filebeat

For the second pipeline, we download a sample Apache log file from logstash-tutorial.log and unzip it to obtain logstash-tutorial.log. This file can also be found from Parsing Logs with Logstash.

This time, Logstash emits parsed Apache log onto stdout.

pushd $THIS_GIST_DIR
mkdir data
mkdir logs
filebeat -path.home $THIS_GIST_DIR -c filebeat2.yml

Output to Elasticsearch

By replacing the output in the pipeline{1,2}.config with the following one, we can direct filtered log outputs to Elasticsearch.

output {
    elasticsearch {
        hosts => [ "localhost:9200" ]
    }
}

Then we can issue a search query like below from the Dev Tools Console of Kibana.

GET /logstash-*/_search
{
  "query": {
    "match": {
      "program.keyword": "cron"
    }
  }
}

References

filebeat.inputs:
- type: log
paths:
- syslog
output.logstash:
hosts: ["localhost:5044"]
filebeat.inputs:
- type: log
paths:
- logstash-tutorial.log
output.logstash:
hosts: ["localhost:5045"]
input {
beats {
port => "5044"
}
}
filter {
grok {
match => { "message" => "%{SYSLOGLINE}"}
}
}
output {
stdout { codec => rubydebug }
}
input {
beats {
port => "5045"
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
}
geoip {
source => "clientip"
}
}
output {
stdout { codec => rubydebug }
}
- pipeline.id: pipeline_1
path.config: "pipeline1.config"
- pipeline.id: pipeline_2
path.config: "pipeline2.config"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment