Skip to content

Instantly share code, notes, and snippets.

@ccdle12
Last active March 5, 2021 02:59
Show Gist options
  • Save ccdle12/8dc25f713f6c0783a443a96696b8da9b to your computer and use it in GitHub Desktop.
Save ccdle12/8dc25f713f6c0783a443a96696b8da9b to your computer and use it in GitHub Desktop.
Cheat Sheet for using the ELK stack for logging in MicroServices.

ELK

Elastic Search, Logstash, Kibana

A tech stack used for logging.

Log Stash Setup

Standard Input for Logstash

Useful for a dev environment, this will take standard input, write it to the elastic search.

This is the config for logstash - logstash.config

input {
  stdin {}
}

## Add your filters / logstash plugins configuration here

output {
   elasticsearch {
     hosts => "elasticsearch:9200"
   }
}

Currently I attached into the container and run logstash -f /elk-config/logstash.config. Then you can enter text on the command line and will be picked up and forwarded to elasticsearch.

The messages will appear on Kibana by initializing the index pattern in managment.

Setting up to receive logs from logspout

Logspout will listen to container syslog output and route it via udp to logstash.

input {
  udp {
    port => 25826
  }
}

output {
  elasticsearch {
    hosts => "elasticsearch:9200"
  }
}

Grok Filtering

We can setup filters that can sort the log messages by fields, this helps in the kibana board to search and filter according to the logging fields.

The filter picks up on logs that have the format timestamp - service_name - log-level - message.

If all the services follow the above format then filtering between service logs will be much easier

Grokfiltering can be hard to test, this website is very useful: https://grokdebug.herokuapp.com/

2019-08-08 09:38:42,133 - api_gateway - INFO - response to order request: CONFIRMED

%{TIMESTAMP_ISO8601:timestamp} - %{WORD:service} - %{LOGLEVEL:log-level} - %{GREEDYDATA:message}

NOTE:* WORD is used for arbitrary strings.

~ input {
~     udp {
~       port => 25826
~     }
~ }
~
~ filter
~ {
~  grok{
~_ match => {"message" => "%{TIMESTAMP_ISO8601:timestamp} - %{WORD:service} - %{LOGLEVEL:log-level} - %{GREEDYDATA:message}"}
   }
+     date {
+     match => ["timestamp", "ISO8601"]
+   }
+ }
+
+ output {
+     elasticsearch {
+         hosts => "elasticsearch:9200"
+     }
+ }

Sample Setup for ELK in docker-compose

The below is the current setup

    elasticsearch:
+     container_name: elasticsearch
      image: docker.elastic.co/elasticsearch/elasticsearch:5.2.2
      expose:
        - "9200"
+       - "9300"
      environment:
        - "xpack.security.enabled=false"
_       - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
      networks:
        - simplebank
~
    kibana:
~     container_name: kibana
+     image: docker.elastic.co/kibana/kibana:6.2.2
      ports:
        - "5601:5601"
      environment:
        - "xpack.security.enabled=false"
      depends_on:
        - "elasticsearch"
      networks:
        - simplebank
~
    logstash:
+     container_name: logstash
      image: docker.elastic.co/logstash/logstash:5.2.2
      expose:
        - "25826"
      volumes:
        - $PWD/elk-config:/elk-config
_     command: logstash -f /elk-config/logstash.config
      depends_on:
        - "elasticsearch"
      networks:
        - simplebank
        
logspout:
  container_name: logspout
  image: gliderlabs/logspout:v3
  command: "udp://logstash:25826"
  restart: always
  volumes:
  - "/var/run/docker.sock:/tmp/docker.sock"
  depends_on:
    - elasticsearch
    - logstash
    - kibana
  networks:
  - simplebank
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment