#ELK Stack
The ELK stack contains Elasticsearch, Logstash, and Kibana:
- Elasticsearch for deep search and data analytics
- Logstash for centralized logging, log enrichment and parsing
- Kibana for powerful and beautiful data visualizations
The architecture ELK:
send data storage data reads data
Data/Log ----------> [Logstash] ------------> [ElasticSearch ip:port] <----------- [{nginx:80}/Kibana]
Logstash is used for :
- Managing events
- Collect data
- Parse data
- Enrich data
- Store data
Input ------------> Filter -------------> Output
Input [logs file, stream, datastore, files, monitoring, queues, network] Filter [parse, enrich, tag, drop] Output [email, datastore, files, monitoring, queues, API, paper]
Elasticsearch is a flexible and powerful open source, distributed, real-time search and analytics engine. Distributed RESTful search and analytics.
visualize logs and time-stamped data. Elasticsearch works seamlessly with Kibana to let you see and interact with your data.
1 - Install java
2 - Install Elasticsearch
# wget https://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-1.1.1.tar.gz
# tar zxvf elasticsearch-1.1.1.tar.gz
# cd elasticsearch-1.1.1/
# ./bin/elasticsearch
3 - Install Logstash
# wget https://download.elasticsearch.org/logstash/logstash/logstash-1.4.1.tar.gz
# tar -zxvf logstash-1.4.1.tar.gz
Test
# cd logstash-1.4.1/bin
# ./logstash -e 'input { stdin { } } output { stdout {} }'
Then type a message in the terminal, for example "hello world". See the output as follows:
logstash startup complete
# hello world
output: "2013-11-21T01:22:14.405+0000 0.0.0.0 hello world"
# CTRL +C (to exit)
# vim first-log.conf
------------------------------------------------------------------------------------------
input {
file {
path => ["/var/log/messages", "/var/log/*", "/d2/java/jboss-eap-6.3/standalone/log/*"]
}
}
output {
elasticsearch { host => localhost } # user "localhost" or "elasticsearch_ip_address".
stdout { codec => rubydebug }
}
------------------------------------------------------------------------------------------
# ./logstash -f first-log.conf &
You will see something like:
------------------------------------------------------------------------------------------
{
"message" => "Dec 23 15:10:01 srv238 CROND[28897]: (root) CMD (/usr/lib64/sa/sa1 1 1)",
"@version" => "1",
"@timestamp" => "2014-12-23T17:10:02.060Z",
"host" => "srv238.cnpq.br",
"path" => "/var/log/cron"
}
{
"message" => "Dec 23 15:20:01 srv238 systemd: Created slice user-0.slice.",
"@version" => "1",
"@timestamp" => "2014-12-23T17:20:02.479Z",
"host" => "srv238.xxx.xx",
"path" => "/var/log/messages"
}
------------------------------------------------------------------------------------------
4 - Install Kibana (on APACHE or NGINX)
with apache:
# cd /var/www/html
# wget https://download.elasticsearch.org/kibana/kibana/kibana-3.1.0.tar.gz
# tar -xzvf kibana-3.1.0.tar.gz
# mv kibana-3.1.0 kibana
with nginx:
# cd /usr/share/nginx/html
# wget https://download.elasticsearch.org/kibana/kibana/kibana-3.1.0.tar.gz
# tar -xzvf kibana-3.1.0.tar.gz
# mv kibana-3.1.0 kibana
Access URL: <your_ip_address>/kibana
Instructions for getting an ELK stack set up quick on Mac. Paths are opinionated. You'll have to infer and change. Sorry mate. 🍰
Install Homebrew if not already. You probably have. If not, you should.
brew install elasticsearch nginx logstash
# do yourself a favor and get a better services command than launchctl
brew tap gapple/services
Download these guys:
- logstash-1.4.2.tar.gz
- logstash-contrib-1.4.2.tar.gz
- kibana-3.1.2.tar.gz (or v4 if you prefer)
I usually throw app installs in /opt on Linux. Mac doesn't have this by default. This is up to you. Elasticsearch is going to be under /usr/local because of homebrew.
if you used brew, do brew info
and move to elastic search conf folder (which can be /usr/local/etc/elasticsearch
)
vim elasticsearch.yml
if you downloaded elastic search and extracted it then
vi /usr/local/etc/elasticsearch/elasticsearch.yml
- Now add the following lines for kibana 3 compatibility:
# kibana 3 compatibility
http.cors.enabled: true
http.cors.allow-origin: http://localhost:8080
- Change the cluster name to just elasticsearch in elasticsearch.yml:
cluster.name: elasticsearch
Create a patterns directory:
mkdir /usr/local/logstash/patterns.d/
Create an apache-errors file in /Users/username/patterns.d/apache-error:
APACHE_ERROR_LOG \[(?<timestamp>%{DAY:day} %{MONTH:month} %{MONTHDAY} %{TIME} %{YEAR})\] \[.*:%{LOGLEVEL:loglevel}\] \[pid %{NUMBER:pid}\] \[client %{IP:clientip}:.*\] %{GREEDYDATA:message}
Create /Users/username/logstash/logstash.conf:
# logstash config
input {
file {
path => [ "/var/log/*.log", "/var/log/messages", "/var/log/syslog" ]
type => "syslog"
}
file {
path => "/var/apache/logs/custom_log"
type => "apache_access_log"
}
file {
path => "/var/apache/logs/error_log"
type => "apache_error_log"
}
}
filter {
if [type] == "apache_access_log" {
mutate { replace => { "type" => "apache-access" } }
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
if [type] == "apache_error_log" {
mutate { replace => { "type" => "apache-error" } }
grok {
patterns_dir => [ "/Users/username/logstash/patterns.d" ]
match => [ "message", '%{APACHE_ERROR_LOG}' ]
overwrite => [ "message" ]
}
if !("_grokparsefailure" in [tags]) {
date {
match => [ "timestamp", "EEE MMM dd HH:mm:ss.SSSSSS yyyy" ]
}
}
}
if [type] == "syslog" {
grok {
match => [ "message", "%{SYSLOGBASE2}" ]
}
}
}
output {
elasticsearch { host => localhost }
stdout { codec => rubydebug }
}
# eof
- To verify that elasticsearch is running, visit port 9200 on whatever host you've set it up on:
http://hostname.com:9200/
- To perform a search:
http://hostname.com:9200/_search?pretty
- To view the mappings:
http://hostname.com:9200/_mappings?pretty
- To view cluster health:
http://localhost:9200/_cluster/health?pretty
$ elasticsearch
$ logstash agent -f ~/logstash/logstash.conf
# your preference on /opt or not
mkdir /opt
sudo chown $USER /opt
# unpack logstash and very valuable contrib tarballs
cd /opt
tar xzf ~/Downloads/logstash-1.4.2.tar.gz
cd logstash-1.4.2
tar xzf ~/Downloads/logstash-contrib-1.4.2.tar.gz --strip-components=1
# put kibana in nginx
cd /usr/local/var/www
tar xzf ~/Downloads/kibana-3.1.2.tar.gz
mv kibana-3.1.2 kibana
# edit config.js with your hostname
vi kibana/config.js
# line 32 - read the comments on why you might not want localhost here
# for dev box only
elasticsearch: "http://localhost:9200",
# enable cors for kibana3 + elasticsearch 1.4
vi /usr/local/Cellar/elasticsearch/1.4.3/config/elasticsearch.yml
# the services command is from the brew/tap at the top, love it
$ brew services restart elasticsearch
# make sure nginx starts by itself
# nginx config is in /usr/local/etc/nginx/nginx.conf if you need to look at it
# it won't need any edits for kibana. it's just js/html in a directory.
# browse to http://localhost:8080/kibana (you should see a kibana page)
# Now, let's change the default page to logstash.
cd /usr/local/var/www/kibana/app/dashboards
mv default.json default.json.orig
cp logstash.json default.json
# refresh the kibana page. It will be logstash's default now.
Now you have to suck your logs into logstash. That's a different tutorial.