Here, there are some configs to use Elasticsearch as a Docker container and Logstash + Kibana as well.
You will see that Elasticsearch config is separated from Logstash and Kibana because I'm assuming that you want to use the ES container to other things rather than only log analyses.
Go to https://docs.docker.com/install/ and open the "Docker CE" menu, choose you OS and follow their instructions.
After install, open your terminal and type:
$ docker version
You should be able to see informations about Client and Server versions. Now type:
$ docker-compose version
It is expected to see docker-compose version as well.
Now that you are able to execute docker and docker-compose commands, create a elasticsearch folder wherever you want (As a suggestion, /home/your_user/docker/elasticsearch).
Inside of elasticsearch folder, create the folders config, esdata, plugins. Inside of config folder, create the elasticsearch.yml file with this content If you have any plugin (for example, repository-s3), copy the plugin's folder to elasticsearch/plugins folder. Ex.: elasticsearch/plugins/repository-s3
Now you need to set the right permissions to those folders, and change its ownership. The elasticsearch's User and Group ID is 1000, so get back to elasticsearch folder level (cd ..), and type:
$ sudo chown -R 1000:1000 esdata config plugins
Create the docker-compose.yml file with this content . Run elasticsearch with:
$ docker-compose up -d
You can you use docker-compose logs
to se what's going on with the container.
If everything goes right, you will be able to enter:
$ curl -XGET localhost:9200
And see something like:
{
"name" : "BtSe67_",
"cluster_name" : "elasticsearch",
"cluster_uuid" : "aCTyezR7TXqkqLn77qZEKA",
"version" : {
"number" : "5.4.3",
"build_hash" : "eed30a8",
"build_date" : "2017-06-22T00:34:03.743Z",
"build_snapshot" : false,
"lucene_version" : "6.5.1"
},
"tagline" : "You Know, for Search"
}
With elastichsearch working properly, create a folder kibana in your docker folder (Ex.: /home/your_user/docker/kibana). Inside of kibana folder, create the folders config, logs/nginx, logs/rails. I didn't know for sure what would be the right permission for logs folder and subfolder, so I just set 777.
$ sudo chmod -R 777 logs
Inside of config folder, create the logstash.conf with this content
Back to kibana folder, create the docker-compose.yml with this content
Run Kibana with:
$ docker-compose up -d kibana
Paste the log files (must be with extension .log
) in those respective folders (logs/nginx, logs/rails) and run:
$ docker-compose up logstash
You will be able to see logstash parsing those file and sending to elasticsearch.
Now, acess http://localhost:5601 and you will get a page asking to configure an index pattern. Fill the Index name or pattern field with logstash-nginx-*
value and select @timestamp
on Time-field name dropdown and click on Create.
Click on + button to add a new index, now enter logstash-rails-*
, select occurrency_time
and click on Create.
Click on Discover and you will see the indexed content of each index (rails and nginx).
Logstash parse and send new logs as soon as detect new files.
If you want check the indexes created by logstash, you can enter:
printf "Listing elasticsearch indexes\n"
curl -XGET 0:9200/_cat/indices?v
To delete all indexes created by logstash, enter:
curl -XDELETE 0:9200/logstash-*