Skip to content

Instantly share code, notes, and snippets.

@motine
Last active August 29, 2015 14:08
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save motine/61e02ff634594b6891c7 to your computer and use it in GitHub Desktop.
Save motine/61e02ff634594b6891c7 to your computer and use it in GitHub Desktop.
docker elasticsearch kibana try-out

Story

Daemon

To start the elastisearch container (no persistance): docker run -d -p 9200:9200 -p 9300:9300 --name els dockerfile/elasticsearch

See if the daemon is up: docker ps

Populate

To run the ruby script: docker run --rm -it -v /Users/motine/Repositories/docker/elastisearch:/root --link els:els binaryphile/ruby:2.1.2 /bin/bash (This creates ELS_ environment variables and mounts a local folder).

In add an entry to test if the link is up:

ELS_URL="http://$ELS_PORT_9200_TCP_ADDR:$ELS_PORT_9200_TCP_PORT"
curl -XPUT "$ELS_URL/dayone/article/1" -d '{
    "title" : "hello",
    "text" : "world"
}'

Now you can install the dependencies and populate elastisearch:

gem install --no-ri --no-rdoc plist elasticsearch
ruby populate.rb

You can commit the changes in the elasticsearch container and the ruby container (installed gems).

Play

After running irb we can start playing around:

require 'elasticsearch'
require 'yaml'
client = Elasticsearch::Client.new(host: "http://#{ENV["ELS_PORT_9200_TCP_ADDR"]}:#{ENV["ELS_PORT_9200_TCP_PORT"]}")

puts client.search(index: 'dayone', body: { query: { match: { title: 'Bolognese' } } }).to_yaml

Kibana

To get kibana up, make sure you have the elastisearch daemon on and populated (mind that the container looses it's state after stopping, except you commit). Run this image: docker run --rm -p 80:80 --name kibana arcus/kibana. Don't forget, that kibana does connect to the elasticsearch service via the client browser.

Find out the ip with boot2docker ip and go to http://192.168.59.103/#/dashboard/file/guided.json. In the settings set the index to dayone/article.

# Put all the dayone files into subfolder `data`.
Encoding.default_external = Encoding::UTF_8
Encoding.default_internal = Encoding::UTF_8
require 'plist'
require 'elasticsearch'
ELS_URL = "http://#{ENV["ELS_PORT_9200_TCP_ADDR"]}:#{ENV["ELS_PORT_9200_TCP_PORT"]}"
client = Elasticsearch::Client.new(host: ELS_URL)
Dir["data/*.doentry"].each do |path|
entry = Plist::parse_xml(path)
entry_info = {
tags: entry["Tags"],
text: entry["Entry Text"].lines[2..-1].join("\n").strip,
title: entry["Entry Text"].lines[0].strip,
creation: entry["Creation Date"]
}
client.index index: 'dayone', type: 'article', id: entry["UUID"], body: entry_info # id
end
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment