Replaying logs to logstash
Copy comprssed log files to a work area.
Uncompress them, remove date part of file name.
Copy /etc/logstash/conf.d/*.conf
to a work location.
Modify conf files to change output to stdout { codev => "rubydebug" }
You want to do this to make sure things are working before you push logs into ElasticSearch.
Modify conf files to change path in the input/file section
file {
type => "@@DOCUMENT_TYPE@@"
path => ["@@LOG_FILE@@"]
sincedb_path => "/dev/null"
start_position => "beginning"
}
Change conf files to change input/file to include start_position => "beginning"
Change conf files to change input/file to change sincedb_path => "/dev/null"
This will force logstash to parse the file from the beginning each time it's started.
Test the config: /opt/logstash/bin/logstash -f ${config_file}
You should get verbose rebydebug info.
Ctrl-c
to stop the output
Change the conf file to change output back to ElasticSearch.
Add line to elasticsearch to send output to specific index:
elasticsearch {
host => "localhost"
index => "@@INDEX_NAME@@"
}
Run LogStash to load the file /opt/logstash/bin/logstash -f ${config_file}
Check your logs to see when things have been loeded, or watch the system run queue. When it settles down the file is loaded.