Skip to content

Instantly share code, notes, and snippets.

@theprogrammerin
Last active March 9, 2020 08:48
Show Gist options
  • Save theprogrammerin/034a3efd849112d166ea to your computer and use it in GitHub Desktop.
Save theprogrammerin/034a3efd849112d166ea to your computer and use it in GitHub Desktop.
This is logstash configuration file to parse and push the generalised slow query log generated from [https://gist.github.com/theprogrammerin/e3206a4ec7a7a4086ac2] .
#
#
# Ashutosh Agrawal
# http://blog.theprogrammer.in
#
# 2014-11-20 v1.0
#
# This is logstash [http://logstash.net/] config for parsing the data out of the
# modified slow query generated from
# https://gist.github.com/theprogrammerin/e3206a4ec7a7a4086ac2
#
# Expected Input:
#
# Database: <db name> | User: <user info> | Host: <host_ip> | NoQuery: <integer> | TotalTime: <number> | AverageTime: <number> | TimeTaken: <string> | RowsAnalyzed: <string> | Pattern: <string> | Orignal: <string>
#
#
#
input {
stdin { }
}
filter {
grok {
match => ["message", "Database: %{DATA:database} \| User: %{DATA:user} \| Host: %{DATA:host_ip} \| NoQuery: %{INT:no_queries} \| TotalTime: %{NUMBER:total_time} \| AverageTime: %{NUMBER:average_time} \| TimeTaken: %{DATA:time_taken} \| RowsAnalyzed: %{DATA:rows_analyzed} \| Pattern: %{GREEDYDATA:dumy1}SET timestamp=XXX\; %{GREEDYDATA:pattern_query} \| Orignal: %{GREEDYDATA:dumy2}SET timestamp=%{NUMBER:time}\; %{GREEDYDATA:original_query}"]
add_field => { "namespace" => "slowquery" }
}
#
# Use this if you are using marginalia(https://github.com/basecamp/marginalia)
# [in rails] like gem in ruby to append general information in the query.
# This extracts the request_uid, application, controller, action, line number. etc.
# You can modify it to parse your custom parameter appended in the query.
#
grok {
match => ["original_query", "%{GREEDYDATA:query} \/\*request_uid:%{GREEDYDATA:request_uid},application:%{GREEDYDATA:application},controller:%{GREEDYDATA:controller},action:%{GREEDYDATA:action},line:%{GREEDYDATA:line}\*\/"]
}
#
# This is to over write the timestamp with when the query was conducted.
#
date {
match => ["time", "UNIX"]
}
mutate {
remove_field => ["time", "dumy1", "dumy2"]
}
}
# Use this for testing the parse in local:
#
# output {
# stdout { codec => rubydebug }
# }
# Output to elastic search
output {
elasticsearch_http {
host => "localhost"
}
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment