Skip to content

Instantly share code, notes, and snippets.

@untergeek
Last active February 20, 2017 12:54
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save untergeek/f8046268a20bde875deb to your computer and use it in GitHub Desktop.
Save untergeek/f8046268a20bde875deb to your computer and use it in GitHub Desktop.
Logstash Debugging #1
input {
# file {
# path => ["C:/logs/Proj/*/*.log"]
# start_position => beginning
# }
stdin { }
}
filter {
grok {
match => { "message" => ["%{TIMESTAMP_ISO8601:Logdate} \[%{WORD:LogLevel}\] \[%{NUMBER:ThreadId}\] ?%{GREEDYDATA:data}", "%{TIMESTAMP_ISO8601:Logdate} \[%{WORD:LogLevel}\] \[%{WORD:Type}\] ?%{GREEDYDATA:data}", "%{TIMESTAMP_ISO8601:Logdate} \[%{WORD:LogLevel}%{SPACE}\] \[%{NUMBER:ThreadId}\] ?%{GREEDYDATA:data}", "%{TIMESTAMP_ISO8601:Logdate} \[%{NUMBER:ThreadId}\] %{WORD:LogLevel} ?%{GREEDYDATA:data}", "%{TIMESTAMP_ISO8601:Logdate} %{WORD:LogLevel} ?%{GREEDYDATA:data}" ] }
#match => { "path" => "%{GREEDYDATA}/%{GREEDYDATA:Component}/%{GREEDYDATA:}_%{GREEDYDATA:ProcessId}.log" }
#break_on_match => false
}
date {
match => [ "Logdate", "YYYY-MM-dd HH:mm:ss,SSS" ]
}
}
output {
# elasticsearch {
# protocol => "http"
# host => localhost
# }
stdout { codec => rubydebug }
}
$ bin/logstash -f h.conf --verbose
Grok patterns path {:patterns_dir=>["/Users/buh/logstash-1.4.2/patterns/*"], :level=>:info}
Grok loading patterns from file {:path=>"/Users/buh/logstash-1.4.2/patterns/firewalls", :level=>:info}
Grok loading patterns from file {:path=>"/Users/buh/logstash-1.4.2/patterns/grok-patterns", :level=>:info}
Grok loading patterns from file {:path=>"/Users/buh/logstash-1.4.2/patterns/haproxy", :level=>:info}
Grok loading patterns from file {:path=>"/Users/buh/logstash-1.4.2/patterns/java", :level=>:info}
Grok loading patterns from file {:path=>"/Users/buh/logstash-1.4.2/patterns/junos", :level=>:info}
Grok loading patterns from file {:path=>"/Users/buh/logstash-1.4.2/patterns/linux-syslog", :level=>:info}
Grok loading patterns from file {:path=>"/Users/buh/logstash-1.4.2/patterns/mcollective", :level=>:info}
Grok loading patterns from file {:path=>"/Users/buh/logstash-1.4.2/patterns/mcollective-patterns", :level=>:info}
Grok loading patterns from file {:path=>"/Users/buh/logstash-1.4.2/patterns/mongodb", :level=>:info}
Grok loading patterns from file {:path=>"/Users/buh/logstash-1.4.2/patterns/nagios", :level=>:info}
Grok loading patterns from file {:path=>"/Users/buh/logstash-1.4.2/patterns/postgresql", :level=>:info}
Grok loading patterns from file {:path=>"/Users/buh/logstash-1.4.2/patterns/redis", :level=>:info}
Grok loading patterns from file {:path=>"/Users/buh/logstash-1.4.2/patterns/ruby", :level=>:info}
Match data {:match=>{"message"=>["%{TIMESTAMP_ISO8601:Logdate} \\[%{WORD:LogLevel}\\] \\[%{NUMBER:ThreadId}\\] ?%{GREEDYDATA:data}", "%{TIMESTAMP_ISO8601:Logdate} \\[%{WORD:LogLevel}\\] \\[%{WORD:Type}\\] ?%{GREEDYDATA:data}", "%{TIMESTAMP_ISO8601:Logdate} \\[%{WORD:LogLevel}%{SPACE}\\] \\[%{NUMBER:ThreadId}\\] ?%{GREEDYDATA:data}", "%{TIMESTAMP_ISO8601:Logdate} \\[%{NUMBER:ThreadId}\\] %{WORD:LogLevel} ?%{GREEDYDATA:data}", "%{TIMESTAMP_ISO8601:Logdate} %{WORD:LogLevel} ?%{GREEDYDATA:data}"]}, :level=>:info}
Grok compile {:field=>"message", :patterns=>["%{TIMESTAMP_ISO8601:Logdate} \\[%{WORD:LogLevel}\\] \\[%{NUMBER:ThreadId}\\] ?%{GREEDYDATA:data}", "%{TIMESTAMP_ISO8601:Logdate} \\[%{WORD:LogLevel}\\] \\[%{WORD:Type}\\] ?%{GREEDYDATA:data}", "%{TIMESTAMP_ISO8601:Logdate} \\[%{WORD:LogLevel}%{SPACE}\\] \\[%{NUMBER:ThreadId}\\] ?%{GREEDYDATA:data}", "%{TIMESTAMP_ISO8601:Logdate} \\[%{NUMBER:ThreadId}\\] %{WORD:LogLevel} ?%{GREEDYDATA:data}", "%{TIMESTAMP_ISO8601:Logdate} %{WORD:LogLevel} ?%{GREEDYDATA:data}"], :level=>:info}
Pipeline started {:level=>:info}
2014-10-13 16:42:46,946 [1] DEBUG SolaceManager - Creating Solace session: Host=abc, VPN=xyz, User=indiana, ConnectRetries=-1, SendBlocking=True
{
"message" => "2014-10-13 16:42:46,946 [1] DEBUG SolaceManager - Creating Solace session: Host=abc, VPN=xyz, User=indiana, ConnectRetries=-1, SendBlocking=True",
"@version" => "1",
"@timestamp" => "2014-10-13T22:42:46.946Z",
"host" => "Aironaut.local",
"Logdate" => "2014-10-13 16:42:46,946",
"ThreadId" => "1",
"LogLevel" => "DEBUG",
"data" => "SolaceManager - Creating Solace session: Host=abc, VPN=xyz, User=indiana, ConnectRetries=-1, SendBlocking=True"
}
@untergeek
Copy link
Author

I took the liberty of correcting and testing this with stdin and stdout. I simply pasted your example in at the command-line and show the rubydebug output.

Notes

  • Filters are processed in the order in which they appear. You added a date filter before a date had been extracted by grok. I moved the date filter down below your grok filter and changed the match field to use Logdate, which you specified should be the field name in your grok statement.
  • break_on_match => false tells grok to keep parsing the message with the other grok rules, even if a match is found. This worked for me immediately after I commented this line out, allowing Logstash to use the default behavior (break_on_match => true), which will stop parsing after the first matching pattern is found. This is likely to be the reason you found multiple fields in your output.
  • elasticsearch_http output is deprecated. The elasticsearch output plugin now has http output built-in. You only need to specify protocol => "http" for this to work. I updated this for you.
  • Your date match statement was set to recognize a 3-character month (MMM), so I removed the superfluous M. As you can see in the output, @timestamp matches the Logdate field.
  • You added a second match statement to your grok rule. What are you trying to accomplish here?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment