Skip to content

Instantly share code, notes, and snippets.

@tormaroe
Last active August 16, 2018 20:08
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save tormaroe/4571604 to your computer and use it in GitHub Desktop.
Save tormaroe/4571604 to your computer and use it in GitHub Desktop.
A very simple DSL for monitoring log files that I'm working on.
class Parser
token ALERT
token DEFINITION
token WHEN
token LINE
token NEXT
token MATCHES
token TIMES
token WITHIN
token MINUTES
token TRIGGER
token HIGH
token CRITICAL
token MEDIUM
token LOW
token ALERT
token END
token REGEX
token WORD
token STRING
token NUMBER
rule
Root:
/* Nothing */ { result = Alerts.new([]) }
| Alerts { result = val[0] }
;
Alerts:
Alert { result = Alerts.new([val[0]]) }
| Alerts Alert { result = val[0] << val[1] }
;
Alert:
ALERT DEFINITION Rules END { result = Alert.new(val[2]) }
;
Rules:
Rule { result = [val[0]] }
| Rules Rule { result = val[0] << val[1] }
;
Rule:
WhenRule
| TriggerRule
;
WhenRule:
WHEN LINE MATCHES REGEX { result = WhenRule.new(0, val[3]) }
| WHEN NEXT MATCHES REGEX { result = WhenRule.new(1, val[3]) }
;
TriggerRule:
TRIGGER Severity ALERT STRING { result = TriggerRule.new(val[1], val[3]) }
;
Severity:
CRITICAL
| HIGH
| MEDIUM
| LOW
;
end
---- header
require "./lexer"
require "./nodes"
---- inner
def parse(code, show_tokens=false)
@tokens = Lexer.new.tokenize(code)
puts @tokens.inspect if show_tokens
do_parse
end
def next_token
@tokens.shift
end
class Lexer
KEYWORDS = [
"Alert","definition",
"When","line","next","matches",
"times","within","minutes",
"Trigger","HIGH","CRITICAL","MEDIUM","LOW","alert",
"end"
]
def tokenize code
code.chomp!
i = 0
tokens = []
while i < code.size
chunk = code[i..-1]
if word = chunk[/\A(\w+)/, 1]
if KEYWORDS.include? word
tokens << [word.upcase.to_sym, word]
else
tokens << [:WORD, word]
end
i += word.size
elsif regex = chunk[/\A\/(.*?)\//, 1]
tokens << [:REGEX, Regexp.new(regex)]
i += regex.size + 2
elsif string = chunk[/\A"(.*?)"/, 1]
tokens << [:STRING, string]
i += string.size + 2
elsif number = chunk[/\A([0-9]+)/, 1]
tokens << [:NUMBER, number.to_i]
i += number.size
elsif chunk.match(/\A[ \n]/)
i += 1
else
value = chunk[0,1]
tokens << [value, value]
i += 1
end
end
tokens
end
end
require './parser'
require './nodes_eval'
class Runtime
def initialize
@file = "test.log"
end
def loglines
@lines = File.readlines(@file)
@lines.each_with_index do |line, index|
@index = index
yield
end
end
def current_line
@lines[@index]
end
def advance n
@index += n
yield
@index -= n
end
end
def eval code
Parser.new.parse(code).eval(Runtime.new)
end
if file = ARGV.first
eval File.read(file)
else
puts "You need to pass inn a dsl file.."
end
Alert definition
When line matches /\[ERROR\]/
When next matches /NRDB/
Trigger MEDIUM alert "NRDB seem to be down"
end
Alert definition
When line matches /\[ERROR\]/
Trigger CRITICAL alert "#{ line }"
end
class Alerts
def initialize nodes
@nodes = nodes
end
def << node
@nodes << node
self
end
end
class Alert
def initialize rules
@when_rules = rules.
select{|r| r.class == WhenRule}.
sort_by{|r| r.order }
@trigger_rules = rules.
select{|r| r.class == TriggerRule}
end
end
class WhenRule
attr_reader :order
def initialize order, regex
@order, @regex = order, regex
end
end
class TriggerRule
def initialize severity, message
@severity, @message = severity, message
end
end
class StringNode
def initialize value
@value = value
end
end
class Alerts
def eval context
context.loglines do
@nodes.each do |n|
was_triggered = n.eval context
break if was_triggered
end
end
end
end
class Alert
def eval context
if @when_rules.all? {|r| r.eval(context) }
@trigger_rules.first.eval(context)
return true
end
return false
end
end
class WhenRule
def eval context
temp = nil
context.advance @order do
temp = (@regex =~ context.current_line)
end
not temp.nil?
end
end
class TriggerRule
def eval context
line = context.current_line
puts "#{@severity} ALERT: #{Kernel.eval('"' + @message + '"')}"
end
end
task :default => [:gen_parser]
task :gen_parser do
puts "Recreating parser.rb from grammar.."
%x{racc -o parser.rb grammar.y}
end
task :test do
puts %x{ruby test.rb}
end
[DEBUG] Some stuff
[DEBUG] Some more stuff
[ERROR] in log file
The errror is in NRDB
[ERROR] some other error
[DEBUG] foo bar quux
code = File.read("logalert_1.txt")
puts "SOURCE CODE:"
puts "============"
puts code
puts
require './lexer'
tokens = Lexer.new.tokenize(code)
puts "TOKENS:"
puts "======="
p tokens
puts
require './parser'
puts "AST:"
puts "===="
p Parser.new.parse(code)
puts
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment