Skip to content

Instantly share code, notes, and snippets.

@Azeemering
Azeemering / inputs.conf
Created October 30, 2018 12:46
WineventLog Security with base blacklist example
[WinEventLog://Security]
disabled = 0
start_from = oldest
current_only = 0
evt_resolve_ad_obj = 1
checkpointInterval = 5
blacklist1 = EventCode="4662" Message="Object Type:(?!\s*groupPolicyContainer)"
blacklist2 = EventCode="566" Message="Object Type:(?!\s*groupPolicyContainer)"
blacklist3 = EventCode="4688" Message="New Process Name:\s*(?i)(?:[C-F]:\\Program Files\\Splunk(?:UniversalForwarder)?\\bin\\(?:btool|splunkd|splunk|splunk\-(?:MonitorNoHandle|admon|netmon|perfmon|powershell|regmon|winevtlog|winhostinfo|winprintmon|wmi))\.exe)"
blacklist4 = EventCode="4689" Message="Process Name:\s*(?i)(?:[C-F]:\\Program Files\\Splunk(?:UniversalForwarder)?\\bin\\(?:btool|splunkd|splunk|splunk\-(?:MonitorNoHandle|admon|netmon|perfmon|powershell|regmon|winevtlog|winhostinfo|winprintmon|wmi))\.exe)"
@Azeemering
Azeemering / streamstats correlation spl
Last active July 3, 2018 20:25
Unlike sysmon, Windows hosts before 10/2016 versions do not provide the parent process name but the process ID only within event 4688 (New Process Created) contents - which is an important piece for threat hunting. Therefore, you need to do some correlation in order to address that. There are many ways to do it (supporting lookups, summary index…
EventCode=4688
| table _time ComputerName New_Process_Name New_Process_ID Creator_Process_ID
| eval proc_name_id_all=New_Process_Name."#mysep#".New_Process_ID
| sort 0 + _time
| streamstats time_window=60s values(proc_name_id_all) AS proc_name_id_all by ComputerName
| eval parent=mvfind(proc_name_id_all, "#mysep#".Creator_Process_ID."$")
| eval parent=replace(mvindex(proc_name_id_all,parent), "^(.+)#mysep#.+$", "\1")
2 Search ideas to to try to find if we are receiving the correct amount of data based on averages of same hour and same day for multiple weeks in the past
Using streamstats:
|tstats count index, _time span=1h
| sort 1-_time
| eval weekday=strftime(_time,"%a")
| eval week_hour=strftime(_time, "%H")
| eval today=strftime(now(), "%a")
| eval hour=strftime(relative_time(now(), "-1h"), "%H")
| eval sameDay=if(today=weekday AND hour = week_hour ,1,0)
<form>
<label>Splunk Users and Activity</label>
<fieldset submitButton="false">
<input type="time" token="field1" searchWhenChanged="true">
<label>Date</label>
<default>
<earliest>-7d@h</earliest>
<latest>now</latest>
</default>
</input>
<dashboard>
<label>Top Cumulative Search Run Times</label>
<row>
<panel>
<table>
<title>Top Cumulative Search Run Times</title>
<search>
<query>( index=_internal sourcetype="scheduler" savedsearch_name=* run_time=* scheduled_time=* app=* user=* user!=nobody sid=* ) OR ( ( index=sos OR index=os ) sourcetype="ps" ARGS="*search*" RSZ_KB=* id user ) | rex field=ARGS "search(_|\s)--id=(?&lt;sid&gt;[\_\-\w\.]+)(_|\s)--" | rex field=ARGS "--user=(?&lt;user&gt;.+?)(_|\s)--" | stats values(run_time) as run_time, max(RSZ_KB) as RSZ_KB, max(VSZ_KB) as VSZ_KB, values(savedsearch_name) as savedsearch_name, values(user) as user by sid | stats sum(run_time) as sum_run_time, max(run_time) as max_run_time, max(RSZ_KB) as max_RSZ_KB, by savedsearch_name, user | sort - max_run_time, max_RSZ_KB | eval minute_threshold = 5 | where max_run_time&gt;(60*minute_threshold) OR sum_run_time&gt;(60*minute_threshold) | join type=left user [ | rest /services/au
▶ | tstats summariesonly=t count WHERE index=* by splunk_server _time | timechart span=5m sum(count) by splunk_server
https://sourceforge.net/projects/syslog-slogger/
index=_internal source=*splunkd.log* AggregatorMiningProcessor OR LineBreakingProcessor OR DateParserVerbose WARN
| rex "(?<type>(Failed to parse timestamp
|suspiciously far away
|outside of the acceptable time window
|too far away from the previous
|Accepted time format has changed
|Breaking event because limit of \d+|Truncating line because limit of \d+))"
| eval type=if(isnull(type),"unknown",type)
| rex "source::(?<eventsource>[^\|]*)\|host::(?<eventhost>[^\|]*)\|(?<eventsourcetype>[^\|]*)\|(?<eventport>[^\s]*)"
| eval eventsourcetype=if(isnull(eventsourcetype),data_sourcetype,eventsourcetype)
@Azeemering
Azeemering / Splunk .conf2017 Community Theater Sessions
Created October 3, 2017 21:49
All Splunk .conf2017 Community Theater Sessions Slides
https://conf.splunk.com/files/2017/slides/automate-security-operations-with-phantom-splunk.pdf
https://conf.splunk.com/files/2017/slides/creating-welcome-pages.pdf
https://conf.splunk.com/files/2017/slides/sandboxing-with-splunk-with-docker.pdf
https://conf.splunk.com/files/2017/slides/using-netflow-for-insider-threat-detection.pdf
https://conf.splunk.com/files/2017/slides/advanced-dashboarding-tips-techniques.pdf
https://conf.splunk.com/files/2017/slides/analyzing-and-measuring-webinar-impact-with-splunk.pdf
https://conf.splunk.com/files/2017/slides/beat-business-rules-the-power-of-combining-text-mining-and-machine-learning-on-your-logs-for-accurate-and-fully-automatic-classification.pdf
https://conf.splunk.com/files/2017/slides/building-a-product-business-on-top-of-splunk.pdf
https://conf.splunk.com/files/2017/slides/bushfire-alerting-automation-system.pdf
https://conf.splunk.com/files/2017/slides/business-value-delivery-for-enterprise-splunk-customers-a-use-case-from-abn-amro-bank.pdf
@Azeemering
Azeemering / Splunk .conf2017 Breakout Sessions Slides
Last active June 21, 2018 15:31
All Splunk .conf2017 Breakout Sessions
https://conf.splunk.com/files/2017/slides/a-day-in-the-life-of-a-gdpr-breach.pdf
https://conf.splunk.com/files/2017/slides/a-journey-to-awesome-without-the-baggage-how-difficult-became-easy-with-splunk-at-john-lewis.pdf
https://conf.splunk.com/files/2017/slides/a-trip-through-the-splunk-data-ingestion-and-retrieval-pipeline.pdf
https://conf.splunk.com/files/2017/slides/apt-splunking-searching-for-adversaries-with-quadrants-and-other-methods.pdf
https://conf.splunk.com/files/2017/slides/achieve-operational-efficiency-in-car-manufacturing-with-advanced-analytics.pdf
https://conf.splunk.com/files/2017/slides/acute-care-telemetry-datastream-process-monitoring-visualization-and-search-with-splunk.pdf
https://conf.splunk.com/files/2017/slides/advanced-analytics-with-splunk-using-apache-spark-machine-learning-and-spark-graph.pdf
https://conf.splunk.com/files/2017/slides/advanced-machine-learning-using-the-extensible-ml-api.pdf
https://conf.splunk.com/files/2017/slides/advanced-security-monitoring-for-critical-groups