-
-
Save sgman/04c0f199194e3945dc75 to your computer and use it in GitHub Desktop.
AM_Log
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
2015-04-07 10:00:05,257 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428426000_129/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428426000_129 sessionKey=ZanA2yt8WM0YV7e4Bsm6^XkeKgCSP5wIqs7Kr5k3W0M0tDwR49oj_MbZM_FHZLXLJezM2^MNrsffLMFGATiXdFk0uOC5amEhFywsB7aNB8T1Ds7XpQjBCYruoqjuOD2_NNx4tftZ3vJNaCG1so alert=To Global Failed Login >3 Alert | |
2015-04-07 10:00:05,263 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428426000_129' has been fired. | |
2015-04-07 10:00:05,541 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_urgency": "low", "default_impact": "low", "default_priority": "low", "default_owner": "unassigned"} | |
2015-04-07 10:00:05,541 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-07 10:00:05,548 DEBUG Incident settings: [ ] | |
2015-04-07 10:00:05,548 INFO No incident settings found for To Global Failed Login >3 Alert, switching back to defaults. | |
2015-04-07 10:00:05,548 DEBUG Incident config after getting settings: {"auto_ttl_resolve": false, "auto_assign": false, "auto_previous_resolve": false, "urgency": "low", "alert_script": "", "category": "", "subcategory": "", "tags": "", "auto_assign_user": "", "run_alert_script": false} | |
2015-04-07 10:00:05,556 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 8 results. | |
2015-04-07 10:00:05,573 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-07 10:00:05,573 DEBUG Transformed 24h into 86400 seconds | |
2015-04-07 10:00:05,625 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-07 10:00:05,625 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-07 10:00:05,625 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-07 10:00:05,625 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-07 10:00:05,625 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428426000_129 | |
2015-04-07 10:00:05,967 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=e10360c4-b55f-4647-b990-6b5ebeed8204 | |
2015-04-07 10:00:05,993 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-07 10:00:05,993 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-07 10:00:05,993 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': 'low'} | |
2015-04-07 10:00:05,994 DEBUG Matched priority in lookup, returning value=low | |
2015-04-07 10:00:06,032 DEBUG Create event will be: time=2015-04-07T10:00:06.032579 severity=INFO origin="alert_handler" event_id="57409d1035526f0bf3200554c0fe4407" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="e10360c4-b55f-4647-b990-6b5ebeed8204" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428426000_129" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428426001" | |
2015-04-07 10:00:06,039 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428426000_129 with incident_id=e10360c4-b55f-4647-b990-6b5ebeed8204 | |
2015-04-07 10:00:06,067 DEBUG results for incident_id=e10360c4-b55f-4647-b990-6b5ebeed8204 written to collection. | |
2015-04-07 10:00:06,067 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428426000_129 incident_id=e10360c4-b55f-4647-b990-6b5ebeed8204 result_id=0 written to collection incident_results | |
2015-04-07 10:00:06,067 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-07 10:00:06,074 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-07 10:00:06,074 INFO Alert handler finished. duration=0.817s | |
2015-04-07 10:10:05,343 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428426600_166/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428426600_166 sessionKey=xPfM83h4e5YzMJ2BfZBFpgyJdj9K^IdFZ070JGRcWtmZDRQASK85zxESUG4HAW7uN^3KcDW9Tcv0QZE14vpMyG^d_gndWqQwAtl^oMZVc3kbjWtWyeL0K6Jq5MvlKaG2SneDRAtvxfJEFpA alert=TO Failed Logon >3 Alert | |
2015-04-07 10:10:05,349 INFO alert_handler started because alert 'TO Failed Logon >3 Alert' with id 'scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428426600_166' has been fired. | |
2015-04-07 10:10:05,624 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low", "default_priority": "low"} | |
2015-04-07 10:10:05,624 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Logon%20%3E3%20Alert%22%7D | |
2015-04-07 10:10:05,630 DEBUG Incident settings: [ { "alert" : "TO Failed Logon >3 Alert", "tags" : "AD", "auto_previous_resolve" : false, "urgency" : "high", "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "subcategory" : "Logon Failed", "run_alert_script" : false, "category" : "Active Directory", "auto_assign" : false, "_user" : "nobody", "_key" : "55240dda3403e174be1efd92" } ] | |
2015-04-07 10:10:05,630 INFO Found incident settings for TO Failed Logon >3 Alert | |
2015-04-07 10:10:05,630 DEBUG Incident config after getting settings: {"_key": "55240dda3403e174be1efd92", "run_alert_script": false, "auto_ttl_resolve": false, "auto_assign_owner": "unassigned", "subcategory": "Logon Failed", "category": "Active Directory", "auto_assign": false, "auto_assign_user": "", "tags": "AD", "alert": "TO Failed Logon >3 Alert", "alert_script": "", "auto_previous_resolve": false, "urgency": "high", "_user": "nobody"} | |
2015-04-07 10:10:05,638 INFO Found job for alert TO Failed Logon >3 Alert. Context is 'search' with 12 results. | |
2015-04-07 10:10:05,656 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-07 10:10:05,656 DEBUG Transformed 24h into 86400 seconds | |
2015-04-07 10:10:05,658 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428426600_167/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428426600_167 sessionKey=BK1901hqXWe2M1c3PZGTbw5^DUDMSpP22z7zjpNmAw7lrC4FUiUkJSoLKGaKJmtOeI14bxMS0e26tl7dOmzZV93E4r1gvTT6VmcFGR9bOS7WeBwWecsshWdB5UEwgelwqsmFnCZsp_eu9o alert=To Global Failed Login >3 Alert | |
2015-04-07 10:10:05,665 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428426600_167' has been fired. | |
2015-04-07 10:10:05,684 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-07 10:10:05,684 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-07 10:10:05,685 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-07 10:10:05,685 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-07 10:10:05,685 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428426600_166 | |
2015-04-07 10:10:05,933 DEBUG Parsed global alert handler settings: {"default_impact": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low"} | |
2015-04-07 10:10:05,933 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-07 10:10:05,939 DEBUG Incident settings: [ { "alert" : "To Global Failed Login >3 Alert", "tags" : "AD", "auto_previous_resolve" : false, "urgency" : "high", "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "subcategory" : "Logon Failed", "run_alert_script" : false, "category" : "Active Directory", "auto_assign" : false, "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-07 10:10:05,940 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-07 10:10:05,940 DEBUG Incident config after getting settings: {"auto_previous_resolve": false, "alert": "To Global Failed Login >3 Alert", "auto_assign_owner": "unassigned", "auto_ttl_resolve": false, "auto_assign": false, "_key": "55240dda3403e174be1efd93", "subcategory": "Logon Failed", "_user": "nobody", "run_alert_script": false, "urgency": "high", "alert_script": "", "auto_assign_user": "", "category": "Active Directory", "tags": "AD"} | |
2015-04-07 10:10:05,948 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 8 results. | |
2015-04-07 10:10:05,967 DEBUG Parsed savedsearch settings: severity=5 expiry=24h digest_mode=True | |
2015-04-07 10:10:05,968 DEBUG Transformed 24h into 86400 seconds | |
2015-04-07 10:10:05,987 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=1be363e4-8d82-4c3b-9342-895839208a41 | |
2015-04-07 10:10:05,994 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-07 10:10:05,995 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-07 10:10:05,995 DEBUG Querying lookup with filter={'severity_id': '5'} | |
2015-04-07 10:10:05,995 DEBUG Matched impact in lookup, returning value=high | |
2015-04-07 10:10:05,995 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428426600_167 | |
2015-04-07 10:10:06,016 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-07 10:10:06,016 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-07 10:10:06,016 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-07 10:10:06,016 DEBUG Matched priority in lookup, returning value=high | |
2015-04-07 10:10:06,059 DEBUG Create event will be: time=2015-04-07T10:10:06.059861 severity=INFO origin="alert_handler" event_id="122b08b8a2f027c559d7ddbf4432a455" user="splunk-system-user" action="create" alert="TO Failed Logon >3 Alert" incident_id="1be363e4-8d82-4c3b-9342-895839208a41" job_id="scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428426600_166" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428426601" | |
2015-04-07 10:10:06,066 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428426600_166 with incident_id=1be363e4-8d82-4c3b-9342-895839208a41 | |
2015-04-07 10:10:06,094 DEBUG results for incident_id=1be363e4-8d82-4c3b-9342-895839208a41 written to collection. | |
2015-04-07 10:10:06,094 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428426600_166 incident_id=1be363e4-8d82-4c3b-9342-895839208a41 result_id=0 written to collection incident_results | |
2015-04-07 10:10:06,094 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-07 10:10:06,101 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-07 10:10:06,101 INFO Alert handler finished. duration=0.759s | |
2015-04-07 10:10:06,290 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=318a8715-45d4-445e-b4c8-515c07af5860 | |
2015-04-07 10:10:06,316 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-07 10:10:06,316 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-07 10:10:06,316 DEBUG Querying lookup with filter={'impact': 'high', 'urgency': u'high'} | |
2015-04-07 10:10:06,316 DEBUG Matched priority in lookup, returning value=critical | |
2015-04-07 10:10:06,334 DEBUG Create event will be: time=2015-04-07T10:10:06.334074 severity=INFO origin="alert_handler" event_id="27ebd7364efe74f9573169a568380627" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="318a8715-45d4-445e-b4c8-515c07af5860" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428426600_167" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428426601" | |
2015-04-07 10:10:06,340 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428426600_167 with incident_id=318a8715-45d4-445e-b4c8-515c07af5860 | |
2015-04-07 10:10:06,368 DEBUG results for incident_id=318a8715-45d4-445e-b4c8-515c07af5860 written to collection. | |
2015-04-07 10:10:06,368 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428426600_167 incident_id=318a8715-45d4-445e-b4c8-515c07af5860 result_id=0 written to collection incident_results | |
2015-04-07 10:10:06,368 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-07 10:10:06,376 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-07 10:10:06,376 INFO Alert handler finished. duration=0.718s | |
2015-04-07 10:20:06,519 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428427200_207/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428427200_207 sessionKey=33X8gt8RYV81NAjME1e5YnRisAt2^IHgv0Xj17GiVTGIosCDhE182HGO3KZIqH^8GuYxulFonDCrP4oU1vb5BYUfJSNpYafq__VfzhxC3s7VX_RQ2BPsowEoIr8i_Um3Se8eLFnFYCoOREx alert=To Global Failed Login >3 Alert | |
2015-04-07 10:20:06,525 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428427200_207' has been fired. | |
2015-04-07 10:20:06,601 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427200_206/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427200_206 sessionKey=VT_uWExYhwnZXSv4CzJNuEuVkceW^kzmB47TZevr9EF8VDUvOLz1NqnIqFyDQa7aI^mrxNqB9iHgFS73TANjKhrcwRMLIsyN9Q^SjSswk8SWteqapQoMvjyciis1tLV^eCQg4LNVAFLc alert=TO Failed Logon >3 Alert | |
2015-04-07 10:20:06,607 INFO alert_handler started because alert 'TO Failed Logon >3 Alert' with id 'scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427200_206' has been fired. | |
2015-04-07 10:20:06,796 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf"} | |
2015-04-07 10:20:06,797 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-07 10:20:06,803 DEBUG Incident settings: [ { "alert" : "To Global Failed Login >3 Alert", "tags" : "AD", "auto_previous_resolve" : false, "urgency" : "high", "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "subcategory" : "Logon Failed", "run_alert_script" : false, "category" : "Active Directory", "auto_assign" : false, "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-07 10:20:06,803 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-07 10:20:06,803 DEBUG Incident config after getting settings: {"urgency": "high", "tags": "AD", "auto_ttl_resolve": false, "category": "Active Directory", "auto_assign": false, "auto_previous_resolve": false, "auto_assign_owner": "unassigned", "auto_assign_user": "", "_key": "55240dda3403e174be1efd93", "run_alert_script": false, "_user": "nobody", "alert_script": "", "alert": "To Global Failed Login >3 Alert", "subcategory": "Logon Failed"} | |
2015-04-07 10:20:06,811 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 7 results. | |
2015-04-07 10:20:06,827 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-07 10:20:06,827 DEBUG Transformed 24h into 86400 seconds | |
2015-04-07 10:20:06,873 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-07 10:20:06,873 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-07 10:20:06,874 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-07 10:20:06,874 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-07 10:20:06,874 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428427200_207 | |
2015-04-07 10:20:06,885 DEBUG Parsed global alert handler settings: {"default_priority": "low", "index": "alerts-to-inf", "default_urgency": "low", "default_owner": "unassigned", "default_impact": "low"} | |
2015-04-07 10:20:06,885 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Logon%20%3E3%20Alert%22%7D | |
2015-04-07 10:20:06,892 DEBUG Incident settings: [ { "alert" : "TO Failed Logon >3 Alert", "tags" : "AD", "auto_previous_resolve" : false, "urgency" : "high", "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "subcategory" : "Logon Failed", "run_alert_script" : false, "category" : "Active Directory", "auto_assign" : false, "_user" : "nobody", "_key" : "55240dda3403e174be1efd92" } ] | |
2015-04-07 10:20:06,892 INFO Found incident settings for TO Failed Logon >3 Alert | |
2015-04-07 10:20:06,892 DEBUG Incident config after getting settings: {"auto_ttl_resolve": false, "auto_assign": false, "auto_assign_user": "", "auto_assign_owner": "unassigned", "_user": "nobody", "subcategory": "Logon Failed", "tags": "AD", "alert": "TO Failed Logon >3 Alert", "category": "Active Directory", "alert_script": "", "auto_previous_resolve": false, "urgency": "high", "run_alert_script": false, "_key": "55240dda3403e174be1efd92"} | |
2015-04-07 10:20:06,901 INFO Found job for alert TO Failed Logon >3 Alert. Context is 'search' with 12 results. | |
2015-04-07 10:20:06,919 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-07 10:20:06,919 DEBUG Transformed 24h into 86400 seconds | |
2015-04-07 10:20:06,948 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-07 10:20:06,948 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-07 10:20:06,949 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-07 10:20:06,949 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-07 10:20:06,949 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427200_206 | |
2015-04-07 10:20:07,178 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=bdf61b6a-328f-4329-82ae-b3f05a63f141 | |
2015-04-07 10:20:07,208 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-07 10:20:07,209 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-07 10:20:07,209 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-07 10:20:07,209 DEBUG Matched priority in lookup, returning value=high | |
2015-04-07 10:20:07,255 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=cefd4fe4-3640-446a-a535-546f82ff17af | |
2015-04-07 10:20:07,267 DEBUG Create event will be: time=2015-04-07T10:20:07.267185 severity=INFO origin="alert_handler" event_id="a90e425eaee8f250498395e79a8d0d02" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="bdf61b6a-328f-4329-82ae-b3f05a63f141" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428427200_207" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428427201" | |
2015-04-07 10:20:07,274 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428427200_207 with incident_id=bdf61b6a-328f-4329-82ae-b3f05a63f141 | |
2015-04-07 10:20:07,303 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-07 10:20:07,306 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-07 10:20:07,306 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-07 10:20:07,306 DEBUG Matched priority in lookup, returning value=high | |
2015-04-07 10:20:07,336 DEBUG results for incident_id=bdf61b6a-328f-4329-82ae-b3f05a63f141 written to collection. | |
2015-04-07 10:20:07,336 DEBUG Create event will be: time=2015-04-07T10:20:07.336663 severity=INFO origin="alert_handler" event_id="b47e8c15f1fea927128324197141e72e" user="splunk-system-user" action="create" alert="TO Failed Logon >3 Alert" incident_id="cefd4fe4-3640-446a-a535-546f82ff17af" job_id="scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427200_206" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428427201" | |
2015-04-07 10:20:07,336 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428427200_207 incident_id=bdf61b6a-328f-4329-82ae-b3f05a63f141 result_id=0 written to collection incident_results | |
2015-04-07 10:20:07,336 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-07 10:20:07,345 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427200_206 with incident_id=cefd4fe4-3640-446a-a535-546f82ff17af | |
2015-04-07 10:20:07,348 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-07 10:20:07,348 INFO Alert handler finished. duration=0.829s | |
2015-04-07 10:20:07,370 DEBUG results for incident_id=cefd4fe4-3640-446a-a535-546f82ff17af written to collection. | |
2015-04-07 10:20:07,370 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427200_206 incident_id=cefd4fe4-3640-446a-a535-546f82ff17af result_id=0 written to collection incident_results | |
2015-04-07 10:20:07,370 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-07 10:20:07,378 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-07 10:20:07,378 INFO Alert handler finished. duration=0.777s | |
2015-04-07 10:22:04,940 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427320_222/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427320_222 sessionKey=7BDBO_EZh8RJVv3EPe5vK4U_OdjqsoBeexTwzTM^gUliiAvbrH6Rpu_Jb00_QZok7MqdJmMxnWJm_^3xMMfNrxzwAy9eHzSCMnlekLLfMhFrEF_nSRZO9qZNTgrvjWxW5UutIJ_OrWs4 alert=TO Failed Logon >3 Alert | |
2015-04-07 10:22:04,946 INFO alert_handler started because alert 'TO Failed Logon >3 Alert' with id 'scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427320_222' has been fired. | |
2015-04-07 10:22:05,219 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_urgency": "low", "default_impact": "low", "default_priority": "low", "default_owner": "unassigned"} | |
2015-04-07 10:22:05,219 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Logon%20%3E3%20Alert%22%7D | |
2015-04-07 10:22:05,225 DEBUG Incident settings: [ { "alert" : "TO Failed Logon >3 Alert", "tags" : "AD", "auto_previous_resolve" : false, "urgency" : "high", "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "subcategory" : "Logon Failed", "run_alert_script" : false, "category" : "Active Directory", "auto_assign" : false, "_user" : "nobody", "_key" : "55240dda3403e174be1efd92" } ] | |
2015-04-07 10:22:05,225 INFO Found incident settings for TO Failed Logon >3 Alert | |
2015-04-07 10:22:05,225 DEBUG Incident config after getting settings: {"category": "Active Directory", "tags": "AD", "auto_assign_owner": "unassigned", "alert_script": "", "auto_assign_user": "", "_key": "55240dda3403e174be1efd92", "urgency": "high", "subcategory": "Logon Failed", "_user": "nobody", "run_alert_script": false, "auto_ttl_resolve": false, "auto_assign": false, "auto_previous_resolve": false, "alert": "TO Failed Logon >3 Alert"} | |
2015-04-07 10:22:05,233 INFO Found job for alert TO Failed Logon >3 Alert. Context is 'search' with 9 results. | |
2015-04-07 10:22:05,250 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-07 10:22:05,250 DEBUG Transformed 24h into 86400 seconds | |
2015-04-07 10:22:05,276 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-07 10:22:05,276 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-07 10:22:05,276 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-07 10:22:05,277 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-07 10:22:05,277 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427320_222 | |
2015-04-07 10:22:05,570 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=bd6ce207-010e-4c77-b0e1-5e5de9adc65d | |
2015-04-07 10:22:05,597 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-07 10:22:05,597 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-07 10:22:05,597 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-07 10:22:05,597 DEBUG Matched priority in lookup, returning value=high | |
2015-04-07 10:22:05,647 DEBUG Create event will be: time=2015-04-07T10:22:05.647077 severity=INFO origin="alert_handler" event_id="36ff064c18fc079357ee55c9c962205e" user="splunk-system-user" action="create" alert="TO Failed Logon >3 Alert" incident_id="bd6ce207-010e-4c77-b0e1-5e5de9adc65d" job_id="scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427320_222" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428427321" | |
2015-04-07 10:22:05,653 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427320_222 with incident_id=bd6ce207-010e-4c77-b0e1-5e5de9adc65d | |
2015-04-07 10:22:05,681 DEBUG results for incident_id=bd6ce207-010e-4c77-b0e1-5e5de9adc65d written to collection. | |
2015-04-07 10:22:05,681 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427320_222 incident_id=bd6ce207-010e-4c77-b0e1-5e5de9adc65d result_id=0 written to collection incident_results | |
2015-04-07 10:22:05,681 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-07 10:22:05,688 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-07 10:22:05,689 INFO Alert handler finished. duration=0.749s | |
2015-04-07 10:23:04,996 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427380_224/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427380_224 sessionKey=WHXMVqph0_q2VwDDSIiUEZX3PzOSLUdPw2FBk1BMlAlTJ7TTNIxKWK_V2ltmBSLBlsmq37b0aptXeGuJ^uET00Cq3gJpF9rUM7o9eBAnccNptfSmlMihlznPwapXrfK7keibxUpMPeY9nSa alert=TO Failed Logon >3 Alert | |
2015-04-07 10:23:05,002 INFO alert_handler started because alert 'TO Failed Logon >3 Alert' with id 'scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427380_224' has been fired. | |
2015-04-07 10:23:05,272 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-07 10:23:05,272 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Logon%20%3E3%20Alert%22%7D | |
2015-04-07 10:23:05,278 DEBUG Incident settings: [ { "alert" : "TO Failed Logon >3 Alert", "tags" : "AD", "auto_previous_resolve" : false, "urgency" : "high", "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "subcategory" : "Logon Failed", "run_alert_script" : false, "category" : "Active Directory", "auto_assign" : false, "_user" : "nobody", "_key" : "55240dda3403e174be1efd92" } ] | |
2015-04-07 10:23:05,278 INFO Found incident settings for TO Failed Logon >3 Alert | |
2015-04-07 10:23:05,278 DEBUG Incident config after getting settings: {"run_alert_script": false, "auto_previous_resolve": false, "subcategory": "Logon Failed", "_user": "nobody", "tags": "AD", "category": "Active Directory", "auto_ttl_resolve": false, "urgency": "high", "auto_assign_user": "", "auto_assign": false, "auto_assign_owner": "unassigned", "alert": "TO Failed Logon >3 Alert", "_key": "55240dda3403e174be1efd92", "alert_script": ""} | |
2015-04-07 10:23:05,286 INFO Found job for alert TO Failed Logon >3 Alert. Context is 'search' with 9 results. | |
2015-04-07 10:23:05,302 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-07 10:23:05,303 DEBUG Transformed 24h into 86400 seconds | |
2015-04-07 10:23:05,328 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-07 10:23:05,328 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-07 10:23:05,328 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-07 10:23:05,328 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-07 10:23:05,328 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427380_224 | |
2015-04-07 10:23:05,621 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=146e00fd-da71-41d1-9326-b2efa4067e1d | |
2015-04-07 10:23:05,648 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-07 10:23:05,648 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-07 10:23:05,648 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-07 10:23:05,648 DEBUG Matched priority in lookup, returning value=high | |
2015-04-07 10:23:05,687 DEBUG Create event will be: time=2015-04-07T10:23:05.687161 severity=INFO origin="alert_handler" event_id="8f8b20c63c96498aad8a3f7d4cd31312" user="splunk-system-user" action="create" alert="TO Failed Logon >3 Alert" incident_id="146e00fd-da71-41d1-9326-b2efa4067e1d" job_id="scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427380_224" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428427381" | |
2015-04-07 10:23:05,693 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427380_224 with incident_id=146e00fd-da71-41d1-9326-b2efa4067e1d | |
2015-04-07 10:23:05,721 DEBUG results for incident_id=146e00fd-da71-41d1-9326-b2efa4067e1d written to collection. | |
2015-04-07 10:23:05,721 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428427380_224 incident_id=146e00fd-da71-41d1-9326-b2efa4067e1d result_id=0 written to collection incident_results | |
2015-04-07 10:23:05,721 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-07 10:23:05,728 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-07 10:23:05,728 INFO Alert handler finished. duration=0.732s | |
2015-04-08 07:20:04,224 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5a06b9498aceed650_at_1428502800_5491/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5a06b9498aceed650_at_1428502800_5491 sessionKey=sMx_hI^tcIAH1GA2M^5yLpQA8np7io^8l3WjCb6bQjqNbcAif6z8cFhXdcpdxciSsB2LpYv1x5TblzV8HNYVz9q^TCkymhxnO9YXcFaH5A_RsDyZ_v_iFFnRW1dSHH1T4y6Gc1779P0 alert=TO ClamAV Alert | |
2015-04-08 07:20:04,230 INFO alert_handler started because alert 'TO ClamAV Alert' with id 'scheduler__XXmeXX__search__RMD5a06b9498aceed650_at_1428502800_5491' has been fired. | |
2015-04-08 07:20:04,500 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-08 07:20:04,500 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20ClamAV%20Alert%22%7D | |
2015-04-08 07:20:04,506 DEBUG Incident settings: [ { "auto_assign" : false, "auto_assign_owner" : "unassigned", "subcategory" : "Linux", "alert" : "TO ClamAV Alert", "category" : "Antivirus", "run_alert_script" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "urgency" : "high", "tags" : "Virus", "_user" : "nobody", "_key" : "552436d73403e12f580a9533" } ] | |
2015-04-08 07:20:04,506 INFO Found incident settings for TO ClamAV Alert | |
2015-04-08 07:20:04,507 DEBUG Incident config after getting settings: {"_key": "552436d73403e12f580a9533", "auto_assign_owner": "unassigned", "tags": "Virus", "auto_assign_user": "", "alert_script": "", "_user": "nobody", "auto_assign": false, "auto_ttl_resolve": false, "urgency": "high", "auto_previous_resolve": false, "subcategory": "Linux", "run_alert_script": false, "category": "Antivirus", "alert": "TO ClamAV Alert"} | |
2015-04-08 07:20:04,514 INFO Found job for alert TO ClamAV Alert. Context is 'search' with 1 results. | |
2015-04-08 07:20:04,532 DEBUG Parsed savedsearch settings: severity=5 expiry=24h digest_mode=True | |
2015-04-08 07:20:04,532 DEBUG Transformed 24h into 86400 seconds | |
2015-04-08 07:20:04,574 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-08 07:20:04,574 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-08 07:20:04,574 DEBUG Querying lookup with filter={'severity_id': '5'} | |
2015-04-08 07:20:04,574 DEBUG Matched impact in lookup, returning value=high | |
2015-04-08 07:20:04,574 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5a06b9498aceed650_at_1428502800_5491 | |
2015-04-08 07:20:04,871 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=92d67915-e2d8-45a2-b08e-12fe9df48bc8 | |
2015-04-08 07:20:04,899 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-08 07:20:04,899 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-08 07:20:04,899 DEBUG Querying lookup with filter={'impact': 'high', 'urgency': u'high'} | |
2015-04-08 07:20:04,900 DEBUG Matched priority in lookup, returning value=critical | |
2015-04-08 07:20:04,944 DEBUG Create event will be: time=2015-04-08T07:20:04.944316 severity=INFO origin="alert_handler" event_id="deda835f6207d86f9fc4b25b2e4719d0" user="splunk-system-user" action="create" alert="TO ClamAV Alert" incident_id="92d67915-e2d8-45a2-b08e-12fe9df48bc8" job_id="scheduler__XXmeXX__search__RMD5a06b9498aceed650_at_1428502800_5491" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428502801" | |
2015-04-08 07:20:04,951 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5a06b9498aceed650_at_1428502800_5491 with incident_id=92d67915-e2d8-45a2-b08e-12fe9df48bc8 | |
2015-04-08 07:20:04,978 DEBUG results for incident_id=92d67915-e2d8-45a2-b08e-12fe9df48bc8 written to collection. | |
2015-04-08 07:20:04,979 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5a06b9498aceed650_at_1428502800_5491 incident_id=92d67915-e2d8-45a2-b08e-12fe9df48bc8 result_id=0 written to collection incident_results | |
2015-04-08 07:20:04,979 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-08 07:20:04,985 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-08 07:20:04,985 INFO Alert handler finished. duration=0.762s | |
2015-04-08 07:20:05,381 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55fb18d0d921a7b5a_at_1428502800_5492/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55fb18d0d921a7b5a_at_1428502800_5492 sessionKey=wHI139kU7H1YIuvoC0M6XGIIDqvxHVjjtHvH^Ugrx^l62fKGMy1nagmYaWZ_Dz5UgC_U9Wm0ioNn^Bz5WeUvRFYYtaYWJA4QWF_ijZZlyEB_I9XwTYYRo9i3o1IU_SkpiuF0ykGITF alert=TO TrendAV Alert | |
2015-04-08 07:20:05,387 INFO alert_handler started because alert 'TO TrendAV Alert' with id 'scheduler__XXmeXX__search__RMD55fb18d0d921a7b5a_at_1428502800_5492' has been fired. | |
2015-04-08 07:20:05,660 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-08 07:20:05,661 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20TrendAV%20Alert%22%7D | |
2015-04-08 07:20:05,667 DEBUG Incident settings: [ { "auto_assign" : false, "auto_assign_owner" : "unassigned", "subcategory" : "Windows", "alert" : "TO TrendAV Alert", "category" : "Antivirus", "run_alert_script" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "urgency" : "high", "tags" : "Virus", "_user" : "nobody", "_key" : "552436d73403e12f580a9535" } ] | |
2015-04-08 07:20:05,668 INFO Found incident settings for TO TrendAV Alert | |
2015-04-08 07:20:05,668 DEBUG Incident config after getting settings: {"auto_assign_owner": "unassigned", "category": "Antivirus", "auto_assign": false, "_user": "nobody", "urgency": "high", "auto_ttl_resolve": false, "run_alert_script": false, "tags": "Virus", "alert": "TO TrendAV Alert", "subcategory": "Windows", "_key": "552436d73403e12f580a9535", "auto_assign_user": "", "alert_script": "", "auto_previous_resolve": false} | |
2015-04-08 07:20:05,675 INFO Found job for alert TO TrendAV Alert. Context is 'search' with 1 results. | |
2015-04-08 07:20:05,692 DEBUG Parsed savedsearch settings: severity=5 expiry=24h digest_mode=True | |
2015-04-08 07:20:05,692 DEBUG Transformed 24h into 86400 seconds | |
2015-04-08 07:20:05,717 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-08 07:20:05,717 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-08 07:20:05,717 DEBUG Querying lookup with filter={'severity_id': '5'} | |
2015-04-08 07:20:05,717 DEBUG Matched impact in lookup, returning value=high | |
2015-04-08 07:20:05,717 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55fb18d0d921a7b5a_at_1428502800_5492 | |
2015-04-08 07:20:06,012 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=6c656df6-c906-4f21-ab81-f199aae24cb3 | |
2015-04-08 07:20:06,038 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-08 07:20:06,038 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-08 07:20:06,038 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'high'} | |
2015-04-08 07:20:06,038 DEBUG Matched priority in lookup, returning value=critical | |
2015-04-08 07:20:06,071 DEBUG Create event will be: time=2015-04-08T07:20:06.071537 severity=INFO origin="alert_handler" event_id="a19d967904829cfada4aaf709c8fed44" user="splunk-system-user" action="create" alert="TO TrendAV Alert" incident_id="6c656df6-c906-4f21-ab81-f199aae24cb3" job_id="scheduler__XXmeXX__search__RMD55fb18d0d921a7b5a_at_1428502800_5492" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428502801" | |
2015-04-08 07:20:06,077 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55fb18d0d921a7b5a_at_1428502800_5492 with incident_id=6c656df6-c906-4f21-ab81-f199aae24cb3 | |
2015-04-08 07:20:06,105 DEBUG results for incident_id=6c656df6-c906-4f21-ab81-f199aae24cb3 written to collection. | |
2015-04-08 07:20:06,105 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55fb18d0d921a7b5a_at_1428502800_5492 incident_id=6c656df6-c906-4f21-ab81-f199aae24cb3 result_id=0 written to collection incident_results | |
2015-04-08 07:20:06,106 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-08 07:20:06,112 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-08 07:20:06,112 INFO Alert handler finished. duration=0.732s | |
2015-04-09 08:20:06,290 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428592800_13064/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428592800_13064 sessionKey=1Y2uV^gzIR4MxBgI8uuiu4tQZb2tU0DbLxWN4yiuwgugNZysbRjVbjeupyG8F2AWpm0dYOFkikMZVFzrbtdNVNgO8xX0Xwv0l_bQ17ajMdz4xEeHF_PahG8_naa5UX5sFH4YaMI4baTVTkXP0F alert=To Global Failed Login >3 Alert | |
2015-04-09 08:20:06,296 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428592800_13064' has been fired. | |
2015-04-09 08:20:06,410 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428592800_13061/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428592800_13061 sessionKey=Enxy4YQE20MMOl10Re6VCRcSSlIRMcw07etGPTYMkECYAdtwTqBKkzguIQg8_bp1DxnE_4zJlCQgyAEPc2cYSDMxYYFSSVmDSxE2BUEEFq^qgx3l6ZLMAH^GDoKzVwR1B8V05Azvsb1D11rZKiL alert=TO Failed Logon >3 Alert | |
2015-04-09 08:20:06,416 INFO alert_handler started because alert 'TO Failed Logon >3 Alert' with id 'scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428592800_13061' has been fired. | |
2015-04-09 08:20:06,575 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low"} | |
2015-04-09 08:20:06,576 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 08:20:06,582 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Global Logon Failed", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 08:20:06,582 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 08:20:06,582 DEBUG Incident config after getting settings: {"tags": "AD", "alert": "To Global Failed Login >3 Alert", "urgency": "high", "auto_previous_resolve": false, "_user": "nobody", "_key": "55240dda3403e174be1efd93", "alert_script": "", "run_alert_script": false, "auto_assign_owner": "unassigned", "auto_ttl_resolve": false, "subcategory": "Global Logon Failed", "category": "Active Directory", "auto_assign": false, "auto_assign_user": ""} | |
2015-04-09 08:20:06,590 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 5 results. | |
2015-04-09 08:20:06,608 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:20:06,608 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:20:06,652 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:20:06,653 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:20:06,653 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:20:06,653 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:20:06,653 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428592800_13064 | |
2015-04-09 08:20:06,699 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 08:20:06,699 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Logon%20%3E3%20Alert%22%7D | |
2015-04-09 08:20:06,705 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Logon Failed", "alert" : "TO Failed Logon >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd92" } ] | |
2015-04-09 08:20:06,706 INFO Found incident settings for TO Failed Logon >3 Alert | |
2015-04-09 08:20:06,706 DEBUG Incident config after getting settings: {"auto_assign_user": "", "run_alert_script": false, "auto_ttl_resolve": false, "urgency": "high", "auto_previous_resolve": false, "tags": "AD", "auto_assign_owner": "unassigned", "category": "Active Directory", "auto_assign": false, "alert": "TO Failed Logon >3 Alert", "alert_script": "", "_key": "55240dda3403e174be1efd92", "subcategory": "Logon Failed", "_user": "nobody"} | |
2015-04-09 08:20:06,713 INFO Found job for alert TO Failed Logon >3 Alert. Context is 'search' with 4 results. | |
2015-04-09 08:20:06,730 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:20:06,730 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:20:06,757 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:20:06,757 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:20:06,757 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:20:06,757 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:20:06,757 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428592800_13061 | |
2015-04-09 08:20:06,955 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=49c3d996-149b-40c4-b282-44756568dbdb | |
2015-04-09 08:20:06,983 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:20:06,983 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:20:06,983 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:20:06,983 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:20:07,025 DEBUG Create event will be: time=2015-04-09T08:20:07.025196 severity=INFO origin="alert_handler" event_id="7613a0dc710a0d77776566e5581e3ea5" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="49c3d996-149b-40c4-b282-44756568dbdb" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428592800_13064" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428592801" | |
2015-04-09 08:20:07,032 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428592800_13064 with incident_id=49c3d996-149b-40c4-b282-44756568dbdb | |
2015-04-09 08:20:07,059 DEBUG results for incident_id=49c3d996-149b-40c4-b282-44756568dbdb written to collection. | |
2015-04-09 08:20:07,059 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428592800_13064 incident_id=49c3d996-149b-40c4-b282-44756568dbdb result_id=0 written to collection incident_results | |
2015-04-09 08:20:07,059 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:20:07,068 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:20:07,068 INFO Alert handler finished. duration=0.779s | |
2015-04-09 08:20:07,071 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=2af1f733-e117-4d25-95ca-e6d30697d871 | |
2015-04-09 08:20:07,101 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:20:07,101 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:20:07,101 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:20:07,101 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:20:07,129 DEBUG Create event will be: time=2015-04-09T08:20:07.128995 severity=INFO origin="alert_handler" event_id="e893f2db209d7ff46004e02e55e105f0" user="splunk-system-user" action="create" alert="TO Failed Logon >3 Alert" incident_id="2af1f733-e117-4d25-95ca-e6d30697d871" job_id="scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428592800_13061" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428592801" | |
2015-04-09 08:20:07,135 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428592800_13061 with incident_id=2af1f733-e117-4d25-95ca-e6d30697d871 | |
2015-04-09 08:20:07,163 DEBUG results for incident_id=2af1f733-e117-4d25-95ca-e6d30697d871 written to collection. | |
2015-04-09 08:20:07,163 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428592800_13061 incident_id=2af1f733-e117-4d25-95ca-e6d30697d871 result_id=0 written to collection incident_results | |
2015-04-09 08:20:07,164 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:20:07,170 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:20:07,171 INFO Alert handler finished. duration=0.761s | |
2015-04-09 08:30:05,535 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593400_13103/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593400_13103 sessionKey=yedIYNjqHe^7ns^m1Y^JyrhHoiGuT_8xcIf0Zquivk9sNu5E5lU0^yuQTgBMG8^nKm^Bgx9Zc3bXuMb2T0MbbU8AVKZDtVWpYmeBy^VJuC5v_7khE6oKBv8t8EVFHXUSivYs4yp85^fzyZR alert=To Global Failed Login >3 Alert | |
2015-04-09 08:30:05,541 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593400_13103' has been fired. | |
2015-04-09 08:30:05,827 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_priority": "low", "default_owner": "unassigned"} | |
2015-04-09 08:30:05,827 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 08:30:05,833 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Global Logon Failed", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 08:30:05,834 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 08:30:05,834 DEBUG Incident config after getting settings: {"_key": "55240dda3403e174be1efd93", "auto_assign_owner": "unassigned", "urgency": "high", "alert": "To Global Failed Login >3 Alert", "category": "Active Directory", "run_alert_script": false, "auto_ttl_resolve": false, "auto_assign_user": "", "auto_assign": false, "_user": "nobody", "alert_script": "", "auto_previous_resolve": false, "subcategory": "Global Logon Failed", "tags": "AD"} | |
2015-04-09 08:30:05,841 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 4 results. | |
2015-04-09 08:30:05,858 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:30:05,858 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:30:05,883 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:30:05,884 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:30:05,884 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:30:05,884 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:30:05,884 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593400_13103 | |
2015-04-09 08:30:06,196 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=23c77640-9ce9-490e-96fa-843c353e94bf | |
2015-04-09 08:30:06,222 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:30:06,223 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:30:06,223 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:30:06,223 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:30:06,268 DEBUG Create event will be: time=2015-04-09T08:30:06.268463 severity=INFO origin="alert_handler" event_id="d7c303c6c7f60988972014e0f7f7bdd8" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="23c77640-9ce9-490e-96fa-843c353e94bf" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593400_13103" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428593401" | |
2015-04-09 08:30:06,275 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593400_13103 with incident_id=23c77640-9ce9-490e-96fa-843c353e94bf | |
2015-04-09 08:30:06,302 DEBUG results for incident_id=23c77640-9ce9-490e-96fa-843c353e94bf written to collection. | |
2015-04-09 08:30:06,303 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593400_13103 incident_id=23c77640-9ce9-490e-96fa-843c353e94bf result_id=0 written to collection incident_results | |
2015-04-09 08:30:06,303 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:30:06,310 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:30:06,310 INFO Alert handler finished. duration=0.775s | |
2015-04-09 08:30:06,371 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593400_13115/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593400_13115 sessionKey=nXOpRyVV23Kk594xsTypuhYPnjwvXheOFKHfxxp6JUY9sCnbt0uvNIn2Gtr^VoKRH87NqFKVvVRofQcTjjAMhi7MB51wrEvDu_tTc3y47UmqenloH5AkEI6eU7UzA8_zwGOg6L3UrFROIQHNTc7 alert=TO Failed Logon >3 Alert | |
2015-04-09 08:30:06,377 INFO alert_handler started because alert 'TO Failed Logon >3 Alert' with id 'scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593400_13115' has been fired. | |
2015-04-09 08:30:06,650 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "default_owner": "unassigned", "default_impact": "low", "index": "alerts-to-inf", "default_priority": "low"} | |
2015-04-09 08:30:06,650 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Logon%20%3E3%20Alert%22%7D | |
2015-04-09 08:30:06,657 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Logon Failed", "alert" : "TO Failed Logon >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd92" } ] | |
2015-04-09 08:30:06,657 INFO Found incident settings for TO Failed Logon >3 Alert | |
2015-04-09 08:30:06,657 DEBUG Incident config after getting settings: {"auto_assign_user": "", "auto_assign": false, "auto_ttl_resolve": false, "subcategory": "Logon Failed", "_user": "nobody", "auto_assign_owner": "unassigned", "alert_script": "", "_key": "55240dda3403e174be1efd92", "tags": "AD", "category": "Active Directory", "run_alert_script": false, "alert": "TO Failed Logon >3 Alert", "urgency": "high", "auto_previous_resolve": false} | |
2015-04-09 08:30:06,665 INFO Found job for alert TO Failed Logon >3 Alert. Context is 'search' with 4 results. | |
2015-04-09 08:30:06,682 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:30:06,682 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:30:06,707 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:30:06,707 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:30:06,707 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:30:06,707 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:30:06,708 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593400_13115 | |
2015-04-09 08:30:07,009 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=9f160995-a021-4619-9f14-722863b75032 | |
2015-04-09 08:30:07,036 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:30:07,036 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:30:07,037 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:30:07,037 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:30:07,054 DEBUG Create event will be: time=2015-04-09T08:30:07.054527 severity=INFO origin="alert_handler" event_id="9470731a335b9b1719ae730b86be5086" user="splunk-system-user" action="create" alert="TO Failed Logon >3 Alert" incident_id="9f160995-a021-4619-9f14-722863b75032" job_id="scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593400_13115" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428593401" | |
2015-04-09 08:30:07,061 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593400_13115 with incident_id=9f160995-a021-4619-9f14-722863b75032 | |
2015-04-09 08:30:07,088 DEBUG results for incident_id=9f160995-a021-4619-9f14-722863b75032 written to collection. | |
2015-04-09 08:30:07,089 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593400_13115 incident_id=9f160995-a021-4619-9f14-722863b75032 result_id=0 written to collection incident_results | |
2015-04-09 08:30:07,089 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:30:07,096 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:30:07,097 INFO Alert handler finished. duration=0.726s | |
2015-04-09 08:33:04,012 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593580_13143/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593580_13143 sessionKey=v26VJ99G7jsgfpF7asqMCsNvGEDOgqz6PU6gAkq_z3L0ows87f6^^C_WV9AzmrE7xL9kBamz3FBaYWh^M2lkufRk5yyyyupoV9wdklmKZ4yEbosOnvYOaquuLv9y^uHdKiKkc9CWVcR5FoAS2C alert=TO Failed Logon >3 Alert | |
2015-04-09 08:33:04,019 INFO alert_handler started because alert 'TO Failed Logon >3 Alert' with id 'scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593580_13143' has been fired. | |
2015-04-09 08:33:04,296 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-09 08:33:04,296 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Logon%20%3E3%20Alert%22%7D | |
2015-04-09 08:33:04,302 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Logon Failed", "alert" : "TO Failed Logon >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd92" } ] | |
2015-04-09 08:33:04,302 INFO Found incident settings for TO Failed Logon >3 Alert | |
2015-04-09 08:33:04,303 DEBUG Incident config after getting settings: {"tags": "AD", "alert": "TO Failed Logon >3 Alert", "urgency": "high", "auto_previous_resolve": false, "_user": "nobody", "auto_assign_user": "", "run_alert_script": false, "_key": "55240dda3403e174be1efd92", "subcategory": "Logon Failed", "auto_assign_owner": "unassigned", "auto_ttl_resolve": false, "auto_assign": false, "category": "Active Directory", "alert_script": ""} | |
2015-04-09 08:33:04,310 INFO Found job for alert TO Failed Logon >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 08:33:04,328 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:33:04,328 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:33:04,356 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:33:04,356 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:33:04,356 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:33:04,356 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:33:04,357 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593580_13143 | |
2015-04-09 08:33:04,656 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=d32aec65-6bbc-4a37-961f-77dbcd2445e6 | |
2015-04-09 08:33:04,682 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:33:04,682 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:33:04,683 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:33:04,683 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:33:04,705 DEBUG Create event will be: time=2015-04-09T08:33:04.705712 severity=INFO origin="alert_handler" event_id="b1ad743b7ef27e0d8ba5fb653b7473a3" user="splunk-system-user" action="create" alert="TO Failed Logon >3 Alert" incident_id="d32aec65-6bbc-4a37-961f-77dbcd2445e6" job_id="scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593580_13143" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428593581" | |
2015-04-09 08:33:04,712 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593580_13143 with incident_id=d32aec65-6bbc-4a37-961f-77dbcd2445e6 | |
2015-04-09 08:33:04,740 DEBUG results for incident_id=d32aec65-6bbc-4a37-961f-77dbcd2445e6 written to collection. | |
2015-04-09 08:33:04,740 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593580_13143 incident_id=d32aec65-6bbc-4a37-961f-77dbcd2445e6 result_id=0 written to collection incident_results | |
2015-04-09 08:33:04,740 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:33:04,747 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:33:04,747 INFO Alert handler finished. duration=0.735s | |
2015-04-09 08:34:03,806 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593640_13146/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593640_13146 sessionKey=5NtnNEQ9fembCAxM4JK2WsTfYAZKk03ZQOt2JT9iEEOxPxap18TDNs7alodWNBPfi5GkADI5oa5^qNhNlINc8dbTWMdwepD9SbCxmq6ZK^jJuRtlLoPEyeTnC5Sn7tU_s5mk8O^Vce2B6chq3MnT alert=TO Failed Logon >3 Alert | |
2015-04-09 08:34:03,812 INFO alert_handler started because alert 'TO Failed Logon >3 Alert' with id 'scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593640_13146' has been fired. | |
2015-04-09 08:34:04,085 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_priority": "low", "default_owner": "unassigned"} | |
2015-04-09 08:34:04,085 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Logon%20%3E3%20Alert%22%7D | |
2015-04-09 08:34:04,090 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Logon Failed", "alert" : "TO Failed Logon >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd92" } ] | |
2015-04-09 08:34:04,091 INFO Found incident settings for TO Failed Logon >3 Alert | |
2015-04-09 08:34:04,091 DEBUG Incident config after getting settings: {"_key": "55240dda3403e174be1efd92", "auto_assign_owner": "unassigned", "alert": "TO Failed Logon >3 Alert", "category": "Active Directory", "run_alert_script": false, "subcategory": "Logon Failed", "tags": "AD", "auto_assign": false, "alert_script": "", "_user": "nobody", "urgency": "high", "auto_assign_user": "", "auto_ttl_resolve": false, "auto_previous_resolve": false} | |
2015-04-09 08:34:04,098 INFO Found job for alert TO Failed Logon >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 08:34:04,116 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:34:04,116 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:34:04,142 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:34:04,142 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:34:04,142 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:34:04,142 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:34:04,142 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593640_13146 | |
2015-04-09 08:34:04,442 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=7e97eeff-0a86-4efd-a2e6-781ade3d6aa8 | |
2015-04-09 08:34:04,468 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:34:04,468 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:34:04,469 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:34:04,469 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:34:04,506 DEBUG Create event will be: time=2015-04-09T08:34:04.506923 severity=INFO origin="alert_handler" event_id="87a51dc4f4eb4d5d85a6becc2bdf4e5a" user="splunk-system-user" action="create" alert="TO Failed Logon >3 Alert" incident_id="7e97eeff-0a86-4efd-a2e6-781ade3d6aa8" job_id="scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593640_13146" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428593641" | |
2015-04-09 08:34:04,513 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593640_13146 with incident_id=7e97eeff-0a86-4efd-a2e6-781ade3d6aa8 | |
2015-04-09 08:34:04,541 DEBUG results for incident_id=7e97eeff-0a86-4efd-a2e6-781ade3d6aa8 written to collection. | |
2015-04-09 08:34:04,541 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593640_13146 incident_id=7e97eeff-0a86-4efd-a2e6-781ade3d6aa8 result_id=0 written to collection incident_results | |
2015-04-09 08:34:04,541 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:34:04,548 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:34:04,548 INFO Alert handler finished. duration=0.743s | |
2015-04-09 08:35:04,875 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593700_13152/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593700_13152 sessionKey=30^agT8DviT^4yCDECvcLeYA3EsW6nmqgQAXCw0Sk5YuvZpYJ0oKO5JTNeuhzfMj_VgU8wwOEV79AS19cG8tp1KFbhf896UodOLhRGEMTHXHvZL^PsNdXTMc2AbUpkA3agQ_VWP8tuQMQiXOyio alert=TO Failed Logon >3 Alert | |
2015-04-09 08:35:04,880 INFO alert_handler started because alert 'TO Failed Logon >3 Alert' with id 'scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593700_13152' has been fired. | |
2015-04-09 08:35:05,159 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low"} | |
2015-04-09 08:35:05,159 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Logon%20%3E3%20Alert%22%7D | |
2015-04-09 08:35:05,165 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Logon Failed", "alert" : "TO Failed Logon >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd92" } ] | |
2015-04-09 08:35:05,166 INFO Found incident settings for TO Failed Logon >3 Alert | |
2015-04-09 08:35:05,166 DEBUG Incident config after getting settings: {"urgency": "high", "category": "Active Directory", "auto_assign": false, "_key": "55240dda3403e174be1efd92", "run_alert_script": false, "tags": "AD", "alert": "TO Failed Logon >3 Alert", "_user": "nobody", "alert_script": "", "auto_previous_resolve": false, "auto_assign_user": "", "auto_assign_owner": "unassigned", "auto_ttl_resolve": false, "subcategory": "Logon Failed"} | |
2015-04-09 08:35:05,174 INFO Found job for alert TO Failed Logon >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 08:35:05,194 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:35:05,194 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:35:05,227 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:35:05,227 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:35:05,227 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:35:05,227 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:35:05,227 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593700_13152 | |
2015-04-09 08:35:05,523 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=9a522df5-bd1d-4d4f-b956-73d671065eb1 | |
2015-04-09 08:35:05,550 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:35:05,550 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:35:05,550 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:35:05,550 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:35:05,566 DEBUG Create event will be: time=2015-04-09T08:35:05.565993 severity=INFO origin="alert_handler" event_id="5a192967d1aa2a1abc0be1ec58d9f9e1" user="splunk-system-user" action="create" alert="TO Failed Logon >3 Alert" incident_id="9a522df5-bd1d-4d4f-b956-73d671065eb1" job_id="scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593700_13152" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428593701" | |
2015-04-09 08:35:05,572 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593700_13152 with incident_id=9a522df5-bd1d-4d4f-b956-73d671065eb1 | |
2015-04-09 08:35:05,600 DEBUG results for incident_id=9a522df5-bd1d-4d4f-b956-73d671065eb1 written to collection. | |
2015-04-09 08:35:05,600 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593700_13152 incident_id=9a522df5-bd1d-4d4f-b956-73d671065eb1 result_id=0 written to collection incident_results | |
2015-04-09 08:35:05,600 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:35:05,607 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:35:05,607 INFO Alert handler finished. duration=0.733s | |
2015-04-09 08:36:04,097 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593760_13156/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593760_13156 sessionKey=OEcOqiTVmJd0AgS5QUNl5GZkUUmCHMr4CcX5AMhbw1HVtAkUvRZ9DqLbGJ93YzMvSXBNoQ^tILQWw9wPcL1pn1fyNzBBTFBGTgMamF54SAh480uuIi0cWuUlxvpzE2uRjJWqcoBzLrVNLIBhTEXQ alert=TO Failed Logon >3 Alert | |
2015-04-09 08:36:04,103 INFO alert_handler started because alert 'TO Failed Logon >3 Alert' with id 'scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593760_13156' has been fired. | |
2015-04-09 08:36:04,204 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593760_13157/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593760_13157 sessionKey=njpz^Gslhaag_H6ktPB9tTmnFivZDtaVW0dxR5t4cWV3ScpNngjYCQuYvcoOEdhrS9f3MFjc02nYAyYaPK7g9R8WHdgX5jdZ45NCJk0SB1hdtUXuu7o6PSmvkdt4yqMljKGXkTlK^TzRmdo alert=To Global Failed Login >3 Alert | |
2015-04-09 08:36:04,210 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593760_13157' has been fired. | |
2015-04-09 08:36:04,385 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 08:36:04,386 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Logon%20%3E3%20Alert%22%7D | |
2015-04-09 08:36:04,391 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Logon Failed", "alert" : "TO Failed Logon >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd92" } ] | |
2015-04-09 08:36:04,391 INFO Found incident settings for TO Failed Logon >3 Alert | |
2015-04-09 08:36:04,392 DEBUG Incident config after getting settings: {"tags": "AD", "alert_script": "", "alert": "TO Failed Logon >3 Alert", "run_alert_script": false, "auto_assign_user": "", "_user": "nobody", "urgency": "high", "auto_previous_resolve": false, "subcategory": "Logon Failed", "category": "Active Directory", "auto_ttl_resolve": false, "auto_assign": false, "auto_assign_owner": "unassigned", "_key": "55240dda3403e174be1efd92"} | |
2015-04-09 08:36:04,399 INFO Found job for alert TO Failed Logon >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 08:36:04,417 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:36:04,417 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:36:04,442 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:36:04,443 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:36:04,443 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:36:04,443 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:36:04,443 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593760_13156 | |
2015-04-09 08:36:04,488 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_priority": "low", "default_owner": "unassigned"} | |
2015-04-09 08:36:04,488 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 08:36:04,493 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Global Logon Failed", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 08:36:04,494 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 08:36:04,494 DEBUG Incident config after getting settings: {"_key": "55240dda3403e174be1efd93", "auto_assign_owner": "unassigned", "alert": "To Global Failed Login >3 Alert", "category": "Active Directory", "run_alert_script": false, "subcategory": "Global Logon Failed", "tags": "AD", "auto_assign": false, "alert_script": "", "_user": "nobody", "urgency": "high", "auto_assign_user": "", "auto_ttl_resolve": false, "auto_previous_resolve": false} | |
2015-04-09 08:36:04,501 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 08:36:04,519 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:36:04,519 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:36:04,545 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:36:04,546 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:36:04,546 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:36:04,546 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:36:04,546 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593760_13157 | |
2015-04-09 08:36:04,738 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=556351ea-a1ea-4d66-8b78-ea0cf3d31976 | |
2015-04-09 08:36:04,764 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:36:04,764 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:36:04,765 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 08:36:04,765 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:36:04,786 DEBUG Create event will be: time=2015-04-09T08:36:04.786281 severity=INFO origin="alert_handler" event_id="b0a6738a0525ffef73e70c36324376e9" user="splunk-system-user" action="create" alert="TO Failed Logon >3 Alert" incident_id="556351ea-a1ea-4d66-8b78-ea0cf3d31976" job_id="scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593760_13156" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428593761" | |
2015-04-09 08:36:04,792 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593760_13156 with incident_id=556351ea-a1ea-4d66-8b78-ea0cf3d31976 | |
2015-04-09 08:36:04,822 DEBUG results for incident_id=556351ea-a1ea-4d66-8b78-ea0cf3d31976 written to collection. | |
2015-04-09 08:36:04,822 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593760_13156 incident_id=556351ea-a1ea-4d66-8b78-ea0cf3d31976 result_id=0 written to collection incident_results | |
2015-04-09 08:36:04,822 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:36:04,830 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:36:04,830 INFO Alert handler finished. duration=0.733s | |
2015-04-09 08:36:04,846 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=2de41777-9798-473d-8b60-d964235fa3d9 | |
2015-04-09 08:36:04,873 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:36:04,873 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:36:04,873 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:36:04,873 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:36:04,889 DEBUG Create event will be: time=2015-04-09T08:36:04.889904 severity=INFO origin="alert_handler" event_id="bfa53a4774c51e67e10c83e30a0c5e78" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="2de41777-9798-473d-8b60-d964235fa3d9" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593760_13157" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428593761" | |
2015-04-09 08:36:04,896 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593760_13157 with incident_id=2de41777-9798-473d-8b60-d964235fa3d9 | |
2015-04-09 08:36:04,924 DEBUG results for incident_id=2de41777-9798-473d-8b60-d964235fa3d9 written to collection. | |
2015-04-09 08:36:04,924 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593760_13157 incident_id=2de41777-9798-473d-8b60-d964235fa3d9 result_id=0 written to collection incident_results | |
2015-04-09 08:36:04,924 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:36:04,931 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:36:04,931 INFO Alert handler finished. duration=0.727s | |
2015-04-09 08:37:08,516 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593820_13160/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593820_13160 sessionKey=vVs9Dbb3lJFz5gBXdIqUfVwmnWonE31g3s0EOgtjdRUj7BZ4yNe4CQIkEPczOIN5UfcpuMTX24k9LiBSjyACtvxylWbtfZZA3spmadDSc95Upo8SAKrESWcUUZqT8MLZUD^^AybS9bTeBVXQcJye alert=To Global Failed Login >3 Alert | |
2015-04-09 08:37:08,522 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593820_13160' has been fired. | |
2015-04-09 08:37:08,705 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593820_13159/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593820_13159 sessionKey=_c33uRSDS5FiCdW2tMRZ^Btzf7dDrEc^Z9jHmkHpaMOZ68BIQsXaxK9_cg496l5fFdl9Ac^coRbh2qMbv8ybpXkGlV5vEV9oHAashX9WyyN2wkV82u^MBN7RzlZcKFqrXiIn6z2Flhl9Q87 alert=TO Failed Logon >3 Alert | |
2015-04-09 08:37:08,711 INFO alert_handler started because alert 'TO Failed Logon >3 Alert' with id 'scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593820_13159' has been fired. | |
2015-04-09 08:37:08,798 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 08:37:08,798 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 08:37:08,804 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Global Logon Failed", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 08:37:08,804 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 08:37:08,804 DEBUG Incident config after getting settings: {"alert_script": "", "auto_previous_resolve": false, "tags": "AD", "urgency": "high", "alert": "To Global Failed Login >3 Alert", "category": "Active Directory", "_key": "55240dda3403e174be1efd93", "auto_assign": false, "auto_assign_user": "", "_user": "nobody", "run_alert_script": false, "auto_assign_owner": "unassigned", "auto_ttl_resolve": false, "subcategory": "Global Logon Failed"} | |
2015-04-09 08:37:08,812 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 08:37:08,829 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:37:08,829 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:37:08,855 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:37:08,856 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:37:08,856 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:37:08,856 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:37:08,856 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593820_13160 | |
2015-04-09 08:37:08,983 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_priority": "low", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-09 08:37:08,983 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Logon%20%3E3%20Alert%22%7D | |
2015-04-09 08:37:08,990 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Logon Failed TEST", "alert" : "TO Failed Logon >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd92" } ] | |
2015-04-09 08:37:08,990 INFO Found incident settings for TO Failed Logon >3 Alert | |
2015-04-09 08:37:08,990 DEBUG Incident config after getting settings: {"auto_assign": false, "auto_ttl_resolve": false, "urgency": "high", "auto_previous_resolve": false, "alert_script": "", "_user": "nobody", "category": "Active Directory", "alert": "TO Failed Logon >3 Alert", "subcategory": "Logon Failed TEST", "run_alert_script": false, "_key": "55240dda3403e174be1efd92", "auto_assign_owner": "unassigned", "tags": "AD", "auto_assign_user": ""} | |
2015-04-09 08:37:08,998 INFO Found job for alert TO Failed Logon >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 08:37:09,015 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:37:09,015 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:37:09,042 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:37:09,042 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:37:09,042 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:37:09,042 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:37:09,042 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593820_13159 | |
2015-04-09 08:37:09,157 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=396b1148-685f-4e56-b9af-d0e4b0343c79 | |
2015-04-09 08:37:09,183 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:37:09,183 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:37:09,184 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:37:09,184 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:37:09,225 DEBUG Create event will be: time=2015-04-09T08:37:09.225171 severity=INFO origin="alert_handler" event_id="e1dcce17169afe3775fa2600449052be" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="396b1148-685f-4e56-b9af-d0e4b0343c79" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593820_13160" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428593821" | |
2015-04-09 08:37:09,231 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593820_13160 with incident_id=396b1148-685f-4e56-b9af-d0e4b0343c79 | |
2015-04-09 08:37:09,259 DEBUG results for incident_id=396b1148-685f-4e56-b9af-d0e4b0343c79 written to collection. | |
2015-04-09 08:37:09,259 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593820_13160 incident_id=396b1148-685f-4e56-b9af-d0e4b0343c79 result_id=0 written to collection incident_results | |
2015-04-09 08:37:09,259 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:37:09,267 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:37:09,267 INFO Alert handler finished. duration=0.751s | |
2015-04-09 08:37:09,345 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=b658a7c2-2400-4542-9966-36e1cace51b6 | |
2015-04-09 08:37:09,371 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:37:09,371 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:37:09,371 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:37:09,371 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:37:09,396 DEBUG Create event will be: time=2015-04-09T08:37:09.396469 severity=INFO origin="alert_handler" event_id="c71ce6e154a5c8e8033aef7683fb3089" user="splunk-system-user" action="create" alert="TO Failed Logon >3 Alert" incident_id="b658a7c2-2400-4542-9966-36e1cace51b6" job_id="scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593820_13159" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428593821" | |
2015-04-09 08:37:09,402 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593820_13159 with incident_id=b658a7c2-2400-4542-9966-36e1cace51b6 | |
2015-04-09 08:37:09,430 DEBUG results for incident_id=b658a7c2-2400-4542-9966-36e1cace51b6 written to collection. | |
2015-04-09 08:37:09,430 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593820_13159 incident_id=b658a7c2-2400-4542-9966-36e1cace51b6 result_id=0 written to collection incident_results | |
2015-04-09 08:37:09,430 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:37:09,437 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:37:09,437 INFO Alert handler finished. duration=0.732s | |
2015-04-09 08:38:04,589 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593880_13162/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593880_13162 sessionKey=QD2zynw8U6wVy^dPT9PP3XqCxC^rMNaCRbWbRc_2NHfqvjlX^3gXQY3g5RicyaQMN^uzEfEhEABLDNhTGBKD121zeKRYNG7yRBggMEPr2crXfV86vRjGT5c5oWXr4Pk8pTUUD0_S8aQFgglRWo alert=TO Failed Logon >3 Alert | |
2015-04-09 08:38:04,595 INFO alert_handler started because alert 'TO Failed Logon >3 Alert' with id 'scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593880_13162' has been fired. | |
2015-04-09 08:38:04,669 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593880_13163/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593880_13163 sessionKey=vLkBR_zVrR4ix5Q8B1Jdisr8^73K1UI79imndiA8TT0MRtBGPEMNWAisznaKY3WQCcp1NhEXAY55p30kqHSIkGvjrWXKxd6^R7G0O1GXhWDpP7jO1WE01jb9NaqZcrv7LLSYinKsIjAG alert=To Global Failed Login >3 Alert | |
2015-04-09 08:38:04,675 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593880_13163' has been fired. | |
2015-04-09 08:38:04,864 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 08:38:04,864 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Logon%20%3E3%20Alert%22%7D | |
2015-04-09 08:38:04,870 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Logon Failed TEST", "alert" : "TO Failed Logon >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd92" } ] | |
2015-04-09 08:38:04,870 INFO Found incident settings for TO Failed Logon >3 Alert | |
2015-04-09 08:38:04,870 DEBUG Incident config after getting settings: {"urgency": "high", "auto_previous_resolve": false, "run_alert_script": false, "alert": "TO Failed Logon >3 Alert", "tags": "AD", "_key": "55240dda3403e174be1efd92", "alert_script": "", "auto_assign_user": "", "auto_assign_owner": "unassigned", "_user": "nobody", "subcategory": "Logon Failed TEST", "category": "Active Directory", "auto_ttl_resolve": false, "auto_assign": false} | |
2015-04-09 08:38:04,878 INFO Found job for alert TO Failed Logon >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 08:38:04,895 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:38:04,895 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:38:04,920 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:38:04,920 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:38:04,920 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:38:04,921 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:38:04,921 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593880_13162 | |
2015-04-09 08:38:04,947 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "default_priority": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-09 08:38:04,947 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 08:38:04,953 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Global Logon Failed", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 08:38:04,954 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 08:38:04,954 DEBUG Incident config after getting settings: {"category": "Active Directory", "subcategory": "Global Logon Failed", "alert": "To Global Failed Login >3 Alert", "tags": "AD", "auto_ttl_resolve": false, "_user": "nobody", "auto_assign_user": "", "auto_assign": false, "_key": "55240dda3403e174be1efd93", "auto_assign_owner": "unassigned", "run_alert_script": false, "auto_previous_resolve": false, "urgency": "high", "alert_script": ""} | |
2015-04-09 08:38:04,961 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 08:38:04,978 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:38:04,978 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:38:05,004 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:38:05,005 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:38:05,005 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:38:05,005 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:38:05,005 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593880_13163 | |
2015-04-09 08:38:05,220 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=c09d9fd2-b871-4cf5-a726-9850582b87d7 | |
2015-04-09 08:38:05,248 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:38:05,248 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:38:05,248 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 08:38:05,248 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:38:05,276 DEBUG Create event will be: time=2015-04-09T08:38:05.275966 severity=INFO origin="alert_handler" event_id="f87340efacd03b2d39ee11e913dbbecf" user="splunk-system-user" action="create" alert="TO Failed Logon >3 Alert" incident_id="c09d9fd2-b871-4cf5-a726-9850582b87d7" job_id="scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593880_13162" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428593882" | |
2015-04-09 08:38:05,284 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593880_13162 with incident_id=c09d9fd2-b871-4cf5-a726-9850582b87d7 | |
2015-04-09 08:38:05,303 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=3b7122aa-65fc-4903-8a64-af8a671b9f3c | |
2015-04-09 08:38:05,310 DEBUG results for incident_id=c09d9fd2-b871-4cf5-a726-9850582b87d7 written to collection. | |
2015-04-09 08:38:05,310 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593880_13162 incident_id=c09d9fd2-b871-4cf5-a726-9850582b87d7 result_id=0 written to collection incident_results | |
2015-04-09 08:38:05,310 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:38:05,318 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:38:05,318 INFO Alert handler finished. duration=0.729s | |
2015-04-09 08:38:05,331 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:38:05,331 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:38:05,332 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:38:05,332 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:38:05,345 DEBUG Create event will be: time=2015-04-09T08:38:05.345290 severity=INFO origin="alert_handler" event_id="0be959211848678d0b2813f2a78d9e37" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="3b7122aa-65fc-4903-8a64-af8a671b9f3c" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593880_13163" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428593882" | |
2015-04-09 08:38:05,352 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593880_13163 with incident_id=3b7122aa-65fc-4903-8a64-af8a671b9f3c | |
2015-04-09 08:38:05,379 DEBUG results for incident_id=3b7122aa-65fc-4903-8a64-af8a671b9f3c written to collection. | |
2015-04-09 08:38:05,379 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593880_13163 incident_id=3b7122aa-65fc-4903-8a64-af8a671b9f3c result_id=0 written to collection incident_results | |
2015-04-09 08:38:05,379 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:38:05,386 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:38:05,386 INFO Alert handler finished. duration=0.717s | |
2015-04-09 08:39:04,041 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593940_13165/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593940_13165 sessionKey=VQC_tE6oUvDbZJaKJ5kRkVMOGaDzevIMXkpSlBnl54kktl7SE2ivB2yZwqao73FdLeuxXY35X8d0ubVPSEyieVgj0qo8GWMHFmAX2s5ezcdZcu08fbx3FRdHWZUNCcMlgEZS1Y^A4GA alert=TO Failed Logon >3 Alert | |
2015-04-09 08:39:04,047 INFO alert_handler started because alert 'TO Failed Logon >3 Alert' with id 'scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593940_13165' has been fired. | |
2015-04-09 08:39:04,076 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593940_13166/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593940_13166 sessionKey=CMFlibjytJcsd39CiWD4mzFPFrPwGem^0UNHB0_fNF_AS0pXP9Xz0ZX125m0vpZ6LyBc6yXvm_jBgMFdO5MpZo9Aicl^mHWzPRTZOM1j9KqXI5FOXxwLLFFBR5zlA7EtWDicoBpLcevIeo alert=To Global Failed Login >3 Alert | |
2015-04-09 08:39:04,082 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593940_13166' has been fired. | |
2015-04-09 08:39:04,319 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-09 08:39:04,319 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Logon%20%3E3%20Alert%22%7D | |
2015-04-09 08:39:04,326 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Logon Failed TEST", "alert" : "TO Failed Logon >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd92" } ] | |
2015-04-09 08:39:04,326 INFO Found incident settings for TO Failed Logon >3 Alert | |
2015-04-09 08:39:04,326 DEBUG Incident config after getting settings: {"tags": "AD", "run_alert_script": false, "alert": "TO Failed Logon >3 Alert", "subcategory": "Logon Failed TEST", "_key": "55240dda3403e174be1efd92", "auto_assign_user": "", "alert_script": "", "auto_previous_resolve": false, "auto_assign_owner": "unassigned", "category": "Active Directory", "auto_assign": false, "_user": "nobody", "auto_ttl_resolve": false, "urgency": "high"} | |
2015-04-09 08:39:04,333 INFO Found job for alert TO Failed Logon >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 08:39:04,351 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:39:04,351 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:39:04,362 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 08:39:04,362 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 08:39:04,369 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Global Logon Failed", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 08:39:04,369 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 08:39:04,369 DEBUG Incident config after getting settings: {"auto_assign_owner": "unassigned", "run_alert_script": false, "_key": "55240dda3403e174be1efd93", "alert_script": "", "auto_previous_resolve": false, "urgency": "high", "alert": "To Global Failed Login >3 Alert", "tags": "AD", "category": "Active Directory", "subcategory": "Global Logon Failed", "_user": "nobody", "auto_assign_user": "", "auto_assign": false, "auto_ttl_resolve": false} | |
2015-04-09 08:39:04,377 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 08:39:04,377 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:39:04,378 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:39:04,378 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:39:04,378 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:39:04,378 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593940_13165 | |
2015-04-09 08:39:04,395 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:39:04,396 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:39:04,421 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:39:04,421 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:39:04,421 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:39:04,421 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:39:04,421 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593940_13166 | |
2015-04-09 08:39:04,680 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=1dd5e7bc-7fb2-42ac-846e-9fcf9e222a58 | |
2015-04-09 08:39:04,707 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:39:04,707 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:39:04,708 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 08:39:04,708 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:39:04,716 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=332ea890-e867-4885-a882-820369b60ea4 | |
2015-04-09 08:39:04,730 DEBUG Create event will be: time=2015-04-09T08:39:04.730306 severity=INFO origin="alert_handler" event_id="52443e7eb723068fd84076242a051c11" user="splunk-system-user" action="create" alert="TO Failed Logon >3 Alert" incident_id="1dd5e7bc-7fb2-42ac-846e-9fcf9e222a58" job_id="scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593940_13165" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428593941" | |
2015-04-09 08:39:04,736 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593940_13165 with incident_id=1dd5e7bc-7fb2-42ac-846e-9fcf9e222a58 | |
2015-04-09 08:39:04,744 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:39:04,744 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:39:04,744 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:39:04,745 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:39:04,765 DEBUG Create event will be: time=2015-04-09T08:39:04.765274 severity=INFO origin="alert_handler" event_id="3068ce1e8407987c8fcdbe3784d1515e" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="332ea890-e867-4885-a882-820369b60ea4" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593940_13166" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428593941" | |
2015-04-09 08:39:04,765 DEBUG results for incident_id=1dd5e7bc-7fb2-42ac-846e-9fcf9e222a58 written to collection. | |
2015-04-09 08:39:04,765 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428593940_13165 incident_id=1dd5e7bc-7fb2-42ac-846e-9fcf9e222a58 result_id=0 written to collection incident_results | |
2015-04-09 08:39:04,765 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:39:04,773 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593940_13166 with incident_id=332ea890-e867-4885-a882-820369b60ea4 | |
2015-04-09 08:39:04,774 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:39:04,775 INFO Alert handler finished. duration=0.734s | |
2015-04-09 08:39:04,800 DEBUG results for incident_id=332ea890-e867-4885-a882-820369b60ea4 written to collection. | |
2015-04-09 08:39:04,800 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428593940_13166 incident_id=332ea890-e867-4885-a882-820369b60ea4 result_id=0 written to collection incident_results | |
2015-04-09 08:39:04,800 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:39:04,807 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:39:04,808 INFO Alert handler finished. duration=0.732s | |
2015-04-09 08:40:05,664 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594000_13175/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594000_13175 sessionKey=ALigG9a2Yu_iN5WXNNW^cm06N4Gjr4VCbotJMivOLnermDwMQcBSw2fd5svTkBoig6fOiliff4KwTE8tb2mLvLRtlwoCA52wgJA2^TyZdzgX2a9LpcMe4JkY^YKcN1AjVv5_bcElscr alert=To Global Failed Login >3 Alert | |
2015-04-09 08:40:05,670 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594000_13175' has been fired. | |
2015-04-09 08:40:05,749 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594000_13172/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594000_13172 sessionKey=g0gQuE5uD3^VQTQ7rBDyZbXK7Mm0^rDGpSVSVy4NoDYpu_1V^_9yOmZuPptez2^nYJevmUTGATtCMOW9vbDZCGObCTsiA^3feEd95vCY^q4nCxb7YhyDV^GUbsR2^EdJF8mLt2bgyDPYJJHfcN alert=TO Failed Logon >3 Alert | |
2015-04-09 08:40:05,755 INFO alert_handler started because alert 'TO Failed Logon >3 Alert' with id 'scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594000_13172' has been fired. | |
2015-04-09 08:40:05,944 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-09 08:40:05,945 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 08:40:05,951 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Global Logon Failed", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 08:40:05,951 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 08:40:05,951 DEBUG Incident config after getting settings: {"auto_ttl_resolve": false, "auto_assign": false, "auto_previous_resolve": false, "urgency": "high", "alert_script": "", "_user": "nobody", "category": "Active Directory", "alert": "To Global Failed Login >3 Alert", "subcategory": "Global Logon Failed", "tags": "AD", "run_alert_script": false, "_key": "55240dda3403e174be1efd93", "auto_assign_owner": "unassigned", "auto_assign_user": ""} | |
2015-04-09 08:40:05,959 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 08:40:05,976 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:40:05,977 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:40:06,002 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:40:06,002 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:40:06,002 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:40:06,003 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:40:06,003 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594000_13175 | |
2015-04-09 08:40:06,033 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-09 08:40:06,033 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Logon%20%3E3%20Alert%22%7D | |
2015-04-09 08:40:06,039 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "unknown", "subcategory" : "unknown", "alert" : "TO Failed Logon >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "_user" : "nobody", "_key" : "55269d1f3403e12f580a9558" } ] | |
2015-04-09 08:40:06,039 INFO Found incident settings for TO Failed Logon >3 Alert | |
2015-04-09 08:40:06,039 DEBUG Incident config after getting settings: {"alert_script": "", "auto_previous_resolve": false, "urgency": "high", "run_alert_script": false, "auto_assign_owner": "unassigned", "_key": "55269d1f3403e12f580a9558", "auto_assign_user": "", "auto_assign": false, "_user": "nobody", "auto_ttl_resolve": false, "tags": "[Untagged]", "alert": "TO Failed Logon >3 Alert", "subcategory": "unknown", "category": "unknown"} | |
2015-04-09 08:40:06,047 INFO Found job for alert TO Failed Logon >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 08:40:06,064 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:40:06,064 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:40:06,091 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:40:06,091 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:40:06,091 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:40:06,092 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:40:06,092 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594000_13172 | |
2015-04-09 08:40:06,302 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=e1e80ff6-e1d6-4e06-b676-d1654e0fbeab | |
2015-04-09 08:40:06,329 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:40:06,330 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:40:06,330 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:40:06,330 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:40:06,350 DEBUG Create event will be: time=2015-04-09T08:40:06.350737 severity=INFO origin="alert_handler" event_id="8deeab377e3cfc2b44758901436af05a" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="e1e80ff6-e1d6-4e06-b676-d1654e0fbeab" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594000_13175" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594001" | |
2015-04-09 08:40:06,358 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594000_13175 with incident_id=e1e80ff6-e1d6-4e06-b676-d1654e0fbeab | |
2015-04-09 08:40:06,385 DEBUG results for incident_id=e1e80ff6-e1d6-4e06-b676-d1654e0fbeab written to collection. | |
2015-04-09 08:40:06,385 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594000_13175 incident_id=e1e80ff6-e1d6-4e06-b676-d1654e0fbeab result_id=0 written to collection incident_results | |
2015-04-09 08:40:06,385 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:40:06,392 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:40:06,393 INFO Alert handler finished. duration=0.729s | |
2015-04-09 08:40:06,398 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=e993fad4-b077-41ca-a96c-6d6602934ce5 | |
2015-04-09 08:40:06,426 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:40:06,426 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:40:06,426 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 08:40:06,427 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:40:06,453 DEBUG Create event will be: time=2015-04-09T08:40:06.453931 severity=INFO origin="alert_handler" event_id="910942d57973f7c181ad01be49d12f11" user="splunk-system-user" action="create" alert="TO Failed Logon >3 Alert" incident_id="e993fad4-b077-41ca-a96c-6d6602934ce5" job_id="scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594000_13172" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594001" | |
2015-04-09 08:40:06,460 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594000_13172 with incident_id=e993fad4-b077-41ca-a96c-6d6602934ce5 | |
2015-04-09 08:40:06,488 DEBUG results for incident_id=e993fad4-b077-41ca-a96c-6d6602934ce5 written to collection. | |
2015-04-09 08:40:06,489 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594000_13172 incident_id=e993fad4-b077-41ca-a96c-6d6602934ce5 result_id=0 written to collection incident_results | |
2015-04-09 08:40:06,489 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:40:06,496 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:40:06,496 INFO Alert handler finished. duration=0.748s | |
2015-04-09 08:41:04,981 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594060_13196/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594060_13196 sessionKey=szuLHAlKZPQ4ExRmM5OdWlF4RrwNWmIoylb_Rj^VjXfU028UlJpA_QX9dk46uw^6Ze4ISQ2IQlfT4vjwSnikSC4drJi8e^de^bdkNHH_XA6uyLf5p_MzB4jMk22VKKETQ4qeLiCUcu_u alert=TO Failed Logon >3 Alert | |
2015-04-09 08:41:04,986 INFO alert_handler started because alert 'TO Failed Logon >3 Alert' with id 'scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594060_13196' has been fired. | |
2015-04-09 08:41:05,256 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_priority": "low", "default_owner": "unassigned"} | |
2015-04-09 08:41:05,256 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Logon%20%3E3%20Alert%22%7D | |
2015-04-09 08:41:05,262 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "unknown", "subcategory" : "unknown", "alert" : "TO Failed Logon >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "_user" : "nobody", "_key" : "55269d1f3403e12f580a9558" } ] | |
2015-04-09 08:41:05,262 INFO Found incident settings for TO Failed Logon >3 Alert | |
2015-04-09 08:41:05,262 DEBUG Incident config after getting settings: {"auto_assign": false, "_key": "55269d1f3403e12f580a9558", "auto_ttl_resolve": false, "urgency": "high", "tags": "[Untagged]", "category": "unknown", "auto_previous_resolve": false, "alert_script": "", "run_alert_script": false, "_user": "nobody", "auto_assign_user": "", "subcategory": "unknown", "auto_assign_owner": "unassigned", "alert": "TO Failed Logon >3 Alert"} | |
2015-04-09 08:41:05,270 INFO Found job for alert TO Failed Logon >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 08:41:05,287 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:41:05,287 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:41:05,314 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:41:05,314 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:41:05,315 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:41:05,315 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:41:05,315 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594060_13196 | |
2015-04-09 08:41:05,610 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=fca4c19a-e416-45ba-9f45-d0240a53a43f | |
2015-04-09 08:41:05,639 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:41:05,639 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:41:05,639 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:41:05,639 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:41:05,676 DEBUG Create event will be: time=2015-04-09T08:41:05.676621 severity=INFO origin="alert_handler" event_id="35b4e9e832d442fa382d71655fa48d1c" user="splunk-system-user" action="create" alert="TO Failed Logon >3 Alert" incident_id="fca4c19a-e416-45ba-9f45-d0240a53a43f" job_id="scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594060_13196" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594062" | |
2015-04-09 08:41:05,683 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594060_13196 with incident_id=fca4c19a-e416-45ba-9f45-d0240a53a43f | |
2015-04-09 08:41:05,746 DEBUG results for incident_id=fca4c19a-e416-45ba-9f45-d0240a53a43f written to collection. | |
2015-04-09 08:41:05,746 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594060_13196 incident_id=fca4c19a-e416-45ba-9f45-d0240a53a43f result_id=0 written to collection incident_results | |
2015-04-09 08:41:05,746 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:41:05,754 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:41:05,755 INFO Alert handler finished. duration=0.774s | |
2015-04-09 08:42:04,724 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594120_13201/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594120_13201 sessionKey=kSisAo94sbbkuOMotbNcqMM60UL4rVqt8f3Uanv_kDnAEuF8ryzdm2WQhTOKhQZAXmjuM4KGgs94GFIaDJWf_zAJDKSOsFCdRF6S01mlY3tqBqp_O8H2jBjfg0Ib_CKwh6JLpXbZ^bA alert=TO Failed Logon >3 Alert | |
2015-04-09 08:42:04,730 INFO alert_handler started because alert 'TO Failed Logon >3 Alert' with id 'scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594120_13201' has been fired. | |
2015-04-09 08:42:05,003 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low"} | |
2015-04-09 08:42:05,004 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Logon%20%3E3%20Alert%22%7D | |
2015-04-09 08:42:05,009 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "unknown", "subcategory" : "unknown", "alert" : "TO Failed Logon >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "_user" : "nobody", "_key" : "55269d1f3403e12f580a9558" } ] | |
2015-04-09 08:42:05,010 INFO Found incident settings for TO Failed Logon >3 Alert | |
2015-04-09 08:42:05,010 DEBUG Incident config after getting settings: {"tags": "[Untagged]", "auto_previous_resolve": false, "subcategory": "unknown", "alert_script": "", "_user": "nobody", "auto_assign": false, "auto_assign_user": "", "auto_ttl_resolve": false, "run_alert_script": false, "category": "unknown", "alert": "TO Failed Logon >3 Alert", "urgency": "high", "_key": "55269d1f3403e12f580a9558", "auto_assign_owner": "unassigned"} | |
2015-04-09 08:42:05,018 INFO Found job for alert TO Failed Logon >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 08:42:05,036 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:42:05,036 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:42:05,063 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:42:05,063 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:42:05,063 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:42:05,063 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:42:05,064 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594120_13201 | |
2015-04-09 08:42:05,358 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=7a267466-105e-4289-879a-95790b3c7a11 | |
2015-04-09 08:42:05,385 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:42:05,385 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:42:05,385 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 08:42:05,385 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:42:05,407 DEBUG Create event will be: time=2015-04-09T08:42:05.407336 severity=INFO origin="alert_handler" event_id="5d941b11c1baa0562287f06561ebbeb3" user="splunk-system-user" action="create" alert="TO Failed Logon >3 Alert" incident_id="7a267466-105e-4289-879a-95790b3c7a11" job_id="scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594120_13201" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594122" | |
2015-04-09 08:42:05,413 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594120_13201 with incident_id=7a267466-105e-4289-879a-95790b3c7a11 | |
2015-04-09 08:42:05,441 DEBUG results for incident_id=7a267466-105e-4289-879a-95790b3c7a11 written to collection. | |
2015-04-09 08:42:05,442 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594120_13201 incident_id=7a267466-105e-4289-879a-95790b3c7a11 result_id=0 written to collection incident_results | |
2015-04-09 08:42:05,442 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:42:05,448 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:42:05,448 INFO Alert handler finished. duration=0.725s | |
2015-04-09 08:43:04,186 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594180_13203/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594180_13203 sessionKey=iqE5dA9ux25jh02ir8c57UAgrWEGxPuWRHfiYLN2NbydUZlqgBFCXjr5M3KIs^5tKNwA4gyIi2V5rm8hxhhDCCnbohHrVGpJiD6nTdCaub6n9CEzvC42_sj7VFWoH3fgsg6RNnGLYEb0F0mE alert=TO Failed Login Alert | |
2015-04-09 08:43:04,191 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594180_13203' has been fired. | |
2015-04-09 08:43:04,233 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594180_13204/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594180_13204 sessionKey=gLXMzIJrNama7S84pNNQO2qwK_mitlt346fTQWelLDdgryj4P71QoSFxJLIZ0Fb^LGw1aec5iJYIlgiF6JdTK0ceDESJFP8JL5jrw0AgAawXMkxj3VQeL_p7XAa^RmZGPCv7CykDcrL2r7R alert=TO Failed Logon >3 Alert | |
2015-04-09 08:43:04,239 INFO alert_handler started because alert 'TO Failed Logon >3 Alert' with id 'scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594180_13204' has been fired. | |
2015-04-09 08:43:04,463 DEBUG Parsed global alert handler settings: {"default_priority": "low", "index": "alerts-to-inf", "default_urgency": "low", "default_owner": "unassigned", "default_impact": "low"} | |
2015-04-09 08:43:04,463 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 08:43:04,470 DEBUG Incident settings: [ ] | |
2015-04-09 08:43:04,470 INFO No incident settings found for TO Failed Login Alert, switching back to defaults. | |
2015-04-09 08:43:04,470 DEBUG Incident config after getting settings: {"tags": "", "category": "", "auto_assign_user": "", "run_alert_script": false, "subcategory": "", "auto_previous_resolve": false, "urgency": "low", "auto_ttl_resolve": false, "auto_assign": false, "alert_script": ""} | |
2015-04-09 08:43:04,478 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 3 results. | |
2015-04-09 08:43:04,496 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:43:04,496 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:43:04,515 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-09 08:43:04,515 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Logon%20%3E3%20Alert%22%7D | |
2015-04-09 08:43:04,521 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "unknown", "subcategory" : "unknown", "alert" : "TO Failed Logon >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "_user" : "nobody", "_key" : "55269d1f3403e12f580a9558" } ] | |
2015-04-09 08:43:04,521 INFO Found incident settings for TO Failed Logon >3 Alert | |
2015-04-09 08:43:04,522 DEBUG Incident config after getting settings: {"subcategory": "unknown", "alert": "TO Failed Logon >3 Alert", "auto_assign": false, "run_alert_script": false, "auto_ttl_resolve": false, "tags": "[Untagged]", "auto_assign_owner": "unassigned", "_key": "55269d1f3403e12f580a9558", "auto_assign_user": "", "auto_previous_resolve": false, "category": "unknown", "alert_script": "", "_user": "nobody", "urgency": "high"} | |
2015-04-09 08:43:04,523 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:43:04,523 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:43:04,524 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:43:04,524 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:43:04,524 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594180_13203 | |
2015-04-09 08:43:04,530 INFO Found job for alert TO Failed Logon >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 08:43:04,548 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:43:04,548 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:43:04,573 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:43:04,573 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:43:04,573 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:43:04,573 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:43:04,573 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594180_13204 | |
2015-04-09 08:43:04,821 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=2a73cc47-4abd-4aae-8e32-c8ce31323fc8 | |
2015-04-09 08:43:04,849 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:43:04,849 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:43:04,850 DEBUG Querying lookup with filter={'urgency': 'low', 'impact': 'medium'} | |
2015-04-09 08:43:04,850 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 08:43:04,869 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=dd193e5b-ad46-4d47-bf1d-2686fd3803e9 | |
2015-04-09 08:43:04,896 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:43:04,897 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:43:04,897 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:43:04,897 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:43:04,899 DEBUG Create event will be: time=2015-04-09T08:43:04.899841 severity=INFO origin="alert_handler" event_id="77fedb2cb4e9f0c6546e3205d614b446" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="2a73cc47-4abd-4aae-8e32-c8ce31323fc8" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594180_13203" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428594181" | |
2015-04-09 08:43:04,907 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594180_13203 with incident_id=2a73cc47-4abd-4aae-8e32-c8ce31323fc8 | |
2015-04-09 08:43:04,933 DEBUG results for incident_id=2a73cc47-4abd-4aae-8e32-c8ce31323fc8 written to collection. | |
2015-04-09 08:43:04,934 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594180_13203 incident_id=2a73cc47-4abd-4aae-8e32-c8ce31323fc8 result_id=0 written to collection incident_results | |
2015-04-09 08:43:04,934 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:43:04,934 DEBUG Create event will be: time=2015-04-09T08:43:04.934249 severity=INFO origin="alert_handler" event_id="61db15ac5d1d643a06475fcfe645997d" user="splunk-system-user" action="create" alert="TO Failed Logon >3 Alert" incident_id="dd193e5b-ad46-4d47-bf1d-2686fd3803e9" job_id="scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594180_13204" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594181" | |
2015-04-09 08:43:04,941 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594180_13204 with incident_id=dd193e5b-ad46-4d47-bf1d-2686fd3803e9 | |
2015-04-09 08:43:04,944 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:43:04,944 INFO Alert handler finished. duration=0.759s | |
2015-04-09 08:43:04,968 DEBUG results for incident_id=dd193e5b-ad46-4d47-bf1d-2686fd3803e9 written to collection. | |
2015-04-09 08:43:04,969 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5c37f320e95eeb368_at_1428594180_13204 incident_id=dd193e5b-ad46-4d47-bf1d-2686fd3803e9 result_id=0 written to collection incident_results | |
2015-04-09 08:43:04,969 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:43:04,976 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:43:04,976 INFO Alert handler finished. duration=0.744s | |
2015-04-09 08:44:04,020 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594240_13206/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594240_13206 sessionKey=KXH8BpCLYe_zxUOl1qvDW6O^1GGkAl^2Oh__tZVIAZsI^OUvdPz4VskKAzHVY3WTz0QqRNDrZ3CzwknE7TZlpQWZHzLbl8d4WcuThuD7Z_3WJEIN9FJMI6pM7pKwKMwC_2Cf7FTEMFpJzp6ELol7 alert=TO Failed Login Alert | |
2015-04-09 08:44:04,026 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594240_13206' has been fired. | |
2015-04-09 08:44:04,292 DEBUG Parsed global alert handler settings: {"default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_priority": "low"} | |
2015-04-09 08:44:04,292 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 08:44:04,298 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "low", "category" : "Active Directory", "subcategory" : "unknown", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 08:44:04,298 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 08:44:04,298 DEBUG Incident config after getting settings: {"alert": "TO Failed Login Alert", "category": "Active Directory", "run_alert_script": false, "subcategory": "unknown", "urgency": "low", "tags": "[Untagged]", "auto_assign": false, "auto_ttl_resolve": false, "_user": "nobody", "alert_script": "", "auto_assign_user": "", "auto_previous_resolve": false, "auto_assign_owner": "unassigned", "_key": "55269e303403e12f580a9565"} | |
2015-04-09 08:44:04,305 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 3 results. | |
2015-04-09 08:44:04,322 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:44:04,322 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:44:04,348 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:44:04,348 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:44:04,349 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:44:04,349 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:44:04,349 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594240_13206 | |
2015-04-09 08:44:04,642 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=7480420f-9a64-4a3e-8842-1867ae82952c | |
2015-04-09 08:44:04,668 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:44:04,669 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:44:04,669 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 08:44:04,669 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 08:44:04,708 DEBUG Create event will be: time=2015-04-09T08:44:04.708183 severity=INFO origin="alert_handler" event_id="46041ff9fb12f802c1cf77e712a97960" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="7480420f-9a64-4a3e-8842-1867ae82952c" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594240_13206" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428594241" | |
2015-04-09 08:44:04,714 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594240_13206 with incident_id=7480420f-9a64-4a3e-8842-1867ae82952c | |
2015-04-09 08:44:04,742 DEBUG results for incident_id=7480420f-9a64-4a3e-8842-1867ae82952c written to collection. | |
2015-04-09 08:44:04,742 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594240_13206 incident_id=7480420f-9a64-4a3e-8842-1867ae82952c result_id=0 written to collection incident_results | |
2015-04-09 08:44:04,742 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:44:04,749 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:44:04,750 INFO Alert handler finished. duration=0.73s | |
2015-04-09 08:45:05,551 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594300_13214/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594300_13214 sessionKey=QGYoPCE70qvQa^cr7HZWTkIG295HEMFY5PwXgQX2F9zVChy5l9lhahvbB^AYNUwTRhbpSSHgN9VAL9811wQgSR2alGXtAv9raeBvW89hoTF2M2nAU9RWXQsHZsgxIye2lXt5uhUJd9Rr1qOtaMg4 alert=TO Failed Login Alert | |
2015-04-09 08:45:05,558 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594300_13214' has been fired. | |
2015-04-09 08:45:05,833 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 08:45:05,833 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 08:45:05,839 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "low", "category" : "Active Directory", "subcategory" : "unknown", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 08:45:05,839 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 08:45:05,839 DEBUG Incident config after getting settings: {"auto_assign": false, "category": "Active Directory", "alert": "TO Failed Login Alert", "_key": "55269e303403e12f580a9565", "_user": "nobody", "subcategory": "unknown", "auto_assign_user": "", "run_alert_script": false, "auto_ttl_resolve": false, "tags": "[Untagged]", "urgency": "low", "auto_previous_resolve": false, "alert_script": "", "auto_assign_owner": "unassigned"} | |
2015-04-09 08:45:05,847 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 3 results. | |
2015-04-09 08:45:05,865 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:45:05,865 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:45:05,890 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:45:05,891 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:45:05,891 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:45:05,891 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:45:05,891 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594300_13214 | |
2015-04-09 08:45:06,191 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=ee0b1551-097d-4437-8688-af9d3c4889a0 | |
2015-04-09 08:45:06,218 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:45:06,218 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:45:06,218 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 08:45:06,219 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 08:45:06,250 DEBUG Create event will be: time=2015-04-09T08:45:06.250248 severity=INFO origin="alert_handler" event_id="b85333b1c631387819eac9183dd56a75" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="ee0b1551-097d-4437-8688-af9d3c4889a0" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594300_13214" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428594301" | |
2015-04-09 08:45:06,256 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594300_13214 with incident_id=ee0b1551-097d-4437-8688-af9d3c4889a0 | |
2015-04-09 08:45:06,284 DEBUG results for incident_id=ee0b1551-097d-4437-8688-af9d3c4889a0 written to collection. | |
2015-04-09 08:45:06,284 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594300_13214 incident_id=ee0b1551-097d-4437-8688-af9d3c4889a0 result_id=0 written to collection incident_results | |
2015-04-09 08:45:06,284 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:45:06,292 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:45:06,292 INFO Alert handler finished. duration=0.741s | |
2015-04-09 08:46:04,296 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594360_13223/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594360_13223 sessionKey=ta9vpZzyz2I2o_KkEsHMW2CGdHIjqmmv9gN10km0YDDBknzkBiPzKvAQVgS^cdlzsGnBsmIZasTuoBRXPgLToOddLmsGs_nq438ejv5nn5zWt0CrqBXYRiZhfEvvCygyCDkbM9f0Fv5FABYJ alert=TO Failed Login Alert | |
2015-04-09 08:46:04,302 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594360_13223' has been fired. | |
2015-04-09 08:46:04,586 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_priority": "low", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-09 08:46:04,586 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 08:46:04,593 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "low", "category" : "Active Directory", "subcategory" : "unknown", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 08:46:04,594 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 08:46:04,594 DEBUG Incident config after getting settings: {"tags": "[Untagged]", "urgency": "low", "_user": "nobody", "alert": "TO Failed Login Alert", "subcategory": "unknown", "auto_assign_user": "", "auto_assign": false, "auto_ttl_resolve": false, "auto_previous_resolve": false, "alert_script": "", "auto_assign_owner": "unassigned", "category": "Active Directory", "_key": "55269e303403e12f580a9565", "run_alert_script": false} | |
2015-04-09 08:46:04,603 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 3 results. | |
2015-04-09 08:46:04,623 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:46:04,623 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:46:04,650 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:46:04,650 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:46:04,650 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:46:04,650 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:46:04,650 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594360_13223 | |
2015-04-09 08:46:04,952 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=d4d8c3d8-bd83-40b5-a180-a3d3cb0df5ee | |
2015-04-09 08:46:04,979 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:46:04,980 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:46:04,980 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 08:46:04,980 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 08:46:04,996 DEBUG Create event will be: time=2015-04-09T08:46:04.996088 severity=INFO origin="alert_handler" event_id="f7d23675b72a0e45a66f5c5a6b3f4471" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="d4d8c3d8-bd83-40b5-a180-a3d3cb0df5ee" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594360_13223" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428594361" | |
2015-04-09 08:46:05,002 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594360_13223 with incident_id=d4d8c3d8-bd83-40b5-a180-a3d3cb0df5ee | |
2015-04-09 08:46:05,030 DEBUG results for incident_id=d4d8c3d8-bd83-40b5-a180-a3d3cb0df5ee written to collection. | |
2015-04-09 08:46:05,030 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594360_13223 incident_id=d4d8c3d8-bd83-40b5-a180-a3d3cb0df5ee result_id=0 written to collection incident_results | |
2015-04-09 08:46:05,030 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:46:05,038 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:46:05,038 INFO Alert handler finished. duration=0.742s | |
2015-04-09 08:47:04,124 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594420_13225/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594420_13225 sessionKey=BWB1nzfqY^nfVqq4zXbtEbF2ZzmssEvJLs0RrnT5lr3JoJ0giOCHB9KZzuldCSoNVxO7dh^AzGYpR5IXKy5MTk9FNXLzy9OfCeQDnnt9XfiTbc4J^x3JGiDMJBE7OAS_sRmag4uWz13 alert=TO Failed Login Alert | |
2015-04-09 08:47:04,130 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594420_13225' has been fired. | |
2015-04-09 08:47:04,400 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low", "default_owner": "unassigned", "default_priority": "low"} | |
2015-04-09 08:47:04,400 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 08:47:04,406 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "low", "category" : "Active Directory", "subcategory" : "Single System Login Failed", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 08:47:04,407 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 08:47:04,407 DEBUG Incident config after getting settings: {"run_alert_script": false, "category": "Active Directory", "auto_previous_resolve": false, "auto_ttl_resolve": false, "alert": "TO Failed Login Alert", "urgency": "low", "auto_assign_owner": "unassigned", "alert_script": "", "tags": "AD", "subcategory": "Single System Login Failed", "_user": "nobody", "auto_assign_user": "", "_key": "55269e303403e12f580a9565", "auto_assign": false} | |
2015-04-09 08:47:04,414 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 3 results. | |
2015-04-09 08:47:04,432 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:47:04,432 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:47:04,458 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:47:04,458 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:47:04,458 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:47:04,458 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:47:04,458 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594420_13225 | |
2015-04-09 08:47:04,758 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=63181dfc-29ba-4f1e-b9ff-4660b48ed81e | |
2015-04-09 08:47:04,784 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:47:04,785 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:47:04,785 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 08:47:04,785 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 08:47:04,802 DEBUG Create event will be: time=2015-04-09T08:47:04.802624 severity=INFO origin="alert_handler" event_id="c3e4f8dafeb034056d22edb9e44a8c99" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="63181dfc-29ba-4f1e-b9ff-4660b48ed81e" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594420_13225" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428594421" | |
2015-04-09 08:47:04,808 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594420_13225 with incident_id=63181dfc-29ba-4f1e-b9ff-4660b48ed81e | |
2015-04-09 08:47:04,837 DEBUG results for incident_id=63181dfc-29ba-4f1e-b9ff-4660b48ed81e written to collection. | |
2015-04-09 08:47:04,837 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594420_13225 incident_id=63181dfc-29ba-4f1e-b9ff-4660b48ed81e result_id=0 written to collection incident_results | |
2015-04-09 08:47:04,837 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:47:04,843 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:47:04,843 INFO Alert handler finished. duration=0.72s | |
2015-04-09 08:48:04,384 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594480_13227/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594480_13227 sessionKey=obPw44LcxwsVDhIVXtedzCv_CwMXbjZqLqRWy1aGxKBUFjr1XZdIuk4jpx2vtD0X1N3aa7kcKgui7E4jdEjpPnAynmCQYAZGPZgcgzOVzLjPrpmwi9oNgRdBpWGnrZhBHG0FI4MbDWIK1F alert=TO Failed Login Alert | |
2015-04-09 08:48:04,390 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594480_13227' has been fired. | |
2015-04-09 08:48:04,660 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 08:48:04,661 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 08:48:04,667 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Single System Login Failed", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 08:48:04,667 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 08:48:04,667 DEBUG Incident config after getting settings: {"auto_assign": false, "urgency": "high", "run_alert_script": false, "tags": "AD", "_key": "55269e303403e12f580a9565", "category": "Active Directory", "alert": "TO Failed Login Alert", "alert_script": "", "auto_previous_resolve": false, "auto_assign_user": "", "_user": "nobody", "subcategory": "Single System Login Failed", "auto_assign_owner": "unassigned", "auto_ttl_resolve": false} | |
2015-04-09 08:48:04,675 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 08:48:04,692 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:48:04,692 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:48:04,716 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:48:04,717 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:48:04,717 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:48:04,717 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:48:04,717 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594480_13227 | |
2015-04-09 08:48:05,007 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=922dcfc1-2c77-47ad-ac7e-bc81bf0142cd | |
2015-04-09 08:48:05,033 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:48:05,033 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:48:05,034 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:48:05,034 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:48:05,057 DEBUG Create event will be: time=2015-04-09T08:48:05.057314 severity=INFO origin="alert_handler" event_id="d7853567df4110c9d9039da93e8ac7e0" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="922dcfc1-2c77-47ad-ac7e-bc81bf0142cd" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594480_13227" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594481" | |
2015-04-09 08:48:05,063 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594480_13227 with incident_id=922dcfc1-2c77-47ad-ac7e-bc81bf0142cd | |
2015-04-09 08:48:05,091 DEBUG results for incident_id=922dcfc1-2c77-47ad-ac7e-bc81bf0142cd written to collection. | |
2015-04-09 08:48:05,091 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594480_13227 incident_id=922dcfc1-2c77-47ad-ac7e-bc81bf0142cd result_id=0 written to collection incident_results | |
2015-04-09 08:48:05,091 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:48:05,098 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:48:05,098 INFO Alert handler finished. duration=0.714s | |
2015-04-09 08:49:04,216 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594540_13231/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594540_13231 sessionKey=AAWAMkUeS^AkGEPyL0uDwEQ6Vm^8rydH88cFea4mcy0oVJ24W7nkiGeRLe8vwK2fSRc7M2BkzjaMXUF7ghcfwFHlPP^tD5F7LOjY9AVhfrTleQd4V8nTOq5_d9WTvikW2B^MSiV4wvZVxwofyF alert=TO Failed Login Alert | |
2015-04-09 08:49:04,222 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594540_13231' has been fired. | |
2015-04-09 08:49:04,491 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned", "default_priority": "low"} | |
2015-04-09 08:49:04,491 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 08:49:04,497 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "Single System Login Failed", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 08:49:04,497 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 08:49:04,497 DEBUG Incident config after getting settings: {"auto_assign": false, "auto_ttl_resolve": false, "alert": "TO Failed Login Alert", "category": "Active Directory", "auto_previous_resolve": false, "auto_assign_owner": "unassigned", "_key": "55269e303403e12f580a9565", "tags": "AD", "alert_script": "", "auto_assign_user": "", "urgency": "high", "subcategory": "Single System Login Failed", "run_alert_script": false, "_user": "nobody"} | |
2015-04-09 08:49:04,505 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 08:49:04,523 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:49:04,523 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:49:04,549 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:49:04,549 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:49:04,549 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:49:04,550 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:49:04,550 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594540_13231 | |
2015-04-09 08:49:04,843 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=bff888da-dc4b-41d8-93df-5b3dda876117 | |
2015-04-09 08:49:04,870 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:49:04,870 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:49:04,870 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:49:04,870 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:49:04,897 DEBUG Create event will be: time=2015-04-09T08:49:04.897216 severity=INFO origin="alert_handler" event_id="d964f3080f2cb310023874c9db0d78ad" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="bff888da-dc4b-41d8-93df-5b3dda876117" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594540_13231" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594541" | |
2015-04-09 08:49:04,903 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594540_13231 with incident_id=bff888da-dc4b-41d8-93df-5b3dda876117 | |
2015-04-09 08:49:04,931 DEBUG results for incident_id=bff888da-dc4b-41d8-93df-5b3dda876117 written to collection. | |
2015-04-09 08:49:04,932 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594540_13231 incident_id=bff888da-dc4b-41d8-93df-5b3dda876117 result_id=0 written to collection incident_results | |
2015-04-09 08:49:04,932 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:49:04,939 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:49:04,939 INFO Alert handler finished. duration=0.723s | |
2015-04-09 08:50:05,666 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594600_13238/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594600_13238 sessionKey=2CZ7w^83q0NlZlu0GnhAH_CmV5gBXjb^AMpGq6odZdXUGBk7T7yHY9JXERha5Y^eoKsmVohHtGm7x0iy^GB652jUVOH1CzPJ2VDUkcPxySgvkALOJGRzWfeC43bAovRCRRGQz2dzOpvBfF alert=TO Failed Login Alert | |
2015-04-09 08:50:05,672 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594600_13238' has been fired. | |
2015-04-09 08:50:05,755 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594600_13241/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594600_13241 sessionKey=Ze734I16FNqymYHx6l1vOtiXNRVZwzBPicA6jPgDt_kRDNLTLZqbSYqaCxBxYALjmyG5jZxoVUTdn8mRd7p6lvbdtaqMJOn8poUld6XLLwl8fP2aQSZpKHnLqIRAiRcCISGc^i8ozH4KjqG7h9BN alert=To Global Failed Login >3 Alert | |
2015-04-09 08:50:05,761 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594600_13241' has been fired. | |
2015-04-09 08:50:05,943 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-09 08:50:05,943 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 08:50:05,950 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 08:50:05,950 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 08:50:05,950 DEBUG Incident config after getting settings: {"auto_assign_user": "", "auto_assign": false, "_user": "nobody", "auto_ttl_resolve": false, "tags": "AD", "alert": "TO Failed Login Alert", "subcategory": "", "category": "Active Directory", "alert_script": "", "auto_previous_resolve": false, "urgency": "high", "run_alert_script": false, "auto_assign_owner": "unassigned", "_key": "55269e303403e12f580a9565"} | |
2015-04-09 08:50:05,958 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 08:50:05,975 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:50:05,975 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:50:06,000 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:50:06,000 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:50:06,000 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:50:06,000 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:50:06,000 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594600_13238 | |
2015-04-09 08:50:06,033 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 08:50:06,034 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 08:50:06,040 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 08:50:06,040 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 08:50:06,040 DEBUG Incident config after getting settings: {"auto_assign_owner": "unassigned", "category": "Active Directory", "_user": "nobody", "auto_assign": false, "urgency": "high", "auto_ttl_resolve": false, "alert": "To Global Failed Login >3 Alert", "run_alert_script": false, "tags": "AD", "_key": "55240dda3403e174be1efd93", "subcategory": "", "alert_script": "", "auto_assign_user": "", "auto_previous_resolve": false} | |
2015-04-09 08:50:06,049 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 08:50:06,068 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:50:06,068 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:50:06,100 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:50:06,100 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:50:06,101 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:50:06,101 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:50:06,101 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594600_13241 | |
2015-04-09 08:50:06,303 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=df2c2a17-8c09-4ad5-8fd0-b8da2a0ecb3e | |
2015-04-09 08:50:06,329 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:50:06,330 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:50:06,330 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 08:50:06,330 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:50:06,372 DEBUG Create event will be: time=2015-04-09T08:50:06.372146 severity=INFO origin="alert_handler" event_id="331dc8fdd0a69ae4855a1fa28828d2d8" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="df2c2a17-8c09-4ad5-8fd0-b8da2a0ecb3e" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594600_13238" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594601" | |
2015-04-09 08:50:06,379 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594600_13238 with incident_id=df2c2a17-8c09-4ad5-8fd0-b8da2a0ecb3e | |
2015-04-09 08:50:06,404 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=b01a00c2-a82f-468c-b27a-d27cc669d60d | |
2015-04-09 08:50:06,406 DEBUG results for incident_id=df2c2a17-8c09-4ad5-8fd0-b8da2a0ecb3e written to collection. | |
2015-04-09 08:50:06,406 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594600_13238 incident_id=df2c2a17-8c09-4ad5-8fd0-b8da2a0ecb3e result_id=0 written to collection incident_results | |
2015-04-09 08:50:06,407 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:50:06,414 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:50:06,414 INFO Alert handler finished. duration=0.749s | |
2015-04-09 08:50:06,432 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:50:06,432 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:50:06,432 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 08:50:06,432 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:50:06,475 DEBUG Create event will be: time=2015-04-09T08:50:06.475708 severity=INFO origin="alert_handler" event_id="3fd5b28cd7818157c3bd13dc00763693" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="b01a00c2-a82f-468c-b27a-d27cc669d60d" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594600_13241" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594601" | |
2015-04-09 08:50:06,482 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594600_13241 with incident_id=b01a00c2-a82f-468c-b27a-d27cc669d60d | |
2015-04-09 08:50:06,509 DEBUG results for incident_id=b01a00c2-a82f-468c-b27a-d27cc669d60d written to collection. | |
2015-04-09 08:50:06,509 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594600_13241 incident_id=b01a00c2-a82f-468c-b27a-d27cc669d60d result_id=0 written to collection incident_results | |
2015-04-09 08:50:06,509 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:50:06,517 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:50:06,517 INFO Alert handler finished. duration=0.762s | |
2015-04-09 08:51:13,966 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594660_13264/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594660_13264 sessionKey=DsoyUN6hiF0SVw8WS4kncc4QWjblIa_JUHrstft2zLsrEoWJZPddn6E6vy2p^JTydoc2u1yoOZnw8DAKdkujtanWrYRw_nQg0vP76lvFR8h4_yIcr9yzjd7ciTdAVXPQIbh14KhywGwB8JuM0o alert=To Global Failed Login >3 Alert | |
2015-04-09 08:51:13,972 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594660_13264' has been fired. | |
2015-04-09 08:51:14,099 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594660_13263/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594660_13263 sessionKey=T4BYDqlmzEPoBEfOmJS9aR1l9vdeXcGfDGKkJc_3ZUXLdIn3PB3wmYiHWhWUNU7oIseBUOUdsktk^_nga^uI4n1vTNJc24Os1KseiGRgjKSqVvS4vSanem08mDPFMKQJDu^6l4wz0SKMkQyL_0LW alert=TO Failed Login Alert | |
2015-04-09 08:51:14,105 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594660_13263' has been fired. | |
2015-04-09 08:51:14,249 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 08:51:14,250 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 08:51:14,256 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 08:51:14,256 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 08:51:14,256 DEBUG Incident config after getting settings: {"urgency": "high", "auto_previous_resolve": false, "run_alert_script": false, "auto_assign_user": "", "_user": "nobody", "tags": "AD", "alert_script": "", "alert": "To Global Failed Login >3 Alert", "_key": "55240dda3403e174be1efd93", "auto_assign": false, "auto_assign_owner": "unassigned", "subcategory": "", "category": "Active Directory", "auto_ttl_resolve": false} | |
2015-04-09 08:51:14,265 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 08:51:14,283 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:51:14,283 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:51:14,309 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:51:14,309 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:51:14,309 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:51:14,309 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:51:14,309 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594660_13264 | |
2015-04-09 08:51:14,376 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-09 08:51:14,376 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 08:51:14,382 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 08:51:14,383 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 08:51:14,383 DEBUG Incident config after getting settings: {"auto_assign_owner": "unassigned", "auto_assign_user": "", "_key": "55269e303403e12f580a9565", "auto_previous_resolve": false, "alert_script": "", "tags": "AD", "alert": "TO Failed Login Alert", "run_alert_script": false, "_user": "nobody", "subcategory": "", "category": "Active Directory", "auto_ttl_resolve": false, "auto_assign": false, "urgency": "high"} | |
2015-04-09 08:51:14,391 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 08:51:14,408 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:51:14,408 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:51:14,436 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:51:14,436 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:51:14,437 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:51:14,437 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:51:14,437 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594660_13263 | |
2015-04-09 08:51:14,614 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=33e3b825-4ca7-4840-b569-52c78aa3516d | |
2015-04-09 08:51:14,642 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:51:14,642 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:51:14,642 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:51:14,642 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:51:14,683 DEBUG Create event will be: time=2015-04-09T08:51:14.683476 severity=INFO origin="alert_handler" event_id="c38d37f064aa8a2a031d0f19a5bfe790" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="33e3b825-4ca7-4840-b569-52c78aa3516d" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594660_13264" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594671" | |
2015-04-09 08:51:14,690 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594660_13264 with incident_id=33e3b825-4ca7-4840-b569-52c78aa3516d | |
2015-04-09 08:51:14,718 DEBUG results for incident_id=33e3b825-4ca7-4840-b569-52c78aa3516d written to collection. | |
2015-04-09 08:51:14,719 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594660_13264 incident_id=33e3b825-4ca7-4840-b569-52c78aa3516d result_id=0 written to collection incident_results | |
2015-04-09 08:51:14,719 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:51:14,727 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:51:14,727 INFO Alert handler finished. duration=0.761s | |
2015-04-09 08:51:14,739 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=faf74373-8f11-48df-a35e-1609ff408407 | |
2015-04-09 08:51:14,767 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:51:14,768 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:51:14,768 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:51:14,768 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:51:14,787 DEBUG Create event will be: time=2015-04-09T08:51:14.787557 severity=INFO origin="alert_handler" event_id="55d936c8b378c814f7d192a4ffa616f0" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="faf74373-8f11-48df-a35e-1609ff408407" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594660_13263" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594671" | |
2015-04-09 08:51:14,794 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594660_13263 with incident_id=faf74373-8f11-48df-a35e-1609ff408407 | |
2015-04-09 08:51:14,822 DEBUG results for incident_id=faf74373-8f11-48df-a35e-1609ff408407 written to collection. | |
2015-04-09 08:51:14,822 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594660_13263 incident_id=faf74373-8f11-48df-a35e-1609ff408407 result_id=0 written to collection incident_results | |
2015-04-09 08:51:14,822 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:51:14,829 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:51:14,830 INFO Alert handler finished. duration=0.731s | |
2015-04-09 08:52:04,907 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594720_13267/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594720_13267 sessionKey=n^4NCJehs_LiTZUKqSRPnnKfjuq5LD3xl_h1qD0LDskHmmzm1FPuiotcwuPjT6nNi7a7xarZxWvzUxveaOr1Hr5CfpLU0znw1Tix9ugplmIvxfCm5NXJzBrVmfGrAruDGdr9rB_tAqhzzo alert=To Global Failed Login >3 Alert | |
2015-04-09 08:52:04,913 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594720_13267' has been fired. | |
2015-04-09 08:52:05,186 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf"} | |
2015-04-09 08:52:05,187 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 08:52:05,193 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 08:52:05,193 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 08:52:05,193 DEBUG Incident config after getting settings: {"_user": "nobody", "alert_script": "", "auto_assign_owner": "unassigned", "auto_previous_resolve": false, "auto_assign": false, "auto_ttl_resolve": false, "run_alert_script": false, "category": "Active Directory", "urgency": "high", "_key": "55240dda3403e174be1efd93", "auto_assign_user": "", "subcategory": "", "alert": "To Global Failed Login >3 Alert", "tags": "AD"} | |
2015-04-09 08:52:05,201 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 08:52:05,219 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:52:05,219 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:52:05,245 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:52:05,246 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:52:05,246 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:52:05,246 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:52:05,246 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594720_13267 | |
2015-04-09 08:52:05,315 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594720_13266/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594720_13266 sessionKey=SGiPXB1_7e430beChxwlu8OrM^BzGisLKgPBmPndb7wx7NsKa0uQx8OcKyMByZ94baGQjLuELN1eWj4rcJ2^r8vzI^Xs2rrahgCYovs_WE3EztOqt1t5fnQ2fDTLX0JT7U56uyRdsZ2W alert=TO Failed Login Alert | |
2015-04-09 08:52:05,320 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594720_13266' has been fired. | |
2015-04-09 08:52:05,542 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=0d7c2da2-a655-4197-a65a-78b70203d37d | |
2015-04-09 08:52:05,567 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:52:05,567 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:52:05,568 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 08:52:05,568 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:52:05,592 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "default_priority": "low", "default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 08:52:05,593 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 08:52:05,602 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 08:52:05,602 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 08:52:05,602 DEBUG Incident config after getting settings: {"auto_assign_owner": "unassigned", "_user": "nobody", "alert_script": "", "urgency": "high", "auto_previous_resolve": false, "subcategory": "", "auto_ttl_resolve": false, "run_alert_script": false, "auto_assign_user": "", "_key": "55269e303403e12f580a9565", "category": "Active Directory", "alert": "TO Failed Login Alert", "auto_assign": false, "tags": "AD"} | |
2015-04-09 08:52:05,604 DEBUG Create event will be: time=2015-04-09T08:52:05.604672 severity=INFO origin="alert_handler" event_id="3b3f605e2ac53453ac7550d0af369667" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="0d7c2da2-a655-4197-a65a-78b70203d37d" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594720_13267" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594721" | |
2015-04-09 08:52:05,612 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594720_13267 with incident_id=0d7c2da2-a655-4197-a65a-78b70203d37d | |
2015-04-09 08:52:05,614 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 08:52:05,636 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:52:05,637 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:52:05,639 DEBUG results for incident_id=0d7c2da2-a655-4197-a65a-78b70203d37d written to collection. | |
2015-04-09 08:52:05,639 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594720_13267 incident_id=0d7c2da2-a655-4197-a65a-78b70203d37d result_id=0 written to collection incident_results | |
2015-04-09 08:52:05,639 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:52:05,646 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:52:05,646 INFO Alert handler finished. duration=0.74s | |
2015-04-09 08:52:05,663 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:52:05,664 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:52:05,664 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:52:05,665 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:52:05,665 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594720_13266 | |
2015-04-09 08:52:05,957 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=5c189d94-c389-4da2-9521-b2c1a1ec274f | |
2015-04-09 08:52:05,983 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:52:05,983 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:52:05,983 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 08:52:05,983 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:52:06,014 DEBUG Create event will be: time=2015-04-09T08:52:06.014713 severity=INFO origin="alert_handler" event_id="926ba0fd8316cf6633fa9ddee479f5e5" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="5c189d94-c389-4da2-9521-b2c1a1ec274f" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594720_13266" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594721" | |
2015-04-09 08:52:06,021 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594720_13266 with incident_id=5c189d94-c389-4da2-9521-b2c1a1ec274f | |
2015-04-09 08:52:06,049 DEBUG results for incident_id=5c189d94-c389-4da2-9521-b2c1a1ec274f written to collection. | |
2015-04-09 08:52:06,049 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594720_13266 incident_id=5c189d94-c389-4da2-9521-b2c1a1ec274f result_id=0 written to collection incident_results | |
2015-04-09 08:52:06,049 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:52:06,056 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:52:06,056 INFO Alert handler finished. duration=0.742s | |
2015-04-09 08:53:04,102 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594780_13269/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594780_13269 sessionKey=HQ3fe^F7RLF4pRBhDtq0hAh1^8UD5DGV8uLGIiVTVbyAnrF3pH^X8U^W7aEdfpo6OPbBIKBQ_pjXGrmZ_xi83Q4WVPG2OQMQkGALFQ^Q^JjIdiSENIoeeDpnNpBtTgvfM7xPhKLVJSnEVC alert=TO Failed Login Alert | |
2015-04-09 08:53:04,108 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594780_13269' has been fired. | |
2015-04-09 08:53:04,392 DEBUG Parsed global alert handler settings: {"default_impact": "low", "default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 08:53:04,393 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 08:53:04,399 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 08:53:04,400 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 08:53:04,400 DEBUG Incident config after getting settings: {"alert_script": "", "tags": "AD", "subcategory": "", "_user": "nobody", "auto_assign": false, "auto_assign_owner": "unassigned", "alert": "TO Failed Login Alert", "run_alert_script": false, "category": "Active Directory", "auto_previous_resolve": false, "auto_ttl_resolve": false, "urgency": "high", "_key": "55269e303403e12f580a9565", "auto_assign_user": ""} | |
2015-04-09 08:53:04,408 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 08:53:04,419 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594780_13270/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594780_13270 sessionKey=89L0QbpRTGu2KW1BItIAip6Q5UmfY65FM03WDzxOb_8xGugW1BKNd9PhfQO9NADyF3CGLgLiEcJDX1iGecL_IJh1H4L9go2PeTVvmDm0Tcs3CAYL_aAOsxox7plri2z05drjEmq6rhW^typmlC alert=To Global Failed Login >3 Alert | |
2015-04-09 08:53:04,425 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594780_13270' has been fired. | |
2015-04-09 08:53:04,428 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:53:04,428 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:53:04,456 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:53:04,457 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:53:04,457 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:53:04,457 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:53:04,457 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594780_13269 | |
2015-04-09 08:53:04,702 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low", "default_priority": "low"} | |
2015-04-09 08:53:04,702 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 08:53:04,710 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 08:53:04,710 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 08:53:04,710 DEBUG Incident config after getting settings: {"run_alert_script": false, "auto_ttl_resolve": false, "_key": "55240dda3403e174be1efd93", "auto_previous_resolve": false, "alert_script": "", "tags": "AD", "urgency": "high", "auto_assign_owner": "unassigned", "auto_assign": false, "alert": "To Global Failed Login >3 Alert", "auto_assign_user": "", "category": "Active Directory", "_user": "nobody", "subcategory": ""} | |
2015-04-09 08:53:04,720 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 08:53:04,741 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:53:04,741 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:53:04,770 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=e953faf7-1b35-4699-bed7-e940fc254f8f | |
2015-04-09 08:53:04,770 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:53:04,770 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:53:04,771 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:53:04,771 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:53:04,771 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594780_13270 | |
2015-04-09 08:53:04,799 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:53:04,799 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:53:04,799 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:53:04,799 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:53:04,833 DEBUG Create event will be: time=2015-04-09T08:53:04.833454 severity=INFO origin="alert_handler" event_id="dcd2cde6db93bd9d766a459c76a5a1c6" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="e953faf7-1b35-4699-bed7-e940fc254f8f" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594780_13269" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594781" | |
2015-04-09 08:53:04,839 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594780_13269 with incident_id=e953faf7-1b35-4699-bed7-e940fc254f8f | |
2015-04-09 08:53:04,875 DEBUG results for incident_id=e953faf7-1b35-4699-bed7-e940fc254f8f written to collection. | |
2015-04-09 08:53:04,875 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594780_13269 incident_id=e953faf7-1b35-4699-bed7-e940fc254f8f result_id=0 written to collection incident_results | |
2015-04-09 08:53:04,876 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:53:04,884 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:53:04,884 INFO Alert handler finished. duration=0.782s | |
2015-04-09 08:53:05,077 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=76c20109-70b1-4b34-8527-fa11d4ff2b5a | |
2015-04-09 08:53:05,103 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:53:05,103 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:53:05,104 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:53:05,104 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:53:05,141 DEBUG Create event will be: time=2015-04-09T08:53:05.141613 severity=INFO origin="alert_handler" event_id="7ac2907538b6b01f1072387fd873f37b" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="76c20109-70b1-4b34-8527-fa11d4ff2b5a" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594780_13270" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594781" | |
2015-04-09 08:53:05,147 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594780_13270 with incident_id=76c20109-70b1-4b34-8527-fa11d4ff2b5a | |
2015-04-09 08:53:05,175 DEBUG results for incident_id=76c20109-70b1-4b34-8527-fa11d4ff2b5a written to collection. | |
2015-04-09 08:53:05,175 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594780_13270 incident_id=76c20109-70b1-4b34-8527-fa11d4ff2b5a result_id=0 written to collection incident_results | |
2015-04-09 08:53:05,175 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:53:05,182 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:53:05,182 INFO Alert handler finished. duration=0.764s | |
2015-04-09 08:54:04,006 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594840_13273/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594840_13273 sessionKey=5J0gC4pvVGdpkiYhYMHGVuE2xBdnjK1zFkGeE6iQwzo5RB9X0Vb^q3QE6KZbuavlyrJXQJEDfaWCsD4akBXTH6rA2D08aP0kiMkhC3oX1SxuGqxUrFyemdn3_9vOihXOda1z3xbv1S7s alert=To Global Failed Login >3 Alert | |
2015-04-09 08:54:04,012 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594840_13273' has been fired. | |
2015-04-09 08:54:04,123 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594840_13272/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594840_13272 sessionKey=Sgfco9^rrUV0Ohwq3RBM1cMTflnEzm^TY6_EDbKcu8f3Z2giM79ZJDz2aZV0Aur37AP4f5^l09YSsguW7Mav7jYn9DDddOOkmO_fCfDzh^4Wu8LHFrgqdsDLmRT0d8P7l9AwRvAFlIgRz7HL alert=TO Failed Login Alert | |
2015-04-09 08:54:04,129 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594840_13272' has been fired. | |
2015-04-09 08:54:04,292 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 08:54:04,293 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 08:54:04,299 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 08:54:04,299 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 08:54:04,300 DEBUG Incident config after getting settings: {"_user": "nobody", "auto_ttl_resolve": false, "urgency": "high", "auto_assign": false, "_key": "55240dda3403e174be1efd93", "subcategory": "", "category": "Active Directory", "run_alert_script": false, "alert": "To Global Failed Login >3 Alert", "alert_script": "", "auto_previous_resolve": false, "tags": "AD", "auto_assign_owner": "unassigned", "auto_assign_user": ""} | |
2015-04-09 08:54:04,308 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 08:54:04,327 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:54:04,327 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:54:04,358 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:54:04,358 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:54:04,359 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:54:04,359 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:54:04,359 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594840_13273 | |
2015-04-09 08:54:04,422 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf"} | |
2015-04-09 08:54:04,422 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 08:54:04,429 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 08:54:04,430 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 08:54:04,430 DEBUG Incident config after getting settings: {"run_alert_script": false, "subcategory": "", "alert": "TO Failed Login Alert", "category": "Active Directory", "_user": "nobody", "alert_script": "", "urgency": "high", "tags": "AD", "auto_assign": false, "auto_previous_resolve": false, "auto_assign_user": "", "auto_ttl_resolve": false, "auto_assign_owner": "unassigned", "_key": "55269e303403e12f580a9565"} | |
2015-04-09 08:54:04,439 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 08:54:04,459 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:54:04,459 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:54:04,485 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:54:04,486 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:54:04,486 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:54:04,486 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:54:04,486 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594840_13272 | |
2015-04-09 08:54:04,665 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=5ab02c86-1201-4017-917a-bdef53d61bf8 | |
2015-04-09 08:54:04,691 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:54:04,691 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:54:04,691 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 08:54:04,691 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:54:04,730 DEBUG Create event will be: time=2015-04-09T08:54:04.730528 severity=INFO origin="alert_handler" event_id="4c619c7c50980d7f7794c8aaa248d83d" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="5ab02c86-1201-4017-917a-bdef53d61bf8" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594840_13273" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594841" | |
2015-04-09 08:54:04,737 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594840_13273 with incident_id=5ab02c86-1201-4017-917a-bdef53d61bf8 | |
2015-04-09 08:54:04,765 DEBUG results for incident_id=5ab02c86-1201-4017-917a-bdef53d61bf8 written to collection. | |
2015-04-09 08:54:04,765 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428594840_13273 incident_id=5ab02c86-1201-4017-917a-bdef53d61bf8 result_id=0 written to collection incident_results | |
2015-04-09 08:54:04,765 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:54:04,773 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:54:04,773 INFO Alert handler finished. duration=0.767s | |
2015-04-09 08:54:04,800 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=f57b1951-decb-42ef-8923-1d89b85dcb0b | |
2015-04-09 08:54:04,828 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:54:04,828 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:54:04,828 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 08:54:04,829 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:54:04,868 DEBUG Create event will be: time=2015-04-09T08:54:04.868666 severity=INFO origin="alert_handler" event_id="3b5558edefb1d6e73f080e9089625b21" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="f57b1951-decb-42ef-8923-1d89b85dcb0b" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594840_13272" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594841" | |
2015-04-09 08:54:04,875 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594840_13272 with incident_id=f57b1951-decb-42ef-8923-1d89b85dcb0b | |
2015-04-09 08:54:04,903 DEBUG results for incident_id=f57b1951-decb-42ef-8923-1d89b85dcb0b written to collection. | |
2015-04-09 08:54:04,903 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594840_13272 incident_id=f57b1951-decb-42ef-8923-1d89b85dcb0b result_id=0 written to collection incident_results | |
2015-04-09 08:54:04,903 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:54:04,911 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:54:04,911 INFO Alert handler finished. duration=0.788s | |
2015-04-09 08:55:04,602 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594900_13279/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594900_13279 sessionKey=_GL_NutTWV7NDPDg34vAyJaUxqvCORk8YC0XdP8AL6DIXnbjPVUVK5XlhRUo8RKh1cU_dC6I24O_kcwOVxbWuSC1Jy585grIC9RZzJNuTFtlUew6F2aPdmmWrdnP2fmw2xWDIOZGQEArlB_igF alert=TO Failed Login Alert | |
2015-04-09 08:55:04,608 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594900_13279' has been fired. | |
2015-04-09 08:55:04,877 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf"} | |
2015-04-09 08:55:04,878 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 08:55:04,884 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 08:55:04,884 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 08:55:04,884 DEBUG Incident config after getting settings: {"alert": "TO Failed Login Alert", "subcategory": "", "auto_assign_owner": "unassigned", "tags": "AD", "auto_ttl_resolve": false, "run_alert_script": false, "category": "Active Directory", "auto_previous_resolve": false, "auto_assign": false, "auto_assign_user": "", "urgency": "high", "_key": "55269e303403e12f580a9565", "_user": "nobody", "alert_script": ""} | |
2015-04-09 08:55:04,892 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 08:55:04,909 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 08:55:04,909 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 08:55:04,934 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 08:55:04,934 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 08:55:04,935 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 08:55:04,935 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 08:55:04,935 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594900_13279 | |
2015-04-09 08:55:05,228 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=54046ee3-8a82-48ea-abaa-302f8d0a7bf8 | |
2015-04-09 08:55:05,253 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 08:55:05,254 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 08:55:05,254 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 08:55:05,254 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 08:55:05,280 DEBUG Create event will be: time=2015-04-09T08:55:05.279965 severity=INFO origin="alert_handler" event_id="e3a474a05ab84e1cc2d74608700740aa" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="54046ee3-8a82-48ea-abaa-302f8d0a7bf8" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594900_13279" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428594901" | |
2015-04-09 08:55:05,286 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594900_13279 with incident_id=54046ee3-8a82-48ea-abaa-302f8d0a7bf8 | |
2015-04-09 08:55:05,314 DEBUG results for incident_id=54046ee3-8a82-48ea-abaa-302f8d0a7bf8 written to collection. | |
2015-04-09 08:55:05,314 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428594900_13279 incident_id=54046ee3-8a82-48ea-abaa-302f8d0a7bf8 result_id=0 written to collection incident_results | |
2015-04-09 08:55:05,314 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 08:55:05,321 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 08:55:05,321 INFO Alert handler finished. duration=0.72s | |
2015-04-09 09:00:06,361 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428595200_13299/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428595200_13299 sessionKey=xRYz3A9mIjwrnhXhJmWrphnu1cnIHYz4xenf0XHklmYsWimi2ep2iGQ^67c8KJjQMKmLr8NdSabEPW5fyKSyH_Yegl7q05W5KPXnmaZS^xmwlGsZ968ehGgCyLv_ue86IHuF5AxGdUJNqF74EF8l alert=To Global Failed Login >3 Alert | |
2015-04-09 09:00:06,368 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428595200_13299' has been fired. | |
2015-04-09 09:00:06,657 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned", "default_priority": "low"} | |
2015-04-09 09:00:06,657 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 09:00:06,664 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 09:00:06,664 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 09:00:06,664 DEBUG Incident config after getting settings: {"_key": "55240dda3403e174be1efd93", "category": "Active Directory", "tags": "AD", "alert_script": "", "auto_assign_owner": "unassigned", "urgency": "high", "auto_previous_resolve": false, "run_alert_script": false, "auto_assign": false, "auto_ttl_resolve": false, "auto_assign_user": "", "alert": "To Global Failed Login >3 Alert", "subcategory": "", "_user": "nobody"} | |
2015-04-09 09:00:06,672 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 09:00:06,690 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 09:00:06,690 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 09:00:06,716 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 09:00:06,716 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 09:00:06,716 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 09:00:06,716 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 09:00:06,716 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428595200_13299 | |
2015-04-09 09:00:06,733 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428595200_13302/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428595200_13302 sessionKey=zDdBClsYwKUJ_sOV9CIMQr1F^CKVNUL9gJSa3_I6bjFO6tplznmub5Ne5br4pkxwKXs9MZs^jgQ50AsXOfDjXrBWe3cDVpSoFQesokqeO6uMRxd_BGcVdpWMyv1mdavD9X__QYkQ9VA alert=TO Failed Login Alert | |
2015-04-09 09:00:06,739 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428595200_13302' has been fired. | |
2015-04-09 09:00:07,021 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low"} | |
2015-04-09 09:00:07,021 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 09:00:07,028 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 09:00:07,028 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 09:00:07,029 DEBUG Incident config after getting settings: {"urgency": "high", "_user": "nobody", "alert_script": "", "auto_previous_resolve": false, "auto_assign": false, "auto_assign_user": "", "auto_assign_owner": "unassigned", "auto_ttl_resolve": false, "run_alert_script": false, "category": "Active Directory", "alert": "TO Failed Login Alert", "subcategory": "", "_key": "55269e303403e12f580a9565", "tags": "AD"} | |
2015-04-09 09:00:07,037 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 09:00:07,038 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=a647d096-02fa-4488-8338-1be262cb096c | |
2015-04-09 09:00:07,058 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 09:00:07,059 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 09:00:07,072 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 09:00:07,072 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 09:00:07,073 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 09:00:07,073 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 09:00:07,091 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 09:00:07,091 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 09:00:07,091 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 09:00:07,092 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 09:00:07,092 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428595200_13302 | |
2015-04-09 09:00:07,111 DEBUG Create event will be: time=2015-04-09T09:00:07.111031 severity=INFO origin="alert_handler" event_id="c0834f7f63005fe777b5eeef4f5884b9" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="a647d096-02fa-4488-8338-1be262cb096c" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428595200_13299" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428595201" | |
2015-04-09 09:00:07,117 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428595200_13299 with incident_id=a647d096-02fa-4488-8338-1be262cb096c | |
2015-04-09 09:00:07,145 DEBUG results for incident_id=a647d096-02fa-4488-8338-1be262cb096c written to collection. | |
2015-04-09 09:00:07,145 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428595200_13299 incident_id=a647d096-02fa-4488-8338-1be262cb096c result_id=0 written to collection incident_results | |
2015-04-09 09:00:07,145 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 09:00:07,152 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 09:00:07,152 INFO Alert handler finished. duration=0.792s | |
2015-04-09 09:00:07,397 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=595056d9-b930-4964-9b81-3828d397fdff | |
2015-04-09 09:00:07,427 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 09:00:07,427 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 09:00:07,427 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 09:00:07,427 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 09:00:07,453 DEBUG Create event will be: time=2015-04-09T09:00:07.453502 severity=INFO origin="alert_handler" event_id="b6ad62ac305483d164451f4d74b7a680" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="595056d9-b930-4964-9b81-3828d397fdff" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428595200_13302" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428595201" | |
2015-04-09 09:00:07,460 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428595200_13302 with incident_id=595056d9-b930-4964-9b81-3828d397fdff | |
2015-04-09 09:00:07,488 DEBUG results for incident_id=595056d9-b930-4964-9b81-3828d397fdff written to collection. | |
2015-04-09 09:00:07,488 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428595200_13302 incident_id=595056d9-b930-4964-9b81-3828d397fdff result_id=0 written to collection incident_results | |
2015-04-09 09:00:07,488 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 09:00:07,496 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 09:00:07,496 INFO Alert handler finished. duration=0.763s | |
2015-04-09 09:10:06,645 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428595800_13363/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428595800_13363 sessionKey=3PgHYDovRLOPO9ISPGdxemEbD0A2mibOYtB4vnUcZQj^aHQWiNmsYfSwHIC1L1T1tIaeBQKqpZJhMqVpu8OmzDrSMF1sgRMZxLG8o_2Y1HguIXHDyNlOsIVO4kwT8UF66KvPoFj6RGR alert=TO Failed Login Alert | |
2015-04-09 09:10:06,651 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428595800_13363' has been fired. | |
2015-04-09 09:10:06,929 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 09:10:06,929 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 09:10:06,930 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428595800_13366/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428595800_13366 sessionKey=euTlbiBdVxBnAEO__lefJhfXwPEgbPUUZQy_yhjQQQKgnrcHDPu7kZmG7bhTxlmSGvFN4ZDJ_WFsJMAR_3zTBot_kyAOuzCz^lwHvhY4AC21KjE1YVCaE3CkLMnNdohPK1UvFgzFAlTE3C alert=To Global Failed Login >3 Alert | |
2015-04-09 09:10:06,935 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 09:10:06,936 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 09:10:06,936 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428595800_13366' has been fired. | |
2015-04-09 09:10:06,936 DEBUG Incident config after getting settings: {"category": "Active Directory", "tags": "AD", "alert_script": "", "alert": "TO Failed Login Alert", "auto_assign": false, "urgency": "high", "run_alert_script": false, "auto_assign_user": "", "auto_assign_owner": "unassigned", "subcategory": "", "auto_ttl_resolve": false, "_key": "55269e303403e12f580a9565", "auto_previous_resolve": false, "_user": "nobody"} | |
2015-04-09 09:10:06,944 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 12 results. | |
2015-04-09 09:10:06,962 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 09:10:06,963 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 09:10:06,990 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 09:10:06,990 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 09:10:06,990 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 09:10:06,990 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 09:10:06,990 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428595800_13363 | |
2015-04-09 09:10:07,210 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-09 09:10:07,210 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 09:10:07,216 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 09:10:07,216 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 09:10:07,216 DEBUG Incident config after getting settings: {"auto_ttl_resolve": false, "category": "Active Directory", "subcategory": "", "_key": "55240dda3403e174be1efd93", "auto_assign_owner": "unassigned", "auto_assign": false, "alert": "To Global Failed Login >3 Alert", "alert_script": "", "tags": "AD", "auto_previous_resolve": false, "urgency": "high", "_user": "nobody", "auto_assign_user": "", "run_alert_script": false} | |
2015-04-09 09:10:07,224 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 5 results. | |
2015-04-09 09:10:07,244 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 09:10:07,244 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 09:10:07,275 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 09:10:07,275 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 09:10:07,275 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 09:10:07,275 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 09:10:07,275 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428595800_13366 | |
2015-04-09 09:10:07,291 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=dee0eded-1316-4a9c-ad1b-7f6da2bf3871 | |
2015-04-09 09:10:07,322 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 09:10:07,322 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 09:10:07,322 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 09:10:07,322 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 09:10:07,354 DEBUG Create event will be: time=2015-04-09T09:10:07.354621 severity=INFO origin="alert_handler" event_id="a0d532554a6c53001463f2bb0aa3e06b" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="dee0eded-1316-4a9c-ad1b-7f6da2bf3871" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428595800_13363" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428595801" | |
2015-04-09 09:10:07,361 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428595800_13363 with incident_id=dee0eded-1316-4a9c-ad1b-7f6da2bf3871 | |
2015-04-09 09:10:07,388 DEBUG results for incident_id=dee0eded-1316-4a9c-ad1b-7f6da2bf3871 written to collection. | |
2015-04-09 09:10:07,388 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428595800_13363 incident_id=dee0eded-1316-4a9c-ad1b-7f6da2bf3871 result_id=0 written to collection incident_results | |
2015-04-09 09:10:07,389 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 09:10:07,396 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 09:10:07,397 INFO Alert handler finished. duration=0.752s | |
2015-04-09 09:10:07,580 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=c8d62172-df36-42fe-87c9-ab9c4d42d06c | |
2015-04-09 09:10:07,613 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 09:10:07,613 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 09:10:07,614 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 09:10:07,614 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 09:10:07,628 DEBUG Create event will be: time=2015-04-09T09:10:07.628844 severity=INFO origin="alert_handler" event_id="84ff3566376919671eee815c70e34213" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="c8d62172-df36-42fe-87c9-ab9c4d42d06c" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428595800_13366" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428595801" | |
2015-04-09 09:10:07,635 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428595800_13366 with incident_id=c8d62172-df36-42fe-87c9-ab9c4d42d06c | |
2015-04-09 09:10:07,663 DEBUG results for incident_id=c8d62172-df36-42fe-87c9-ab9c4d42d06c written to collection. | |
2015-04-09 09:10:07,663 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428595800_13366 incident_id=c8d62172-df36-42fe-87c9-ab9c4d42d06c result_id=0 written to collection incident_results | |
2015-04-09 09:10:07,663 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 09:10:07,671 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 09:10:07,671 INFO Alert handler finished. duration=0.742s | |
2015-04-09 09:20:07,540 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428596400_13425/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428596400_13425 sessionKey=EUeiXGJxJAfM9eTT40fzEW4AQngGwx8la^kQSdCpalbbDX2TJl5KMrcFG61LiasObmS4Otf4ZjTWhdiklB5MZxlu7HMDapgdEC6ITzLEIRYF0GTpyN_0zMGMRzP2LKJrbtHMbTpkEMm8QAA6YzL alert=TO Failed Login Alert | |
2015-04-09 09:20:07,546 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428596400_13425' has been fired. | |
2015-04-09 09:20:07,825 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "default_priority": "low", "index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-09 09:20:07,825 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 09:20:07,831 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 09:20:07,831 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 09:20:07,831 DEBUG Incident config after getting settings: {"alert": "TO Failed Login Alert", "run_alert_script": false, "auto_assign": false, "auto_assign_user": "", "auto_assign_owner": "unassigned", "tags": "AD", "_user": "nobody", "auto_previous_resolve": false, "alert_script": "", "auto_ttl_resolve": false, "subcategory": "", "urgency": "high", "_key": "55269e303403e12f580a9565", "category": "Active Directory"} | |
2015-04-09 09:20:07,839 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 6 results. | |
2015-04-09 09:20:07,857 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 09:20:07,857 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 09:20:07,873 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428596400_13428/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428596400_13428 sessionKey=6KwFjz^qDGV2OmU8WY72bHZDGOttm9t8yEsCcKGar3ToxaGhFF3Xibd4sXfnQCeoaIsxiK6vPEAh1FXynILm2SXsKVj1aCpqrJfXPYeFiOgxKNGgT3cqjENDptixskjlBbBnY^6zdF7NKPnR alert=To Global Failed Login >3 Alert | |
2015-04-09 09:20:07,879 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428596400_13428' has been fired. | |
2015-04-09 09:20:07,882 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 09:20:07,883 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 09:20:07,883 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 09:20:07,883 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 09:20:07,883 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428596400_13425 | |
2015-04-09 09:20:08,176 DEBUG Parsed global alert handler settings: {"default_impact": "low", "default_owner": "unassigned", "default_urgency": "low", "index": "alerts-to-inf", "default_priority": "low"} | |
2015-04-09 09:20:08,177 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 09:20:08,183 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 09:20:08,184 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 09:20:08,184 DEBUG Incident config after getting settings: {"subcategory": "", "_user": "nobody", "alert": "To Global Failed Login >3 Alert", "auto_assign_owner": "unassigned", "auto_assign": false, "auto_assign_user": "", "category": "Active Directory", "auto_ttl_resolve": false, "run_alert_script": false, "urgency": "high", "auto_previous_resolve": false, "alert_script": "", "_key": "55240dda3403e174be1efd93", "tags": "AD"} | |
2015-04-09 09:20:08,193 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 5 results. | |
2015-04-09 09:20:08,210 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=567ade6a-925b-4f53-8c6d-a8b248e3e053 | |
2015-04-09 09:20:08,214 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 09:20:08,214 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 09:20:08,263 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 09:20:08,263 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 09:20:08,263 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 09:20:08,263 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 09:20:08,265 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 09:20:08,266 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 09:20:08,266 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 09:20:08,266 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 09:20:08,266 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428596400_13428 | |
2015-04-09 09:20:08,312 DEBUG Create event will be: time=2015-04-09T09:20:08.312306 severity=INFO origin="alert_handler" event_id="b373948d24dcc1707aa5da24b9fcede1" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="567ade6a-925b-4f53-8c6d-a8b248e3e053" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428596400_13425" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428596402" | |
2015-04-09 09:20:08,320 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428596400_13425 with incident_id=567ade6a-925b-4f53-8c6d-a8b248e3e053 | |
2015-04-09 09:20:08,380 DEBUG results for incident_id=567ade6a-925b-4f53-8c6d-a8b248e3e053 written to collection. | |
2015-04-09 09:20:08,381 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428596400_13425 incident_id=567ade6a-925b-4f53-8c6d-a8b248e3e053 result_id=0 written to collection incident_results | |
2015-04-09 09:20:08,381 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 09:20:08,389 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 09:20:08,389 INFO Alert handler finished. duration=0.85s | |
2015-04-09 09:20:08,577 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=ad924a07-6d61-494b-bd6f-2da17c4b693f | |
2015-04-09 09:20:08,605 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 09:20:08,605 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 09:20:08,606 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 09:20:08,606 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 09:20:08,622 DEBUG Create event will be: time=2015-04-09T09:20:08.621952 severity=INFO origin="alert_handler" event_id="06e2b1d3548b3a459993f45044875dad" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="ad924a07-6d61-494b-bd6f-2da17c4b693f" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428596400_13428" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428596402" | |
2015-04-09 09:20:08,629 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428596400_13428 with incident_id=ad924a07-6d61-494b-bd6f-2da17c4b693f | |
2015-04-09 09:20:08,656 DEBUG results for incident_id=ad924a07-6d61-494b-bd6f-2da17c4b693f written to collection. | |
2015-04-09 09:20:08,656 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428596400_13428 incident_id=ad924a07-6d61-494b-bd6f-2da17c4b693f result_id=0 written to collection incident_results | |
2015-04-09 09:20:08,656 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 09:20:08,664 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 09:20:08,664 INFO Alert handler finished. duration=0.791s | |
2015-04-09 09:30:06,579 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428597000_13477/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428597000_13477 sessionKey=1xAcEU87lGzeL5UpvcGgchT5pzT5DhQuIIzit3WWgKieZTDNWXUH1bz42a0Gm0QUFnwR8urV6B6WIiDkYcOknhhzyNZtOEFnmpwo8sV_zYHsCXNOFgv_rOF0yCs6x4AHQVnISkG5HX181B9c alert=To Global Failed Login >3 Alert | |
2015-04-09 09:30:06,585 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428597000_13477' has been fired. | |
2015-04-09 09:30:06,862 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 09:30:06,862 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 09:30:06,868 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 09:30:06,868 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 09:30:06,868 DEBUG Incident config after getting settings: {"alert_script": "", "auto_assign_user": "", "auto_previous_resolve": false, "alert": "To Global Failed Login >3 Alert", "tags": "AD", "run_alert_script": false, "_key": "55240dda3403e174be1efd93", "subcategory": "", "_user": "nobody", "auto_assign": false, "auto_ttl_resolve": false, "urgency": "high", "auto_assign_owner": "unassigned", "category": "Active Directory"} | |
2015-04-09 09:30:06,876 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 4 results. | |
2015-04-09 09:30:06,894 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 09:30:06,894 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 09:30:06,899 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428597000_13480/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428597000_13480 sessionKey=HOJ9c3vvYFNFUbKFcvLbjWTTT75g_koKfotYnXufHCIAWLNDibGUYW02R_9mtGFUWhuTcKfWrxS6sd2Yoqw^l7HMML7vycJ28bduaHN3mq7pcKosl_WR8F1TA^Wq3v1QjvmnJuvGR0n0i3C alert=TO Failed Login Alert | |
2015-04-09 09:30:06,905 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428597000_13480' has been fired. | |
2015-04-09 09:30:06,922 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 09:30:06,922 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 09:30:06,923 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 09:30:06,923 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 09:30:06,923 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428597000_13477 | |
2015-04-09 09:30:07,180 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned", "default_priority": "low"} | |
2015-04-09 09:30:07,180 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 09:30:07,188 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 09:30:07,188 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 09:30:07,188 DEBUG Incident config after getting settings: {"auto_previous_resolve": false, "auto_assign_user": "", "_key": "55269e303403e12f580a9565", "auto_assign_owner": "unassigned", "category": "Active Directory", "alert": "TO Failed Login Alert", "subcategory": "", "run_alert_script": false, "auto_assign": false, "auto_ttl_resolve": false, "urgency": "high", "tags": "AD", "alert_script": "", "_user": "nobody"} | |
2015-04-09 09:30:07,198 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 4 results. | |
2015-04-09 09:30:07,218 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 09:30:07,218 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 09:30:07,236 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=72a97cec-8b04-45a1-9506-1ef61fd94b57 | |
2015-04-09 09:30:07,249 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 09:30:07,249 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 09:30:07,249 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 09:30:07,249 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 09:30:07,249 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428597000_13480 | |
2015-04-09 09:30:07,268 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 09:30:07,268 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 09:30:07,269 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 09:30:07,269 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 09:30:07,293 DEBUG Create event will be: time=2015-04-09T09:30:07.293734 severity=INFO origin="alert_handler" event_id="83f7bdfefbb3c24199a79015ea22296e" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="72a97cec-8b04-45a1-9506-1ef61fd94b57" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428597000_13477" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428597001" | |
2015-04-09 09:30:07,301 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428597000_13477 with incident_id=72a97cec-8b04-45a1-9506-1ef61fd94b57 | |
2015-04-09 09:30:07,328 DEBUG results for incident_id=72a97cec-8b04-45a1-9506-1ef61fd94b57 written to collection. | |
2015-04-09 09:30:07,328 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428597000_13477 incident_id=72a97cec-8b04-45a1-9506-1ef61fd94b57 result_id=0 written to collection incident_results | |
2015-04-09 09:30:07,328 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 09:30:07,335 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 09:30:07,335 INFO Alert handler finished. duration=0.757s | |
2015-04-09 09:30:07,551 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=ed91253a-5047-463e-8d29-2089bfb30207 | |
2015-04-09 09:30:07,584 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 09:30:07,584 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 09:30:07,585 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 09:30:07,585 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 09:30:07,602 DEBUG Create event will be: time=2015-04-09T09:30:07.602184 severity=INFO origin="alert_handler" event_id="3903d5812d3cd34d11fc2f97af34448d" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="ed91253a-5047-463e-8d29-2089bfb30207" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428597000_13480" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428597002" | |
2015-04-09 09:30:07,609 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428597000_13480 with incident_id=ed91253a-5047-463e-8d29-2089bfb30207 | |
2015-04-09 09:30:07,636 DEBUG results for incident_id=ed91253a-5047-463e-8d29-2089bfb30207 written to collection. | |
2015-04-09 09:30:07,636 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428597000_13480 incident_id=ed91253a-5047-463e-8d29-2089bfb30207 result_id=0 written to collection incident_results | |
2015-04-09 09:30:07,637 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 09:30:07,644 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 09:30:07,644 INFO Alert handler finished. duration=0.745s | |
2015-04-09 09:40:06,633 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428597600_13531/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428597600_13531 sessionKey=v7BY3Yb7Y2oAYDDJjDHcFmY^YktK0bXtEaQ4Dh36b5AiCmyBXihYdclwKmr^kN0WCt^2yTfvypXS2rekoNvFL2JmIGgClfHxUFpBHYY1di2Q_aPdOZy5q3OacB_YCHNlalzLJ^TCsEqX9jjy alert=TO Failed Login Alert | |
2015-04-09 09:40:06,639 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428597600_13531' has been fired. | |
2015-04-09 09:40:06,776 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428597600_13534/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428597600_13534 sessionKey=SR908Z8Y6hTxmHyHqprJzd6eCNt4Y_B6PzSjHc_UsV_xJ44N0X5jdwFJxU1EXFaddyojEpiIFORXXfLIu8me7dwjh4flI5UaUfwOkNCGPovF^z9ogtuGOLk1T0jpJkEWIjHZFy28rF7etiZvIC alert=To Global Failed Login >3 Alert | |
2015-04-09 09:40:06,782 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428597600_13534' has been fired. | |
2015-04-09 09:40:06,921 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-09 09:40:06,921 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 09:40:06,928 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 09:40:06,928 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 09:40:06,928 DEBUG Incident config after getting settings: {"auto_previous_resolve": false, "alert_script": "", "tags": "AD", "alert": "TO Failed Login Alert", "auto_assign": false, "urgency": "high", "auto_assign_user": "", "_user": "nobody", "run_alert_script": false, "category": "Active Directory", "_key": "55269e303403e12f580a9565", "subcategory": "", "auto_ttl_resolve": false, "auto_assign_owner": "unassigned"} | |
2015-04-09 09:40:06,936 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 09:40:06,957 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 09:40:06,957 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 09:40:06,987 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 09:40:06,987 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 09:40:06,987 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 09:40:06,988 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 09:40:06,988 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428597600_13531 | |
2015-04-09 09:40:07,064 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 09:40:07,064 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 09:40:07,070 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 09:40:07,071 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 09:40:07,071 DEBUG Incident config after getting settings: {"auto_assign_owner": "unassigned", "auto_assign": false, "_key": "55240dda3403e174be1efd93", "auto_ttl_resolve": false, "category": "Active Directory", "subcategory": "", "_user": "nobody", "auto_assign_user": "", "run_alert_script": false, "auto_previous_resolve": false, "urgency": "high", "alert": "To Global Failed Login >3 Alert", "alert_script": "", "tags": "AD"} | |
2015-04-09 09:40:07,079 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 09:40:07,100 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 09:40:07,100 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 09:40:07,132 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 09:40:07,132 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 09:40:07,133 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 09:40:07,133 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 09:40:07,133 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428597600_13534 | |
2015-04-09 09:40:07,302 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=8d84252d-2c0d-4604-ae59-c2ffff6f43df | |
2015-04-09 09:40:07,331 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 09:40:07,331 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 09:40:07,331 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 09:40:07,332 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 09:40:07,376 DEBUG Create event will be: time=2015-04-09T09:40:07.376892 severity=INFO origin="alert_handler" event_id="70bb01cf98e8935696a05a9f5fc47fd3" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="8d84252d-2c0d-4604-ae59-c2ffff6f43df" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428597600_13531" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428597601" | |
2015-04-09 09:40:07,383 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428597600_13531 with incident_id=8d84252d-2c0d-4604-ae59-c2ffff6f43df | |
2015-04-09 09:40:07,412 DEBUG results for incident_id=8d84252d-2c0d-4604-ae59-c2ffff6f43df written to collection. | |
2015-04-09 09:40:07,412 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428597600_13531 incident_id=8d84252d-2c0d-4604-ae59-c2ffff6f43df result_id=0 written to collection incident_results | |
2015-04-09 09:40:07,412 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 09:40:07,422 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 09:40:07,423 INFO Alert handler finished. duration=0.79s | |
2015-04-09 09:40:07,446 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=0129e7d1-e7d0-4b2e-8fd5-6d49c7fbd0bf | |
2015-04-09 09:40:07,476 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 09:40:07,476 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 09:40:07,476 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 09:40:07,476 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 09:40:07,515 DEBUG Create event will be: time=2015-04-09T09:40:07.515343 severity=INFO origin="alert_handler" event_id="372034a64e1166a252bd1e2d6f5e3b78" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="0129e7d1-e7d0-4b2e-8fd5-6d49c7fbd0bf" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428597600_13534" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428597601" | |
2015-04-09 09:40:07,521 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428597600_13534 with incident_id=0129e7d1-e7d0-4b2e-8fd5-6d49c7fbd0bf | |
2015-04-09 09:40:07,549 DEBUG results for incident_id=0129e7d1-e7d0-4b2e-8fd5-6d49c7fbd0bf written to collection. | |
2015-04-09 09:40:07,549 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428597600_13534 incident_id=0129e7d1-e7d0-4b2e-8fd5-6d49c7fbd0bf result_id=0 written to collection incident_results | |
2015-04-09 09:40:07,549 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 09:40:07,556 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 09:40:07,557 INFO Alert handler finished. duration=0.781s | |
2015-04-09 09:50:06,012 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428598200_46/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428598200_46 sessionKey=1TmZMUiLrBYZ_Tna5t_loUPXG2f8faBl_2VrZTfPmsr_6i8bpgrYjy_7TJBIUbsdpTZ8KjLy9nGmJBPHnG4^CL0VOfe7K0tImjGyIUaMHPQdtwuh5pzTwOhoAC4vMrudRiq^dZbh alert=To Global Failed Login >3 Alert | |
2015-04-09 09:50:06,018 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428598200_46' has been fired. | |
2015-04-09 09:50:06,286 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf"} | |
2015-04-09 09:50:06,286 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 09:50:06,292 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 09:50:06,293 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 09:50:06,293 DEBUG Incident config after getting settings: {"urgency": "high", "_user": "nobody", "auto_assign": false, "auto_previous_resolve": false, "auto_ttl_resolve": false, "auto_assign_owner": "unassigned", "alert": "To Global Failed Login >3 Alert", "alert_script": "", "_key": "55240dda3403e174be1efd93", "tags": "AD", "auto_assign_user": "", "subcategory": "", "run_alert_script": false, "category": "Active Directory"} | |
2015-04-09 09:50:06,300 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 09:50:06,318 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 09:50:06,318 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 09:50:06,363 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 09:50:06,363 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 09:50:06,363 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 09:50:06,363 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 09:50:06,363 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428598200_46 | |
2015-04-09 09:50:06,665 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=7d1b7a40-eaf1-48ac-9078-33184f5ec33f | |
2015-04-09 09:50:06,692 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 09:50:06,692 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 09:50:06,693 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 09:50:06,693 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 09:50:06,727 DEBUG Create event will be: time=2015-04-09T09:50:06.727516 severity=INFO origin="alert_handler" event_id="261b657a720b0efcb9613813ae16d017" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="7d1b7a40-eaf1-48ac-9078-33184f5ec33f" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428598200_46" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428598201" | |
2015-04-09 09:50:06,734 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428598200_46 with incident_id=7d1b7a40-eaf1-48ac-9078-33184f5ec33f | |
2015-04-09 09:50:06,761 DEBUG results for incident_id=7d1b7a40-eaf1-48ac-9078-33184f5ec33f written to collection. | |
2015-04-09 09:50:06,761 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428598200_46 incident_id=7d1b7a40-eaf1-48ac-9078-33184f5ec33f result_id=0 written to collection incident_results | |
2015-04-09 09:50:06,761 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 09:50:06,769 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 09:50:06,769 INFO Alert handler finished. duration=0.757s | |
2015-04-09 09:55:07,737 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428598500_77/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428598500_77 sessionKey=SYsrcpBuT6PeaPlKrtTLPJty752rrn5w5_vEV9OY9x^P1cabwFpwZ^CVxh4xF_49IrafhCLXIYwAXpqO07suk787u2UdLM5xfCGNCoPDi0mcyd7PhJybJOR2Uj8IB9_QCefqOckh1hiQFoLr alert=TO Failed Login Alert | |
2015-04-09 09:55:07,743 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428598500_77' has been fired. | |
2015-04-09 09:55:08,018 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 09:55:08,018 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 09:55:08,024 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 09:55:08,024 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 09:55:08,025 DEBUG Incident config after getting settings: {"subcategory": "", "auto_ttl_resolve": false, "auto_assign": false, "_key": "55269e303403e12f580a9565", "auto_previous_resolve": false, "_user": "nobody", "category": "Active Directory", "tags": "AD", "alert_script": "", "alert": "TO Failed Login Alert", "urgency": "high", "run_alert_script": false, "auto_assign_user": "", "auto_assign_owner": "unassigned"} | |
2015-04-09 09:55:08,032 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 09:55:08,050 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 09:55:08,050 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 09:55:08,077 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 09:55:08,077 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 09:55:08,078 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 09:55:08,078 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 09:55:08,078 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428598500_77 | |
2015-04-09 09:55:08,372 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=cdbc1c19-4aff-4942-9459-40bbf0c4fb78 | |
2015-04-09 09:55:08,399 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 09:55:08,399 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 09:55:08,399 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 09:55:08,400 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 09:55:08,433 DEBUG Create event will be: time=2015-04-09T09:55:08.433175 severity=INFO origin="alert_handler" event_id="3567cf3efce0e1e56b8c74aa72fcede7" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="cdbc1c19-4aff-4942-9459-40bbf0c4fb78" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428598500_77" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428598504" | |
2015-04-09 09:55:08,439 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428598500_77 with incident_id=cdbc1c19-4aff-4942-9459-40bbf0c4fb78 | |
2015-04-09 09:55:08,467 DEBUG results for incident_id=cdbc1c19-4aff-4942-9459-40bbf0c4fb78 written to collection. | |
2015-04-09 09:55:08,467 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428598500_77 incident_id=cdbc1c19-4aff-4942-9459-40bbf0c4fb78 result_id=0 written to collection incident_results | |
2015-04-09 09:55:08,467 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 09:55:08,474 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 09:55:08,475 INFO Alert handler finished. duration=0.738s | |
2015-04-09 10:00:06,435 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428598800_98/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428598800_98 sessionKey=4wEmPJgGHyUcJL4nqhunph0zk^olKQf^978gIqKBcVgaMXyEIznAGL0o3_rRnJf0yVIw3uH6uYaPIA_6CBbBxkWjMLqM57tYya^QX7sdw7kFmgbx0RGLAsFPrjXx3MBh9abFKUuw alert=TO Failed Login Alert | |
2015-04-09 10:00:06,441 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428598800_98' has been fired. | |
2015-04-09 10:00:06,576 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428598800_95/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428598800_95 sessionKey=IhB1VvcMuVMtKIpR1Utp2HFkpZYAcGV_Bd1VnYhKVOniSXxnWqb4oqBg2xpE2vYxwNXJooV7HmcL7snQvtp^v83FczPddNY6qxPb9jvaBPcE25eM2Gk4j2SE3_CuZvAAbVJhsxyAhV_crNRZ3o alert=To Global Failed Login >3 Alert | |
2015-04-09 10:00:06,582 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428598800_95' has been fired. | |
2015-04-09 10:00:06,718 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf"} | |
2015-04-09 10:00:06,718 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 10:00:06,724 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "TO Failed Login Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 10:00:06,724 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 10:00:06,724 DEBUG Incident config after getting settings: {"auto_assign_user": "", "subcategory": "", "alert": "TO Failed Login Alert", "auto_assign_owner": "unassigned", "urgency": "high", "auto_previous_resolve": false, "run_alert_script": false, "_key": "55269e303403e12f580a9565", "category": "Active Directory", "_user": "nobody", "alert_script": "", "tags": "AD", "auto_assign": false, "auto_ttl_resolve": false} | |
2015-04-09 10:00:06,732 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 10:00:06,751 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 10:00:06,752 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 10:00:06,779 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 10:00:06,779 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 10:00:06,779 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 10:00:06,779 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 10:00:06,780 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428598800_98 | |
2015-04-09 10:00:06,867 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-09 10:00:06,867 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 10:00:06,874 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "high", "category" : "Active Directory", "subcategory" : "", "alert" : "To Global Failed Login >3 Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 10:00:06,874 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 10:00:06,874 DEBUG Incident config after getting settings: {"alert": "To Global Failed Login >3 Alert", "_user": "nobody", "_key": "55240dda3403e174be1efd93", "tags": "AD", "run_alert_script": false, "urgency": "high", "alert_script": "", "auto_ttl_resolve": false, "auto_assign_owner": "unassigned", "subcategory": "", "auto_assign_user": "", "auto_assign": false, "auto_previous_resolve": false, "category": "Active Directory"} | |
2015-04-09 10:00:06,882 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 10:00:06,899 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 10:00:06,900 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 10:00:06,930 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 10:00:06,931 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 10:00:06,931 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 10:00:06,931 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 10:00:06,931 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428598800_95 | |
2015-04-09 10:00:07,090 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=aff0aae2-9584-4a02-8c55-d9c0c0886e4b | |
2015-04-09 10:00:07,121 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 10:00:07,121 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 10:00:07,121 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 10:00:07,121 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 10:00:07,151 DEBUG Create event will be: time=2015-04-09T10:00:07.151229 severity=INFO origin="alert_handler" event_id="26b68f4399819408d0a794e7936a4a62" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="aff0aae2-9584-4a02-8c55-d9c0c0886e4b" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428598800_98" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428598801" | |
2015-04-09 10:00:07,157 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428598800_98 with incident_id=aff0aae2-9584-4a02-8c55-d9c0c0886e4b | |
2015-04-09 10:00:07,185 DEBUG results for incident_id=aff0aae2-9584-4a02-8c55-d9c0c0886e4b written to collection. | |
2015-04-09 10:00:07,186 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428598800_98 incident_id=aff0aae2-9584-4a02-8c55-d9c0c0886e4b result_id=0 written to collection incident_results | |
2015-04-09 10:00:07,186 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 10:00:07,193 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 10:00:07,193 INFO Alert handler finished. duration=0.758s | |
2015-04-09 10:00:07,244 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=89faee27-75d6-42a0-8847-dcf197a9e4fc | |
2015-04-09 10:00:07,273 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 10:00:07,273 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 10:00:07,274 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 10:00:07,274 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 10:00:07,290 DEBUG Create event will be: time=2015-04-09T10:00:07.290016 severity=INFO origin="alert_handler" event_id="a7787788ec7e704b60cd5826d0567cc4" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="89faee27-75d6-42a0-8847-dcf197a9e4fc" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428598800_95" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428598801" | |
2015-04-09 10:00:07,297 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428598800_95 with incident_id=89faee27-75d6-42a0-8847-dcf197a9e4fc | |
2015-04-09 10:00:07,324 DEBUG results for incident_id=89faee27-75d6-42a0-8847-dcf197a9e4fc written to collection. | |
2015-04-09 10:00:07,325 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428598800_95 incident_id=89faee27-75d6-42a0-8847-dcf197a9e4fc result_id=0 written to collection incident_results | |
2015-04-09 10:00:07,325 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 10:00:07,332 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 10:00:07,333 INFO Alert handler finished. duration=0.757s | |
2015-04-09 10:00:07,501 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428598800_100/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428598800_100 sessionKey=U^H3L6iT1Y^FQVQ4RSRKaN946U2sNpx3bJLBmjehtnCxCOIGrCrG5Tnv3vW^ur4EASYOqJuUdYB5fRDRbZrT9vbXeNpd9XJlqiWVo95pEM^N^^N8xk8zpXBoNJRRhuFH87IJ33zUD1KVgo alert=TO AD Audit Rule Alert | |
2015-04-09 10:00:07,510 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428598800_100' has been fired. | |
2015-04-09 10:00:07,859 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_priority": "low", "default_owner": "unassigned"} | |
2015-04-09 10:00:07,859 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-09 10:00:07,866 DEBUG Incident settings: [ { "auto_previous_resolve" : false, "urgency" : "low", "category" : "Active Directory", "subcategory" : "", "alert" : "TO AD Audit Rule Alert", "auto_assign" : false, "run_alert_script" : false, "auto_ttl_resolve" : false, "auto_assign_owner" : "unassigned", "tags" : "AD", "_user" : "nobody", "_key" : "552436d73403e12f580a9531" } ] | |
2015-04-09 10:00:07,866 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-09 10:00:07,866 DEBUG Incident config after getting settings: {"alert_script": "", "subcategory": "", "tags": "AD", "auto_assign": false, "_user": "nobody", "auto_assign_owner": "unassigned", "category": "Active Directory", "_key": "552436d73403e12f580a9531", "run_alert_script": false, "auto_ttl_resolve": false, "auto_previous_resolve": false, "auto_assign_user": "", "urgency": "low", "alert": "TO AD Audit Rule Alert"} | |
2015-04-09 10:00:07,875 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 1 results. | |
2015-04-09 10:00:07,900 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-09 10:00:07,900 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 10:00:07,939 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 10:00:07,939 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 10:00:07,939 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-09 10:00:07,940 DEBUG Matched impact in lookup, returning value=low | |
2015-04-09 10:00:07,940 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428598800_100 | |
2015-04-09 10:00:08,285 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=44312e7e-6de3-40b9-980f-8603a65e2720 | |
2015-04-09 10:00:08,317 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 10:00:08,317 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 10:00:08,317 DEBUG Querying lookup with filter={'impact': 'low', 'urgency': u'low'} | |
2015-04-09 10:00:08,317 DEBUG Matched priority in lookup, returning value=informational | |
2015-04-09 10:00:08,361 DEBUG Create event will be: time=2015-04-09T10:00:08.361283 severity=INFO origin="alert_handler" event_id="a7e4154577a46ad3a207862cf3225c5f" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="44312e7e-6de3-40b9-980f-8603a65e2720" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428598800_100" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428598801" | |
2015-04-09 10:00:08,369 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428598800_100 with incident_id=44312e7e-6de3-40b9-980f-8603a65e2720 | |
2015-04-09 10:00:08,396 DEBUG results for incident_id=44312e7e-6de3-40b9-980f-8603a65e2720 written to collection. | |
2015-04-09 10:00:08,396 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428598800_100 incident_id=44312e7e-6de3-40b9-980f-8603a65e2720 result_id=0 written to collection incident_results | |
2015-04-09 10:00:08,396 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 10:00:08,404 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 10:00:08,404 INFO Alert handler finished. duration=0.903s | |
2015-04-09 10:10:06,096 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428599400_162/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428599400_162 sessionKey=_6fB2MG98FzPHNd6dyztVVjbkc5cNug2^z050J3bovRlC7spemfn8BJqui8JPusmA06Qw3ttw6hYENJYAQQeDCzTOlF8my82znDuiCA1RMvvGJgH61cfUIwkzwrsXoWkH5WNT^CpVQsmvtz7jF alert=To Global Failed Login >3 Alert | |
2015-04-09 10:10:06,102 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428599400_162' has been fired. | |
2015-04-09 10:10:06,177 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428599400_159/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428599400_159 sessionKey=NKL^g0D1ngewmtRZqQnnbZdj886YcjG24npxnwrCi2HJNKqdLceZvnIfdtJYNAPuoDFgTAPwcviDPDe54mUSvD08wX_iCgEdXn^UIerRKKRhsugy349WVZJVD12l6jqpqREyVO7dFs8RUFY alert=TO Failed Login Alert | |
2015-04-09 10:10:06,183 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428599400_159' has been fired. | |
2015-04-09 10:10:06,381 DEBUG Parsed global alert handler settings: {"default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_priority": "low"} | |
2015-04-09 10:10:06,382 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 10:10:06,388 DEBUG Incident settings: [ { "category" : "Active Directory", "tags" : "AD", "alert" : "To Global Failed Login >3 Alert", "urgency" : "high", "auto_assign" : false, "run_alert_script" : false, "subcategory" : "", "auto_ttl_resolve" : false, "auto_previous_resolve" : false, "auto_assign_owner" : "unassigned", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 10:10:06,388 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 10:10:06,388 DEBUG Incident config after getting settings: {"tags": "AD", "_key": "55240dda3403e174be1efd93", "alert_script": "", "urgency": "high", "auto_previous_resolve": false, "run_alert_script": false, "category": "Active Directory", "auto_assign": false, "auto_ttl_resolve": false, "auto_assign_user": "", "alert": "To Global Failed Login >3 Alert", "auto_assign_owner": "unassigned", "_user": "nobody", "subcategory": ""} | |
2015-04-09 10:10:06,396 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 10:10:06,413 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 10:10:06,413 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 10:10:06,441 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 10:10:06,441 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 10:10:06,441 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 10:10:06,441 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 10:10:06,441 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428599400_162 | |
2015-04-09 10:10:06,466 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-09 10:10:06,466 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 10:10:06,473 DEBUG Incident settings: [ { "category" : "Active Directory", "tags" : "AD", "alert" : "TO Failed Login Alert", "urgency" : "high", "auto_assign" : false, "run_alert_script" : false, "subcategory" : "", "auto_ttl_resolve" : false, "auto_previous_resolve" : false, "auto_assign_owner" : "unassigned", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 10:10:06,473 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 10:10:06,473 DEBUG Incident config after getting settings: {"_key": "55269e303403e12f580a9565", "urgency": "high", "subcategory": "", "auto_assign_owner": "unassigned", "auto_ttl_resolve": false, "tags": "AD", "auto_previous_resolve": false, "alert_script": "", "auto_assign_user": "", "_user": "nobody", "run_alert_script": false, "category": "Active Directory", "auto_assign": false, "alert": "TO Failed Login Alert"} | |
2015-04-09 10:10:06,481 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 10:10:06,498 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 10:10:06,498 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 10:10:06,524 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 10:10:06,524 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 10:10:06,524 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 10:10:06,525 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 10:10:06,525 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428599400_159 | |
2015-04-09 10:10:06,743 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=7902157e-d3ff-48d1-b36f-f566b60cdfcb | |
2015-04-09 10:10:06,769 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 10:10:06,770 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 10:10:06,770 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 10:10:06,770 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 10:10:06,786 DEBUG Create event will be: time=2015-04-09T10:10:06.786140 severity=INFO origin="alert_handler" event_id="7ebae38952353ecd16dd3f8356d277b2" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="7902157e-d3ff-48d1-b36f-f566b60cdfcb" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428599400_162" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428599401" | |
2015-04-09 10:10:06,793 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428599400_162 with incident_id=7902157e-d3ff-48d1-b36f-f566b60cdfcb | |
2015-04-09 10:10:06,820 DEBUG results for incident_id=7902157e-d3ff-48d1-b36f-f566b60cdfcb written to collection. | |
2015-04-09 10:10:06,820 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428599400_162 incident_id=7902157e-d3ff-48d1-b36f-f566b60cdfcb result_id=0 written to collection incident_results | |
2015-04-09 10:10:06,820 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 10:10:06,829 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 10:10:06,829 INFO Alert handler finished. duration=0.734s | |
2015-04-09 10:10:06,831 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=dc43668a-c26b-485b-b5e3-29697969da2f | |
2015-04-09 10:10:06,859 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 10:10:06,860 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 10:10:06,860 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 10:10:06,860 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 10:10:06,889 DEBUG Create event will be: time=2015-04-09T10:10:06.889178 severity=INFO origin="alert_handler" event_id="6cc4745f2228f83ef9a860864eb91b23" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="dc43668a-c26b-485b-b5e3-29697969da2f" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428599400_159" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428599401" | |
2015-04-09 10:10:06,896 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428599400_159 with incident_id=dc43668a-c26b-485b-b5e3-29697969da2f | |
2015-04-09 10:10:06,923 DEBUG results for incident_id=dc43668a-c26b-485b-b5e3-29697969da2f written to collection. | |
2015-04-09 10:10:06,923 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428599400_159 incident_id=dc43668a-c26b-485b-b5e3-29697969da2f result_id=0 written to collection incident_results | |
2015-04-09 10:10:06,923 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 10:10:06,930 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 10:10:06,931 INFO Alert handler finished. duration=0.754s | |
2015-04-09 10:20:06,281 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428600000_223/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428600000_223 sessionKey=lfFVrihdgDNX^We3Lu82xE_n2UcJ7wxqBQee5CKFsGAW12U_3sZCLe3NMBZRI6PR3WbezRbLyFhX0o2n865l2pgZNVKMCi_2vY^vUubkO8242VeYw2h0n^bpK5rriS5wgDzu1AMYL4PsJulV alert=To Global Failed Login >3 Alert | |
2015-04-09 10:20:06,286 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428600000_223' has been fired. | |
2015-04-09 10:20:06,288 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428600000_220/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428600000_220 sessionKey=mMKq^KXan6szs^ObBEvg6ivpyaNJZGgSUbd8hbDiCCZuFNneF_0g4S4Dtb3eYJ5pJvMsqaeNjPI6tsG3M3FFp4w6g8^PS^0ulAqkfA87gAv9zstwbYA67grEiSwB36otvVTMBhjWWK5kmp4w alert=TO Failed Login Alert | |
2015-04-09 10:20:06,295 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428600000_220' has been fired. | |
2015-04-09 10:20:06,568 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned", "default_priority": "low"} | |
2015-04-09 10:20:06,568 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 10:20:06,570 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf"} | |
2015-04-09 10:20:06,571 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 10:20:06,575 DEBUG Incident settings: [ { "category" : "Active Directory", "tags" : "AD", "alert" : "To Global Failed Login >3 Alert", "urgency" : "high", "auto_assign" : false, "run_alert_script" : false, "subcategory" : "", "auto_ttl_resolve" : false, "auto_previous_resolve" : false, "auto_assign_owner" : "unassigned", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 10:20:06,575 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 10:20:06,575 DEBUG Incident config after getting settings: {"category": "Active Directory", "auto_previous_resolve": false, "subcategory": "", "run_alert_script": false, "auto_assign": false, "auto_ttl_resolve": false, "urgency": "high", "tags": "AD", "alert_script": "", "_user": "nobody", "alert": "To Global Failed Login >3 Alert", "auto_assign_user": "", "_key": "55240dda3403e174be1efd93", "auto_assign_owner": "unassigned"} | |
2015-04-09 10:20:06,578 DEBUG Incident settings: [ { "category" : "Active Directory", "tags" : "AD", "alert" : "TO Failed Login Alert", "urgency" : "high", "auto_assign" : false, "run_alert_script" : false, "subcategory" : "", "auto_ttl_resolve" : false, "auto_previous_resolve" : false, "auto_assign_owner" : "unassigned", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 10:20:06,578 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 10:20:06,578 DEBUG Incident config after getting settings: {"auto_assign_user": "", "auto_assign": false, "category": "Active Directory", "auto_ttl_resolve": false, "_user": "nobody", "subcategory": "", "auto_assign_owner": "unassigned", "alert": "TO Failed Login Alert", "alert_script": "", "tags": "AD", "_key": "55269e303403e12f580a9565", "run_alert_script": false, "urgency": "high", "auto_previous_resolve": false} | |
2015-04-09 10:20:06,585 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 4 results. | |
2015-04-09 10:20:06,587 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 4 results. | |
2015-04-09 10:20:06,604 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 10:20:06,604 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 10:20:06,606 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 10:20:06,607 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 10:20:06,647 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 10:20:06,647 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 10:20:06,648 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 10:20:06,648 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 10:20:06,648 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428600000_223 | |
2015-04-09 10:20:06,650 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 10:20:06,651 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 10:20:06,651 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 10:20:06,651 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 10:20:06,651 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428600000_220 | |
2015-04-09 10:20:06,954 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=dcda7ab2-05d9-49f5-83a0-f4c36a1df206 | |
2015-04-09 10:20:06,957 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=5ace76d2-2bee-4eda-af1f-e4196c4f58c9 | |
2015-04-09 10:20:06,999 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 10:20:06,999 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 10:20:07,000 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 10:20:07,000 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 10:20:07,003 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 10:20:07,003 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 10:20:07,004 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 10:20:07,004 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 10:20:07,034 DEBUG Create event will be: time=2015-04-09T10:20:07.034752 severity=INFO origin="alert_handler" event_id="0bb2a7823125c3e225b4c9eb61868931" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="5ace76d2-2bee-4eda-af1f-e4196c4f58c9" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428600000_220" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428600001" | |
2015-04-09 10:20:07,034 DEBUG Create event will be: time=2015-04-09T10:20:07.034910 severity=INFO origin="alert_handler" event_id="55ee77a2dcac656a834a51ce9945df2f" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="dcda7ab2-05d9-49f5-83a0-f4c36a1df206" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428600000_223" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428600001" | |
2015-04-09 10:20:07,044 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428600000_220 with incident_id=5ace76d2-2bee-4eda-af1f-e4196c4f58c9 | |
2015-04-09 10:20:07,045 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428600000_223 with incident_id=dcda7ab2-05d9-49f5-83a0-f4c36a1df206 | |
2015-04-09 10:20:07,076 DEBUG results for incident_id=5ace76d2-2bee-4eda-af1f-e4196c4f58c9 written to collection. | |
2015-04-09 10:20:07,076 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428600000_220 incident_id=5ace76d2-2bee-4eda-af1f-e4196c4f58c9 result_id=0 written to collection incident_results | |
2015-04-09 10:20:07,076 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 10:20:07,083 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 10:20:07,083 INFO Alert handler finished. duration=0.795s | |
2015-04-09 10:20:07,110 DEBUG results for incident_id=dcda7ab2-05d9-49f5-83a0-f4c36a1df206 written to collection. | |
2015-04-09 10:20:07,111 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428600000_223 incident_id=dcda7ab2-05d9-49f5-83a0-f4c36a1df206 result_id=0 written to collection incident_results | |
2015-04-09 10:20:07,111 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 10:20:07,118 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 10:20:07,118 INFO Alert handler finished. duration=0.838s | |
2015-04-09 10:30:07,333 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428600600_274/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428600600_274 sessionKey=^b6BhloHz4jhr6O9G7_achtprlS9Ocglx7dHLzSt3VoqBoQwYnwZ2URQ2ctvowqRi4tWbiu4zOHHUH94939niiUlCN40LfF7IO4C_g65_sgI3HDeTxza2jf0GmlM3akbAKrsPp8HgcQT^hR alert=TO Failed Login Alert | |
2015-04-09 10:30:07,338 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428600600_274' has been fired. | |
2015-04-09 10:30:07,497 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428600600_271/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428600600_271 sessionKey=pnsTPwGAUkYP9IoSMSfX9yUfjg7TMDSKyVooWVecMX6qcNOkTGzc^jBu9_^F2YM_TchwNzeDpGdlfr9lqSSkcTTB5HDi_hg0gODugi_9BubDn3x^_QW3cR^h6Tg5yRM_Si5OENCMHWtP alert=To Global Failed Login >3 Alert | |
2015-04-09 10:30:07,502 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428600600_271' has been fired. | |
2015-04-09 10:30:07,623 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 10:30:07,624 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 10:30:07,632 DEBUG Incident settings: [ { "category" : "", "tags" : "", "alert" : "TO Failed Login Alert", "urgency" : "high", "auto_assign" : false, "run_alert_script" : false, "subcategory" : "", "auto_ttl_resolve" : false, "auto_previous_resolve" : false, "auto_assign_owner" : "unassigned", "_user" : "nobody", "_key" : "55269e303403e12f580a9565" } ] | |
2015-04-09 10:30:07,632 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 10:30:07,632 DEBUG Incident config after getting settings: {"auto_previous_resolve": false, "auto_assign_user": "", "alert_script": "", "subcategory": "", "_key": "55269e303403e12f580a9565", "run_alert_script": false, "tags": "", "alert": "TO Failed Login Alert", "urgency": "high", "auto_ttl_resolve": false, "auto_assign": false, "_user": "nobody", "category": "", "auto_assign_owner": "unassigned"} | |
2015-04-09 10:30:07,642 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 10:30:07,665 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 10:30:07,665 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 10:30:07,708 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 10:30:07,710 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 10:30:07,711 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 10:30:07,711 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 10:30:07,711 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428600600_274 | |
2015-04-09 10:30:07,818 DEBUG Parsed global alert handler settings: {"default_priority": "low", "index": "alerts-to-inf", "default_impact": "low", "default_urgency": "low", "default_owner": "unassigned"} | |
2015-04-09 10:30:07,818 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 10:30:07,828 DEBUG Incident settings: [ { "category" : "", "tags" : "", "alert" : "To Global Failed Login >3 Alert", "urgency" : "high", "auto_assign" : false, "run_alert_script" : false, "subcategory" : "", "auto_ttl_resolve" : false, "auto_previous_resolve" : false, "auto_assign_owner" : "unassigned", "_user" : "nobody", "_key" : "55240dda3403e174be1efd93" } ] | |
2015-04-09 10:30:07,828 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 10:30:07,828 DEBUG Incident config after getting settings: {"auto_previous_resolve": false, "urgency": "high", "auto_assign_user": "", "run_alert_script": false, "_user": "nobody", "tags": "", "alert": "To Global Failed Login >3 Alert", "alert_script": "", "_key": "55240dda3403e174be1efd93", "auto_assign": false, "auto_assign_owner": "unassigned", "subcategory": "", "auto_ttl_resolve": false, "category": ""} | |
2015-04-09 10:30:07,839 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 10:30:07,861 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 10:30:07,862 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 10:30:07,898 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 10:30:07,899 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 10:30:07,899 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 10:30:07,899 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 10:30:07,899 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428600600_271 | |
2015-04-09 10:30:08,079 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=390a13cc-ceab-4976-a962-397a2ea7ad01 | |
2015-04-09 10:30:08,124 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 10:30:08,125 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 10:30:08,125 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 10:30:08,125 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 10:30:08,182 DEBUG Create event will be: time=2015-04-09T10:30:08.182436 severity=INFO origin="alert_handler" event_id="74fc000362aae52fb852b65e406d20c6" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="390a13cc-ceab-4976-a962-397a2ea7ad01" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428600600_274" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428600601" | |
2015-04-09 10:30:08,190 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428600600_274 with incident_id=390a13cc-ceab-4976-a962-397a2ea7ad01 | |
2015-04-09 10:30:08,249 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=e6c63d7a-8480-4d56-8548-e58ab17663ad | |
2015-04-09 10:30:08,260 DEBUG results for incident_id=390a13cc-ceab-4976-a962-397a2ea7ad01 written to collection. | |
2015-04-09 10:30:08,260 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428600600_274 incident_id=390a13cc-ceab-4976-a962-397a2ea7ad01 result_id=0 written to collection incident_results | |
2015-04-09 10:30:08,260 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 10:30:08,270 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 10:30:08,270 INFO Alert handler finished. duration=0.938s | |
2015-04-09 10:30:08,287 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 10:30:08,287 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 10:30:08,288 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 10:30:08,288 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 10:30:08,330 DEBUG Create event will be: time=2015-04-09T10:30:08.330434 severity=INFO origin="alert_handler" event_id="61a5f52205615f30bcbb0aa4975bf676" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="e6c63d7a-8480-4d56-8548-e58ab17663ad" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428600600_271" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428600601" | |
2015-04-09 10:30:08,339 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428600600_271 with incident_id=e6c63d7a-8480-4d56-8548-e58ab17663ad | |
2015-04-09 10:30:08,399 DEBUG results for incident_id=e6c63d7a-8480-4d56-8548-e58ab17663ad written to collection. | |
2015-04-09 10:30:08,400 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428600600_271 incident_id=e6c63d7a-8480-4d56-8548-e58ab17663ad result_id=0 written to collection incident_results | |
2015-04-09 10:30:08,400 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 10:30:08,408 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 10:30:08,408 INFO Alert handler finished. duration=0.912s | |
2015-04-09 10:40:06,011 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428601200_325/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428601200_325 sessionKey=5BE46BvBqNe7X4LX^Wiv0A4VXnYz^SUBgnwzBm6sEIAaE^4vJ3L_aEypNv5ACyo79h^5Sevr^jtB9cmXJBFq61^b39T36K0ccvpy3M8lseiahJFR0D_dWMf9uV__sj2gzOmPzmo5 alert=TO Failed Login Alert | |
2015-04-09 10:40:06,016 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428601200_325' has been fired. | |
2015-04-09 10:40:06,132 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428601200_328/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428601200_328 sessionKey=60_HCHlQZnkEt7gt1v2ANV5JpAu5ky6MJ_QrpifBYl8mqCjtzOgGwDPXJ9jlHkMeS0nW6rsXca8OiOXEw9OhwZUq1UeEQFur7zUWW4BVdU9V0W3kZsuZ^Mb^a_TXjbsHBZne4ggoy^nh alert=To Global Failed Login >3 Alert | |
2015-04-09 10:40:06,138 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428601200_328' has been fired. | |
2015-04-09 10:40:06,292 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 10:40:06,292 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 10:40:06,299 DEBUG Incident settings: [ { "subcategory" : "unknown", "auto_ttl_resolve" : false, "category" : "unknown", "tags" : "[Untagged]", "alert" : "TO Failed Login Alert", "auto_previous_resolve" : false, "urgency" : "low", "run_alert_script" : false, "auto_assign_owner" : "unassigned", "auto_assign" : false, "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 10:40:06,300 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 10:40:06,300 DEBUG Incident config after getting settings: {"auto_assign_owner": "unassigned", "auto_assign": false, "_key": "5526b8b13403e13421162b0c", "category": "unknown", "auto_ttl_resolve": false, "subcategory": "unknown", "_user": "nobody", "run_alert_script": false, "auto_assign_user": "", "urgency": "low", "auto_previous_resolve": false, "alert_script": "", "alert": "TO Failed Login Alert", "tags": "[Untagged]"} | |
2015-04-09 10:40:06,309 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 4 results. | |
2015-04-09 10:40:06,327 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 10:40:06,327 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 10:40:06,353 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 10:40:06,353 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 10:40:06,354 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 10:40:06,354 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 10:40:06,354 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428601200_325 | |
2015-04-09 10:40:06,416 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "default_priority": "low", "default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 10:40:06,416 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 10:40:06,423 DEBUG Incident settings: [ { "subcategory" : "unknown", "auto_ttl_resolve" : false, "category" : "unknown", "tags" : "[Untagged]", "alert" : "To Global Failed Login >3 Alert", "auto_previous_resolve" : false, "urgency" : "low", "run_alert_script" : false, "auto_assign_owner" : "unassigned", "auto_assign" : false, "_user" : "nobody", "_key" : "5526b8b13403e13421162b0d" } ] | |
2015-04-09 10:40:06,423 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 10:40:06,423 DEBUG Incident config after getting settings: {"alert": "To Global Failed Login >3 Alert", "auto_assign": false, "auto_assign_user": "", "alert_script": "", "_key": "5526b8b13403e13421162b0d", "auto_ttl_resolve": false, "run_alert_script": false, "category": "unknown", "subcategory": "unknown", "auto_assign_owner": "unassigned", "_user": "nobody", "tags": "[Untagged]", "urgency": "low", "auto_previous_resolve": false} | |
2015-04-09 10:40:06,431 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 4 results. | |
2015-04-09 10:40:06,451 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 10:40:06,451 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 10:40:06,480 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 10:40:06,480 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 10:40:06,481 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 10:40:06,481 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 10:40:06,481 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428601200_328 | |
2015-04-09 10:40:06,662 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=69735802-c794-46cf-85ff-285eb1e57e8d | |
2015-04-09 10:40:06,689 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 10:40:06,689 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 10:40:06,690 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 10:40:06,690 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 10:40:06,727 DEBUG Create event will be: time=2015-04-09T10:40:06.727911 severity=INFO origin="alert_handler" event_id="efc1714f94cd4a737696f724727cb044" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="69735802-c794-46cf-85ff-285eb1e57e8d" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428601200_325" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428601201" | |
2015-04-09 10:40:06,734 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428601200_325 with incident_id=69735802-c794-46cf-85ff-285eb1e57e8d | |
2015-04-09 10:40:06,763 DEBUG results for incident_id=69735802-c794-46cf-85ff-285eb1e57e8d written to collection. | |
2015-04-09 10:40:06,763 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428601200_325 incident_id=69735802-c794-46cf-85ff-285eb1e57e8d result_id=0 written to collection incident_results | |
2015-04-09 10:40:06,763 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 10:40:06,772 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 10:40:06,772 INFO Alert handler finished. duration=0.762s | |
2015-04-09 10:40:06,791 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=30c1426e-b0d8-4513-9ffe-67a63db9184c | |
2015-04-09 10:40:06,819 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 10:40:06,819 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 10:40:06,819 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 10:40:06,819 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 10:40:06,831 DEBUG Create event will be: time=2015-04-09T10:40:06.831419 severity=INFO origin="alert_handler" event_id="29bb2fefe0a835aff763cf0e1f7ce0f4" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="30c1426e-b0d8-4513-9ffe-67a63db9184c" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428601200_328" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428601201" | |
2015-04-09 10:40:06,837 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428601200_328 with incident_id=30c1426e-b0d8-4513-9ffe-67a63db9184c | |
2015-04-09 10:40:06,865 DEBUG results for incident_id=30c1426e-b0d8-4513-9ffe-67a63db9184c written to collection. | |
2015-04-09 10:40:06,866 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428601200_328 incident_id=30c1426e-b0d8-4513-9ffe-67a63db9184c result_id=0 written to collection incident_results | |
2015-04-09 10:40:06,866 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 10:40:06,872 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 10:40:06,872 INFO Alert handler finished. duration=0.741s | |
2015-04-09 10:50:05,972 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428601800_385/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428601800_385 sessionKey=qGCNeWwPeyaxZFETkd3h119TsRJy7kO6s0OUHJefSql0VV8ZF1rGLD8GI8c^qA95CVXrU9k78UPahjgovnKHqcviPigihYZrXzxOBa9I2p1FdH04ufCf6KJNBxTVNcowaWi8Lm4QHc2KH3o alert=To Global Failed Login >3 Alert | |
2015-04-09 10:50:05,978 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428601800_385' has been fired. | |
2015-04-09 10:50:06,168 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428601800_382/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428601800_382 sessionKey=IABiwmX9GnldffJf6NTmPEWaN^guYg82LuLJrS^QqoxNKUeuma6Q7btSxuvMMvD7waB^jteKpOW6a8K_vYzSJscBWyk8^XfmiDOwWKymBMvL7t5_akPX3xExpqNdWgwMDo2bW5hkDN alert=TO Failed Login Alert | |
2015-04-09 10:50:06,173 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428601800_382' has been fired. | |
2015-04-09 10:50:06,254 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_priority": "low", "default_owner": "unassigned"} | |
2015-04-09 10:50:06,254 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 10:50:06,260 DEBUG Incident settings: [ { "subcategory" : "unknown", "auto_ttl_resolve" : false, "category" : "unknown", "tags" : "[Untagged]", "alert" : "To Global Failed Login >3 Alert", "auto_previous_resolve" : false, "urgency" : "low", "run_alert_script" : false, "auto_assign_owner" : "unassigned", "auto_assign" : false, "_user" : "nobody", "_key" : "5526b8b13403e13421162b0d" } ] | |
2015-04-09 10:50:06,260 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 10:50:06,260 DEBUG Incident config after getting settings: {"_key": "5526b8b13403e13421162b0d", "auto_assign_owner": "unassigned", "urgency": "low", "alert": "To Global Failed Login >3 Alert", "category": "unknown", "run_alert_script": false, "auto_ttl_resolve": false, "auto_assign_user": "", "auto_assign": false, "_user": "nobody", "alert_script": "", "auto_previous_resolve": false, "subcategory": "unknown", "tags": "[Untagged]"} | |
2015-04-09 10:50:06,268 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 4 results. | |
2015-04-09 10:50:06,286 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 10:50:06,286 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 10:50:06,312 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 10:50:06,312 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 10:50:06,312 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 10:50:06,312 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 10:50:06,312 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428601800_385 | |
2015-04-09 10:50:06,451 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 10:50:06,451 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 10:50:06,457 DEBUG Incident settings: [ { "subcategory" : "unknown", "auto_ttl_resolve" : false, "category" : "unknown", "tags" : "[Untagged]", "alert" : "TO Failed Login Alert", "auto_previous_resolve" : false, "urgency" : "low", "run_alert_script" : false, "auto_assign_owner" : "unassigned", "auto_assign" : false, "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 10:50:06,457 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 10:50:06,458 DEBUG Incident config after getting settings: {"subcategory": "unknown", "_key": "5526b8b13403e13421162b0c", "run_alert_script": false, "tags": "[Untagged]", "alert": "TO Failed Login Alert", "auto_previous_resolve": false, "auto_assign_user": "", "alert_script": "", "category": "unknown", "auto_assign_owner": "unassigned", "urgency": "low", "auto_ttl_resolve": false, "auto_assign": false, "_user": "nobody"} | |
2015-04-09 10:50:06,465 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 10:50:06,482 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 10:50:06,483 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 10:50:06,508 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 10:50:06,508 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 10:50:06,508 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 10:50:06,508 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 10:50:06,509 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428601800_382 | |
2015-04-09 10:50:06,615 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=0661bf76-c0a6-4a4b-90b7-f5fa091d0a8a | |
2015-04-09 10:50:06,643 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 10:50:06,643 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 10:50:06,644 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 10:50:06,644 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 10:50:06,678 DEBUG Create event will be: time=2015-04-09T10:50:06.677962 severity=INFO origin="alert_handler" event_id="3402dc2cf38bccdbb9334ff5b8eab5c0" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="0661bf76-c0a6-4a4b-90b7-f5fa091d0a8a" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428601800_385" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428601801" | |
2015-04-09 10:50:06,684 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428601800_385 with incident_id=0661bf76-c0a6-4a4b-90b7-f5fa091d0a8a | |
2015-04-09 10:50:06,712 DEBUG results for incident_id=0661bf76-c0a6-4a4b-90b7-f5fa091d0a8a written to collection. | |
2015-04-09 10:50:06,712 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428601800_385 incident_id=0661bf76-c0a6-4a4b-90b7-f5fa091d0a8a result_id=0 written to collection incident_results | |
2015-04-09 10:50:06,712 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 10:50:06,719 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 10:50:06,719 INFO Alert handler finished. duration=0.747s | |
2015-04-09 10:50:06,811 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=44d9871b-9bdb-4250-8052-48fe2de5f518 | |
2015-04-09 10:50:06,838 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 10:50:06,838 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 10:50:06,838 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 10:50:06,838 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 10:50:06,884 DEBUG Create event will be: time=2015-04-09T10:50:06.884028 severity=INFO origin="alert_handler" event_id="b1a434b6fcc7546e0d82e3cc3aeefb4a" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="44d9871b-9bdb-4250-8052-48fe2de5f518" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428601800_382" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428601801" | |
2015-04-09 10:50:06,890 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428601800_382 with incident_id=44d9871b-9bdb-4250-8052-48fe2de5f518 | |
2015-04-09 10:50:06,918 DEBUG results for incident_id=44d9871b-9bdb-4250-8052-48fe2de5f518 written to collection. | |
2015-04-09 10:50:06,918 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428601800_382 incident_id=44d9871b-9bdb-4250-8052-48fe2de5f518 result_id=0 written to collection incident_results | |
2015-04-09 10:50:06,918 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 10:50:06,925 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 10:50:06,925 INFO Alert handler finished. duration=0.758s | |
2015-04-09 10:50:07,007 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428601800_380/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428601800_380 sessionKey=Ax7T3NFJnDh2qvgo8ZQuFKoceyghp3ig4zgOq3V1xUyQPr14ruoyFEzcy1hepDxSAXwD8awa5sKBc0OOaUxTyYeZCUzgV43XHElDf9DgGs0hLpk7KgyICk^clwrfg^Bm^HtzeLsuYiNCSF alert=TO AD Audit Rule Alert | |
2015-04-09 10:50:07,013 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428601800_380' has been fired. | |
2015-04-09 10:50:07,291 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 10:50:07,291 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-09 10:50:07,297 DEBUG Incident settings: [ { "subcategory" : "unknown", "auto_ttl_resolve" : false, "category" : "unknown", "tags" : "[Untagged]", "alert" : "TO AD Audit Rule Alert", "auto_previous_resolve" : false, "urgency" : "low", "run_alert_script" : false, "auto_assign_owner" : "unassigned", "auto_assign" : false, "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-09 10:50:07,298 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-09 10:50:07,298 DEBUG Incident config after getting settings: {"_user": "nobody", "auto_assign": false, "urgency": "low", "auto_ttl_resolve": false, "auto_assign_owner": "unassigned", "category": "unknown", "alert_script": "", "auto_assign_user": "", "auto_previous_resolve": false, "alert": "TO AD Audit Rule Alert", "run_alert_script": false, "tags": "[Untagged]", "_key": "5526b8b13403e13421162b09", "subcategory": "unknown"} | |
2015-04-09 10:50:07,305 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 1 results. | |
2015-04-09 10:50:07,326 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-09 10:50:07,326 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 10:50:07,355 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 10:50:07,355 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 10:50:07,355 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-09 10:50:07,355 DEBUG Matched impact in lookup, returning value=low | |
2015-04-09 10:50:07,355 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428601800_380 | |
2015-04-09 10:50:07,662 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=ed16b7ac-fe30-4c2c-8752-ac3bf59730de | |
2015-04-09 10:50:07,691 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 10:50:07,691 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 10:50:07,691 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'low'} | |
2015-04-09 10:50:07,691 DEBUG Matched priority in lookup, returning value=informational | |
2015-04-09 10:50:07,755 DEBUG Create event will be: time=2015-04-09T10:50:07.755892 severity=INFO origin="alert_handler" event_id="ef9f7c01bbc57bf6dad7d6e3137dfb27" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="ed16b7ac-fe30-4c2c-8752-ac3bf59730de" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428601800_380" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428601801" | |
2015-04-09 10:50:07,763 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428601800_380 with incident_id=ed16b7ac-fe30-4c2c-8752-ac3bf59730de | |
2015-04-09 10:50:07,789 DEBUG results for incident_id=ed16b7ac-fe30-4c2c-8752-ac3bf59730de written to collection. | |
2015-04-09 10:50:07,790 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428601800_380 incident_id=ed16b7ac-fe30-4c2c-8752-ac3bf59730de result_id=0 written to collection incident_results | |
2015-04-09 10:50:07,790 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 10:50:07,798 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 10:50:07,798 INFO Alert handler finished. duration=0.792s | |
2015-04-09 11:00:05,882 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428602400_39/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428602400_39 sessionKey=n9bLw69w0TZLNs0AftEzg0oX_sYuUb^0x^5sAlw0lV3aRIaNotw557JYoQyuzLzv7lMAITZJiq23tVADxr^fIdcZ0lwvCCMH1g7ji1bSCKIqxqyBegmc7RyoajkGhgNKKVAR_Y7C2RsgMF alert=To Global Failed Login >3 Alert | |
2015-04-09 11:00:05,888 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428602400_39' has been fired. | |
2015-04-09 11:00:06,165 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_urgency": "low", "default_impact": "low", "default_priority": "low", "default_owner": "unassigned"} | |
2015-04-09 11:00:06,165 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 11:00:06,171 DEBUG Incident settings: [ { "subcategory" : "unknown", "auto_ttl_resolve" : false, "category" : "unknown", "tags" : "[Untagged]", "alert" : "To Global Failed Login >3 Alert", "auto_previous_resolve" : false, "urgency" : "low", "run_alert_script" : false, "auto_assign_owner" : "unassigned", "auto_assign" : false, "_user" : "nobody", "_key" : "5526b8b13403e13421162b0d" } ] | |
2015-04-09 11:00:06,171 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 11:00:06,171 DEBUG Incident config after getting settings: {"alert": "To Global Failed Login >3 Alert", "auto_assign_owner": "unassigned", "subcategory": "unknown", "auto_assign_user": "", "urgency": "low", "alert_script": "", "category": "unknown", "run_alert_script": false, "auto_ttl_resolve": false, "auto_assign": false, "tags": "[Untagged]", "_key": "5526b8b13403e13421162b0d", "auto_previous_resolve": false, "_user": "nobody"} | |
2015-04-09 11:00:06,179 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 5 results. | |
2015-04-09 11:00:06,198 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:00:06,198 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:00:06,241 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:00:06,241 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:00:06,241 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:00:06,241 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:00:06,241 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428602400_39 | |
2015-04-09 11:00:06,418 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428602400_42/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428602400_42 sessionKey=NuJ_D0_p7MKbUfbczwcAraVgU03ZMVmRZTuqWpH9XXIQhLqoyAAyJwxpHd86Ra7d4LRueGI01pQ9_SvThwZccprlqkXm46vlyruwKLOriljARpSo_DLvskRwDZNcL1KSmiqAyb68ADpUucRcCC alert=TO Failed Login Alert | |
2015-04-09 11:00:06,424 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428602400_42' has been fired. | |
2015-04-09 11:00:06,543 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=503558ef-ba7a-457c-9ced-57897e763344 | |
2015-04-09 11:00:06,570 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:00:06,571 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:00:06,571 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:00:06,571 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:00:06,608 DEBUG Create event will be: time=2015-04-09T11:00:06.608513 severity=INFO origin="alert_handler" event_id="7461e65a9f6dd859eb338c13b21b5a3c" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="503558ef-ba7a-457c-9ced-57897e763344" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428602400_39" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428602401" | |
2015-04-09 11:00:06,615 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428602400_39 with incident_id=503558ef-ba7a-457c-9ced-57897e763344 | |
2015-04-09 11:00:06,643 DEBUG results for incident_id=503558ef-ba7a-457c-9ced-57897e763344 written to collection. | |
2015-04-09 11:00:06,643 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428602400_39 incident_id=503558ef-ba7a-457c-9ced-57897e763344 result_id=0 written to collection incident_results | |
2015-04-09 11:00:06,643 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:00:06,650 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:00:06,651 INFO Alert handler finished. duration=0.769s | |
2015-04-09 11:00:06,700 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 11:00:06,700 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 11:00:06,706 DEBUG Incident settings: [ { "subcategory" : "unknown", "auto_ttl_resolve" : false, "category" : "unknown", "tags" : "[Untagged]", "alert" : "TO Failed Login Alert", "auto_previous_resolve" : false, "urgency" : "low", "run_alert_script" : false, "auto_assign_owner" : "unassigned", "auto_assign" : false, "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 11:00:06,706 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 11:00:06,706 DEBUG Incident config after getting settings: {"auto_assign": false, "alert": "TO Failed Login Alert", "_key": "5526b8b13403e13421162b0c", "_user": "nobody", "auto_assign_user": "", "urgency": "low", "subcategory": "unknown", "run_alert_script": false, "auto_ttl_resolve": false, "tags": "[Untagged]", "category": "unknown", "auto_previous_resolve": false, "alert_script": "", "auto_assign_owner": "unassigned"} | |
2015-04-09 11:00:06,714 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 5 results. | |
2015-04-09 11:00:06,731 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:00:06,731 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:00:06,757 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:00:06,757 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:00:06,757 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:00:06,757 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:00:06,758 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428602400_42 | |
2015-04-09 11:00:07,051 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=1aa95756-fa24-4f72-bc3c-47924c8aa6bf | |
2015-04-09 11:00:07,080 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:00:07,080 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:00:07,080 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:00:07,080 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:00:07,121 DEBUG Create event will be: time=2015-04-09T11:00:07.121842 severity=INFO origin="alert_handler" event_id="7a29fa2f7f90f526083afd21181d8d18" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="1aa95756-fa24-4f72-bc3c-47924c8aa6bf" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428602400_42" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428602401" | |
2015-04-09 11:00:07,129 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428602400_42 with incident_id=1aa95756-fa24-4f72-bc3c-47924c8aa6bf | |
2015-04-09 11:00:07,156 DEBUG results for incident_id=1aa95756-fa24-4f72-bc3c-47924c8aa6bf written to collection. | |
2015-04-09 11:00:07,156 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428602400_42 incident_id=1aa95756-fa24-4f72-bc3c-47924c8aa6bf result_id=0 written to collection incident_results | |
2015-04-09 11:00:07,156 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:00:07,163 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:00:07,164 INFO Alert handler finished. duration=0.746s | |
2015-04-09 11:10:05,999 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428603000_103/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428603000_103 sessionKey=xgzLQiaH14wQWWLfAVnO8_^0dbpAKmDTGApPwZV9Ed47PtHlyWkS^qdMwUQ9k8fR8lUdUR2vS6vRuhPOJecXL2yJIGYberBqjqc6I92QrTV6NOVOGHdBsEKPQzqmyktWTe08XMzRrVEEAfe4xF alert=TO Failed Login Alert | |
2015-04-09 11:10:06,005 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428603000_103' has been fired. | |
2015-04-09 11:10:06,154 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428603000_106/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428603000_106 sessionKey=JleaitnAmywrw43pb8BOImOCz4F_YMEoScJrLYD8cnW10vZ6YO8rZU^Qsdvjpie69CaKa_VmvXsZ8ZbPyeGUpi04UGgsqIaBkBpooCt^s^ZjVI3HjU6o3dGBVHYee^mp8QIPQa8y alert=To Global Failed Login >3 Alert | |
2015-04-09 11:10:06,159 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428603000_106' has been fired. | |
2015-04-09 11:10:06,279 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low", "default_priority": "low"} | |
2015-04-09 11:10:06,279 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 11:10:06,286 DEBUG Incident settings: [ { "subcategory" : "unknown", "auto_ttl_resolve" : false, "category" : "unknown", "tags" : "[Untagged]", "alert" : "TO Failed Login Alert", "auto_previous_resolve" : false, "urgency" : "low", "run_alert_script" : false, "auto_assign_owner" : "unassigned", "auto_assign" : false, "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 11:10:06,286 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 11:10:06,286 DEBUG Incident config after getting settings: {"alert": "TO Failed Login Alert", "alert_script": "", "tags": "[Untagged]", "_user": "nobody", "auto_assign_user": "", "run_alert_script": false, "auto_previous_resolve": false, "urgency": "low", "auto_ttl_resolve": false, "category": "unknown", "subcategory": "unknown", "auto_assign_owner": "unassigned", "auto_assign": false, "_key": "5526b8b13403e13421162b0c"} | |
2015-04-09 11:10:06,294 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 3 results. | |
2015-04-09 11:10:06,313 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:10:06,313 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:10:06,339 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:10:06,339 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:10:06,339 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:10:06,339 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:10:06,339 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428603000_103 | |
2015-04-09 11:10:06,433 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "default_priority": "low", "index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-09 11:10:06,434 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 11:10:06,439 DEBUG Incident settings: [ { "subcategory" : "unknown", "auto_ttl_resolve" : false, "category" : "unknown", "tags" : "[Untagged]", "alert" : "To Global Failed Login >3 Alert", "auto_previous_resolve" : false, "urgency" : "low", "run_alert_script" : false, "auto_assign_owner" : "unassigned", "auto_assign" : false, "_user" : "nobody", "_key" : "5526b8b13403e13421162b0d" } ] | |
2015-04-09 11:10:06,440 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 11:10:06,440 DEBUG Incident config after getting settings: {"auto_previous_resolve": false, "_key": "5526b8b13403e13421162b0d", "_user": "nobody", "subcategory": "unknown", "auto_ttl_resolve": false, "run_alert_script": false, "urgency": "low", "auto_assign_user": "", "auto_assign": false, "auto_assign_owner": "unassigned", "category": "unknown", "tags": "[Untagged]", "alert": "To Global Failed Login >3 Alert", "alert_script": ""} | |
2015-04-09 11:10:06,448 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 4 results. | |
2015-04-09 11:10:06,468 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:10:06,468 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:10:06,495 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:10:06,495 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:10:06,495 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:10:06,495 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:10:06,496 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428603000_106 | |
2015-04-09 11:10:06,639 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=be2a1bc3-6ffc-4949-878c-61760bda952b | |
2015-04-09 11:10:06,664 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:10:06,665 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:10:06,665 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:10:06,665 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:10:06,680 DEBUG Create event will be: time=2015-04-09T11:10:06.680892 severity=INFO origin="alert_handler" event_id="9a318fd36520cd45547211ef8bef62d6" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="be2a1bc3-6ffc-4949-878c-61760bda952b" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428603000_103" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428603001" | |
2015-04-09 11:10:06,687 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428603000_103 with incident_id=be2a1bc3-6ffc-4949-878c-61760bda952b | |
2015-04-09 11:10:06,715 DEBUG results for incident_id=be2a1bc3-6ffc-4949-878c-61760bda952b written to collection. | |
2015-04-09 11:10:06,715 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428603000_103 incident_id=be2a1bc3-6ffc-4949-878c-61760bda952b result_id=0 written to collection incident_results | |
2015-04-09 11:10:06,715 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:10:06,722 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:10:06,722 INFO Alert handler finished. duration=0.724s | |
2015-04-09 11:10:06,793 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=9cd421fa-7116-4175-b545-442ec58d90bd | |
2015-04-09 11:10:06,820 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:10:06,820 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:10:06,821 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:10:06,821 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:10:06,853 DEBUG Create event will be: time=2015-04-09T11:10:06.853206 severity=INFO origin="alert_handler" event_id="18bfa7578d683e67d048e52eaf80facb" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="9cd421fa-7116-4175-b545-442ec58d90bd" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428603000_106" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428603001" | |
2015-04-09 11:10:06,859 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428603000_106 with incident_id=9cd421fa-7116-4175-b545-442ec58d90bd | |
2015-04-09 11:10:06,887 DEBUG results for incident_id=9cd421fa-7116-4175-b545-442ec58d90bd written to collection. | |
2015-04-09 11:10:06,887 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428603000_106 incident_id=9cd421fa-7116-4175-b545-442ec58d90bd result_id=0 written to collection incident_results | |
2015-04-09 11:10:06,887 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:10:06,894 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:10:06,894 INFO Alert handler finished. duration=0.741s | |
2015-04-09 11:17:04,897 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603420_152/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603420_152 sessionKey=kSFRhrcxYCB6CDcoQw4G8pqlkgxaiUeeV9Z53ROQYo3J5^fs90bfXfbFyIPdeCfAvMIJ7HlLoVSNcqaiMxsPpeGJEsUIdGqJX8pmLvGaD1i6xP6BObWsdngOruS0ZCbuzTvTG8lcGyYPKN alert=Test Alert | |
2015-04-09 11:17:04,903 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603420_152' has been fired. | |
2015-04-09 11:17:05,173 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-09 11:17:05,174 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:17:05,180 DEBUG Incident settings: [ ] | |
2015-04-09 11:17:05,180 INFO No incident settings found for Test Alert, switching back to defaults. | |
2015-04-09 11:17:05,180 DEBUG Incident config after getting settings: {"auto_ttl_resolve": false, "category": "", "subcategory": "", "alert_script": "", "tags": "", "auto_previous_resolve": false, "urgency": "low", "run_alert_script": false, "auto_assign_user": "", "auto_assign": false} | |
2015-04-09 11:17:05,189 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-09 11:17:05,208 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:17:05,208 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:17:05,239 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:17:05,239 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:17:05,240 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:17:05,240 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:17:05,240 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603420_152 | |
2015-04-09 11:17:05,535 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=850cd16d-60ce-479b-875d-84d582c9aa93 | |
2015-04-09 11:17:05,566 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:17:05,566 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:17:05,566 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': 'low'} | |
2015-04-09 11:17:05,567 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:17:05,581 DEBUG Create event will be: time=2015-04-09T11:17:05.581637 severity=INFO origin="alert_handler" event_id="0b0cfa377c727c1e393f71d9104644a8" user="splunk-system-user" action="create" alert="Test Alert" incident_id="850cd16d-60ce-479b-875d-84d582c9aa93" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603420_152" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428603421" | |
2015-04-09 11:17:05,587 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603420_152 with incident_id=850cd16d-60ce-479b-875d-84d582c9aa93 | |
2015-04-09 11:17:05,616 DEBUG results for incident_id=850cd16d-60ce-479b-875d-84d582c9aa93 written to collection. | |
2015-04-09 11:17:05,616 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603420_152 incident_id=850cd16d-60ce-479b-875d-84d582c9aa93 result_id=0 written to collection incident_results | |
2015-04-09 11:17:05,616 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:17:05,622 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:17:05,623 INFO Alert handler finished. duration=0.726s | |
2015-04-09 11:18:04,666 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603480_154/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603480_154 sessionKey=VKTe4ByD_aUm4cyEOTd2NZWjPM3s2S1K8H8Jt8CVAum3YJIBCqJjkBPNz1P6R0Y5FjybQzyCqSgxOh1EBmR2gqmZz_DFGf05AGp^hWELNNdxAPVyRFUznuRcZd0SzZ8pfmZm4Rn5LChP alert=Test Alert | |
2015-04-09 11:18:04,672 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603480_154' has been fired. | |
2015-04-09 11:18:04,951 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "default_urgency": "low", "default_impact": "low", "index": "alerts-to-inf"} | |
2015-04-09 11:18:04,952 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:18:04,958 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:18:04,958 INFO Found incident settings for Test Alert | |
2015-04-09 11:18:04,958 DEBUG Incident config after getting settings: {"auto_assign_owner": "unassigned", "auto_assign_user": "", "alert_script": "", "auto_previous_resolve": false, "tags": "[Untagged]", "alert": "Test Alert", "urgency": "low", "run_alert_script": false, "_key": "5526c2413403e135d146268b", "subcategory": "unknown", "category": "unknown", "auto_assign": false, "_user": "nobody", "auto_ttl_resolve": false} | |
2015-04-09 11:18:04,966 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-09 11:18:04,983 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:18:04,983 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:18:05,010 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:18:05,010 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:18:05,010 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:18:05,010 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:18:05,010 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603480_154 | |
2015-04-09 11:18:05,311 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=1b1e3138-159c-4842-ad81-701340b88191 | |
2015-04-09 11:18:05,339 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:18:05,339 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:18:05,339 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:18:05,339 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:18:05,382 DEBUG Create event will be: time=2015-04-09T11:18:05.382305 severity=INFO origin="alert_handler" event_id="cd73fff4ded7cd4afb1effd722428d9f" user="splunk-system-user" action="create" alert="Test Alert" incident_id="1b1e3138-159c-4842-ad81-701340b88191" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603480_154" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428603481" | |
2015-04-09 11:18:05,388 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603480_154 with incident_id=1b1e3138-159c-4842-ad81-701340b88191 | |
2015-04-09 11:18:05,416 DEBUG results for incident_id=1b1e3138-159c-4842-ad81-701340b88191 written to collection. | |
2015-04-09 11:18:05,416 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603480_154 incident_id=1b1e3138-159c-4842-ad81-701340b88191 result_id=0 written to collection incident_results | |
2015-04-09 11:18:05,416 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:18:05,423 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:18:05,423 INFO Alert handler finished. duration=0.757s | |
2015-04-09 11:19:04,394 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603540_157/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603540_157 sessionKey=zN9qeq2yu2GJXyNmcNlrHObztJqAzw23uGJoS__NS7PVfS_gn4UKP_ud9esy08SQnhEEReOMTfHX70^mPpPF^pdVxfqgx954GDptuJNxzAVYtOZa2bUxAT2LaXevtvf1qNd93Hk8jN alert=Test Alert | |
2015-04-09 11:19:04,400 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603540_157' has been fired. | |
2015-04-09 11:19:04,672 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-09 11:19:04,672 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:19:04,679 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:19:04,679 INFO Found incident settings for Test Alert | |
2015-04-09 11:19:04,680 DEBUG Incident config after getting settings: {"urgency": "low", "auto_assign_user": "", "auto_assign": false, "_user": "nobody", "auto_ttl_resolve": false, "auto_previous_resolve": false, "auto_assign_owner": "unassigned", "alert": "Test Alert", "_key": "5526c2413403e135d146268b", "alert_script": "", "subcategory": "unknown", "tags": "[Untagged]", "category": "unknown", "run_alert_script": false} | |
2015-04-09 11:19:04,688 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-09 11:19:04,707 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:19:04,708 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:19:04,735 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:19:04,735 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:19:04,736 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:19:04,736 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:19:04,736 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603540_157 | |
2015-04-09 11:19:05,033 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=17647891-8d43-4191-bce4-f5069e069b6c | |
2015-04-09 11:19:05,059 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:19:05,059 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:19:05,059 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:19:05,059 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:19:05,106 DEBUG Create event will be: time=2015-04-09T11:19:05.106745 severity=INFO origin="alert_handler" event_id="ec1f1ffb3d9cf571e35e594538e6c1bb" user="splunk-system-user" action="create" alert="Test Alert" incident_id="17647891-8d43-4191-bce4-f5069e069b6c" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603540_157" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428603541" | |
2015-04-09 11:19:05,112 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603540_157 with incident_id=17647891-8d43-4191-bce4-f5069e069b6c | |
2015-04-09 11:19:05,141 DEBUG results for incident_id=17647891-8d43-4191-bce4-f5069e069b6c written to collection. | |
2015-04-09 11:19:05,141 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603540_157 incident_id=17647891-8d43-4191-bce4-f5069e069b6c result_id=0 written to collection incident_results | |
2015-04-09 11:19:05,141 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:19:05,148 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:19:05,148 INFO Alert handler finished. duration=0.754s | |
2015-04-09 11:20:06,476 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428603600_163/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428603600_163 sessionKey=cSm3ehsczjCVDbTFhjf_GRM7632SvA^CBrsv4xQYJAK09lwEf_heeIF9I2540YVqSHLo3isN80a0hf9Zy2HcRt6jZE6y_Oi6jBjJB2_iElLJmmww8lpXEOaT5^jxvaaN0OAqe4azrQSN alert=To Global Failed Login >3 Alert | |
2015-04-09 11:20:06,482 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428603600_163' has been fired. | |
2015-04-09 11:20:06,533 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603600_164/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603600_164 sessionKey=LijXbIIBbA0DA7QPAQMIU6NlmxuGexKYMrMQ^QU7d1IwtVVwbF^b1Iod3p8EvokZeJc64jQsOybYjOVjpOrEl_y3gSO3^XES^hmLw9fzLTVxYYOL78c^CcQkTyfpnsO_KBfhLh0W alert=Test Alert | |
2015-04-09 11:20:06,539 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603600_164' has been fired. | |
2015-04-09 11:20:06,757 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_urgency": "low", "default_impact": "low"} | |
2015-04-09 11:20:06,758 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 11:20:06,763 DEBUG Incident settings: [ { "alert" : "To Global Failed Login >3 Alert", "auto_previous_resolve" : false, "auto_assign" : false, "auto_assign_owner" : "unassigned", "auto_ttl_resolve" : false, "urgency" : "low", "run_alert_script" : false, "subcategory" : "unknown", "tags" : "[Untagged]", "category" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0d" } ] | |
2015-04-09 11:20:06,764 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 11:20:06,764 DEBUG Incident config after getting settings: {"auto_ttl_resolve": false, "tags": "[Untagged]", "urgency": "low", "auto_assign": false, "auto_previous_resolve": false, "auto_assign_user": "", "auto_assign_owner": "unassigned", "_key": "5526b8b13403e13421162b0d", "category": "unknown", "_user": "nobody", "run_alert_script": false, "subcategory": "unknown", "alert": "To Global Failed Login >3 Alert", "alert_script": ""} | |
2015-04-09 11:20:06,771 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 11:20:06,789 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:20:06,790 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:20:06,816 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:20:06,817 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:20:06,817 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:20:06,817 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:20:06,817 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428603600_163 | |
2015-04-09 11:20:06,822 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 11:20:06,822 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:20:06,829 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:20:06,829 INFO Found incident settings for Test Alert | |
2015-04-09 11:20:06,829 DEBUG Incident config after getting settings: {"category": "unknown", "tags": "[Untagged]", "alert_script": "", "alert": "Test Alert", "auto_assign": false, "urgency": "low", "run_alert_script": false, "auto_assign_user": "", "auto_assign_owner": "unassigned", "subcategory": "unknown", "auto_ttl_resolve": false, "_key": "5526c2413403e135d146268b", "auto_previous_resolve": false, "_user": "nobody"} | |
2015-04-09 11:20:06,836 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-09 11:20:06,854 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:20:06,854 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:20:06,879 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:20:06,880 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:20:06,880 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:20:06,880 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:20:06,880 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603600_164 | |
2015-04-09 11:20:07,116 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=a51aed37-7b2d-4aa7-a84e-e8990d5f4c35 | |
2015-04-09 11:20:07,145 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:20:07,145 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:20:07,146 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:20:07,146 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:20:07,195 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=a24df1a2-37b3-4163-950d-2ebbfa41bd92 | |
2015-04-09 11:20:07,197 DEBUG Create event will be: time=2015-04-09T11:20:07.197608 severity=INFO origin="alert_handler" event_id="8c22c82f15bea3997cfae25feb2dd193" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="a51aed37-7b2d-4aa7-a84e-e8990d5f4c35" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428603600_163" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428603601" | |
2015-04-09 11:20:07,206 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428603600_163 with incident_id=a51aed37-7b2d-4aa7-a84e-e8990d5f4c35 | |
2015-04-09 11:20:07,227 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:20:07,227 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:20:07,227 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:20:07,227 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:20:07,232 DEBUG results for incident_id=a51aed37-7b2d-4aa7-a84e-e8990d5f4c35 written to collection. | |
2015-04-09 11:20:07,232 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428603600_163 incident_id=a51aed37-7b2d-4aa7-a84e-e8990d5f4c35 result_id=0 written to collection incident_results | |
2015-04-09 11:20:07,232 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:20:07,240 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:20:07,240 INFO Alert handler finished. duration=0.764s | |
2015-04-09 11:20:07,267 DEBUG Create event will be: time=2015-04-09T11:20:07.267137 severity=INFO origin="alert_handler" event_id="c1787574a90ae582645d3e651cd85390" user="splunk-system-user" action="create" alert="Test Alert" incident_id="a24df1a2-37b3-4163-950d-2ebbfa41bd92" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603600_164" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428603601" | |
2015-04-09 11:20:07,273 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603600_164 with incident_id=a24df1a2-37b3-4163-950d-2ebbfa41bd92 | |
2015-04-09 11:20:07,302 DEBUG results for incident_id=a24df1a2-37b3-4163-950d-2ebbfa41bd92 written to collection. | |
2015-04-09 11:20:07,302 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603600_164 incident_id=a24df1a2-37b3-4163-950d-2ebbfa41bd92 result_id=0 written to collection incident_results | |
2015-04-09 11:20:07,302 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:20:07,309 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:20:07,310 INFO Alert handler finished. duration=0.777s | |
2015-04-09 11:20:07,529 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428603600_167/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428603600_167 sessionKey=UsZ4r^7Gpc_3yXf1hUk9b2qwKruNfjRzgTor1Ei71LL_a^xK_^gPyyx^cp09Fwyz9XV4wZeaRyCA8scq7OU^RXrRhQx3N9_P3KO7HnLjpmHPMi^33O6ba8o1K^5DGg_aa_oFONHgSxUEoX0h3N alert=TO Failed Login Alert | |
2015-04-09 11:20:07,535 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428603600_167' has been fired. | |
2015-04-09 11:20:07,838 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low"} | |
2015-04-09 11:20:07,838 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 11:20:07,844 DEBUG Incident settings: [ { "alert" : "TO Failed Login Alert", "auto_previous_resolve" : false, "auto_assign" : false, "auto_assign_owner" : "unassigned", "auto_ttl_resolve" : false, "urgency" : "low", "run_alert_script" : false, "subcategory" : "unknown", "tags" : "[Untagged]", "category" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 11:20:07,844 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 11:20:07,844 DEBUG Incident config after getting settings: {"auto_previous_resolve": false, "urgency": "low", "_user": "nobody", "alert_script": "", "alert": "TO Failed Login Alert", "auto_assign_user": "", "subcategory": "unknown", "tags": "[Untagged]", "auto_assign_owner": "unassigned", "auto_ttl_resolve": false, "_key": "5526b8b13403e13421162b0c", "run_alert_script": false, "auto_assign": false, "category": "unknown"} | |
2015-04-09 11:20:07,852 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 11:20:07,870 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:20:07,871 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:20:07,897 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:20:07,897 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:20:07,898 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:20:07,898 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:20:07,898 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428603600_167 | |
2015-04-09 11:20:08,197 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=f01a8276-34a4-49a4-ba1a-e70908ae0bac | |
2015-04-09 11:20:08,223 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:20:08,223 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:20:08,224 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:20:08,224 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:20:08,271 DEBUG Create event will be: time=2015-04-09T11:20:08.271136 severity=INFO origin="alert_handler" event_id="2bdc02deae15aa2a2badbb399f4eb38f" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="f01a8276-34a4-49a4-ba1a-e70908ae0bac" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428603600_167" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428603601" | |
2015-04-09 11:20:08,278 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428603600_167 with incident_id=f01a8276-34a4-49a4-ba1a-e70908ae0bac | |
2015-04-09 11:20:08,305 DEBUG results for incident_id=f01a8276-34a4-49a4-ba1a-e70908ae0bac written to collection. | |
2015-04-09 11:20:08,305 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428603600_167 incident_id=f01a8276-34a4-49a4-ba1a-e70908ae0bac result_id=0 written to collection incident_results | |
2015-04-09 11:20:08,305 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:20:08,312 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:20:08,312 INFO Alert handler finished. duration=0.783s | |
2015-04-09 11:21:14,377 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603660_194/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603660_194 sessionKey=BbJU9H4WMKHky0Gi1QGxhStN_pbVe6tYAUhvaUOsDK0vbVKMvM1Y3hYKzZTTvxrxKIwftFupyYYbfPfYfOsT2cv2voiPg1Wwkeltm^vzzkF2tPLRykX1YicQ8m9tbtCyLTjrtdYyrGN alert=Test Alert | |
2015-04-09 11:21:14,383 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603660_194' has been fired. | |
2015-04-09 11:21:14,653 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "default_priority": "low", "default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 11:21:14,653 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:21:14,659 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:21:14,660 INFO Found incident settings for Test Alert | |
2015-04-09 11:21:14,660 DEBUG Incident config after getting settings: {"alert": "Test Alert", "auto_assign": false, "_user": "nobody", "subcategory": "unknown", "_key": "5526c2413403e135d146268b", "alert_script": "", "auto_ttl_resolve": false, "run_alert_script": false, "urgency": "low", "auto_assign_user": "", "auto_assign_owner": "unassigned", "tags": "[Untagged]", "category": "unknown", "auto_previous_resolve": false} | |
2015-04-09 11:21:14,668 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-09 11:21:14,685 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:21:14,685 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:21:14,711 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:21:14,711 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:21:14,711 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:21:14,711 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:21:14,711 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603660_194 | |
2015-04-09 11:21:15,008 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=0322bee5-c088-4d4b-910c-4c3907a00f49 | |
2015-04-09 11:21:15,035 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:21:15,036 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:21:15,036 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:21:15,036 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:21:15,050 DEBUG Create event will be: time=2015-04-09T11:21:15.050160 severity=INFO origin="alert_handler" event_id="2db3c0e55f472db322e09c1a35a22121" user="splunk-system-user" action="create" alert="Test Alert" incident_id="0322bee5-c088-4d4b-910c-4c3907a00f49" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603660_194" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428603671" | |
2015-04-09 11:21:15,056 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603660_194 with incident_id=0322bee5-c088-4d4b-910c-4c3907a00f49 | |
2015-04-09 11:21:15,084 DEBUG results for incident_id=0322bee5-c088-4d4b-910c-4c3907a00f49 written to collection. | |
2015-04-09 11:21:15,084 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603660_194 incident_id=0322bee5-c088-4d4b-910c-4c3907a00f49 result_id=0 written to collection incident_results | |
2015-04-09 11:21:15,085 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:21:15,091 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:21:15,092 INFO Alert handler finished. duration=0.715s | |
2015-04-09 11:22:04,401 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603720_196/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603720_196 sessionKey=TIPyiIFMBDcNtLdSMRb^v1G2olGe4cmYJzcqiaXR1JJsYopwrN7AFNfwFZNO1mOjLILcHiUDIN^6kDxiEB23ej5Xo7DUeYPgg18PQ88TWLMJrFKYkq8BXwsUkAbpNiW5gCDn_s1sOEMlxO46 alert=Test Alert | |
2015-04-09 11:22:04,406 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603720_196' has been fired. | |
2015-04-09 11:22:04,677 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 11:22:04,677 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:22:04,683 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:22:04,683 INFO Found incident settings for Test Alert | |
2015-04-09 11:22:04,683 DEBUG Incident config after getting settings: {"auto_assign_owner": "unassigned", "auto_assign_user": "", "run_alert_script": false, "urgency": "low", "alert": "Test Alert", "alert_script": "", "tags": "[Untagged]", "category": "unknown", "_user": "nobody", "auto_assign": false, "auto_previous_resolve": false, "_key": "5526c2413403e135d146268b", "auto_ttl_resolve": false, "subcategory": "unknown"} | |
2015-04-09 11:22:04,691 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-09 11:22:04,708 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:22:04,708 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:22:04,734 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:22:04,734 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:22:04,735 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:22:04,735 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:22:04,735 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603720_196 | |
2015-04-09 11:22:05,030 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=a22b50ad-bfd0-4af6-92a9-093c1b0babf2 | |
2015-04-09 11:22:05,057 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:22:05,057 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:22:05,057 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:22:05,057 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:22:05,104 DEBUG Create event will be: time=2015-04-09T11:22:05.104353 severity=INFO origin="alert_handler" event_id="e10a8a63fdb0cf20fa45ab2a08a07186" user="splunk-system-user" action="create" alert="Test Alert" incident_id="a22b50ad-bfd0-4af6-92a9-093c1b0babf2" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603720_196" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428603721" | |
2015-04-09 11:22:05,110 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603720_196 with incident_id=a22b50ad-bfd0-4af6-92a9-093c1b0babf2 | |
2015-04-09 11:22:05,138 DEBUG results for incident_id=a22b50ad-bfd0-4af6-92a9-093c1b0babf2 written to collection. | |
2015-04-09 11:22:05,139 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603720_196 incident_id=a22b50ad-bfd0-4af6-92a9-093c1b0babf2 result_id=0 written to collection incident_results | |
2015-04-09 11:22:05,139 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:22:05,145 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:22:05,145 INFO Alert handler finished. duration=0.745s | |
2015-04-09 11:23:04,105 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603780_198/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603780_198 sessionKey=vdzs5Ojk9vBi3JEiZcUXGMiqccVCmedRaDh8y430jL5FQym_8cS6TM19vjjFnAboXJdqMFWVm6eN_8eSKnQ_ZV8yj20i8E^uP1oVcQMYTjQ9KJgCzxzvWJAqa00U9u_Ln_X6ycRlgtV6HzQhSC alert=Test Alert | |
2015-04-09 11:23:04,111 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603780_198' has been fired. | |
2015-04-09 11:23:04,380 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-09 11:23:04,380 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:23:04,386 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:23:04,386 INFO Found incident settings for Test Alert | |
2015-04-09 11:23:04,386 DEBUG Incident config after getting settings: {"auto_assign_user": "", "_key": "5526c2413403e135d146268b", "auto_assign_owner": "unassigned", "alert_script": "", "category": "unknown", "alert": "Test Alert", "subcategory": "unknown", "run_alert_script": false, "auto_ttl_resolve": false, "auto_assign": false, "tags": "[Untagged]", "urgency": "low", "auto_previous_resolve": false, "_user": "nobody"} | |
2015-04-09 11:23:04,394 INFO Found job for alert Test Alert. Context is 'search' with 3 results. | |
2015-04-09 11:23:04,413 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:23:04,413 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:23:04,438 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:23:04,439 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:23:04,439 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:23:04,439 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:23:04,439 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603780_198 | |
2015-04-09 11:23:04,729 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=92f7bf04-b3bb-4ca4-b4fc-1c601bc86b51 | |
2015-04-09 11:23:04,755 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:23:04,755 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:23:04,755 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:23:04,756 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:23:04,802 DEBUG Create event will be: time=2015-04-09T11:23:04.802230 severity=INFO origin="alert_handler" event_id="ed0f65d0c1dcd0d08ad5ff95af8adcab" user="splunk-system-user" action="create" alert="Test Alert" incident_id="92f7bf04-b3bb-4ca4-b4fc-1c601bc86b51" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603780_198" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428603781" | |
2015-04-09 11:23:04,808 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603780_198 with incident_id=92f7bf04-b3bb-4ca4-b4fc-1c601bc86b51 | |
2015-04-09 11:23:04,836 DEBUG results for incident_id=92f7bf04-b3bb-4ca4-b4fc-1c601bc86b51 written to collection. | |
2015-04-09 11:23:04,836 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603780_198 incident_id=92f7bf04-b3bb-4ca4-b4fc-1c601bc86b51 result_id=0 written to collection incident_results | |
2015-04-09 11:23:04,836 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:23:04,843 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:23:04,844 INFO Alert handler finished. duration=0.739s | |
2015-04-09 11:24:04,026 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603840_200/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603840_200 sessionKey=OrzARXN1^EoFtkM^^Ns4vBT_ka0kJDobnzOqthSu93Alki1g927m5EMb46GCNz2uNF_3ZSyk^GQ6hB96CtIJeEZeRIj2p2r54ohp9hBFVc7y7caO1rCDtc9Z34t^mhIFtWQjgv67 alert=Test Alert | |
2015-04-09 11:24:04,032 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603840_200' has been fired. | |
2015-04-09 11:24:04,305 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 11:24:04,305 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:24:04,311 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:24:04,311 INFO Found incident settings for Test Alert | |
2015-04-09 11:24:04,311 DEBUG Incident config after getting settings: {"_user": "nobody", "alert": "Test Alert", "auto_assign": false, "auto_assign_user": "", "auto_assign_owner": "unassigned", "tags": "[Untagged]", "alert_script": "", "auto_previous_resolve": false, "_key": "5526c2413403e135d146268b", "auto_ttl_resolve": false, "category": "unknown", "subcategory": "unknown", "run_alert_script": false, "urgency": "low"} | |
2015-04-09 11:24:04,319 INFO Found job for alert Test Alert. Context is 'search' with 3 results. | |
2015-04-09 11:24:04,337 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:24:04,337 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:24:04,362 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:24:04,362 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:24:04,362 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:24:04,362 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:24:04,363 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603840_200 | |
2015-04-09 11:24:04,652 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=9e45ef88-32bd-4c63-a0a5-5f0866e9c1c1 | |
2015-04-09 11:24:04,678 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:24:04,678 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:24:04,678 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:24:04,678 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:24:04,701 DEBUG Create event will be: time=2015-04-09T11:24:04.701744 severity=INFO origin="alert_handler" event_id="66d61ed335d0b5296708af0e44a63d93" user="splunk-system-user" action="create" alert="Test Alert" incident_id="9e45ef88-32bd-4c63-a0a5-5f0866e9c1c1" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603840_200" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428603841" | |
2015-04-09 11:24:04,708 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603840_200 with incident_id=9e45ef88-32bd-4c63-a0a5-5f0866e9c1c1 | |
2015-04-09 11:24:04,736 DEBUG results for incident_id=9e45ef88-32bd-4c63-a0a5-5f0866e9c1c1 written to collection. | |
2015-04-09 11:24:04,736 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603840_200 incident_id=9e45ef88-32bd-4c63-a0a5-5f0866e9c1c1 result_id=0 written to collection incident_results | |
2015-04-09 11:24:04,736 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:24:04,742 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:24:04,743 INFO Alert handler finished. duration=0.717s | |
2015-04-09 11:25:04,888 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428603900_204/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428603900_204 sessionKey=zYv3oKu9aJYOn27bSgZk^oUlQx0V9KaR7r8wdXOUwY4ILNF32uN4Y8jqVhgkuNrNqnHOsHpXon2ahiCfKaZ6^ZvoOmWJ2lC00Hsgz4qQpmqLV7uJvxzZe3h6LsQ2s_TW7BKG5E9x alert=TO AD Audit Rule Alert | |
2015-04-09 11:25:04,894 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428603900_204' has been fired. | |
2015-04-09 11:25:05,170 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 11:25:05,170 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-09 11:25:05,176 DEBUG Incident settings: [ { "alert" : "TO AD Audit Rule Alert", "auto_previous_resolve" : false, "auto_assign" : false, "auto_assign_owner" : "unassigned", "auto_ttl_resolve" : false, "urgency" : "low", "run_alert_script" : false, "subcategory" : "unknown", "tags" : "[Untagged]", "category" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-09 11:25:05,176 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-09 11:25:05,176 DEBUG Incident config after getting settings: {"alert_script": "", "alert": "TO AD Audit Rule Alert", "tags": "[Untagged]", "category": "unknown", "auto_assign_owner": "unassigned", "auto_assign": false, "auto_assign_user": "", "urgency": "low", "auto_previous_resolve": false, "auto_ttl_resolve": false, "subcategory": "unknown", "_user": "nobody", "_key": "5526b8b13403e13421162b09", "run_alert_script": false} | |
2015-04-09 11:25:05,184 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 2 results. | |
2015-04-09 11:25:05,201 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-09 11:25:05,201 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:25:05,226 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:25:05,226 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:25:05,227 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-09 11:25:05,227 DEBUG Matched impact in lookup, returning value=low | |
2015-04-09 11:25:05,227 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428603900_204 | |
2015-04-09 11:25:05,327 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603900_207/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603900_207 sessionKey=yPiHqMbsTE28ZgZzzV21Z1KDxsuF^3qlfnp1t5wuk9e4yh7JvCvX67qEv5i^r6i5s947Xpbfk1zwgNXnGhAMz1FRsI02hy6Wr3Ecf0c^qCO_ktn02EN0yzEpy5aUBJXSgmKFAM8olqFDaDC alert=Test Alert | |
2015-04-09 11:25:05,332 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603900_207' has been fired. | |
2015-04-09 11:25:05,528 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=2707243a-2987-4239-8555-887f757a16ba | |
2015-04-09 11:25:05,557 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:25:05,558 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:25:05,558 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'low'} | |
2015-04-09 11:25:05,558 DEBUG Matched priority in lookup, returning value=informational | |
2015-04-09 11:25:05,592 DEBUG Create event will be: time=2015-04-09T11:25:05.592858 severity=INFO origin="alert_handler" event_id="08cb83a3db61a2c0bdee3ba78253d3dc" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="2707243a-2987-4239-8555-887f757a16ba" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428603900_204" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428603901" | |
2015-04-09 11:25:05,600 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428603900_204 with incident_id=2707243a-2987-4239-8555-887f757a16ba | |
2015-04-09 11:25:05,611 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "default_urgency": "low", "default_impact": "low", "index": "alerts-to-inf"} | |
2015-04-09 11:25:05,616 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:25:05,622 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:25:05,622 INFO Found incident settings for Test Alert | |
2015-04-09 11:25:05,622 DEBUG Incident config after getting settings: {"auto_assign_owner": "unassigned", "auto_assign_user": "", "alert_script": "", "auto_previous_resolve": false, "tags": "[Untagged]", "alert": "Test Alert", "_key": "5526c2413403e135d146268b", "urgency": "low", "run_alert_script": false, "subcategory": "unknown", "category": "unknown", "auto_assign": false, "_user": "nobody", "auto_ttl_resolve": false} | |
2015-04-09 11:25:05,626 DEBUG results for incident_id=2707243a-2987-4239-8555-887f757a16ba written to collection. | |
2015-04-09 11:25:05,626 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428603900_204 incident_id=2707243a-2987-4239-8555-887f757a16ba result_id=0 written to collection incident_results | |
2015-04-09 11:25:05,627 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:25:05,631 INFO Found job for alert Test Alert. Context is 'search' with 3 results. | |
2015-04-09 11:25:05,634 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:25:05,634 INFO Alert handler finished. duration=0.747s | |
2015-04-09 11:25:05,651 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:25:05,651 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:25:05,677 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:25:05,678 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:25:05,678 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:25:05,678 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:25:05,678 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603900_207 | |
2015-04-09 11:25:05,969 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=1ffbd7d6-d043-4630-b322-04b504fb1796 | |
2015-04-09 11:25:05,995 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:25:05,995 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:25:05,996 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:25:05,996 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:25:06,037 DEBUG Create event will be: time=2015-04-09T11:25:06.037920 severity=INFO origin="alert_handler" event_id="cca37d036803e0af03750a5dc29be1b1" user="splunk-system-user" action="create" alert="Test Alert" incident_id="1ffbd7d6-d043-4630-b322-04b504fb1796" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603900_207" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428603901" | |
2015-04-09 11:25:06,044 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603900_207 with incident_id=1ffbd7d6-d043-4630-b322-04b504fb1796 | |
2015-04-09 11:25:06,072 DEBUG results for incident_id=1ffbd7d6-d043-4630-b322-04b504fb1796 written to collection. | |
2015-04-09 11:25:06,072 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603900_207 incident_id=1ffbd7d6-d043-4630-b322-04b504fb1796 result_id=0 written to collection incident_results | |
2015-04-09 11:25:06,072 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:25:06,078 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:25:06,079 INFO Alert handler finished. duration=0.752s | |
2015-04-09 11:26:29,490 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603960_14/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603960_14 sessionKey=xQkD72oyag9PGYZkIOxh2QMQ2foKVU69Brb9XOOGn2ebo7MR0CfpEP^oj4^hngmS3KQ4996ENyFuNwpSegaTy1aeieru5scPUp9t5hDI_WOzd70mBRYS1s1ZdCJo5XyUmdmk8G32zBUetcKK alert=Test Alert | |
2015-04-09 11:26:29,496 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603960_14' has been fired. | |
2015-04-09 11:26:29,806 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-09 11:26:29,806 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:26:29,815 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:26:29,815 INFO Found incident settings for Test Alert | |
2015-04-09 11:26:29,815 DEBUG Incident config after getting settings: {"auto_assign": false, "alert": "Test Alert", "_key": "5526c2413403e135d146268b", "_user": "nobody", "subcategory": "unknown", "auto_assign_user": "", "run_alert_script": false, "auto_ttl_resolve": false, "tags": "[Untagged]", "alert_script": "", "auto_previous_resolve": false, "urgency": "low", "auto_assign_owner": "unassigned", "category": "unknown"} | |
2015-04-09 11:26:29,824 INFO Found job for alert Test Alert. Context is 'search' with 3 results. | |
2015-04-09 11:26:29,842 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:26:29,842 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:26:29,885 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:26:29,885 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:26:29,886 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:26:29,886 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:26:29,886 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603960_14 | |
2015-04-09 11:26:30,221 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=ba140135-81e1-444e-aa93-601c83e6a329 | |
2015-04-09 11:26:30,253 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:26:30,253 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:26:30,254 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:26:30,254 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:26:30,286 DEBUG Create event will be: time=2015-04-09T11:26:30.286893 severity=INFO origin="alert_handler" event_id="968245c1f37cd8d732c82437b5a65c29" user="splunk-system-user" action="create" alert="Test Alert" incident_id="ba140135-81e1-444e-aa93-601c83e6a329" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603960_14" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428603985" | |
2015-04-09 11:26:30,294 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603960_14 with incident_id=ba140135-81e1-444e-aa93-601c83e6a329 | |
2015-04-09 11:26:30,356 DEBUG results for incident_id=ba140135-81e1-444e-aa93-601c83e6a329 written to collection. | |
2015-04-09 11:26:30,356 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428603960_14 incident_id=ba140135-81e1-444e-aa93-601c83e6a329 result_id=0 written to collection incident_results | |
2015-04-09 11:26:30,356 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:26:30,366 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:26:30,366 INFO Alert handler finished. duration=0.876s | |
2015-04-09 11:27:03,930 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604020_20/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604020_20 sessionKey=eR6d6i0u7rkiGedFK7ymcnZ1N_8tCY^RT7sAK4bxrMLm6pDDrZ3tQt^qKuE3_N1VAnqLUjhMaPR9nyCmqxhHZVqoS7CxqpMoP75UM6efmw1_KTUWsVt0noi0ssXi0ZKvyF9N^D28lFvDvF alert=Test Alert | |
2015-04-09 11:27:03,936 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604020_20' has been fired. | |
2015-04-09 11:27:04,214 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned", "default_priority": "low"} | |
2015-04-09 11:27:04,214 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:27:04,220 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:27:04,220 INFO Found incident settings for Test Alert | |
2015-04-09 11:27:04,220 DEBUG Incident config after getting settings: {"category": "unknown", "auto_previous_resolve": false, "run_alert_script": false, "auto_assign": false, "auto_ttl_resolve": false, "tags": "[Untagged]", "alert_script": "", "_user": "nobody", "_key": "5526c2413403e135d146268b", "alert": "Test Alert", "subcategory": "unknown", "auto_assign_user": "", "urgency": "low", "auto_assign_owner": "unassigned"} | |
2015-04-09 11:27:04,228 INFO Found job for alert Test Alert. Context is 'search' with 3 results. | |
2015-04-09 11:27:04,247 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:27:04,247 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:27:04,275 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:27:04,275 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:27:04,276 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:27:04,276 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:27:04,276 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604020_20 | |
2015-04-09 11:27:04,589 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=b90627dc-b1dc-4e0d-acdf-53a77a58eae4 | |
2015-04-09 11:27:04,616 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:27:04,616 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:27:04,617 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:27:04,617 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:27:04,639 DEBUG Create event will be: time=2015-04-09T11:27:04.639197 severity=INFO origin="alert_handler" event_id="e57247d784ecf6475eec565d18ffcc88" user="splunk-system-user" action="create" alert="Test Alert" incident_id="b90627dc-b1dc-4e0d-acdf-53a77a58eae4" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604020_20" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604021" | |
2015-04-09 11:27:04,646 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604020_20 with incident_id=b90627dc-b1dc-4e0d-acdf-53a77a58eae4 | |
2015-04-09 11:27:04,673 DEBUG results for incident_id=b90627dc-b1dc-4e0d-acdf-53a77a58eae4 written to collection. | |
2015-04-09 11:27:04,673 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604020_20 incident_id=b90627dc-b1dc-4e0d-acdf-53a77a58eae4 result_id=0 written to collection incident_results | |
2015-04-09 11:27:04,673 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:27:04,682 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:27:04,682 INFO Alert handler finished. duration=0.753s | |
2015-04-09 11:28:04,251 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604080_23/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604080_23 sessionKey=bDsy67cZg5r177e1VOuLe5r9VGWkfRs8N^QDFjLK97pwfyxwzmAiyhzCIB0rJ0PQ8GQlcvH0sUwYm45o_hUf0TQmtrJxu3EQLlibP2Hq4pr8z1doJnevOex5r^LAQBdzlgr6FnZ7IJxlCI3 alert=Test Alert | |
2015-04-09 11:28:04,257 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604080_23' has been fired. | |
2015-04-09 11:28:04,536 DEBUG Parsed global alert handler settings: {"default_priority": "low", "index": "alerts-to-inf", "default_urgency": "low", "default_owner": "unassigned", "default_impact": "low"} | |
2015-04-09 11:28:04,537 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:28:04,543 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:28:04,543 INFO Found incident settings for Test Alert | |
2015-04-09 11:28:04,543 DEBUG Incident config after getting settings: {"auto_assign_user": "", "auto_assign_owner": "unassigned", "_key": "5526c2413403e135d146268b", "alert": "Test Alert", "auto_previous_resolve": false, "category": "unknown", "run_alert_script": false, "subcategory": "unknown", "tags": "[Untagged]", "urgency": "low", "auto_ttl_resolve": false, "auto_assign": false, "_user": "nobody", "alert_script": ""} | |
2015-04-09 11:28:04,551 INFO Found job for alert Test Alert. Context is 'search' with 3 results. | |
2015-04-09 11:28:04,569 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:28:04,569 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:28:04,595 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:28:04,596 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:28:04,596 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:28:04,596 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:28:04,596 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604080_23 | |
2015-04-09 11:28:04,895 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=93d53bda-4d29-4032-a034-e6319ef7df1c | |
2015-04-09 11:28:04,922 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:28:04,923 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:28:04,923 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:28:04,923 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:28:04,949 DEBUG Create event will be: time=2015-04-09T11:28:04.949247 severity=INFO origin="alert_handler" event_id="063a32951d5e23a6861f0ad1f3257943" user="splunk-system-user" action="create" alert="Test Alert" incident_id="93d53bda-4d29-4032-a034-e6319ef7df1c" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604080_23" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604081" | |
2015-04-09 11:28:04,955 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604080_23 with incident_id=93d53bda-4d29-4032-a034-e6319ef7df1c | |
2015-04-09 11:28:05,018 DEBUG results for incident_id=93d53bda-4d29-4032-a034-e6319ef7df1c written to collection. | |
2015-04-09 11:28:05,018 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604080_23 incident_id=93d53bda-4d29-4032-a034-e6319ef7df1c result_id=0 written to collection incident_results | |
2015-04-09 11:28:05,018 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:28:05,026 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:28:05,026 INFO Alert handler finished. duration=0.775s | |
2015-04-09 11:29:04,031 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604140_25/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604140_25 sessionKey=pR0huP1gjPKJO2cOMt3oKUNheUopIzBwhmNVmfn4AN9_H2^DNfra1NRKSJ0OC4vus_cagTzkcbE3XYv_np6GvG_L72Xv4fMbnx_FFtdikauDtG^TSdsL_^AFX7_OaNEeQpXv_VSdN86DIxFX alert=Test Alert | |
2015-04-09 11:29:04,037 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604140_25' has been fired. | |
2015-04-09 11:29:04,307 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 11:29:04,307 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:29:04,314 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:29:04,314 INFO Found incident settings for Test Alert | |
2015-04-09 11:29:04,315 DEBUG Incident config after getting settings: {"subcategory": "unknown", "auto_ttl_resolve": false, "_user": "nobody", "_key": "5526c2413403e135d146268b", "auto_previous_resolve": false, "tags": "[Untagged]", "alert_script": "", "alert": "Test Alert", "run_alert_script": false, "category": "unknown", "auto_assign": false, "auto_assign_user": "", "auto_assign_owner": "unassigned", "urgency": "low"} | |
2015-04-09 11:29:04,323 INFO Found job for alert Test Alert. Context is 'search' with 3 results. | |
2015-04-09 11:29:04,341 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:29:04,341 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:29:04,366 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:29:04,366 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:29:04,367 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:29:04,367 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:29:04,367 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604140_25 | |
2015-04-09 11:29:04,664 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=18c341c2-3d72-41b8-8272-373ca1a00a8c | |
2015-04-09 11:29:04,691 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:29:04,691 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:29:04,691 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:29:04,691 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:29:04,722 DEBUG Create event will be: time=2015-04-09T11:29:04.722044 severity=INFO origin="alert_handler" event_id="8038a36eac82759793b5a7d47750256b" user="splunk-system-user" action="create" alert="Test Alert" incident_id="18c341c2-3d72-41b8-8272-373ca1a00a8c" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604140_25" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604141" | |
2015-04-09 11:29:04,728 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604140_25 with incident_id=18c341c2-3d72-41b8-8272-373ca1a00a8c | |
2015-04-09 11:29:04,756 DEBUG results for incident_id=18c341c2-3d72-41b8-8272-373ca1a00a8c written to collection. | |
2015-04-09 11:29:04,756 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604140_25 incident_id=18c341c2-3d72-41b8-8272-373ca1a00a8c result_id=0 written to collection incident_results | |
2015-04-09 11:29:04,757 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:29:04,763 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:29:04,763 INFO Alert handler finished. duration=0.732s | |
2015-04-09 11:30:06,464 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428604200_36/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428604200_36 sessionKey=J402l4wMgjZRGLu0ly3HyTCBsl_Eya7myzEcJcKWGReRV2TEg_W^Xv7X7k8orMNgIhkkLIoiklmFPBWfTEgk5Ap4OvU3yiTHN4vwp6swDywDsqrJNnhuxSOrp^3X6nMO3j8DJXQG6i^G9PbyLN alert=To Global Failed Login >3 Alert | |
2015-04-09 11:30:06,469 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428604200_36' has been fired. | |
2015-04-09 11:30:06,534 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604200_37/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604200_37 sessionKey=QrGZDcY6ObSYGak_0M_9feXIBifhMDAZwFLZ9VANSi_8hyO7VNt^09QtKdl43E_AIeg6fkHgPfiHBGfzXZQozxKShtC2YCT8FaH^5BYYnP21Gc1lOa43jilVUuTKNCwpqTqAfy7uDxvXS3FpIN alert=Test Alert | |
2015-04-09 11:30:06,540 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604200_37' has been fired. | |
2015-04-09 11:30:06,753 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf"} | |
2015-04-09 11:30:06,753 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 11:30:06,759 DEBUG Incident settings: [ { "alert" : "To Global Failed Login >3 Alert", "auto_previous_resolve" : false, "auto_assign" : false, "auto_assign_owner" : "unassigned", "auto_ttl_resolve" : false, "urgency" : "low", "run_alert_script" : false, "subcategory" : "unknown", "tags" : "[Untagged]", "category" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0d" } ] | |
2015-04-09 11:30:06,760 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 11:30:06,760 DEBUG Incident config after getting settings: {"auto_assign_user": "", "subcategory": "unknown", "alert": "To Global Failed Login >3 Alert", "auto_assign_owner": "unassigned", "urgency": "low", "auto_previous_resolve": false, "run_alert_script": false, "_key": "5526b8b13403e13421162b0d", "category": "unknown", "_user": "nobody", "alert_script": "", "tags": "[Untagged]", "auto_assign": false, "auto_ttl_resolve": false} | |
2015-04-09 11:30:06,767 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 11:30:06,786 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:30:06,786 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:30:06,815 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:30:06,815 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:30:06,815 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:30:06,815 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:30:06,816 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428604200_36 | |
2015-04-09 11:30:06,833 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf"} | |
2015-04-09 11:30:06,833 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:30:06,840 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:30:06,840 INFO Found incident settings for Test Alert | |
2015-04-09 11:30:06,840 DEBUG Incident config after getting settings: {"auto_previous_resolve": false, "auto_ttl_resolve": false, "alert_script": "", "tags": "[Untagged]", "urgency": "low", "_user": "nobody", "auto_assign": false, "auto_assign_user": "", "subcategory": "unknown", "run_alert_script": false, "category": "unknown", "auto_assign_owner": "unassigned", "alert": "Test Alert", "_key": "5526c2413403e135d146268b"} | |
2015-04-09 11:30:06,848 INFO Found job for alert Test Alert. Context is 'search' with 3 results. | |
2015-04-09 11:30:06,866 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:30:06,866 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:30:06,893 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:30:06,893 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:30:06,893 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:30:06,894 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:30:06,894 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604200_37 | |
2015-04-09 11:30:07,129 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=168ebd55-3e6c-41de-bf8c-814969ff50c5 | |
2015-04-09 11:30:07,165 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:30:07,165 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:30:07,165 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:30:07,166 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:30:07,191 DEBUG Create event will be: time=2015-04-09T11:30:07.191655 severity=INFO origin="alert_handler" event_id="8f1d55b7f5d04f307a92b5a8fc1b4470" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="168ebd55-3e6c-41de-bf8c-814969ff50c5" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428604200_36" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604201" | |
2015-04-09 11:30:07,198 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428604200_36 with incident_id=168ebd55-3e6c-41de-bf8c-814969ff50c5 | |
2015-04-09 11:30:07,210 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=08219d2a-c5b2-4c38-93d7-1c08f3622a08 | |
2015-04-09 11:30:07,227 DEBUG results for incident_id=168ebd55-3e6c-41de-bf8c-814969ff50c5 written to collection. | |
2015-04-09 11:30:07,227 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428604200_36 incident_id=168ebd55-3e6c-41de-bf8c-814969ff50c5 result_id=0 written to collection incident_results | |
2015-04-09 11:30:07,227 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:30:07,234 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:30:07,235 INFO Alert handler finished. duration=0.771s | |
2015-04-09 11:30:07,245 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:30:07,245 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:30:07,246 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:30:07,246 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:30:07,262 DEBUG Create event will be: time=2015-04-09T11:30:07.262668 severity=INFO origin="alert_handler" event_id="b2e88a9ec7463ded123ee5b29be645b2" user="splunk-system-user" action="create" alert="Test Alert" incident_id="08219d2a-c5b2-4c38-93d7-1c08f3622a08" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604200_37" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604201" | |
2015-04-09 11:30:07,270 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604200_37 with incident_id=08219d2a-c5b2-4c38-93d7-1c08f3622a08 | |
2015-04-09 11:30:07,331 DEBUG results for incident_id=08219d2a-c5b2-4c38-93d7-1c08f3622a08 written to collection. | |
2015-04-09 11:30:07,331 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604200_37 incident_id=08219d2a-c5b2-4c38-93d7-1c08f3622a08 result_id=0 written to collection incident_results | |
2015-04-09 11:30:07,331 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:30:07,339 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:30:07,339 INFO Alert handler finished. duration=0.805s | |
2015-04-09 11:30:07,544 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604200_40/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604200_40 sessionKey=GfvluIRf1I_v5UutFOyF3fs5dLRcx_5NWaXC^a1nJyO59L2vJYoXLiNNCSNWSEpnlyUMQwNRC^ApceheymG2CLoO4QPuuhwVnTNTWEab6OYyMPp7BY2XoI6NmlvSnVUQnYYxvz_BEevMBN alert=TO Failed Login Alert | |
2015-04-09 11:30:07,550 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604200_40' has been fired. | |
2015-04-09 11:30:07,852 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_priority": "low", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-09 11:30:07,852 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 11:30:07,860 DEBUG Incident settings: [ { "alert" : "TO Failed Login Alert", "auto_previous_resolve" : false, "auto_assign" : false, "auto_assign_owner" : "unassigned", "auto_ttl_resolve" : false, "urgency" : "low", "run_alert_script" : false, "subcategory" : "unknown", "tags" : "[Untagged]", "category" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 11:30:07,860 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 11:30:07,861 DEBUG Incident config after getting settings: {"category": "unknown", "alert": "TO Failed Login Alert", "subcategory": "unknown", "run_alert_script": false, "auto_assign": false, "auto_ttl_resolve": false, "urgency": "low", "tags": "[Untagged]", "auto_previous_resolve": false, "_user": "nobody", "alert_script": "", "auto_assign_user": "", "_key": "5526b8b13403e13421162b0c", "auto_assign_owner": "unassigned"} | |
2015-04-09 11:30:07,872 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 3 results. | |
2015-04-09 11:30:07,895 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:30:07,895 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:30:07,931 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:30:07,931 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:30:07,931 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:30:07,931 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:30:07,931 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604200_40 | |
2015-04-09 11:30:08,311 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=1ebe5905-1cc3-49d7-82f2-c9929772f14d | |
2015-04-09 11:30:08,351 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:30:08,352 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:30:08,352 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:30:08,352 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:30:08,400 DEBUG Create event will be: time=2015-04-09T11:30:08.400698 severity=INFO origin="alert_handler" event_id="9a6863bebab23841025ee5b7d3e27197" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="1ebe5905-1cc3-49d7-82f2-c9929772f14d" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604200_40" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604202" | |
2015-04-09 11:30:08,409 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604200_40 with incident_id=1ebe5905-1cc3-49d7-82f2-c9929772f14d | |
2015-04-09 11:30:08,478 DEBUG results for incident_id=1ebe5905-1cc3-49d7-82f2-c9929772f14d written to collection. | |
2015-04-09 11:30:08,479 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604200_40 incident_id=1ebe5905-1cc3-49d7-82f2-c9929772f14d result_id=0 written to collection incident_results | |
2015-04-09 11:30:08,479 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:30:08,490 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:30:08,490 INFO Alert handler finished. duration=0.947s | |
2015-04-09 11:31:14,875 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604260_68/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604260_68 sessionKey=YcQCM^9QYWU7JMuQW9MLVfwLvIUQlD45Vug01kDi0mZJKuRd3DIuelRZ7YIdsIT2Q411nlQU4zaUvkdgD7ZwcXsPPr8pCbS2txCQD5yYvvLH9fIeDsEHh1_V8CSNDe_2v27GpUoxiH0QJB3 alert=TO Failed Login Alert | |
2015-04-09 11:31:14,881 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604260_68' has been fired. | |
2015-04-09 11:31:15,156 DEBUG Parsed global alert handler settings: {"default_impact": "low", "default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 11:31:15,156 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 11:31:15,162 DEBUG Incident settings: [ { "alert" : "TO Failed Login Alert", "auto_previous_resolve" : false, "auto_assign" : false, "auto_assign_owner" : "unassigned", "auto_ttl_resolve" : false, "urgency" : "low", "run_alert_script" : false, "subcategory" : "unknown", "tags" : "[Untagged]", "category" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 11:31:15,162 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 11:31:15,162 DEBUG Incident config after getting settings: {"run_alert_script": false, "alert_script": "", "tags": "[Untagged]", "subcategory": "unknown", "_key": "5526b8b13403e13421162b0c", "auto_assign_owner": "unassigned", "category": "unknown", "alert": "TO Failed Login Alert", "auto_previous_resolve": false, "auto_ttl_resolve": false, "_user": "nobody", "auto_assign": false, "urgency": "low", "auto_assign_user": ""} | |
2015-04-09 11:31:15,170 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 4 results. | |
2015-04-09 11:31:15,188 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:31:15,188 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:31:15,215 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:31:15,215 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:31:15,215 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:31:15,216 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:31:15,216 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604260_68 | |
2015-04-09 11:31:15,383 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604260_69/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604260_69 sessionKey=fvluZjUEf_MqjeelBbkj5AC0SdLU5B2RFVKZkEPX8pBnNx2BMz3TVxshaCi7vFx6jB2U7tY^5_ME2TcJi5eqKzcUFxYQkC6Pir8gQPWgdSl1Cee6y2Gsr0gzm91uRisLfdkN0FbxPELhrD_D alert=Test Alert | |
2015-04-09 11:31:15,389 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604260_69' has been fired. | |
2015-04-09 11:31:15,521 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=2e962116-cc24-437d-bf14-9d2d50d40726 | |
2015-04-09 11:31:15,548 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:31:15,548 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:31:15,548 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:31:15,548 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:31:15,578 DEBUG Create event will be: time=2015-04-09T11:31:15.578341 severity=INFO origin="alert_handler" event_id="6049c0a7a044ed896137234d5c198e6c" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="2e962116-cc24-437d-bf14-9d2d50d40726" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604260_68" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604272" | |
2015-04-09 11:31:15,584 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604260_68 with incident_id=2e962116-cc24-437d-bf14-9d2d50d40726 | |
2015-04-09 11:31:15,612 DEBUG results for incident_id=2e962116-cc24-437d-bf14-9d2d50d40726 written to collection. | |
2015-04-09 11:31:15,613 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604260_68 incident_id=2e962116-cc24-437d-bf14-9d2d50d40726 result_id=0 written to collection incident_results | |
2015-04-09 11:31:15,613 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:31:15,619 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:31:15,620 INFO Alert handler finished. duration=0.745s | |
2015-04-09 11:31:15,663 DEBUG Parsed global alert handler settings: {"default_impact": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 11:31:15,664 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:31:15,670 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:31:15,670 INFO Found incident settings for Test Alert | |
2015-04-09 11:31:15,670 DEBUG Incident config after getting settings: {"subcategory": "unknown", "run_alert_script": false, "category": "unknown", "alert": "Test Alert", "alert_script": "", "_user": "nobody", "auto_ttl_resolve": false, "auto_assign": false, "tags": "[Untagged]", "urgency": "low", "auto_assign_user": "", "auto_previous_resolve": false, "_key": "5526c2413403e135d146268b", "auto_assign_owner": "unassigned"} | |
2015-04-09 11:31:15,679 INFO Found job for alert Test Alert. Context is 'search' with 4 results. | |
2015-04-09 11:31:15,698 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:31:15,698 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:31:15,723 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:31:15,724 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:31:15,724 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:31:15,724 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:31:15,725 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604260_69 | |
2015-04-09 11:31:16,025 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=a48c8588-5873-4d9e-a321-f68ad080b9d2 | |
2015-04-09 11:31:16,051 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:31:16,051 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:31:16,051 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:31:16,052 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:31:16,091 DEBUG Create event will be: time=2015-04-09T11:31:16.091537 severity=INFO origin="alert_handler" event_id="2b77908efa6da1881edbfbc580095d9c" user="splunk-system-user" action="create" alert="Test Alert" incident_id="a48c8588-5873-4d9e-a321-f68ad080b9d2" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604260_69" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604272" | |
2015-04-09 11:31:16,097 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604260_69 with incident_id=a48c8588-5873-4d9e-a321-f68ad080b9d2 | |
2015-04-09 11:31:16,125 DEBUG results for incident_id=a48c8588-5873-4d9e-a321-f68ad080b9d2 written to collection. | |
2015-04-09 11:31:16,126 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604260_69 incident_id=a48c8588-5873-4d9e-a321-f68ad080b9d2 result_id=0 written to collection incident_results | |
2015-04-09 11:31:16,126 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:31:16,132 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:31:16,132 INFO Alert handler finished. duration=0.749s | |
2015-04-09 11:32:04,487 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604320_72/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604320_72 sessionKey=kyMmLNRClC0pZdvtNtRh62AbU11fUYopSfxmXghzX9ayKRY0jPruGC1c^8Q3_wav54v0CDO^F^nNjPho9zV4UspWGNs62XY^D9hANqFVUTvUUF8AfYyQKyUPwNfDG5KJJH2EiyJ^TMXc alert=Test Alert | |
2015-04-09 11:32:04,493 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604320_72' has been fired. | |
2015-04-09 11:32:04,645 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604320_71/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604320_71 sessionKey=OrwpTMHt0Hi1rMOk7_FpP522nll4Sdzr^mw6Jg8AMv8iRTRcBswz0FSbQHNmsL2xnNkyG3^TNNKKV9DMYS^XQx^RBMGrqEIqPxRwvzTMO0HUy1mQQsHpsJggVYdKdHIVRPXVveVQX6L0I8x alert=TO Failed Login Alert | |
2015-04-09 11:32:04,651 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604320_71' has been fired. | |
2015-04-09 11:32:04,773 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 11:32:04,773 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:32:04,780 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:32:04,780 INFO Found incident settings for Test Alert | |
2015-04-09 11:32:04,780 DEBUG Incident config after getting settings: {"auto_assign_user": "", "auto_assign_owner": "unassigned", "_key": "5526c2413403e135d146268b", "alert": "Test Alert", "category": "unknown", "auto_previous_resolve": false, "run_alert_script": false, "subcategory": "unknown", "urgency": "low", "tags": "[Untagged]", "auto_assign": false, "auto_ttl_resolve": false, "_user": "nobody", "alert_script": ""} | |
2015-04-09 11:32:04,789 INFO Found job for alert Test Alert. Context is 'search' with 4 results. | |
2015-04-09 11:32:04,806 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:32:04,806 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:32:04,831 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:32:04,831 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:32:04,831 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:32:04,832 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:32:04,832 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604320_72 | |
2015-04-09 11:32:04,931 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "default_urgency": "low", "default_impact": "low", "index": "alerts-to-inf"} | |
2015-04-09 11:32:04,932 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 11:32:04,938 DEBUG Incident settings: [ { "alert" : "TO Failed Login Alert", "auto_previous_resolve" : false, "auto_assign" : false, "auto_assign_owner" : "unassigned", "auto_ttl_resolve" : false, "urgency" : "low", "run_alert_script" : false, "subcategory" : "unknown", "tags" : "[Untagged]", "category" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 11:32:04,938 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 11:32:04,938 DEBUG Incident config after getting settings: {"_key": "5526b8b13403e13421162b0c", "auto_assign_owner": "unassigned", "run_alert_script": false, "urgency": "low", "auto_previous_resolve": false, "alert_script": "", "category": "unknown", "subcategory": "unknown", "alert": "TO Failed Login Alert", "tags": "[Untagged]", "auto_ttl_resolve": false, "_user": "nobody", "auto_assign": false, "auto_assign_user": ""} | |
2015-04-09 11:32:04,946 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 4 results. | |
2015-04-09 11:32:04,964 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:32:04,965 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:32:04,991 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:32:04,991 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:32:04,992 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:32:04,992 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:32:04,992 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604320_71 | |
2015-04-09 11:32:05,134 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=e7308757-a6ef-4536-b124-3a05db08c6cb | |
2015-04-09 11:32:05,161 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:32:05,161 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:32:05,161 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:32:05,161 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:32:05,203 DEBUG Create event will be: time=2015-04-09T11:32:05.203283 severity=INFO origin="alert_handler" event_id="8e014dda329d26eabe58c0afd6441e06" user="splunk-system-user" action="create" alert="Test Alert" incident_id="e7308757-a6ef-4536-b124-3a05db08c6cb" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604320_72" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604321" | |
2015-04-09 11:32:05,209 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604320_72 with incident_id=e7308757-a6ef-4536-b124-3a05db08c6cb | |
2015-04-09 11:32:05,237 DEBUG results for incident_id=e7308757-a6ef-4536-b124-3a05db08c6cb written to collection. | |
2015-04-09 11:32:05,237 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604320_72 incident_id=e7308757-a6ef-4536-b124-3a05db08c6cb result_id=0 written to collection incident_results | |
2015-04-09 11:32:05,238 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:32:05,244 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:32:05,245 INFO Alert handler finished. duration=0.758s | |
2015-04-09 11:32:05,298 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=2e121878-ea19-4847-8363-878c3bcd940e | |
2015-04-09 11:32:05,325 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:32:05,325 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:32:05,325 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:32:05,325 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:32:05,341 DEBUG Create event will be: time=2015-04-09T11:32:05.341328 severity=INFO origin="alert_handler" event_id="663afcdf7a2fad1f3690a9ac2613ff58" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="2e121878-ea19-4847-8363-878c3bcd940e" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604320_71" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604321" | |
2015-04-09 11:32:05,348 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604320_71 with incident_id=2e121878-ea19-4847-8363-878c3bcd940e | |
2015-04-09 11:32:05,375 DEBUG results for incident_id=2e121878-ea19-4847-8363-878c3bcd940e written to collection. | |
2015-04-09 11:32:05,375 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604320_71 incident_id=2e121878-ea19-4847-8363-878c3bcd940e result_id=0 written to collection incident_results | |
2015-04-09 11:32:05,376 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:32:05,383 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:32:05,383 INFO Alert handler finished. duration=0.738s | |
2015-04-09 11:33:04,352 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604380_74/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604380_74 sessionKey=I81ZN9Qz1uwGoeXhN2sYVM4cLX4ZaM0i0unPYahSCN^3rhOXQnIn4TyK_YIunz1I6YNJHNF8j_Sfhi2UDVc9JumLa4JO6SuIzUBxRmhW3oeo2snHMTF5ArvychLSZnFOVQgOq17S alert=TO Failed Login Alert | |
2015-04-09 11:33:04,358 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604380_74' has been fired. | |
2015-04-09 11:33:04,435 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604380_75/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604380_75 sessionKey=B6GKlEFsrwi2hRIgr_jUOSHo50YbP^BXKCWMML^pJZZVG0iraU3^s_tiVA3puVNEVno6To931vUQMOVP_YuN7E1y3MHRs7SOL9vKxTseCJ60wu2PylwTszLjtJDVUZ1lzrIx7z5k alert=Test Alert | |
2015-04-09 11:33:04,441 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604380_75' has been fired. | |
2015-04-09 11:33:04,631 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-09 11:33:04,632 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 11:33:04,638 DEBUG Incident settings: [ { "alert" : "TO Failed Login Alert", "auto_previous_resolve" : false, "auto_assign" : false, "auto_assign_owner" : "unassigned", "auto_ttl_resolve" : false, "urgency" : "low", "run_alert_script" : false, "subcategory" : "unknown", "tags" : "[Untagged]", "category" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 11:33:04,638 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 11:33:04,638 DEBUG Incident config after getting settings: {"urgency": "low", "subcategory": "unknown", "_user": "nobody", "run_alert_script": false, "category": "unknown", "tags": "[Untagged]", "alert_script": "", "auto_assign_user": "", "alert": "TO Failed Login Alert", "auto_previous_resolve": false, "auto_assign_owner": "unassigned", "_key": "5526b8b13403e13421162b0c", "auto_ttl_resolve": false, "auto_assign": false} | |
2015-04-09 11:33:04,646 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 3 results. | |
2015-04-09 11:33:04,663 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:33:04,663 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:33:04,687 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:33:04,688 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:33:04,688 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:33:04,688 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:33:04,688 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604380_74 | |
2015-04-09 11:33:04,727 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-09 11:33:04,727 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:33:04,733 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:33:04,733 INFO Found incident settings for Test Alert | |
2015-04-09 11:33:04,733 DEBUG Incident config after getting settings: {"auto_assign": false, "_user": "nobody", "auto_ttl_resolve": false, "urgency": "low", "auto_assign_owner": "unassigned", "category": "unknown", "auto_assign_user": "", "alert_script": "", "auto_previous_resolve": false, "tags": "[Untagged]", "run_alert_script": false, "alert": "Test Alert", "subcategory": "unknown", "_key": "5526c2413403e135d146268b"} | |
2015-04-09 11:33:04,742 INFO Found job for alert Test Alert. Context is 'search' with 3 results. | |
2015-04-09 11:33:04,760 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:33:04,760 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:33:04,787 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:33:04,787 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:33:04,787 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:33:04,787 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:33:04,787 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604380_75 | |
2015-04-09 11:33:04,989 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=eb2d151f-ad69-489e-b7d9-4e11df45f116 | |
2015-04-09 11:33:05,015 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:33:05,015 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:33:05,015 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:33:05,015 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:33:05,038 DEBUG Create event will be: time=2015-04-09T11:33:05.038941 severity=INFO origin="alert_handler" event_id="3a6e7b11ee4976389e0db5401f6e7904" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="eb2d151f-ad69-489e-b7d9-4e11df45f116" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604380_74" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604381" | |
2015-04-09 11:33:05,045 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604380_74 with incident_id=eb2d151f-ad69-489e-b7d9-4e11df45f116 | |
2015-04-09 11:33:05,074 DEBUG results for incident_id=eb2d151f-ad69-489e-b7d9-4e11df45f116 written to collection. | |
2015-04-09 11:33:05,074 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604380_74 incident_id=eb2d151f-ad69-489e-b7d9-4e11df45f116 result_id=0 written to collection incident_results | |
2015-04-09 11:33:05,074 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:33:05,082 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:33:05,083 INFO Alert handler finished. duration=0.731s | |
2015-04-09 11:33:05,098 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=81212835-0e21-4ba1-ad97-6451a5cad6d0 | |
2015-04-09 11:33:05,124 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:33:05,124 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:33:05,125 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:33:05,125 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:33:05,142 DEBUG Create event will be: time=2015-04-09T11:33:05.142848 severity=INFO origin="alert_handler" event_id="9d8904837e2265d97ed2c12c04e45464" user="splunk-system-user" action="create" alert="Test Alert" incident_id="81212835-0e21-4ba1-ad97-6451a5cad6d0" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604380_75" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604381" | |
2015-04-09 11:33:05,149 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604380_75 with incident_id=81212835-0e21-4ba1-ad97-6451a5cad6d0 | |
2015-04-09 11:33:05,177 DEBUG results for incident_id=81212835-0e21-4ba1-ad97-6451a5cad6d0 written to collection. | |
2015-04-09 11:33:05,177 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604380_75 incident_id=81212835-0e21-4ba1-ad97-6451a5cad6d0 result_id=0 written to collection incident_results | |
2015-04-09 11:33:05,177 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:33:05,184 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:33:05,184 INFO Alert handler finished. duration=0.749s | |
2015-04-09 11:34:04,234 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604440_78/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604440_78 sessionKey=EnbP2wQHi1Fw^ox4jzwFe4Aa2Ht2XopbB7_pnqI8N2ntPMhnV6KeXo^zMZhJf9ib1BqNSBQWrYVEdDoZImSVz^X6HNv36mj_DnQmhHO9oirgatzwuhZEnNm_Wj6ipsEr21U58tTrxSctSHWE alert=TO Failed Login Alert | |
2015-04-09 11:34:04,240 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604440_78' has been fired. | |
2015-04-09 11:34:04,376 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604440_79/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604440_79 sessionKey=yjZinERlmgugqTvhLTfHiZ40ptazDFNJk05xoMazFwrT2alI894Ix_fiWUg9jnaHcArPCZhHus4FhfgcbEv79Z4OrDSQadDpiiCIuowf5nvnW2wqh0DNzElKCW^rX92UX4YbkpvQn2r alert=Test Alert | |
2015-04-09 11:34:04,382 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604440_79' has been fired. | |
2015-04-09 11:34:04,516 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-09 11:34:04,516 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 11:34:04,522 DEBUG Incident settings: [ { "alert" : "TO Failed Login Alert", "auto_previous_resolve" : false, "auto_assign" : false, "auto_assign_owner" : "unassigned", "auto_ttl_resolve" : false, "urgency" : "low", "run_alert_script" : false, "subcategory" : "unknown", "tags" : "[Untagged]", "category" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 11:34:04,523 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 11:34:04,523 DEBUG Incident config after getting settings: {"urgency": "low", "auto_assign": false, "alert": "TO Failed Login Alert", "_key": "5526b8b13403e13421162b0c", "tags": "[Untagged]", "run_alert_script": false, "_user": "nobody", "auto_assign_user": "", "auto_previous_resolve": false, "alert_script": "", "auto_ttl_resolve": false, "auto_assign_owner": "unassigned", "subcategory": "unknown", "category": "unknown"} | |
2015-04-09 11:34:04,530 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 3 results. | |
2015-04-09 11:34:04,549 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:34:04,549 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:34:04,577 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:34:04,577 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:34:04,578 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:34:04,578 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:34:04,578 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604440_78 | |
2015-04-09 11:34:04,668 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_priority": "low", "default_owner": "unassigned"} | |
2015-04-09 11:34:04,669 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:34:04,675 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:34:04,675 INFO Found incident settings for Test Alert | |
2015-04-09 11:34:04,675 DEBUG Incident config after getting settings: {"auto_previous_resolve": false, "alert_script": "", "category": "unknown", "_user": "nobody", "urgency": "low", "_key": "5526c2413403e135d146268b", "subcategory": "unknown", "alert": "Test Alert", "auto_assign": false, "run_alert_script": false, "auto_assign_user": "", "auto_ttl_resolve": false, "auto_assign_owner": "unassigned", "tags": "[Untagged]"} | |
2015-04-09 11:34:04,683 INFO Found job for alert Test Alert. Context is 'search' with 3 results. | |
2015-04-09 11:34:04,703 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:34:04,703 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:34:04,734 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:34:04,734 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:34:04,735 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:34:04,735 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:34:04,735 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604440_79 | |
2015-04-09 11:34:04,887 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=98f67b6e-530a-4393-a6ac-df044230a27b | |
2015-04-09 11:34:04,914 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:34:04,914 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:34:04,914 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:34:04,915 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:34:04,982 DEBUG Create event will be: time=2015-04-09T11:34:04.982553 severity=INFO origin="alert_handler" event_id="9550d91cc1661e77223e21a856163002" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="98f67b6e-530a-4393-a6ac-df044230a27b" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604440_78" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604441" | |
2015-04-09 11:34:04,990 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604440_78 with incident_id=98f67b6e-530a-4393-a6ac-df044230a27b | |
2015-04-09 11:34:05,016 DEBUG results for incident_id=98f67b6e-530a-4393-a6ac-df044230a27b written to collection. | |
2015-04-09 11:34:05,017 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604440_78 incident_id=98f67b6e-530a-4393-a6ac-df044230a27b result_id=0 written to collection incident_results | |
2015-04-09 11:34:05,017 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:34:05,024 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:34:05,024 INFO Alert handler finished. duration=0.791s | |
2015-04-09 11:34:05,047 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=20d830ff-e71a-464d-8b5b-a3dabb60357a | |
2015-04-09 11:34:05,075 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:34:05,075 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:34:05,075 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:34:05,075 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:34:05,119 DEBUG Create event will be: time=2015-04-09T11:34:05.119602 severity=INFO origin="alert_handler" event_id="eab6ebecfbacfab18c8618f8b49098db" user="splunk-system-user" action="create" alert="Test Alert" incident_id="20d830ff-e71a-464d-8b5b-a3dabb60357a" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604440_79" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604441" | |
2015-04-09 11:34:05,126 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604440_79 with incident_id=20d830ff-e71a-464d-8b5b-a3dabb60357a | |
2015-04-09 11:34:05,154 DEBUG results for incident_id=20d830ff-e71a-464d-8b5b-a3dabb60357a written to collection. | |
2015-04-09 11:34:05,154 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604440_79 incident_id=20d830ff-e71a-464d-8b5b-a3dabb60357a result_id=0 written to collection incident_results | |
2015-04-09 11:34:05,154 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:34:05,161 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:34:05,161 INFO Alert handler finished. duration=0.785s | |
2015-04-09 11:35:04,766 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604500_86/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604500_86 sessionKey=OhmT8ycYbu3MMhusPnkx^GN^nQVDLSnw9X8sEE1YvdiyPTMnwl5p^wPxaiu1Qvrzj^kd7mUI6B0geCEoWyYnTcGe_1KaVmx5nUUhxLYNzUm3ju2A2viSCAuRY74vIIWLa0OG1U0vu7Tm alert=TO Failed Login Alert | |
2015-04-09 11:35:04,772 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604500_86' has been fired. | |
2015-04-09 11:35:04,975 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604500_88/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604500_88 sessionKey=vfKTHpFNmcFu9cMG2qWCBE4gNdQJqBham_JIuntubbQYtpPeGtYKbaCEg5zjKvY6sp3UetydPHF3bA_KpmO3cec_Gieta^R1xqTSOpnIH5gqn1eXDTdeoUaVhrKPHNTzB8yEtz_YVW3 alert=Test Alert | |
2015-04-09 11:35:04,981 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604500_88' has been fired. | |
2015-04-09 11:35:05,054 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_priority": "low", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-09 11:35:05,055 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 11:35:05,060 DEBUG Incident settings: [ { "alert" : "TO Failed Login Alert", "auto_previous_resolve" : false, "auto_assign" : false, "auto_assign_owner" : "unassigned", "auto_ttl_resolve" : false, "urgency" : "low", "run_alert_script" : false, "subcategory" : "unknown", "tags" : "[Untagged]", "category" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 11:35:05,060 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 11:35:05,061 DEBUG Incident config after getting settings: {"category": "unknown", "_key": "5526b8b13403e13421162b0c", "tags": "[Untagged]", "alert_script": "", "auto_assign_user": "", "alert": "TO Failed Login Alert", "urgency": "low", "subcategory": "unknown", "run_alert_script": false, "_user": "nobody", "auto_assign": false, "auto_ttl_resolve": false, "auto_previous_resolve": false, "auto_assign_owner": "unassigned"} | |
2015-04-09 11:35:05,068 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 3 results. | |
2015-04-09 11:35:05,086 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:35:05,086 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:35:05,113 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:35:05,113 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:35:05,113 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:35:05,113 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:35:05,113 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604500_86 | |
2015-04-09 11:35:05,263 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-09 11:35:05,264 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:35:05,269 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:35:05,270 INFO Found incident settings for Test Alert | |
2015-04-09 11:35:05,270 DEBUG Incident config after getting settings: {"auto_assign_user": "", "alert_script": "", "auto_previous_resolve": false, "tags": "[Untagged]", "run_alert_script": false, "alert": "Test Alert", "subcategory": "unknown", "_key": "5526c2413403e135d146268b", "auto_assign": false, "_user": "nobody", "auto_ttl_resolve": false, "urgency": "low", "auto_assign_owner": "unassigned", "category": "unknown"} | |
2015-04-09 11:35:05,278 INFO Found job for alert Test Alert. Context is 'search' with 3 results. | |
2015-04-09 11:35:05,295 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:35:05,295 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:35:05,320 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:35:05,320 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:35:05,320 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:35:05,320 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:35:05,320 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604500_88 | |
2015-04-09 11:35:05,419 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=c80c16f3-8596-45bf-b044-257ff64bb032 | |
2015-04-09 11:35:05,450 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:35:05,451 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:35:05,451 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:35:05,451 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:35:05,495 DEBUG Create event will be: time=2015-04-09T11:35:05.495114 severity=INFO origin="alert_handler" event_id="9cb50fffcc29c441c28b9890651c7fbd" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="c80c16f3-8596-45bf-b044-257ff64bb032" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604500_86" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604501" | |
2015-04-09 11:35:05,502 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604500_86 with incident_id=c80c16f3-8596-45bf-b044-257ff64bb032 | |
2015-04-09 11:35:05,528 DEBUG results for incident_id=c80c16f3-8596-45bf-b044-257ff64bb032 written to collection. | |
2015-04-09 11:35:05,528 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604500_86 incident_id=c80c16f3-8596-45bf-b044-257ff64bb032 result_id=0 written to collection incident_results | |
2015-04-09 11:35:05,529 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:35:05,537 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:35:05,537 INFO Alert handler finished. duration=0.772s | |
2015-04-09 11:35:05,634 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=6aee2cad-1dfb-43ea-85c1-9de4ad55e7d2 | |
2015-04-09 11:35:05,665 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:35:05,665 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:35:05,665 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:35:05,665 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:35:05,699 DEBUG Create event will be: time=2015-04-09T11:35:05.699935 severity=INFO origin="alert_handler" event_id="cefe94804cd8180419a80cd6736ba678" user="splunk-system-user" action="create" alert="Test Alert" incident_id="6aee2cad-1dfb-43ea-85c1-9de4ad55e7d2" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604500_88" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604501" | |
2015-04-09 11:35:05,706 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604500_88 with incident_id=6aee2cad-1dfb-43ea-85c1-9de4ad55e7d2 | |
2015-04-09 11:35:05,734 DEBUG results for incident_id=6aee2cad-1dfb-43ea-85c1-9de4ad55e7d2 written to collection. | |
2015-04-09 11:35:05,734 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604500_88 incident_id=6aee2cad-1dfb-43ea-85c1-9de4ad55e7d2 result_id=0 written to collection incident_results | |
2015-04-09 11:35:05,734 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:35:05,741 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:35:05,741 INFO Alert handler finished. duration=0.766s | |
2015-04-09 11:36:04,076 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604560_92/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604560_92 sessionKey=mfg78yWyBvN8iRlrkVbXwyZYZLZBtB0uSiPONyGVbXj5gEgnvzWFO0NyT4cIwQctr5xFbio9uZ3VTeHIr3d0adPj23RgXm2UhQ9Sv4SQqlO0AsiLlppPGhMsNoQ5Myi0mqt5F81^hc8mYpF alert=Test Alert | |
2015-04-09 11:36:04,081 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604560_92' has been fired. | |
2015-04-09 11:36:04,233 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604560_91/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604560_91 sessionKey=zPG7hlSrQ5G3mJcluIupxX^8QWP5sQbagrV2Vg9k4loIj0qTnkzDm69ZcOwP7MrRW6bh4SnvKP21Kzz_NNO5oVN6MVrz3cbxURq0jh9NJVAjNi9EjfLOGCQj68YHOfMNLiPGTqsSuT2p0p0t alert=TO Failed Login Alert | |
2015-04-09 11:36:04,239 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604560_91' has been fired. | |
2015-04-09 11:36:04,354 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "default_urgency": "low", "default_impact": "low", "index": "alerts-to-inf"} | |
2015-04-09 11:36:04,354 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:36:04,360 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:36:04,360 INFO Found incident settings for Test Alert | |
2015-04-09 11:36:04,360 DEBUG Incident config after getting settings: {"auto_ttl_resolve": false, "_user": "nobody", "auto_assign": false, "auto_assign_user": "", "category": "unknown", "subcategory": "unknown", "alert": "Test Alert", "tags": "[Untagged]", "urgency": "low", "auto_previous_resolve": false, "alert_script": "", "_key": "5526c2413403e135d146268b", "auto_assign_owner": "unassigned", "run_alert_script": false} | |
2015-04-09 11:36:04,369 INFO Found job for alert Test Alert. Context is 'search' with 3 results. | |
2015-04-09 11:36:04,387 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:36:04,387 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:36:04,413 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:36:04,413 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:36:04,414 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:36:04,414 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:36:04,414 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604560_92 | |
2015-04-09 11:36:04,516 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_priority": "low", "default_owner": "unassigned"} | |
2015-04-09 11:36:04,516 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 11:36:04,522 DEBUG Incident settings: [ { "alert" : "TO Failed Login Alert", "auto_previous_resolve" : false, "auto_assign" : false, "auto_assign_owner" : "unassigned", "auto_ttl_resolve" : false, "urgency" : "low", "run_alert_script" : false, "subcategory" : "unknown", "tags" : "[Untagged]", "category" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 11:36:04,522 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 11:36:04,522 DEBUG Incident config after getting settings: {"auto_assign": false, "_key": "5526b8b13403e13421162b0c", "_user": "nobody", "auto_assign_owner": "unassigned", "alert_script": "", "subcategory": "unknown", "tags": "[Untagged]", "urgency": "low", "alert": "TO Failed Login Alert", "category": "unknown", "run_alert_script": false, "auto_previous_resolve": false, "auto_ttl_resolve": false, "auto_assign_user": ""} | |
2015-04-09 11:36:04,530 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 3 results. | |
2015-04-09 11:36:04,549 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:36:04,550 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:36:04,575 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:36:04,576 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:36:04,576 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:36:04,576 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:36:04,576 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604560_91 | |
2015-04-09 11:36:04,714 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=a24da01e-8ef1-43af-8c1a-aa924e1c9350 | |
2015-04-09 11:36:04,740 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:36:04,740 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:36:04,740 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:36:04,740 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:36:04,774 DEBUG Create event will be: time=2015-04-09T11:36:04.774096 severity=INFO origin="alert_handler" event_id="7641096ec17f6bf53f19528adb28ee7e" user="splunk-system-user" action="create" alert="Test Alert" incident_id="a24da01e-8ef1-43af-8c1a-aa924e1c9350" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604560_92" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604561" | |
2015-04-09 11:36:04,780 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604560_92 with incident_id=a24da01e-8ef1-43af-8c1a-aa924e1c9350 | |
2015-04-09 11:36:04,808 DEBUG results for incident_id=a24da01e-8ef1-43af-8c1a-aa924e1c9350 written to collection. | |
2015-04-09 11:36:04,808 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604560_92 incident_id=a24da01e-8ef1-43af-8c1a-aa924e1c9350 result_id=0 written to collection incident_results | |
2015-04-09 11:36:04,808 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:36:04,815 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:36:04,815 INFO Alert handler finished. duration=0.74s | |
2015-04-09 11:36:04,874 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=7c9210cc-eebb-4fa0-9904-1090c9c9eaf8 | |
2015-04-09 11:36:04,900 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:36:04,901 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:36:04,901 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:36:04,901 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:36:04,945 DEBUG Create event will be: time=2015-04-09T11:36:04.945908 severity=INFO origin="alert_handler" event_id="4c4adff9dbc2bf6bffd7c560e1eb3f2d" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="7c9210cc-eebb-4fa0-9904-1090c9c9eaf8" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604560_91" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604561" | |
2015-04-09 11:36:04,952 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604560_91 with incident_id=7c9210cc-eebb-4fa0-9904-1090c9c9eaf8 | |
2015-04-09 11:36:04,980 DEBUG results for incident_id=7c9210cc-eebb-4fa0-9904-1090c9c9eaf8 written to collection. | |
2015-04-09 11:36:04,980 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604560_91 incident_id=7c9210cc-eebb-4fa0-9904-1090c9c9eaf8 result_id=0 written to collection incident_results | |
2015-04-09 11:36:04,980 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:36:04,987 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:36:04,987 INFO Alert handler finished. duration=0.754s | |
2015-04-09 11:37:04,282 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604620_95/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604620_95 sessionKey=0Du2xMTdraFbEXdgKjNzd5wlhS26KOXl8eEigFX16wP5gkf^JBoWtuutu2ez4uTL_aAxazlq6^d7Vu5koHg^4eFfTjhIc5lqn33z^SgMryBMBptk1HEsQJ2m8Itkhtr25s6wrRdoOCdp58F alert=Test Alert | |
2015-04-09 11:37:04,288 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604620_95' has been fired. | |
2015-04-09 11:37:04,342 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604620_94/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604620_94 sessionKey=XxKk9cgy8wNrdLVJzDs46ykeVpoDWQEFAH7pigCvQsrVl1yO1Oc7LPgyLDqTokSntUzXJ55Ifo4L6pi^KLBAz30MoKKOI3bGwMwcND9ze7vyMFtZRXBLKgmutH4lHfLVB0L^7zh^7zmqbN alert=TO Failed Login Alert | |
2015-04-09 11:37:04,348 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604620_94' has been fired. | |
2015-04-09 11:37:04,557 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 11:37:04,558 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:37:04,564 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:37:04,564 INFO Found incident settings for Test Alert | |
2015-04-09 11:37:04,564 DEBUG Incident config after getting settings: {"tags": "[Untagged]", "category": "unknown", "auto_assign_user": "", "alert_script": "", "auto_assign_owner": "unassigned", "run_alert_script": false, "_user": "nobody", "subcategory": "unknown", "_key": "5526c2413403e135d146268b", "urgency": "low", "auto_assign": false, "auto_ttl_resolve": false, "alert": "Test Alert", "auto_previous_resolve": false} | |
2015-04-09 11:37:04,571 INFO Found job for alert Test Alert. Context is 'search' with 3 results. | |
2015-04-09 11:37:04,588 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:37:04,588 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:37:04,616 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:37:04,616 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:37:04,616 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:37:04,616 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:37:04,617 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604620_95 | |
2015-04-09 11:37:04,619 DEBUG Parsed global alert handler settings: {"default_impact": "low", "default_owner": "unassigned", "default_urgency": "low", "index": "alerts-to-inf", "default_priority": "low"} | |
2015-04-09 11:37:04,619 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 11:37:04,626 DEBUG Incident settings: [ { "alert" : "TO Failed Login Alert", "auto_previous_resolve" : false, "auto_assign" : false, "auto_assign_owner" : "unassigned", "auto_ttl_resolve" : false, "urgency" : "low", "run_alert_script" : false, "subcategory" : "unknown", "tags" : "[Untagged]", "category" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 11:37:04,627 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 11:37:04,627 DEBUG Incident config after getting settings: {"auto_assign_user": "", "_key": "5526b8b13403e13421162b0c", "auto_assign_owner": "unassigned", "_user": "nobody", "subcategory": "unknown", "run_alert_script": false, "category": "unknown", "alert": "TO Failed Login Alert", "alert_script": "", "auto_previous_resolve": false, "auto_assign": false, "auto_ttl_resolve": false, "urgency": "low", "tags": "[Untagged]"} | |
2015-04-09 11:37:04,635 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 3 results. | |
2015-04-09 11:37:04,652 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:37:04,652 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:37:04,678 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:37:04,678 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:37:04,678 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:37:04,678 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:37:04,678 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604620_94 | |
2015-04-09 11:37:04,916 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=63552797-d39b-475d-9121-6d85faaaa196 | |
2015-04-09 11:37:04,947 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:37:04,947 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:37:04,947 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:37:04,947 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:37:04,972 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=02aee537-9c87-4a23-86b9-329f77b4d7d2 | |
2015-04-09 11:37:04,981 DEBUG Create event will be: time=2015-04-09T11:37:04.981630 severity=INFO origin="alert_handler" event_id="14fd7b2d36b3763b4ef1ca272dcd6612" user="splunk-system-user" action="create" alert="Test Alert" incident_id="63552797-d39b-475d-9121-6d85faaaa196" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604620_95" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604621" | |
2015-04-09 11:37:04,988 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604620_95 with incident_id=63552797-d39b-475d-9121-6d85faaaa196 | |
2015-04-09 11:37:05,008 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:37:05,008 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:37:05,008 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:37:05,008 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:37:05,016 DEBUG results for incident_id=63552797-d39b-475d-9121-6d85faaaa196 written to collection. | |
2015-04-09 11:37:05,016 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604620_95 incident_id=63552797-d39b-475d-9121-6d85faaaa196 result_id=0 written to collection incident_results | |
2015-04-09 11:37:05,016 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:37:05,023 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:37:05,023 INFO Alert handler finished. duration=0.741s | |
2015-04-09 11:37:05,051 DEBUG Create event will be: time=2015-04-09T11:37:05.051092 severity=INFO origin="alert_handler" event_id="971a00de84f3b85e9014839e214435dd" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="02aee537-9c87-4a23-86b9-329f77b4d7d2" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604620_94" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604621" | |
2015-04-09 11:37:05,057 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604620_94 with incident_id=02aee537-9c87-4a23-86b9-329f77b4d7d2 | |
2015-04-09 11:37:05,085 DEBUG results for incident_id=02aee537-9c87-4a23-86b9-329f77b4d7d2 written to collection. | |
2015-04-09 11:37:05,085 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604620_94 incident_id=02aee537-9c87-4a23-86b9-329f77b4d7d2 result_id=0 written to collection incident_results | |
2015-04-09 11:37:05,085 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:37:05,092 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:37:05,092 INFO Alert handler finished. duration=0.75s | |
2015-04-09 11:38:03,938 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604680_97/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604680_97 sessionKey=9ilm7SB8n9ocyW2iRcr_hN7ebAD_8kqiIqRsCbGhzgrQpBB3iBsZMYRLV8eRWnfVd3Mwuv7MNRWsfMt6dt30wk6nQozN_iE_3ezlm3vhPJGLeeZqUXL3tM9hYGbmvVhJd2GgpsxyqwDnDtA alert=Test Alert | |
2015-04-09 11:38:03,944 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604680_97' has been fired. | |
2015-04-09 11:38:04,211 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf"} | |
2015-04-09 11:38:04,211 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:38:04,217 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:38:04,217 INFO Found incident settings for Test Alert | |
2015-04-09 11:38:04,217 DEBUG Incident config after getting settings: {"alert_script": "", "auto_assign": false, "_key": "5526c2413403e135d146268b", "run_alert_script": false, "urgency": "low", "auto_previous_resolve": false, "alert": "Test Alert", "auto_assign_user": "", "category": "unknown", "auto_ttl_resolve": false, "_user": "nobody", "subcategory": "unknown", "auto_assign_owner": "unassigned", "tags": "[Untagged]"} | |
2015-04-09 11:38:04,225 INFO Found job for alert Test Alert. Context is 'search' with 3 results. | |
2015-04-09 11:38:04,242 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:38:04,242 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:38:04,266 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:38:04,266 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:38:04,267 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:38:04,267 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:38:04,267 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604680_97 | |
2015-04-09 11:38:04,558 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=2c9cd68a-a709-4e06-bb10-a26c82584254 | |
2015-04-09 11:38:04,583 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:38:04,583 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:38:04,583 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:38:04,583 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:38:04,611 DEBUG Create event will be: time=2015-04-09T11:38:04.611115 severity=INFO origin="alert_handler" event_id="50cda4ab577a5e60e09693e17ca4995a" user="splunk-system-user" action="create" alert="Test Alert" incident_id="2c9cd68a-a709-4e06-bb10-a26c82584254" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604680_97" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604681" | |
2015-04-09 11:38:04,617 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604680_97 with incident_id=2c9cd68a-a709-4e06-bb10-a26c82584254 | |
2015-04-09 11:38:04,645 DEBUG results for incident_id=2c9cd68a-a709-4e06-bb10-a26c82584254 written to collection. | |
2015-04-09 11:38:04,645 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604680_97 incident_id=2c9cd68a-a709-4e06-bb10-a26c82584254 result_id=0 written to collection incident_results | |
2015-04-09 11:38:04,645 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:38:04,652 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:38:04,652 INFO Alert handler finished. duration=0.714s | |
2015-04-09 11:40:05,211 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604800_106/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604800_106 sessionKey=2HlBJGpRVWFPz_NXBmKjEqtsryYa8iR^Z6bk9eEmk8Yp125punNqwRhzXYuIyFEKJW^ox322hZ^7HPnOCejuYmNR11Wf5wowwKLko7nztJePM8SsasrRy8OfCmzml1QZP2KOL8SawCSqvJYO alert=TO Failed Login Alert | |
2015-04-09 11:40:05,217 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604800_106' has been fired. | |
2015-04-09 11:40:05,361 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604800_109/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604800_109 sessionKey=cEleBolFRh95Yrlg^SZwQFRU8bjanJQD_il0toUhRe4MmWNHWTkbzLaxti13Jx5UlKb1DDSiAklsEU_q2Xjyq2o15QySdVVuN2CGoHbEdzX5IPvAq^8xMyKhH7Ep9xPeX2x3ssWS alert=Test Alert | |
2015-04-09 11:40:05,367 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604800_109' has been fired. | |
2015-04-09 11:40:05,490 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_urgency": "low", "default_impact": "low"} | |
2015-04-09 11:40:05,490 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 11:40:05,497 DEBUG Incident settings: [ { "alert" : "TO Failed Login Alert", "auto_previous_resolve" : false, "auto_assign" : false, "auto_assign_owner" : "unassigned", "auto_ttl_resolve" : false, "urgency" : "low", "run_alert_script" : false, "subcategory" : "unknown", "tags" : "[Untagged]", "category" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 11:40:05,497 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 11:40:05,497 DEBUG Incident config after getting settings: {"auto_assign": false, "_user": "nobody", "subcategory": "unknown", "tags": "[Untagged]", "auto_previous_resolve": false, "alert_script": "", "alert": "TO Failed Login Alert", "urgency": "low", "auto_assign_owner": "unassigned", "_key": "5526b8b13403e13421162b0c", "auto_ttl_resolve": false, "auto_assign_user": "", "category": "unknown", "run_alert_script": false} | |
2015-04-09 11:40:05,504 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 3 results. | |
2015-04-09 11:40:05,522 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:40:05,523 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:40:05,549 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:40:05,549 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:40:05,549 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:40:05,549 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:40:05,549 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604800_106 | |
2015-04-09 11:40:05,647 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low"} | |
2015-04-09 11:40:05,648 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-09 11:40:05,654 DEBUG Incident settings: [ { "alert" : "Test Alert", "urgency" : "low", "auto_previous_resolve" : false, "auto_assign" : false, "run_alert_script" : false, "subcategory" : "unknown", "auto_assign_owner" : "unassigned", "tags" : "[Untagged]", "auto_ttl_resolve" : false, "category" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-09 11:40:05,654 INFO Found incident settings for Test Alert | |
2015-04-09 11:40:05,654 DEBUG Incident config after getting settings: {"run_alert_script": false, "auto_ttl_resolve": false, "urgency": "low", "auto_previous_resolve": false, "tags": "[Untagged]", "auto_assign_owner": "unassigned", "category": "unknown", "alert_script": "", "auto_assign": false, "auto_assign_user": "", "alert": "Test Alert", "_key": "5526c2413403e135d146268b", "subcategory": "unknown", "_user": "nobody"} | |
2015-04-09 11:40:05,662 INFO Found job for alert Test Alert. Context is 'search' with 3 results. | |
2015-04-09 11:40:05,680 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:40:05,681 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:40:05,706 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:40:05,706 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:40:05,706 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:40:05,706 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:40:05,706 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604800_109 | |
2015-04-09 11:40:05,864 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=dc9bdfc5-5ff6-4980-816f-8fb136e7daad | |
2015-04-09 11:40:05,891 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:40:05,891 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:40:05,891 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:40:05,891 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:40:05,917 DEBUG Create event will be: time=2015-04-09T11:40:05.916949 severity=INFO origin="alert_handler" event_id="9c70ca5ba5a847b5b676ec78dfddb16d" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="dc9bdfc5-5ff6-4980-816f-8fb136e7daad" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604800_106" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604801" | |
2015-04-09 11:40:05,923 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604800_106 with incident_id=dc9bdfc5-5ff6-4980-816f-8fb136e7daad | |
2015-04-09 11:40:05,951 DEBUG results for incident_id=dc9bdfc5-5ff6-4980-816f-8fb136e7daad written to collection. | |
2015-04-09 11:40:05,951 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428604800_106 incident_id=dc9bdfc5-5ff6-4980-816f-8fb136e7daad result_id=0 written to collection incident_results | |
2015-04-09 11:40:05,951 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:40:05,958 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:40:05,958 INFO Alert handler finished. duration=0.748s | |
2015-04-09 11:40:06,019 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=d56880e1-fbb9-4eb3-8e03-f42841a49f51 | |
2015-04-09 11:40:06,044 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:40:06,045 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:40:06,045 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-09 11:40:06,045 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:40:06,088 DEBUG Create event will be: time=2015-04-09T11:40:06.088777 severity=INFO origin="alert_handler" event_id="3c032b28f3b014f20de13923f0dd3159" user="splunk-system-user" action="create" alert="Test Alert" incident_id="d56880e1-fbb9-4eb3-8e03-f42841a49f51" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604800_109" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604801" | |
2015-04-09 11:40:06,095 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604800_109 with incident_id=d56880e1-fbb9-4eb3-8e03-f42841a49f51 | |
2015-04-09 11:40:06,123 DEBUG results for incident_id=d56880e1-fbb9-4eb3-8e03-f42841a49f51 written to collection. | |
2015-04-09 11:40:06,123 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428604800_109 incident_id=d56880e1-fbb9-4eb3-8e03-f42841a49f51 result_id=0 written to collection incident_results | |
2015-04-09 11:40:06,123 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:40:06,130 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:40:06,130 INFO Alert handler finished. duration=0.769s | |
2015-04-09 11:40:06,254 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428604800_110/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428604800_110 sessionKey=F0e^o1rZjL3NyBYtHrGXYl88prVd^I01jqq9b4iFogbMXfS24hMcaMvgzWqnz9bsbr_KUJAQrJvlL^7HOeUi9nBAZiu1ZGhhlCWSkLbQe3ZuLTdurra_J3iBwuWagBS2_7z3w38eDmK7 alert=To Global Failed Login >3 Alert | |
2015-04-09 11:40:06,260 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428604800_110' has been fired. | |
2015-04-09 11:40:06,526 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 11:40:06,527 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 11:40:06,532 DEBUG Incident settings: [ { "alert" : "To Global Failed Login >3 Alert", "auto_previous_resolve" : false, "auto_assign" : false, "auto_assign_owner" : "unassigned", "auto_ttl_resolve" : false, "urgency" : "low", "run_alert_script" : false, "subcategory" : "unknown", "tags" : "[Untagged]", "category" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0d" } ] | |
2015-04-09 11:40:06,532 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 11:40:06,533 DEBUG Incident config after getting settings: {"auto_previous_resolve": false, "auto_assign": false, "auto_ttl_resolve": false, "_user": "nobody", "alert_script": "", "tags": "[Untagged]", "category": "unknown", "run_alert_script": false, "_key": "5526b8b13403e13421162b0d", "auto_assign_owner": "unassigned", "urgency": "low", "alert": "To Global Failed Login >3 Alert", "auto_assign_user": "", "subcategory": "unknown"} | |
2015-04-09 11:40:06,540 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 11:40:06,557 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:40:06,557 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:40:06,582 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:40:06,582 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:40:06,582 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:40:06,582 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:40:06,582 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428604800_110 | |
2015-04-09 11:40:06,873 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=6c43709f-fb28-4f47-bfca-5c02f1141648 | |
2015-04-09 11:40:06,899 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:40:06,899 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:40:06,899 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-09 11:40:06,900 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 11:40:06,943 DEBUG Create event will be: time=2015-04-09T11:40:06.943385 severity=INFO origin="alert_handler" event_id="aa3f44725d9d7fc161207bb69cf9e509" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="6c43709f-fb28-4f47-bfca-5c02f1141648" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428604800_110" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428604801" | |
2015-04-09 11:40:06,949 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428604800_110 with incident_id=6c43709f-fb28-4f47-bfca-5c02f1141648 | |
2015-04-09 11:40:06,977 DEBUG results for incident_id=6c43709f-fb28-4f47-bfca-5c02f1141648 written to collection. | |
2015-04-09 11:40:06,978 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428604800_110 incident_id=6c43709f-fb28-4f47-bfca-5c02f1141648 result_id=0 written to collection incident_results | |
2015-04-09 11:40:06,978 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:40:06,985 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:40:06,985 INFO Alert handler finished. duration=0.731s | |
2015-04-09 11:50:05,445 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428605400_164/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428605400_164 sessionKey=Xujyoe3Js1PhyZvx9XNEg4QXIRble5zzgVYysyLVPXZGuCX5sWetowJvvfeEnhUpAWmbhsRqeY1nInaFUSyfT4qSk41U8YJTQUrsLbMypPbg4Oio1lr1XllgU1YDDTDt3DCSKYXKTfEJJdDypo alert=TO Failed Login Alert | |
2015-04-09 11:50:05,451 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428605400_164' has been fired. | |
2015-04-09 11:50:05,466 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428605400_167/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428605400_167 sessionKey=QNHLB0ifVvqbqlAaRJ1u1Pl79IXSaK9qbBqgKV46Go0YEq_AzXMkOCepD4ZH04drZw2NIpH4WANMjYxvNZMvLbSqKeiAueViaACY92nHcp6vnE_UWFUiKXOWxjS3bnWskuWfgnYUz^8UIC alert=To Global Failed Login >3 Alert | |
2015-04-09 11:50:05,472 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428605400_167' has been fired. | |
2015-04-09 11:50:05,728 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 11:50:05,728 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 11:50:05,737 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "TO Failed Login Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 11:50:05,737 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 11:50:05,737 DEBUG Incident config after getting settings: {"urgency": "high", "auto_assign": false, "category": "Active Directory", "_key": "5526b8b13403e13421162b0c", "tags": "AD", "run_alert_script": false, "alert": "TO Failed Login Alert", "auto_previous_resolve": false, "alert_script": "", "auto_assign_user": "", "_user": "nobody", "auto_ttl_resolve": false, "auto_assign_owner": "unassigned", "subcategory": "unknown"} | |
2015-04-09 11:50:05,747 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 4 results. | |
2015-04-09 11:50:05,752 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_urgency": "low", "default_impact": "low", "default_priority": "low", "default_owner": "unassigned"} | |
2015-04-09 11:50:05,753 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 11:50:05,759 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "To Global Failed Login >3 Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0d" } ] | |
2015-04-09 11:50:05,759 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 11:50:05,760 DEBUG Incident config after getting settings: {"urgency": "high", "subcategory": "unknown", "_user": "nobody", "run_alert_script": false, "category": "Active Directory", "tags": "AD", "alert_script": "", "auto_assign_user": "", "auto_previous_resolve": false, "alert": "To Global Failed Login >3 Alert", "_key": "5526b8b13403e13421162b0d", "auto_ttl_resolve": false, "auto_assign": false, "auto_assign_owner": "unassigned"} | |
2015-04-09 11:50:05,768 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 4 results. | |
2015-04-09 11:50:05,770 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:50:05,770 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:50:05,788 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 11:50:05,788 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 11:50:05,807 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:50:05,807 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:50:05,807 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:50:05,808 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:50:05,808 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428605400_164 | |
2015-04-09 11:50:05,821 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 11:50:05,821 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 11:50:05,822 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 11:50:05,822 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 11:50:05,822 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428605400_167 | |
2015-04-09 11:50:06,112 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=624b2d98-14df-43b9-9765-fac36e8662e0 | |
2015-04-09 11:50:06,129 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=fcc2e619-c221-4e05-8b1c-75a489b0bf85 | |
2015-04-09 11:50:06,146 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:50:06,146 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:50:06,146 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 11:50:06,146 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 11:50:06,162 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 11:50:06,162 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 11:50:06,162 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 11:50:06,163 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 11:50:06,185 DEBUG Create event will be: time=2015-04-09T11:50:06.185749 severity=INFO origin="alert_handler" event_id="4013d57033b61c0263a1b01c3f1c29f0" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="fcc2e619-c221-4e05-8b1c-75a489b0bf85" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428605400_167" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428605401" | |
2015-04-09 11:50:06,187 DEBUG Create event will be: time=2015-04-09T11:50:06.187492 severity=INFO origin="alert_handler" event_id="f5e728746d9ec28b42db2b41ba85109e" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="624b2d98-14df-43b9-9765-fac36e8662e0" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428605400_164" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428605401" | |
2015-04-09 11:50:06,193 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428605400_167 with incident_id=fcc2e619-c221-4e05-8b1c-75a489b0bf85 | |
2015-04-09 11:50:06,194 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428605400_164 with incident_id=624b2d98-14df-43b9-9765-fac36e8662e0 | |
2015-04-09 11:50:06,227 DEBUG results for incident_id=fcc2e619-c221-4e05-8b1c-75a489b0bf85 written to collection. | |
2015-04-09 11:50:06,227 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428605400_167 incident_id=fcc2e619-c221-4e05-8b1c-75a489b0bf85 result_id=0 written to collection incident_results | |
2015-04-09 11:50:06,227 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:50:06,235 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:50:06,235 INFO Alert handler finished. duration=0.769s | |
2015-04-09 11:50:06,262 DEBUG results for incident_id=624b2d98-14df-43b9-9765-fac36e8662e0 written to collection. | |
2015-04-09 11:50:06,263 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428605400_164 incident_id=624b2d98-14df-43b9-9765-fac36e8662e0 result_id=0 written to collection incident_results | |
2015-04-09 11:50:06,263 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 11:50:06,270 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 11:50:06,270 INFO Alert handler finished. duration=0.826s | |
2015-04-09 12:00:06,847 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428606000_217/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428606000_217 sessionKey=dyvQm6nOSpO4NDgU9dL1LpPUY_lH80g4W8u7x4YlwBzFuG8t4UGj51upcn7SZXsRkW94h4VPKKInWkgIrMJtNKVT0I0lJOrXJGl2kmIJuP38UOn18zyMTSJt82ImqMP5AR8ocRmr3znH_NFIEC alert=TO Failed Login Alert | |
2015-04-09 12:00:06,853 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428606000_217' has been fired. | |
2015-04-09 12:00:06,944 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428606000_214/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428606000_214 sessionKey=0i8YW8tGcB7BEQQ1Y9SXrB2QAVdDpZUtyuQt1FQ3LFqr1fIf1JcBt7UMc97W1LuxYSjgkceEgh3Cf12Y9jpjVhET_eyEmA1CJKzGV96SppAnEWtTd^kIoRGRzqKfEPyEhyrjsXUxzODUxR8l alert=To Global Failed Login >3 Alert | |
2015-04-09 12:00:06,950 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428606000_214' has been fired. | |
2015-04-09 12:00:07,131 DEBUG Parsed global alert handler settings: {"default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_priority": "low"} | |
2015-04-09 12:00:07,131 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 12:00:07,137 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "TO Failed Login Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 12:00:07,138 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 12:00:07,138 DEBUG Incident config after getting settings: {"alert": "TO Failed Login Alert", "auto_assign_user": "", "subcategory": "unknown", "urgency": "high", "_user": "nobody", "auto_assign_owner": "unassigned", "category": "Active Directory", "run_alert_script": false, "auto_previous_resolve": false, "tags": "AD", "auto_assign": false, "auto_ttl_resolve": false, "_key": "5526b8b13403e13421162b0c", "alert_script": ""} | |
2015-04-09 12:00:07,146 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 3 results. | |
2015-04-09 12:00:07,167 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 12:00:07,168 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 12:00:07,199 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 12:00:07,200 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 12:00:07,200 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 12:00:07,200 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 12:00:07,200 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428606000_217 | |
2015-04-09 12:00:07,235 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "default_urgency": "low", "default_impact": "low", "index": "alerts-to-inf"} | |
2015-04-09 12:00:07,235 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 12:00:07,241 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "To Global Failed Login >3 Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0d" } ] | |
2015-04-09 12:00:07,241 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 12:00:07,241 DEBUG Incident config after getting settings: {"alert": "To Global Failed Login >3 Alert", "auto_ttl_resolve": false, "subcategory": "unknown", "_key": "5526b8b13403e13421162b0d", "auto_previous_resolve": false, "_user": "nobody", "category": "Active Directory", "alert_script": "", "auto_assign": false, "tags": "AD", "urgency": "high", "auto_assign_owner": "unassigned", "run_alert_script": false, "auto_assign_user": ""} | |
2015-04-09 12:00:07,249 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 12:00:07,267 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 12:00:07,267 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 12:00:07,295 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 12:00:07,295 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 12:00:07,296 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 12:00:07,296 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 12:00:07,296 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428606000_214 | |
2015-04-09 12:00:07,511 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=0049a6f1-f941-4e78-89e2-4425caedd0ad | |
2015-04-09 12:00:07,547 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 12:00:07,547 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 12:00:07,547 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 12:00:07,548 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 12:00:07,579 DEBUG Create event will be: time=2015-04-09T12:00:07.579598 severity=INFO origin="alert_handler" event_id="0ad8e14f0722d9e5c2da4eb9b06c024b" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="0049a6f1-f941-4e78-89e2-4425caedd0ad" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428606000_217" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428606001" | |
2015-04-09 12:00:07,588 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428606000_217 with incident_id=0049a6f1-f941-4e78-89e2-4425caedd0ad | |
2015-04-09 12:00:07,627 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=f22727e6-a907-4be8-b56e-bc3bdba9852f | |
2015-04-09 12:00:07,649 DEBUG results for incident_id=0049a6f1-f941-4e78-89e2-4425caedd0ad written to collection. | |
2015-04-09 12:00:07,650 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428606000_217 incident_id=0049a6f1-f941-4e78-89e2-4425caedd0ad result_id=0 written to collection incident_results | |
2015-04-09 12:00:07,650 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 12:00:07,661 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 12:00:07,662 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 12:00:07,662 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 12:00:07,662 INFO Alert handler finished. duration=0.816s | |
2015-04-09 12:00:07,662 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 12:00:07,662 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 12:00:07,686 DEBUG Create event will be: time=2015-04-09T12:00:07.686661 severity=INFO origin="alert_handler" event_id="cdb96243b074ef6937e62467ccc062c4" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="f22727e6-a907-4be8-b56e-bc3bdba9852f" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428606000_214" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428606001" | |
2015-04-09 12:00:07,697 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428606000_214 with incident_id=f22727e6-a907-4be8-b56e-bc3bdba9852f | |
2015-04-09 12:00:07,762 DEBUG results for incident_id=f22727e6-a907-4be8-b56e-bc3bdba9852f written to collection. | |
2015-04-09 12:00:07,762 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428606000_214 incident_id=f22727e6-a907-4be8-b56e-bc3bdba9852f result_id=0 written to collection incident_results | |
2015-04-09 12:00:07,762 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 12:00:07,771 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 12:00:07,772 INFO Alert handler finished. duration=0.828s | |
2015-04-09 12:10:06,054 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428606600_278/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428606600_278 sessionKey=35kSNsObP92^LxcETESkSi1prJDARoRpcj97_Kp2dollVcEYpftWcdrABlHU1et82ySZV9rEDp5eTEYT4vtvm7TafhJk9lhmFNDES96eCzGpswvlmA^wyCwcLqETBXzAf42DtUUKire4nN alert=TO Failed Login Alert | |
2015-04-09 12:10:06,060 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428606600_278' has been fired. | |
2015-04-09 12:10:06,338 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-09 12:10:06,339 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 12:10:06,340 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428606600_281/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428606600_281 sessionKey=KGywnT9CiCWVCRy3qF46j0rerbr4rXTGRfkNuwOC11bHpS74Uj9m_wbZsdeD0Wa4gb^6iNqrWvhujURqv4udmIcgSv83aFpUa7SlvEgNv5GSlfTqwoMz2tL3O0jqRfp^shMIM8AHQDQm1xjQ alert=To Global Failed Login >3 Alert | |
2015-04-09 12:10:06,345 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "TO Failed Login Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 12:10:06,345 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 12:10:06,345 DEBUG Incident config after getting settings: {"auto_assign_user": "", "auto_previous_resolve": false, "auto_assign_owner": "unassigned", "_key": "5526b8b13403e13421162b0c", "run_alert_script": false, "subcategory": "unknown", "alert": "TO Failed Login Alert", "category": "Active Directory", "_user": "nobody", "alert_script": "", "tags": "AD", "urgency": "high", "auto_ttl_resolve": false, "auto_assign": false} | |
2015-04-09 12:10:06,346 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428606600_281' has been fired. | |
2015-04-09 12:10:06,354 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 3 results. | |
2015-04-09 12:10:06,372 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 12:10:06,372 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 12:10:06,396 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 12:10:06,396 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 12:10:06,397 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 12:10:06,397 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 12:10:06,397 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428606600_278 | |
2015-04-09 12:10:06,618 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 12:10:06,618 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 12:10:06,625 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "To Global Failed Login >3 Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0d" } ] | |
2015-04-09 12:10:06,625 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 12:10:06,625 DEBUG Incident config after getting settings: {"urgency": "high", "auto_ttl_resolve": false, "auto_assign": false, "_user": "nobody", "category": "Active Directory", "auto_assign_owner": "unassigned", "auto_previous_resolve": false, "auto_assign_user": "", "alert_script": "", "subcategory": "unknown", "_key": "5526b8b13403e13421162b0d", "run_alert_script": false, "tags": "AD", "alert": "To Global Failed Login >3 Alert"} | |
2015-04-09 12:10:06,632 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 3 results. | |
2015-04-09 12:10:06,651 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 12:10:06,651 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 12:10:06,679 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 12:10:06,679 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 12:10:06,680 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 12:10:06,680 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 12:10:06,680 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428606600_281 | |
2015-04-09 12:10:06,696 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=4c3860d1-2e3f-4395-8be4-58e1bbfa7f92 | |
2015-04-09 12:10:06,722 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 12:10:06,722 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 12:10:06,722 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 12:10:06,723 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 12:10:06,759 DEBUG Create event will be: time=2015-04-09T12:10:06.759498 severity=INFO origin="alert_handler" event_id="1e152c4288f20903b2a957794cc1aa83" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="4c3860d1-2e3f-4395-8be4-58e1bbfa7f92" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428606600_278" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428606601" | |
2015-04-09 12:10:06,766 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428606600_278 with incident_id=4c3860d1-2e3f-4395-8be4-58e1bbfa7f92 | |
2015-04-09 12:10:06,794 DEBUG results for incident_id=4c3860d1-2e3f-4395-8be4-58e1bbfa7f92 written to collection. | |
2015-04-09 12:10:06,794 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428606600_278 incident_id=4c3860d1-2e3f-4395-8be4-58e1bbfa7f92 result_id=0 written to collection incident_results | |
2015-04-09 12:10:06,794 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 12:10:06,801 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 12:10:06,802 INFO Alert handler finished. duration=0.748s | |
2015-04-09 12:10:06,986 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=1ce69d85-2fa6-4d29-896a-c19e03c004f7 | |
2015-04-09 12:10:07,018 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 12:10:07,019 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 12:10:07,019 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 12:10:07,019 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 12:10:07,052 DEBUG Create event will be: time=2015-04-09T12:10:07.052799 severity=INFO origin="alert_handler" event_id="77dc5fee7703970e6b2a9f3eeb8b41fd" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="1ce69d85-2fa6-4d29-896a-c19e03c004f7" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428606600_281" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428606601" | |
2015-04-09 12:10:07,060 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428606600_281 with incident_id=1ce69d85-2fa6-4d29-896a-c19e03c004f7 | |
2015-04-09 12:10:07,121 DEBUG results for incident_id=1ce69d85-2fa6-4d29-896a-c19e03c004f7 written to collection. | |
2015-04-09 12:10:07,121 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428606600_281 incident_id=1ce69d85-2fa6-4d29-896a-c19e03c004f7 result_id=0 written to collection incident_results | |
2015-04-09 12:10:07,122 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 12:10:07,130 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 12:10:07,130 INFO Alert handler finished. duration=0.79s | |
2015-04-09 12:20:06,189 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428607200_342/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428607200_342 sessionKey=HhiXAnXOtZSkcDYaU_eH_Z_xqy5LH0VQTG^5XhV4kkOTb5HvAjevLOLBHbKbx5Ys2gBX6OneAqEUPCESVGUTE5fxtrNXHpzV4E1RZ8ph^Sxpupvb^ozPKBpEtUFm3xbrYx1t3jyd5DUpUN alert=To Global Failed Login >3 Alert | |
2015-04-09 12:20:06,195 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428607200_342' has been fired. | |
2015-04-09 12:20:06,470 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_urgency": "low", "default_impact": "low"} | |
2015-04-09 12:20:06,470 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 12:20:06,477 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "To Global Failed Login >3 Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0d" } ] | |
2015-04-09 12:20:06,477 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 12:20:06,477 DEBUG Incident config after getting settings: {"subcategory": "unknown", "alert": "To Global Failed Login >3 Alert", "auto_assign_owner": "unassigned", "auto_assign_user": "", "_key": "5526b8b13403e13421162b0d", "category": "Active Directory", "_user": "nobody", "run_alert_script": false, "auto_previous_resolve": false, "auto_ttl_resolve": false, "alert_script": "", "tags": "AD", "urgency": "high", "auto_assign": false} | |
2015-04-09 12:20:06,480 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428607200_339/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428607200_339 sessionKey=MLptVPnlofB_bJIh758vqnW6jhNNE8yZDYAxQAEqQMLsJET^diuBU4_V6hQX6mJWeUDhq7md1FrWzcRlVbvn4JqkUdIB3XUk9PrH^JxE55BGgPSs8HRnbbSEExtzlfalccFQalVVhgbU^2R alert=TO Failed Login Alert | |
2015-04-09 12:20:06,485 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 12:20:06,487 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428607200_339' has been fired. | |
2015-04-09 12:20:06,506 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 12:20:06,506 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 12:20:06,532 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 12:20:06,532 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 12:20:06,532 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 12:20:06,532 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 12:20:06,533 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428607200_342 | |
2015-04-09 12:20:06,766 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-09 12:20:06,766 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 12:20:06,772 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "TO Failed Login Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 12:20:06,772 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 12:20:06,772 DEBUG Incident config after getting settings: {"_key": "5526b8b13403e13421162b0c", "auto_assign_owner": "unassigned", "auto_assign": false, "auto_ttl_resolve": false, "category": "Active Directory", "subcategory": "unknown", "auto_previous_resolve": false, "urgency": "high", "_user": "nobody", "auto_assign_user": "", "run_alert_script": false, "alert": "TO Failed Login Alert", "alert_script": "", "tags": "AD"} | |
2015-04-09 12:20:06,780 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 12:20:06,799 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 12:20:06,799 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 12:20:06,831 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 12:20:06,832 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 12:20:06,832 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 12:20:06,832 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 12:20:06,832 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428607200_339 | |
2015-04-09 12:20:06,845 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=8dc9091e-bc3c-48ee-944c-ca1598be495b | |
2015-04-09 12:20:06,874 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 12:20:06,874 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 12:20:06,875 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 12:20:06,875 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 12:20:06,922 DEBUG Create event will be: time=2015-04-09T12:20:06.922780 severity=INFO origin="alert_handler" event_id="6709e2aecad09432461c8ad01e3d410e" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="8dc9091e-bc3c-48ee-944c-ca1598be495b" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428607200_342" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428607201" | |
2015-04-09 12:20:06,930 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428607200_342 with incident_id=8dc9091e-bc3c-48ee-944c-ca1598be495b | |
2015-04-09 12:20:06,957 DEBUG results for incident_id=8dc9091e-bc3c-48ee-944c-ca1598be495b written to collection. | |
2015-04-09 12:20:06,957 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428607200_342 incident_id=8dc9091e-bc3c-48ee-944c-ca1598be495b result_id=0 written to collection incident_results | |
2015-04-09 12:20:06,957 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 12:20:06,966 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 12:20:06,966 INFO Alert handler finished. duration=0.777s | |
2015-04-09 12:20:07,133 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=78a9cee9-a3b9-4605-a990-ed34cd1f5ff4 | |
2015-04-09 12:20:07,163 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 12:20:07,163 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 12:20:07,163 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 12:20:07,163 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 12:20:07,197 DEBUG Create event will be: time=2015-04-09T12:20:07.197331 severity=INFO origin="alert_handler" event_id="42bba55114f22b1cfb63a4fce926a425" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="78a9cee9-a3b9-4605-a990-ed34cd1f5ff4" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428607200_339" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428607201" | |
2015-04-09 12:20:07,205 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428607200_339 with incident_id=78a9cee9-a3b9-4605-a990-ed34cd1f5ff4 | |
2015-04-09 12:20:07,231 DEBUG results for incident_id=78a9cee9-a3b9-4605-a990-ed34cd1f5ff4 written to collection. | |
2015-04-09 12:20:07,231 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428607200_339 incident_id=78a9cee9-a3b9-4605-a990-ed34cd1f5ff4 result_id=0 written to collection incident_results | |
2015-04-09 12:20:07,232 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 12:20:07,239 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 12:20:07,239 INFO Alert handler finished. duration=0.759s | |
2015-04-09 12:30:06,232 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428607800_390/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428607800_390 sessionKey=5J8MVNN18jbV^dRIwHvNhSPsxQa6HbCax6qu8KYXMtoR2gf8e0eUlRN1WSDEsuwsSyqRJYar^hmVBoFZ6UhFiTil2Y7UuzlEzcXbT5T9w7sOSOaQcTxbY1u9JI8jO9v2h8uqQ4NmAFOBWo alert=To Global Failed Login >3 Alert | |
2015-04-09 12:30:06,238 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428607800_390' has been fired. | |
2015-04-09 12:30:06,510 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-09 12:30:06,511 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 12:30:06,517 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "To Global Failed Login >3 Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0d" } ] | |
2015-04-09 12:30:06,517 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 12:30:06,517 DEBUG Incident config after getting settings: {"urgency": "high", "tags": "AD", "auto_assign": false, "_user": "nobody", "auto_ttl_resolve": false, "auto_previous_resolve": false, "category": "Active Directory", "auto_assign_owner": "unassigned", "alert": "To Global Failed Login >3 Alert", "_key": "5526b8b13403e13421162b0d", "subcategory": "unknown", "auto_assign_user": "", "alert_script": "", "run_alert_script": false} | |
2015-04-09 12:30:06,525 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 12:30:06,544 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 12:30:06,544 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 12:30:06,571 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 12:30:06,571 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 12:30:06,571 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 12:30:06,571 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 12:30:06,571 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428607800_390 | |
2015-04-09 12:30:06,878 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=e3fed58e-8a07-450e-8c46-61e6255310bd | |
2015-04-09 12:30:06,905 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 12:30:06,905 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 12:30:06,906 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 12:30:06,906 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 12:30:06,940 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428607800_393/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428607800_393 sessionKey=reroGUMmzmm2AcC^_bjUzWO6OsKMHLazM3_651f^iK2KVUHID7OoA0u5F9g4d3DZm049u2CgGl4pPPTzPYC748hllgud1fNYZ7CprriDlDiCd1M_glJkeZjcDQRPIWBJFg5rL6fT_^15PF alert=TO Failed Login Alert | |
2015-04-09 12:30:06,946 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428607800_393' has been fired. | |
2015-04-09 12:30:06,951 DEBUG Create event will be: time=2015-04-09T12:30:06.951528 severity=INFO origin="alert_handler" event_id="7e70ba56752281c773224f81686d821e" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="e3fed58e-8a07-450e-8c46-61e6255310bd" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428607800_390" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428607801" | |
2015-04-09 12:30:06,959 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428607800_390 with incident_id=e3fed58e-8a07-450e-8c46-61e6255310bd | |
2015-04-09 12:30:06,986 DEBUG results for incident_id=e3fed58e-8a07-450e-8c46-61e6255310bd written to collection. | |
2015-04-09 12:30:06,986 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428607800_390 incident_id=e3fed58e-8a07-450e-8c46-61e6255310bd result_id=0 written to collection incident_results | |
2015-04-09 12:30:06,986 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 12:30:06,994 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 12:30:06,994 INFO Alert handler finished. duration=0.763s | |
2015-04-09 12:30:07,224 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-09 12:30:07,225 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 12:30:07,231 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "TO Failed Login Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 12:30:07,231 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 12:30:07,231 DEBUG Incident config after getting settings: {"auto_ttl_resolve": false, "tags": "AD", "auto_previous_resolve": false, "category": "Active Directory", "auto_assign": false, "_user": "nobody", "urgency": "high", "alert_script": "", "run_alert_script": false, "subcategory": "unknown", "auto_assign_user": "", "_key": "5526b8b13403e13421162b0c", "alert": "TO Failed Login Alert", "auto_assign_owner": "unassigned"} | |
2015-04-09 12:30:07,239 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 12:30:07,258 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 12:30:07,258 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 12:30:07,286 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 12:30:07,286 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 12:30:07,286 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 12:30:07,286 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 12:30:07,287 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428607800_393 | |
2015-04-09 12:30:07,601 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=6e7cc92e-7ce8-46d3-969a-ea963181c04c | |
2015-04-09 12:30:07,636 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 12:30:07,636 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 12:30:07,637 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 12:30:07,637 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 12:30:07,672 DEBUG Create event will be: time=2015-04-09T12:30:07.672552 severity=INFO origin="alert_handler" event_id="b049a36fca2c3d87832ef90cbf5bc969" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="6e7cc92e-7ce8-46d3-969a-ea963181c04c" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428607800_393" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428607801" | |
2015-04-09 12:30:07,680 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428607800_393 with incident_id=6e7cc92e-7ce8-46d3-969a-ea963181c04c | |
2015-04-09 12:30:07,752 DEBUG results for incident_id=6e7cc92e-7ce8-46d3-969a-ea963181c04c written to collection. | |
2015-04-09 12:30:07,752 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428607800_393 incident_id=6e7cc92e-7ce8-46d3-969a-ea963181c04c result_id=0 written to collection incident_results | |
2015-04-09 12:30:07,752 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 12:30:07,762 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 12:30:07,762 INFO Alert handler finished. duration=0.823s | |
2015-04-09 12:40:06,119 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428608400_447/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428608400_447 sessionKey=g8MjDW1RV85Z0rMcWBNSgzcuOpJ_s8wVVT^JFw5_Kcb6llSl6^UQN647O9cXa3WsSRL5dJ7AtGNvUGbrv5Jgiww2VG1B3M0J7Kd9EOXW1OZYnj62RPJZnmIkoid3HCLfRg8oMqvMsSjm alert=To Global Failed Login >3 Alert | |
2015-04-09 12:40:06,125 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428608400_447' has been fired. | |
2015-04-09 12:40:06,399 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 12:40:06,399 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 12:40:06,405 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "To Global Failed Login >3 Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0d" } ] | |
2015-04-09 12:40:06,406 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 12:40:06,406 DEBUG Incident config after getting settings: {"_user": "nobody", "run_alert_script": false, "auto_assign_user": "", "urgency": "high", "auto_previous_resolve": false, "alert_script": "", "alert": "To Global Failed Login >3 Alert", "tags": "AD", "auto_assign_owner": "unassigned", "auto_assign": false, "_key": "5526b8b13403e13421162b0d", "category": "Active Directory", "auto_ttl_resolve": false, "subcategory": "unknown"} | |
2015-04-09 12:40:06,413 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 12:40:06,431 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 12:40:06,431 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 12:40:06,457 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 12:40:06,457 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 12:40:06,457 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 12:40:06,457 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 12:40:06,458 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428608400_447 | |
2015-04-09 12:40:06,616 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428608400_444/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428608400_444 sessionKey=eFns4KG14IeyGDieIS2mRzrYR^sQkJ99r0u^z4xn1erJqha2kydNw5WLBgNIbOvk9IH5KDLoDjmmmtfkvKWxNuwxwyN5IdBzXOcPccfxXyq^lG3UNB_u4cxqZFTkJfBcJFq0Z6g9l53 alert=TO Failed Login Alert | |
2015-04-09 12:40:06,622 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428608400_444' has been fired. | |
2015-04-09 12:40:06,756 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=9041b049-492e-47e4-bc12-05f0765b6241 | |
2015-04-09 12:40:06,783 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 12:40:06,783 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 12:40:06,783 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 12:40:06,783 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 12:40:06,821 DEBUG Create event will be: time=2015-04-09T12:40:06.821599 severity=INFO origin="alert_handler" event_id="251ce56bb9d8e7379d2603985f95ca0c" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="9041b049-492e-47e4-bc12-05f0765b6241" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428608400_447" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428608401" | |
2015-04-09 12:40:06,828 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428608400_447 with incident_id=9041b049-492e-47e4-bc12-05f0765b6241 | |
2015-04-09 12:40:06,855 DEBUG results for incident_id=9041b049-492e-47e4-bc12-05f0765b6241 written to collection. | |
2015-04-09 12:40:06,856 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428608400_447 incident_id=9041b049-492e-47e4-bc12-05f0765b6241 result_id=0 written to collection incident_results | |
2015-04-09 12:40:06,856 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 12:40:06,863 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 12:40:06,864 INFO Alert handler finished. duration=0.745s | |
2015-04-09 12:40:06,898 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "default_priority": "low", "default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 12:40:06,898 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 12:40:06,904 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "TO Failed Login Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 12:40:06,904 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 12:40:06,905 DEBUG Incident config after getting settings: {"subcategory": "unknown", "auto_ttl_resolve": false, "urgency": "high", "category": "Active Directory", "_key": "5526b8b13403e13421162b0c", "auto_assign": false, "_user": "nobody", "auto_assign_user": "", "auto_assign_owner": "unassigned", "tags": "AD", "alert_script": "", "auto_previous_resolve": false, "alert": "TO Failed Login Alert", "run_alert_script": false} | |
2015-04-09 12:40:06,912 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 12:40:06,930 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 12:40:06,930 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 12:40:06,959 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 12:40:06,959 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 12:40:06,959 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 12:40:06,959 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 12:40:06,959 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428608400_444 | |
2015-04-09 12:40:07,249 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=43bd5f23-fd2c-490c-88c4-ecdb658b13cd | |
2015-04-09 12:40:07,277 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 12:40:07,278 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 12:40:07,278 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 12:40:07,278 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 12:40:07,300 DEBUG Create event will be: time=2015-04-09T12:40:07.300345 severity=INFO origin="alert_handler" event_id="06f4b115238df51e9ae491adc6bf43ad" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="43bd5f23-fd2c-490c-88c4-ecdb658b13cd" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428608400_444" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428608401" | |
2015-04-09 12:40:07,306 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428608400_444 with incident_id=43bd5f23-fd2c-490c-88c4-ecdb658b13cd | |
2015-04-09 12:40:07,334 DEBUG results for incident_id=43bd5f23-fd2c-490c-88c4-ecdb658b13cd written to collection. | |
2015-04-09 12:40:07,335 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428608400_444 incident_id=43bd5f23-fd2c-490c-88c4-ecdb658b13cd result_id=0 written to collection incident_results | |
2015-04-09 12:40:07,335 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 12:40:07,342 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 12:40:07,342 INFO Alert handler finished. duration=0.726s | |
2015-04-09 12:50:05,835 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428609000_501/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428609000_501 sessionKey=UdVqvq0qdiPcxvoYHcWBo9swO3sSX1hHE5B8xt1Vtcw2ANVae^v0SmpMGOlEjZzC3rOEY3WrnJiv^4mRuLnjeQGoAPBVqg16^JhrSYpDIuiNOcuYVQ552QCrOjGPTVw9mkb_OPjN^Vr alert=TO Failed Login Alert | |
2015-04-09 12:50:05,841 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428609000_501' has been fired. | |
2015-04-09 12:50:06,033 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428609000_504/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428609000_504 sessionKey=BJ_HmCqrCPwYrXaZEM^2JU5HpGAl8^zzVAMF_vVYfSiN8X^rfVhWCdWl^xUt4lsSxpE4L8Ruje_7IKr4iTvNg7yMpsUY4TAYHDenwuUfpZdYBdh_hnU1^bksZ5UArmRD6BhAMYXpIN alert=To Global Failed Login >3 Alert | |
2015-04-09 12:50:06,039 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428609000_504' has been fired. | |
2015-04-09 12:50:06,112 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-09 12:50:06,112 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 12:50:06,118 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "TO Failed Login Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 12:50:06,118 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 12:50:06,118 DEBUG Incident config after getting settings: {"auto_ttl_resolve": false, "auto_assign_user": "", "auto_assign": false, "_user": "nobody", "subcategory": "unknown", "category": "Active Directory", "tags": "AD", "alert": "TO Failed Login Alert", "auto_previous_resolve": false, "urgency": "high", "alert_script": "", "_key": "5526b8b13403e13421162b0c", "run_alert_script": false, "auto_assign_owner": "unassigned"} | |
2015-04-09 12:50:06,126 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 12:50:06,143 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 12:50:06,143 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 12:50:06,169 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 12:50:06,169 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 12:50:06,170 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 12:50:06,170 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 12:50:06,170 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428609000_501 | |
2015-04-09 12:50:06,311 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-09 12:50:06,311 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 12:50:06,318 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "To Global Failed Login >3 Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0d" } ] | |
2015-04-09 12:50:06,318 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 12:50:06,319 DEBUG Incident config after getting settings: {"alert": "To Global Failed Login >3 Alert", "_user": "nobody", "_key": "5526b8b13403e13421162b0d", "tags": "AD", "run_alert_script": false, "urgency": "high", "alert_script": "", "auto_ttl_resolve": false, "auto_assign_owner": "unassigned", "subcategory": "unknown", "auto_assign_user": "", "auto_assign": false, "auto_previous_resolve": false, "category": "Active Directory"} | |
2015-04-09 12:50:06,326 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 12:50:06,343 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 12:50:06,343 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 12:50:06,369 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 12:50:06,369 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 12:50:06,370 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 12:50:06,370 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 12:50:06,370 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428609000_504 | |
2015-04-09 12:50:06,467 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=0f757da4-61e6-48b3-96e2-9d5be3fc69fc | |
2015-04-09 12:50:06,498 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 12:50:06,498 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 12:50:06,499 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 12:50:06,499 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 12:50:06,531 DEBUG Create event will be: time=2015-04-09T12:50:06.531085 severity=INFO origin="alert_handler" event_id="c8e5040aeb2572562a5fc9dac07be5f8" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="0f757da4-61e6-48b3-96e2-9d5be3fc69fc" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428609000_501" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428609001" | |
2015-04-09 12:50:06,537 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428609000_501 with incident_id=0f757da4-61e6-48b3-96e2-9d5be3fc69fc | |
2015-04-09 12:50:06,565 DEBUG results for incident_id=0f757da4-61e6-48b3-96e2-9d5be3fc69fc written to collection. | |
2015-04-09 12:50:06,565 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428609000_501 incident_id=0f757da4-61e6-48b3-96e2-9d5be3fc69fc result_id=0 written to collection incident_results | |
2015-04-09 12:50:06,565 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 12:50:06,572 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 12:50:06,572 INFO Alert handler finished. duration=0.738s | |
2015-04-09 12:50:06,669 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=7f2e5e48-ad90-414c-bb22-ba58cb0bbc1a | |
2015-04-09 12:50:06,696 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 12:50:06,697 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 12:50:06,697 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 12:50:06,697 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 12:50:06,737 DEBUG Create event will be: time=2015-04-09T12:50:06.737112 severity=INFO origin="alert_handler" event_id="12dcb08aa0a2e0b1d38989988a1d5080" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="7f2e5e48-ad90-414c-bb22-ba58cb0bbc1a" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428609000_504" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428609001" | |
2015-04-09 12:50:06,743 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428609000_504 with incident_id=7f2e5e48-ad90-414c-bb22-ba58cb0bbc1a | |
2015-04-09 12:50:06,771 DEBUG results for incident_id=7f2e5e48-ad90-414c-bb22-ba58cb0bbc1a written to collection. | |
2015-04-09 12:50:06,771 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428609000_504 incident_id=7f2e5e48-ad90-414c-bb22-ba58cb0bbc1a result_id=0 written to collection incident_results | |
2015-04-09 12:50:06,771 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 12:50:06,778 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 12:50:06,779 INFO Alert handler finished. duration=0.746s | |
2015-04-09 13:00:06,362 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428609600_551/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428609600_551 sessionKey=WNL3fFQOT1CiASJDqh6Z6nj0_FsSzB585dNAQNCaPaBj4hsWkVkg1dl6nhxl9UdSSvhYChtQAxYI_1ZiECd6W8XMpHcY3ylpQf0ej32M7aarZpQ6VdED0fPgGJzRdlyKQjyJldajNmAe alert=To Global Failed Login >3 Alert | |
2015-04-09 13:00:06,368 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428609600_551' has been fired. | |
2015-04-09 13:00:06,650 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low"} | |
2015-04-09 13:00:06,651 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 13:00:06,657 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "To Global Failed Login >3 Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0d" } ] | |
2015-04-09 13:00:06,657 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 13:00:06,658 DEBUG Incident config after getting settings: {"category": "Active Directory", "auto_assign": false, "alert": "To Global Failed Login >3 Alert", "auto_assign_user": "", "_key": "5526b8b13403e13421162b0d", "subcategory": "unknown", "alert_script": "", "run_alert_script": false, "auto_ttl_resolve": false, "urgency": "high", "auto_previous_resolve": false, "tags": "AD", "auto_assign_owner": "unassigned", "_user": "nobody"} | |
2015-04-09 13:00:06,665 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 13:00:06,682 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 13:00:06,682 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 13:00:06,705 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428609600_554/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428609600_554 sessionKey=DdZKY^_zJYMYrXnoN6egkTIQ9LatwcoNceBeMaZ1HLhovPRB_ihZZDKfDx0YJ1Nul8sutFbfXoiwUOUKQKFLlVaW7_9cT03qpodnGpPE9kkpKyj37iqXbb6HtVvhrOsden8qC4N2y6VMEC alert=TO Failed Login Alert | |
2015-04-09 13:00:06,710 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 13:00:06,710 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 13:00:06,710 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 13:00:06,710 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 13:00:06,710 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428609600_551 | |
2015-04-09 13:00:06,711 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428609600_554' has been fired. | |
2015-04-09 13:00:07,003 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low", "default_owner": "unassigned", "default_priority": "low"} | |
2015-04-09 13:00:07,003 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 13:00:07,009 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "TO Failed Login Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 13:00:07,009 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 13:00:07,010 DEBUG Incident config after getting settings: {"alert": "TO Failed Login Alert", "urgency": "high", "run_alert_script": false, "category": "Active Directory", "auto_assign_user": "", "auto_ttl_resolve": false, "_user": "nobody", "auto_previous_resolve": false, "auto_assign": false, "alert_script": "", "tags": "AD", "_key": "5526b8b13403e13421162b0c", "subcategory": "unknown", "auto_assign_owner": "unassigned"} | |
2015-04-09 13:00:07,018 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 13:00:07,024 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=c3acfaa6-6997-4bdd-be14-b47254f5f9fe | |
2015-04-09 13:00:07,041 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 13:00:07,041 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 13:00:07,059 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 13:00:07,059 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 13:00:07,060 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 13:00:07,060 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 13:00:07,076 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 13:00:07,077 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 13:00:07,077 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 13:00:07,077 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 13:00:07,077 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428609600_554 | |
2015-04-09 13:00:07,103 DEBUG Create event will be: time=2015-04-09T13:00:07.103646 severity=INFO origin="alert_handler" event_id="5719a3086f71a0730518fda026f431f5" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="c3acfaa6-6997-4bdd-be14-b47254f5f9fe" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428609600_551" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428609601" | |
2015-04-09 13:00:07,110 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428609600_551 with incident_id=c3acfaa6-6997-4bdd-be14-b47254f5f9fe | |
2015-04-09 13:00:07,138 DEBUG results for incident_id=c3acfaa6-6997-4bdd-be14-b47254f5f9fe written to collection. | |
2015-04-09 13:00:07,138 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428609600_551 incident_id=c3acfaa6-6997-4bdd-be14-b47254f5f9fe result_id=0 written to collection incident_results | |
2015-04-09 13:00:07,138 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 13:00:07,147 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 13:00:07,147 INFO Alert handler finished. duration=0.785s | |
2015-04-09 13:00:07,383 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=35968e70-66a9-43c7-af84-1933da5aaca9 | |
2015-04-09 13:00:07,412 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 13:00:07,412 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 13:00:07,413 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 13:00:07,413 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 13:00:07,448 DEBUG Create event will be: time=2015-04-09T13:00:07.448301 severity=INFO origin="alert_handler" event_id="70ee6d571029403782d300981910925e" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="35968e70-66a9-43c7-af84-1933da5aaca9" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428609600_554" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428609601" | |
2015-04-09 13:00:07,455 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428609600_554 with incident_id=35968e70-66a9-43c7-af84-1933da5aaca9 | |
2015-04-09 13:00:07,483 DEBUG results for incident_id=35968e70-66a9-43c7-af84-1933da5aaca9 written to collection. | |
2015-04-09 13:00:07,483 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428609600_554 incident_id=35968e70-66a9-43c7-af84-1933da5aaca9 result_id=0 written to collection incident_results | |
2015-04-09 13:00:07,483 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 13:00:07,491 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 13:00:07,492 INFO Alert handler finished. duration=0.787s | |
2015-04-09 13:00:08,549 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428609600_556/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428609600_556 sessionKey=epSvHN5N5NA7qiYCNC4hoI8OBhY4ocetXGMbq6EddGI^iQ42b9EaGK35scnjj6q_MXmK7LZiHsks_kJWbcffuWPr8HZd0VgF3lj1uEN5TUxW4xXHZbOrq^W4Kk3t^UepjqEELFFeTqoYwuoIbN alert=TO AD Audit Rule Alert | |
2015-04-09 13:00:08,556 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428609600_556' has been fired. | |
2015-04-09 13:00:08,857 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low"} | |
2015-04-09 13:00:08,857 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-09 13:00:08,864 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-09 13:00:08,864 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-09 13:00:08,865 DEBUG Incident config after getting settings: {"alert": "TO AD Audit Rule Alert", "subcategory": "unknown", "auto_assign_user": "", "_user": "nobody", "run_alert_script": false, "alert_script": "", "auto_previous_resolve": false, "_key": "5526b8b13403e13421162b09", "auto_assign_owner": "unassigned", "tags": "AD", "urgency": "medium", "auto_ttl_resolve": false, "category": "Active Directory", "auto_assign": false} | |
2015-04-09 13:00:08,873 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 1 results. | |
2015-04-09 13:00:08,894 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-09 13:00:08,895 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 13:00:08,921 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 13:00:08,921 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 13:00:08,922 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-09 13:00:08,922 DEBUG Matched impact in lookup, returning value=low | |
2015-04-09 13:00:08,922 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428609600_556 | |
2015-04-09 13:00:09,223 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=bf935b02-9008-46db-b0e4-ea813a21311b | |
2015-04-09 13:00:09,249 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 13:00:09,249 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 13:00:09,250 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'low'} | |
2015-04-09 13:00:09,250 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 13:00:09,296 DEBUG Create event will be: time=2015-04-09T13:00:09.296659 severity=INFO origin="alert_handler" event_id="294beb91277cb2d125dc3d1d00329fda" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="bf935b02-9008-46db-b0e4-ea813a21311b" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428609600_556" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428609601" | |
2015-04-09 13:00:09,303 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428609600_556 with incident_id=bf935b02-9008-46db-b0e4-ea813a21311b | |
2015-04-09 13:00:09,331 DEBUG results for incident_id=bf935b02-9008-46db-b0e4-ea813a21311b written to collection. | |
2015-04-09 13:00:09,331 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428609600_556 incident_id=bf935b02-9008-46db-b0e4-ea813a21311b result_id=0 written to collection incident_results | |
2015-04-09 13:00:09,331 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 13:00:09,338 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 13:00:09,338 INFO Alert handler finished. duration=0.79s | |
2015-04-09 13:10:06,563 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428610200_618/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428610200_618 sessionKey=gdq9UMTdAsRcuQVVsvAdXUoeXRx0TSnlgqOOF_S3jr21yHIirABTsRCyMpQ1UWx0hK1ZNmL_3^p9Nr_FBuug85_wT2_gnhrQbWp8hmvIHi82ymG^6CMQb50^JcqBVE3TDOB^TlshnXKT alert=To Global Failed Login >3 Alert | |
2015-04-09 13:10:06,569 INFO alert_handler started because alert 'To Global Failed Login >3 Alert' with id 'scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428610200_618' has been fired. | |
2015-04-09 13:10:06,571 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428610200_615/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428610200_615 sessionKey=E1U10aOcEJ8jdSoWrWUQZxec9TJzNSlgMSzY2VnjFWh0Eonq9H8Xi1RiIsJY0srU5s^VKsx3gmu6zgT5ZO4VohYve9A68cJsfP0mow3V^uPg5KdqH3_yGqKXbofaovkRAICytuBMiG2AIo alert=TO Failed Login Alert | |
2015-04-09 13:10:06,577 INFO alert_handler started because alert 'TO Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428610200_615' has been fired. | |
2015-04-09 13:10:06,848 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_urgency": "low", "default_priority": "low"} | |
2015-04-09 13:10:06,848 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22To%20Global%20Failed%20Login%20%3E3%20Alert%22%7D | |
2015-04-09 13:10:06,855 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "To Global Failed Login >3 Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0d" } ] | |
2015-04-09 13:10:06,855 INFO Found incident settings for To Global Failed Login >3 Alert | |
2015-04-09 13:10:06,855 DEBUG Incident config after getting settings: {"tags": "AD", "alert_script": "", "auto_previous_resolve": false, "auto_assign_user": "", "auto_assign_owner": "unassigned", "_user": "nobody", "auto_assign": false, "_key": "5526b8b13403e13421162b0d", "alert": "To Global Failed Login >3 Alert", "run_alert_script": false, "category": "Active Directory", "subcategory": "unknown", "auto_ttl_resolve": false, "urgency": "high"} | |
2015-04-09 13:10:06,861 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low", "default_owner": "unassigned", "default_impact": "low"} | |
2015-04-09 13:10:06,861 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Failed%20Login%20Alert%22%7D | |
2015-04-09 13:10:06,864 INFO Found job for alert To Global Failed Login >3 Alert. Context is 'search' with 2 results. | |
2015-04-09 13:10:06,867 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "TO Failed Login Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0c" } ] | |
2015-04-09 13:10:06,867 INFO Found incident settings for TO Failed Login Alert | |
2015-04-09 13:10:06,867 DEBUG Incident config after getting settings: {"alert": "TO Failed Login Alert", "category": "Active Directory", "run_alert_script": false, "subcategory": "unknown", "tags": "AD", "urgency": "high", "auto_ttl_resolve": false, "auto_assign": false, "_user": "nobody", "alert_script": "", "auto_previous_resolve": false, "auto_assign_user": "", "auto_assign_owner": "unassigned", "_key": "5526b8b13403e13421162b0c"} | |
2015-04-09 13:10:06,875 INFO Found job for alert TO Failed Login Alert. Context is 'search' with 2 results. | |
2015-04-09 13:10:06,882 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 13:10:06,883 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 13:10:06,894 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-09 13:10:06,895 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 13:10:06,918 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 13:10:06,918 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 13:10:06,918 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 13:10:06,918 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 13:10:06,918 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428610200_618 | |
2015-04-09 13:10:06,928 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 13:10:06,928 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 13:10:06,928 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-09 13:10:06,929 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-09 13:10:06,929 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428610200_615 | |
2015-04-09 13:10:07,228 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=e6221414-b6c5-4945-92c6-8b27119e054d | |
2015-04-09 13:10:07,229 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=f46a2864-f059-4297-b7ca-53176eb424cd | |
2015-04-09 13:10:07,270 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 13:10:07,271 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 13:10:07,271 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-09 13:10:07,271 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 13:10:07,275 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 13:10:07,275 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 13:10:07,275 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-09 13:10:07,276 DEBUG Matched priority in lookup, returning value=high | |
2015-04-09 13:10:07,307 DEBUG Create event will be: time=2015-04-09T13:10:07.307638 severity=INFO origin="alert_handler" event_id="445899c6d85c2d7c6c547cd9c5e92f8d" user="splunk-system-user" action="create" alert="To Global Failed Login >3 Alert" incident_id="e6221414-b6c5-4945-92c6-8b27119e054d" job_id="scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428610200_618" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428610202" | |
2015-04-09 13:10:07,308 DEBUG Create event will be: time=2015-04-09T13:10:07.308021 severity=INFO origin="alert_handler" event_id="83623d69e41adc75723952f8f26ca004" user="splunk-system-user" action="create" alert="TO Failed Login Alert" incident_id="f46a2864-f059-4297-b7ca-53176eb424cd" job_id="scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428610200_615" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428610202" | |
2015-04-09 13:10:07,316 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428610200_618 with incident_id=e6221414-b6c5-4945-92c6-8b27119e054d | |
2015-04-09 13:10:07,317 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428610200_615 with incident_id=f46a2864-f059-4297-b7ca-53176eb424cd | |
2015-04-09 13:10:07,350 DEBUG results for incident_id=e6221414-b6c5-4945-92c6-8b27119e054d written to collection. | |
2015-04-09 13:10:07,350 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5cd6099d5b7d84045_at_1428610200_618 incident_id=e6221414-b6c5-4945-92c6-8b27119e054d result_id=0 written to collection incident_results | |
2015-04-09 13:10:07,350 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 13:10:07,357 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 13:10:07,357 INFO Alert handler finished. duration=0.795s | |
2015-04-09 13:10:07,385 DEBUG results for incident_id=f46a2864-f059-4297-b7ca-53176eb424cd written to collection. | |
2015-04-09 13:10:07,385 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5853430e3bafc3e3f_at_1428610200_615 incident_id=f46a2864-f059-4297-b7ca-53176eb424cd result_id=0 written to collection incident_results | |
2015-04-09 13:10:07,386 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 13:10:07,393 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 13:10:07,393 INFO Alert handler finished. duration=0.823s | |
2015-04-09 15:00:06,621 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428616800_1208/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428616800_1208 sessionKey=51_WpDgtC6UhYHn8xlLY9tUykxic62nPJgZeyi3fFiOmpotTlOqc5PSat6FC_tB777VxrMMFW35DjGAmZGYyVMWY4Dg66lagehWaMCwdo4zKQkmCBeKT^97DGSRdW7wp6O6DTjTtSRCT0wYx alert=TO AD Audit Rule Alert | |
2015-04-09 15:00:06,627 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428616800_1208' has been fired. | |
2015-04-09 15:00:06,898 DEBUG Parsed global alert handler settings: {"default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_priority": "low"} | |
2015-04-09 15:00:06,898 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-09 15:00:06,904 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-09 15:00:06,905 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-09 15:00:06,905 DEBUG Incident config after getting settings: {"urgency": "medium", "auto_assign_owner": "unassigned", "tags": "AD", "alert": "TO AD Audit Rule Alert", "auto_assign_user": "", "subcategory": "unknown", "auto_previous_resolve": false, "auto_assign": false, "auto_ttl_resolve": false, "_key": "5526b8b13403e13421162b09", "_user": "nobody", "alert_script": "", "category": "Active Directory", "run_alert_script": false} | |
2015-04-09 15:00:06,912 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 6 results. | |
2015-04-09 15:00:06,929 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-09 15:00:06,929 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 15:00:06,955 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 15:00:06,956 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 15:00:06,956 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-09 15:00:06,956 DEBUG Matched impact in lookup, returning value=low | |
2015-04-09 15:00:06,956 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428616800_1208 | |
2015-04-09 15:00:07,257 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=7b61404a-3c9b-490f-8d7f-babbf66f1946 | |
2015-04-09 15:00:07,287 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 15:00:07,287 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 15:00:07,287 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'low'} | |
2015-04-09 15:00:07,287 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 15:00:07,331 DEBUG Create event will be: time=2015-04-09T15:00:07.331352 severity=INFO origin="alert_handler" event_id="76082634839ba78a95011b9080fa3db4" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="7b61404a-3c9b-490f-8d7f-babbf66f1946" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428616800_1208" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428616801" | |
2015-04-09 15:00:07,337 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428616800_1208 with incident_id=7b61404a-3c9b-490f-8d7f-babbf66f1946 | |
2015-04-09 15:00:07,365 DEBUG results for incident_id=7b61404a-3c9b-490f-8d7f-babbf66f1946 written to collection. | |
2015-04-09 15:00:07,365 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428616800_1208 incident_id=7b61404a-3c9b-490f-8d7f-babbf66f1946 result_id=0 written to collection incident_results | |
2015-04-09 15:00:07,366 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 15:00:07,372 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 15:00:07,373 INFO Alert handler finished. duration=0.752s | |
2015-04-09 15:35:05,003 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428618900_1412/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428618900_1412 sessionKey=c^pgor24lzOylewDQYbnTQJ^hTR9RZ4mERMOwgXygBZBch8R8Rwb59KgqMr7lHvdvr4B8indlGVauFl650BIajFPfx4go9zf49dtPGuB6fNmJPBVy_UtpvLecgIWhWrw2y6a3vqEPwPJN6IciC0 alert=TO AD Audit Rule Alert | |
2015-04-09 15:35:05,009 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428618900_1412' has been fired. | |
2015-04-09 15:35:05,289 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-09 15:35:05,290 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-09 15:35:05,296 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-09 15:35:05,296 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-09 15:35:05,297 DEBUG Incident config after getting settings: {"_user": "nobody", "alert_script": "", "auto_previous_resolve": false, "auto_ttl_resolve": false, "auto_assign": false, "run_alert_script": false, "_key": "5526b8b13403e13421162b09", "category": "Active Directory", "urgency": "medium", "auto_assign_user": "", "tags": "AD", "subcategory": "unknown", "alert": "TO AD Audit Rule Alert", "auto_assign_owner": "unassigned"} | |
2015-04-09 15:35:05,304 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 3 results. | |
2015-04-09 15:35:05,322 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-09 15:35:05,322 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 15:35:05,349 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 15:35:05,349 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 15:35:05,350 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-09 15:35:05,350 DEBUG Matched impact in lookup, returning value=low | |
2015-04-09 15:35:05,350 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428618900_1412 | |
2015-04-09 15:35:05,649 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=0a771dba-6bc4-4b62-b949-dce472f95cee | |
2015-04-09 15:35:05,675 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 15:35:05,676 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 15:35:05,676 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'low'} | |
2015-04-09 15:35:05,676 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 15:35:05,706 DEBUG Create event will be: time=2015-04-09T15:35:05.706210 severity=INFO origin="alert_handler" event_id="fd10a31fde7c28d46853434a427f2409" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="0a771dba-6bc4-4b62-b949-dce472f95cee" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428618900_1412" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428618901" | |
2015-04-09 15:35:05,713 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428618900_1412 with incident_id=0a771dba-6bc4-4b62-b949-dce472f95cee | |
2015-04-09 15:35:05,740 DEBUG results for incident_id=0a771dba-6bc4-4b62-b949-dce472f95cee written to collection. | |
2015-04-09 15:35:05,741 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428618900_1412 incident_id=0a771dba-6bc4-4b62-b949-dce472f95cee result_id=0 written to collection incident_results | |
2015-04-09 15:35:05,741 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 15:35:05,749 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 15:35:05,749 INFO Alert handler finished. duration=0.746s | |
2015-04-09 15:50:05,634 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428619800_1480/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428619800_1480 sessionKey=Taoi4TNM_M^jkCloGtUdcXRbUNO5wSSr4^Xx2b7UMuW4kT6DqwUX2DYQjbVlfnnH3pWCunAHZpY9jijEIk3W6YIszvmrUrqyk35biWjv9gWdvjW1FrNMe6pq_ysbqqOA_ewAeJAWTF alert=TO AD Audit Rule Alert | |
2015-04-09 15:50:05,640 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428619800_1480' has been fired. | |
2015-04-09 15:50:05,913 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "default_priority": "low", "default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-09 15:50:05,913 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-09 15:50:05,919 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-09 15:50:05,919 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-09 15:50:05,920 DEBUG Incident config after getting settings: {"auto_assign_user": "", "_user": "nobody", "auto_assign": false, "alert_script": "", "category": "Active Directory", "subcategory": "unknown", "auto_assign_owner": "unassigned", "auto_ttl_resolve": false, "run_alert_script": false, "_key": "5526b8b13403e13421162b09", "urgency": "medium", "auto_previous_resolve": false, "alert": "TO AD Audit Rule Alert", "tags": "AD"} | |
2015-04-09 15:50:05,927 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 1 results. | |
2015-04-09 15:50:05,944 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-09 15:50:05,945 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 15:50:05,972 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 15:50:05,973 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 15:50:05,973 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-09 15:50:05,973 DEBUG Matched impact in lookup, returning value=low | |
2015-04-09 15:50:05,973 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428619800_1480 | |
2015-04-09 15:50:06,273 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=e09208c2-ecac-444f-99b7-ff7213f9241c | |
2015-04-09 15:50:06,306 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 15:50:06,306 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 15:50:06,306 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'low'} | |
2015-04-09 15:50:06,306 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 15:50:06,330 DEBUG Create event will be: time=2015-04-09T15:50:06.330805 severity=INFO origin="alert_handler" event_id="b5f527f78473931648547b3d6ec40cd6" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="e09208c2-ecac-444f-99b7-ff7213f9241c" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428619800_1480" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428619801" | |
2015-04-09 15:50:06,338 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428619800_1480 with incident_id=e09208c2-ecac-444f-99b7-ff7213f9241c | |
2015-04-09 15:50:06,365 DEBUG results for incident_id=e09208c2-ecac-444f-99b7-ff7213f9241c written to collection. | |
2015-04-09 15:50:06,365 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428619800_1480 incident_id=e09208c2-ecac-444f-99b7-ff7213f9241c result_id=0 written to collection incident_results | |
2015-04-09 15:50:06,365 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 15:50:06,372 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 15:50:06,372 INFO Alert handler finished. duration=0.739s | |
2015-04-09 16:15:06,264 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428621300_1632/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428621300_1632 sessionKey=pW^01aGiglchOq6wpCkSr7kKOyYGlAOIPV8YcqMGZSmtPWcrmoWC1k^5U8J14C9R57cGlXspWcy8qUtaBbLycfFcS0EnIEBPIhXObrtpar0P3UJ3ibXnKyTuvl5CFV0TF64bgRzqFsJpoo alert=TO AD Audit Rule Alert | |
2015-04-09 16:15:06,270 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428621300_1632' has been fired. | |
2015-04-09 16:15:06,546 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-09 16:15:06,546 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-09 16:15:06,553 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-09 16:15:06,553 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-09 16:15:06,553 DEBUG Incident config after getting settings: {"alert": "TO AD Audit Rule Alert", "_user": "nobody", "auto_assign": false, "_key": "5526b8b13403e13421162b09", "auto_assign_owner": "unassigned", "auto_assign_user": "", "run_alert_script": false, "auto_previous_resolve": false, "alert_script": "", "tags": "AD", "urgency": "medium", "auto_ttl_resolve": false, "subcategory": "unknown", "category": "Active Directory"} | |
2015-04-09 16:15:06,561 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 3 results. | |
2015-04-09 16:15:06,579 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-09 16:15:06,579 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 16:15:06,604 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 16:15:06,605 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 16:15:06,605 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-09 16:15:06,605 DEBUG Matched impact in lookup, returning value=low | |
2015-04-09 16:15:06,605 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428621300_1632 | |
2015-04-09 16:15:06,897 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=fd043d69-d46d-4337-b5b5-f1edc2fedd28 | |
2015-04-09 16:15:06,923 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 16:15:06,923 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 16:15:06,924 DEBUG Querying lookup with filter={'impact': 'low', 'urgency': u'medium'} | |
2015-04-09 16:15:06,924 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 16:15:06,960 DEBUG Create event will be: time=2015-04-09T16:15:06.960233 severity=INFO origin="alert_handler" event_id="cf3525a1cac3963fd351459fa6c6d43a" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="fd043d69-d46d-4337-b5b5-f1edc2fedd28" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428621300_1632" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428621301" | |
2015-04-09 16:15:06,967 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428621300_1632 with incident_id=fd043d69-d46d-4337-b5b5-f1edc2fedd28 | |
2015-04-09 16:15:06,994 DEBUG results for incident_id=fd043d69-d46d-4337-b5b5-f1edc2fedd28 written to collection. | |
2015-04-09 16:15:06,994 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428621300_1632 incident_id=fd043d69-d46d-4337-b5b5-f1edc2fedd28 result_id=0 written to collection incident_results | |
2015-04-09 16:15:06,995 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 16:15:07,002 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 16:15:07,002 INFO Alert handler finished. duration=0.738s | |
2015-04-09 16:20:06,770 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428621600_1649/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428621600_1649 sessionKey=cY7XIvZZM9owetDOd_7gGW4mdE01UgEdufHHP8FV5^ciQn1bZmksgYqJrmMNZgBZQpvNcRbUR1vbfG53e0bm8QT^2GTGhU3YWmRjvZT4qHML8mzol9kRGLNo7e4_HAvyXHm8dqYpsb6H alert=TO AD Audit Rule Alert | |
2015-04-09 16:20:06,776 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428621600_1649' has been fired. | |
2015-04-09 16:20:07,055 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-09 16:20:07,055 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-09 16:20:07,062 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-09 16:20:07,062 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-09 16:20:07,062 DEBUG Incident config after getting settings: {"auto_assign": false, "auto_assign_owner": "unassigned", "_key": "5526b8b13403e13421162b09", "subcategory": "unknown", "category": "Active Directory", "auto_ttl_resolve": false, "run_alert_script": false, "auto_assign_user": "", "_user": "nobody", "urgency": "medium", "auto_previous_resolve": false, "tags": "AD", "alert_script": "", "alert": "TO AD Audit Rule Alert"} | |
2015-04-09 16:20:07,071 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 2 results. | |
2015-04-09 16:20:07,091 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-09 16:20:07,091 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 16:20:07,120 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 16:20:07,121 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 16:20:07,121 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-09 16:20:07,121 DEBUG Matched impact in lookup, returning value=low | |
2015-04-09 16:20:07,121 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428621600_1649 | |
2015-04-09 16:20:07,428 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=cc37001a-4aeb-45ee-8dea-a18704a5537d | |
2015-04-09 16:20:07,458 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 16:20:07,458 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 16:20:07,458 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'low'} | |
2015-04-09 16:20:07,458 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 16:20:07,490 DEBUG Create event will be: time=2015-04-09T16:20:07.490905 severity=INFO origin="alert_handler" event_id="363106309dda8a9db9430a2e7fd34b82" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="cc37001a-4aeb-45ee-8dea-a18704a5537d" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428621600_1649" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428621601" | |
2015-04-09 16:20:07,499 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428621600_1649 with incident_id=cc37001a-4aeb-45ee-8dea-a18704a5537d | |
2015-04-09 16:20:07,525 DEBUG results for incident_id=cc37001a-4aeb-45ee-8dea-a18704a5537d written to collection. | |
2015-04-09 16:20:07,525 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428621600_1649 incident_id=cc37001a-4aeb-45ee-8dea-a18704a5537d result_id=0 written to collection incident_results | |
2015-04-09 16:20:07,525 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 16:20:07,532 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 16:20:07,532 INFO Alert handler finished. duration=0.762s | |
2015-04-09 16:35:04,834 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428622500_1737/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428622500_1737 sessionKey=xBLFbehVZNhmXfAZNrstwxBruAxterJ3oPFrvXKvuQG2GqhoC95xHggNkwEG^SYgpAvzefwjyPRcwrpXv0zWns_ttQV0Wku_5Hpo_mufovbKSWEemlJBDfdNwljTSy1uEGJtctN7Ao alert=TO AD Audit Rule Alert | |
2015-04-09 16:35:04,839 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428622500_1737' has been fired. | |
2015-04-09 16:35:05,126 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned", "default_priority": "low"} | |
2015-04-09 16:35:05,126 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-09 16:35:05,132 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-09 16:35:05,132 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-09 16:35:05,132 DEBUG Incident config after getting settings: {"category": "Active Directory", "auto_previous_resolve": false, "run_alert_script": false, "auto_assign": false, "auto_ttl_resolve": false, "tags": "AD", "alert_script": "", "_user": "nobody", "_key": "5526b8b13403e13421162b09", "alert": "TO AD Audit Rule Alert", "subcategory": "unknown", "auto_assign_user": "", "urgency": "medium", "auto_assign_owner": "unassigned"} | |
2015-04-09 16:35:05,140 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 4 results. | |
2015-04-09 16:35:05,157 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-09 16:35:05,157 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 16:35:05,183 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 16:35:05,183 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 16:35:05,184 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-09 16:35:05,184 DEBUG Matched impact in lookup, returning value=low | |
2015-04-09 16:35:05,184 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428622500_1737 | |
2015-04-09 16:35:05,482 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=5ec48db4-b4fb-40dd-8cb4-9fc517769785 | |
2015-04-09 16:35:05,509 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 16:35:05,509 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 16:35:05,510 DEBUG Querying lookup with filter={'impact': 'low', 'urgency': u'medium'} | |
2015-04-09 16:35:05,510 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 16:35:05,551 DEBUG Create event will be: time=2015-04-09T16:35:05.551143 severity=INFO origin="alert_handler" event_id="2f4c7cdb1698f78b5b3c2bee27ba6c13" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="5ec48db4-b4fb-40dd-8cb4-9fc517769785" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428622500_1737" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428622501" | |
2015-04-09 16:35:05,558 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428622500_1737 with incident_id=5ec48db4-b4fb-40dd-8cb4-9fc517769785 | |
2015-04-09 16:35:05,619 DEBUG results for incident_id=5ec48db4-b4fb-40dd-8cb4-9fc517769785 written to collection. | |
2015-04-09 16:35:05,620 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428622500_1737 incident_id=5ec48db4-b4fb-40dd-8cb4-9fc517769785 result_id=0 written to collection incident_results | |
2015-04-09 16:35:05,620 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 16:35:05,628 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 16:35:05,628 INFO Alert handler finished. duration=0.795s | |
2015-04-09 17:15:06,361 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428624900_1957/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428624900_1957 sessionKey=FYhDEFta2u7ts7uGqZQ^BFYjufhMXt720F9_ExHrmyF5vA2xMgu91JsmNIgo20J2nnqtjsR0qCvVT9U0_DqxmQnl9yRuf7rgW0M30ZDGwBmhf6Hv05QPV2XLAuLB1MhovHT7RrVVxDkHoIo alert=TO AD Audit Rule Alert | |
2015-04-09 17:15:06,367 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428624900_1957' has been fired. | |
2015-04-09 17:15:06,649 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-09 17:15:06,650 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-09 17:15:06,656 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-09 17:15:06,656 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-09 17:15:06,656 DEBUG Incident config after getting settings: {"auto_assign": false, "_user": "nobody", "auto_ttl_resolve": false, "urgency": "medium", "auto_assign_owner": "unassigned", "category": "Active Directory", "auto_assign_user": "", "alert_script": "", "auto_previous_resolve": false, "tags": "AD", "run_alert_script": false, "alert": "TO AD Audit Rule Alert", "subcategory": "unknown", "_key": "5526b8b13403e13421162b09"} | |
2015-04-09 17:15:06,664 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 2 results. | |
2015-04-09 17:15:06,682 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-09 17:15:06,682 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 17:15:06,707 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 17:15:06,707 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 17:15:06,707 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-09 17:15:06,707 DEBUG Matched impact in lookup, returning value=low | |
2015-04-09 17:15:06,707 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428624900_1957 | |
2015-04-09 17:15:07,003 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=15677e77-a68a-429f-a224-2f24e8837d28 | |
2015-04-09 17:15:07,030 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 17:15:07,030 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 17:15:07,031 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'low'} | |
2015-04-09 17:15:07,031 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 17:15:07,055 DEBUG Create event will be: time=2015-04-09T17:15:07.055005 severity=INFO origin="alert_handler" event_id="80c8b73e9f97aaeeb5452f2186672346" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="15677e77-a68a-429f-a224-2f24e8837d28" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428624900_1957" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428624902" | |
2015-04-09 17:15:07,061 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428624900_1957 with incident_id=15677e77-a68a-429f-a224-2f24e8837d28 | |
2015-04-09 17:15:07,089 DEBUG results for incident_id=15677e77-a68a-429f-a224-2f24e8837d28 written to collection. | |
2015-04-09 17:15:07,089 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428624900_1957 incident_id=15677e77-a68a-429f-a224-2f24e8837d28 result_id=0 written to collection incident_results | |
2015-04-09 17:15:07,090 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 17:15:07,097 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 17:15:07,097 INFO Alert handler finished. duration=0.736s | |
2015-04-09 17:20:06,063 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428625200_1974/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428625200_1974 sessionKey=^h9dEBU8wTf^vgxZwYuiF_SwY7VFOgY8ikZDyyw0U8UZwlZzeOCH6uQ9S6cxma0W1MCibwPD0FVrLI4QWFzIss8_kFUbJQhighpgX61TmOmLR95g3WSrSK_xPqXIDI8krQkF9aHutg7S1z4mvC alert=TO AD Audit Rule Alert | |
2015-04-09 17:20:06,069 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428625200_1974' has been fired. | |
2015-04-09 17:20:06,340 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "default_urgency": "low", "default_impact": "low", "index": "alerts-to-inf"} | |
2015-04-09 17:20:06,341 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-09 17:20:06,347 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-09 17:20:06,347 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-09 17:20:06,347 DEBUG Incident config after getting settings: {"_key": "5526b8b13403e13421162b09", "auto_assign_owner": "unassigned", "run_alert_script": false, "urgency": "medium", "auto_previous_resolve": false, "alert_script": "", "category": "Active Directory", "subcategory": "unknown", "alert": "TO AD Audit Rule Alert", "tags": "AD", "auto_ttl_resolve": false, "_user": "nobody", "auto_assign": false, "auto_assign_user": ""} | |
2015-04-09 17:20:06,355 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 2 results. | |
2015-04-09 17:20:06,373 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-09 17:20:06,373 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 17:20:06,398 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 17:20:06,398 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 17:20:06,398 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-09 17:20:06,398 DEBUG Matched impact in lookup, returning value=low | |
2015-04-09 17:20:06,398 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428625200_1974 | |
2015-04-09 17:20:06,692 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=f7adffe3-c871-4e10-86e7-4d4de1b9bb7c | |
2015-04-09 17:20:06,717 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 17:20:06,718 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 17:20:06,718 DEBUG Querying lookup with filter={'impact': 'low', 'urgency': u'medium'} | |
2015-04-09 17:20:06,718 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 17:20:06,766 DEBUG Create event will be: time=2015-04-09T17:20:06.766006 severity=INFO origin="alert_handler" event_id="b5932ecfce24dc21464ac2d4abd6f7fe" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="f7adffe3-c871-4e10-86e7-4d4de1b9bb7c" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428625200_1974" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428625201" | |
2015-04-09 17:20:06,772 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428625200_1974 with incident_id=f7adffe3-c871-4e10-86e7-4d4de1b9bb7c | |
2015-04-09 17:20:06,802 DEBUG results for incident_id=f7adffe3-c871-4e10-86e7-4d4de1b9bb7c written to collection. | |
2015-04-09 17:20:06,802 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428625200_1974 incident_id=f7adffe3-c871-4e10-86e7-4d4de1b9bb7c result_id=0 written to collection incident_results | |
2015-04-09 17:20:06,802 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 17:20:06,811 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 17:20:06,811 INFO Alert handler finished. duration=0.748s | |
2015-04-09 18:05:05,018 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428627900_2222/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428627900_2222 sessionKey=HwW5BUhcNUHe7fmsEF05htE_RgOVJpEtUI4nrCE_aMLQ6QsxCqPo1p5XpaeZmvpoPiOYRDTnc_zZUBS9ECgigwri3zz4mMGkpF11t3fCriI^upZV8Xx14p2kRLQf8Ic5bEXcoIUqR6F alert=TO AD Audit Rule Alert | |
2015-04-09 18:05:05,024 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428627900_2222' has been fired. | |
2015-04-09 18:05:05,304 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-09 18:05:05,304 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-09 18:05:05,310 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-09 18:05:05,310 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-09 18:05:05,311 DEBUG Incident config after getting settings: {"auto_ttl_resolve": false, "auto_assign": false, "auto_assign_user": "", "category": "Active Directory", "auto_assign_owner": "unassigned", "subcategory": "unknown", "_user": "nobody", "_key": "5526b8b13403e13421162b09", "tags": "AD", "alert_script": "", "auto_previous_resolve": false, "urgency": "medium", "alert": "TO AD Audit Rule Alert", "run_alert_script": false} | |
2015-04-09 18:05:05,319 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 2 results. | |
2015-04-09 18:05:05,339 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-09 18:05:05,339 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 18:05:05,367 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 18:05:05,367 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 18:05:05,367 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-09 18:05:05,367 DEBUG Matched impact in lookup, returning value=low | |
2015-04-09 18:05:05,367 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428627900_2222 | |
2015-04-09 18:05:05,664 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=7611e392-a3e3-4a6a-a1bf-7dd74e2c304f | |
2015-04-09 18:05:05,691 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 18:05:05,691 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 18:05:05,691 DEBUG Querying lookup with filter={'impact': 'low', 'urgency': u'medium'} | |
2015-04-09 18:05:05,691 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 18:05:05,766 DEBUG Create event will be: time=2015-04-09T18:05:05.766054 severity=INFO origin="alert_handler" event_id="64d0375bdb897e3cd07bca447b87edf1" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="7611e392-a3e3-4a6a-a1bf-7dd74e2c304f" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428627900_2222" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428627901" | |
2015-04-09 18:05:05,772 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428627900_2222 with incident_id=7611e392-a3e3-4a6a-a1bf-7dd74e2c304f | |
2015-04-09 18:05:05,802 DEBUG results for incident_id=7611e392-a3e3-4a6a-a1bf-7dd74e2c304f written to collection. | |
2015-04-09 18:05:05,802 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428627900_2222 incident_id=7611e392-a3e3-4a6a-a1bf-7dd74e2c304f result_id=0 written to collection incident_results | |
2015-04-09 18:05:05,802 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 18:05:05,810 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 18:05:05,811 INFO Alert handler finished. duration=0.793s | |
2015-04-09 18:10:06,037 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428628200_2241/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428628200_2241 sessionKey=g8HfMmcKNe0If14ZGzjoSPzVRiICzJoAaWRmLHmpjuTOVe1sqS_RWhIveK2d9TDWpkHEY8YpBpMpjsjhDh0^gHLQRlVjHNTum6dIq5KehJXByAP26HtZx8p4G3ds7og8hZ9S^zwX7z3Rrj5cFC alert=TO AD Audit Rule Alert | |
2015-04-09 18:10:06,042 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428628200_2241' has been fired. | |
2015-04-09 18:10:06,315 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_priority": "low"} | |
2015-04-09 18:10:06,315 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-09 18:10:06,321 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-09 18:10:06,321 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-09 18:10:06,321 DEBUG Incident config after getting settings: {"auto_ttl_resolve": false, "auto_previous_resolve": false, "urgency": "medium", "auto_assign": false, "_user": "nobody", "tags": "AD", "subcategory": "unknown", "auto_assign_user": "", "alert_script": "", "run_alert_script": false, "category": "Active Directory", "auto_assign_owner": "unassigned", "alert": "TO AD Audit Rule Alert", "_key": "5526b8b13403e13421162b09"} | |
2015-04-09 18:10:06,329 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 1 results. | |
2015-04-09 18:10:06,346 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-09 18:10:06,346 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 18:10:06,371 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 18:10:06,371 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 18:10:06,372 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-09 18:10:06,372 DEBUG Matched impact in lookup, returning value=low | |
2015-04-09 18:10:06,372 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428628200_2241 | |
2015-04-09 18:10:06,668 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=eff5a3be-4909-4ece-aa26-5b6120284b2d | |
2015-04-09 18:10:06,695 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 18:10:06,695 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 18:10:06,695 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'low'} | |
2015-04-09 18:10:06,695 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 18:10:06,709 DEBUG Create event will be: time=2015-04-09T18:10:06.709549 severity=INFO origin="alert_handler" event_id="de5da5a413ae28ef56428e7e4623bbee" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="eff5a3be-4909-4ece-aa26-5b6120284b2d" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428628200_2241" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428628201" | |
2015-04-09 18:10:06,716 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428628200_2241 with incident_id=eff5a3be-4909-4ece-aa26-5b6120284b2d | |
2015-04-09 18:10:06,744 DEBUG results for incident_id=eff5a3be-4909-4ece-aa26-5b6120284b2d written to collection. | |
2015-04-09 18:10:06,744 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428628200_2241 incident_id=eff5a3be-4909-4ece-aa26-5b6120284b2d result_id=0 written to collection incident_results | |
2015-04-09 18:10:06,744 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 18:10:06,751 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 18:10:06,751 INFO Alert handler finished. duration=0.715s | |
2015-04-09 18:25:04,544 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428629100_2332/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428629100_2332 sessionKey=AtVatR4pgQu6Jr_xqpLSLTjehRodg8Z0f8uOG67LzMNxMNQOhMD0a6b9y6Tb_g_5MUALBjl0guClaX_P41f^VEwV0cwsP^d_yd8wMLU0QaOcF0b1UmrjRUTNLGrg0hcNX8fRLKylmqy5 alert=TO AD Audit Rule Alert | |
2015-04-09 18:25:04,550 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428629100_2332' has been fired. | |
2015-04-09 18:25:04,826 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_urgency": "low", "default_impact": "low", "default_priority": "low", "default_owner": "unassigned"} | |
2015-04-09 18:25:04,826 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-09 18:25:04,832 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-09 18:25:04,832 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-09 18:25:04,833 DEBUG Incident config after getting settings: {"auto_assign_user": "", "_key": "5526b8b13403e13421162b09", "auto_assign_owner": "unassigned", "alert_script": "", "category": "Active Directory", "alert": "TO AD Audit Rule Alert", "subcategory": "unknown", "run_alert_script": false, "auto_ttl_resolve": false, "auto_assign": false, "tags": "AD", "urgency": "medium", "auto_previous_resolve": false, "_user": "nobody"} | |
2015-04-09 18:25:04,840 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 8 results. | |
2015-04-09 18:25:04,859 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-09 18:25:04,859 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 18:25:04,885 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 18:25:04,885 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 18:25:04,885 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-09 18:25:04,886 DEBUG Matched impact in lookup, returning value=low | |
2015-04-09 18:25:04,886 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428629100_2332 | |
2015-04-09 18:25:05,185 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=c71a39b5-b6e3-438e-a6b5-bdbddbf66745 | |
2015-04-09 18:25:05,212 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 18:25:05,212 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 18:25:05,212 DEBUG Querying lookup with filter={'impact': 'low', 'urgency': u'medium'} | |
2015-04-09 18:25:05,212 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 18:25:05,255 DEBUG Create event will be: time=2015-04-09T18:25:05.255481 severity=INFO origin="alert_handler" event_id="7a856e17a4abe23ba6db6bcc84d42929" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="c71a39b5-b6e3-438e-a6b5-bdbddbf66745" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428629100_2332" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428629101" | |
2015-04-09 18:25:05,262 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428629100_2332 with incident_id=c71a39b5-b6e3-438e-a6b5-bdbddbf66745 | |
2015-04-09 18:25:05,289 DEBUG results for incident_id=c71a39b5-b6e3-438e-a6b5-bdbddbf66745 written to collection. | |
2015-04-09 18:25:05,290 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428629100_2332 incident_id=c71a39b5-b6e3-438e-a6b5-bdbddbf66745 result_id=0 written to collection incident_results | |
2015-04-09 18:25:05,290 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 18:25:05,297 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 18:25:05,297 INFO Alert handler finished. duration=0.753s | |
2015-04-09 18:30:06,237 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428629400_2354/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428629400_2354 sessionKey=ge^X7qxDwxBJR9Fg0PqrAQU16n^Arq3AjJFJM60PQh_OmZK6ylkbLI7O8e1li86ZBGN8uQc4^^fRQFT0OA7MybD^15wIFPP5UOJut91wx57lczdNPefOGP0kl19lBbGH8ywx1N19lC alert=TO AD Audit Rule Alert | |
2015-04-09 18:30:06,243 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428629400_2354' has been fired. | |
2015-04-09 18:30:06,520 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low"} | |
2015-04-09 18:30:06,520 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-09 18:30:06,527 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-09 18:30:06,527 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-09 18:30:06,527 DEBUG Incident config after getting settings: {"auto_assign_user": "", "tags": "AD", "auto_ttl_resolve": false, "run_alert_script": false, "auto_assign": false, "alert": "TO AD Audit Rule Alert", "subcategory": "unknown", "category": "Active Directory", "urgency": "medium", "_key": "5526b8b13403e13421162b09", "_user": "nobody", "alert_script": "", "auto_previous_resolve": false, "auto_assign_owner": "unassigned"} | |
2015-04-09 18:30:06,535 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 4 results. | |
2015-04-09 18:30:06,553 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-09 18:30:06,554 DEBUG Transformed 24h into 86400 seconds | |
2015-04-09 18:30:06,580 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-09 18:30:06,580 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-09 18:30:06,580 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-09 18:30:06,580 DEBUG Matched impact in lookup, returning value=low | |
2015-04-09 18:30:06,580 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428629400_2354 | |
2015-04-09 18:30:06,883 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=6da7d089-6d07-4eec-8cac-f220df4e9de3 | |
2015-04-09 18:30:06,909 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-09 18:30:06,910 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-09 18:30:06,910 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'low'} | |
2015-04-09 18:30:06,910 DEBUG Matched priority in lookup, returning value=low | |
2015-04-09 18:30:06,925 DEBUG Create event will be: time=2015-04-09T18:30:06.925206 severity=INFO origin="alert_handler" event_id="67ed54d37935388cea328c0f531cfdf7" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="6da7d089-6d07-4eec-8cac-f220df4e9de3" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428629400_2354" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428629401" | |
2015-04-09 18:30:06,931 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428629400_2354 with incident_id=6da7d089-6d07-4eec-8cac-f220df4e9de3 | |
2015-04-09 18:30:06,960 DEBUG results for incident_id=6da7d089-6d07-4eec-8cac-f220df4e9de3 written to collection. | |
2015-04-09 18:30:06,961 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428629400_2354 incident_id=6da7d089-6d07-4eec-8cac-f220df4e9de3 result_id=0 written to collection incident_results | |
2015-04-09 18:30:06,961 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-09 18:30:06,970 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-09 18:30:06,970 INFO Alert handler finished. duration=0.733s | |
2015-04-10 00:11:13,564 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428649200_4148/results.csv.gz job_id=scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428649200_4148 sessionKey=nvp7ThKFKDvQc3D3axDhCD_HolPX166KIeo_sCFiKnqnvUnPFtbuaaShHxtc0mYB9JRC2Nehj5vbwSIrgeVceI_5ql_bUhblXwYBcbmNH3MF3qpWGpQ2SLuyTKLUYUxp7oDXTcb^1zF alert=Splunk Forwarder Status | |
2015-04-10 00:11:13,570 INFO alert_handler started because alert 'Splunk Forwarder Status' with id 'scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428649200_4148' has been fired. | |
2015-04-10 00:11:13,838 DEBUG Parsed global alert handler settings: {"default_impact": "low", "default_priority": "low", "default_owner": "unassigned", "default_urgency": "low", "index": "alerts-to-inf"} | |
2015-04-10 00:11:13,838 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Splunk%20Forwarder%20Status%22%7D | |
2015-04-10 00:11:13,845 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "Splunk Forwarder Status", "tags" : "Splunk", "category" : "Splunk", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "", "_user" : "nobody", "_key" : "5526b8b13403e13421162b08" } ] | |
2015-04-10 00:11:13,845 INFO Found incident settings for Splunk Forwarder Status | |
2015-04-10 00:11:13,845 DEBUG Incident config after getting settings: {"subcategory": "", "run_alert_script": false, "category": "Splunk", "alert": "Splunk Forwarder Status", "alert_script": "", "_user": "nobody", "auto_assign": false, "auto_ttl_resolve": false, "urgency": "medium", "tags": "Splunk", "auto_assign_user": "", "auto_previous_resolve": false, "_key": "5526b8b13403e13421162b08", "auto_assign_owner": "unassigned"} | |
2015-04-10 00:11:13,853 INFO Found job for alert Splunk Forwarder Status. Context is 'search' with 59 results. | |
2015-04-10 00:11:13,870 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 00:11:13,870 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 00:11:13,896 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 00:11:13,896 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 00:11:13,896 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 00:11:13,896 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 00:11:13,896 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428649200_4148 | |
2015-04-10 00:11:14,187 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=06ff4993-325d-43b4-a84d-42517316c83b | |
2015-04-10 00:11:14,213 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 00:11:14,213 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 00:11:14,214 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'medium'} | |
2015-04-10 00:11:14,214 DEBUG Matched priority in lookup, returning value=medium | |
2015-04-10 00:11:14,253 DEBUG Create event will be: time=2015-04-10T00:11:14.253625 severity=INFO origin="alert_handler" event_id="7f51272723a76e074223f0e3eee5bb8c" user="splunk-system-user" action="create" alert="Splunk Forwarder Status" incident_id="06ff4993-325d-43b4-a84d-42517316c83b" job_id="scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428649200_4148" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428649207" | |
2015-04-10 00:11:14,260 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428649200_4148 with incident_id=06ff4993-325d-43b4-a84d-42517316c83b | |
2015-04-10 00:11:14,288 DEBUG results for incident_id=06ff4993-325d-43b4-a84d-42517316c83b written to collection. | |
2015-04-10 00:11:14,288 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428649200_4148 incident_id=06ff4993-325d-43b4-a84d-42517316c83b result_id=0 written to collection incident_results | |
2015-04-10 00:11:14,288 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 00:11:14,295 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 00:11:14,295 INFO Alert handler finished. duration=0.731s | |
2015-04-10 06:50:05,032 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428673800_6378/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428673800_6378 sessionKey=PLmFx1W4e^qibXsCPgRGcAu0nCA6o3nMwIx7Gy3iXLxVSBVD8dgYZaGDWNY2_Q2KkE8GeMaPn9mLN6FhItJO_dyoV9YLtwGcWE8LI1GJv7jd^p7SKtuqCP5h6cRnsajcs5TkH1DjzzYw_l9n alert=Builtin Account Used | |
2015-04-10 06:50:05,038 INFO alert_handler started because alert 'Builtin Account Used' with id 'scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428673800_6378' has been fired. | |
2015-04-10 06:50:05,306 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_urgency": "low", "default_impact": "low", "default_priority": "low", "default_owner": "unassigned"} | |
2015-04-10 06:50:05,307 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Builtin%20Account%20Used%22%7D | |
2015-04-10 06:50:05,313 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "Builtin Account Used", "tags" : "Builtin", "category" : "Builtin Account", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "", "_user" : "nobody", "_key" : "5526b8b13403e13421162b07" } ] | |
2015-04-10 06:50:05,313 INFO Found incident settings for Builtin Account Used | |
2015-04-10 06:50:05,313 DEBUG Incident config after getting settings: {"category": "Builtin Account", "urgency": "high", "run_alert_script": false, "auto_ttl_resolve": false, "auto_assign": false, "auto_previous_resolve": false, "alert_script": "", "_user": "nobody", "alert": "Builtin Account Used", "subcategory": "", "auto_assign_user": "", "_key": "5526b8b13403e13421162b07", "auto_assign_owner": "unassigned", "tags": "Builtin"} | |
2015-04-10 06:50:05,321 INFO Found job for alert Builtin Account Used. Context is 'search' with 6 results. | |
2015-04-10 06:50:05,339 DEBUG Parsed savedsearch settings: severity=5 expiry=24h digest_mode=True | |
2015-04-10 06:50:05,340 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 06:50:05,366 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 06:50:05,366 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 06:50:05,367 DEBUG Querying lookup with filter={'severity_id': '5'} | |
2015-04-10 06:50:05,367 DEBUG Matched impact in lookup, returning value=high | |
2015-04-10 06:50:05,367 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428673800_6378 | |
2015-04-10 06:50:05,660 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=9f0554c2-7da6-41e5-a890-e2a34d515918 | |
2015-04-10 06:50:05,686 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 06:50:05,687 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 06:50:05,687 DEBUG Querying lookup with filter={'impact': 'high', 'urgency': u'high'} | |
2015-04-10 06:50:05,687 DEBUG Matched priority in lookup, returning value=critical | |
2015-04-10 06:50:05,709 DEBUG Create event will be: time=2015-04-10T06:50:05.709193 severity=INFO origin="alert_handler" event_id="7ba6640beb0d399473035a221a8ba8f6" user="splunk-system-user" action="create" alert="Builtin Account Used" incident_id="9f0554c2-7da6-41e5-a890-e2a34d515918" job_id="scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428673800_6378" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428673801" | |
2015-04-10 06:50:05,715 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428673800_6378 with incident_id=9f0554c2-7da6-41e5-a890-e2a34d515918 | |
2015-04-10 06:50:05,744 DEBUG results for incident_id=9f0554c2-7da6-41e5-a890-e2a34d515918 written to collection. | |
2015-04-10 06:50:05,745 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428673800_6378 incident_id=9f0554c2-7da6-41e5-a890-e2a34d515918 result_id=0 written to collection incident_results | |
2015-04-10 06:50:05,745 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 06:50:05,754 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 06:50:05,754 INFO Alert handler finished. duration=0.722s | |
2015-04-10 07:10:06,035 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428675000_6489/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428675000_6489 sessionKey=wxsB5kLGe_lamQOoE_84uhbHKSJJX2fISf5HIdLMoDWBNv2RT9bELDxaTm6I7J8VfNOtE_kTxdrmNsQIuYNL0jd9j0UnCUvK2S6wF5Zv8qXlZGRGneCstk2MXHOocMvbbM7DB^uKUZQ78gTOVsY alert=TO AD Audit Rule Alert | |
2015-04-10 07:10:06,041 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428675000_6489' has been fired. | |
2015-04-10 07:10:06,317 DEBUG Parsed global alert handler settings: {"default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_priority": "low"} | |
2015-04-10 07:10:06,317 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-10 07:10:06,324 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-10 07:10:06,324 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-10 07:10:06,325 DEBUG Incident config after getting settings: {"auto_previous_resolve": false, "alert": "TO AD Audit Rule Alert", "auto_assign_owner": "unassigned", "auto_assign": false, "auto_ttl_resolve": false, "urgency": "medium", "run_alert_script": false, "_user": "nobody", "subcategory": "unknown", "tags": "AD", "category": "Active Directory", "_key": "5526b8b13403e13421162b09", "auto_assign_user": "", "alert_script": ""} | |
2015-04-10 07:10:06,332 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 1 results. | |
2015-04-10 07:10:06,350 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-10 07:10:06,350 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 07:10:06,377 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 07:10:06,377 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 07:10:06,377 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-10 07:10:06,377 DEBUG Matched impact in lookup, returning value=low | |
2015-04-10 07:10:06,377 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428675000_6489 | |
2015-04-10 07:10:06,680 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=e162e7ff-b836-4b77-89c2-f808c8300358 | |
2015-04-10 07:10:06,706 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 07:10:06,706 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 07:10:06,706 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'low'} | |
2015-04-10 07:10:06,706 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 07:10:06,726 DEBUG Create event will be: time=2015-04-10T07:10:06.726306 severity=INFO origin="alert_handler" event_id="bfeb139dffb7ac504710bdd5bb9f77e9" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="e162e7ff-b836-4b77-89c2-f808c8300358" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428675000_6489" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428675001" | |
2015-04-10 07:10:06,733 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428675000_6489 with incident_id=e162e7ff-b836-4b77-89c2-f808c8300358 | |
2015-04-10 07:10:06,760 DEBUG results for incident_id=e162e7ff-b836-4b77-89c2-f808c8300358 written to collection. | |
2015-04-10 07:10:06,760 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428675000_6489 incident_id=e162e7ff-b836-4b77-89c2-f808c8300358 result_id=0 written to collection incident_results | |
2015-04-10 07:10:06,760 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 07:10:06,767 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 07:10:06,767 INFO Alert handler finished. duration=0.733s | |
2015-04-10 07:30:06,597 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676200_6602/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676200_6602 sessionKey=1yfT^LjzsVsHAQ8VC84je3KqSXOe5mmtlIIRaTJjcZgeiASb6UVgyxIR8EXqOXx2BUJCV2c6xNEzmmSBpcmmHvkyhU15SVh9Dv^UdRa5whIgo5^LHVkufBTvtmCPNlG0nBSRdve_W7H4VN alert=TO AD Audit Rule Alert | |
2015-04-10 07:30:06,603 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676200_6602' has been fired. | |
2015-04-10 07:30:06,879 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low", "default_impact": "low"} | |
2015-04-10 07:30:06,880 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-10 07:30:06,886 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-10 07:30:06,886 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-10 07:30:06,886 DEBUG Incident config after getting settings: {"subcategory": "unknown", "alert": "TO AD Audit Rule Alert", "auto_ttl_resolve": false, "tags": "AD", "auto_assign_owner": "unassigned", "category": "Active Directory", "run_alert_script": false, "auto_previous_resolve": false, "auto_assign": false, "alert_script": "", "_key": "5526b8b13403e13421162b09", "urgency": "medium", "auto_assign_user": "", "_user": "nobody"} | |
2015-04-10 07:30:06,893 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 6 results. | |
2015-04-10 07:30:06,910 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-10 07:30:06,910 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 07:30:06,935 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 07:30:06,935 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 07:30:06,935 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-10 07:30:06,935 DEBUG Matched impact in lookup, returning value=low | |
2015-04-10 07:30:06,936 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676200_6602 | |
2015-04-10 07:30:07,249 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=9a88609e-67d5-4d32-8b22-726af59e10a6 | |
2015-04-10 07:30:07,280 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 07:30:07,280 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 07:30:07,281 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'low'} | |
2015-04-10 07:30:07,281 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 07:30:07,327 DEBUG Create event will be: time=2015-04-10T07:30:07.327602 severity=INFO origin="alert_handler" event_id="0590192672f6ec1a4d9b78b4f9d02bd5" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="9a88609e-67d5-4d32-8b22-726af59e10a6" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676200_6602" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428676201" | |
2015-04-10 07:30:07,334 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676200_6602 with incident_id=9a88609e-67d5-4d32-8b22-726af59e10a6 | |
2015-04-10 07:30:07,362 DEBUG results for incident_id=9a88609e-67d5-4d32-8b22-726af59e10a6 written to collection. | |
2015-04-10 07:30:07,362 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676200_6602 incident_id=9a88609e-67d5-4d32-8b22-726af59e10a6 result_id=0 written to collection incident_results | |
2015-04-10 07:30:07,362 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 07:30:07,370 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 07:30:07,370 INFO Alert handler finished. duration=0.774s | |
2015-04-10 07:35:04,570 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676500_6636/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676500_6636 sessionKey=NJiS9e0nHAy_90SMYTB8gzp7KSKgwyWW2JG8fdvXFHA6uRq4Vyq1cXV5A28BEYGNCnnFi5J0UACARtY0tbmPPGDF0aoVuOJBQuAxM^ylUw1^eQVRw14_i9tTulxOmlMJZwEV_SLjUpOdCqy_^3L alert=TO AD Audit Rule Alert | |
2015-04-10 07:35:04,576 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676500_6636' has been fired. | |
2015-04-10 07:35:04,853 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-10 07:35:04,854 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-10 07:35:04,861 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-10 07:35:04,861 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-10 07:35:04,861 DEBUG Incident config after getting settings: {"subcategory": "unknown", "_user": "nobody", "auto_assign_owner": "unassigned", "auto_assign": false, "auto_assign_user": "", "category": "Active Directory", "auto_ttl_resolve": false, "_key": "5526b8b13403e13421162b09", "run_alert_script": false, "urgency": "medium", "auto_previous_resolve": false, "alert_script": "", "alert": "TO AD Audit Rule Alert", "tags": "AD"} | |
2015-04-10 07:35:04,869 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 16 results. | |
2015-04-10 07:35:04,886 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-10 07:35:04,887 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 07:35:04,912 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 07:35:04,912 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 07:35:04,912 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-10 07:35:04,913 DEBUG Matched impact in lookup, returning value=low | |
2015-04-10 07:35:04,913 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676500_6636 | |
2015-04-10 07:35:05,217 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=e9d27bea-0e4c-4dca-a973-d5211da69444 | |
2015-04-10 07:35:05,246 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 07:35:05,246 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 07:35:05,246 DEBUG Querying lookup with filter={'impact': 'low', 'urgency': u'medium'} | |
2015-04-10 07:35:05,246 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 07:35:05,264 DEBUG Create event will be: time=2015-04-10T07:35:05.264774 severity=INFO origin="alert_handler" event_id="74ed04c17044abc4b215c4278841ec28" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="e9d27bea-0e4c-4dca-a973-d5211da69444" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676500_6636" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428676501" | |
2015-04-10 07:35:05,271 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676500_6636 with incident_id=e9d27bea-0e4c-4dca-a973-d5211da69444 | |
2015-04-10 07:35:05,299 DEBUG results for incident_id=e9d27bea-0e4c-4dca-a973-d5211da69444 written to collection. | |
2015-04-10 07:35:05,299 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676500_6636 incident_id=e9d27bea-0e4c-4dca-a973-d5211da69444 result_id=0 written to collection incident_results | |
2015-04-10 07:35:05,300 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 07:35:05,307 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 07:35:05,307 INFO Alert handler finished. duration=0.738s | |
2015-04-10 07:40:05,467 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676800_6649/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676800_6649 sessionKey=HId5v0krotiz2We2R8X1r8WBgU7H01FcT3kd8VI3L8kCA3TdjmdWp8ew_HlTquxi_p5cXK2SMSzqxAdpmPepXbN9d8yXPwe3QEK0lYigQKvJZstWUhkPIB9u7WKRdwGspdmKNKcUgwC alert=TO AD Audit Rule Alert | |
2015-04-10 07:40:05,473 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676800_6649' has been fired. | |
2015-04-10 07:40:05,747 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_urgency": "low", "default_priority": "low", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-10 07:40:05,747 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-10 07:40:05,753 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-10 07:40:05,754 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-10 07:40:05,754 DEBUG Incident config after getting settings: {"run_alert_script": false, "category": "Active Directory", "_user": "nobody", "auto_previous_resolve": false, "alert_script": "", "tags": "AD", "auto_ttl_resolve": false, "auto_assign": false, "auto_assign_user": "", "subcategory": "unknown", "alert": "TO AD Audit Rule Alert", "_key": "5526b8b13403e13421162b09", "urgency": "medium", "auto_assign_owner": "unassigned"} | |
2015-04-10 07:40:05,762 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 28 results. | |
2015-04-10 07:40:05,779 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-10 07:40:05,779 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 07:40:05,805 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 07:40:05,805 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 07:40:05,805 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-10 07:40:05,805 DEBUG Matched impact in lookup, returning value=low | |
2015-04-10 07:40:05,806 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676800_6649 | |
2015-04-10 07:40:06,105 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=13299e6c-8575-402b-a330-f8b93ac713fe | |
2015-04-10 07:40:06,130 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 07:40:06,131 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 07:40:06,131 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'low'} | |
2015-04-10 07:40:06,131 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 07:40:06,157 DEBUG Create event will be: time=2015-04-10T07:40:06.157302 severity=INFO origin="alert_handler" event_id="be26b16a9dacce7f81588d610ff01a0e" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="13299e6c-8575-402b-a330-f8b93ac713fe" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676800_6649" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428676801" | |
2015-04-10 07:40:06,163 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676800_6649 with incident_id=13299e6c-8575-402b-a330-f8b93ac713fe | |
2015-04-10 07:40:06,191 DEBUG results for incident_id=13299e6c-8575-402b-a330-f8b93ac713fe written to collection. | |
2015-04-10 07:40:06,191 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428676800_6649 incident_id=13299e6c-8575-402b-a330-f8b93ac713fe result_id=0 written to collection incident_results | |
2015-04-10 07:40:06,192 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 07:40:06,198 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 07:40:06,198 INFO Alert handler finished. duration=0.732s | |
2015-04-10 07:45:05,646 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428677100_6684/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428677100_6684 sessionKey=_TUAjX0ozicXvECh0sXa0Hckw4XPQ3qWfaKgq1IvGIepJZwRUN^JgrBPXEhWCEwqjzwzSPXaKP5T8L8p6EQBXXOx_gtqF8BCrGi5L0P_3Bi5PBEcgHK2qQcYo^GNcnZZqFEm4GOZsQyn27MC alert=TO AD Audit Rule Alert | |
2015-04-10 07:45:05,652 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428677100_6684' has been fired. | |
2015-04-10 07:45:05,930 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_priority": "low"} | |
2015-04-10 07:45:05,931 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-10 07:45:05,937 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-10 07:45:05,937 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-10 07:45:05,937 DEBUG Incident config after getting settings: {"_user": "nobody", "auto_assign": false, "auto_ttl_resolve": false, "alert": "TO AD Audit Rule Alert", "alert_script": "", "auto_assign_user": "", "category": "Active Directory", "auto_assign_owner": "unassigned", "tags": "AD", "_key": "5526b8b13403e13421162b09", "subcategory": "unknown", "run_alert_script": false, "auto_previous_resolve": false, "urgency": "medium"} | |
2015-04-10 07:45:05,945 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 16 results. | |
2015-04-10 07:45:05,962 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-10 07:45:05,962 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 07:45:05,988 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 07:45:05,988 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 07:45:05,989 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-10 07:45:05,989 DEBUG Matched impact in lookup, returning value=low | |
2015-04-10 07:45:05,989 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428677100_6684 | |
2015-04-10 07:45:06,295 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=b91520b0-0ebe-473e-970a-0ffac9998d0e | |
2015-04-10 07:45:06,320 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 07:45:06,320 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 07:45:06,321 DEBUG Querying lookup with filter={'impact': 'low', 'urgency': u'medium'} | |
2015-04-10 07:45:06,321 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 07:45:06,365 DEBUG Create event will be: time=2015-04-10T07:45:06.365713 severity=INFO origin="alert_handler" event_id="f10c0007e4d4b57d5e2beaa19e23d63a" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="b91520b0-0ebe-473e-970a-0ffac9998d0e" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428677100_6684" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428677101" | |
2015-04-10 07:45:06,371 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428677100_6684 with incident_id=b91520b0-0ebe-473e-970a-0ffac9998d0e | |
2015-04-10 07:45:06,400 DEBUG results for incident_id=b91520b0-0ebe-473e-970a-0ffac9998d0e written to collection. | |
2015-04-10 07:45:06,400 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428677100_6684 incident_id=b91520b0-0ebe-473e-970a-0ffac9998d0e result_id=0 written to collection incident_results | |
2015-04-10 07:45:06,400 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 07:45:06,406 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 07:45:06,406 INFO Alert handler finished. duration=0.761s | |
2015-04-10 08:10:06,443 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428678600_6814/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428678600_6814 sessionKey=k69PmmPNc7XK9aUs8h3DHD25kOFokpltTP75TLBczeY4i9wZvBChJvCB27NCer_IvKsVYy7qRb5BOimFvjALuVlW9PA9GrqRcVSI4^f2dz0zsWj5eeIVdR7sTNs57pFXy^8S7nGfvaG7CF alert=TO AD Audit Rule Alert | |
2015-04-10 08:10:06,449 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428678600_6814' has been fired. | |
2015-04-10 08:10:06,716 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_priority": "low"} | |
2015-04-10 08:10:06,716 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-10 08:10:06,722 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-10 08:10:06,722 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-10 08:10:06,723 DEBUG Incident config after getting settings: {"alert": "TO AD Audit Rule Alert", "tags": "AD", "_user": "nobody", "urgency": "medium", "auto_previous_resolve": false, "subcategory": "unknown", "auto_assign_owner": "unassigned", "auto_ttl_resolve": false, "run_alert_script": false, "alert_script": "", "_key": "5526b8b13403e13421162b09", "auto_assign_user": "", "auto_assign": false, "category": "Active Directory"} | |
2015-04-10 08:10:06,730 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 3 results. | |
2015-04-10 08:10:06,747 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-10 08:10:06,747 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 08:10:06,772 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 08:10:06,772 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 08:10:06,773 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-10 08:10:06,773 DEBUG Matched impact in lookup, returning value=low | |
2015-04-10 08:10:06,773 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428678600_6814 | |
2015-04-10 08:10:07,065 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=c4dda43d-84db-4d60-bfef-e4ea5b0e4877 | |
2015-04-10 08:10:07,094 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 08:10:07,095 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 08:10:07,095 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'low'} | |
2015-04-10 08:10:07,095 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 08:10:07,115 DEBUG Create event will be: time=2015-04-10T08:10:07.114941 severity=INFO origin="alert_handler" event_id="e8837def88cec3c9a60eb298a4613dc3" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="c4dda43d-84db-4d60-bfef-e4ea5b0e4877" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428678600_6814" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428678601" | |
2015-04-10 08:10:07,121 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428678600_6814 with incident_id=c4dda43d-84db-4d60-bfef-e4ea5b0e4877 | |
2015-04-10 08:10:07,149 DEBUG results for incident_id=c4dda43d-84db-4d60-bfef-e4ea5b0e4877 written to collection. | |
2015-04-10 08:10:07,149 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428678600_6814 incident_id=c4dda43d-84db-4d60-bfef-e4ea5b0e4877 result_id=0 written to collection incident_results | |
2015-04-10 08:10:07,149 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 08:10:07,156 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 08:10:07,156 INFO Alert handler finished. duration=0.713s | |
2015-04-10 08:45:05,543 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428680700_7009/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428680700_7009 sessionKey=Nk8fverWOZFgfuTz4ehSQN6xuqyODl8wLUaJFcweCtkgXOXwV2Aq97d2q9iWmHZOltx_Ru^oMAF9sy8c3aMoitN6vZCx5r4FS6FSrQeOUMm7wplD1zcrsWm5JpvwtmPBl2JGhNbR8hr69N alert=TO AD Audit Rule Alert | |
2015-04-10 08:45:05,549 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428680700_7009' has been fired. | |
2015-04-10 08:45:05,830 DEBUG Parsed global alert handler settings: {"default_priority": "low", "index": "alerts-to-inf", "default_impact": "low", "default_urgency": "low", "default_owner": "unassigned"} | |
2015-04-10 08:45:05,830 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-10 08:45:05,837 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-10 08:45:05,837 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-10 08:45:05,837 DEBUG Incident config after getting settings: {"subcategory": "unknown", "category": "Active Directory", "tags": "AD", "alert": "TO AD Audit Rule Alert", "auto_ttl_resolve": false, "auto_assign_user": "", "auto_assign": false, "_user": "nobody", "_key": "5526b8b13403e13421162b09", "run_alert_script": false, "auto_assign_owner": "unassigned", "auto_previous_resolve": false, "urgency": "medium", "alert_script": ""} | |
2015-04-10 08:45:05,845 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 2 results. | |
2015-04-10 08:45:05,863 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-10 08:45:05,863 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 08:45:05,889 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 08:45:05,889 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 08:45:05,890 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-10 08:45:05,890 DEBUG Matched impact in lookup, returning value=low | |
2015-04-10 08:45:05,890 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428680700_7009 | |
2015-04-10 08:45:06,187 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=454ba4c6-af19-4bbd-9055-a7c14fea7f12 | |
2015-04-10 08:45:06,214 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 08:45:06,214 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 08:45:06,214 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'low'} | |
2015-04-10 08:45:06,215 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 08:45:06,242 DEBUG Create event will be: time=2015-04-10T08:45:06.242874 severity=INFO origin="alert_handler" event_id="f61e58fe3b5f283cacdfc2dd5b4a0c2b" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="454ba4c6-af19-4bbd-9055-a7c14fea7f12" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428680700_7009" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428680701" | |
2015-04-10 08:45:06,249 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428680700_7009 with incident_id=454ba4c6-af19-4bbd-9055-a7c14fea7f12 | |
2015-04-10 08:45:06,277 DEBUG results for incident_id=454ba4c6-af19-4bbd-9055-a7c14fea7f12 written to collection. | |
2015-04-10 08:45:06,277 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428680700_7009 incident_id=454ba4c6-af19-4bbd-9055-a7c14fea7f12 result_id=0 written to collection incident_results | |
2015-04-10 08:45:06,277 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 08:45:06,284 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 08:45:06,284 INFO Alert handler finished. duration=0.742s | |
2015-04-10 09:30:04,144 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428683400_7240/results.csv.gz job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428683400_7240 sessionKey=eiHvNOS6a^BOI^4VSsNAm3UXbJmLKC9q^tluGrnnZR_NAbpeA^waXUOfhmkEuCfyOSMCEEIvpUf1Yk8rKOGRyDc1AIGfRc7cKCvw48zNLY53NDzxFSt46u9MbkKV54pIb0jVzDtOGdqyEC alert=TO Linux Failed Login Alert | |
2015-04-10 09:30:04,150 INFO alert_handler started because alert 'TO Linux Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428683400_7240' has been fired. | |
2015-04-10 09:30:04,427 DEBUG Parsed global alert handler settings: {"default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_priority": "low"} | |
2015-04-10 09:30:04,427 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Linux%20Failed%20Login%20Alert%22%7D | |
2015-04-10 09:30:04,434 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "TO Linux Failed Login Alert", "tags" : "[Linux]", "category" : "Linux", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0e" } ] | |
2015-04-10 09:30:04,434 INFO Found incident settings for TO Linux Failed Login Alert | |
2015-04-10 09:30:04,435 DEBUG Incident config after getting settings: {"urgency": "high", "auto_previous_resolve": false, "run_alert_script": false, "alert": "TO Linux Failed Login Alert", "tags": "[Linux]", "_key": "5526b8b13403e13421162b0e", "alert_script": "", "auto_assign_user": "", "auto_assign_owner": "unassigned", "_user": "nobody", "subcategory": "unknown", "category": "Linux", "auto_ttl_resolve": false, "auto_assign": false} | |
2015-04-10 09:30:04,443 INFO Found job for alert TO Linux Failed Login Alert. Context is 'search' with 3 results. | |
2015-04-10 09:30:04,466 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 09:30:04,466 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 09:30:04,500 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 09:30:04,500 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 09:30:04,501 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 09:30:04,501 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 09:30:04,501 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428683400_7240 | |
2015-04-10 09:30:04,816 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=9ebb25d5-1bd7-4fc7-9874-60251bdbf35e | |
2015-04-10 09:30:04,854 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 09:30:04,855 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 09:30:04,855 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-10 09:30:04,855 DEBUG Matched priority in lookup, returning value=high | |
2015-04-10 09:30:04,896 DEBUG Create event will be: time=2015-04-10T09:30:04.896298 severity=INFO origin="alert_handler" event_id="9bee5f0a50803f5df250c53d2ee01fa3" user="splunk-system-user" action="create" alert="TO Linux Failed Login Alert" incident_id="9ebb25d5-1bd7-4fc7-9874-60251bdbf35e" job_id="scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428683400_7240" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428683401" | |
2015-04-10 09:30:04,903 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428683400_7240 with incident_id=9ebb25d5-1bd7-4fc7-9874-60251bdbf35e | |
2015-04-10 09:30:04,931 DEBUG results for incident_id=9ebb25d5-1bd7-4fc7-9874-60251bdbf35e written to collection. | |
2015-04-10 09:30:04,931 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428683400_7240 incident_id=9ebb25d5-1bd7-4fc7-9874-60251bdbf35e result_id=0 written to collection incident_results | |
2015-04-10 09:30:04,931 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 09:30:04,940 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 09:30:04,940 INFO Alert handler finished. duration=0.796s | |
2015-04-10 09:40:05,865 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684000_7299/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684000_7299 sessionKey=qZxxZUm2ztjNsAgiZyVTUc9oNrkD2FtWWMtTWAAqA_mn6hDpeav2E8Kx6B_B3jrvYe2rgfZMeGwwDYpr4J5ZDW8VFOwb73JgqCIGzzZtoYLjge^6jx7E0Nt_9CFRO4shppnTfYRez3yNscYWuo alert=TO AD Audit Rule Alert | |
2015-04-10 09:40:05,871 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684000_7299' has been fired. | |
2015-04-10 09:40:06,141 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low", "default_priority": "low", "default_urgency": "low"} | |
2015-04-10 09:40:06,141 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-10 09:40:06,147 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-10 09:40:06,147 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-10 09:40:06,147 DEBUG Incident config after getting settings: {"tags": "AD", "alert": "TO AD Audit Rule Alert", "auto_previous_resolve": false, "urgency": "medium", "_key": "5526b8b13403e13421162b09", "run_alert_script": false, "auto_ttl_resolve": false, "auto_assign_owner": "unassigned", "subcategory": "unknown", "alert_script": "", "category": "Active Directory", "auto_assign": false, "_user": "nobody", "auto_assign_user": ""} | |
2015-04-10 09:40:06,155 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 1 results. | |
2015-04-10 09:40:06,172 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-10 09:40:06,172 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 09:40:06,197 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 09:40:06,197 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 09:40:06,197 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-10 09:40:06,197 DEBUG Matched impact in lookup, returning value=low | |
2015-04-10 09:40:06,198 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684000_7299 | |
2015-04-10 09:40:06,491 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=9a794d5d-3c77-4297-8317-e14a5f7a8685 | |
2015-04-10 09:40:06,517 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 09:40:06,517 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 09:40:06,517 DEBUG Querying lookup with filter={'impact': 'low', 'urgency': u'medium'} | |
2015-04-10 09:40:06,517 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 09:40:06,552 DEBUG Create event will be: time=2015-04-10T09:40:06.552829 severity=INFO origin="alert_handler" event_id="65168361c4e4905626b8482c6b53fef7" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="9a794d5d-3c77-4297-8317-e14a5f7a8685" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684000_7299" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428684001" | |
2015-04-10 09:40:06,559 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684000_7299 with incident_id=9a794d5d-3c77-4297-8317-e14a5f7a8685 | |
2015-04-10 09:40:06,587 DEBUG results for incident_id=9a794d5d-3c77-4297-8317-e14a5f7a8685 written to collection. | |
2015-04-10 09:40:06,587 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684000_7299 incident_id=9a794d5d-3c77-4297-8317-e14a5f7a8685 result_id=0 written to collection incident_results | |
2015-04-10 09:40:06,587 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 09:40:06,594 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 09:40:06,594 INFO Alert handler finished. duration=0.729s | |
2015-04-10 09:50:05,878 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684600_7354/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684600_7354 sessionKey=8zEDKZQAHtG6kpsbuVyCd1GNN^4Hm0j0IxgmVohOBERUJxHsxWIGqQr4DAhwydS9^w3pShROLNotue3ekUyhM9s3eQVC46xvioW98kwQCMSIceahZaU7JSyUVkaB9AixIszKCCPOXgvO3PTC alert=TO AD Audit Rule Alert | |
2015-04-10 09:50:05,884 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684600_7354' has been fired. | |
2015-04-10 09:50:06,158 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-10 09:50:06,158 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-10 09:50:06,164 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-10 09:50:06,165 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-10 09:50:06,165 DEBUG Incident config after getting settings: {"auto_assign_user": "", "_key": "5526b8b13403e13421162b09", "auto_assign_owner": "unassigned", "auto_previous_resolve": false, "alert_script": "", "_user": "nobody", "auto_assign": false, "auto_ttl_resolve": false, "urgency": "medium", "tags": "AD", "subcategory": "unknown", "run_alert_script": false, "category": "Active Directory", "alert": "TO AD Audit Rule Alert"} | |
2015-04-10 09:50:06,173 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 2 results. | |
2015-04-10 09:50:06,190 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-10 09:50:06,191 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 09:50:06,216 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 09:50:06,216 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 09:50:06,216 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-10 09:50:06,216 DEBUG Matched impact in lookup, returning value=low | |
2015-04-10 09:50:06,216 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684600_7354 | |
2015-04-10 09:50:06,506 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=81472c4f-f81d-493a-adf4-dc60e3c35067 | |
2015-04-10 09:50:06,531 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 09:50:06,531 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 09:50:06,532 DEBUG Querying lookup with filter={'impact': 'low', 'urgency': u'medium'} | |
2015-04-10 09:50:06,532 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 09:50:06,570 DEBUG Create event will be: time=2015-04-10T09:50:06.570527 severity=INFO origin="alert_handler" event_id="398a4b3e556a7207b0823a5170fefbb9" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="81472c4f-f81d-493a-adf4-dc60e3c35067" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684600_7354" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428684601" | |
2015-04-10 09:50:06,576 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684600_7354 with incident_id=81472c4f-f81d-493a-adf4-dc60e3c35067 | |
2015-04-10 09:50:06,639 DEBUG results for incident_id=81472c4f-f81d-493a-adf4-dc60e3c35067 written to collection. | |
2015-04-10 09:50:06,639 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684600_7354 incident_id=81472c4f-f81d-493a-adf4-dc60e3c35067 result_id=0 written to collection incident_results | |
2015-04-10 09:50:06,639 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 09:50:06,646 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 09:50:06,646 INFO Alert handler finished. duration=0.769s | |
2015-04-10 09:55:04,509 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684900_7385/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684900_7385 sessionKey=_eSGpcT3qdKWmD^cQpN3tvn^DFbcNg4zQp3kRyahJQHlNzQ3LjgLCFi4QNPWyyweBxikCaIgGxK2OWlVolXfn3tNxI3pkyjMSEbMbqOy21NF3pQk6igGY6xQ084hxw3RWGhD8QkesUljcElD alert=TO AD Audit Rule Alert | |
2015-04-10 09:55:04,515 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684900_7385' has been fired. | |
2015-04-10 09:55:04,782 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf"} | |
2015-04-10 09:55:04,783 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-10 09:55:04,789 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-10 09:55:04,789 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-10 09:55:04,789 DEBUG Incident config after getting settings: {"tags": "AD", "subcategory": "unknown", "alert_script": "", "auto_previous_resolve": false, "_user": "nobody", "auto_assign": false, "auto_assign_owner": "unassigned", "auto_assign_user": "", "auto_ttl_resolve": false, "run_alert_script": false, "_key": "5526b8b13403e13421162b09", "category": "Active Directory", "urgency": "medium", "alert": "TO AD Audit Rule Alert"} | |
2015-04-10 09:55:04,796 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 15 results. | |
2015-04-10 09:55:04,814 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-10 09:55:04,814 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 09:55:04,839 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 09:55:04,839 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 09:55:04,840 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-10 09:55:04,840 DEBUG Matched impact in lookup, returning value=low | |
2015-04-10 09:55:04,840 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684900_7385 | |
2015-04-10 09:55:05,132 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=8d1d6589-e4a8-41e5-9540-d03d052b1d09 | |
2015-04-10 09:55:05,158 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 09:55:05,159 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 09:55:05,159 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'low'} | |
2015-04-10 09:55:05,159 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 09:55:05,173 DEBUG Create event will be: time=2015-04-10T09:55:05.173670 severity=INFO origin="alert_handler" event_id="45a8ea53cc1c12cfb8abc517502c1733" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="8d1d6589-e4a8-41e5-9540-d03d052b1d09" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684900_7385" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428684901" | |
2015-04-10 09:55:05,179 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684900_7385 with incident_id=8d1d6589-e4a8-41e5-9540-d03d052b1d09 | |
2015-04-10 09:55:05,208 DEBUG results for incident_id=8d1d6589-e4a8-41e5-9540-d03d052b1d09 written to collection. | |
2015-04-10 09:55:05,208 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428684900_7385 incident_id=8d1d6589-e4a8-41e5-9540-d03d052b1d09 result_id=0 written to collection incident_results | |
2015-04-10 09:55:05,208 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 09:55:05,214 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 09:55:05,214 INFO Alert handler finished. duration=0.705s | |
2015-04-10 10:00:06,215 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685200_7407/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685200_7407 sessionKey=QRk2HtAiNcxInGBK3_1XC8aCh7roKVUUVH7oCbWPTHqLTCwVJkr5x7t3y2zyoJgMGddmxTQeIUTcJtjW3keAaOJXgRyTdDSh0U_ArLHkFescG5Vu2Q8hkiFAZ3GG4AuUXJSR7itU8JrNMl3G6N alert=TO AD Audit Rule Alert | |
2015-04-10 10:00:06,220 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685200_7407' has been fired. | |
2015-04-10 10:00:06,505 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-10 10:00:06,505 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-10 10:00:06,511 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-10 10:00:06,511 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-10 10:00:06,511 DEBUG Incident config after getting settings: {"_key": "5526b8b13403e13421162b09", "auto_assign_user": "", "_user": "nobody", "auto_assign": false, "urgency": "medium", "category": "Active Directory", "alert": "TO AD Audit Rule Alert", "tags": "AD", "alert_script": "", "auto_previous_resolve": false, "auto_assign_owner": "unassigned", "run_alert_script": false, "auto_ttl_resolve": false, "subcategory": "unknown"} | |
2015-04-10 10:00:06,519 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 1 results. | |
2015-04-10 10:00:06,536 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-10 10:00:06,536 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 10:00:06,561 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 10:00:06,561 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 10:00:06,562 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-10 10:00:06,562 DEBUG Matched impact in lookup, returning value=low | |
2015-04-10 10:00:06,562 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685200_7407 | |
2015-04-10 10:00:06,862 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=b675d4d1-185d-40cb-811e-a697fd34c16a | |
2015-04-10 10:00:06,887 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 10:00:06,888 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 10:00:06,888 DEBUG Querying lookup with filter={'impact': 'low', 'urgency': u'medium'} | |
2015-04-10 10:00:06,888 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 10:00:06,916 DEBUG Create event will be: time=2015-04-10T10:00:06.916497 severity=INFO origin="alert_handler" event_id="d18b825b711045230b9286e51ed3a8f5" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="b675d4d1-185d-40cb-811e-a697fd34c16a" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685200_7407" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428685201" | |
2015-04-10 10:00:06,923 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685200_7407 with incident_id=b675d4d1-185d-40cb-811e-a697fd34c16a | |
2015-04-10 10:00:06,950 DEBUG results for incident_id=b675d4d1-185d-40cb-811e-a697fd34c16a written to collection. | |
2015-04-10 10:00:06,951 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685200_7407 incident_id=b675d4d1-185d-40cb-811e-a697fd34c16a result_id=0 written to collection incident_results | |
2015-04-10 10:00:06,951 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 10:00:06,958 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 10:00:06,958 INFO Alert handler finished. duration=0.744s | |
2015-04-10 10:05:04,725 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685500_7445/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685500_7445 sessionKey=5XTU5x2BOpRP77La_2G_lDVS1cbUvTW8^zrjnHHZDmcXUokIciWAcz5^CvEnFhbOosZCB_FGuxZEZmvS^B5jiydmNyt5x918dIzy^jMn0cx5JB44RGC62SApb4LJI9iyXpfxXQkUuOJuMTRay^To alert=TO AD Audit Rule Alert | |
2015-04-10 10:05:04,731 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685500_7445' has been fired. | |
2015-04-10 10:05:05,007 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-10 10:05:05,007 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-10 10:05:05,014 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-10 10:05:05,014 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-10 10:05:05,014 DEBUG Incident config after getting settings: {"run_alert_script": false, "tags": "AD", "alert": "TO AD Audit Rule Alert", "subcategory": "unknown", "_key": "5526b8b13403e13421162b09", "auto_assign_user": "", "alert_script": "", "auto_previous_resolve": false, "auto_assign_owner": "unassigned", "category": "Active Directory", "auto_assign": false, "_user": "nobody", "urgency": "medium", "auto_ttl_resolve": false} | |
2015-04-10 10:05:05,022 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 13 results. | |
2015-04-10 10:05:05,040 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-10 10:05:05,041 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 10:05:05,067 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 10:05:05,067 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 10:05:05,068 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-10 10:05:05,068 DEBUG Matched impact in lookup, returning value=low | |
2015-04-10 10:05:05,068 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685500_7445 | |
2015-04-10 10:05:05,372 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=60397bf1-6877-4fc5-8a5e-9552afb6f5e4 | |
2015-04-10 10:05:05,397 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 10:05:05,398 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 10:05:05,398 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'low'} | |
2015-04-10 10:05:05,398 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 10:05:05,436 DEBUG Create event will be: time=2015-04-10T10:05:05.436791 severity=INFO origin="alert_handler" event_id="b98640c4765f98287d5e54c4eb783258" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="60397bf1-6877-4fc5-8a5e-9552afb6f5e4" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685500_7445" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428685501" | |
2015-04-10 10:05:05,443 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685500_7445 with incident_id=60397bf1-6877-4fc5-8a5e-9552afb6f5e4 | |
2015-04-10 10:05:05,471 DEBUG results for incident_id=60397bf1-6877-4fc5-8a5e-9552afb6f5e4 written to collection. | |
2015-04-10 10:05:05,471 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685500_7445 incident_id=60397bf1-6877-4fc5-8a5e-9552afb6f5e4 result_id=0 written to collection incident_results | |
2015-04-10 10:05:05,471 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 10:05:05,478 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 10:05:05,478 INFO Alert handler finished. duration=0.754s | |
2015-04-10 10:10:06,231 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685800_7464/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685800_7464 sessionKey=wAi8IRU61u3etgsH^zvGDXEkTJUKdYVD6W16XwKYymw1CpPgKaNy04sPXu2_xH3BTeEYWwQFaQACiLwRXyAnpko9rpI9OHxTv9Vt3M4P4nWiqeYS2k1c8y^hh^5D3huds2paiTVL91N alert=TO AD Audit Rule Alert | |
2015-04-10 10:10:06,237 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685800_7464' has been fired. | |
2015-04-10 10:10:06,507 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low", "default_urgency": "low"} | |
2015-04-10 10:10:06,507 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-10 10:10:06,513 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-10 10:10:06,514 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-10 10:10:06,514 DEBUG Incident config after getting settings: {"alert": "TO AD Audit Rule Alert", "alert_script": "", "auto_assign_user": "", "category": "Active Directory", "tags": "AD", "subcategory": "unknown", "_user": "nobody", "run_alert_script": false, "urgency": "medium", "auto_ttl_resolve": false, "auto_assign": false, "auto_assign_owner": "unassigned", "auto_previous_resolve": false, "_key": "5526b8b13403e13421162b09"} | |
2015-04-10 10:10:06,521 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 2 results. | |
2015-04-10 10:10:06,539 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-10 10:10:06,539 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 10:10:06,565 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 10:10:06,565 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 10:10:06,566 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-10 10:10:06,566 DEBUG Matched impact in lookup, returning value=low | |
2015-04-10 10:10:06,566 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685800_7464 | |
2015-04-10 10:10:06,860 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=a4989505-d253-4399-8706-2111cd521e2a | |
2015-04-10 10:10:06,886 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 10:10:06,886 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 10:10:06,886 DEBUG Querying lookup with filter={'impact': 'low', 'urgency': u'medium'} | |
2015-04-10 10:10:06,886 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 10:10:06,921 DEBUG Create event will be: time=2015-04-10T10:10:06.921877 severity=INFO origin="alert_handler" event_id="182d09fae0070e6ca6b01a484f7a1b79" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="a4989505-d253-4399-8706-2111cd521e2a" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685800_7464" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428685801" | |
2015-04-10 10:10:06,928 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685800_7464 with incident_id=a4989505-d253-4399-8706-2111cd521e2a | |
2015-04-10 10:10:06,956 DEBUG results for incident_id=a4989505-d253-4399-8706-2111cd521e2a written to collection. | |
2015-04-10 10:10:06,956 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428685800_7464 incident_id=a4989505-d253-4399-8706-2111cd521e2a result_id=0 written to collection incident_results | |
2015-04-10 10:10:06,956 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 10:10:06,963 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 10:10:06,963 INFO Alert handler finished. duration=0.732s | |
2015-04-10 10:15:05,684 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428686100_7506/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428686100_7506 sessionKey=qqlgqqM8VCpViCFapY8E1Iq9zBxszwcGxsu2EXzm5FQP15Zd00VGyyzoxGOdCHKZGs1ZFk1BiNBAsm7YyDuOYKC8Q7bCVQ9Rd8bfAG3LBbI0XgPJi^HgLVccKH1nff6zUptzJQ9BI26318o alert=TO AD Audit Rule Alert | |
2015-04-10 10:15:05,690 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428686100_7506' has been fired. | |
2015-04-10 10:15:05,966 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "default_priority": "low", "index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-10 10:15:05,967 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-10 10:15:05,973 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-10 10:15:05,973 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-10 10:15:05,973 DEBUG Incident config after getting settings: {"subcategory": "unknown", "_key": "5526b8b13403e13421162b09", "tags": "AD", "run_alert_script": false, "alert": "TO AD Audit Rule Alert", "auto_previous_resolve": false, "auto_assign_user": "", "alert_script": "", "category": "Active Directory", "auto_assign_owner": "unassigned", "auto_ttl_resolve": false, "urgency": "medium", "auto_assign": false, "_user": "nobody"} | |
2015-04-10 10:15:05,981 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 8 results. | |
2015-04-10 10:15:05,999 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-10 10:15:05,999 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 10:15:06,027 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 10:15:06,028 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 10:15:06,028 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-10 10:15:06,028 DEBUG Matched impact in lookup, returning value=low | |
2015-04-10 10:15:06,028 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428686100_7506 | |
2015-04-10 10:15:06,332 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=64b35302-eba5-44f1-96a5-09f631e20d01 | |
2015-04-10 10:15:06,359 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 10:15:06,359 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 10:15:06,359 DEBUG Querying lookup with filter={'impact': 'low', 'urgency': u'medium'} | |
2015-04-10 10:15:06,360 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 10:15:06,387 DEBUG Create event will be: time=2015-04-10T10:15:06.387740 severity=INFO origin="alert_handler" event_id="4164b1e84b8311597c35fe2394ccd5b9" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="64b35302-eba5-44f1-96a5-09f631e20d01" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428686100_7506" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428686101" | |
2015-04-10 10:15:06,394 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428686100_7506 with incident_id=64b35302-eba5-44f1-96a5-09f631e20d01 | |
2015-04-10 10:15:06,422 DEBUG results for incident_id=64b35302-eba5-44f1-96a5-09f631e20d01 written to collection. | |
2015-04-10 10:15:06,422 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428686100_7506 incident_id=64b35302-eba5-44f1-96a5-09f631e20d01 result_id=0 written to collection incident_results | |
2015-04-10 10:15:06,422 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 10:15:06,429 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 10:15:06,429 INFO Alert handler finished. duration=0.745s | |
2015-04-10 10:30:06,111 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428687000_7577/results.csv.gz job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428687000_7577 sessionKey=kjDwZfb1z_tCEHTEjcnERn^p^XP1hlmCjV17WXhHafHUQJ2gr_ny5DFTZHOcFGeeDMee589OXfDYhfIyIch9Jkq9uGbLLY9_YxndESz17384M^0Ccl9ykLevemppzvXV0TcbSi6clbB6KC alert=TO AD Audit Rule Alert | |
2015-04-10 10:30:06,118 INFO alert_handler started because alert 'TO AD Audit Rule Alert' with id 'scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428687000_7577' has been fired. | |
2015-04-10 10:30:06,390 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-10 10:30:06,390 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20AD%20Audit%20Rule%20Alert%22%7D | |
2015-04-10 10:30:06,397 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "TO AD Audit Rule Alert", "tags" : "AD", "category" : "Active Directory", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b09" } ] | |
2015-04-10 10:30:06,397 INFO Found incident settings for TO AD Audit Rule Alert | |
2015-04-10 10:30:06,397 DEBUG Incident config after getting settings: {"subcategory": "unknown", "_key": "5526b8b13403e13421162b09", "tags": "AD", "run_alert_script": false, "alert": "TO AD Audit Rule Alert", "auto_previous_resolve": false, "auto_assign_user": "", "alert_script": "", "category": "Active Directory", "auto_assign_owner": "unassigned", "auto_ttl_resolve": false, "urgency": "medium", "auto_assign": false, "_user": "nobody"} | |
2015-04-10 10:30:06,405 INFO Found job for alert TO AD Audit Rule Alert. Context is 'search' with 1 results. | |
2015-04-10 10:30:06,422 DEBUG Parsed savedsearch settings: severity=2 expiry=24h digest_mode=True | |
2015-04-10 10:30:06,422 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 10:30:06,447 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 10:30:06,447 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 10:30:06,447 DEBUG Querying lookup with filter={'severity_id': '2'} | |
2015-04-10 10:30:06,447 DEBUG Matched impact in lookup, returning value=low | |
2015-04-10 10:30:06,447 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428687000_7577 | |
2015-04-10 10:30:06,741 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=aab76e2e-aebb-4a96-93be-766309d10bfe | |
2015-04-10 10:30:06,767 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 10:30:06,767 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 10:30:06,768 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'low'} | |
2015-04-10 10:30:06,768 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 10:30:06,797 DEBUG Create event will be: time=2015-04-10T10:30:06.797059 severity=INFO origin="alert_handler" event_id="c4ea0b4c79cb3135378dc6e97e3471ba" user="splunk-system-user" action="create" alert="TO AD Audit Rule Alert" incident_id="aab76e2e-aebb-4a96-93be-766309d10bfe" job_id="scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428687000_7577" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428687001" | |
2015-04-10 10:30:06,803 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428687000_7577 with incident_id=aab76e2e-aebb-4a96-93be-766309d10bfe | |
2015-04-10 10:30:06,831 DEBUG results for incident_id=aab76e2e-aebb-4a96-93be-766309d10bfe written to collection. | |
2015-04-10 10:30:06,832 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD55810cbf1bbfc399b_at_1428687000_7577 incident_id=aab76e2e-aebb-4a96-93be-766309d10bfe result_id=0 written to collection incident_results | |
2015-04-10 10:30:06,832 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 10:30:06,839 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 10:30:06,839 INFO Alert handler finished. duration=0.728s | |
2015-04-10 11:29:04,133 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690540_7889/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690540_7889 sessionKey=25Up^H7tmneq9dePKOlRxOL8cmyKFksMt6_CK_ndj3xfh^zvafnsClLrr2ksVtfZqQxD63wTAFKO4C4N3gYu9sQbHJAwG^lPN7y3os4i4kpXmW4ZzCD_5U0b929R67AH_YWVmzZ5rtZN^sR alert=Test Alert | |
2015-04-10 11:29:04,139 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690540_7889' has been fired. | |
2015-04-10 11:29:04,413 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low", "default_owner": "unassigned", "default_priority": "low"} | |
2015-04-10 11:29:04,413 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:29:04,419 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:29:04,419 INFO Found incident settings for Test Alert | |
2015-04-10 11:29:04,419 DEBUG Incident config after getting settings: {"auto_previous_resolve": false, "auto_ttl_resolve": false, "_user": "nobody", "auto_assign": false, "category": "unknown", "urgency": "low", "run_alert_script": false, "alert_script": "", "auto_assign_user": "", "subcategory": "unknown", "tags": "[Untagged]", "_key": "5526c2413403e135d146268b", "alert": "Test Alert", "auto_assign_owner": "unassigned"} | |
2015-04-10 11:29:04,427 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:29:04,444 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:29:04,444 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:29:04,469 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:29:04,469 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:29:04,469 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:29:04,469 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:29:04,469 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690540_7889 | |
2015-04-10 11:29:04,766 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=271cb033-92bb-485f-8659-3a9c47cb5fe0 | |
2015-04-10 11:29:04,792 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:29:04,792 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:29:04,792 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-10 11:29:04,792 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:29:04,812 DEBUG Create event will be: time=2015-04-10T11:29:04.812407 severity=INFO origin="alert_handler" event_id="c42fe395051df57819bd369f2a1bef54" user="splunk-system-user" action="create" alert="Test Alert" incident_id="271cb033-92bb-485f-8659-3a9c47cb5fe0" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690540_7889" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428690541" | |
2015-04-10 11:29:04,819 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690540_7889 with incident_id=271cb033-92bb-485f-8659-3a9c47cb5fe0 | |
2015-04-10 11:29:04,847 DEBUG results for incident_id=271cb033-92bb-485f-8659-3a9c47cb5fe0 written to collection. | |
2015-04-10 11:29:04,847 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690540_7889 incident_id=271cb033-92bb-485f-8659-3a9c47cb5fe0 result_id=0 written to collection incident_results | |
2015-04-10 11:29:04,847 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:29:04,854 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:29:04,854 INFO Alert handler finished. duration=0.721s | |
2015-04-10 11:30:06,414 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690600_7900/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690600_7900 sessionKey=EKugb64LeShPlWLzYLyf5vq4PGLcMwdhFRT9ClchYOKENJ1FAXQ745yA7uOYo6FYxf^ING2u1nGgfnBvdsBERdtmMOhDjCh3oWjNFZqTB149lTaoXd8a4QNfuW22^pFPeC7NUaqNefL alert=Test Alert | |
2015-04-10 11:30:06,420 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690600_7900' has been fired. | |
2015-04-10 11:30:06,695 DEBUG Parsed global alert handler settings: {"default_priority": "low", "index": "alerts-to-inf", "default_impact": "low", "default_urgency": "low", "default_owner": "unassigned"} | |
2015-04-10 11:30:06,696 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:30:06,702 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:30:06,702 INFO Found incident settings for Test Alert | |
2015-04-10 11:30:06,702 DEBUG Incident config after getting settings: {"auto_assign_user": "", "urgency": "low", "auto_assign": false, "category": "unknown", "subcategory": "unknown", "auto_ttl_resolve": false, "auto_assign_owner": "unassigned", "run_alert_script": false, "_key": "5526c2413403e135d146268b", "auto_previous_resolve": false, "alert_script": "", "alert": "Test Alert", "tags": "[Untagged]", "_user": "nobody"} | |
2015-04-10 11:30:06,710 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:30:06,727 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:30:06,727 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:30:06,752 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:30:06,752 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:30:06,752 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:30:06,753 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:30:06,753 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690600_7900 | |
2015-04-10 11:30:07,060 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=5c458b92-95c2-49be-ba96-6cd4518fd03e | |
2015-04-10 11:30:07,086 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:30:07,086 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:30:07,086 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-10 11:30:07,086 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:30:07,133 DEBUG Create event will be: time=2015-04-10T11:30:07.133764 severity=INFO origin="alert_handler" event_id="a2ab2d481b6c45e1521cb0e8e06b7a66" user="splunk-system-user" action="create" alert="Test Alert" incident_id="5c458b92-95c2-49be-ba96-6cd4518fd03e" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690600_7900" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428690601" | |
2015-04-10 11:30:07,140 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690600_7900 with incident_id=5c458b92-95c2-49be-ba96-6cd4518fd03e | |
2015-04-10 11:30:07,167 DEBUG results for incident_id=5c458b92-95c2-49be-ba96-6cd4518fd03e written to collection. | |
2015-04-10 11:30:07,168 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690600_7900 incident_id=5c458b92-95c2-49be-ba96-6cd4518fd03e result_id=0 written to collection incident_results | |
2015-04-10 11:30:07,168 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:30:07,175 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:30:07,175 INFO Alert handler finished. duration=0.761s | |
2015-04-10 11:31:25,999 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690660_7929/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690660_7929 sessionKey=G1gLiPo1LbDqHCinc4hAtn_Ds1Rg8_UEPO_OyS_3YXRbQJxBlQ_eBMYXRDuqAeFFP23PhaenhJQCGT0KX8bDpVzK0n049_j7VlrVgpfV24GFINi4V2Lz1L9ewTUT0iLwihZEhGmLZu3 alert=Test Alert | |
2015-04-10 11:31:26,004 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690660_7929' has been fired. | |
2015-04-10 11:31:26,274 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_urgency": "low", "default_impact": "low"} | |
2015-04-10 11:31:26,274 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:31:26,280 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:31:26,280 INFO Found incident settings for Test Alert | |
2015-04-10 11:31:26,280 DEBUG Incident config after getting settings: {"auto_ttl_resolve": false, "auto_assign_user": "", "auto_assign_owner": "unassigned", "auto_assign": false, "run_alert_script": false, "subcategory": "unknown", "alert": "Test Alert", "tags": "[Untagged]", "_key": "5526c2413403e135d146268b", "urgency": "low", "alert_script": "", "_user": "nobody", "auto_previous_resolve": false, "category": "unknown"} | |
2015-04-10 11:31:26,289 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:31:26,307 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:31:26,307 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:31:26,333 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:31:26,334 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:31:26,334 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:31:26,334 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:31:26,334 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690660_7929 | |
2015-04-10 11:31:26,630 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=b62335d3-6886-45fd-9f48-34539bb73dd3 | |
2015-04-10 11:31:26,663 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:31:26,664 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:31:26,664 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-10 11:31:26,664 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:31:26,696 DEBUG Create event will be: time=2015-04-10T11:31:26.696140 severity=INFO origin="alert_handler" event_id="343d0f6a9c50ba66af9e96bdbf34c74a" user="splunk-system-user" action="create" alert="Test Alert" incident_id="b62335d3-6886-45fd-9f48-34539bb73dd3" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690660_7929" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428690683" | |
2015-04-10 11:31:26,704 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690660_7929 with incident_id=b62335d3-6886-45fd-9f48-34539bb73dd3 | |
2015-04-10 11:31:26,730 DEBUG results for incident_id=b62335d3-6886-45fd-9f48-34539bb73dd3 written to collection. | |
2015-04-10 11:31:26,730 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690660_7929 incident_id=b62335d3-6886-45fd-9f48-34539bb73dd3 result_id=0 written to collection incident_results | |
2015-04-10 11:31:26,730 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:31:26,737 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:31:26,737 INFO Alert handler finished. duration=0.739s | |
2015-04-10 11:32:04,200 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690720_7931/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690720_7931 sessionKey=N6cqEfLGYCyVKZYIQMiBvejVtAgY88fjHlKUES2ihu8xSeutclx3riRH5BH0QgJL_TccuJWEylcFefz9cJp69Ik97bA5WhgArqTE48y3osw0bXVSapRW9DQBpzdokvcJmmbjY4lvzuBn6Wwoy10 alert=Test Alert | |
2015-04-10 11:32:04,206 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690720_7931' has been fired. | |
2015-04-10 11:32:04,471 DEBUG Parsed global alert handler settings: {"default_priority": "low", "index": "alerts-to-inf", "default_impact": "low", "default_urgency": "low", "default_owner": "unassigned"} | |
2015-04-10 11:32:04,471 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:32:04,477 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:32:04,477 INFO Found incident settings for Test Alert | |
2015-04-10 11:32:04,478 DEBUG Incident config after getting settings: {"alert_script": "", "auto_ttl_resolve": false, "run_alert_script": false, "auto_assign_owner": "unassigned", "tags": "[Untagged]", "_user": "nobody", "auto_previous_resolve": false, "category": "unknown", "auto_assign_user": "", "alert": "Test Alert", "auto_assign": false, "subcategory": "unknown", "urgency": "low", "_key": "5526c2413403e135d146268b"} | |
2015-04-10 11:32:04,485 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:32:04,503 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:32:04,504 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:32:04,530 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:32:04,530 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:32:04,530 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:32:04,530 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:32:04,531 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690720_7931 | |
2015-04-10 11:32:04,824 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=3c82bb05-c036-46fd-994a-059e91e31127 | |
2015-04-10 11:32:04,850 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:32:04,850 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:32:04,850 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-10 11:32:04,851 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:32:04,894 DEBUG Create event will be: time=2015-04-10T11:32:04.894478 severity=INFO origin="alert_handler" event_id="3598f15a02822207401b1c419542904f" user="splunk-system-user" action="create" alert="Test Alert" incident_id="3c82bb05-c036-46fd-994a-059e91e31127" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690720_7931" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428690721" | |
2015-04-10 11:32:04,901 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690720_7931 with incident_id=3c82bb05-c036-46fd-994a-059e91e31127 | |
2015-04-10 11:32:04,929 DEBUG results for incident_id=3c82bb05-c036-46fd-994a-059e91e31127 written to collection. | |
2015-04-10 11:32:04,929 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690720_7931 incident_id=3c82bb05-c036-46fd-994a-059e91e31127 result_id=0 written to collection incident_results | |
2015-04-10 11:32:04,929 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:32:04,936 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:32:04,936 INFO Alert handler finished. duration=0.737s | |
2015-04-10 11:33:04,216 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690780_7933/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690780_7933 sessionKey=En1J4ZKBGiW56uyXRMuIkvi5DMvFSLGJLFxcK_LD_FCFXz_wid^ztYGoZ14e5inWJpkyLtAWjHRJC_uJphtumSYT3AIcEMEEY0mAEtlk86KqPzT259WyUeeoVAloLSzcY52X2^yi9jubRIMM alert=Test Alert | |
2015-04-10 11:33:04,222 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690780_7933' has been fired. | |
2015-04-10 11:33:04,493 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "default_priority": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-10 11:33:04,493 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:33:04,500 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:33:04,500 INFO Found incident settings for Test Alert | |
2015-04-10 11:33:04,500 DEBUG Incident config after getting settings: {"auto_assign_user": "", "category": "unknown", "auto_assign": false, "auto_ttl_resolve": false, "auto_assign_owner": "unassigned", "subcategory": "unknown", "_key": "5526c2413403e135d146268b", "run_alert_script": false, "_user": "nobody", "auto_previous_resolve": false, "alert_script": "", "alert": "Test Alert", "tags": "[Untagged]", "urgency": "low"} | |
2015-04-10 11:33:04,508 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:33:04,526 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:33:04,526 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:33:04,551 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:33:04,551 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:33:04,551 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:33:04,552 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:33:04,552 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690780_7933 | |
2015-04-10 11:33:04,843 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=e8d01830-413b-4641-911e-e86835b03e79 | |
2015-04-10 11:33:04,869 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:33:04,869 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:33:04,869 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-10 11:33:04,870 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:33:04,904 DEBUG Create event will be: time=2015-04-10T11:33:04.904099 severity=INFO origin="alert_handler" event_id="ca6ff27cd1920953cf3cc50e1ff6d6e6" user="splunk-system-user" action="create" alert="Test Alert" incident_id="e8d01830-413b-4641-911e-e86835b03e79" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690780_7933" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428690781" | |
2015-04-10 11:33:04,910 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690780_7933 with incident_id=e8d01830-413b-4641-911e-e86835b03e79 | |
2015-04-10 11:33:04,938 DEBUG results for incident_id=e8d01830-413b-4641-911e-e86835b03e79 written to collection. | |
2015-04-10 11:33:04,938 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690780_7933 incident_id=e8d01830-413b-4641-911e-e86835b03e79 result_id=0 written to collection incident_results | |
2015-04-10 11:33:04,938 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:33:04,945 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:33:04,946 INFO Alert handler finished. duration=0.73s | |
2015-04-10 11:34:04,212 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690840_7936/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690840_7936 sessionKey=Civ_DHJbQZd7S3KLMANTp_lmJSjA7zfmIAnX7a6NVNppofTjJ29oxa_gTd5zPDXg5oZrPkmXEgI4qH46cm8S0JB1teMKWKMBWyFVcN8DfWPevY^K93B8KgOMIdShgu061LoUE2tmvsrD alert=Test Alert | |
2015-04-10 11:34:04,218 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690840_7936' has been fired. | |
2015-04-10 11:34:04,487 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf"} | |
2015-04-10 11:34:04,487 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:34:04,493 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:34:04,493 INFO Found incident settings for Test Alert | |
2015-04-10 11:34:04,493 DEBUG Incident config after getting settings: {"auto_assign_owner": "unassigned", "auto_assign_user": "", "auto_ttl_resolve": false, "run_alert_script": false, "category": "unknown", "auto_assign": false, "tags": "[Untagged]", "alert": "Test Alert", "subcategory": "unknown", "urgency": "low", "_key": "5526c2413403e135d146268b", "_user": "nobody", "alert_script": "", "auto_previous_resolve": false} | |
2015-04-10 11:34:04,501 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:34:04,518 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:34:04,519 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:34:04,544 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:34:04,545 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:34:04,545 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:34:04,545 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:34:04,545 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690840_7936 | |
2015-04-10 11:34:04,835 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=37294a69-dbb9-4e5e-9bb1-9069c06438a9 | |
2015-04-10 11:34:04,860 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:34:04,860 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:34:04,861 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-10 11:34:04,861 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:34:04,895 DEBUG Create event will be: time=2015-04-10T11:34:04.895306 severity=INFO origin="alert_handler" event_id="10a373290782dbd8133455ca224b39ed" user="splunk-system-user" action="create" alert="Test Alert" incident_id="37294a69-dbb9-4e5e-9bb1-9069c06438a9" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690840_7936" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428690841" | |
2015-04-10 11:34:04,901 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690840_7936 with incident_id=37294a69-dbb9-4e5e-9bb1-9069c06438a9 | |
2015-04-10 11:34:04,929 DEBUG results for incident_id=37294a69-dbb9-4e5e-9bb1-9069c06438a9 written to collection. | |
2015-04-10 11:34:04,929 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690840_7936 incident_id=37294a69-dbb9-4e5e-9bb1-9069c06438a9 result_id=0 written to collection incident_results | |
2015-04-10 11:34:04,929 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:34:04,936 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:34:04,936 INFO Alert handler finished. duration=0.725s | |
2015-04-10 11:35:04,744 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690900_7943/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690900_7943 sessionKey=v9OgBMKUr5xME9fUpA9FpdMUI0abAY3j23ez31i4SirJSmmvWOjWmCC5xI_j_WSyJ3fb8L54LVaVrs1iifRKX64yRwWqxEU09Y^HVV3VZmd_yMLs3^4_TSXpaJNQ3SB0ELRRJl_a1bxLTC alert=Test Alert | |
2015-04-10 11:35:04,750 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690900_7943' has been fired. | |
2015-04-10 11:35:05,040 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "default_priority": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-10 11:35:05,040 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:35:05,046 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:35:05,047 INFO Found incident settings for Test Alert | |
2015-04-10 11:35:05,047 DEBUG Incident config after getting settings: {"auto_previous_resolve": false, "_key": "5526c2413403e135d146268b", "_user": "nobody", "auto_ttl_resolve": false, "subcategory": "unknown", "run_alert_script": false, "urgency": "low", "auto_assign_owner": "unassigned", "auto_assign_user": "", "auto_assign": false, "category": "unknown", "alert": "Test Alert", "alert_script": "", "tags": "[Untagged]"} | |
2015-04-10 11:35:05,054 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:35:05,071 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:35:05,071 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:35:05,097 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:35:05,097 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:35:05,097 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:35:05,097 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:35:05,098 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690900_7943 | |
2015-04-10 11:35:05,396 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=a0f6a9e4-8475-4e1e-9449-1c30d749fad5 | |
2015-04-10 11:35:05,422 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:35:05,422 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:35:05,422 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-10 11:35:05,423 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:35:05,444 DEBUG Create event will be: time=2015-04-10T11:35:05.444163 severity=INFO origin="alert_handler" event_id="6a00f0d4c40419ac6503c803226290dc" user="splunk-system-user" action="create" alert="Test Alert" incident_id="a0f6a9e4-8475-4e1e-9449-1c30d749fad5" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690900_7943" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428690901" | |
2015-04-10 11:35:05,450 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690900_7943 with incident_id=a0f6a9e4-8475-4e1e-9449-1c30d749fad5 | |
2015-04-10 11:35:05,478 DEBUG results for incident_id=a0f6a9e4-8475-4e1e-9449-1c30d749fad5 written to collection. | |
2015-04-10 11:35:05,478 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690900_7943 incident_id=a0f6a9e4-8475-4e1e-9449-1c30d749fad5 result_id=0 written to collection incident_results | |
2015-04-10 11:35:05,478 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:35:05,487 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:35:05,487 INFO Alert handler finished. duration=0.743s | |
2015-04-10 11:36:04,078 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690960_7946/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690960_7946 sessionKey=oMjNdqn4D520FFWAK7SorVPaNn8sTA7drkQVloD5dxf^3DzCDSr6Cjvp1aCp4pr82YALOMHgPps5D4BUGh23HY_rYb1B7dXXUorBzli^RiyeWYCz_4gkr6YS9epLAoibAo4OcEcAM0RKgF alert=Test Alert | |
2015-04-10 11:36:04,084 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690960_7946' has been fired. | |
2015-04-10 11:36:04,353 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low", "default_priority": "low", "default_urgency": "low"} | |
2015-04-10 11:36:04,353 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:36:04,359 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:36:04,359 INFO Found incident settings for Test Alert | |
2015-04-10 11:36:04,359 DEBUG Incident config after getting settings: {"alert": "Test Alert", "tags": "[Untagged]", "run_alert_script": false, "_key": "5526c2413403e135d146268b", "subcategory": "unknown", "alert_script": "", "auto_assign_user": "", "auto_previous_resolve": false, "auto_assign_owner": "unassigned", "category": "unknown", "_user": "nobody", "auto_assign": false, "auto_ttl_resolve": false, "urgency": "low"} | |
2015-04-10 11:36:04,367 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:36:04,384 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:36:04,384 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:36:04,409 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:36:04,409 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:36:04,409 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:36:04,409 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:36:04,410 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690960_7946 | |
2015-04-10 11:36:04,700 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=2f096944-335c-441d-8e84-e2bac977eb80 | |
2015-04-10 11:36:04,726 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:36:04,726 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:36:04,726 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-10 11:36:04,726 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:36:04,763 DEBUG Create event will be: time=2015-04-10T11:36:04.763357 severity=INFO origin="alert_handler" event_id="c61817a30b5ec0ed3dd258359a67552f" user="splunk-system-user" action="create" alert="Test Alert" incident_id="2f096944-335c-441d-8e84-e2bac977eb80" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690960_7946" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428690961" | |
2015-04-10 11:36:04,769 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690960_7946 with incident_id=2f096944-335c-441d-8e84-e2bac977eb80 | |
2015-04-10 11:36:04,797 DEBUG results for incident_id=2f096944-335c-441d-8e84-e2bac977eb80 written to collection. | |
2015-04-10 11:36:04,797 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428690960_7946 incident_id=2f096944-335c-441d-8e84-e2bac977eb80 result_id=0 written to collection incident_results | |
2015-04-10 11:36:04,797 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:36:04,804 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:36:04,804 INFO Alert handler finished. duration=0.726s | |
2015-04-10 11:37:03,991 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691020_7948/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691020_7948 sessionKey=h5ZAt4EhRAYM1VGX77CSySy92GXy3o5BoCh720jADxYuf5uvezscWbvGsUvlCgPRKzmzpI_37_uz3R7TYZhNXGg2qusFX8yY0VrQCbZhFZx4f9m_ItAgZagpJrsuWVUo5GuBbrNg22aaN6YE2F alert=Test Alert | |
2015-04-10 11:37:03,997 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691020_7948' has been fired. | |
2015-04-10 11:37:04,268 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf"} | |
2015-04-10 11:37:04,268 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:37:04,274 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:37:04,274 INFO Found incident settings for Test Alert | |
2015-04-10 11:37:04,274 DEBUG Incident config after getting settings: {"urgency": "low", "_key": "5526c2413403e135d146268b", "auto_assign_user": "", "subcategory": "unknown", "alert": "Test Alert", "_user": "nobody", "alert_script": "", "auto_previous_resolve": false, "auto_assign": false, "auto_ttl_resolve": false, "run_alert_script": false, "auto_assign_owner": "unassigned", "category": "unknown", "tags": "[Untagged]"} | |
2015-04-10 11:37:04,283 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:37:04,301 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:37:04,301 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:37:04,327 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:37:04,328 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:37:04,328 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:37:04,328 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:37:04,328 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691020_7948 | |
2015-04-10 11:37:04,619 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=3042d618-5858-4793-a46f-421f322d0b5f | |
2015-04-10 11:37:04,644 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:37:04,645 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:37:04,645 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-10 11:37:04,645 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:37:04,664 DEBUG Create event will be: time=2015-04-10T11:37:04.664065 severity=INFO origin="alert_handler" event_id="82501bc75868a6af031e93c33e964703" user="splunk-system-user" action="create" alert="Test Alert" incident_id="3042d618-5858-4793-a46f-421f322d0b5f" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691020_7948" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428691021" | |
2015-04-10 11:37:04,669 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691020_7948 with incident_id=3042d618-5858-4793-a46f-421f322d0b5f | |
2015-04-10 11:37:04,698 DEBUG results for incident_id=3042d618-5858-4793-a46f-421f322d0b5f written to collection. | |
2015-04-10 11:37:04,698 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691020_7948 incident_id=3042d618-5858-4793-a46f-421f322d0b5f result_id=0 written to collection incident_results | |
2015-04-10 11:37:04,698 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:37:04,704 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:37:04,705 INFO Alert handler finished. duration=0.714s | |
2015-04-10 11:38:03,931 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691080_7950/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691080_7950 sessionKey=r6WGGMwggzNrboZCqifr2s^vzMjRvmi_xW1Nd^mqtolUN1hWQWgSeWFdJXcblHOWF1aHQSjIxUSKjumw1tI5d^Gt84BTkRPqh68PppAb_0Qn0QMmd7Sz9fv9B43lZZC9AmrYuASsYPo alert=Test Alert | |
2015-04-10 11:38:03,936 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691080_7950' has been fired. | |
2015-04-10 11:38:04,208 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-10 11:38:04,209 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:38:04,215 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:38:04,215 INFO Found incident settings for Test Alert | |
2015-04-10 11:38:04,215 DEBUG Incident config after getting settings: {"subcategory": "unknown", "auto_ttl_resolve": false, "category": "unknown", "_key": "5526c2413403e135d146268b", "auto_assign": false, "auto_assign_owner": "unassigned", "tags": "[Untagged]", "alert": "Test Alert", "alert_script": "", "auto_previous_resolve": false, "urgency": "low", "auto_assign_user": "", "run_alert_script": false, "_user": "nobody"} | |
2015-04-10 11:38:04,222 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:38:04,239 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:38:04,239 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:38:04,264 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:38:04,264 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:38:04,264 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:38:04,264 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:38:04,264 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691080_7950 | |
2015-04-10 11:38:04,561 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=db114286-b95c-4be4-833f-8c6a810b713d | |
2015-04-10 11:38:04,591 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:38:04,591 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:38:04,591 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-10 11:38:04,591 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:38:04,630 DEBUG Create event will be: time=2015-04-10T11:38:04.630819 severity=INFO origin="alert_handler" event_id="b6c37a766e3b7bf6347b4ac534ff0047" user="splunk-system-user" action="create" alert="Test Alert" incident_id="db114286-b95c-4be4-833f-8c6a810b713d" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691080_7950" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428691081" | |
2015-04-10 11:38:04,637 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691080_7950 with incident_id=db114286-b95c-4be4-833f-8c6a810b713d | |
2015-04-10 11:38:04,665 DEBUG results for incident_id=db114286-b95c-4be4-833f-8c6a810b713d written to collection. | |
2015-04-10 11:38:04,665 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691080_7950 incident_id=db114286-b95c-4be4-833f-8c6a810b713d result_id=0 written to collection incident_results | |
2015-04-10 11:38:04,665 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:38:04,672 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:38:04,672 INFO Alert handler finished. duration=0.742s | |
2015-04-10 11:39:03,919 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691140_7952/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691140_7952 sessionKey=GG2_BS8rpsYCGEwDRVGzHj6ALL90DCllHmXC2gaeHhe5CQPGVH2J6MDww4pl1LcUPuOaOaI2^2Av^E8pgCehaBuEVDrrxCdm8ZtkUivtUBS3m16swu^y^WwvZiUe^igPlQvlwzrO24OjVN alert=Test Alert | |
2015-04-10 11:39:03,925 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691140_7952' has been fired. | |
2015-04-10 11:39:04,192 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_urgency": "low", "default_priority": "low", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-10 11:39:04,192 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:39:04,198 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:39:04,198 INFO Found incident settings for Test Alert | |
2015-04-10 11:39:04,198 DEBUG Incident config after getting settings: {"auto_assign_user": "", "alert_script": "", "tags": "[Untagged]", "category": "unknown", "_user": "nobody", "run_alert_script": false, "subcategory": "unknown", "_key": "5526c2413403e135d146268b", "urgency": "low", "auto_assign_owner": "unassigned", "auto_ttl_resolve": false, "auto_assign": false, "alert": "Test Alert", "auto_previous_resolve": false} | |
2015-04-10 11:39:04,206 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:39:04,223 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:39:04,223 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:39:04,248 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:39:04,248 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:39:04,248 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:39:04,249 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:39:04,249 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691140_7952 | |
2015-04-10 11:39:04,539 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=115f1a12-20fb-400e-90cb-e5a4ea8c4623 | |
2015-04-10 11:39:04,565 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:39:04,566 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:39:04,566 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-10 11:39:04,566 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:39:04,599 DEBUG Create event will be: time=2015-04-10T11:39:04.599184 severity=INFO origin="alert_handler" event_id="e8467c49a291d7b04c066777cb0fb273" user="splunk-system-user" action="create" alert="Test Alert" incident_id="115f1a12-20fb-400e-90cb-e5a4ea8c4623" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691140_7952" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428691141" | |
2015-04-10 11:39:04,605 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691140_7952 with incident_id=115f1a12-20fb-400e-90cb-e5a4ea8c4623 | |
2015-04-10 11:39:04,633 DEBUG results for incident_id=115f1a12-20fb-400e-90cb-e5a4ea8c4623 written to collection. | |
2015-04-10 11:39:04,633 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691140_7952 incident_id=115f1a12-20fb-400e-90cb-e5a4ea8c4623 result_id=0 written to collection incident_results | |
2015-04-10 11:39:04,633 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:39:04,640 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:39:04,640 INFO Alert handler finished. duration=0.722s | |
2015-04-10 11:40:05,646 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691200_7961/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691200_7961 sessionKey=aCbfkNmFXFROe^v1AalYTcuk2qEh1mO02Vpipf5xZo6dNRbBTkH8ulELrYcKNU7fJJttm^qP3bUnvWUM_w6OvKDFlYWddQEPUsVeEvpn3GdjyYtelO6iytrLZjQI6MKlSepLfBUyJchEWo alert=Test Alert | |
2015-04-10 11:40:05,652 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691200_7961' has been fired. | |
2015-04-10 11:40:05,925 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf"} | |
2015-04-10 11:40:05,925 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:40:05,931 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:40:05,931 INFO Found incident settings for Test Alert | |
2015-04-10 11:40:05,931 DEBUG Incident config after getting settings: {"run_alert_script": false, "subcategory": "unknown", "alert": "Test Alert", "category": "unknown", "_user": "nobody", "alert_script": "", "urgency": "low", "tags": "[Untagged]", "auto_assign": false, "auto_previous_resolve": false, "auto_assign_user": "", "auto_ttl_resolve": false, "auto_assign_owner": "unassigned", "_key": "5526c2413403e135d146268b"} | |
2015-04-10 11:40:05,939 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:40:05,956 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:40:05,956 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:40:05,982 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:40:05,982 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:40:05,983 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:40:05,983 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:40:05,983 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691200_7961 | |
2015-04-10 11:40:06,271 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=2fbc11f5-8d85-44d2-b2fd-49540625d3f3 | |
2015-04-10 11:40:06,297 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:40:06,298 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:40:06,298 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-10 11:40:06,298 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:40:06,341 DEBUG Create event will be: time=2015-04-10T11:40:06.341008 severity=INFO origin="alert_handler" event_id="dd831cc31dc31aee0f4a7dc0b8eb97de" user="splunk-system-user" action="create" alert="Test Alert" incident_id="2fbc11f5-8d85-44d2-b2fd-49540625d3f3" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691200_7961" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428691201" | |
2015-04-10 11:40:06,347 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691200_7961 with incident_id=2fbc11f5-8d85-44d2-b2fd-49540625d3f3 | |
2015-04-10 11:40:06,375 DEBUG results for incident_id=2fbc11f5-8d85-44d2-b2fd-49540625d3f3 written to collection. | |
2015-04-10 11:40:06,375 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691200_7961 incident_id=2fbc11f5-8d85-44d2-b2fd-49540625d3f3 result_id=0 written to collection incident_results | |
2015-04-10 11:40:06,375 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:40:06,382 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:40:06,382 INFO Alert handler finished. duration=0.737s | |
2015-04-10 11:41:19,089 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691260_7983/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691260_7983 sessionKey=RH3OtWF2aIwXXXk6kSMSo01mDgUlVFgzGxYkNPyzVSfHc4nOeRiUvHk1fDsAMQVtU9WIuWFFmbhUxJa9u1dBzXVeW3ichnDi7IYjkue5TljARWHXohLpuFPgad2XNlqjDU_rzKhSaXHOOmF7nXr alert=Test Alert | |
2015-04-10 11:41:19,095 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691260_7983' has been fired. | |
2015-04-10 11:41:19,367 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low", "default_impact": "low"} | |
2015-04-10 11:41:19,368 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:41:19,373 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:41:19,374 INFO Found incident settings for Test Alert | |
2015-04-10 11:41:19,374 DEBUG Incident config after getting settings: {"auto_previous_resolve": false, "auto_assign_user": "", "urgency": "low", "alert_script": "", "_user": "nobody", "category": "unknown", "subcategory": "unknown", "alert": "Test Alert", "auto_ttl_resolve": false, "tags": "[Untagged]", "auto_assign_owner": "unassigned", "_key": "5526c2413403e135d146268b", "auto_assign": false, "run_alert_script": false} | |
2015-04-10 11:41:19,381 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:41:19,399 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:41:19,400 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:41:19,425 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:41:19,425 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:41:19,426 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:41:19,426 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:41:19,426 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691260_7983 | |
2015-04-10 11:41:19,721 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=e6559e33-8ec7-4d8a-a34a-68d62d106c71 | |
2015-04-10 11:41:19,746 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:41:19,747 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:41:19,747 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-10 11:41:19,747 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:41:19,769 DEBUG Create event will be: time=2015-04-10T11:41:19.769486 severity=INFO origin="alert_handler" event_id="62f4e8c4c0b527abcb79b8b78261f93e" user="splunk-system-user" action="create" alert="Test Alert" incident_id="e6559e33-8ec7-4d8a-a34a-68d62d106c71" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691260_7983" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428691276" | |
2015-04-10 11:41:19,775 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691260_7983 with incident_id=e6559e33-8ec7-4d8a-a34a-68d62d106c71 | |
2015-04-10 11:41:19,803 DEBUG results for incident_id=e6559e33-8ec7-4d8a-a34a-68d62d106c71 written to collection. | |
2015-04-10 11:41:19,804 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691260_7983 incident_id=e6559e33-8ec7-4d8a-a34a-68d62d106c71 result_id=0 written to collection incident_results | |
2015-04-10 11:41:19,804 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:41:19,810 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:41:19,811 INFO Alert handler finished. duration=0.722s | |
2015-04-10 11:42:04,155 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691320_7986/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691320_7986 sessionKey=BAYjSCKbesES7AJYMpmZo06gA_MPtcCwkW9RyfL5gvhdJ4E_iwmmewh0PsM6Q85V8jPIfzBRLWlNkYBJxQU9J1DTJLzyVJlqee2sPjc9BUZdhm0WDVZiVu8k^45p^H9tXHoDZUpZD06JlTR alert=Test Alert | |
2015-04-10 11:42:04,161 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691320_7986' has been fired. | |
2015-04-10 11:42:04,428 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-10 11:42:04,429 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:42:04,434 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:42:04,435 INFO Found incident settings for Test Alert | |
2015-04-10 11:42:04,435 DEBUG Incident config after getting settings: {"urgency": "low", "auto_assign_owner": "unassigned", "run_alert_script": false, "_user": "nobody", "subcategory": "unknown", "tags": "[Untagged]", "category": "unknown", "auto_assign_user": "", "alert_script": "", "auto_previous_resolve": false, "alert": "Test Alert", "auto_assign": false, "auto_ttl_resolve": false, "_key": "5526c2413403e135d146268b"} | |
2015-04-10 11:42:04,442 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:42:04,460 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:42:04,460 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:42:04,485 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:42:04,485 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:42:04,485 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:42:04,486 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:42:04,486 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691320_7986 | |
2015-04-10 11:42:04,775 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=bca010c0-66b2-4255-9293-5fa4b2d8eb0c | |
2015-04-10 11:42:04,801 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:42:04,801 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:42:04,802 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-10 11:42:04,802 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:42:04,850 DEBUG Create event will be: time=2015-04-10T11:42:04.850673 severity=INFO origin="alert_handler" event_id="a14aa5f6ab59052585abb3c9dc54537b" user="splunk-system-user" action="create" alert="Test Alert" incident_id="bca010c0-66b2-4255-9293-5fa4b2d8eb0c" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691320_7986" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428691321" | |
2015-04-10 11:42:04,858 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691320_7986 with incident_id=bca010c0-66b2-4255-9293-5fa4b2d8eb0c | |
2015-04-10 11:42:04,884 DEBUG results for incident_id=bca010c0-66b2-4255-9293-5fa4b2d8eb0c written to collection. | |
2015-04-10 11:42:04,884 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691320_7986 incident_id=bca010c0-66b2-4255-9293-5fa4b2d8eb0c result_id=0 written to collection incident_results | |
2015-04-10 11:42:04,884 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:42:04,891 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:42:04,891 INFO Alert handler finished. duration=0.737s | |
2015-04-10 11:43:04,060 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691380_7988/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691380_7988 sessionKey=qSueAfgwPLAeVtCGkSdH2hdO_^cDluK3lqwUjabn0BZZNXqyYxk4yMuqRXMZRozz1WunF5iMQIFscjJLex7Vn84uVGsOPiqsUqVHK2aJH5yqpgLMl2aJw^h94EZVMujok1VLRSuQ5mQT alert=Test Alert | |
2015-04-10 11:43:04,066 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691380_7988' has been fired. | |
2015-04-10 11:43:04,335 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_urgency": "low", "default_priority": "low"} | |
2015-04-10 11:43:04,335 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:43:04,341 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:43:04,341 INFO Found incident settings for Test Alert | |
2015-04-10 11:43:04,341 DEBUG Incident config after getting settings: {"_user": "nobody", "subcategory": "unknown", "auto_assign_owner": "unassigned", "auto_ttl_resolve": false, "category": "unknown", "auto_previous_resolve": false, "auto_assign_user": "", "run_alert_script": false, "tags": "[Untagged]", "_key": "5526c2413403e135d146268b", "alert": "Test Alert", "auto_assign": false, "urgency": "low", "alert_script": ""} | |
2015-04-10 11:43:04,349 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:43:04,367 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:43:04,367 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:43:04,392 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:43:04,392 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:43:04,393 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:43:04,393 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:43:04,393 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691380_7988 | |
2015-04-10 11:43:04,686 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=38809c3b-30c6-4035-8014-875b3a48c060 | |
2015-04-10 11:43:04,711 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:43:04,712 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:43:04,712 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-10 11:43:04,712 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:43:04,751 DEBUG Create event will be: time=2015-04-10T11:43:04.751798 severity=INFO origin="alert_handler" event_id="e3cc46e1cc4566dc20acf2f6c2087845" user="splunk-system-user" action="create" alert="Test Alert" incident_id="38809c3b-30c6-4035-8014-875b3a48c060" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691380_7988" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428691381" | |
2015-04-10 11:43:04,758 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691380_7988 with incident_id=38809c3b-30c6-4035-8014-875b3a48c060 | |
2015-04-10 11:43:04,786 DEBUG results for incident_id=38809c3b-30c6-4035-8014-875b3a48c060 written to collection. | |
2015-04-10 11:43:04,786 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691380_7988 incident_id=38809c3b-30c6-4035-8014-875b3a48c060 result_id=0 written to collection incident_results | |
2015-04-10 11:43:04,786 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:43:04,793 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:43:04,793 INFO Alert handler finished. duration=0.734s | |
2015-04-10 11:44:03,904 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691440_7990/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691440_7990 sessionKey=qJu2W66BLETnNSoc^1g9CmRUGwg5BGpaa7l_1bOwLWeUm9BLtEuSJ0GF3Gl0uCo7ygnUwyYyo0AzNWBGo2cG1zMsiWvjQSKJcTYOtZCkXJmIYSshDG7kWdFN9IdUbBOYWDjK6GN8glL5 alert=Test Alert | |
2015-04-10 11:44:03,910 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691440_7990' has been fired. | |
2015-04-10 11:44:04,174 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-10 11:44:04,174 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:44:04,180 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:44:04,180 INFO Found incident settings for Test Alert | |
2015-04-10 11:44:04,180 DEBUG Incident config after getting settings: {"subcategory": "unknown", "auto_ttl_resolve": false, "_user": "nobody", "category": "unknown", "auto_previous_resolve": false, "_key": "5526c2413403e135d146268b", "tags": "[Untagged]", "alert": "Test Alert", "alert_script": "", "run_alert_script": false, "auto_assign_user": "", "auto_assign": false, "auto_assign_owner": "unassigned", "urgency": "low"} | |
2015-04-10 11:44:04,188 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:44:04,206 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:44:04,206 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:44:04,231 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:44:04,231 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:44:04,231 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:44:04,231 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:44:04,231 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691440_7990 | |
2015-04-10 11:44:04,520 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=ba079ce4-bab0-4017-afaf-65a0ea992993 | |
2015-04-10 11:44:04,545 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:44:04,546 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:44:04,546 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-10 11:44:04,546 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:44:04,578 DEBUG Create event will be: time=2015-04-10T11:44:04.578397 severity=INFO origin="alert_handler" event_id="461cff0d88f44a10d59cf5a936997204" user="splunk-system-user" action="create" alert="Test Alert" incident_id="ba079ce4-bab0-4017-afaf-65a0ea992993" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691440_7990" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428691441" | |
2015-04-10 11:44:04,584 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691440_7990 with incident_id=ba079ce4-bab0-4017-afaf-65a0ea992993 | |
2015-04-10 11:44:04,612 DEBUG results for incident_id=ba079ce4-bab0-4017-afaf-65a0ea992993 written to collection. | |
2015-04-10 11:44:04,613 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691440_7990 incident_id=ba079ce4-bab0-4017-afaf-65a0ea992993 result_id=0 written to collection incident_results | |
2015-04-10 11:44:04,613 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:44:04,619 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:44:04,619 INFO Alert handler finished. duration=0.716s | |
2015-04-10 11:45:07,930 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691500_7999/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691500_7999 sessionKey=ZnmkjNouVpKgm29OjBJ5fTNLC3Zw^Ty3RzxhZ^vAj7ExczmRu^xV9K_pVOGRV_q9ZUCt_r2ryQ47s4U2W5SZls8PAFDj15qWibz0W6yrhJxIhNvAJO4VqDDBLFCcbzcK^CuSP_3e7lw3uC alert=Test Alert | |
2015-04-10 11:45:07,936 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691500_7999' has been fired. | |
2015-04-10 11:45:08,222 DEBUG Parsed global alert handler settings: {"default_impact": "low", "default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-10 11:45:08,222 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:45:08,229 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:45:08,229 INFO Found incident settings for Test Alert | |
2015-04-10 11:45:08,229 DEBUG Incident config after getting settings: {"_user": "nobody", "auto_assign": false, "alert_script": "", "tags": "[Untagged]", "subcategory": "unknown", "_key": "5526c2413403e135d146268b", "urgency": "low", "alert": "Test Alert", "run_alert_script": false, "auto_assign_owner": "unassigned", "category": "unknown", "auto_ttl_resolve": false, "auto_assign_user": "", "auto_previous_resolve": false} | |
2015-04-10 11:45:08,237 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:45:08,255 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:45:08,255 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:45:08,280 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:45:08,280 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:45:08,281 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:45:08,281 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:45:08,281 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691500_7999 | |
2015-04-10 11:45:08,572 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=a9f98e07-85db-4919-ba95-670f68855200 | |
2015-04-10 11:45:08,599 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:45:08,599 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:45:08,599 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-10 11:45:08,600 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:45:08,636 DEBUG Create event will be: time=2015-04-10T11:45:08.636010 severity=INFO origin="alert_handler" event_id="79304a21434295780b697cf26918db7e" user="splunk-system-user" action="create" alert="Test Alert" incident_id="a9f98e07-85db-4919-ba95-670f68855200" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691500_7999" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428691504" | |
2015-04-10 11:45:08,642 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691500_7999 with incident_id=a9f98e07-85db-4919-ba95-670f68855200 | |
2015-04-10 11:45:08,670 DEBUG results for incident_id=a9f98e07-85db-4919-ba95-670f68855200 written to collection. | |
2015-04-10 11:45:08,670 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691500_7999 incident_id=a9f98e07-85db-4919-ba95-670f68855200 result_id=0 written to collection incident_results | |
2015-04-10 11:45:08,670 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:45:08,677 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:45:08,677 INFO Alert handler finished. duration=0.748s | |
2015-04-10 11:46:04,046 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691560_8007/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691560_8007 sessionKey=DML9Bm3lI7rjxAkh6bbw8VmJ4quc6^_HkcAPMuqFX_Vk1ncdbos5h7YRtDpoCECjQUIbRZje1L6DXhyi4wI46ZCVzHOK9w6RU1Q4bOSj5sV6SOMBhyNFt30vwS2dEbsBCKhdxrN_5oGWNcLmxGS alert=Test Alert | |
2015-04-10 11:46:04,051 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691560_8007' has been fired. | |
2015-04-10 11:46:04,325 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_urgency": "low", "default_impact": "low", "default_priority": "low", "default_owner": "unassigned"} | |
2015-04-10 11:46:04,325 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:46:04,331 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:46:04,332 INFO Found incident settings for Test Alert | |
2015-04-10 11:46:04,332 DEBUG Incident config after getting settings: {"subcategory": "unknown", "alert": "Test Alert", "category": "unknown", "_user": "nobody", "run_alert_script": false, "auto_assign_user": "", "auto_previous_resolve": false, "alert_script": "", "auto_assign": false, "auto_assign_owner": "unassigned", "auto_ttl_resolve": false, "_key": "5526c2413403e135d146268b", "tags": "[Untagged]", "urgency": "low"} | |
2015-04-10 11:46:04,340 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:46:04,357 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:46:04,357 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:46:04,382 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:46:04,382 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:46:04,383 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:46:04,383 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:46:04,383 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691560_8007 | |
2015-04-10 11:46:04,678 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=9a2adc87-7bb7-4efd-93ca-90c673215fba | |
2015-04-10 11:46:04,704 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:46:04,704 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:46:04,705 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-10 11:46:04,705 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:46:04,721 DEBUG Create event will be: time=2015-04-10T11:46:04.721330 severity=INFO origin="alert_handler" event_id="6f920f305850bea219aea083a449e7da" user="splunk-system-user" action="create" alert="Test Alert" incident_id="9a2adc87-7bb7-4efd-93ca-90c673215fba" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691560_8007" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428691561" | |
2015-04-10 11:46:04,727 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691560_8007 with incident_id=9a2adc87-7bb7-4efd-93ca-90c673215fba | |
2015-04-10 11:46:04,755 DEBUG results for incident_id=9a2adc87-7bb7-4efd-93ca-90c673215fba written to collection. | |
2015-04-10 11:46:04,756 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691560_8007 incident_id=9a2adc87-7bb7-4efd-93ca-90c673215fba result_id=0 written to collection incident_results | |
2015-04-10 11:46:04,756 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:46:04,763 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:46:04,763 INFO Alert handler finished. duration=0.718s | |
2015-04-10 11:47:04,015 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691620_8009/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691620_8009 sessionKey=qgdyPBAW5bTEc5yGvWD4dmb2rrsrzL^ckxc_eu7rY4jiyr5ubQeH0QEr3379npFL_u_JDBlDkeE6A3m1rlYZMVKjz2rJGCCprijcQpeTLsIY3HtHmU9gGh7rjIjz3pnrN89d941YXTA alert=Test Alert | |
2015-04-10 11:47:04,021 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691620_8009' has been fired. | |
2015-04-10 11:47:04,288 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-10 11:47:04,288 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:47:04,294 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:47:04,294 INFO Found incident settings for Test Alert | |
2015-04-10 11:47:04,294 DEBUG Incident config after getting settings: {"tags": "[Untagged]", "alert_script": "", "auto_previous_resolve": false, "auto_assign_owner": "unassigned", "run_alert_script": false, "alert": "Test Alert", "category": "unknown", "auto_ttl_resolve": false, "urgency": "low", "_key": "5526c2413403e135d146268b", "subcategory": "unknown", "auto_assign": false, "_user": "nobody", "auto_assign_user": ""} | |
2015-04-10 11:47:04,302 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:47:04,319 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:47:04,319 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:47:04,343 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:47:04,344 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:47:04,344 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:47:04,344 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:47:04,344 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691620_8009 | |
2015-04-10 11:47:04,635 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=3f3f5812-480c-465d-8f89-c2b94e658eec | |
2015-04-10 11:47:04,660 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:47:04,660 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:47:04,661 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-10 11:47:04,661 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:47:04,691 DEBUG Create event will be: time=2015-04-10T11:47:04.691920 severity=INFO origin="alert_handler" event_id="a6f4a80e23c8304746a1211e2237f72f" user="splunk-system-user" action="create" alert="Test Alert" incident_id="3f3f5812-480c-465d-8f89-c2b94e658eec" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691620_8009" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428691621" | |
2015-04-10 11:47:04,698 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691620_8009 with incident_id=3f3f5812-480c-465d-8f89-c2b94e658eec | |
2015-04-10 11:47:04,726 DEBUG results for incident_id=3f3f5812-480c-465d-8f89-c2b94e658eec written to collection. | |
2015-04-10 11:47:04,726 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691620_8009 incident_id=3f3f5812-480c-465d-8f89-c2b94e658eec result_id=0 written to collection incident_results | |
2015-04-10 11:47:04,726 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:47:04,733 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:47:04,733 INFO Alert handler finished. duration=0.718s | |
2015-04-10 11:48:03,966 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691680_8011/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691680_8011 sessionKey=aIi_U14PoMKpHJk7oilH0HjePUzFADPNeQToe_GP61YOVDxf3V7EF1_18SZOuwAtEJlylX1vY9ARE^SeeFCIHPEYGoOH_BDBA1X9fQy_pOFCN2tAEiX8Qy8PzH9b7b1igzEBtpgtdA^xpNbNvo alert=Test Alert | |
2015-04-10 11:48:03,971 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691680_8011' has been fired. | |
2015-04-10 11:48:04,238 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low", "default_urgency": "low", "default_priority": "low"} | |
2015-04-10 11:48:04,238 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:48:04,244 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:48:04,244 INFO Found incident settings for Test Alert | |
2015-04-10 11:48:04,244 DEBUG Incident config after getting settings: {"_user": "nobody", "auto_assign_user": "", "auto_assign": false, "auto_ttl_resolve": false, "alert": "Test Alert", "tags": "[Untagged]", "category": "unknown", "subcategory": "unknown", "alert_script": "", "auto_previous_resolve": false, "urgency": "low", "auto_assign_owner": "unassigned", "run_alert_script": false, "_key": "5526c2413403e135d146268b"} | |
2015-04-10 11:48:04,252 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:48:04,269 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:48:04,269 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:48:04,294 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:48:04,294 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:48:04,295 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:48:04,295 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:48:04,295 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691680_8011 | |
2015-04-10 11:48:04,586 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=8b82d86a-b742-4f61-8f95-9c312015d2f4 | |
2015-04-10 11:48:04,612 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:48:04,612 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:48:04,612 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-10 11:48:04,612 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:48:04,629 DEBUG Create event will be: time=2015-04-10T11:48:04.629739 severity=INFO origin="alert_handler" event_id="b2649b683a2c8feea193261002f0dd8e" user="splunk-system-user" action="create" alert="Test Alert" incident_id="8b82d86a-b742-4f61-8f95-9c312015d2f4" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691680_8011" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428691681" | |
2015-04-10 11:48:04,635 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691680_8011 with incident_id=8b82d86a-b742-4f61-8f95-9c312015d2f4 | |
2015-04-10 11:48:04,664 DEBUG results for incident_id=8b82d86a-b742-4f61-8f95-9c312015d2f4 written to collection. | |
2015-04-10 11:48:04,664 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691680_8011 incident_id=8b82d86a-b742-4f61-8f95-9c312015d2f4 result_id=0 written to collection incident_results | |
2015-04-10 11:48:04,664 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:48:04,671 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:48:04,671 INFO Alert handler finished. duration=0.706s | |
2015-04-10 11:49:04,099 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691740_8015/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691740_8015 sessionKey=WRp79SeuTyjGUBo^WLGEZU0Mu1dzA_KiZx2BvTG3s9iV^qPlpVkB88VjG2iQUlJgdsJU2^QwrDWhbvdWEq5thT15v3NIo2SKiqUojZ8nvRGpCBwDnSIp0w8juefDX7L3lyGe5_yo1wr alert=Test Alert | |
2015-04-10 11:49:04,105 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691740_8015' has been fired. | |
2015-04-10 11:49:04,378 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low"} | |
2015-04-10 11:49:04,378 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:49:04,385 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:49:04,385 INFO Found incident settings for Test Alert | |
2015-04-10 11:49:04,385 DEBUG Incident config after getting settings: {"alert_script": "", "urgency": "low", "auto_previous_resolve": false, "auto_assign_owner": "unassigned", "run_alert_script": false, "_key": "5526c2413403e135d146268b", "_user": "nobody", "auto_assign": false, "auto_assign_user": "", "auto_ttl_resolve": false, "alert": "Test Alert", "tags": "[Untagged]", "category": "unknown", "subcategory": "unknown"} | |
2015-04-10 11:49:04,393 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:49:04,411 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:49:04,411 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:49:04,437 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:49:04,438 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:49:04,438 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:49:04,438 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:49:04,438 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691740_8015 | |
2015-04-10 11:49:04,731 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=4590fcdc-75d2-4c73-95f5-759dc4d902dd | |
2015-04-10 11:49:04,758 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:49:04,758 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:49:04,758 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-10 11:49:04,758 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:49:04,827 DEBUG Create event will be: time=2015-04-10T11:49:04.827177 severity=INFO origin="alert_handler" event_id="17269729931b6bae0c8f03417ded82ca" user="splunk-system-user" action="create" alert="Test Alert" incident_id="4590fcdc-75d2-4c73-95f5-759dc4d902dd" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691740_8015" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428691741" | |
2015-04-10 11:49:04,834 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691740_8015 with incident_id=4590fcdc-75d2-4c73-95f5-759dc4d902dd | |
2015-04-10 11:49:04,861 DEBUG results for incident_id=4590fcdc-75d2-4c73-95f5-759dc4d902dd written to collection. | |
2015-04-10 11:49:04,861 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691740_8015 incident_id=4590fcdc-75d2-4c73-95f5-759dc4d902dd result_id=0 written to collection incident_results | |
2015-04-10 11:49:04,861 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:49:04,869 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:49:04,869 INFO Alert handler finished. duration=0.77s | |
2015-04-10 11:50:05,550 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691800_8024/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691800_8024 sessionKey=1BdfZXoQMIerRR1RJtwKXqtBXFhpfJBCLZ1a0NShu4Jv9DmYR4_3zg_KhGXCLktiw4rOlZkaiUNzEBQ7M^qvz6ZxcgdM7VJfu7LCHLm8rVXw3cqi52UeJwLqlC7O5K0W6X3PCWxxykw1Rg0 alert=Test Alert | |
2015-04-10 11:50:05,556 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691800_8024' has been fired. | |
2015-04-10 11:50:05,833 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned", "default_priority": "low", "default_urgency": "low"} | |
2015-04-10 11:50:05,834 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:50:05,839 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:50:05,839 INFO Found incident settings for Test Alert | |
2015-04-10 11:50:05,840 DEBUG Incident config after getting settings: {"auto_assign": false, "category": "unknown", "auto_assign_user": "", "run_alert_script": false, "_key": "5526c2413403e135d146268b", "subcategory": "unknown", "auto_ttl_resolve": false, "auto_assign_owner": "unassigned", "auto_previous_resolve": false, "alert_script": "", "_user": "nobody", "tags": "[Untagged]", "urgency": "low", "alert": "Test Alert"} | |
2015-04-10 11:50:05,847 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:50:05,865 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:50:05,865 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:50:05,890 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:50:05,891 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:50:05,891 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:50:05,891 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:50:05,891 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691800_8024 | |
2015-04-10 11:50:06,186 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=50882a1a-860e-45c1-80e2-e3734a17da1e | |
2015-04-10 11:50:06,212 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:50:06,212 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:50:06,213 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-10 11:50:06,213 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:50:06,226 DEBUG Create event will be: time=2015-04-10T11:50:06.226626 severity=INFO origin="alert_handler" event_id="b19ae4d6ad5e2e05191125dd8f1d8dfd" user="splunk-system-user" action="create" alert="Test Alert" incident_id="50882a1a-860e-45c1-80e2-e3734a17da1e" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691800_8024" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428691801" | |
2015-04-10 11:50:06,232 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691800_8024 with incident_id=50882a1a-860e-45c1-80e2-e3734a17da1e | |
2015-04-10 11:50:06,260 DEBUG results for incident_id=50882a1a-860e-45c1-80e2-e3734a17da1e written to collection. | |
2015-04-10 11:50:06,261 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691800_8024 incident_id=50882a1a-860e-45c1-80e2-e3734a17da1e result_id=0 written to collection incident_results | |
2015-04-10 11:50:06,261 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:50:06,267 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:50:06,267 INFO Alert handler finished. duration=0.717s | |
2015-04-10 11:51:30,060 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691860_8046/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691860_8046 sessionKey=laIlTY32ZcbVaUO6fnENv_0kT05D4N3uhySeJIAbUX7Hd0qsJpqJ825TNvrrZ4jIVF_SGUu8A1UirrRc2_9y9Le7TUhdDJD^3PClILeiFN0gUP4VqJYxh3kz1pRy3NFNR3yw009bSLTHtEZJX9pg alert=Test Alert | |
2015-04-10 11:51:30,066 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691860_8046' has been fired. | |
2015-04-10 11:51:30,336 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low", "default_priority": "low", "default_urgency": "low"} | |
2015-04-10 11:51:30,336 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:51:30,342 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:51:30,342 INFO Found incident settings for Test Alert | |
2015-04-10 11:51:30,342 DEBUG Incident config after getting settings: {"alert_script": "", "auto_assign_user": "", "auto_previous_resolve": false, "alert": "Test Alert", "tags": "[Untagged]", "run_alert_script": false, "_key": "5526c2413403e135d146268b", "subcategory": "unknown", "_user": "nobody", "auto_assign": false, "auto_ttl_resolve": false, "urgency": "low", "auto_assign_owner": "unassigned", "category": "unknown"} | |
2015-04-10 11:51:30,350 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:51:30,368 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:51:30,368 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:51:30,393 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:51:30,393 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:51:30,393 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:51:30,393 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:51:30,394 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691860_8046 | |
2015-04-10 11:51:30,686 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=dfb73a41-20c0-42b1-b1e4-024e49790205 | |
2015-04-10 11:51:30,711 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:51:30,711 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:51:30,712 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-10 11:51:30,712 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:51:30,727 DEBUG Create event will be: time=2015-04-10T11:51:30.727616 severity=INFO origin="alert_handler" event_id="d5689d757ddf7d1e0df2ead4fb662507" user="splunk-system-user" action="create" alert="Test Alert" incident_id="dfb73a41-20c0-42b1-b1e4-024e49790205" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691860_8046" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428691887" | |
2015-04-10 11:51:30,733 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691860_8046 with incident_id=dfb73a41-20c0-42b1-b1e4-024e49790205 | |
2015-04-10 11:51:30,761 DEBUG results for incident_id=dfb73a41-20c0-42b1-b1e4-024e49790205 written to collection. | |
2015-04-10 11:51:30,762 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691860_8046 incident_id=dfb73a41-20c0-42b1-b1e4-024e49790205 result_id=0 written to collection incident_results | |
2015-04-10 11:51:30,762 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:51:30,768 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:51:30,768 INFO Alert handler finished. duration=0.709s | |
2015-04-10 11:52:06,679 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691920_8048/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691920_8048 sessionKey=WLiOAoFVQLaf41q9F0cnyyT9V13l1_M49Kk^OmC6zf5QL7uTBbeb4IBr_tTlCbdbJfizMTDfein5hqwqLykbrrlN2T^_2IvO501i8MQ9ullkSWDxHfAnqZJRu63PQQk7FUG8TZsqZdEQvUtk9Ztj alert=Test Alert | |
2015-04-10 11:52:06,685 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691920_8048' has been fired. | |
2015-04-10 11:52:06,961 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_urgency": "low", "default_impact": "low"} | |
2015-04-10 11:52:06,961 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:52:06,968 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:52:06,968 INFO Found incident settings for Test Alert | |
2015-04-10 11:52:06,968 DEBUG Incident config after getting settings: {"auto_ttl_resolve": false, "auto_assign": false, "auto_assign_owner": "unassigned", "auto_previous_resolve": false, "alert": "Test Alert", "alert_script": "", "auto_assign_user": "", "category": "unknown", "tags": "[Untagged]", "subcategory": "unknown", "_key": "5526c2413403e135d146268b", "_user": "nobody", "run_alert_script": false, "urgency": "low"} | |
2015-04-10 11:52:06,975 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:52:06,993 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:52:06,993 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:52:07,022 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:52:07,022 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:52:07,022 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:52:07,022 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:52:07,022 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691920_8048 | |
2015-04-10 11:52:07,320 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=b25ed386-d7cc-42d5-acc5-810493fb8a4d | |
2015-04-10 11:52:07,346 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:52:07,346 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:52:07,347 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-10 11:52:07,347 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:52:07,390 DEBUG Create event will be: time=2015-04-10T11:52:07.390389 severity=INFO origin="alert_handler" event_id="58496ada7df9d8c50b51ef79c4a557ee" user="splunk-system-user" action="create" alert="Test Alert" incident_id="b25ed386-d7cc-42d5-acc5-810493fb8a4d" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691920_8048" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428691924" | |
2015-04-10 11:52:07,397 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691920_8048 with incident_id=b25ed386-d7cc-42d5-acc5-810493fb8a4d | |
2015-04-10 11:52:07,426 DEBUG results for incident_id=b25ed386-d7cc-42d5-acc5-810493fb8a4d written to collection. | |
2015-04-10 11:52:07,426 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691920_8048 incident_id=b25ed386-d7cc-42d5-acc5-810493fb8a4d result_id=0 written to collection incident_results | |
2015-04-10 11:52:07,426 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:52:07,434 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:52:07,434 INFO Alert handler finished. duration=0.756s | |
2015-04-10 11:53:03,939 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691980_8050/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691980_8050 sessionKey=wt0_lHLr2UyTUCq0dTYo1qLy3Piw23LfR5qHSXoXChZKvvR5AtrzFkO4qu58n9s_XHww1jJ391XTVsj^320IJk6E8MNSs7QoyNz9MNzINNVK1ijPc4Whfft963aTMszjZ6NgwemHY_g alert=Test Alert | |
2015-04-10 11:53:03,945 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691980_8050' has been fired. | |
2015-04-10 11:53:04,216 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-10 11:53:04,216 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:53:04,221 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:53:04,222 INFO Found incident settings for Test Alert | |
2015-04-10 11:53:04,222 DEBUG Incident config after getting settings: {"auto_ttl_resolve": false, "_user": "nobody", "run_alert_script": false, "auto_assign_owner": "unassigned", "auto_previous_resolve": false, "category": "unknown", "tags": "[Untagged]", "alert_script": "", "alert": "Test Alert", "auto_assign": false, "subcategory": "unknown", "_key": "5526c2413403e135d146268b", "auto_assign_user": "", "urgency": "low"} | |
2015-04-10 11:53:04,229 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:53:04,247 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:53:04,247 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:53:04,273 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:53:04,273 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:53:04,273 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:53:04,273 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:53:04,273 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691980_8050 | |
2015-04-10 11:53:04,566 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=79ef86c9-1222-476f-a9fe-a2d3c765ee59 | |
2015-04-10 11:53:04,592 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:53:04,593 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:53:04,593 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-10 11:53:04,593 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:53:04,629 DEBUG Create event will be: time=2015-04-10T11:53:04.629734 severity=INFO origin="alert_handler" event_id="08eefd58997d8370fae5f6a6ae493d53" user="splunk-system-user" action="create" alert="Test Alert" incident_id="79ef86c9-1222-476f-a9fe-a2d3c765ee59" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691980_8050" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428691981" | |
2015-04-10 11:53:04,636 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691980_8050 with incident_id=79ef86c9-1222-476f-a9fe-a2d3c765ee59 | |
2015-04-10 11:53:04,664 DEBUG results for incident_id=79ef86c9-1222-476f-a9fe-a2d3c765ee59 written to collection. | |
2015-04-10 11:53:04,664 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428691980_8050 incident_id=79ef86c9-1222-476f-a9fe-a2d3c765ee59 result_id=0 written to collection incident_results | |
2015-04-10 11:53:04,664 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:53:04,671 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:53:04,671 INFO Alert handler finished. duration=0.732s | |
2015-04-10 11:54:03,882 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692040_8052/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692040_8052 sessionKey=7Z1LRn7MTMoUiFZrcpqK2ULmq_y6_Xnh5gqcPKZE6dLufp6mQnnL0NcT466BLsgMLksnYMSxIv5^k5u^l09gazVMsMTKKljPSGMIGpWTTrxde8LsyPbI7OMNIt7xbKW2E5MaP6NgNpVeOm4l alert=Test Alert | |
2015-04-10 11:54:03,887 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692040_8052' has been fired. | |
2015-04-10 11:54:04,153 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_urgency": "low", "default_impact": "low", "default_priority": "low", "default_owner": "unassigned"} | |
2015-04-10 11:54:04,153 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:54:04,159 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:54:04,159 INFO Found incident settings for Test Alert | |
2015-04-10 11:54:04,159 DEBUG Incident config after getting settings: {"category": "unknown", "alert": "Test Alert", "subcategory": "unknown", "run_alert_script": false, "auto_ttl_resolve": false, "auto_assign": false, "tags": "[Untagged]", "urgency": "low", "alert_script": "", "_user": "nobody", "auto_assign_user": "", "_key": "5526c2413403e135d146268b", "auto_assign_owner": "unassigned", "auto_previous_resolve": false} | |
2015-04-10 11:54:04,167 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:54:04,184 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:54:04,185 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:54:04,209 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:54:04,210 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:54:04,210 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:54:04,210 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:54:04,210 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692040_8052 | |
2015-04-10 11:54:04,497 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=c3d6ba3d-87c6-48a3-8e39-d4faa2bab3fd | |
2015-04-10 11:54:04,523 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:54:04,523 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:54:04,524 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-10 11:54:04,524 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:54:04,562 DEBUG Create event will be: time=2015-04-10T11:54:04.562671 severity=INFO origin="alert_handler" event_id="f7a5ba68a941fb062f10a214a9a19303" user="splunk-system-user" action="create" alert="Test Alert" incident_id="c3d6ba3d-87c6-48a3-8e39-d4faa2bab3fd" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692040_8052" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428692041" | |
2015-04-10 11:54:04,568 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692040_8052 with incident_id=c3d6ba3d-87c6-48a3-8e39-d4faa2bab3fd | |
2015-04-10 11:54:04,597 DEBUG results for incident_id=c3d6ba3d-87c6-48a3-8e39-d4faa2bab3fd written to collection. | |
2015-04-10 11:54:04,597 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692040_8052 incident_id=c3d6ba3d-87c6-48a3-8e39-d4faa2bab3fd result_id=0 written to collection incident_results | |
2015-04-10 11:54:04,597 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:54:04,603 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:54:04,604 INFO Alert handler finished. duration=0.722s | |
2015-04-10 11:55:13,661 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692100_8058/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692100_8058 sessionKey=J197qg_^i^Z4VuHgdoAEBQjcNeJt9rgMXj2jiehyE2oS9m2KdlL9a3Nue9i9xeoIGQ^BGMC8T1ma^HczMnaJY0GGYUJvEkyPQSzw9eknA9xhjq_ibaXaoxKS3_D^ZMRLqKSZVjJcAeSSO5o alert=Test Alert | |
2015-04-10 11:55:13,667 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692100_8058' has been fired. | |
2015-04-10 11:55:13,945 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low"} | |
2015-04-10 11:55:13,945 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:55:13,951 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:55:13,951 INFO Found incident settings for Test Alert | |
2015-04-10 11:55:13,951 DEBUG Incident config after getting settings: {"alert": "Test Alert", "subcategory": "unknown", "_key": "5526c2413403e135d146268b", "tags": "[Untagged]", "auto_assign_owner": "unassigned", "auto_ttl_resolve": false, "run_alert_script": false, "category": "unknown", "auto_previous_resolve": false, "auto_assign_user": "", "urgency": "low", "auto_assign": false, "_user": "nobody", "alert_script": ""} | |
2015-04-10 11:55:13,959 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:55:13,977 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:55:13,977 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:55:14,004 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:55:14,004 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:55:14,004 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:55:14,004 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:55:14,004 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692100_8058 | |
2015-04-10 11:55:14,308 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=47836c3a-01fb-4e9c-a879-fb04cd64504f | |
2015-04-10 11:55:14,337 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:55:14,338 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:55:14,338 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-10 11:55:14,338 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:55:14,377 DEBUG Create event will be: time=2015-04-10T11:55:14.377242 severity=INFO origin="alert_handler" event_id="96a28b2c30268cfa1b9ca33deaeb5498" user="splunk-system-user" action="create" alert="Test Alert" incident_id="47836c3a-01fb-4e9c-a879-fb04cd64504f" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692100_8058" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428692101" | |
2015-04-10 11:55:14,383 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692100_8058 with incident_id=47836c3a-01fb-4e9c-a879-fb04cd64504f | |
2015-04-10 11:55:14,411 DEBUG results for incident_id=47836c3a-01fb-4e9c-a879-fb04cd64504f written to collection. | |
2015-04-10 11:55:14,411 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692100_8058 incident_id=47836c3a-01fb-4e9c-a879-fb04cd64504f result_id=0 written to collection incident_results | |
2015-04-10 11:55:14,411 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:55:14,418 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:55:14,419 INFO Alert handler finished. duration=0.758s | |
2015-04-10 11:56:04,164 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692160_8062/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692160_8062 sessionKey=VCEuojvjkVPEh1KlMvd9oAm2jgTtICzC8jf^47RO2tGmHQGHESRAmxQ_TYeuxvZeB^t7mEhVtqQE4IA^VXLoGdueMxjjFNO96E5T0YwySdN6Ci9prAqO0g61RvISp8zPv1kUet2kFCx alert=Test Alert | |
2015-04-10 11:56:04,170 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692160_8062' has been fired. | |
2015-04-10 11:56:04,436 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-10 11:56:04,436 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:56:04,442 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:56:04,442 INFO Found incident settings for Test Alert | |
2015-04-10 11:56:04,442 DEBUG Incident config after getting settings: {"alert": "Test Alert", "run_alert_script": false, "auto_assign": false, "auto_assign_user": "", "auto_assign_owner": "unassigned", "tags": "[Untagged]", "_user": "nobody", "auto_previous_resolve": false, "alert_script": "", "auto_ttl_resolve": false, "_key": "5526c2413403e135d146268b", "subcategory": "unknown", "category": "unknown", "urgency": "low"} | |
2015-04-10 11:56:04,450 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:56:04,467 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:56:04,467 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:56:04,492 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:56:04,492 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:56:04,493 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:56:04,493 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:56:04,493 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692160_8062 | |
2015-04-10 11:56:04,782 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=3941e28d-2162-486b-be64-11602339caa1 | |
2015-04-10 11:56:04,807 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:56:04,807 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:56:04,808 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-10 11:56:04,808 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:56:04,843 DEBUG Create event will be: time=2015-04-10T11:56:04.843533 severity=INFO origin="alert_handler" event_id="7272b398d687347e6f773330a8634677" user="splunk-system-user" action="create" alert="Test Alert" incident_id="3941e28d-2162-486b-be64-11602339caa1" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692160_8062" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428692161" | |
2015-04-10 11:56:04,849 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692160_8062 with incident_id=3941e28d-2162-486b-be64-11602339caa1 | |
2015-04-10 11:56:04,877 DEBUG results for incident_id=3941e28d-2162-486b-be64-11602339caa1 written to collection. | |
2015-04-10 11:56:04,878 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692160_8062 incident_id=3941e28d-2162-486b-be64-11602339caa1 result_id=0 written to collection incident_results | |
2015-04-10 11:56:04,878 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:56:04,884 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:56:04,884 INFO Alert handler finished. duration=0.721s | |
2015-04-10 11:57:03,902 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692220_8064/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692220_8064 sessionKey=7qBytxId1Ii32HLKodeis9Di42UIa5gQewbOhYVNu_gZ0TdybhTIi_blsOmwczLU3PoDSFlOcWMof_tA60Pa6Y1OpPdNJHqFhELBvjByUs1kuJNiX85dXD527Gm5yy5MtQqv_AL8ewYKaN alert=Test Alert | |
2015-04-10 11:57:03,908 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692220_8064' has been fired. | |
2015-04-10 11:57:04,174 DEBUG Parsed global alert handler settings: {"default_impact": "low", "default_owner": "unassigned", "default_urgency": "low", "index": "alerts-to-inf", "default_priority": "low"} | |
2015-04-10 11:57:04,174 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:57:04,180 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:57:04,180 INFO Found incident settings for Test Alert | |
2015-04-10 11:57:04,181 DEBUG Incident config after getting settings: {"run_alert_script": false, "category": "unknown", "alert_script": "", "auto_assign": false, "_user": "nobody", "_key": "5526c2413403e135d146268b", "auto_ttl_resolve": false, "tags": "[Untagged]", "subcategory": "unknown", "auto_assign_user": "", "auto_assign_owner": "unassigned", "alert": "Test Alert", "auto_previous_resolve": false, "urgency": "low"} | |
2015-04-10 11:57:04,188 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:57:04,205 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:57:04,205 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:57:04,230 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:57:04,230 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:57:04,231 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:57:04,231 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:57:04,231 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692220_8064 | |
2015-04-10 11:57:04,521 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=a6c3cbcf-f131-4b27-9863-b0fe6df43bd2 | |
2015-04-10 11:57:04,547 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:57:04,547 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:57:04,547 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'low'} | |
2015-04-10 11:57:04,547 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:57:04,571 DEBUG Create event will be: time=2015-04-10T11:57:04.571393 severity=INFO origin="alert_handler" event_id="8fa157e02a2c9ed92bbae90e90703f31" user="splunk-system-user" action="create" alert="Test Alert" incident_id="a6c3cbcf-f131-4b27-9863-b0fe6df43bd2" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692220_8064" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428692221" | |
2015-04-10 11:57:04,577 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692220_8064 with incident_id=a6c3cbcf-f131-4b27-9863-b0fe6df43bd2 | |
2015-04-10 11:57:04,605 DEBUG results for incident_id=a6c3cbcf-f131-4b27-9863-b0fe6df43bd2 written to collection. | |
2015-04-10 11:57:04,605 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692220_8064 incident_id=a6c3cbcf-f131-4b27-9863-b0fe6df43bd2 result_id=0 written to collection incident_results | |
2015-04-10 11:57:04,605 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:57:04,612 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:57:04,612 INFO Alert handler finished. duration=0.71s | |
2015-04-10 11:58:03,853 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692280_8066/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692280_8066 sessionKey=HlO4nbDurmKN9RPeKzSFXc6owd1I7D6V0A25yFVwcreh5QG7Us1Q7AS87_K1LiOAbijfK1H5rM^cD2CCLHXbjgdU9Kmt9Fp^VPB4oVHCnFB9f2tc5kM7ssOfaLzXNwPjg3RRZeeVARde0WL alert=Test Alert | |
2015-04-10 11:58:03,859 INFO alert_handler started because alert 'Test Alert' with id 'scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692280_8066' has been fired. | |
2015-04-10 11:58:04,134 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_impact": "low", "default_urgency": "low", "index": "alerts-to-inf"} | |
2015-04-10 11:58:04,135 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Test%20Alert%22%7D | |
2015-04-10 11:58:04,141 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "low", "alert" : "Test Alert", "tags" : "[Untagged]", "category" : "unknown", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526c2413403e135d146268b" } ] | |
2015-04-10 11:58:04,141 INFO Found incident settings for Test Alert | |
2015-04-10 11:58:04,141 DEBUG Incident config after getting settings: {"alert": "Test Alert", "subcategory": "unknown", "auto_assign_owner": "unassigned", "tags": "[Untagged]", "auto_ttl_resolve": false, "run_alert_script": false, "category": "unknown", "auto_previous_resolve": false, "auto_assign": false, "auto_assign_user": "", "urgency": "low", "_key": "5526c2413403e135d146268b", "_user": "nobody", "alert_script": ""} | |
2015-04-10 11:58:04,149 INFO Found job for alert Test Alert. Context is 'search' with 2 results. | |
2015-04-10 11:58:04,168 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-10 11:58:04,168 DEBUG Transformed 24h into 86400 seconds | |
2015-04-10 11:58:04,194 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-10 11:58:04,194 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-10 11:58:04,195 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-10 11:58:04,195 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-10 11:58:04,195 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692280_8066 | |
2015-04-10 11:58:04,492 DEBUG No valid urgency field found in results. Falling back to default_urgency=low for incident_id=906b9b84-0d45-4c1f-a291-250e2216c598 | |
2015-04-10 11:58:04,518 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-10 11:58:04,518 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-10 11:58:04,518 DEBUG Querying lookup with filter={'urgency': u'low', 'impact': 'medium'} | |
2015-04-10 11:58:04,518 DEBUG Matched priority in lookup, returning value=low | |
2015-04-10 11:58:04,535 DEBUG Create event will be: time=2015-04-10T11:58:04.535597 severity=INFO origin="alert_handler" event_id="f038c837a345930c319546a4c254ac7f" user="splunk-system-user" action="create" alert="Test Alert" incident_id="906b9b84-0d45-4c1f-a291-250e2216c598" job_id="scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692280_8066" result_id="0" owner="unassigned" status="new" urgency="low" ttl="86400" alert_time="1428692281" | |
2015-04-10 11:58:04,541 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692280_8066 with incident_id=906b9b84-0d45-4c1f-a291-250e2216c598 | |
2015-04-10 11:58:04,569 DEBUG results for incident_id=906b9b84-0d45-4c1f-a291-250e2216c598 written to collection. | |
2015-04-10 11:58:04,570 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5ada861e3c4e7d72f_at_1428692280_8066 incident_id=906b9b84-0d45-4c1f-a291-250e2216c598 result_id=0 written to collection incident_results | |
2015-04-10 11:58:04,570 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-10 11:58:04,577 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-10 11:58:04,577 INFO Alert handler finished. duration=0.724s | |
2015-04-11 00:12:01,198 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428735600_11852/results.csv.gz job_id=scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428735600_11852 sessionKey=4jOF4kVuttIMgwau3srbMIe7j^bgXXB4IP9Gx91u^^Q1abZw_RnJqlikXIVdJY0q3sK15BIQ3mAIj^drirGSgcbF0W7Zl2jdERPWozNG9ErNfznxTEx6TWvsFYV2Bw0M1ehIokZdkXVoop3XzbCj alert=Splunk Forwarder Status | |
2015-04-11 00:12:01,204 INFO alert_handler started because alert 'Splunk Forwarder Status' with id 'scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428735600_11852' has been fired. | |
2015-04-11 00:12:01,472 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_urgency": "low", "default_owner": "unassigned", "default_priority": "low", "default_impact": "low"} | |
2015-04-11 00:12:01,473 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Splunk%20Forwarder%20Status%22%7D | |
2015-04-11 00:12:01,479 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "Splunk Forwarder Status", "tags" : "Splunk", "category" : "Splunk", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "", "_user" : "nobody", "_key" : "5526b8b13403e13421162b08" } ] | |
2015-04-11 00:12:01,479 INFO Found incident settings for Splunk Forwarder Status | |
2015-04-11 00:12:01,479 DEBUG Incident config after getting settings: {"auto_previous_resolve": false, "tags": "Splunk", "auto_ttl_resolve": false, "auto_assign": false, "_user": "nobody", "alert_script": "", "category": "Splunk", "run_alert_script": false, "urgency": "medium", "_key": "5526b8b13403e13421162b08", "auto_assign_owner": "unassigned", "alert": "Splunk Forwarder Status", "auto_assign_user": "", "subcategory": ""} | |
2015-04-11 00:12:01,488 INFO Found job for alert Splunk Forwarder Status. Context is 'search' with 65 results. | |
2015-04-11 00:12:01,509 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-11 00:12:01,509 DEBUG Transformed 24h into 86400 seconds | |
2015-04-11 00:12:01,542 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-11 00:12:01,542 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-11 00:12:01,543 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-11 00:12:01,543 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-11 00:12:01,543 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428735600_11852 | |
2015-04-11 00:12:01,837 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=8cf577e0-6609-4060-b9c4-ba4810eb859a | |
2015-04-11 00:12:01,866 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-11 00:12:01,867 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-11 00:12:01,867 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'medium'} | |
2015-04-11 00:12:01,867 DEBUG Matched priority in lookup, returning value=medium | |
2015-04-11 00:12:01,887 DEBUG Create event will be: time=2015-04-11T00:12:01.887045 severity=INFO origin="alert_handler" event_id="3d2a47e5ead1d49855506e43318cb867" user="splunk-system-user" action="create" alert="Splunk Forwarder Status" incident_id="8cf577e0-6609-4060-b9c4-ba4810eb859a" job_id="scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428735600_11852" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428735607" | |
2015-04-11 00:12:01,893 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428735600_11852 with incident_id=8cf577e0-6609-4060-b9c4-ba4810eb859a | |
2015-04-11 00:12:01,921 DEBUG results for incident_id=8cf577e0-6609-4060-b9c4-ba4810eb859a written to collection. | |
2015-04-11 00:12:01,921 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428735600_11852 incident_id=8cf577e0-6609-4060-b9c4-ba4810eb859a result_id=0 written to collection incident_results | |
2015-04-11 00:12:01,921 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-11 00:12:01,930 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-11 00:12:01,930 INFO Alert handler finished. duration=0.733s | |
2015-04-11 04:20:05,198 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428751200_13215/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428751200_13215 sessionKey=FnNqUTelPhEU5oCiknFgjmLX1lAxju0^7RyHtaWhpX^svTiRQBfXXaKJlTUsFTo_KT1fHWdFpAky9D5Ut2XP6F7lWNsroqKnEBleFRRYSAbXYfu9nGUym3WpGqZ8enWsKqepSmXePm4ZCFHrPlTL alert=Builtin Account Used | |
2015-04-11 04:20:05,203 INFO alert_handler started because alert 'Builtin Account Used' with id 'scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428751200_13215' has been fired. | |
2015-04-11 04:20:05,479 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_urgency": "low", "default_priority": "low"} | |
2015-04-11 04:20:05,479 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Builtin%20Account%20Used%22%7D | |
2015-04-11 04:20:05,486 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "Builtin Account Used", "tags" : "Builtin", "category" : "Builtin Account", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "", "_user" : "nobody", "_key" : "5526b8b13403e13421162b07" } ] | |
2015-04-11 04:20:05,486 INFO Found incident settings for Builtin Account Used | |
2015-04-11 04:20:05,486 DEBUG Incident config after getting settings: {"category": "Builtin Account", "subcategory": "", "urgency": "high", "run_alert_script": false, "_user": "nobody", "tags": "Builtin", "alert_script": "", "auto_previous_resolve": false, "auto_assign_user": "", "auto_assign_owner": "unassigned", "auto_assign": false, "_key": "5526b8b13403e13421162b07", "auto_ttl_resolve": false, "alert": "Builtin Account Used"} | |
2015-04-11 04:20:05,493 INFO Found job for alert Builtin Account Used. Context is 'search' with 2 results. | |
2015-04-11 04:20:05,511 DEBUG Parsed savedsearch settings: severity=5 expiry=24h digest_mode=True | |
2015-04-11 04:20:05,511 DEBUG Transformed 24h into 86400 seconds | |
2015-04-11 04:20:05,535 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-11 04:20:05,536 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-11 04:20:05,536 DEBUG Querying lookup with filter={'severity_id': '5'} | |
2015-04-11 04:20:05,536 DEBUG Matched impact in lookup, returning value=high | |
2015-04-11 04:20:05,536 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428751200_13215 | |
2015-04-11 04:20:05,838 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=eb8f3a33-5017-4942-89a1-9df9c90f16f8 | |
2015-04-11 04:20:05,865 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-11 04:20:05,866 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-11 04:20:05,866 DEBUG Querying lookup with filter={'impact': 'high', 'urgency': u'high'} | |
2015-04-11 04:20:05,866 DEBUG Matched priority in lookup, returning value=critical | |
2015-04-11 04:20:05,895 DEBUG Create event will be: time=2015-04-11T04:20:05.895660 severity=INFO origin="alert_handler" event_id="1a4c6ab4047ba82c285bcbd6bf07e306" user="splunk-system-user" action="create" alert="Builtin Account Used" incident_id="eb8f3a33-5017-4942-89a1-9df9c90f16f8" job_id="scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428751200_13215" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428751201" | |
2015-04-11 04:20:05,901 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428751200_13215 with incident_id=eb8f3a33-5017-4942-89a1-9df9c90f16f8 | |
2015-04-11 04:20:05,930 DEBUG results for incident_id=eb8f3a33-5017-4942-89a1-9df9c90f16f8 written to collection. | |
2015-04-11 04:20:05,930 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428751200_13215 incident_id=eb8f3a33-5017-4942-89a1-9df9c90f16f8 result_id=0 written to collection incident_results | |
2015-04-11 04:20:05,930 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-11 04:20:05,937 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-11 04:20:05,937 INFO Alert handler finished. duration=0.74s | |
2015-04-11 04:40:05,440 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428752400_13314/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428752400_13314 sessionKey=kx6JU6RYU3ROar1m9t^wDUSmj6RhHDi2Lx16W2eOHY8x5Nlw49D6GswNSiwbe8Nm40ZArNbKGwOzxl9tqjuDmnhR^LVRspvhyvzRu5LX0jxLu5vLtLVqr4HH4T^dcb9YO8ebXTwkZdgu alert=Builtin Account Used | |
2015-04-11 04:40:05,446 INFO alert_handler started because alert 'Builtin Account Used' with id 'scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428752400_13314' has been fired. | |
2015-04-11 04:40:05,716 DEBUG Parsed global alert handler settings: {"default_priority": "low", "index": "alerts-to-inf", "default_impact": "low", "default_urgency": "low", "default_owner": "unassigned"} | |
2015-04-11 04:40:05,717 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Builtin%20Account%20Used%22%7D | |
2015-04-11 04:40:05,723 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "Builtin Account Used", "tags" : "Builtin", "category" : "Builtin Account", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "", "_user" : "nobody", "_key" : "5526b8b13403e13421162b07" } ] | |
2015-04-11 04:40:05,723 INFO Found incident settings for Builtin Account Used | |
2015-04-11 04:40:05,723 DEBUG Incident config after getting settings: {"run_alert_script": false, "auto_previous_resolve": false, "_key": "5526b8b13403e13421162b07", "_user": "nobody", "subcategory": "", "auto_ttl_resolve": false, "urgency": "high", "auto_assign_user": "", "auto_assign": false, "auto_assign_owner": "unassigned", "category": "Builtin Account", "tags": "Builtin", "alert": "Builtin Account Used", "alert_script": ""} | |
2015-04-11 04:40:05,731 INFO Found job for alert Builtin Account Used. Context is 'search' with 2 results. | |
2015-04-11 04:40:05,750 DEBUG Parsed savedsearch settings: severity=5 expiry=24h digest_mode=True | |
2015-04-11 04:40:05,750 DEBUG Transformed 24h into 86400 seconds | |
2015-04-11 04:40:05,782 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-11 04:40:05,782 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-11 04:40:05,782 DEBUG Querying lookup with filter={'severity_id': '5'} | |
2015-04-11 04:40:05,782 DEBUG Matched impact in lookup, returning value=high | |
2015-04-11 04:40:05,783 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428752400_13314 | |
2015-04-11 04:40:06,079 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=f3a805ea-7418-4c98-88ef-4eaeb3460540 | |
2015-04-11 04:40:06,105 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-11 04:40:06,105 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-11 04:40:06,106 DEBUG Querying lookup with filter={'impact': 'high', 'urgency': u'high'} | |
2015-04-11 04:40:06,106 DEBUG Matched priority in lookup, returning value=critical | |
2015-04-11 04:40:06,118 DEBUG Create event will be: time=2015-04-11T04:40:06.118186 severity=INFO origin="alert_handler" event_id="9c9472848fd66d797b1681789f3bcdaa" user="splunk-system-user" action="create" alert="Builtin Account Used" incident_id="f3a805ea-7418-4c98-88ef-4eaeb3460540" job_id="scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428752400_13314" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428752401" | |
2015-04-11 04:40:06,125 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428752400_13314 with incident_id=f3a805ea-7418-4c98-88ef-4eaeb3460540 | |
2015-04-11 04:40:06,152 DEBUG results for incident_id=f3a805ea-7418-4c98-88ef-4eaeb3460540 written to collection. | |
2015-04-11 04:40:06,152 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428752400_13314 incident_id=f3a805ea-7418-4c98-88ef-4eaeb3460540 result_id=0 written to collection incident_results | |
2015-04-11 04:40:06,153 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-11 04:40:06,160 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-11 04:40:06,160 INFO Alert handler finished. duration=0.72s | |
2015-04-11 05:00:05,371 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428753600_13419/results.csv.gz job_id=scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428753600_13419 sessionKey=9D6cb1T787SXZGQMX^GhfUW8mzrly3o^5F3_iytxt7E^XCZHcI1b0GD_G81WDEQNTdDqI7SY5uvRMNyE^JdBT_qEcUtagGwpIq_L3gVlh55VDRi0dJ5GVfewS2HgW5Z8sIPiyELNk4z5useV6QL alert=Builtin Account Used | |
2015-04-11 05:00:05,377 INFO alert_handler started because alert 'Builtin Account Used' with id 'scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428753600_13419' has been fired. | |
2015-04-11 05:00:05,672 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_urgency": "low", "default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned"} | |
2015-04-11 05:00:05,672 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Builtin%20Account%20Used%22%7D | |
2015-04-11 05:00:05,680 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "Builtin Account Used", "tags" : "Builtin", "category" : "Builtin Account", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "", "_user" : "nobody", "_key" : "5526b8b13403e13421162b07" } ] | |
2015-04-11 05:00:05,680 INFO Found incident settings for Builtin Account Used | |
2015-04-11 05:00:05,680 DEBUG Incident config after getting settings: {"auto_ttl_resolve": false, "auto_assign": false, "auto_assign_user": "", "_user": "nobody", "subcategory": "", "category": "Builtin Account", "tags": "Builtin", "alert": "Builtin Account Used", "urgency": "high", "auto_previous_resolve": false, "alert_script": "", "_key": "5526b8b13403e13421162b07", "run_alert_script": false, "auto_assign_owner": "unassigned"} | |
2015-04-11 05:00:05,690 INFO Found job for alert Builtin Account Used. Context is 'search' with 3 results. | |
2015-04-11 05:00:05,712 DEBUG Parsed savedsearch settings: severity=5 expiry=24h digest_mode=True | |
2015-04-11 05:00:05,712 DEBUG Transformed 24h into 86400 seconds | |
2015-04-11 05:00:05,740 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-11 05:00:05,740 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-11 05:00:05,740 DEBUG Querying lookup with filter={'severity_id': '5'} | |
2015-04-11 05:00:05,740 DEBUG Matched impact in lookup, returning value=high | |
2015-04-11 05:00:05,740 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428753600_13419 | |
2015-04-11 05:00:06,044 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=635c635d-a0d9-4706-aa12-1d378f123540 | |
2015-04-11 05:00:06,070 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-11 05:00:06,070 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-11 05:00:06,070 DEBUG Querying lookup with filter={'impact': 'high', 'urgency': u'high'} | |
2015-04-11 05:00:06,070 DEBUG Matched priority in lookup, returning value=critical | |
2015-04-11 05:00:06,102 DEBUG Create event will be: time=2015-04-11T05:00:06.102592 severity=INFO origin="alert_handler" event_id="7a09e6aa38099f28536b8afbe15b84a8" user="splunk-system-user" action="create" alert="Builtin Account Used" incident_id="635c635d-a0d9-4706-aa12-1d378f123540" job_id="scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428753600_13419" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428753601" | |
2015-04-11 05:00:06,109 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428753600_13419 with incident_id=635c635d-a0d9-4706-aa12-1d378f123540 | |
2015-04-11 05:00:06,137 DEBUG results for incident_id=635c635d-a0d9-4706-aa12-1d378f123540 written to collection. | |
2015-04-11 05:00:06,137 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD5a0eed3be33141a19_at_1428753600_13419 incident_id=635c635d-a0d9-4706-aa12-1d378f123540 result_id=0 written to collection incident_results | |
2015-04-11 05:00:06,137 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-11 05:00:06,144 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-11 05:00:06,144 INFO Alert handler finished. duration=0.773s | |
2015-04-12 00:10:03,905 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428822000_19357/results.csv.gz job_id=scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428822000_19357 sessionKey=6S3eV99E^tvdlukmJlMo44QtLtFFq1HVf7Ky4e^EAI3BxncdCNM7FfRfgP_p5XfjNK0H^TQr8cCZ0KnYgh^JalEjXsI11VsBI82WIrtBPfb0z3fzz26qvPAosE82fZ^REDD2DTPmA5o alert=Splunk Forwarder Status | |
2015-04-12 00:10:03,910 INFO alert_handler started because alert 'Splunk Forwarder Status' with id 'scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428822000_19357' has been fired. | |
2015-04-12 00:10:04,190 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "index": "alerts-to-inf", "default_impact": "low", "default_urgency": "low", "default_priority": "low"} | |
2015-04-12 00:10:04,190 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Splunk%20Forwarder%20Status%22%7D | |
2015-04-12 00:10:04,197 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "Splunk Forwarder Status", "tags" : "Splunk", "category" : "Splunk", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "", "_user" : "nobody", "_key" : "5526b8b13403e13421162b08" } ] | |
2015-04-12 00:10:04,197 INFO Found incident settings for Splunk Forwarder Status | |
2015-04-12 00:10:04,197 DEBUG Incident config after getting settings: {"run_alert_script": false, "auto_ttl_resolve": false, "subcategory": "", "_user": "nobody", "auto_previous_resolve": false, "_key": "5526b8b13403e13421162b08", "alert": "Splunk Forwarder Status", "alert_script": "", "tags": "Splunk", "category": "Splunk", "auto_assign_owner": "unassigned", "auto_assign_user": "", "auto_assign": false, "urgency": "medium"} | |
2015-04-12 00:10:04,205 INFO Found job for alert Splunk Forwarder Status. Context is 'search' with 159 results. | |
2015-04-12 00:10:04,223 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-12 00:10:04,223 DEBUG Transformed 24h into 86400 seconds | |
2015-04-12 00:10:04,249 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-12 00:10:04,249 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-12 00:10:04,249 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-12 00:10:04,249 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-12 00:10:04,249 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428822000_19357 | |
2015-04-12 00:10:04,549 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=bdbc79e9-6aca-4783-b43a-0fe6cf16cbf0 | |
2015-04-12 00:10:04,575 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-12 00:10:04,575 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-12 00:10:04,575 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'medium'} | |
2015-04-12 00:10:04,575 DEBUG Matched priority in lookup, returning value=medium | |
2015-04-12 00:10:04,596 DEBUG Create event will be: time=2015-04-12T00:10:04.596712 severity=INFO origin="alert_handler" event_id="b6d4019f3b44ebf7f8b9e84d47adbead" user="splunk-system-user" action="create" alert="Splunk Forwarder Status" incident_id="bdbc79e9-6aca-4783-b43a-0fe6cf16cbf0" job_id="scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428822000_19357" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428822007" | |
2015-04-12 00:10:04,603 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428822000_19357 with incident_id=bdbc79e9-6aca-4783-b43a-0fe6cf16cbf0 | |
2015-04-12 00:10:04,665 DEBUG results for incident_id=bdbc79e9-6aca-4783-b43a-0fe6cf16cbf0 written to collection. | |
2015-04-12 00:10:04,665 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428822000_19357 incident_id=bdbc79e9-6aca-4783-b43a-0fe6cf16cbf0 result_id=0 written to collection incident_results | |
2015-04-12 00:10:04,665 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-12 00:10:04,673 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-12 00:10:04,673 INFO Alert handler finished. duration=0.769s | |
2015-04-13 00:11:10,738 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428908400_26601/results.csv.gz job_id=scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428908400_26601 sessionKey=b6gVcw9SVXVSfwZ^kurXcdRtequ407yeUHzhpCWJBeM8RWy5_6i_EYnyHz5h_Gjk7bJ2_R3rbZM2amzQK8uHZ78RliCBSeuXKAF4kcvejnAoRyrYfm10HtWv1UgCnqjbjBmTjeZnsXb_yRyZIwR alert=Splunk Forwarder Status | |
2015-04-13 00:11:10,744 INFO alert_handler started because alert 'Splunk Forwarder Status' with id 'scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428908400_26601' has been fired. | |
2015-04-13 00:11:11,013 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low"} | |
2015-04-13 00:11:11,013 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22Splunk%20Forwarder%20Status%22%7D | |
2015-04-13 00:11:11,019 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "medium", "alert" : "Splunk Forwarder Status", "tags" : "Splunk", "category" : "Splunk", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "", "_user" : "nobody", "_key" : "5526b8b13403e13421162b08" } ] | |
2015-04-13 00:11:11,020 INFO Found incident settings for Splunk Forwarder Status | |
2015-04-13 00:11:11,020 DEBUG Incident config after getting settings: {"tags": "Splunk", "auto_previous_resolve": false, "subcategory": "", "alert_script": "", "_user": "nobody", "auto_assign": false, "auto_assign_user": "", "auto_ttl_resolve": false, "run_alert_script": false, "category": "Splunk", "alert": "Splunk Forwarder Status", "urgency": "medium", "auto_assign_owner": "unassigned", "_key": "5526b8b13403e13421162b08"} | |
2015-04-13 00:11:11,028 INFO Found job for alert Splunk Forwarder Status. Context is 'search' with 45 results. | |
2015-04-13 00:11:11,047 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-13 00:11:11,047 DEBUG Transformed 24h into 86400 seconds | |
2015-04-13 00:11:11,074 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-13 00:11:11,074 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-13 00:11:11,074 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-13 00:11:11,074 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-13 00:11:11,075 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428908400_26601 | |
2015-04-13 00:11:11,375 DEBUG No valid urgency field found in results. Falling back to default_urgency=medium for incident_id=ee7a2e01-4b35-4c94-9b6e-18f1c623d624 | |
2015-04-13 00:11:11,402 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-13 00:11:11,402 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-13 00:11:11,402 DEBUG Querying lookup with filter={'urgency': u'medium', 'impact': 'medium'} | |
2015-04-13 00:11:11,402 DEBUG Matched priority in lookup, returning value=medium | |
2015-04-13 00:11:11,440 DEBUG Create event will be: time=2015-04-13T00:11:11.439971 severity=INFO origin="alert_handler" event_id="d1ddb980409c867021fdf3c502bd2045" user="splunk-system-user" action="create" alert="Splunk Forwarder Status" incident_id="ee7a2e01-4b35-4c94-9b6e-18f1c623d624" job_id="scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428908400_26601" result_id="0" owner="unassigned" status="new" urgency="medium" ttl="86400" alert_time="1428908407" | |
2015-04-13 00:11:11,449 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428908400_26601 with incident_id=ee7a2e01-4b35-4c94-9b6e-18f1c623d624 | |
2015-04-13 00:11:11,474 DEBUG results for incident_id=ee7a2e01-4b35-4c94-9b6e-18f1c623d624 written to collection. | |
2015-04-13 00:11:11,474 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD52672d170304a312d_at_1428908400_26601 incident_id=ee7a2e01-4b35-4c94-9b6e-18f1c623d624 result_id=0 written to collection incident_results | |
2015-04-13 00:11:11,474 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-13 00:11:11,481 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-13 00:11:11,482 INFO Alert handler finished. duration=0.744s | |
2015-04-13 07:10:06,223 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428934200_28773/results.csv.gz job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428934200_28773 sessionKey=ymAPhFmP31Mq4hFyqgXxxVuOLKjiFlan1TFFXX4t9xQbrbhFvg^uyQy2rbTO^uuYuj6475srAtGfXDe^IwAnrsYW6mPmUwQ5EWCcmtbjaZzeO59XeT4W5ogP_0aeWB4^dxetDGWCi^a alert=TO Linux Failed Login Alert | |
2015-04-13 07:10:06,228 INFO alert_handler started because alert 'TO Linux Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428934200_28773' has been fired. | |
2015-04-13 07:10:06,495 DEBUG Parsed global alert handler settings: {"index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-13 07:10:06,495 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Linux%20Failed%20Login%20Alert%22%7D | |
2015-04-13 07:10:06,501 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "TO Linux Failed Login Alert", "tags" : "[Linux]", "category" : "Linux", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0e" } ] | |
2015-04-13 07:10:06,501 INFO Found incident settings for TO Linux Failed Login Alert | |
2015-04-13 07:10:06,502 DEBUG Incident config after getting settings: {"auto_assign_owner": "unassigned", "subcategory": "unknown", "_user": "nobody", "auto_ttl_resolve": false, "auto_assign": false, "alert": "TO Linux Failed Login Alert", "auto_assign_user": "", "auto_previous_resolve": false, "urgency": "high", "run_alert_script": false, "category": "Linux", "_key": "5526b8b13403e13421162b0e", "tags": "[Linux]", "alert_script": ""} | |
2015-04-13 07:10:06,509 INFO Found job for alert TO Linux Failed Login Alert. Context is 'search' with 1436 results. | |
2015-04-13 07:10:06,527 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-13 07:10:06,527 DEBUG Transformed 24h into 86400 seconds | |
2015-04-13 07:10:06,552 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-13 07:10:06,552 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-13 07:10:06,553 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-13 07:10:06,553 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-13 07:10:06,553 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428934200_28773 | |
2015-04-13 07:10:06,882 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=a8c03008-03d2-4df1-9aa3-b791ab577556 | |
2015-04-13 07:10:06,908 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-13 07:10:06,908 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-13 07:10:06,909 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-13 07:10:06,909 DEBUG Matched priority in lookup, returning value=high | |
2015-04-13 07:10:06,948 DEBUG Create event will be: time=2015-04-13T07:10:06.948424 severity=INFO origin="alert_handler" event_id="8e871830f2e71bbf3ac5b06fb63fc4cc" user="splunk-system-user" action="create" alert="TO Linux Failed Login Alert" incident_id="a8c03008-03d2-4df1-9aa3-b791ab577556" job_id="scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428934200_28773" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428934202" | |
2015-04-13 07:10:06,955 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428934200_28773 with incident_id=a8c03008-03d2-4df1-9aa3-b791ab577556 | |
2015-04-13 07:10:07,019 DEBUG results for incident_id=a8c03008-03d2-4df1-9aa3-b791ab577556 written to collection. | |
2015-04-13 07:10:07,019 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428934200_28773 incident_id=a8c03008-03d2-4df1-9aa3-b791ab577556 result_id=0 written to collection incident_results | |
2015-04-13 07:10:07,019 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-13 07:10:07,027 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-13 07:10:07,027 INFO Alert handler finished. duration=0.805s | |
2015-04-13 07:20:06,399 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428934800_28828/results.csv.gz job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428934800_28828 sessionKey=7U2abAvlBZseU12a^VBjgcubIthmrerP5FUvQhN^nzWeBqI8De6mLIUfFkIioQTMhWiTyHaDvVKnNWm4Dl0PWi4YK4lb9Ti259a^JzMrf_Sl2S^TCCnletbSf860zRZ_BRtzU141m6r alert=TO Linux Failed Login Alert | |
2015-04-13 07:20:06,405 INFO alert_handler started because alert 'TO Linux Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428934800_28828' has been fired. | |
2015-04-13 07:20:06,675 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_priority": "low", "default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low"} | |
2015-04-13 07:20:06,675 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Linux%20Failed%20Login%20Alert%22%7D | |
2015-04-13 07:20:06,681 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "TO Linux Failed Login Alert", "tags" : "[Linux]", "category" : "Linux", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0e" } ] | |
2015-04-13 07:20:06,681 INFO Found incident settings for TO Linux Failed Login Alert | |
2015-04-13 07:20:06,681 DEBUG Incident config after getting settings: {"auto_assign_user": "", "_key": "5526b8b13403e13421162b0e", "auto_assign_owner": "unassigned", "subcategory": "unknown", "run_alert_script": false, "category": "Linux", "_user": "nobody", "alert": "TO Linux Failed Login Alert", "alert_script": "", "auto_previous_resolve": false, "auto_assign": false, "auto_ttl_resolve": false, "urgency": "high", "tags": "[Linux]"} | |
2015-04-13 07:20:06,689 INFO Found job for alert TO Linux Failed Login Alert. Context is 'search' with 2215 results. | |
2015-04-13 07:20:06,707 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-13 07:20:06,707 DEBUG Transformed 24h into 86400 seconds | |
2015-04-13 07:20:06,733 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-13 07:20:06,733 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-13 07:20:06,733 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-13 07:20:06,733 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-13 07:20:06,733 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428934800_28828 | |
2015-04-13 07:20:07,076 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=01f20ed2-5fbf-4f09-be7e-efc5a21adf33 | |
2015-04-13 07:20:07,105 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-13 07:20:07,105 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-13 07:20:07,105 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-13 07:20:07,105 DEBUG Matched priority in lookup, returning value=high | |
2015-04-13 07:20:07,139 DEBUG Create event will be: time=2015-04-13T07:20:07.139145 severity=INFO origin="alert_handler" event_id="b067b23179892d22a812636edc57a4d5" user="splunk-system-user" action="create" alert="TO Linux Failed Login Alert" incident_id="01f20ed2-5fbf-4f09-be7e-efc5a21adf33" job_id="scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428934800_28828" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428934801" | |
2015-04-13 07:20:07,146 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428934800_28828 with incident_id=01f20ed2-5fbf-4f09-be7e-efc5a21adf33 | |
2015-04-13 07:20:07,244 DEBUG results for incident_id=01f20ed2-5fbf-4f09-be7e-efc5a21adf33 written to collection. | |
2015-04-13 07:20:07,245 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428934800_28828 incident_id=01f20ed2-5fbf-4f09-be7e-efc5a21adf33 result_id=0 written to collection incident_results | |
2015-04-13 07:20:07,245 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-13 07:20:07,252 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-13 07:20:07,252 INFO Alert handler finished. duration=0.853s | |
2015-04-13 07:30:05,552 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428935400_28876/results.csv.gz job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428935400_28876 sessionKey=8ZpuiBKNfCDUd5_9MaqXPpo7DbHiQqRgMYAK_ZDb0iPu7XUWwd0_Jul_NoWWMiMbFIBwYpelwUzUQ5FQizbQylZW6okwTuKSYCdTrPG7QsgtEJXV0EkCbOmRrPpqj_TSi6u7m9yt0_wMkEE3 alert=TO Linux Failed Login Alert | |
2015-04-13 07:30:05,558 INFO alert_handler started because alert 'TO Linux Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428935400_28876' has been fired. | |
2015-04-13 07:30:05,851 DEBUG Parsed global alert handler settings: {"default_owner": "unassigned", "default_impact": "low", "index": "alerts-to-inf", "default_urgency": "low", "default_priority": "low"} | |
2015-04-13 07:30:05,851 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Linux%20Failed%20Login%20Alert%22%7D | |
2015-04-13 07:30:05,858 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "TO Linux Failed Login Alert", "tags" : "[Linux]", "category" : "Linux", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0e" } ] | |
2015-04-13 07:30:05,858 INFO Found incident settings for TO Linux Failed Login Alert | |
2015-04-13 07:30:05,858 DEBUG Incident config after getting settings: {"alert_script": "", "urgency": "high", "auto_previous_resolve": false, "auto_assign_owner": "unassigned", "run_alert_script": false, "_key": "5526b8b13403e13421162b0e", "_user": "nobody", "auto_assign": false, "auto_assign_user": "", "auto_ttl_resolve": false, "alert": "TO Linux Failed Login Alert", "tags": "[Linux]", "category": "Linux", "subcategory": "unknown"} | |
2015-04-13 07:30:05,866 INFO Found job for alert TO Linux Failed Login Alert. Context is 'search' with 2991 results. | |
2015-04-13 07:30:05,885 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-13 07:30:05,885 DEBUG Transformed 24h into 86400 seconds | |
2015-04-13 07:30:05,913 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-13 07:30:05,914 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-13 07:30:05,914 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-13 07:30:05,914 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-13 07:30:05,914 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428935400_28876 | |
2015-04-13 07:30:06,296 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=6b86205c-8a60-45d4-9c26-f63eb119b706 | |
2015-04-13 07:30:06,324 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-13 07:30:06,325 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-13 07:30:06,325 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-13 07:30:06,325 DEBUG Matched priority in lookup, returning value=high | |
2015-04-13 07:30:06,360 DEBUG Create event will be: time=2015-04-13T07:30:06.360726 severity=INFO origin="alert_handler" event_id="bd16620547b33de767387c5c4363d365" user="splunk-system-user" action="create" alert="TO Linux Failed Login Alert" incident_id="6b86205c-8a60-45d4-9c26-f63eb119b706" job_id="scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428935400_28876" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428935401" | |
2015-04-13 07:30:06,367 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428935400_28876 with incident_id=6b86205c-8a60-45d4-9c26-f63eb119b706 | |
2015-04-13 07:30:06,468 DEBUG results for incident_id=6b86205c-8a60-45d4-9c26-f63eb119b706 written to collection. | |
2015-04-13 07:30:06,468 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428935400_28876 incident_id=6b86205c-8a60-45d4-9c26-f63eb119b706 result_id=0 written to collection incident_results | |
2015-04-13 07:30:06,468 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-13 07:30:06,476 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-13 07:30:06,476 INFO Alert handler finished. duration=0.925s | |
2015-04-13 07:40:05,073 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428936000_28922/results.csv.gz job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428936000_28922 sessionKey=dvQSIzwpMokFjPOVAAas4xnenDw3jjUz6iVxCavWhJEIp80eT_ub_PwJkSItO1hLGRyElmrMVGpXPhYd7d03Dm3jvsJyjDRmUACMuquFLxPmnbI8moj3gA38AcxHtObm7rqre6Z6zVg alert=TO Linux Failed Login Alert | |
2015-04-13 07:40:05,078 INFO alert_handler started because alert 'TO Linux Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428936000_28922' has been fired. | |
2015-04-13 07:40:05,346 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_impact": "low", "default_owner": "unassigned", "default_priority": "low"} | |
2015-04-13 07:40:05,346 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Linux%20Failed%20Login%20Alert%22%7D | |
2015-04-13 07:40:05,353 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "TO Linux Failed Login Alert", "tags" : "[Linux]", "category" : "Linux", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0e" } ] | |
2015-04-13 07:40:05,353 INFO Found incident settings for TO Linux Failed Login Alert | |
2015-04-13 07:40:05,353 DEBUG Incident config after getting settings: {"auto_previous_resolve": false, "auto_assign_user": "", "_key": "5526b8b13403e13421162b0e", "auto_assign_owner": "unassigned", "category": "Linux", "alert": "TO Linux Failed Login Alert", "subcategory": "unknown", "run_alert_script": false, "auto_assign": false, "auto_ttl_resolve": false, "urgency": "high", "tags": "[Linux]", "alert_script": "", "_user": "nobody"} | |
2015-04-13 07:40:05,361 INFO Found job for alert TO Linux Failed Login Alert. Context is 'search' with 3457 results. | |
2015-04-13 07:40:05,380 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-13 07:40:05,380 DEBUG Transformed 24h into 86400 seconds | |
2015-04-13 07:40:05,410 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-13 07:40:05,410 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-13 07:40:05,411 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-13 07:40:05,411 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-13 07:40:05,411 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428936000_28922 | |
2015-04-13 07:40:05,783 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=ec107955-a8f2-480b-a294-677acde91685 | |
2015-04-13 07:40:05,816 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-13 07:40:05,816 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-13 07:40:05,816 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-13 07:40:05,816 DEBUG Matched priority in lookup, returning value=high | |
2015-04-13 07:40:05,854 DEBUG Create event will be: time=2015-04-13T07:40:05.854559 severity=INFO origin="alert_handler" event_id="beba6730ccd7f826d8f94bb89c0ede06" user="splunk-system-user" action="create" alert="TO Linux Failed Login Alert" incident_id="ec107955-a8f2-480b-a294-677acde91685" job_id="scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428936000_28922" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428936001" | |
2015-04-13 07:40:05,861 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428936000_28922 with incident_id=ec107955-a8f2-480b-a294-677acde91685 | |
2015-04-13 07:40:05,961 DEBUG results for incident_id=ec107955-a8f2-480b-a294-677acde91685 written to collection. | |
2015-04-13 07:40:05,962 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428936000_28922 incident_id=ec107955-a8f2-480b-a294-677acde91685 result_id=0 written to collection incident_results | |
2015-04-13 07:40:05,962 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-13 07:40:05,970 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-13 07:40:05,970 INFO Alert handler finished. duration=0.898s | |
2015-04-13 07:50:04,985 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428936600_28974/results.csv.gz job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428936600_28974 sessionKey=PcituHFZmfzh45N44exnEfXHIRI16FpsZVRC1AFBr9bg4cRz3j4_PDJNLo5QbG9912yk32U5oTYOPkQTSUP_vP3YxGMcjNhUf7TG2AU2XcDZzLZxbCcejWJH_g6NNWQxGQtNGZB26AJydOTG alert=TO Linux Failed Login Alert | |
2015-04-13 07:50:04,991 INFO alert_handler started because alert 'TO Linux Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428936600_28974' has been fired. | |
2015-04-13 07:50:05,262 DEBUG Parsed global alert handler settings: {"default_urgency": "low", "index": "alerts-to-inf", "default_priority": "low", "default_impact": "low", "default_owner": "unassigned"} | |
2015-04-13 07:50:05,263 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Linux%20Failed%20Login%20Alert%22%7D | |
2015-04-13 07:50:05,269 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "TO Linux Failed Login Alert", "tags" : "[Linux]", "category" : "Linux", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0e" } ] | |
2015-04-13 07:50:05,269 INFO Found incident settings for TO Linux Failed Login Alert | |
2015-04-13 07:50:05,269 DEBUG Incident config after getting settings: {"auto_previous_resolve": false, "category": "Linux", "tags": "[Linux]", "alert_script": "", "urgency": "high", "alert": "TO Linux Failed Login Alert", "run_alert_script": false, "auto_assign": false, "auto_ttl_resolve": false, "_key": "5526b8b13403e13421162b0e", "auto_assign_user": "", "auto_assign_owner": "unassigned", "subcategory": "unknown", "_user": "nobody"} | |
2015-04-13 07:50:05,277 INFO Found job for alert TO Linux Failed Login Alert. Context is 'search' with 1317 results. | |
2015-04-13 07:50:05,295 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-13 07:50:05,295 DEBUG Transformed 24h into 86400 seconds | |
2015-04-13 07:50:05,320 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-13 07:50:05,320 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-13 07:50:05,320 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-13 07:50:05,321 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-13 07:50:05,321 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428936600_28974 | |
2015-04-13 07:50:05,650 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=e0526ede-9fd2-4ce5-8ac5-1f4f20104c91 | |
2015-04-13 07:50:05,676 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-13 07:50:05,676 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-13 07:50:05,676 DEBUG Querying lookup with filter={'impact': 'medium', 'urgency': u'high'} | |
2015-04-13 07:50:05,676 DEBUG Matched priority in lookup, returning value=high | |
2015-04-13 07:50:05,704 DEBUG Create event will be: time=2015-04-13T07:50:05.704183 severity=INFO origin="alert_handler" event_id="7151d028678a5145308d52f6625c29e8" user="splunk-system-user" action="create" alert="TO Linux Failed Login Alert" incident_id="e0526ede-9fd2-4ce5-8ac5-1f4f20104c91" job_id="scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428936600_28974" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428936601" | |
2015-04-13 07:50:05,710 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428936600_28974 with incident_id=e0526ede-9fd2-4ce5-8ac5-1f4f20104c91 | |
2015-04-13 07:50:05,775 DEBUG results for incident_id=e0526ede-9fd2-4ce5-8ac5-1f4f20104c91 written to collection. | |
2015-04-13 07:50:05,775 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428936600_28974 incident_id=e0526ede-9fd2-4ce5-8ac5-1f4f20104c91 result_id=0 written to collection incident_results | |
2015-04-13 07:50:05,775 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-13 07:50:05,783 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-13 07:50:05,783 INFO Alert handler finished. duration=0.798s | |
2015-04-13 08:00:04,526 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428937200_29011/results.csv.gz job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428937200_29011 sessionKey=z2x42L_7rLp4X6OtYbozjVaZ8ZykYQwMV^ZXnUL9m4Myjv32RDYzDYDzaogv8YE32ET6aZJePjMtQ6^sKFMx5Ruykg_QpQsOrXuuB8eez2R2jj^zq6bY36ChkwAjunInPpseVoJyyZ0 alert=TO Linux Failed Login Alert | |
2015-04-13 08:00:04,532 INFO alert_handler started because alert 'TO Linux Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428937200_29011' has been fired. | |
2015-04-13 08:00:04,808 DEBUG Parsed global alert handler settings: {"default_impact": "low", "index": "alerts-to-inf", "default_owner": "unassigned", "default_urgency": "low", "default_priority": "low"} | |
2015-04-13 08:00:04,808 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Linux%20Failed%20Login%20Alert%22%7D | |
2015-04-13 08:00:04,814 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "TO Linux Failed Login Alert", "tags" : "[Linux]", "category" : "Linux", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0e" } ] | |
2015-04-13 08:00:04,814 INFO Found incident settings for TO Linux Failed Login Alert | |
2015-04-13 08:00:04,814 DEBUG Incident config after getting settings: {"run_alert_script": false, "auto_assign": false, "auto_assign_owner": "unassigned", "urgency": "high", "auto_assign_user": "", "tags": "[Linux]", "alert_script": "", "alert": "TO Linux Failed Login Alert", "category": "Linux", "_user": "nobody", "_key": "5526b8b13403e13421162b0e", "auto_previous_resolve": false, "subcategory": "unknown", "auto_ttl_resolve": false} | |
2015-04-13 08:00:04,822 INFO Found job for alert TO Linux Failed Login Alert. Context is 'search' with 3 results. | |
2015-04-13 08:00:04,847 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-13 08:00:04,847 DEBUG Transformed 24h into 86400 seconds | |
2015-04-13 08:00:04,874 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-13 08:00:04,874 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-13 08:00:04,874 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-13 08:00:04,874 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-13 08:00:04,875 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428937200_29011 | |
2015-04-13 08:00:05,189 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=5d043b21-11d9-43b2-8f80-f86629ae3b53 | |
2015-04-13 08:00:05,218 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-13 08:00:05,218 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-13 08:00:05,219 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-13 08:00:05,219 DEBUG Matched priority in lookup, returning value=high | |
2015-04-13 08:00:05,271 DEBUG Create event will be: time=2015-04-13T08:00:05.271129 severity=INFO origin="alert_handler" event_id="269eee6722a3933767aa05f44a701f46" user="splunk-system-user" action="create" alert="TO Linux Failed Login Alert" incident_id="5d043b21-11d9-43b2-8f80-f86629ae3b53" job_id="scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428937200_29011" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428937201" | |
2015-04-13 08:00:05,278 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428937200_29011 with incident_id=5d043b21-11d9-43b2-8f80-f86629ae3b53 | |
2015-04-13 08:00:05,306 DEBUG results for incident_id=5d043b21-11d9-43b2-8f80-f86629ae3b53 written to collection. | |
2015-04-13 08:00:05,306 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428937200_29011 incident_id=5d043b21-11d9-43b2-8f80-f86629ae3b53 result_id=0 written to collection incident_results | |
2015-04-13 08:00:05,306 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-13 08:00:05,314 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-13 08:00:05,314 INFO Alert handler finished. duration=0.789s | |
2015-04-13 11:40:04,557 DEBUG Parsed arguments: job_path=/opt/splunk/var/run/splunk/dispatch/scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428950400_30135/results.csv.gz job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428950400_30135 sessionKey=yA_WEUPbfFE4qpNdd2shMY6vwZ2AqStDCD9KfK9ZSUrPJbqDm37o6RD5u0cH3GROMJATmBlTvy^nX3u1^D_WuWVc5Q2zLgAD4hGaU^y3A7e2dqMlD6nXMDjSk9Zglai^feruVTl6rHy8cUbV5N alert=TO Linux Failed Login Alert | |
2015-04-13 11:40:04,563 INFO alert_handler started because alert 'TO Linux Failed Login Alert' with id 'scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428950400_30135' has been fired. | |
2015-04-13 11:40:04,844 DEBUG Parsed global alert handler settings: {"default_priority": "low", "default_owner": "unassigned", "default_urgency": "low", "default_impact": "low", "index": "alerts-to-inf"} | |
2015-04-13 11:40:04,844 DEBUG Query for alert settings: %7B%22alert%22%3A%20%22TO%20Linux%20Failed%20Login%20Alert%22%7D | |
2015-04-13 11:40:04,850 DEBUG Incident settings: [ { "auto_assign_owner" : "unassigned", "run_alert_script" : false, "urgency" : "high", "alert" : "TO Linux Failed Login Alert", "tags" : "[Linux]", "category" : "Linux", "auto_assign" : false, "auto_previous_resolve" : false, "auto_ttl_resolve" : false, "subcategory" : "unknown", "_user" : "nobody", "_key" : "5526b8b13403e13421162b0e" } ] | |
2015-04-13 11:40:04,850 INFO Found incident settings for TO Linux Failed Login Alert | |
2015-04-13 11:40:04,850 DEBUG Incident config after getting settings: {"subcategory": "unknown", "urgency": "high", "_key": "5526b8b13403e13421162b0e", "auto_ttl_resolve": false, "auto_assign_owner": "unassigned", "auto_assign_user": "", "alert_script": "", "auto_assign": false, "tags": "[Linux]", "category": "Linux", "alert": "TO Linux Failed Login Alert", "_user": "nobody", "run_alert_script": false, "auto_previous_resolve": false} | |
2015-04-13 11:40:04,858 INFO Found job for alert TO Linux Failed Login Alert. Context is 'search' with 133 results. | |
2015-04-13 11:40:04,881 DEBUG Parsed savedsearch settings: severity=4 expiry=24h digest_mode=True | |
2015-04-13 11:40:04,882 DEBUG Transformed 24h into 86400 seconds | |
2015-04-13 11:40:04,907 DEBUG Got lookup content for lookup=alert_impact. filename=alert_impact.csv app=alert_manager | |
2015-04-13 11:40:04,907 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_impact.csv found. Proceeding... | |
2015-04-13 11:40:04,907 DEBUG Querying lookup with filter={'severity_id': '4'} | |
2015-04-13 11:40:04,907 DEBUG Matched impact in lookup, returning value=medium | |
2015-04-13 11:40:04,907 INFO Creating incident for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428950400_30135 | |
2015-04-13 11:40:05,212 DEBUG No valid urgency field found in results. Falling back to default_urgency=high for incident_id=7b59e60c-1234-48f9-beac-65b322e72f7e | |
2015-04-13 11:40:05,237 DEBUG Got lookup content for lookup=alert_priority. filename=alert_priority.csv.sample app=alert_manager | |
2015-04-13 11:40:05,237 DEBUG Lookup file /opt/splunk/etc/apps/alert_manager/lookups/alert_priority.csv.sample found. Proceeding... | |
2015-04-13 11:40:05,238 DEBUG Querying lookup with filter={'urgency': u'high', 'impact': 'medium'} | |
2015-04-13 11:40:05,238 DEBUG Matched priority in lookup, returning value=high | |
2015-04-13 11:40:05,270 DEBUG Create event will be: time=2015-04-13T11:40:05.270145 severity=INFO origin="alert_handler" event_id="d77472d95d9fc64810f0a77fe2596eab" user="splunk-system-user" action="create" alert="TO Linux Failed Login Alert" incident_id="7b59e60c-1234-48f9-beac-65b322e72f7e" job_id="scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428950400_30135" result_id="0" owner="unassigned" status="new" urgency="high" ttl="86400" alert_time="1428950401" | |
2015-04-13 11:40:05,276 INFO Incident initial state added to collection for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428950400_30135 with incident_id=7b59e60c-1234-48f9-beac-65b322e72f7e | |
2015-04-13 11:40:05,304 DEBUG results for incident_id=7b59e60c-1234-48f9-beac-65b322e72f7e written to collection. | |
2015-04-13 11:40:05,304 INFO Alert results for job_id=scheduler__XXmeXX__search__RMD53e172472389e3d2b_at_1428950400_30135 incident_id=7b59e60c-1234-48f9-beac-65b322e72f7e result_id=0 written to collection incident_results | |
2015-04-13 11:40:05,304 INFO Attempting Alert metadata write to index=alerts-to-inf | |
2015-04-13 11:40:05,311 INFO Alert metadata written to index=alerts-to-inf | |
2015-04-13 11:40:05,311 INFO Alert handler finished. duration=0.754s |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment