One of our users says his alert falsely triggered 5 times today. It alerted at 2:15 am, 2:45 am, 3:30 am, 3:45 am and 5:30 am.
Alert condition - if number of events is less than 1
Schedule - every 15 minutes
Query:
earliest=-1h@h latest=@h index=*** host=**** source=*/***access* OR source=*/****access* "POST" | rex "POST (?\S+)" | stats count by service
I searched as below and found results for all 3 windows.
earliest="05/20/2016:02:00:00" latest="05/20/2016:03:00:00" index=*** host=**** source=*/***access* OR source=*/****access* "POST" | rex "POST (?\S+)" | stats count by service
earliest="05/20/2016:03:00:00" latest="05/20/2016:04:00:00" index=*** host=**** source=*/***access* OR source=*/****access* "POST" | rex "POST (?\S+)" | stats count by service
earliest="05/20/2016:05:00:00" latest="05/20/2016:06:00:00" index=*** host=**** source=*/***access* OR source=*/****access* "POST" | rex "POST (?\S+)" | stats count by service
Below shows from the scheduler log shows that alert got triggered the email and result_count was 0 on all.
`index=_internal source="/apps/splunk/var/log/splunk/scheduler.log" savedsearch_name="" alert_actions="email"`
05-20-2016 05:30:24.340 -0400 INFO SavedSplunker - savedsearch_id="**************************", user="*****", app="***", savedsearch_name="**************************", status=success, digest_mode=1, scheduled_time=1463736600, window_time=0, dispatch_time=1463736621, run_time=1.067, result_count=0, alert_actions="email", sid="scheduler__****__***__*******_at_1463736600_50_F1061003-1E94-4141-A542-EA5DBA9F15DE", suppressed=0, thread_id="AlertNotifierWorker-0"
05-20-2016 03:45:42.383 -0400 INFO SavedSplunker - savedsearch_id="**************************", user="*****", app="***", savedsearch_name="**************************", status=success, digest_mode=1, scheduled_time=1463730300, window_time=0, dispatch_time=1463730340, run_time=0.438, result_count=0, alert_actions="email", sid="scheduler__****__***__*******_at_1463736600_50_F1061003-1E94-4141-A542-EA5DBA9F15DE", suppressed=0, thread_id="AlertNotifierWorker-0"
05-20-2016 03:30:43.814 -0400 INFO SavedSplunker - savedsearch_id="**************************", user="*****", app="***", savedsearch_name="**************************", status=success, digest_mode=1, scheduled_time=1463729400, window_time=0, dispatch_time=1463729441, run_time=0.611, result_count=0, alert_actions="email", sid="scheduler__****__***__*******_at_1463729400_386_F1061003-1E94-4141-A542-EA5DBA9F15DE", suppressed=0, thread_id="AlertNotifierWorker-0"
05-20-2016 02:45:04.295 -0400 INFO SavedSplunker - savedsearch_id="**************************", user="*****", app="***", savedsearch_name="**************************", status=success, digest_mode=1, scheduled_time=1463726700, window_time=0, dispatch_time=1463726702, run_time=0.814, result_count=0, alert_actions="email", sid="scheduler__****__***__*******_at_1463726700_443_F1061003-1E94-4141-A542-EA5DBA9F15DE", suppressed=0, thread_id="AlertNotifierWorker-0"
05-20-2016 02:15:44.778 -0400 INFO SavedSplunker - savedsearch_id="**************************", user="*****", app="***", savedsearch_name="**************************", status=success, digest_mode=1, scheduled_time=1463724900, window_time=0, dispatch_time=1463724942, run_time=0.406, result_count=0, alert_actions="email", sid="scheduler__****__***__*******_att_1463724900_197_F1061003-1E94-4141-A542-EA5DBA9F15DE", suppressed=0, thread_id="AlertNotifierWorker-0"
Below is from the python.log that email was sent out.
`index=_internal source="/apps/splunk/var/log/splunk/python.log" host="*" subject="*****"`
2016-05-20 05:30:24,319 -0400 INFO sendemail:112 - Sending email. subject="************", results_link="**********", recipients="[u'****', u'****', u'****']", server="localhost"
2016-05-20 03:45:42,352 -0400 INFO sendemail:112 - Sending email. subject="************", results_link="**********", recipients="[u'****', u'****', u'****']", server="localhost"
2016-05-20 03:30:43,795 -0400 INFO sendemail:112 - Sending email. subject="************", results_link="**********", recipients="[u'****', u'****', u'****']", server="localhost"
2016-05-20 02:45:04,277 -0400 INFO sendemail:112 - Sending email. subject="************", results_link="**********", recipients="[u'****', u'****', u'****']", server="localhost"
2016-05-20 02:15:44,761 -0400 INFO sendemail:112 - Sending email. subject="************", results_link="**********", recipients="[u'****', u'****', u'****']", server="localhost"
So does this mean it’s a Splunk issue?
May be indexers were not available to shows the result? And it triggered the alert?
↧