This code:
| makeresults
| eval StartTime = strptime("2018-01-01 00:00:00", "%Y-%m-%d %H:%M:%S")
| eval EndTime = strptime("2018-01-01 00:10:00", "%Y-%m-%d %H:%M:%S")
| eval Elapsed = EndTime - StartTime
| fieldformat Elapsed = strftime(Elapsed, "%H:%M:%S")
results to Elapsed=08:10:00 or eight hours and ten minutes, instead of just ten minutes.
Would like to get some assistance on why this happens and how to fix. Thanks a lot!
↧
Invalid result in getting span between two dates
↧
誤ったタイムスタンプが表示される
props.confでTIME_PREFIX、MAX_TIMESTAMP_LOOKAHEADやTIME_FORMATなどを正しく定義したにも関わらず、検索結果に表示されるタイムスタンプ情報(_timeの情報)が実際のタイムスタンプと異なっています。
例えば、「2018 Jan 11 16:36:16」が「1970 Jan 1 12:05:43」として表示されます。
問題となり得る箇所はどこにあるかを教えて頂けますか?
↧
↧
blacklisting file with pattern in the filename
Hello,
We would like to exclude some files from indexing using blacklist. At the moment it looks as follows and works fine
`blacklist =rtedump|_alert_|available\.log$|nameserver_history\.trc$|table_consistency_check|\.(?i:gz|json|old|py|tar|txt|xml|zip)$`
I would like to additionally exclude the user traces, which can be identified as having the following ending pattern (checked in regex101.com):
`[ICDicd]\d{6,}\.trc`
How would the new blacklist have to look like? Would it be just the extension of the existing one and look as follows?
`blacklist =[ICDicd]\d{6,}\.trc|rtedump|_alert_|available\.log$|nameserver_history\.trc$|table_consistency_check|\.(?i:gz|json|old|py|tar|txt|xml|zip)$`
Kind Regards,
Kamil
↧
Why is the extracted field not shown and not available for search
I extracted three fields
the data is `\\VMMSNEWPALM2SER\Process(TIDC.Imports)\% Privileged Time, ,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0`
I want to extract the `VMMSNEWPALM2SER` , `Process(TIDC.Imports)` and `% Privileged Time`
My rex is `\\\\(?P\w+)` , `\w+\\\\(?P.*)\\\\` and `^[^\)\n]*\)\\(?P.*?),`
![alt text][1]
but now ,I can't always use the `v_fields` to find data ,I'm sure I've extracted it
![alt text][2]
And I can find the `v_fields` in the left fields sidebar.
why?
What should I do?
[1]: /storage/temp/254910-a1.png
[2]: /storage/temp/254909-a3.png
↧
same search query returns different number of results
Hi,
I have Splunk Free (I am afraid this is not present in the "choose product" list, switched from "Enterprise Trial"...).
I am using the same user (there is only admin user in Splunk Free), and run a very simple query several times,
host="abc-def.csv"
with time picker = "All time". Moreover, the index records do not change during the searches (one time load csv).
Also, settings for event sampling are "No event sampling".
Now, strangely, I always get different amount of events returned (e.g. ranging from 132k to 169k events...).
Why is this so? Is there kind of timeout and how can I increase it?
There are several similar posts, but all are n.a. - e.g. I use a single user and the index does not change, ...
Thanks!
Best Regards
Florian
↧
↧
dynamic comparison of todays to last week
index = abc App_Name=xyz earliest=-0d@d latest=now | multikv | eval ReportKey="Today"|append[search index = abc App_Name=xyz earliest=-7d@d latest=-6d@d | multikv | eval ReportKey="LastWeek"| eval _time=_time+60*60*24*7]|eval _time=if(isnotnull(new_time), new_time, _time)|timechart span=5m sum(TOTAL_TRANSACTIONS) as Transactions by ReportKey
This query i have written for today's comparison with last week.
If someone wants to view the comparison of yesterday data and correspondingly last week(considering yesterday to its one week data), or the 'day before yesterday' to its corresponding last week data and so on.
So could you please help how can i write the query for that.
↧
Can we change background color for the Splunk input choices ?
I have a dashboard where panels hide/show according to the linked list choices you are making,
Now i want the choices to have a specific color change and font color, when i am trying to use background color all of the linked list gets that background color.
↧
Issue regarding summary index fields
Hello,
I have created a scheduled search which populates summary index from a custom index.
My main custom index has around 100 fields, but those fields are not appearing in summary index, only host, source source type fields are present in summary index.
When i tried adding table field1, field2 etc in the scheduled search query, then those mentioned fields are appearing in the summary index. But when i use table * in the search query, i am not getting any fields in the summary index.
I have to explicitly specify "table field names" in query, which is tedious considering the number of fields.
IS there any way to fix this issue?
↧
Send failure while pushing PK to search peer = https://*.*.*.*:8089 , Read Timeout
I'm getting the above warning messages in the internal Splunk logs every minute from each of our 3 x search heads.
The search peer in question is in our secondary site (let's say B) to the search heads (site A), but there are two other search peers in the same site (B) which we don't get any warning messages for.
I've done a ping and netcat from each of the search heads in site A to each of the three search peers in Site B and the results are the same for each one, connection established and similar ping times.
It's not a connection issue, so i'm wondering what else could be causing it?
↧
↧
How to create a dashboard with dependencies between assets, like a tree or topology?
How to create a dashboard with dependencies between assets, like a tree or topology, something like the one used in the "IT Service Intelligence" app?
Thank you very much in advance.
↧
Combine multiple cron job into a single cronjob for a single db input
Hi All,
I have a db input created in splunk db connect app. I want to execute the query based on cron schedule. The problem is I want to run the first job in every 45 mins starting from 0:00 to 12:00 and then the rest cronjob should run for every hour from 13:00 to 23:00.
Is there any way to execute these 2 crons in a single cronjob? Any help will be much appreciated.
Thanks
↧
want to trigger severity based on two violations and below criteria
sourcetype=xreGuide XRE-07*** IS_VISIBLE=true
| bucket _time span=10m
| stats dc(receiverId) as receiverIds by _time
| eval psev=case(receiverIds<=499, "4", receiverIds<=9999, "2", receiverIds>10000, "1")
| eventstats count as VIOLATIONS by psev
| eval severity=if(VIOLATIONS>1 AND psev=3, 3, 4)
| eventstats min(severity) as overallSeverity
| fields _time receiverIds overallSeverity
| rename overallSeverity as severity
↧
"View Capabilities" page missing...
We upgraded our splunk and found that when you click on "view capabilities" for a user in the AccessControls >> Users page it'll take you to a great picture of buttercup 404. Does anyone know what to restore to fix this?
↧
↧
Rex Help in a search query
I have a field values with below formats and need to extract the end value extensions like (cjs, js ..,etc) from it and store it in a separate field . Can anyone help me with this? Thanks
sample=/abc/test/ipts/jquery-1.3.1-vsdoc.cjs
sample=/abc/test/ipts/jquery-1.3.js
↧
Custom Adaptive Response Action in ES with Validation
Hello,
I'm unable to get field validation in a custom Adaptive Response Action in Splunk Enterprise Security. What I would achieve is field validation that obliges the user to fill the field (required field) but I can't get even the simplest validation working. When I click on the run button in the adaptive actions modal view on the incident I get no validation but a message saying "action has been dispatched".
Furthermore which field should I put in alert_actions.conf.spec and savedsearched.conf.spec, the documentation I have read is quite vague.
Thanks!
↧
Can you help me trigger severity based on two violations and below criteria?
sourcetype=xreGuide XRE-07*** IS_VISIBLE=true
| bucket _time span=10m
| stats dc(receiverId) as receiverIds by _time
| eval psev=case(receiverIds<=499, "4", receiverIds<=9999, "2", receiverIds>10000, "1")
| eventstats count as VIOLATIONS by psev
| eval severity=if(VIOLATIONS>1 AND psev=3, 3, 4)
| eventstats min(severity) as overallSeverity
| fields _time receiverIds overallSeverity
| rename overallSeverity as severitye
↧
JSON format log getting truncating
I have a log which has a JSON format lines in the middle. Splunk is extracting the log but is truncating JSON part to 26 lines. How do I get the full log without Splunk truncating the JSON lines.
↧
↧
Could you help me use rex to extract end value extensions from field values?
I have field values with the below formats and I need to extract the end value extensions like (cjs, js ..,etc) from them and store them in separate fields. Can anyone help me with this? Thanks
sample=/abc/test/ipts/jquery-1.3.1-vsdoc.cjs
sample=/abc/test/ipts/jquery-1.3.js
↧
JSON format log getting truncating
I have a log which has a JSON format lines in the middle. Splunk is extracting the log but is truncating JSON part to 26 lines. How do I get the full log without Splunk truncating the JSON lines
↧
In a dashboard, can we change background color for the Splunk input choices ?
I have a dashboard where panels hide/show according to the linked list choices you are making,
Now, i want the choices to have a specific color change and font color. So, when i am trying to use a certain background color, all of the linked list would that background color.
↧