We Just migrated two SH Vms to a new data center. Now we are having intermitten timeouts to where the search heads cannot distribute to peers and I am seeing this error message
`WARN GetRemoteAuthToken - Unable to get auth token from peer: https://xx.xx.xxx:8089 due to: Read Timeout; exceeded 60000 milliseconds`
↧
unable to distribute to peer - unable to get auth token - READ TIMEOUT
↧
Problem of search event even I know there is event there in Boss of SOC V1 dataset
Hi Everyone
I practicing the event and having problem doing search on the dataset. When I just search the answer I can see the event, but when I use splunk search query the answer is not appear for some reason.
Question
What is the name of the file that defaced the imreallynotbatman.com website? Please submit only the name of the file with extension (For example "notepad.exe" or "favicon.ico")
Answer is poisonivy-is-coming-for-you-batman.jpeg
so if I just search poisonivy-is-coming-for-you-batman.jpeg it give me two events
![alt text][1]
However when I do search sourcetype=suricata src_ip=192.168.250.70 | table url | search url=*batman* it does not give me that event, and this happens to a lot of questions, Any suggestion what is happened?
![alt text][2]
[1]: /storage/temp/255814-1.png
[2]: /storage/temp/255815-2.png
↧
↧
Sourcetype override problems
I have the universal forwarder installed on a Windows 2012 server. I am trying to monitor a log directory for a custom application. The application creates a new log file for each month, so I have many text files in the folder that look like 201808.txt, 201807.txt, 201806.txt, etc.
When I monitor the directory, instead of hardcoding the sourcetype that I am telling splunk to do, it is instead setting the sourcetype to the filename. How can I fix this?
On the Windows Server, inputs.conf:
[monitor://C:\BlueIris\log]
disabled = false
sourcetype = blueiris
On the indexer, props.conf:
[blueiris]
sourcetype = blueiris
↧
Calculate median for each type on the hourly aggregation
Dear all,
There are three columns with data: time (time scale in steps of 10 minutes) , val (amount of transactions) and type (type of automated system - 3 different types only).
I need to aggregate data for each type at the hour level - and calculate median(val) for each type on the hourly aggregation. As the answer should be 3 time series of the same length.
What I did:
source="data.txt" | chart median(val) by type, date_hour
But X-axis contains not all hours, they aggregate into "OTHER" tab.
Thanks in advance for the help.
↧
Bootstrap with Splunk
h\Has anyone ever used Bootstrap with splunk?
I want to know the starting point of how to start on this.
TIA!
↧
↧
what does bin _time span does here?
Hi,
I am having a bit of difficulty understanding what does **bin _time span** does here. Below is query shared in splunk community to find request per min by OrgName per day
index=data earliest = -1d| bin _time span=1d
| streamstats count as Req by OrgName, _time
| eval requestsPerMin=Req/24/60
| eval requestsPerSec=requestsPerMin/60
| stats avg(requestsPerMin) as avgRequestPerSec, max(requestsPerMin) as peakRequestPerMin by OrgName, _time
My confusion here is what bin _time span does. I want to find requestPerMin for Last week and month wise too. Could you someone explain how it can be done ?
↧
Creating an "About this dashboard" popup modal view when opening a dashboard
Hi all!
I am developing an app which contains a lot of dashboards and I want the users be prompted with information about each dashboard when opening it. I'm also planning on letting the user choose to check for "Do not show this message again", but first things first.
How can I create a simple popup like [this](https://www.splunk.com/blog/2014/02/24/using-bootstrap-modal-with-splunk-simple-xml.html ) except that I want it to pop up when opening the dashboard without having to click an "info" button.
Unfortunately I have very little experience with javascript, but not afraid to use it.
Really hope somone will be able to help me out, I'm running out of keywords to google
↧
IMAPmailbox index returns 0 event
Hi all, I have an OUTLOOK email account that receives real-time email notifications on PC backup and I wanna index all these emails into my installed `IMAPmailbox` APP for data analysis and visualization. But now my `index=mail` return 0 events. Could someone help me with this?
In my `inputs.conf` , I only changed unix systems `diabled` attribute to true and windows one to false. Also, I modified configuration using Splunk UI and checked the imap.conf. These all looks normal.
Also, not sure if this is related but the inbox of the email account has nothing. All notifications are shown in the `Status Report` section as the same level of Inbox. They are divided into Success, Failed, Warning and Test.
But still my search returns nothing and I would like to know if extra modifications need to be done for this to wok? Thanks for your help!
↧
Two dimentional table with static headers (row and column) and dynamically values
Hi,
I would like to create the following table:
Blue Yellow
Mazda _____ _______
Honda _____ _______
Audi _____ _______
The upper row and the left column needed to be static and only the values inside are dependent by search results.
I tired so many option but I could not find a solution to this.
Does anyone have the correct **full** answer?
↧
↧
Convert KB to MB
Hi
I need to convert these 2 counters from KB to MB
TotalSpaceKB=486757372
FreeSpaceKB=435455092
Do i have to divise this by 1024000?
thanks
↧
Ingesting InfluxDB data into Splunk
How to ingest data from Influx DB data to Splunk using curl command.
I get the response when i do curl in command line.
How to execute the same from the Splunk search head . Please advise.
↧
XML unstructured data
↧
Can I collect application logs from Azure to Splunk?
I already know that I can collect application logs into Azure application insight, and use a storage account streaming this data to event hub, but can splunk pull this data? if yes, how can I configure input in Splunk to do that.
If someone has a documentation about that it will be very helpful.
↧
↧
compress sent reports
Hi at all,
I have the problem that sometimes one of my reports exceed the eMail attachment limits.
I could reduce the fields in report but this isn't a good solution because in this way I don't satisfy the final customer and the problem could be still present.
I solved the problem giving to the customer the availability to manually run the report, but the customer isn't fully satisfied.
Is there a way to compress (zip or tar) a report before to send it to the eMail system?
I think that this is an important feature that it's strange that nobody implemented in Splunk.
Bye.
Giuseppe
↧
show multiple fields in table after using a mvexpand in a query
Hi I am looking at data which includes a field with multiple lines of values. For instance:
$name$, $products$, $country$
============================
an example of an event:
name:
Peter Thompson
products:
windows 10
office 2017
adobe reader 9
country:
Germany
============================
What I am trying to achieve is a table like:
name products country
Peter Thompson Windows 10 Germany
Peter Thompson Office 2017 Germany
Peter Thompson Adobe Reader 9 Germany
============================
To do this I am using mvexpand on the products field which gives me the separated products and sorts them by rarity. I cannot however seem to create a table after that which pulls back the other values such as the name and country. It appears that after the mvexpand or the rare functions, all other fields are lost.
============================
My query:
index=data sourcetype=stuff | mvexpand products | rare limit=10 products | eventstats count AS total by products, name | table count, name
↧
Can we use object storage for storing data for cold buckets( as cold storage)
Hello,
I am looking for cold storage options for Splunk of longer term data rentention.
Can we use object storage for it ?
Has anyone tried testing this earlier?
Splunk version is 7.1.0 with ITSI.
↧
create a dashbord and use text field to search multi string
hello all
i have created a dashboard for nessus report. the results are huge thus our users need to exclude some results
for example they need to exclude ssh and telnet vulnerability reports so they need a text field to type it and then in the background use as filed1!=ssh AND field2!=telnet.
my question is how to deploy this ? with only one input field and just type the string like " ssh telnet "
it it possible ? or any other solution ?
↧
↧
how to host splunk distributed instance on microsoft Azure?
We are in the phase of deploying splunk on Microsoft azure. we would like to know what are the limitation if we deploy splunk using standard deployment option provided in AZURE ?
Any suggestion and help will be appreciated.
↧
Splunk 7.1 Bad Request — editTracker failed, reason='Unable to connect to license master=https://example.de:8089 Error resolving: Name or service not known'
So I tried to set up a new Splunk instance (currently trial version) and I want that instance to be a splunk slave from a master (enterprise version) in another network. I already tried to setup a A-Record for this: IP -> example.com to reach the master splunk instance from my trial license. But everytime I try this the error from above appears. Am I still missing something? Thanks in advance.
kind regards,
Daniel
↧
Calculate median values for the column for 7 weeks
Dear all,
There are two columns with data: `time` (time scale in steps of 10 minutes) and `val` (amount of transactions).
I need to calculate median values (med_val) for the `val` column for 7 weeks. The specific example for the point 12.04.2018 15:00:00 med_val = `median` ( `val` by 7 points. 5.04.2018 15:00:00, 29/03/2018 15:00:00, 22/03/2018 15:00:00, 15/03/2018 15:00:00, 8/8/2018 15:00:00, 03/03/2018 15:00:00, 22/02/2018 15:00:00), i.e. so median at 7 same time points on the same days of the week. If there are no data, then we consider that 0 transactions were performed.
The best that I could come up with is:
| timechart span=10m median(val) | timewrap 1w series=exact
Are there any good solutions?
Thanks in advance!
↧