There seems to be a bug searching events with JSON data if the field names are nested.
For example: sourcetype=cmdb | rename data.ip_v4_address AS ip
This search works as expected
sourcetype=cmdb | eval ip=data.ip_v4
This search does not put any value in the ip field.
If, however, I create an alias for data.ip_v4, I can use eval to access the value of the aliased field.
I complained to my Splunk SE about this over a year ago and it still hasn't been fixed as of 7.1.1.
↧
Bug accessing nested JSON field values
↧
Can I pass an indexed Date value to the time picker?
As far as I know the time picker searches based on the time that the data was indexed in Splunk. I need to search based on the Date values in the events in the dataset. Is this possible?
↧
↧
Why is the text input not filtering results correctly in the dashboard?
So I have a dashboard with 3 different inputs. I noticed something really weird with my text input. If I enter the number 22 into my text input search, results on my dashboard stats table include 21, **22**, 24, 25, 27, 28, 29, 33, etc.
Why is my text input showing me results that don't match the text input exactly?
![alt text][1]
[1]: /storage/temp/251081-test.png
Below is my query for the stats table:
(index=cms_vm) $arrayfield$ $lun$ $datacenter$
| dedup VM
| eval DatastoreName=replace(DatastoreName,".+_(\d+)$","\1")
| eval StorageArray=replace(StorageArray,"^[^_]*_[^_]*\K.*$","")
| eval VM=upper(VM)
| eval StorageArray=upper(StorageArray)
| join type=outer VM [search index="cms_app_server" | fields VM Application]
| table VM OperatingSystem_Code Datacenter StorageArray DatastoreName Application
| rename OperatingSystem_Code AS "Operating System", StorageArray AS "Storage Array", DatastoreName AS "LUN"
Any help would be appreciated as I can't seem to pinpoint why this is occurring.
↧
query with 3 sub search returning several items
Hello there !
This is my first post here :)
I've already read a lot of query/answer, try a lot of things, but .... i'm still not getting something good :( :(
I'd need to mix 3 differents queries in order to get my final result.
I would like to be able to run only ONE query instead of doing step by step the 3 of them.
__A. the first query : get the all transactionId__
Extract all fields called "transactionId" for one source where the word 'ERROR' is seen
* "] ERROR" source=*exp* | table transactionId | dedup transactionId
for example, this will return 2 lines :
dd2ff560-7bcd-11e8-8ac7-005056ac4954
db846840-7bcd-11e8-8ac7-005056ac4954
__B. based on the transactionId found in query A, found the correlationId__ :
* source=*mb05* HTTPHeaderHandler.InboundHeaders ( transactionId from query A ) | rename message_id as correlationId | table correlationId
my query in a step by step mode looks like
* source=*mb05* HTTPHeaderHandler.InboundHeaders (dd2ff560-7bcd-11e8-8ac7-005056ac4954 OR db846840-7bcd-11e8-8ac7-005056ac4954) | rename message_id as correlationId | table correlationId
the result is 2 lines also :
zz31ca20-7bcd-11e8-8ac7-005056ac4954
zz863d00-7bcd-11e8-8ac7-005056ac4954
__C. with the correlationId found on B get all the lines with Exception __ :
* source=*mb05* ExceptionHandler.HandledException ( correlationID from query B) | fields _raw
in my step by step mode :
* source=*mb05* ExceptionHandler.HandledException ( zz31ca20-7bcd-11e8-8ac7-005056ac4954 OR zz863d00-7bcd-11e8-8ac7-005056ac4954 ) | fields _raw
that give me the log that I'm looking for.
A bit annoying to do it step by step.
so I'd like to get something like :
* source=*mb05* ExceptionHandler.HandledException [ search source=*mb05* HTTPHeaderHandler.InboundHeaders [ search * "] ERROR" source=*exp* | table transactionId | dedup transactionId ] | rename message_id as correlationId | table correlationId ] | fields _raw
If anybody have some clue to help me, and will be more tha happy ! :D
thanks in advance for your help !
↧
wire format of tcpsplunk, a.k.a cooked mode
I've scoured the net, looking for info on cooked mode which is used in splunk to splunk communications (e.g. forwarder to indexer).
There is almost nothing -- no wireshark decode module, no info on delimiters, lengths, signing, stamping, msg boundaries, etc.
This creates a false sense of security by obscurity and provides fertile ground for
hackers to reverse engineer and attack proprietary splunk protocols, e.g. see repo FooBallZ/SplunkPWNScripts on github.
Why is this protocol shrouded in secrecy? Any references to a tcpsplunk decoder?
Given the above intro, deflecting/sidestepping counter questions like what problem I'm trying to solve are pointless.
↧
↧
How do I make macro arguments get parsed as fields instead of literals?
I am trying to create a macro that will take a field from an existing query. But when I try to call it the macro treats its argument as a literal value rather than the search field value.
Specifically what I am trying to do is to lookup info about a queried machine in Carbon Black.
the macro looks like this
name: reqsensorsearch(1)
sensorsearch query="$sensor_search$"
I tried testing my macro with
|makeresults | eval sensor_search="hostname:" | `reqsensorsearch(sensor_search)`
but if I do Ctrl+Shift+E (to expand and display), I see this
| makeresults
| eval sensor_search="hostname:"
| sensorsearch query="sensor_search"
What do I need to do to make the macro accept the value of the field sensor_search? So I can get it to run as
|sensorsearch query="hostname:"
↧
From Splunk connect to cassandra through SplunkDBConnect
Hi,
From Splunk, I am trying to connect to cassandra through SplunkDBConnect , in in Db_connection_types.conf file i made below changes:
[cassandra]
displayName = DSE Cassandra
serviceClass = com.splunk.dbx2.DefaultDBX2JDBC
jdbcUrlFormat = jdbc:cassandra://host/keyspace
jdbcDriverClass = com.dbschema.CassandraJdbcDriver
port = 9042
database = cassandra
and deployed necessary jars under Drivers directory, after i restart i am able to see the Driver listed in driver section on Splunk UI, but it is not installed.
I verified the logs i am seeing Error message:
2018-06-29 13:56:15.075 -0500 3536@CB2012D060 [main] WARN com.splunk.dbx.service.driver.DriverServiceImpl - action=load_drivers Can not load any driver from files [/C:/Program%20Files/Splunk/etc/apps/splunk_app_db_connect/drivers/cassandra-jdbc-2.1.1.jar]
Can someone please help?
↧
splunkd service 7.1.1 on Windows 10 RS4 x64 keeps stopping
Hello,
I'm new to splunk so please bear with me. I have just installed the forwarder service on a Windows 10 RS4 x64 image. It appears after a couple minutes that the service just stops. I then get an error that you see here: answers.splunk.com/answers/609072/i-am-not-able-to-run-splunkkd-service-on-windows10.html and here answers.splunk.com/answers/301878/has-anyone-come-across-the-error-the-splunkforward.html. I have found the log and here are the errors I get:
ERROR TcpOutputProc - LightWeightForwarder/UniversalForwarder not configured. Please configure outputs.conf.
06-29-2018 14:13:37.135 -0700 ERROR ExecProcessor - message from ""C:\Program Files\SplunkUniversalForwarder\bin\splunk-powershell.exe"" Method invocation failed because [System.Management.Automation.PowerShellAsyncResult] does not contain a method named
06-29-2018 14:13:37.135 -0700 ERROR ExecProcessor - message from ""C:\Program Files\SplunkUniversalForwarder\bin\splunk-powershell.exe"" 'Close'.
06-29-2018 14:13:37.135 -0700 ERROR ExecProcessor - message from ""C:\Program Files\SplunkUniversalForwarder\bin\splunk-powershell.exe"" At C:\Program Files\SplunkUniversalForwarder\bin\splunk-powershell.ps1:280 char:1
06-29-2018 14:13:37.135 -0700 ERROR ExecProcessor - message from ""C:\Program Files\SplunkUniversalForwarder\bin\splunk-powershell.exe"" + $psDisposer.runspace.Close()
06-29-2018 14:13:37.135 -0700 ERROR ExecProcessor - message from ""C:\Program Files\SplunkUniversalForwarder\bin\splunk-powershell.exe"" + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~
06-29-2018 14:13:37.135 -0700 ERROR ExecProcessor - message from ""C:\Program Files\SplunkUniversalForwarder\bin\splunk-powershell.exe"" + CategoryInfo : InvalidOperation: (:) [], ParentContainsErrorRecordException
06-29-2018 14:13:37.135 -0700 ERROR ExecProcessor - message from ""C:\Program Files\SplunkUniversalForwarder\bin\splunk-powershell.exe"" + FullyQualifiedErrorId : MethodNotFound
06-29-2018 14:13:37.135 -0700 ERROR ExecProcessor - message from ""C:\Program Files\SplunkUniversalForwarder\bin\splunk-powershell.exe""
06-29-2018 14:13:37.775 -0700 ERROR ExecProcessor - message from ""C:\Program Files\SplunkUniversalForwarder\etc\apps\SA-ModularInput-PowerShell\bin\healthlog.bat"" ^C
I didn't see a python folder as mentioned in the first link above and since it didn't say what was deleted I'm not going to attempt anything like that. I have tried to reinstall the service, but I get the same issue. Afaik we have it installed on other machines, I'm not sure about Win 10 RS4 but I'd like to think so.
The plan is to have this installed on a template that would be cloned by users and then everything would report to the splunk server. However I don't know if that's possible, if I need to install it again as the splunk forwarder was installed on the template when I cloned it, so I don't know if that caused a problem or not. I'd also like to know if the splunk forwarder sends data to the splunk server even if nobody is logged onto the machine, or if someone was to create one from the template but not login to it for some time. Any and all help is appreciated. Thanks in advance.
↧
transaction alternative
Have data in the following format ;
1:26:[06/28/2018][08:00:00.149][6959][3868982128][s537565/r17][servername1][filename.cpp:27][ActionMessage::ProcessMessage][][][][][][][][][][][][][][][][** Received request.][servername2]
2:26:[06/28/2018][08:00:00.159][6959][3868982128][s537565/r17][servername1][filename.cpp:57][ActionMessage::ProcessMessage][global1][global2][][][][Millz][][][][][][][][][][** Status: Authorized. ][]
3:26:[06/28/2018][08:00:00.149][6959][4005350256][s537565/r17][servername1][filename.cpp:27][ActionMessage::ProcessMessage][][][][][][][][][][][][][][][][** Received request.][servername2]
4:26:[06/28/2018][08:00:00.159][6959][4005350256][s537565/r17][servername1][filename.cpp:57][ActionMessage::ProcessMessage][global1][global2][][][][Millz][][][][][][][][][][** Status: Authenticated. ][]
| transaction field5 host maxevents=5 startswith="\*\* Received*" endswith="\*\* Status*" | sort -duration| table field25, duration
Need the table as
** Status: Authorized. .010
** Status: Authenticated. .010
However it results as
** Received request.
** Status: Authorized. .010
** Received request.
** Status: Authenticated. .010
What's the best way to normalise this and also get table to only have the 'endswith' string ?
↧
↧
Splunk Add on For AWS - Generic S3 input
I have setup splunk add-on for AWS.
For generic S3 bucket, we tried to add different format files into the bucket.
The text file is read properly by splunk but in CSV file only the header portion is read and rest is ignored.
Nothing can be seen in the _internal logs.
Can someone suggest the reason for this and the solution?
↧
How to create a bar chart with different color and show as sorted
I`m trying to create a bar chart. But if I just do it by:
index="cn-*"
|top limit=5 SrcIP
the bar chart shows in just single color.
if I do it by:
index="cn-*"
|top limit=5 SrcIP
|eval temp=""
|xyseries temp SrcIP count
The bar chart shows in different colors. But the order seems not sorted by the count.
So, is there any way to create a bar chart with different color. And it can be sorted by the count? Also, I need drilldown at the bar chart.
↧
Splunk Form
Hi Splunk Community,
I have been working with Splunk for quite a while and recently wanted to create my own Splunk form using XML. A form that I am currently attempting to create is for a user to input multiple source IP addresses (ex. 10.1.1.1.1, 10.2.2.2.2,...) and Splunk would display all of the user's information including their full name, phone number, and email address from the source IP addresses that were inputted into the form. For example, if I type in 10.1.1.1.1,10.2.2.2.2 then Splunk would display the full name, phone number, and email addresses of those two source IP addresses. This is what I have so far:
Splunk Phishing Email Form:
A simple XML form that displays the user's information once the Source IP address is inputted.
index="wineventlogs" user!= "*$"
[ eval src_ip = "Source_IP" | makemv src_ip delim="," | mvexpand src_ip | fields src_ip]
| dedup user | table user, user_nick, user_phone, user_email
I cannot test this into Splunk because for some reason I am not able to access it at home. I would very much appreciate it if anybody would tell me if the XML code is right. If not, could you please tell what is wrong with it and how I could fix it. Thank you!
↧
When I use splunk commands in command prompt another command prompt window pops up and executes the commands withoout showing proper results
Eg : Here i tired to check the below splunk command on cmd another similar cmd prompt pops up executing the command and closes down once the query is executed without showing the result.
C:\Program Files\Splunk\bin>splunk test
↧
↧
Unable to set direct access to Splunk from C dir.
C:\>"C:\Program Files\Splunk\bin\splunk.exe" envvars > setSplunkEnv.bat & setSplunkEnv
Access is denied.
'setSplunkEnv' is not recognized as an internal or external command,
operable program or batch file.
↧
How can I split similar fields into multiple related events
I have some events like this. Wifi AP and DEVICE connected to it. A one to many AP to DEVICE relationship exists
AP,DEVICE
---------------
A1,D1
A3,D2
A3,D3
A3,D4
A4,D5
A5,D6
A5,D7
I need to reformat this data to be like this:
tuple,D
-----------
1,A1
1,D1
2,A3
2,D2
3,A3
3,D3
4,A3
4,D4
5,A4
5,D5
6,A5
6,D6
7,A5
7,D7
Is there a cleaner way of doing this than below?
search cmd
| streamstats count as tuple
| eval point="1,2"
| makemv delim="," point
| mvexpand point
| eval D=if(point=1,AD,DEVICE)
| table tuple D
FYI. This is so I can use the Map+ vis to draw lines between these connected devices.
↧
Showing a true value for If function
Hello All i have the below query which is based on a ping request running on the back end.
the data looks like this
Reply from 192.168.1.1: bytes=32 time=48ms TTL=64
sourcetype=pingr Server=192.168.1.104
| stats avg(ms) as averages by Server
| fields - Server
| appendpipe
[ stats count
| eval averages=0
| where count==0
| fields - count ]
So the below server will give me a value of 0 if the server is actually of instead of no results are found. I was wondering if its possible to show a text like "Server is off" if the value of 0 is returned and show the actual value of the server is on. I have tried the if command with eval and it kind of works but any value other than 0 should show the correct value of average calculated earlier.
Is this possible? Any help is highly appreciated.
thanks
↧
Geo map - Tile customisation using leaflet app
![alt text][1]
[1]: /storage/temp/251091-img2.png
How to display the content in the value in the tiles using the leaflet map . Its taking only the placemark name when displaying.
Regards,
Nadhiya
↧
↧
calculate the percentage of sum of volume between current and the previous month
base search...
| eval Month = case(Month = "2018-02","Feb",Month = "2018-03","Mar", Month = "2018-04","Apr")
| eval month_curr=relative_time(now(), "-3mon")
| eval month_curr=strftime(month_curr, "%b")
| eval month_prev=relative_time(now(), "-4mon")
| eval month_prev=strftime(month_prev, "%b")
| chart sum(Volume) AS Vol over Shop by Month
Need to calculate percentage between Mar and Feb
↧
What is the identification process for SplunkForwarder?
I've been asked to write a document about the process of SplunkForwarder connecting with a deployer or indexer and forwarding data.
I've been browsing Splunk docs for an hour or two and all I see is configuration / troubleshooting tutorials.
What is the identification process step by step?
I'm guessing it's something like: heartbeat > enlistment > application push > data push, but I'm not sure myself and I hope you guys have a better, more detailed explanation for me.
↧
Unable to upload Tutorial data
I'm unable to upload any data from tutorial due to the below warning message.I haven't unzip the file but still the problem persist
"Preview is not supported for this archive file, but it can still be indexed"
↧