Hi. Could someone explain to me the difference between Distributed and Clustered environment in relation to Splunk? I keep thinking it's the same.
Thanks in advance!
↧
What is the difference between a Distributed and Clustered environment?
↧
Splunk Javascript SDK: Is it possible to include column headings as a row in the chart array?
Hi all,
I'm currently working with the Splunk SDK for JavaScript and I am having some difficulties formatting the data from queries. I have a search that looks like this:
* | chart count by _time my_field limit=0 usenull=f useother=f
Which in Splunk looks something like this:
_time A B C D
2017-01-26 10:18:42 2 4 8 0
2017-01-26 10:18:43 0 7 6 3
2017-01-26 10:18:44 4 9 5 2
2017-01-26 10:18:46 7 0 5 0
But when I get the results from running the query through the JavaScript SDK, the array I get is basically this:
2017-01-26 10:18:42 2 4 8 0
2017-01-26 10:18:43 0 7 6 3
2017-01-26 10:18:44 4 9 5 2
2017-01-26 10:18:46 7 0 5 0
(without the headers)
Is there an easy way to include the column headers in the array I get in JavaScript?
↧
↧
How to add a static table in dashboard?
I added a following table to my dashboard mainly to show the legends from a statistics table. The table showed the legends and contents properly, but the statistics table did not work any more. I guess this is not the right way to add a static table to the Splunk dashboard. Any suggestion? Thanks.
↧
How do I write a regular expression to return a matching pattern in my logs?
Any string starting with COLDAPP , ending with double colon, would be a Tx id in my logs. it can be at the beginning/middle/end as the logs are not fully structured always. How do i write a regex to return a matching pattern starting with a COLDAPP and ends with double colon, excluding the double colon in returned pattern.
Example of log:
BaseProcessor pool-62-thread-84 - COLDAPP_WS_780144376_148455147959900002_pbv14slm2_12910::3tWofZ2Bb
I am trying
COLDAPP(?P.*?)::
it returns
_WS_780144376_148455147959900002_pbv14slm2_12910
How do i return
COLDAPP_WS_780144376_148455147959900002_pbv14slm2_12910
Thanks in advance for your help.
↧
What does "P" stand for in regular expression query?
I am trying to understand more about a regular expression query used in Splunk. what does character P stands for in the regex example?
(?P)
↧
↧
admin's beware: 6.5.1 causes clients to directly reach to splunk
FYI, discovered that in 6.5.1 Splunk is now placing the burden of checking whether it is up to date on the client rather than the server. So, since most client machines have access to the internet, lots of interesting information gets passed back to Splunk.. including:
Any associated Splunk Answers user/cookie information
All Splunk role's on the server
GUID's of Splunk Licenses on the server
We are observing the request go to https://quickdraw.splunk.com, and we have 'updateCheckerBaseURL = 0' in web.conf. The request to quickdraw.splunk.com only occurs after a successful login.
If anyone knows how to turn off this behavior, it would be greatly appreciated.
↧
How to export data from Splunk into GRC MetricStream?
HI,
Please share the process to export data from Splunk automatically. I am looking for solutions that allows data to be exported to a node or shared drive folder that can be picked up and ingested into the GRC or a solution that involves a API or data stream from Splunk into MetricStream.
Thanks
↧
Is what I am doing a correct use of custom search commands?
I am not sure if what I am trying to do is a good idea or not and am new to Splunk, so please go easy on me.
We have a system (I will refer to it as "System A") that has a very specific purpose. It has a database of information and generates reports on subsets of that information. System A works very well for its intended purpose. Its only real deficiency is that it is difficult to find the right subset of the data on which to generate the report.
This is where Splunk comes in. It turns out that Splunk has some really great search capabilities that we would like to leverage. In fact, we are already doing so. All we need now is some way to get those search results back into System A.
What I've come up with is a new custom search command. After generating the search via Splunk Web, I simply add the following:
| script export_to_System_A
↧
how to move dashboards from test environment to production
i have couple of dashboards under some app in test environment i want to move this dashboard to production who can i do this.
↧
↧
json: how to chart mem usage
I've got an interesting json:
{"timeStamp":"2017-01-26 23:59","name":"myVM1","counter":"mem.usage.average","description":"Memory usage as percentage of total configured or available memory","unit":"%","values":{"2017-01-26 10:00":"8.99","2017-01-26 09:55":"19.39","2017-01-26 09:50":"7.99"}}
{"timeStamp":"2017-01-26 23:59","name":"myVM2","counter":"mem.usage.average","description":"Memory usage as percentage of total configured or available memory","unit":"%","values":{"2017-01-26 10:00":"33.11","2017-01-26 09:55":"42.12","2017-01-26 09:50":"23.32"}}
The key is the timestamps. Can someone please provide the syntax to chart the two so I can compare mem usage? Thanks!
↧
How to change the default color of dashboards - footers?
Hello All,
Could someone tell me how to change the default footer that is showing in my dashboards - forms pages?
I have a CSS set and all my dashboards that are not advanced XML and using the simple XML format has a grayed out footer.
I would like to change / set the footer background color to match the rest of my page bodies.
v.6.5
Thanks all!
↧
How to find Splunk application type
How can I find out, if I am using Splunk Entrprise, Cloud etc without asking the Splunk admin ?
I use the weburl on port 8000
↧
How to find the Splunk application type in the navigation menu?
How can I find out, if I am using Splunk Entrprise, Cloud etc without asking the Splunk admin ?
I use the weburl on port 8000
↧
↧
Can Splunk integrate with Automation?
We are trying to integrate Splunk with Automation-anywhere which is a RPA tool, any idea on that? is there any app for it?
↧
Can we trigger a workflow in based on alert?
We are trying to generate a workflow to make a POST call to a ticketing system, can we trigger the workflow based on an alert?
↧
REST API Stripping Leading Pipe?
It appears that my use of the REST API is somehow causing a leading pipe to be stripped before an inputcsv command. I have this python search string:
"| inputcsv scale_med_validation_data | apply fastflux_model | where 'predicted(is_attack)' = 1 | eval t = now()+3600*1 | eval report_hour=strftime(t, "%H") | eval report_date=strftime(t, "%m/%d/%Y") | tail 50 | collect index=fastflux_summary"
This works as desired when entered manually through the web interface.
However, when submitted through the REST API, the jobs screen shows the search query missing the leading pipe:
"inputcsv scale_med_validation_data | apply fastflux_model | where 'predicted(is_attack)' = 1 | eval t = now()+3600*1 | eval report_hour=strftime(t, "%H") | eval report_date=strftime(t, "%m/%d/%Y") | tail 50 | collect index=fastflux_summary"
Naturally, this causes the inputcsv to fail, and so none of the REST API jobs succeed. Why might the leading pipe not be making it through here?
↧
Has anyone integrated Splunk with Spotfire?
Has anyone integrated Splunk with Spotfire. Basically the idea is to have Splunk data made available to Spot fire(BI Tool) for visualization, I am aware of the ODBC integration, please post your comments if any of you have more information on this topic
↧
↧
Why is REST API removing a leading pipe before an "inputcsv" command?
It appears that my use of the REST API is somehow causing a leading pipe to be stripped before an inputcsv command. I have this python search string:
"| inputcsv scale_med_validation_data | apply fastflux_model | where 'predicted(is_attack)' = 1 | eval t = now()+3600*1 | eval report_hour=strftime(t, "%H") | eval report_date=strftime(t, "%m/%d/%Y") | tail 50 | collect index=fastflux_summary"
This works as desired when entered manually through the web interface.
However, when submitted through the REST API, the jobs screen shows the search query missing the leading pipe:
"inputcsv scale_med_validation_data | apply fastflux_model | where 'predicted(is_attack)' = 1 | eval t = now()+3600*1 | eval report_hour=strftime(t, "%H") | eval report_date=strftime(t, "%m/%d/%Y") | tail 50 | collect index=fastflux_summary"
Naturally, this causes the inputcsv to fail, and so none of the REST API jobs succeed. Why might the leading pipe not be making it through here?
↧
Can we configure the forwarders to use SFTP for transferring the files?
Can we configure the forwarders to use SFTP for transferring the files? If not is there any way to encrypt data by Universal Forwarder (UF)? Does UF support SSL?
↧
How to chart and compare memory usage of my JSON data?
I've got an interesting JSON:
{"timeStamp":"2017-01-26 23:59","name":"myVM1","counter":"mem.usage.average","description":"Memory usage as percentage of total configured or available memory","unit":"%","values":{"2017-01-26 10:00":"8.99","2017-01-26 09:55":"19.39","2017-01-26 09:50":"7.99"}}
{"timeStamp":"2017-01-26 23:59","name":"myVM2","counter":"mem.usage.average","description":"Memory usage as percentage of total configured or available memory","unit":"%","values":{"2017-01-26 10:00":"33.11","2017-01-26 09:55":"42.12","2017-01-26 09:50":"23.32"}}
The key is the timestamps. Can someone please provide the syntax to chart the two so I can compare memory usage? Thanks!
↧