I would like to display weekday in the column heading.
|Search....
| eval weekday=strftime(now(),"%A")
Output
S.no | Daily | weekday
1 101 Thursday
2 210 Thursday
Desired Output
S.no | Daily (Thursday)
1 101
2 210
Tried xyseries and transpose but I couldn't find a way to flip only one column instead of the whole table.
↧
How do I make a field value as a column heading?
↧
Can you help me with Dedup events in a data model?
Hi everyone
How do I leave just unique events by specified field in an accelerated data model?
My base search looks like:
index=main source=transactions tx_type=purchase | `registration_time` | `type_user`
And I'm trying to add child dataset with just one constraint: dedup transaction_id
But it didn't work.
Duplicated transaction_id cannot be deleted from original source 'cause it's a feature of some transactions... But, in data model, we just need one occurrence of it, e.g. to sum revenue.
With this query we get a greater result than we actually have because of counting several times transactions with the same transaction_id. How can we get correct result?
Thanks in advance!
| tstats sum(transactions.price) AS Revenue from datamodel=transact.transactions where (nodename = transactions) groupby _time span=1month
| rename transactions.* as *
| timechart span=1month first(Revenue) as revenue
↧
↧
How to predict from time series values for Multiple fields? (Machine Learning Toolkit)
index=deg host=host sourcetype=source source=logpath Name=* Pages=* Schema=*
| eval SizeInMB = ((Pages * 4 * 1024)/1048576)
| eval Maximum = max(SizeInMB)
| sort -Maximum
| where Maximum > 25000
| timechart span=1hr max(Maximum) by Name
| rename Maximum as "MBSize"
| Predict Name 1
This is my search query that produces a time series of data points (every hour) for Megabyte Size of around 10 tables names.
**Name 1 Name 2 Name 3 Name 4 ...**
**01:00** MBsize MBsize MBsize MBsize
**02:00** MBsize MBsize MBsize MBsize
**03:00** MBsize MBsize MBsize MBsize
**04:00** MBsize MBsize MBsize MBsize
...
My Question is...How do I predict the MBsize for lets say 2 months in the future for **each table name** based on the gathered previous data points for each table?
I understand how this can be done forecasted for 1 Table Name by specifying `| Predict Name 1 Name 2 Name 3 etc..` but I don't know how to pass in all the table names in as a list handle predicting Multiple tables names based on their past data points.
↧
How do you use Rest API Modular Input to create an incident in ServiceNow (SNOW) without the Splunk add-on?
I have an alert configured in Splunk. Whenever that alert gets triggered, I need to call an API to create an incident in SNOW.
i am aware of the SNOW add-on, but we are not required to use it.
So please help with the other way to create an incident. We have an endpoint URL to create it.
↧
Why am I getting the following 'call not properly authenticated' error when using Splunk SDK for JavaScript?
Whenever I try to do a search query using Splunk SDK for JavaScript (using node), I get the following error message:
{ messages: [ { type: 'WARN', text: 'call not properly authenticated' } ] }
I checked multiple forums but none have helped.
I am able to login (always), but as soon as I call search() function, I get this error.
Here is my function call:
splunkService.login (err, success) ->
if err
console.log err.data
else
console.log ("Login is successful") #this always successful
query = 'search index=a sourcetype=b application=*abc* | top 1 error'
params = {
earliest_time: '-15m'
latest_time: 'now'
exec_mode: 'normal'
}
searchSplunk(query, params)
searchSplunk = (query, params) ->
splunkService.search query, params, (err, job) ->
if err
console.log 'Error Encountered:'
console.log err.data
else
console.log 'Job ID: ' + job.sid
#console.log job
job.track {period: 200},
done: (job) ->
console.log 'Done, here!'
job.results {}, (err, results, job) ->
if err
console.log err
else
results.fields.forEach (field) ->
console.log field
results.rows.forEach (row) ->
console.log row
failed: (job) ->
console.log 'Job failed'
error: (err) ->
console.log err.data
Please note, oneshotSearch() always works, search() works but just 2 times out of 10.
↧
↧
How do you view last event time in an inputlookup?
I have a inputlookup like which searches on a CSV where the CSV looks like
Field_A Field_B
A test1
B test2
I want to run a search where I get the output but I also get a column added to see what was the last event observed from Fiend A in Splunk with an output like
Field_A Field_B Last_event
A test1 9/22/18 7:28:16
B test2 9/25/18 7:28:16
I have written a query but it does not seem to work:
inputlookup excel.csv | table Field_A Field_B | appendcols [search index=my_index src=Field_A| stats head 1 | stats first(_time)| eval Last_Seen_Event=strftime(Last_Seen_Event,"%+")]
↧
How do I monitor input on Windows machine with a wild card character?
I want to monitor a log file from the below location on a Windows server.
D:\Program Files\Apache Software Foundation\Tomcat 8.5\webapps\config\
However, based on the version of tomcat, the folder name changes. It could be Tomcat 6.0 or Tomcat 7.5 — etc. — on some servers. So, I tried with a different input stanza on the Universal Forwarder in the inputs.conf file.
[monitor://d:\Program Files\Apache Software Foundation\Tomcat*\webapps\config\audit.log]
[monitor://d:\Program Files\Apache Software Foundation\\*\webapps\config\audit.log]
[monitor://d:\Program Files\Apache Software Foundation\\...\webapps\config\audit.log]
None of the above options work and in the Splunkd.log file, I can see the below entry
09-28-2018 13:44:24.422 +0000 INFO TailingProcessor - Adding watch on path: d:\Program Files\Apache Software Foundation.
which means it is not recognizing folder structure mentioned in input stanza.
Please suggest a solution.
↧
In Splunk Enterprise, can you help me set up the AMQP Modular Input Set up?
Hello,
we are trying to pull in the JSON message from a rabbit server. However, we seem to be getting all the config from the queue before hand. Is there a way to just pull in the msg_body?
Here is an example of the events in Splunk
Fri Sep 28 12:55:36 BST 2018 name=amqp_msg_received event_id=null msg_queue=ES_queue msg_exchange=BMISG msg_body={"TIMESTAMP":"2018-09-27-18:14:26.727","MESSAGETYPE":"INFO","SYSTEM":"BMI","MODULE":"Prep Step","SUBMODULE":"unionData","MESSAGE":"Testing Data.","RUNID":"TEST_201806_064"}
↧
How do I add a new field in the output?
host=pcde* sourcetype=qwedc
| eval host_type=case(host LIKE "%raf%", "RAF", host LIKE "%tap%", "TAP", host LIKE "%dft%", "DFT"
| streamstats count as Req by host_type
| eval RequestsPerMin=Req/24/60
| eval RequestsPerSec=RequestsPerMin/60
| timechart span=5m avg(RequestsPerSec) as AvgRequestsPerSec , max(RequestsPerSec) as MaxRequestsPerSec , p95(RequestsPerSec) as P95RequestsPerSec
|eval AvgRequestsPerSec=round(AvgRequestsPerSec,2), MaxRequestsPerSec=round(MaxRequestsPerSec,2), P95RequestsPerSec=round(P95RequestsPerSec,2)
|sort -MaxRequestsPerSec, -P95RequestsPerSec
============================================================
i want to add host_type in the output so that i can check each server type request count like..
_time | Host_type | AvgRequestsPerSec | MaxRequestsPerSec | P95RequestsPerSec
2018-09-28 19:00:00 raf 0.19 0.31 0.30
========================================================================
Please look at the screenshot for more info.
![alt text][1]
[1]: /storage/temp/255066-screen-shot-2018-09-29-at-125822-am.png
↧
↧
What is the best way to number each event in descending time?
I need to assign number each event sorted in decending _time order.
Ex
Event. _time Count
Event1. 11:54:51. 1
Event2. 11:53:57 2
Event3. 11:53:52. 3
I can use `|streamstats` count.
But does this guarantee events in descending order for historical searches on clustered indexers? Using sorting on _time is effecting query performance. So Is there any way to assign a increment number count based on descending order of _time.
↧
How do you view the last event time in an inputlookup?
I have an inputlookup which searches on a CSV where the CSV looks like
Field_A Field_B
A test1
B test2
I want to run a search where I get the output but I also get a column added to see what was the last event observed from Field A in Splunk with an output like
Field_A Field_B Last_event
A test1 9/22/18 7:28:16
B test2 9/25/18 7:28:16
I have written a query but it does not seem to work:
inputlookup excel.csv | table Field_A Field_B | appendcols [search index=my_index src=Field_A| stats head 1 | stats first(_time)| eval Last_Seen_Event=strftime(Last_Seen_Event,"%+")]
↧
,How do you customize the drilldown of a search?
I've got a search viewed as a table and one of the values of the table cell is a URL. I want to be able to click on that URL and have the browser take me to it. My search results show that I need to customize the drilldown, but I don't see a drilldown customization in a search. Am I missing it somewhere? Is there any way to customize the drilldown of the "details" cell such that a click will take me to the URL that is the value of that cell?
Here's my (redacted) screenshot that shows what happens when I click on it (View events, Other events, etc.):
![screenshot][1]
[1]: /storage/temp/256094-search-with-link.png
↧
How do you disable a vertical scrollbar for a chart?
I have a panel with 2 charts. One of the charts with trellis has a vertical scrollbar and a pager. I would like to disable the scrollbar and use only the pager with multiple pages.
Is there an option to disable the vertical scrollbar in charts and use only pager?
↧
↧
How to set tokens from HTML section of XML dashboard?
I have a simple XML dashboard with an html section inside the fieldset section. I want to set a token based on the value of the input field in the html section when I press the submit button. How do I do this? Does anyone have an example I could follow?
↧
How do you customize the drilldown of a search?
I've got a search viewed as a table and one of the values of the table cell is a URL. I want to be able to click on that URL and have the browser take me to it. My search results show that I need to customize the drilldown, but I don't see a drilldown customization in a search. Am I missing it somewhere? Is there any way to customize the drilldown of the "details" cell such that a click will take me to the URL that is the value of that cell?
Here's my (redacted) screenshot that shows what happens when I click on it (View events, Other events, etc.):
![screenshot][1]
[1]: /storage/temp/256094-search-with-link.png
↧
How do I set a token from the HTML section of XML dashboard?
I have a simple XML dashboard with an html section inside the fieldset section. I want to set a token based on the value of the input field in the html section when I press the submit button. How do I do this? Does anyone have an example I could follow?
↧
can you help me on last event displaying
hello
when I execute the request below, i want to display only the last event without playing with token time or doing a dedup time
index="windows-wmi" sourcetype="wmi:diskdrive" | table host Caption DeviceID FirmwareRevision Status
how to do please?
↧
↧
extarct specific field with regex
i want to extract the field with the name of http_agent from my logs
the raw field is :
"http_host=""nts.mapnanyp.com""","http_agent=""Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:62.0) Gecko/20100101 Firefox/62.0""",http_retcode=200,"msg=""HTTP get request from,content_switch_name="none",server_pool_name="NTS","user_name=""Unknown""","http_refer=""https://mysite/dashboard/new/datalist.aspx?
i want just show the result before http_retcode, the result should be
(http_agent=""Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:62.0) Gecko/20100101 Firefox/62.0)
can you tell me how can i do that?
↧
how to enable CORSS in splunk enterprise ?
I was trying to build application outside the splunk web interface using **Splunk ReST APIs** and **Angular 6** I Got **CORSS rejection error**
ReST API ( I used ):
`https://localhost:8089/services/auth/login`
result:
![alt text][1]
**angular Code : **
login(data) {
const body = new URLSearchParams();
body.set('username', data.username);
body.set('password', data.password);
body.set('output_mode', 'json');
const options = {
headers: new HttpHeaders().set('Content-Type', 'application/x-www-form-urlencoded')
};
console.log(body.toString());
this.http.post('https://localhost:8089/services/auth/login', body.toString(), options).subscribe(
(resData) => {
console.log(resData);
},
(err) => {
console.log(err);
}
);
console.log(data);
}
**but it is working fine with Post man**
Then I tried Changing server.conf
![alt text][2]
[1]: /storage/temp/256099-screenshot-20180929-203605.png
[2]: /storage/temp/256098-screenshot-20180929-204724.png
↧
Scheduled Report sending email with no results, manual search shows data.
Hello,
I have a report that is run daily going back 24 hours. That report was reporting results just fine up until a few days ago, and nothing about the report has changed, yet now the results in the email are blank even though running the search manually returns results. What could possibly cause this type of behavior? I have checked the schedule settings, permissions, the search itself, none of it is any different than what I originally saved. Other saved reports and alerts are running just fine.
Are there any sort of backend changes to splunk that have been known to cause this type of behavior? Perhaps changes in the environment?
There was a post about this same issue in 2015 titled "Scheduled report shows "No results found" but manual report sees data", which was never answered.
↧