Quantcast
Channel: Questions in topic: "splunk-enterprise"
Viewing all 47296 articles
Browse latest View live

Dashboard creation

$
0
0
Hi, I want a help where I have to create a dashboard where I click on a service A it should display a panel based on query. Similarly when I click on service B the results should be displayed in the same panel based on a different query. And i have N no of services like that.. Kindly help me with it

transaction command not command

$
0
0
i have query like below and got result index=ABC host=xyz123 | transaction startswith="failure" endswith="success" maxevents=2 maxspan=1m now i want to display the result opposite of this index=ABC host=xyz123 NOT ( | transaction startswith="failure" endswith="success" maxevents=2 maxspan=1m) how to achieve this?

Conditional SPL

$
0
0
How do you build a query that takes two different SPL paths based on a condition within the data? Example: Write the results of a query to a summary index only if the search name does not begin with "TEST"?

Error: read ECONNRESET

$
0
0
Connection reset looks like a networking issue to me but I checked our firewall logs and I don't see any denies on packets sent by our Splunk HF where the add-on is installed. Can you give me any guidance on how to troubleshoot this? 02-25-2019 09:45:35.922 -0500 ERROR ExecProcessor - message from "/Data/splunk/etc/apps/TA-Azure_Monitor/bin/azure_activity_log.sh" Modular input azure_activity_log://***** Error getting event hub creds: RequestError: Error: read ECONNRESET 02-25-2019 09:46:35.855 -0500 ERROR ExecProcessor - message from "/Data/splunk/etc/apps/TA-Azure_Monitor/bin/azure_diagnostic_logs.sh" Modular input azure_diagnostic_logs://***** Diag Logs Error getting event hub creds: RequestError: Error: read ECONNRESET

Duplicate values causing conflict wiht Dropdown

$
0
0
Hi, I have a dropdown input populated with this search: index=apache_nifi "info.eventSource"="QDH" | sort 0 headers.ispwebServiceHeader.requestInfo.serviceID | fields headers.ispwebServiceHeader.requestInfo.serviceID | dedup headers.ispwebServiceHeader.requestInfo.serviceID and with these configurations: Field For Label: headers.ispwebServiceHeader.requestInfo.serviceID Field For Value: headers.ispwebServiceHeader.requestInfo.serviceID But I got always these error: *"Duplicate values causing conflict"* also if values are really uniques (I tested the query in a separeate search). I tried with other Answers but with no positive results. any ideas? Thanks a lot

Copy index

$
0
0
I'm looking for a clean way to copy an index or duplicate a data stream withouth having to index it twice. We have a Splunk production environment, but are setting up a new environment. This one is more development based, but would use some of the data that is running in production. Seeing we don't want to mix dev and prod, but don't want to index data twice, what would be the best way to make certain data or indexes available to both machines? We tried a setup with forwarding from the prod machine, and with transform and props we managed to get the correct data to our dev machine, but then the prod machine stopped indexing all together...

Help required regarding lookup

$
0
0
I have a lookup(search_query.csv) with data as below. Name Subcategory Query Get Vehicle index=abc I where api=fig I table api msg Post Summary index=cfg I where api=his I table api msg [Note : lookup has 1000's row data such as above] Now I would want to run a query using the above lookup,when name and subcategory matches,it has to execute the corresponding query. (i.e., use Name and Subcategory as input and get the query as output and use the output query as the search). Is it possible?????

Help required on lookup

$
0
0
I have a lookup(search_query.csv) with data as below. Name Subcategory Query Get Vehicle index=abc I where api=fig I table api msg Post Summary index=cfg I where api=his I table api msg Now I would want to run a query using the above lookup,when name and subcategory matches,it has to execute the corresponding query. (i.e., use Name and Subcategory as input and get the query as output and use the output query as the search). Is it possible?????

Details in alert manager are not displayed

$
0
0
Hello, after updating the app from version 2.1.1 to version 2.2.0 it does not show me the details of the incidents generated in incident posture. ![alt text][1] [1]: /storage/temp/270655-alertmanager.jpg

Unable to get value on x-axis

$
0
0
I have a tabular data like below. **EventTime SQL CPU Utilization Other Process CPU Utilization Total CPU Utilization** 2019-02-24 10:00:48.0 0 3 3 2019-02-24 10:01:48.0 0 2 2 2019-02-24 10:02:48.0 0 1 1 2019-02-24 10:03:48.0 0 1 1 2019-02-24 10:04:48.0 0 2 2 2019-02-24 10:05:48.0 0 2 2 2019-02-24 10:06:48.0 0 2 2 2019-02-24 10:07:48.0 0 3 3 2019-02-24 10:08:48.0 0 5 5 2019-02-24 10:09:48.0 0 3 3 i tried to use the line chart and print EventTime on X-axis and rest values on Y-axis. I am able to get the values on Y-axis but X-axis not displaying the data of EventTime field. I used below query. index=main sourcettype="SQL" host=ABC | eval Total_CPU_Utilization=(SQLCPUUtilization+OtherProcessCPUUtilization) | chart latest(SQLCPUUtilization) as "SQL CPU Utilization", latest(OtherProcessCPUUtilization) as "Other Process CPU Utilization", latest(Total_CPU_Utilization) as "Total CPU Utilization" by EventTime Do I need to make any changes in my query?

HTTP Event Collector Indexing Slow

$
0
0
Fellow splunksters, I have been send data to splunk via TCP sockets for a while and never had any issues. I switched some of our apps over to using the HTTP event collector via the Python Splunk API so for example... import splunklib.client as splunk_client service = splunk_client.connect(host='127.0.0.1', port=8089, username=, password=) index = service.indexes['my_index'] index.submit(message, sourcetype='_json', host='local') I think it is important to note that data coming into my script running the API is MQTT data, as the data is coming (about 1 event every 2 seconds) splunk is able to index the data just fine. However, if the data stream is interrupted the events are stored until the connection is re-established and all the events flood to the Splunk server. This is when it takes about 10-15 minutes to index anywhere from 150 to 300 events. I certainly do expect some delay just not 15 minutes. Wondering if anybody else has had this issue with the HTTP Event Collector? Is there a more efficient way of indexing data so this doesn't happen? Is a TCP socket faster than the HEC? I am currently waiting for our IT department to allocate more resource to our Splunk server (such as RAM and CPU cores), maybe that will help increase performance? Thanks!

Compare and filter table results

$
0
0
Given the table below: VIP Group State Primary_VIP Group1 Down Backup_VIP Group1 Down Primary_VIP Group3 Down Backup_VIP Group4 Down How can I filter the results to show only the events where both Primary and Backup VIPs are down in same group? e.g. I'd like to keep just: VIP Group State Primary_VIP Group1 Down Backup_VIP Group1 Down

Are there limitations on using the searchmatch() eval function in props.conf?

$
0
0
I have the following eval statement: | eval aaa=case(action=="opened","success",action=="closed","success",action=="succeeded","success",action=="failed","failure",action=="Accepted","success",action=="Invalid","failure",searchmatch("error trying to bind as user"),"failure",action=="new user","created",action=="new group","created",action=="add" AND app=="usermod","modified",action=="removed" AND app="gpasswd","modified",app=="usermodd" AND action=="change","modified",app=="usermod" AND action=="lock","modified",searchmatch("setting system clock"),"success",action=="clock_sync","success",app=="chage" and action=="changed","modified",app=="aide" AND action="created","added",app=="aide" AND action=="changed","modified",app=="aide" AND action=="removed","deleted",app=="ip route" AND action=="add","added",searchmatch("changed password expiry"),"modified",searchmatch("ip route add"),"added",searchmatch("ip route del"),"deleted",searchmatch("ip route replace"),"modified",useradd_action=="new user" OR useradd_action=="new group","added",action=="Up" OR action=="up","modified",action=="Down" OR action=="down","modified") If I use that statement in the search pipeline, it works. If I define it in a EVAL- statement in props.conf, it breaks completely. If I remove the searchmatch() statements, it works. Is searchmatch() not supported in props.conf? If not, is there a workaround? I tried things like: _raw=="*my text*" and that didn't work either. I understand searchmatch () is an alias for the match() statement. I tried using match() as well and that doesn't work either. Any ideas?

How can I set up an alert for Splunk errors?

$
0
0
Splunk only notifies of errors like file system permission issues in the top right messages dropdown. Since I rarely use Splunk web interface, I'm always missing them. Is there a way to get notified of these errors? Can I set up alerts for them?

How do you extract fields that end with a question?

$
0
0
I wanted to extract a field to capture the data before the question mark as below. api_call "Get \search\ip\6789\?=number\90" where api_call is an already extracted field. I wrote it as a rex field = `api_call "/"(?[^/?])"` ---- result required is Get \search\ip\6789\ but it doesn't seem to work.

Can you help to match events with an inputlookup search?

$
0
0
Hi, I use the basic query below in order to collect the model of a host (workstation) index="xx" sourcetype="WMI:Model" | table host Model In parallel, I have a CSV file called "cmdb" where there is a field called "HOSTNAME", which refers to the field "host" in my search I want to match these 2 fields (host and HOSTNAME) in order to collect in a same table the host, the Model and other fields of my CSV file like CLIENT_USER COUNTRY STATUS ROOM SITE & TOWN Could you help me please??

How come my dashboard eval token is not working correctly?

$
0
0
I have two “parallel” multivalue fields. One has friendly names and the other has the actual URL. In the example below, given the friendly name (cnn), it finds the corresponding URL: | makeresults | fields - _time | eval friendly_names="google,facebook,cnn", urls="http://google.com,http://facebook.com,http://cnn.com" | makemv friendly_names delim="," | makemv urls delim="," | eval url=mvindex(urls, mvfind(friendly_names,"cnn")) | table url This works exactly as I expect it to. When I try to use it in a dashboard event handler, it doesn’t work. Anyone have any thoughts on what I might be doing wrong or an alternative way to do this? My desire is to generate some custom ugly URLs behind the scenes, but to present the user with friendly names on the dashboard. | makeresults | fields - _time | eval friendly_names="google,facebook,cnn", urls="http://google.com,http://facebook.com,http://cnn.com" | makemv friendly_names delim="," | makemv urls delim="," | table friendly_names-24h@hnow1 **mvindex(urls,mvfind(friendly_names,"$click.value2$"))**

Duplicate values causing conflict with dropdown.

$
0
0
Hi, I have a dropdown input populated with this search: index=apache_nifi "info.eventSource"="QDH" | sort 0 headers.ispwebServiceHeader.requestInfo.serviceID | fields headers.ispwebServiceHeader.requestInfo.serviceID | dedup headers.ispwebServiceHeader.requestInfo.serviceID and with these configurations: Field For Label: headers.ispwebServiceHeader.requestInfo.serviceID Field For Value: headers.ispwebServiceHeader.requestInfo.serviceID But, I always get this error: *"Duplicate values causing conflict"* Also, if values are really unique (I tested the query in a separate search). I tried with other Answers but with no positive results. Any ideas? Thanks a lot.

Create a dash-board to count Errors by date

$
0
0
I need to create a dashboard in that I need to count Number of logs by date and number of errors by date

Create a Dashboard to count number of logs and count errors by date

$
0
0
Create a Dashboard to count number of logs and count errors by date can you help me in sending the script please?
Viewing all 47296 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>