Quantcast
Channel: Questions in topic: "splunk-enterprise"
Viewing all 47296 articles
Browse latest View live

How to display validation errors in the Setup form?

$
0
0
I've written a configuration handler script (python) using admin.MConfigHandler. The script works but I haven't found any way to provide good feedback to the user when any of the fields fail to validate. The docs I've found lacks in this regard. Browsing scripts there seem to be two common ways of solving validation issues: raise an exception (typically admin.ArgValidationException) or output some text to stderr. The problem is that this does not give any useful feedback to the user. There is a red box at the top of the page stating a generic "Error while posting...". Looking in the HTML generated from setup.xml one can see that there are DIVs attached to each field with class "widgeterror". If you manually modify the code a nice bold red text is displayed next to the field. Short version: the form is prepared for this type of feedback. How do you "light it up"?

Process monitoring in Windows

$
0
0
Hi, In our system, there will be multiple java.exe process will be running, we are in need for monitoring the cpu and memeory usage of all the java,exe process running. I tried white listing in inputs.conf file it did not work. We do not need monitoring of other process in our windows server. Is there any specific commands to be used in inputs,conf file to monitor only Java.exe running. or is there is any App to monitor Java Process. we could not configure JMX, as our Apache used is completely customized by our product owner and we could not edit further. Thanks

About log rotation best practices

$
0
0
Now I am planning to rotate the logs being monitored by Splunk. I will also use the compress option to rotate. But one thing to worry about... If log rotation is done and files are compressed before splunk has finished loading the logs, will the logs be missing on Splunk? Is there also a recommended setting of rotate etc?

Compound transform - logical AND condition

$
0
0
Is support for compound transform conditions in the works? I have cases where I would like to set a DEST_KEY if an event matches i.e. two SOURCE_KEYs, but ONLY if the two match (logical AND). An example would be a specific source and host, i.e. source=/logs/beta/... AND host=vserver1451 applied to a sourcetype stanza. I understand that one can specify multiple transforms for a spec stanza, but these behave independently. This is not what I am after.

How do I distribute the search app bundles on the cluster of search header?

$
0
0
In the search header cluster, we can use deployer to distribute app bundles but I've always had a question. If i need to update a configuration file on the search app. for example: I would like to add a lookup table in the search app `/search/lookups/` or add a static file (.js) in the `/search/appserver/static/` directory. So how should i do it? A、copy `$SPLUNK_HOME/etc/apps/search` from search header to the `$SPLUNK_HOME/etc/shcluter/apps/`of the deployer and then add the new lookup table or add a static file. Finally, through the `splunk apply` CLI to distribute the bundle B、create a search directory directly under `$SPLUNK_HOME/etc/shcluter/apps` on the deployer, and then create the `lookups` directory in the search directory. Add a lookup table here, and finally through the `splunk apply` CLI to distribute the bundle A and B, which method is correct? Please forgive my English level, all the help would be appreciated.

How to make Splunk read the file when only date stamp of the file changes.

$
0
0
I have a system where I use SSH to pull out status data from a remote system This is then stored to a file that Splunk i set to monitor. My problem is that the files is read from the system every 5 minutes, but Splunk only shows indexed data when file content is changed. I would like Splunk to show all the content every time the file changes date (5 min cron job), even if nothing has changed within the file. Is this possible? Example first run: **red=1 yellow=2** time stamp of file **09:05** Splunk now show two events. Second run: **red=1 yellow=2** time stamp of file **09:10** Splunk now shows no events. I need to show both every 5 min, even if they do not change. Third run: **red=1 yellow=3** time stamp of file **09:15** Splunk now shows all event again, since content of file has change.

Nested Splunk XML Condition Match Possibility

$
0
0
I have two conditions needs to be compared to set token value in the second dropdown. First condition from dropdown 1 (token="view") and second condition from dropdown 2 ( token="category"). Please tell us if there is any alternative way of doing this.dABAXYTest1Test2Test 3

Is IMAP App for Splunk capable of ingesting emails from SMTP?

$
0
0
Hi Team. Would like to ask if we can ingest emails from SMTP going to IMAP app. Thank you! Kevin Alzaga

What happens if you specify two paths in a volume in indexes.conf?

$
0
0
What happens if you specify two paths in a volume in indexes.conf? For example: [volume:example] path = /opt/splunk/examplevolume path = /opt/splunk/var/lib/splunk maxVolumeDataSizeMB = 400000 Does splunk send the data to each location, does it split it up across the two paths or does it choose one? Cheers

How to create an SPLUNK APP/dashboard that will fetch on external API using javascript?

$
0
0
Hi Guys, I am newbie in SPLUNK. I am trying to work with API. The goal is to create an APP where it will get the data from external API. I only new how to use Javascript and C#. The API is something like this "Http://domain.com/API/controller/function?param1=val1&startdate=date&endate=enddate". I read some article from the community but didn't get me anywhere. Hopefully I can get an answer here. A sample code on javascript that SPLUNK dashboard will be nice :) Thank you for your help in advance.

Forward to other index

$
0
0
Dear all, may I ask a noob-question to the experts? Currently I am forwarding Data from several forwarders (F_a, F_b, F_c) to a splunk indexer (S_a). So like this: F_a F_b --- > S_a (These are collected in 3 different Indexes: a, b, c) F_c for Research purposes I would now use all the data that is sent to S_a in another indexer (S_b). So like this F_a F_b ---- > S_a ---- > S_b F_c This can be done very easy of course by using the "Configuring Forwarding" in the mangement console. The challenge what I have is, that I want all the data comming from S_a to S_b to be collected in one single index, e.g. "abc". So in Terms of Indexes it is like this: Indexes on S_a: a b c Index on S_b: abc The idea is then to feed all data from Indexes a,b,c (from S_a) to the single index abc (in S_b). And I would like to have that not "one time" but real time during forwarding. Is that possible and how? In an ideal case, the Information about the original index can be kept in an additional field then. best regards and thanks a lot for your answers in advance

expand splunk enterprise in hardware constrained environment

$
0
0
We are currently running splunk in single instance mode and have grown enough that we need to expand it. I have the ability to provision an existing beefy server to be a 2nd indexer. However I do not have the hardware to have a dedicated search heard. My reading of the splunk documentation is that I will need 3 servers minimum (1 SH, 2 IX). Is that correct? Is there no way to do "just add a second index" (management question)?

Deposit Structuring

$
0
0
I am trying to create a query that calculates the amount of money a person deposits within an hour and then compares that to the amount of money the person withdraws within the following hour. I've been able to calculate the amounts and display them as a table, but I'd really like to be able to compare the values and alert when they match. For example, give the following data: 9/24/2017 10:00:00 AM: account=123456789 memberName="Bad guy" command=deposit channel=atm amount=499.00 direction=external 9/24/2017 10:01:00 AM: account=123456789 memberName="Bad guy" command=deposit channel=atm amount=499.00 direction=external 9/24/2017 10:02:00 AM: account=123456789 memberName="Bad guy" command=deposit channel=atm amount=499.00 direction=external 9/24/2017 10:03:00 AM: account=123456789 memberName="Bad guy" command=deposit channel=atm amount=3.00 direction=external 9/24/2017 10:05:00 AM: account=123456789 memberName="Bad guy" command=withdrawal channel=atm amount=500.00 direction=external 9/24/2017 10:06:00 AM: account=123456789 memberName="Bad guy" command=withdrawal channel=atm amount=500.00 direction=external 9/24/2017 10:07:00 AM: account=123456789 memberName="Bad guy" command=withdrawal channel=atm amount=500.00 direction=external 9/24/2017 11:00:00 AM: account=123456789 memberName="Bad guy" command=deposit channel=atm amount=499.00 direction=external 9/24/2017 11:01:00 AM: account=123456789 memberName="Bad guy" command=deposit channel=atm amount=499.00 direction=external 9/24/2017 11:02:00 AM: account=123456789 memberName="Bad guy" command=deposit channel=atm amount=499.00 direction=external 9/24/2017 11:03:00 AM: account=123456789 memberName="Bad guy" command=deposit channel=atm amount=5.00 direction=external 9/24/2017 11:05:00 AM: account=123456789 memberName="Bad guy" command=withdrawal channel=atm amount=500.00 direction=external 9/24/2017 11:06:00 AM: account=123456789 memberName="Bad guy" command=withdrawal channel=atm amount=500.00 direction=external 9/24/2017 11:07:00 AM: account=123456789 memberName="Bad guy" command=withdrawal channel=atm amount=500.00 direction=external I ran the query: index=test sourcetype="test_transactions" direction=external | bin span=1h _time | stats sum(amount) AS "Total" by _time date_hour account command | where Total >= 1500 which produced a nice table showing the aggregate amounts per hour per command, but I am still not sure how to compare the deposit and withdrawal totals per hour to each other. I'd appreciate any help you can give me. Thanks

How to normalize field value from two different sourcetypes?

$
0
0
Suppose I have two sourcetypes: proxy1_source in sourcetype=proxy1_source, the field url starts with: "http://" proxy2_source in sourcetype=proxy2_source, the field url doesn't start with: "http://" How do I search for all events in both sourcetypes so that I can table the "url" but all urls in proxy2_source must be prepended with "http://"? Also, proxy2_source doesn't always have that url field. I tried this below: | rex field=url "(?((http|https)://))" |fillnull value="http://" |eval url= scheme + url but i got output such as: http://http:// (this keeps showing up) http://http://www.domain.com/any/thing/ (proxy1_source - redundant http://) proxy2_source urls look fine with the intended http:// prefix. thanks in advance.

How Do I Create a Splunk Setup View

$
0
0
My goal is to create a setup page that users will be forced to configure on the installation of my Splunk App. I know that Splunk has `[setup.xml][1]` pages, but I want to use the setup view that was released around Splunk 6.4.0. The only documentation I can find for it is here in the `[app.conf][2]` file. It's listed in the `[ui]` stanza as a `setup_view` property and that it allows you to specify the setup view. However, that's it! I can't find anymore information on this! Can anyone help me find out how to use this? [1]: http://dev.splunk.com/view/SP-CAAAE8U [2]: http://docs.splunk.com/Documentation/Splunk/6.6.3/admin/Appconf#.5Bui.5D

ForeScout addon for Splunk

$
0
0
I have installed Forescout Addon into Splunk. I did configure by following the document. From the wireshark, I could see CounterACT is sending the message to Splunk. However, I am unable to see the graphs. Also when I tried to search the data, it seems no data on Splunk Web GUI. I do not know what next to investigate. The data seems to hit Splunk (based on Wireshark) but why couldn't see data on Splunk Web GUI. Please kindly advise.

i need proper documentation for splunk SAP power connect add on

$
0
0
Hey sap power connect either uses flat files or html or any other files

Problem splitting data, lines are lost from scripted input data.

$
0
0
I have a script that works fine. When I do run it from cli like this, I get correct result: **/opt/splunk/bin/splunk cmd /opt/splunk/etc/apps/MikroTik/bin/mikrotik_upnp.sh** Flags: X - disabled, I - invalid, D - dynamic 0 D ;;; upnp 10.10.10.32: Teredo chain=dstnat action=dst-nat to-addresses=10.10.10.32 to-ports=57050 protocol=udp dst-address=110.12.197.134 in-interface=ether1 dst-port=57050 1 D ;;; upnp 10.10.10.84: Skype UDP at 10.10.10.84:48153 (3904) chain=dstnat action=dst-nat to-addresses=10.10.10.84 to-ports=48153 protocol=udp dst-address=110.12.197.134 in-interface=ether1 dst-port=48153 2 D ;;; upnp 10.10.10.84: Skype TCP at 10.10.10.84:48153 (3904) chain=dstnat action=dst-nat to-addresses=10.10.10.84 to-ports=48153 protocol=tcp dst-address=110.12.197.134 in-interface=ether1 dst-port=48153 3 D ;;; upnp 10.10.10.128: Skype UDP at 10.10.10.128:43905 (3909) chain=dstnat action=dst-nat to-addresses=10.10.10.128 to-ports=43905 protocol=udp dst-address=110.12.197.134 in-interface=ether1 dst-port=43905 4 D ;;; upnp 10.10.10.128: Skype TCP at 10.10.10.128:43905 (3909) chain=dstnat action=dst-nat to-addresses=10.10.10.128 to-ports=43905 protocol=tcp dst-address=110.12.197.134 in-interface=ether1 dst-port=43905 5 D ;;; upnp 10.10.10.129: Skype UDP at 10.10.10.129:20139 (3910) chain=dstnat action=dst-nat to-addresses=10.10.10.129 to-ports=20139 protocol=udp dst-address=110.12.197.134 in-interface=ether1 dst-port=20139 6 D ;;; upnp 10.10.10.129: Skype TCP at 10.10.10.129:20139 (3910) chain=dstnat action=dst-nat to-addresses=10.10.10.129 to-ports=20139 protocol=tcp dst-address=110.12.197.134 in-interface=ether1 dst-port=20139 7 D ;;; upnp 10.10.10.125: 3074 UDP chain=dstnat action=dst-nat to-addresses=10.10.10.125 to-ports=3074 protocol=udp dst-address=110.12.197.134 in-interface=ether1 dst-port=3074 8 D ;;; upnp 10.10.10.152: WhatsApp (1505943818) () chain=dstnat action=dst-nat to-addresses=10.10.10.152 to-ports=56265 protocol=udp dst-address=110.12.197.134 in-interface=ether1 dst-port=56265 9 D ;;; upnp 10.10.10.152: WhatsApp (1505944513) () chain=dstnat action=dst-nat to-addresses=10.10.10.152 to-ports=61271 protocol=udp dst-address=110.12.197.134 in-interface=ether1 dst-port=61271 10 D ;;; upnp 10.10.10.152: WhatsApp (1505945615) () chain=dstnat action=dst-nat to-addresses=10.10.10.152 to-ports=62934 protocol=udp dst-address=110.12.197.134 in-interface=ether1 dst-port=62934 11 D ;;; upnp 10.10.10.32: uTorrent (TCP) chain=dstnat action=dst-nat to-addresses=10.10.10.32 to-ports=28816 protocol=tcp dst-address=110.12.197.134 in-interface=ether1 dst-port=28816 12 D ;;; upnp 10.10.10.32: uTorrent (UDP) chain=dstnat action=dst-nat to-addresses=10.10.10.32 to-ports=28816 protocol=udp dst-address=110.12.197.134 in-interface=ether1 dst-port=28816 But in Splunk, I only get 9 events??? It stops at event 7, so 8,9,10,11,12 is missing and result is like this: 25/09/2017 11:21:52.000 7 D ;;; upnp 10.10.10.125: 3074 UDP chain=dstnat action=dst-nat to-addresses=10.10.10.125 to-ports=3074 protocol=udp dst-address=110.12.197.134 in-interface=ether1 dst-port=3074 host = Varg source = /opt/splunk/etc/apps/MikroTik/bin/mikrotik_upnp.sh sourcetype = mikrotik2 25/09/2017 11:21:52.000 6 D ;;; upnp 10.10.10.129: Skype TCP at 10.10.10.129:20139 (3910) chain=dstnat action=dst-nat to-addresses=10.10.10.129 to-ports=20139 protocol=tcp dst-address=110.12.197.134 in-interface=ether1 dst-port=20139 host = Varg source = /opt/splunk/etc/apps/MikroTik/bin/mikrotik_upnp.sh sourcetype = mikrotik2 25/09/2017 11:21:52.000 5 D ;;; upnp 10.10.10.129: Skype UDP at 10.10.10.129:20139 (3910) chain=dstnat action=dst-nat to-addresses=10.10.10.129 to-ports=20139 protocol=udp dst-address=110.12.197.134 in-interface=ether1 dst-port=20139 host = Varg source = /opt/splunk/etc/apps/MikroTik/bin/mikrotik_upnp.sh sourcetype = mikrotik2 25/09/2017 11:21:52.000 4 D ;;; upnp 10.10.10.128: Skype TCP at 10.10.10.128:43905 (3909) chain=dstnat action=dst-nat to-addresses=10.10.10.128 to-ports=43905 protocol=tcp dst-address=110.12.197.134 in-interface=ether1 dst-port=43905 host = Varg source = /opt/splunk/etc/apps/MikroTik/bin/mikrotik_upnp.sh sourcetype = mikrotik2 25/09/2017 11:21:52.000 3 D ;;; upnp 10.10.10.128: Skype UDP at 10.10.10.128:43905 (3909) chain=dstnat action=dst-nat to-addresses=10.10.10.128 to-ports=43905 protocol=udp dst-address=110.12.197.134 in-interface=ether1 dst-port=43905 host = Varg source = /opt/splunk/etc/apps/MikroTik/bin/mikrotik_upnp.sh sourcetype = mikrotik2 25/09/2017 11:21:52.000 2 D ;;; upnp 10.10.10.84: Skype TCP at 10.10.10.84:48153 (3904) chain=dstnat action=dst-nat to-addresses=10.10.10.84 to-ports=48153 protocol=tcp dst-address=110.12.197.134 in-interface=ether1 dst-port=48153 host = Varg source = /opt/splunk/etc/apps/MikroTik/bin/mikrotik_upnp.sh sourcetype = mikrotik2 25/09/2017 11:21:52.000 1 D ;;; upnp 10.10.10.84: Skype UDP at 10.10.10.84:48153 (3904) chain=dstnat action=dst-nat to-addresses=10.10.10.84 to-ports=48153 protocol=udp dst-address=110.12.197.134 in-interface=ether1 dst-port=48153 host = Varg source = /opt/splunk/etc/apps/MikroTik/bin/mikrotik_upnp.sh sourcetype = mikrotik2 25/09/2017 11:21:52.000 0 D ;;; upnp 10.10.10.32: Teredo chain=dstnat action=dst-nat to-addresses=10.10.10.32 to-ports=57050 protocol=udp dst-address=110.12.197.134 in-interface=ether1 dst-port=57050 host = Varg source = /opt/splunk/etc/apps/MikroTik/bin/mikrotik_upnp.sh sourcetype = mikrotik2 25/09/2017 11:21:52.000 Flags: X - disabled, I - invalid, D - dynamic host = Varg source = /opt/splunk/etc/apps/MikroTik/bin/mikrotik_upnp.sh sourcetype = mikrotik2 I did try to split events by a blank line but did not get it to work), inputs.conf [script://$SPLUNK_HOME/etc/apps/MikroTik/bin/mikrotik_upnp.sh] disabled = false interval = 300 sourcetype = mikrotik2 props.conf [mikrotik2] DATETIME_CONFIG = NO_BINARY_CHECK = true category = Custom pulldown_type = true LINE_BREAKER = BREAK_ONLY_BEFORE = \d+\s+D\s disabled = false Why does 5 events get lost? Is it due to my BREAK_ONLY_BEFORE? Are there a better way to du it (use LINE_BREAKER instead)?

Replacement for Join

$
0
0
The below example provides the output I need, but I will exceed the JOIN command limitations (50k). Can someone advise on a different way to accomplish the join below - without using the JOIN command? So as an example: **Primary Table** Customer 1, 2, 3 **Secondary Table** Customer 1,2,3,2 Spend 100, 200, 300, 400 **Search**: index=primary | join type=left max=0 Customer [search index=secondary] | table Customer Spend **Output** is Customer 1, 2, 2, 3 Spend 100, 200, 400, 300

I want to create report for last 7 days data, which should take last 6 days data from the summary index and for today's data it should take other index, by Switch Indexes.

$
0
0
I want to create report for last 7 days data, which should take last 6 days data from the summary index and for today's data it should take other index, by Switch Indexes. is there a possibility?
Viewing all 47296 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>