Quantcast
Channel: Questions in topic: "splunk-enterprise"
Viewing all 47296 articles
Browse latest View live

Setting phoneHomeIntervalInSecs

$
0
0
I need to set this for my Windows deployment clients only. Can I add this entry to the WEB.CONF or can I only add this to the DEPLOYMENTCLIENT.CONF? If so how does it have the same impact on my HF, INDEX? I want my Windows UF client to on poll every 6 hours instead of a minute since I will have 12,000. Thanks!

Invalid FORMAT when creating a field transformation

$
0
0
I have these events that come with a `source` attribute something like `source = /var/collectd/csv/sv3vm5b/cpu-0/cpu-idle-2018-01-10` and I need to extract the CPU number (the `cpu-0` part, which can also be `cpu-1`, `cpu-2`, or `cpu-3` ). So I tried to create (for my sourcetype) a transformation ( `Fields: Field transformations: Add new` ). The destination app is `search`, the new field name is `cpu`, the type is `regex-based` with the regular expression `^.*/cpu-([0-9]+)/`and the source key `source`. According to the form, the default format ( `::$1` ) should do just fine so I leave the Format box blank. But it won't save, yielding this error message: `Encountered the following error while trying to save: Invalid FORMAT:` (I would add a screen capture but I don't have enough karma yet). Help?

LEFT JOIN not working

$
0
0
The below left join identified by ** is what i am trying to join onto the search but it is not listing all product_names per machine. `GEN_ProductionWorkstations` | table machine | join type=left machine [ search index=sccm sourcetype=otl_dbin_machineinfo host=opspk source=dbmon-dump:/otl_dbin_machineinfo ] | join type=left machine [ search index=ad source=otl_addnsscan | eval machine=lower(machine) | rename data as IP, name as machine | table machine, IP, User_Name0, Model0, lastRebootDaysAgo] **| join type=left machine [ search index=sccm computername=* product_name=* | search category!="Device Drivers, Configuration, and Utilities" | eval machine=lower(computername) | fields machine, product_name | dedup machine, product_name | table machine, product_name ]** The final table should list machine, IP, User_Name0, Model0, lastRebootDaysAgo, product_name. I think the issue is that there can be many product_names per machine but only one User_Name0 and IP per machine.

Splunk App for AWS on Splunk Enterprise - indexer and search head Splunk version

$
0
0
hello, looks like this app requires Splunk Enterprise to be of at least v 6.5.0. Do the indexers also need to be of v6.5.0, or is it okay if just the search head that the app and the add will be installed on is running Splunk v6.5.0? Thanks in advance. Usup

How to combine wildcard results into one field?

$
0
0
index=perfmonitor sourcetype=dc_perfmonitor source="f:*" | fields + host, "*Processor Time" | stats avg("*Processor Time") by host The output of this query results in a long list of hosts with a staggered table of the average of each machine's average total processor time. I wanted to combine all of these results into a single column. Basically, I wanted to ask how do I create a new field using this wildcard search (it has a space in its name), as something more general, like "ProcessorTime" vs. "Machine1 Processor Time", "Machine2 Processor Time", "Machine3 Processor Time", etc.?

TA-ClamAV App - Ingesting ClamAV data from sources other than the Syslog

$
0
0
I'm interested in using the ClamAV App for Splunk and have installed it in my environment, but am so far not getting data ingested as expected. That said, I'm not that surprised at the issue as we aren't sending the clamscan results directly to the syslog, rather that data goes into dated files that are created in /var/log/clamav So, I'm trying to figure out how to make the ClamAV app find those logs to use for the data that goes into that app's dashboard. Hopefully that all makes sense, but if not, if you folks bear with me, I'll explain and/or provide more information as necessary to hopefully get a solution to the issue and get this app working for my needs (and hopefully help educate others as well). Thanks

Encountered the following error while trying to update: Splunkd daemon is not responding: (u"Error connecting to /servicesNS/nobody/Splunk_TA_snow/apps/local/Splunk_TA_snow/setup: ('The read operation timed out',)",)

$
0
0
Hello All, I have the following error: Encountered the following error while trying to update: Splunkd daemon is not responding: (u"Error connecting to /servicesNS/nobody/Splunk_TA_snow/apps/local/Splunk_TA_snow/setup: ('The read operation timed out',)",) I have been trying from past hour refreshing, restart and log out (I'm testing this in my local Splunk version) but none of them seem to work. I am on windows machine if that is something we should consider. Can some one please help me out with this error. Thanks

Get current logged in user's Email Address in Javascript SDK

$
0
0
I'm attempting to get the current logged in user's email address from the Javascript SDK to pass to a js variable. I can get the current logged in Username, however the *configKey* values don't seem to exist in this docs article for getConfigValue, so I'm not sure what value to use for email address https://docs.splunk.com/Documentation/Splunk/6.6.5/ModuleRef/Splunk.util#Parameters_8 Does anybody have the *configKey* value for the email address for the user in the JavaScript SDK? Or even an explanation for all of the options for *configKey* under getConfigValue? Here's the code I'm trying: require(["underscore","jquery","splunkjs/mvc","splunkjs/mvc/utils","splunkjs/mvc/simplexml/ready!"], function(_,$,mvc,utils) { var username = Splunk.util.getConfigValue("USERNAME").toLowerCase() var email = Splunk.util.getConfigValue("email") });

How to get the sum according to field values?

$
0
0
Hi Guys, I am counting the number of events from field name "LOCATION".This Field have 4 locations, Location A,B,C and D. I need to get the count of events from Location A B and C and name it as Position 1 Events then events from Location D as Position 2. Then after that, I need to get the percentage of events from Position 1 (ABC) and Position 2 (D). How can I do that? Thank you!

where can I find when I added or deleted license?

$
0
0
I have some license,and maybe I added and deleted one of them some months ago。Now,I forget when I deleted it.How can I find out? I search index=_internal,but not found

Need help in trending chart with one single line

$
0
0
Hi , When i select a value from filter which has both true and false values , i am getting trending lines for both .But when i select a value which has false values and there is 0 True values , i am not getting any lines for both . My requirement : When i select a value which has only true values , there should be single trending line showing true values . Please help me out in this.

help regex

$
0
0
I have this kind of logs 00:00:47: %LINK-3-UPDOWN: Interface GigabitEthernet0/1, changed state to down 00:00:48: %LINEPROTO-5-UPDOWN: Line protocol on Interface Vlan3, changed state to up 00:00:48: %LINEPROTO-5-UPDOWN: Line protocol on Interface GigabitEthernet0/5, changed state to down How can I extract severity and interface? TIA

Understanding command in search

$
0
0
**Hello everyone, I need help understanding the search command. I tried to read documents and still did not understand. I would be happy to receive an explanation and not a link to study commands. The commands is:** - sourcetype=cisco_wsa_squid BLOCK | stats values(x_webroot_threat_name) as "Threat Name" -sourcetype=access_combined | stats count(action) as "Total Events" avg(price) as "Average Price" sum(price) as "Total Amount" by action | rename action as Actio **Thank you!**

Timechart to start from the most recent time where a condition is met

$
0
0
Hi all I have "my search | timechart avg(Throughput) span=5m by id". For each id, the throughput fluctuates and and drop to 0 several times. I want to show the user the throughput from the most recent time + 1hour earlier. I do not need to show the older events where Throughput=0. Thank you in advance for your suggestions.

Not getting data from Heavy Forwarder

$
0
0
Hello, Recently we have deployed the Splunk Enterprise. Our moto is to monitor Wi-Fi usage, our Wi-Fi devices sending log data to syslog server, in syslog I have installed HF and configured all required settings but unfortunately am not seeing any data flow to splunk indexer. Configuration: **Heavy Forwarder** Outputs.conf - configuration [tcpout:group1] server=X.X.X.X:9997 [tcpout] indexAndForward=true inputs.conf - configuration [monitor:///var/log/messages] sourcetype= cisco:ise:syslog **Splunk Enterprise** Enabled receiving in port no - 9997 inputs.conf - configuration [default] host = splunk server hostname [splunktcp://9997] disabled = 0 Firewall been adjusted not to block traffic from port. Did ping and telnet test and both are successful but not sure why not able to see data. kindly let me know suggestions to fix the issue. Regards, MC

I am not recieving the logs of my linux machine

$
0
0
I want to receive the logs of Linux machine having UF installed in my windows machine which have splunk enterprise free with domain account I edited the inputs.conf and outputs.conf as follows ip of my Linux is suppose 192.168.5.007 ip of windows with port no I want to recieve is suppose 192.168.2.047:9997 **In inputs.conf of splunk forwarder** [monitor:///path.../myfile] index=INDEX_NAME host = 192.168.5.007 sourcetype = linux:log **outputs.conf** [tcpout-server://192.168.2.047:9997] compressed=false **In inputs.conf of splunk enterprise in windows** [splunktcp://9997] disabled = 0 can some help me that if I have done every thing fine or I have to change any thing? I edited the inputs.conf and outputs.conf of system\local.

my search string is truncated after a question mark in a custom drilldown search

$
0
0
my search string is truncated after a question mark in a custom drilldown search. I have a statistic table that I made in a dashboard, and when I click in the row of that table, I have custom search. so I go to Edit Drilldown -> Link to search and I choose custom. the problem is: I am using regular expression in my custom search and when I click in the table row, it takes me to the new custom search but everything after the regular expression question mark is truncated. here is my custome search: host="myhost" field1=$row.id$ |rex "\|mynewField:(?.*)\|" however when I click and go to the custom search, the search appears is : host="myhost" field1=$row.id$ |rex "\|mynewField:( so basically the question mark ?, and everything after is truncated. and I get Unbalanced quotes error

servers-attribute of distsearch.conf not visible

$
0
0
Hello I need a small clarification over distsearch.conf. As per the documentation, to connect the SH with Indexer. One can configure in **SH** using any of the 3 ways : **CLI**, **GUI** & **Conf** file. The doc nicely describes it, Thanks for that. In my case, the splunk env was already setup in my organisation. Now I am not aware which way was followed for adding search peer to search head. Now in the **SH GUI** the "settings-->Distributed search-->Search peers" server entry is visible, and also the SH fetches the data from Indexer nicely. But my problem is I am not able to find out in which conf file that server settings are stored. I tried to locate the distsearch.conf inside whole of splunk dir, but I could not find the server settings in anywhere. Further I tried to debug with **btool** cmd in SH and was surprised to see, even in that the servers settings are not visible. **Summarizing the Problem :** The setting is visible in GUI, but no clue in which conf file that setting is getting stored.

Parsing a field in csv having key value pairs and displaying it as header and value in table

$
0
0
We are trying to parse an entry in the csv which is of the below format, 2018-01-11 00:00:00,default.MS_2016,shekhar.com,Forward-Mapping,No,,10.102.31.216,1,0,2,0,0,0,1,0,0,0,0,1,0,0,1,1,0,1,0,8,0,0,hinf=10|cert=15|ipsec=80| The value in the last column needs to be parsed and it should be displayed as separate columns and corresponding value. Example: For the entry "hinf=10|cert=15|ipsec=80|" The Table needs to display, hinf cert ipsec 10 15 80 Could anyone please let me know if this is possible.

Unable to initialize modular input "mqinput" defined inside the app "mq_ta": Introspecting scheme=mqinput: script running failed (exited with code 1)

$
0
0
I am getting below error . Unable to initialize modular input "mqinput" defined inside the app "mq_ta": Introspecting scheme=mqinput: script running failed (exited with code 1)
Viewing all 47296 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>