Quantcast
Channel: Questions in topic: "splunk-enterprise"
Viewing all 47296 articles
Browse latest View live

Kafka Splunk Sink connector All channels have back pressure

$
0
0
Hi, I have a Kafka cluster running and periodically the active controller fails. This causes issues with the Splunk sink connector and therefore stops the process of streaming audit data from Cloudera to Splunk. The error I'm getting over and over is: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------- [2019-02-21 14:54:10,672] INFO add 1 failed batches (com.splunk.kafka.connect.SplunkSinkTask:322) [2019-02-21 14:54:10,672] INFO total failed batches 263 (com.splunk.kafka.connect.SplunkSinkTask:47) [2019-02-21 14:54:10,672] ERROR failed to send batch (com.splunk.kafka.connect.SplunkSinkTask:261) com.splunk.hecclient.HecException: All channels have back pressure at com.splunk.hecclient.LoadBalancer.send(LoadBalancer.java:62) at com.splunk.hecclient.Hec.send(Hec.java:233) at com.splunk.kafka.connect.SplunkSinkTask.send(SplunkSinkTask.java:257) at com.splunk.kafka.connect.SplunkSinkTask.handleFailedBatches(SplunkSinkTask.java:127) at com.splunk.kafka.connect.SplunkSinkTask.put(SplunkSinkTask.java:62) at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:495) at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:288) at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:198) at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:166) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) ------------------------------------------------------------------------------------------------------------------------------------------------------------------------- I have restarted and reset the sink connector configuration. It is still spitting out this error after an hour. NOTE: This setup has been working for almost 2 weeks. It's only started doing this today. It did do this about 5 days ago. It came up with the same error and then corrected itself. Thanks!

Enable/Disable (Hide/show) dropdown field based on options in radio buttons

$
0
0
Scenario: i need to extract records based on 3 different options. 1) Based on only days of the weeks: user selects from which day till which day he wants to extract the records from (Sunday to Saturday) 2) Based on only time: user selects from what time to what time he wants to extract the output from. (Eg: 9PM-7AM, 11AM-6PM etc) 3) Based on both: user selects day as well as time. Goal: 1) If user selects the radio option "Only Days" he cannot modify the hour field and the record is extracted for days the matches the users choice. 2) If user selects the radio option "Only Hours" he cannot modify the Days field and the record is extracted for the hour range that matches the users choice. 3) If user selects "Both" he can modify the day and hour field and the result is displayed for the appropriate match. Given below is the query that i have used: 3Only DaysOnly HoursBoth (Default)-30d@dnowSundaySundayMondayTuesdayWednesdayThursdayFridaySaturdaySaturdaySundayMondayTuesdayWednesdayThursdayFridaySaturday0012 AM (00:00:01)01 AM02 AM03 AM04 AM05 AM06 AM07 AM08 AM09 AM10 AM11 AM12 PM01 PM02 PM03 PM04 PM05 PM06 PM07 PM08 PM09 PM10 PM11 PM2412 AM (00:00:01)01 AM02 AM03 AM04 AM05 AM06 AM07 AM08 AM09 AM10 AM11 AM12 PM01 PM02 PM03 PM04 PM05 PM06 PM07 PM08 PM09 PM10 PM11 PM12 AM (11:59:59) the "depends" here seems to have no effect at all. And also do i need to make different query statements for each user option (using "depends" in element) or is there an efficient solution ?

drilldown dashboards open without apparent cause

$
0
0
CONTEXT: I have a group of servers divided into several groups. From the top page, user makes a choice from a dropdown to get to a dashboard with data from all servers in that group, and from the first drilldown the user can choose (again from a dropdown) a specific server within the group to open a dashboard with logs specific to the chosen server. The server lists and groups are fairly maleable, so I have set each of the top 2 dashboards to populate the dropdowns dynamically from a lookup table. PROBLEM: When I open the top level dash, 1 to 3 of the 1st drilldown open (in new tabs) and either 3 or 9 of the second drilldown opens. this should happen until a choice is made. top board:
| inputlookup group_ip_host | dedup group | table groupgroupgroup drilldown1?group_tok=$group_tok$
1st drilldown:
| inputlookup group_ip_host where group=$group_tok$ | table hosthosthostdrilldown2?host=$host_tok$-60m@mnow
Any idea why I'm getting these extra tabs opened without a choice actually being made? And why in batches of 3? I've tried with autoRun=true and false, submitButton=true and false (obviously not at the same time for each) and I get the same behaviour.

What's the best approach for migrating Splunk on premise to AWS cloud?

$
0
0
Currently we have Splunk search heads , one of them with Enterprise Security, indexer cluster, deployment server,License master which we need to migrate from virtual machines and physical boxes to the AWS cloud. What would be the best approach so the data in buckets is not lost and all the configurations and apps are intact?

Kafka Splunk Sink connector: All channels have back pressure

$
0
0
Hi, I have a Kafka cluster running, and periodically, the active controller fails. This causes issues with the Splunk sink connector and therefore stops the process of streaming audit data from Cloudera to Splunk. The error I'm getting over and over is: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------- [2019-02-21 14:54:10,672] INFO add 1 failed batches (com.splunk.kafka.connect.SplunkSinkTask:322) [2019-02-21 14:54:10,672] INFO total failed batches 263 (com.splunk.kafka.connect.SplunkSinkTask:47) [2019-02-21 14:54:10,672] ERROR failed to send batch (com.splunk.kafka.connect.SplunkSinkTask:261) com.splunk.hecclient.HecException: All channels have back pressure at com.splunk.hecclient.LoadBalancer.send(LoadBalancer.java:62) at com.splunk.hecclient.Hec.send(Hec.java:233) at com.splunk.kafka.connect.SplunkSinkTask.send(SplunkSinkTask.java:257) at com.splunk.kafka.connect.SplunkSinkTask.handleFailedBatches(SplunkSinkTask.java:127) at com.splunk.kafka.connect.SplunkSinkTask.put(SplunkSinkTask.java:62) at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:495) at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:288) at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:198) at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:166) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) ------------------------------------------------------------------------------------------------------------------------------------------------------------------------- I have restarted and reset the sink connector configuration. It is still spitting out this error after an hour. NOTE: This setup has been working for almost 2 weeks. It has only started doing this today. It did do this about 5 days ago. It came up with the same error and then corrected itself. Thanks!

How do you Enable/Disable (Hide/show) dropdown field based on options in radio buttons?

$
0
0
Scenario: I need to extract records based on 3 different options. 1) Based on only days of the weeks: user selects from which day till which day he wants to extract the records from (Sunday to Saturday) 2) Based on only time: user selects from what time to what time he wants to extract the output from. (Eg: 9PM-7AM, 11AM-6PM etc) 3) Based on both: user selects day as well as time. Goal: 1) If a user selects the radio option "Only Days", he cannot modify the hour field, and the record is extracted for the days that match the user's choice. 2) If a user selects the radio option "Only Hours", he cannot modify the Days field, and the record is extracted for the hour range that matches the user's choice. 3) If a user selects "Both", he can modify the day and hour field and the result is displayed for the appropriate match. Given below is the query that I have used: 3Only DaysOnly HoursBoth (Default)-30d@dnowSundaySundayMondayTuesdayWednesdayThursdayFridaySaturdaySaturdaySundayMondayTuesdayWednesdayThursdayFridaySaturday0012 AM (00:00:01)01 AM02 AM03 AM04 AM05 AM06 AM07 AM08 AM09 AM10 AM11 AM12 PM01 PM02 PM03 PM04 PM05 PM06 PM07 PM08 PM09 PM10 PM11 PM2412 AM (00:00:01)01 AM02 AM03 AM04 AM05 AM06 AM07 AM08 AM09 AM10 AM11 AM12 PM01 PM02 PM03 PM04 PM05 PM06 PM07 PM08 PM09 PM10 PM11 PM12 AM (11:59:59) the "depends" here seems to have no effect at all. And also, do I need to make different query statements for each user option (using "depends" in element) or is there an efficient solution ?

How come extra tabs in our drilldown dashboards are opening without apparent cause?

$
0
0
CONTEXT: I have a group of servers divided into several groups. From the top page, a user makes a choice from a dropdown to get to a dashboard with data from all servers in that group, and from the first drilldown, the user can choose (again from a dropdown) a specific server within the group to open a dashboard with logs specific to the chosen server. The server lists and groups are fairly maleable, so I have set each of the top 2 dashboards to populate the dropdowns dynamically from a lookup table. PROBLEM: When I open the top level dash, 1 to 3 of the 1st drilldown open (in new tabs) and either 3 or 9 of the second drilldown opens. This should happen until a choice is made. top board:
| inputlookup group_ip_host | dedup group | table groupgroupgroup drilldown1?group_tok=$group_tok$
1st drilldown:
| inputlookup group_ip_host where group=$group_tok$ | table hosthosthostdrilldown2?host=$host_tok$-60m@mnow
Any idea why I'm getting these extra tabs opened without a choice actually being made? And why in batches of 3? I've tried with autoRun=true and false, submitButton=true and false (obviously not at the same time for each) and I get the same behaviour.

How do I search for a letter at a specific position

$
0
0
I'm very new to Splunk and need help with a search. I want to perform a search to show me the results where the 5th letter of the server name has the letter "p". Is this possible? Thank you

How do you make a decision based on a row value?

$
0
0
I'm trying to create a traffic-color dashboard for my applications based on their status and Tier level. If any one of the application status is RED, I want the tier to be shown as RED, even though there are other applications in the same tier level that are GREEN or AMBER. Can you suggest me how my search query should be? example data SrV |App| Tier |Status S1 |A1 |Tier1 |AMBER S2 |A1 |Tier1 |AMBER S3 |A2 |Tier2 |AMBER S4 |A3 |Tier3 |GREEN S5 |A4 |Tier2 |GREEN S6 |A2 |Tier2 |AMBER S7 |A4 |Tier2 |GREEN S8 |A5 |Tier1 |RED to Something like Tier1 Tier2 Tier3 RED AMBER GREEN

What is the Splunk SPL for matching same values and output to an additional column with a newly defined value?

$
0
0
Hi, I have a field named OS This field is populating multiple values such as below after running the following SPL: | inputlookup Host.csv | stats dc(host) as Count by OS | fields - Count Result: WINDOWS NT WINDOWS SERVER 2003 WINDOWS SERVER 2008 WINDOWS SERVER 2012 LINUX LINUX 6.7 LINUX 7.0 SOLARIS 9 SOLARIS 10 I want an additional column in results that if: All the Windows above should display Windows All the Linux above, should display Linux and so on in an additional column like below: ![alt text][1] How? I tried to use eval and case but seems like not getting it or having a long day. Thanks in-advance [1]: /storage/temp/269597-spl.png

Get results of a search based on CSV content

$
0
0
Splunkenterprise 7x I am basically trying to get this to work: https://answers.splunk.com/answers/519950/ho-to-get-search-input-from-csv-file.html I have created a .csv, inported it and created a lookup definition. I am trying to filter the results of a "*" search for service status such that it only displays the servies listed in the .csv. eventtype=hostmon_winows Type=Service host="SCCM" (Name="*") Startmode="*" State= "*" [ |inputlookup SCCMServicesCSV.csv |fields ServiceName] | dedup host, Name | table host, Name, Startmode, State This query is from the Windows App. It works just fine when the lookup section isn' included. Can someone tell me what I am doing wrong? Thank you Ron Jones

In Splunk Enterprise 7x, how do you get results of a search based on CSV content?

$
0
0
Splunk Enterprise 7x I am basically trying to get this to work: https://answers.splunk.com/answers/519950/ho-to-get-search-input-from-csv-file.html I have created a .csv, inported it, and created a lookup definition. I am trying to filter the results of a "*" search for service status such that it only displays the services listed in the .csv. eventtype=hostmon_winows Type=Service host="SCCM" (Name="*") Startmode="*" State= "*" [ |inputlookup SCCMServicesCSV.csv |fields ServiceName] | dedup host, Name | table host, Name, Startmode, State This query is from the Windows App. It works just fine when the lookup section isn't included. Can someone tell me what I am doing wrong? Thank you, Ron Jones

Adding a Unix indexer to support apps

$
0
0
Again, new to Splunk. I currently have a single instance of Splunkenterprise installed on a Win12 R2 server. We would like to use a few apps that either do not work on a Windows installation, or it has been reported that the app/s runs better on a *nix system. My plan is to install splunkenterprise on a *nix server as an indexer and instal the apps in question on that server. If I do this, will there be two URLs; one for the Windows Splunk and the other for the *nix Splunk that the userswill have to use? Are there any lessons leared I need to be aware of? Again, I aprecate all the assistance thus far, this community has been great. Ron Jones

Splunk 7.x Fundamentals Part 1 (eLearning)

$
0
0
I just finished all the modules and the final quiz, my question is Do I have to pay for the certification of "Splunk 7.x Fundamentals Part 1 (eLearning)" or it is free ?

How to count files in which multiple fields meet certain conditions

$
0
0
I have a few files, they all have same columns, and look like this: timestamp field1 field2 ... 1544079360.84132 99 1544079363.52629 98 1544081067.48075 100 1544081377.48521 100 ... I want to count the files that both field1 and field2 reached 100 or above. I tried: ... | search field1>=100 AND field2>=100 but it didn't work. I believe it's because there were null values. So I tried filldown: ... | filldown field1, field2 but it's still not working. I also tried eventstats and no luck. And I don't prefer eventstats as it gets very slow when data is increasing. Any thoughts? Thank you!

How do you count files in which multiple fields meet certain conditions?

$
0
0
I have a few files. They all have the same columns and look like this: timestamp field1 field2 ... 1544079360.84132 99 1544079363.52629 98 1544081067.48075 100 1544081377.48521 100 ... I want to count the files that both field1 and field2 reached 100 or above. I tried: ... | search field1>=100 AND field2>=100 but it didn't work. I believe it's because there were null values. So I tried filldown: ... | filldown field1, field2 but it's still not working. I also tried eventstats and no luck. And I don't prefer eventstats as it gets very slow when data is increasing. Any thoughts? Thank you!

Is Splunk 7.x Fundamentals Part 1 (eLearning) free?

$
0
0
I just finished all the modules and the final quiz, my question is Do I have to pay for the certification of "Splunk 7.x Fundamentals Part 1 (eLearning)" or it is free ?

Not prompted for Splunk user upon installation

$
0
0
Hi, I'm trying to migrate my Splunk instance from an AWS VM to a GCP VM. I copied over this file splunk-6.5.0-59c8927def0f-Linux-x86_64.tgz to my new GCP instance and ran the following commands sudo tar xvzf splunk-6.5.0-59c8927def0f-Linux-x86_64.tgz -C /opt `sudo useradd splunk -d /opt/splunkforwarder` `sudo passwd splunk` sudo chown -R splunk:splunk $SPLUNK_HOME ./splunk start The install goes fine except I get this message "Your license is expired. Please login as an administrator to update the license." I'm not sure what license this is referring to. During the installation I wasn't prompted to create a Splunk user so I have no log in for the Web UI. I tried using this to add a user but I got blocked because there's no Splunk username to use $ ./splunk edit user admin -password mooredna Splunk username: admin Password: Login failed $ exit Can someone point out what I'm doing wrong?

ITSI - Episode Review - 1 KPI

$
0
0
I'm very new to Splunk and ITSI. We have created a service for VMware VMs. The Service has several KPIs like memory and CPU. A few of the VMs have CPUs in Critical status. Episode Review shows 0 episodes. Is it possible to have the specific servers show up in Episode Review?

How to count files in which multiple fields meet certain conditions?

$
0
0
I have a few files. They all have the same columns and look like this: timestamp field1 field2 ... 1544079360.84132 99 1544079363.52629 98 1544081067.48075 100 1544081377.48521 100 ... I want to count the files that both field1 and field2 reached 100 or above. I tried: ... | search field1>=100 AND field2>=100 but it didn't work. I believe it's because there were null values. So I tried filldown: ... | filldown field1, field2 but it's still not working. I also tried eventstats and no luck. And I don't prefer eventstats as it gets very slow when data is increasing. Any thoughts? Thank you!
Viewing all 47296 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>