Quantcast
Channel: Questions in topic: "splunk-enterprise"
Viewing all 47296 articles
Browse latest View live

Why is the CPU usage showing 80-100% (falsely)? -- Search help

$
0
0
I am trying to get CPU usage for a specific process in windows. My search looks like this: host=host1 AND sourcetype="Perfmon:Process" AND counter="% Processor Time" AND process_name="server*" | table _time, counter, process_name, Value My result is showing mostly 100 for Value which is not really true. Windows runs on VM. Result looks like this: 2017-09-22T14:40:28.000-0400 % Processor Time server 100 2017-09-22T14:39:43.000-0400 % Processor Time server 100 2017-09-22T14:37:28.000-0400 % Processor Time server 100 2017-09-22T14:32:58.000-0400 % Processor Time server#1 100 2017-09-22T14:32:13.000-0400 % Processor Time server 100 2017-09-22T14:38:13.000-0400 % Processor Time server 100 2017-09-22T14:31:28.000-0400 % Processor Time server#1 11.30968265 2017-09-22T14:21:43.000-0400 % Processor Time server 100 2017-09-22T14:18:43.000-0400 % Processor Time server#1 0.105369743 2017-09-22T14:36:43.000-0400 % Processor Time server 0.034732856 2017-09-22T14:35:58.000-0400 % Processor Time server#1 0.14049302 2017-09-22T14:29:13.000-0400 % Processor Time server 100 2017-09-22T14:28:28.000-0400 % Processor Time server#1 84.84122861 2017-09-22T14:20:58.000-0400 % Processor Time server#1 100 2017-09-22T14:16:28.000-0400 % Processor Time server 100 2017-09-22T14:14:58.000-0400 % Processor Time server#1 100 What should I do? Why is it pulling all the 100s? 80% of events show a 100. Is it an agent config issue?

Search that shows first and last event time + total count of events per user

$
0
0
I have a list of top 10 users that failed to login to a site and I want to take the events related to those top ten users and get a read out of: Time of first event Time of last event Total number of events This would be relating to each user in that top ten list. Here is an example of what it would look like on paper: ---user_email--------------Start--------------------------------Stop----------------------------------Total 1. bob@bob.com---------02/28/17 - 01:16:19:PM-------09/22/17 - 10:36:51:AM---------35 2. smith@smith.com-----04/1/17 - 05:32:15:PM --------06/26/17 - 11:22:06:PM---------7 Here is what I have so far, really I am just missing how I can get the total number of events per user column: `index="test" Event_ID="123456" [search index="test"Event_ID="123456" | top limit=10 user_email | table user_email]` `| stats earliest(_time) as start, latest(_time) as stop by user_email` `| eval start=strftime(start, "%m/%d/%y - %I:%M:%S:%p")` `| eval stop=strftime(stop, "%m/%d/%y - %I:%M:%S:%p")`

How can I import this CSV file into Splunk?

$
0
0
I have: 1 Searchhead 1 Deployment Server 4 Indexers (Non clustered) This is the raw CSV file: date,name,capacity,free_capacity,virtual_capacity,used_capacity,real_capacity,overallocation,compression_virtual_capacity,compression_compressed_capacity,compression_uncompressed_capacity 1470207600,myserver,62.00TB,16.67TB,163.02TB,41.80TB,45.24TB,262,86.72TB,34.97TB,69.88TB 1470207600,MigrationPool_8192,0,0,0.00MB,0.00MB,0.00MB,0,0.00MB,0.00MB,0.00MB 1470207600,MigrationPool_512,0,0,0.00MB,0.00MB,0.00MB,0,0.00MB,0.00MB,0.00MB 1470294000,myserver,62.00TB,16.67TB,163.02TB,41.81TB,45.25TB,262,86.72TB,34.99TB,69.88TB 1470294000,MigrationPool_8192,0,0,0.00MB,0.00MB,0.00MB,0,0.00MB,0.00MB,0.00MB the top line is the header........ I have the Props and Transform conf on my UF along side my inputs.conf /opt/splunkforwarder/etc/apps/myapp/local/inputs.conf: [monitor:///usr/local/bin/reports/storage/emc_capacity.out] disabled = false index = zz_test sourcetype = VMAX_capacity [monitor:///usr/local/bin/reports/storage/tustin_svc_capacity_rpts.out] disabled = false index = zz_test sourcetype = SVC_capacity [monitor:///usr/local/bin/reports/storage/idc_svc_capacity_rpts.out] disabled = false index = zz_test sourcetype = SVC_capacity /opt/splunkforwarder/etc/apps/myapp/local/props.conf: [VMAX_capacity] REPORT -VMAX_capacity = VMAX_storage_csv [SVC_capacity] REPORT -SVC_capacity = SVC_storage_csv /opt/splunkforwarder/etc/apps/myapp/local/transforms.conf [SVC_storage_csv] DELIMS = "," FIELDS = "date","name","capacity","free_capacity","virtual_capacity","used_capacity","real_capacity","overallocation","compression_virtual_capacity","compression_compressed_capacity","compression_uncompressed_capacity" [VMAX_storage_csv] DELIMS = "," FIELDS = "Date","Array","Useable","Used","UsedPercent","UsedGrowth","Free","Subscribed","SubscribedMax","SubscribedPercent","SubscribedGrowth","Snapshot","compression","ExpansionNeeded" When I run the search on my Searchhead: index=zz_test Sourcetype=SVC_capacity The data is not parsed....my question...does the props and Transform need to ne on my Indexers? on the UF? does my Props and Transform conf look correct? Any assistance much appreciated.

Why can't I start the Splunk Forwarder on a local computer? (Error 1069, login failure)

$
0
0
Splunk Version 6.2.9.276372 Windows could not start the SplunkForwarder service on local computer. Error 1069: The service did not start due to a logon failure.

My case statement is putting events in the "other" category -- why?

$
0
0
Hi guys, So i have a user_agent and a url field for an elb log file. I am checking the user agent field for the values that contain Googlebot and Bingbot. If the useragent field has either of these values i want them to be displayed in the results as google_bot and bing_bot, otherwise the events that dont match either of these conditions should fall under the other category. My problem is when the search finalizes, it ends up sticking every event in the other category, but while it is running the search, it splits them by the proper category that i want (google_bot,bing_bot, other). I don' understand why my case statement default to putting every event in the other category. I have tried multiple commands, shown below, but they all end with the same results where every event is placed in the other category. Can anyone help me understand why it is doing this? index=vgl | eval bot= coalesce(case(user_agent LIKE "%google%", "google_bot", user_agent LIKE "%bing%","bing_bot"), "other")|chart count(bot) AS count_bot by url, bot usenull=false index=vgl | eval bot= case(user_agent LIKE "%google%", "google_bot", user_agent LIKE "%bing%","bing_bot", True(), "other")|chart count(bot) AS count_bot by url, bot usenull=false index=vgl | eval bot= case(user_agent LIKE "%google%", "google_bot", user_agent LIKE "%bing%","bing_bot", 1=1, "other")|chart count(bot) AS count_bot by url, bot usenull=false **This is what how I want my results** ![alt text][1] **This is by the time it finalizes the job** ![alt text][2] [1]: /storage/temp/217625-1.jpg [2]: /storage/temp/217626-2.jpg

Is Telegram Alert Action app compatible with Splunk v6.5.1?

$
0
0
Hi folks, We've Splunk Enterprise 6.5.1, running in a cluster of three SH and three IN. Could we install Telegram Alert Action on this environment? Regards Pedro

Qualys Technology Add-on (TA) for Splunk: How can I get the full knowledge base downloaded from Qualys onto my search heads?

$
0
0
Hi guys, I'm trying to get the full knowledge base downloaded from qualys onto my search heads. I currently have the "basic" knowledge base being downloaded fine. However, I don't know which parameters in the Python scripts to change to download the full knowledge base, I've tried changing detail = Basic to All and the new full knowledgebase is downloaded into the tmp file, however, it's not parsed into the qualys_kb.csv file. I've looked at the logs and the errors I'm seeing look like this: Exception while parsing. dict contains fields not in fieldnames: 'CVSS_REPORT_CONFIDENCE', 'CVSS_ACCESS', 'CVSS_REMEDIATION_LEVEL', 'CVSS_EXPLOITABILITY', 'CVSS_IMPACT', 'CVSS_AUTHENTICATION' :: Traceback (most recent call last): return self.writer.writerow(self._dict_to_list(rowdict)) File "/opt/splunk/lib/python2.7/csv.py", line 148, in _dict_to_list ValueError: dict contains fields not in fieldnames: 'CVSS_REPORT_CONFIDENCE', 'CVSS_ACCESS', 'CVSS_REMEDIATION_LEVEL', 'CVSS_EXPLOITABILITY', 'CVSS_IMPACT', 'CVSS_AUTHENTICATION' TA-QualysCloudPlatform: 2017-09-22T13:18:23Z PID=1552 [MainThread] DEBUG: TA-QualysCloudPlatform [knowledge_base] - Exception while parsing. dict contains fields not in fieldnames: 'CVSS_AUTHENTICATION', 'CVSS_REMEDIATION_LEVEL', 'CVSS_IMPACT', 'CVSS_REPORT_CONFIDENCE', 'CVSS_EXPLOITABILITY', 'CVSS_ACCESS' :: Traceback (most recent call last): return self.writer.writerow(self._dict_to_list(rowdict)) File "/opt/splunk/lib/python2.7/csv.py", line 148, in _dict_to_list ValueError: dict contains fields not in fieldnames: 'CVSS_AUTHENTICATION', 'CVSS_REMEDIATION_LEVEL', 'CVSS_IMPACT', 'CVSS_REPORT_CONFIDENCE', 'CVSS_EXPLOITABILITY', 'CVSS_ACCESS' I've looked at the full knowledgebase XML file in the temp directory and it does not contain any of these fieldnames mentioned above (nor does the basic version) so I'm unsure where it's coming from? Any python wizards out there that can help find what i need to change in the script for the full pull? I've not explained it too well so if anyone needs more info I'll try and explain it a bit better! Cheers!

Machine learning toolkit Assistant - Detect numerical outliers - Timechart value by field

$
0
0
I am trying to use the machine learning toolkit assistant for detecting numerical outliers in transaction response time for multiple targets. I want to treat data set for each target over a period of time separately and apply the algorithm to each set. I am using this query in the assistant: index=dc10 sourcetype=ML |timechart useother=f limit=20 span=10m values(resptime) by name I expect to use the "resptime" field to analyze and split by the "name" field. However this is not working as I expected it to. I am getting the values for "name" in the "Field to analyze" drop down. I can use it against a single target (name) and it works fine. Is there a way to apply the algorithm in a way that I need? I don't want to write separate queries to create a model for each of the targets.

Splunk real-time data input from html via REST

$
0
0
I've been trying to look for a way for Splunk to input real-time data and I come across Rest API thinking it could be a solution to my problem. But after I set it up a Rest api base on the instruction from Splunk, no data is being added to Splunk. Could anyone let me know what I did wrong here? For testing purposes, I use a Wikipedia site as an endpoint url and I did not set up any kind of handler. I just want to know if this REST api could get me any type of information from the site. Someone recommended me to define a custom sourcetype but I don't know what should I define in this custom sourcetype. If this is the way that I could fix it, can anyone please be specific in what I should put in this custom sourcetype? I'm very new to Splunk REST api!

Splunking user-agent string to extract browser with the version

$
0
0
Hi, I would like to extract and show the browser and version from the user-agent string, so as to segregate the different versions of the same browser is being used by the users (of our application). Below is the Splunk query being used to extract the browser details: index="sample_index" sourcetype="log_alias" operation="log-in" AND userAgent!="-" AND userAgent!="Apache" AND userAgent!="Load-weight" AND userAgent!="Java" AND userAgent!="Jakarta Commons-HttpClient" | stats count(eval(match(userAgent, "Firefox"))) as "Firefox", count(eval(match(userAgent, "Chrome"))) as "Chrome", count(eval(match(userAgent, "Safari") AND NOT match(userAgent, "Chrome"))) as "Safari", count(eval(match(userAgent, "MSIE|Trident"))) as "IE", count(eval(NOT match(userAgent, "Chrome|Firefox|Safari|MSIE|Trident"))) as "Other" This query will result in showing the count of users based on the browser usage. Screenshot of the statistics is shown below: ![alt text][1] [1]: /storage/temp/216609-screen-shot-2017-09-22-at-32928-pm.png I would like to extract and segregate the individual browser based on its version(s). For instance, if two users using two different versions of Google Chrome browser, that should be extracted. Please suggest. Thanks.

Splunk logging logback appenders is not terminating

$
0
0
We followed the steps in below documentation for "Enable logging to HTTP Event Collector in your Java project". http://dev.splunk.com/view/splunk-logging-java/SP-CAAAE7M In our testing, we are seeing that Splunk logger appender is not terminating. Our java program keeps running with no exit code. Steps to reproduce: - Import attached code to IntelliJ IDE - Change http collect token and url in logback.xml - Run the main java Unfortunately, I cannot attach files here due to fewer karma points. ---------- here are the files: **test.java** import org.slf4j.Logger; import org.slf4j.LoggerFactory; import com.splunk.logging.*; public class test { public static void main( String[] args ) { System.out.println("This is first"); Logger logger = LoggerFactory.getLogger("splunk"); logger.info("This is a test event for Logback test"); logger.error("This is a test error for Logback test"); System.out.println("This is second"); } } ---------- **logback.xml** http://splunkheavyforwarder.dev.aws.away.black:8088640D5ED1-4D44-484A-8576-F8BA16746954true %d{yyyy-MM-dd HH:mm:ss} [%thread] %-5level %logger{36} - %msg%n %d{yyyy-MM-dd HH:mm:ss} [%thread] %-5level %logger{36} - %msg%n Let me know in case you need additional information.

Search using metadata returns different results

$
0
0
I'm getting different search results for the metadata I added to my log events. What did I misconfigure? Added to inputs.conf on forwarder: _meta = datacenter::aws Added to fields.conf on forwarder: [datacenter] INDEXED=true Returns very few results: datacenter=aws Returns all results: datacenter::aws

Why does search not find data when using wild card in middle of search term?

$
0
0
I have JSON data, which is indexed and can be searched. This is an example of the data Product: { [-] BottleSizeMls: 750mls BottleSizeName: Bottle Id: 0 Notes: null Title: MOSS WOOD Ribbon Vale Merlot, Margaret River 2013 Winery: null } I have 4 searches, the first three work and the last one does not. Product.Title="MOSS WOOD Ribbon Vale Merlot, Margaret River*" Product.Title="MOSS WOOD Ribbon Vale Merlot, Margaret River 2013" Product.Title="MOSS WOOD Ribbon Vale Merlot, Margaret River*" Product.Title="*2013" Product.Title="MOSS WOOD Ribbon Vale Merlot, Margaret River*2013" I need to use the wildcard as the third party data is inconsistent and sometimes comes with extra words before the year rather than just the single space. This ONLY happens for one or two different wines, and works in 99.9% of cases. I have checked the original JSON and there is only a single space in the source data. Any thoughts on how to diagnose?

Arduino with splunk

$
0
0
hello, we are trying to use splunk with arduino by wifi (esp8266). We utilized the port 8088 and this code but don´t recognize this port and send to port 443. But we have some problems and we can't connect they both together. Can you have some example by how we can connect that? We try using this code: /* Medidor de temperatura com LM35, ESP8266 e Thingspeak * * Envia a temperatura lida pelo sensor LM35 e envia para a internet * com o auxilio da platarforma de Cloud do Thingspeak. * * library modified 23 Mar 2016 on Arduino IDE 1.6.7 * by Cléber Werlang (http://iot4nerds.wordpress.com/) */ #include // Biblioteca do ESP8266 #include "Limits.h" String apiKey = "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"; //Coloque a API do ThingSpeak entre as aspas const char* ssid = "Renato iPhone 5s"; //Coloque a ssid do Wifi entre as aspas. const char* password = "xxxxxxxxxxx"; //Coloque a senha do Wifi entre as aspas. const char* server = "nnn,nnn,nn,172"; const int sensorTemp = A0; //Pino analógico que o sensor de temperatura está conectado. int valorSensorTemp = 0; //usada para ler o valor do sensor de temperatura. int menorValorTemp = INT_MAX; //usada para armazenar o menor valor da temperatura. WiFiClient client; void setup() { Serial.begin(115200); delay(10); WiFi.begin(ssid, password); Serial.println(); Serial.println(); Serial.print("Conectando em "); Serial.println(ssid); WiFi.begin(ssid, password); while (WiFi.status() != WL_CONNECTED) { delay(100); Serial.print("."); } Serial.println(""); Serial.println("Conectado!"); } void loop() { Serial.println("Lendo temperatura!"); //Para evitar as grandes variações de leitura do componente //LM35 são feitas 8 leitura é o menor valor lido prevalece. menorValorTemp = INT_MAX; //Inicializando com o maior valor int possível for (int i = 1; i <= 8; i++) { //Lendo o valor do sensor de temperatura. valorSensorTemp = analogRead(sensorTemp); //Transformando valor lido no sensor de temperatura em graus celsius aproximados. valorSensorTemp *= 0.54 ; //Mantendo sempre a menor temperatura lida if (valorSensorTemp < menorValorTemp) { menorValorTemp = valorSensorTemp; } delay(150); } String postStr = apiKey; postStr +="&field1="; postStr += String(menorValorTemp); postStr += "\r\n\r\n"; client.connect("https://nnn,nnn,nn,172",8088); client.println("POST /1/inputs/http?index=&sourcetype=arduino&host=arduino"); client.println("Authorization: Splunk "); /client.print("Authorization: Splunk "+apiKey);/ client.print(postStr.length()); client.println("Content-Type: text/plain"); client.println("Connection: close"); client.println(""); client.println(postStr); Serial.println(""); Serial.print("Temperatura: "); //Imprime a temperatura na Serial. Serial.print(menorValorTemp); Serial.println("C"); Serial.println("Enviando para o ThingSpeak"); client.stop(); delay(14000); // thingspeak tem um delay de 14 segundos para os updates. Serial.println("Enviado!"); Serial.println(""); }

Filter Events data using conditional Regex

$
0
0
HI All, Below is my raw event data . {"FormatVersion":"1.1","StartTime":"2017-09-22T01:11:38.565Z","EndTime":"2017-09-22T01:11:39.468Z","EventType":"Login","Result":"Success","UserId":"dmorand","TerminalId":"172.16.3.85","SessionId":"RCIcAM1DxUYmG7WMDMkEuQXyGTpOqcBMtyrGOPpFUPU=","LoginUri":"/login-auth/saml","EventSource":"Platform","ServerHostname":"fe02.hbc.stage.us-west-2.orionsaas"} I want event indexing like below condition. 1. IF "EventType":"Login" and "LoginUri":"/login-auth/saml" the index those event. means we need to discard those event in which event type = login and login uri != /login-auth/saml 2. if "EventType":"Login" and "LoginUri" is not present then index those event 3. if "EventType":"Login" not equals to Login index those event please help for making this regex .

Configuration changes alert.

$
0
0
This query related to changes in .conf file by any user : index=_audit sourcetype=audittrail *.conf NOT (action=search OR action=modified) | table _time user file_name action modtime, now i am getting the results but not the users which is the whole purpose of this search. the user listed in the results are -1 , 0 and n/a, is anything wrong with my splunk configuration or splunk doesnt actually capture the usernames?

Search results for specific users in the lookup field

$
0
0
Hi All, I am trying to list out activity of providing local admin rights other than the authorized user accounts. The list of user authorized user accounts are added in a lookup table called "ITSD.csv" and i am running the below query index=winendpoint EventCode=4732 Group_Name="Administrators" [|inputlookup ITSD.csv | table User] The result is not giving me any results, even though there are events of local admin rights provided by users in the list. Can somebody help ?

Splunk Visualization based on mouse cursor

$
0
0
Hello All, I have a dashboard with below visualizations: ![alt text][1] Now suppose I keep the mouse cursor on user 1 "alchen" in first visualization , I want it to highlight the events for the same user in second visualization as well. Can this be done using Splunk. I have no idea if there is such feature. Kindly help. Regards Shailendra Patil [1]: /storage/temp/216614-screen-shot-2017-09-23-at-55131-pm.png

Transformation fields using Splunk UI

$
0
0
Team, I need help in defining 3 new fields using Splunk User interface. 1. Decision=Agree , Field Name should be "Decision" and Matching values is "Agree". 2. Fieldname is "Time" , need this in the Timestamp format ( Dateand HH:MM:SS) 3. SourceIP Any help is greatly appreciated.

Left Join Not Returning All Fields

$
0
0
So as an example: **Primary Table** Customer 1, 2, 3 **Secondary Table** Customer 1,2,3,2 Spend 100, 200, 300, 400 Search: index=primary | join Customer [search index=secondary] | table Customer Spend Output is Customer 1, 2, 3 Spend 100, 400, 300 It is NOT returning all values from Customer 2. Need ALL values from secondary table, not just one. Please advise. Thanks!
Viewing all 47296 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>