Quantcast
Channel: Questions in topic: "splunk-enterprise"
Viewing all 47296 articles
Browse latest View live

How to remove varying characters from string in field

$
0
0
HI again! I need help with removing characters from a string. We have a tool that generates a user field that is typically domain\user. I have used replace to fix that issue because domain is static so I do replace domain\* with * in user. However, sometimes the user is a local user account on a workstation and the "domain" becomes the computer name. Which varies for each computer, so my previous trick won't work. How can I remove varying computer names that could be different lengths and only report the user (which could also be different lengths. Examples computer\user computer1\user computer2\user1 comp\us1 and All I want is what is on the right side of the \ Thanks!

Splunk App for AWS - Cloudwatch Alarms Only

$
0
0
Hi, reading the docs but can't find what I'm looking for. We have the Splunk App for AWS, is there a way to just receive Cloudwatch Alarms in Splunk? We do not want to ingest all the cloudwatch metrics. Thanks! Chris

How do you calculate the mean of a timewrap series?

$
0
0
I am trying to write a query that will count the number of errors for the last 5 minutes and then I want to compare if that error total is greater than the average for the previous 15 minutes. Then I want to trigger an alert if series s0 is greater than the mean (for alerting purposes). I found part of the solution from Splunk Answers ([https://answers.splunk.com/answers/151921/how-to-set-up-alert-when-error-count-of-latest-week-is-greater-than-average-of-all-weeks-in-past-30-days.html][1]), but my eval expression to calculate the mean field does not seem to work; the field is just empty. What am I not doing right here? Query is below: index=cisco_uc sourcetype=RCD| where like (Variable10,"Tx|%") | bucket _time span=5m | stats latest(Variable10) as Variable10 by _time Variable2 |timechart span=5m count(eval(like(Variable10,"%|U%"))) as U_Count |timewrap 5min series=short |eval mean=(U_Count_s1 + U_Count_s2 + U_Count_s3)/3 |where U_Count_s0 > mean [1]: https://answers.splunk.com/answers/151921/how-to-set-up-alert-when-error-count-of-latest-week-is-greater-than-average-of-all-weeks-in-past-30-days.html

How do you merge events on common field values?

$
0
0
Basically I have two fields, index and sourcetypes. Index: Sourcetype: index1 sourcetypeA index2 sourcetypeA How do I merge the values of index on the common field value in source type? Expected result: Sourcetype: sourcetypeA index1 index2

How do I append event(s) fields to a separate event based on two timestamps?

$
0
0
Hello, I have two sets of data: Trip Metadata(A) and Individual Trip Coordinates(B). Set A fields: - StartTime - EndTime Set B fields: - Coordinates - Timestamp How do I append the Set B Coordinate field to events of Set A only if the Set B timestamp is between the Set A Start and End time? Thanks for reading. Both sets of data live in the same index but with different source types.

How do you remove varying characters from a string in a field?

$
0
0
Hi again! I need help with removing characters from a string. We have a tool that generates a user field that is typically domain\user. I have used replace to fix that issue because domain is static so I do replace domain\* with * in user. However, sometimes the user is a local user account on a workstation and the "domain" becomes the computer name, which varies for each computer, so my previous trick won't work. How can I remove varying computer names that could be different lengths and only report the user (which could also be different lengths. Examples computer\user computer1\user computer2\user1 comp\us1 ...and all I want is what is on the right side of the \ Thanks!

How can I perform math calculations within my XML dashboard?

$
0
0
I would like to use a drilldown token created from clicking a bar on a timechart and add 1800 to the value and use it in the "latest" tag in the new chart that generates. Here's what I currently have:
-24h@hnow
$field1.earliest$$field1.latest$1$click.value$$click.value$
$myAccount$accountId=$accountNumber$ | timechart count()$field1.earliest$$field1.latest$$click.value$ **'click.value$ + 500**accountId=$accountNumber$ | timechart count()$timespan$**$timeSpan2$**
I cleared out most of the fluff to make it a bit easier to read through. The $timeSpan$ token works perfectly fine but how do I add 1800 to $timeSpan$ and set that to token $timeSpan2$? Thank you, Jonathan

Can you help me parse the time stamp of some Apache Access Logs i'm manually ingesting?

$
0
0
Hi all I am trying to manually ingest some Apache access logs and am having issues parsing the timestamp. For some reason, the logs have dashes instead of colons (see example below). I've tried using regex to look for the "- -" before the timestamp and I've used %Y-%m-%d--%h-%m-%s for the string in strptime(). I have also installed the Apache add-on and it doesn't make a difference. 127.0.0.1 - - 2012-06-27--01-28-51 "GET /mifs/c/i/reg/scep/enroll/local/scep.html?operation=GetCACert&message=MobileIronSCEP HTTP/1.1" 200 1757 "-" "Java/1.6.0_26"

confirmation of the CLONE_SOURCETYPE config

$
0
0
Hello, I would need a confirmation of my CLONE_SOURCETYPE configuration. I have the following requirements: **sourcetype 1: ISP_hanatraces** Here I would like all the events containing the strings: csns.admin.AdminConnection csns.admin.commands alter system alter configuration to be cloned to the new index mlbso_changelog and the source type ISP_parameterChangelog. **sourcetype 2: ISP_executed_statements** Here I would like to copy all the events completely to the new index mlbso_changelog and the source type ISP_DMLchangelog. My configuration for that would be: **props.conf** [ISP_hanatraces] TRANSFORMS-ISP_parameterChangelog_clone [ISP_executed_statements] SHOULD_LINEMERGE = false LINE_BREAKER = ((?:\r?\n){2,}) TRANSFORMS-ISP_executed_statements_clone **transforms.conf** [ISP_parameterChangelog_clone] CLONE_SOURCETYPE = ISP_parameterChangelog REGEX = .*(?i)(csns\.admin\.AdminConnection|csns\.admin\.commands|alter system alter configuration)(?-i).* FORMAT = mlbso_changelog DEST_KEY = _MetaData:index [ISP_executed_statements_clone] CLONE_SOURCETYPE = ISP_DMLchangelog FORMAT = mlbso_changelog DEST_KEY = _MetaData:index Does the above make sense? Also, the original sourcetypes have some sensitive data being hashed in the props.conf using SEDCMD later in the file. However I would like to clone the data still before hashing. Would the cloning and transform rules be applied in the order how they appear in the props.conf? Then it would be fine for me as the cloning entries above would come first. Kind regards, Kamil

Splunk getting lookup table does not exist for one indexer from list of 20 indexer

$
0
0
Setting the below errors when searching on a few different types of indexes: 7 errors occurred while the search was executing. Therefore, search results might be incomplete. Hide errors. [indexer1] The lookup table 'MSADGroupType' does not exist. It is referenced by configuration 'windows:ad'. [indexer1] The lookup table 'nix_action_lookup' does not exist. It is referenced by configuration 'aws:ec2:unix'. [indexer1] The lookup table 'windows_action_lookup' does not exist. It is referenced by configuration 'windows:ad'. [indexer1] The lookup table 'windows_app_lookup' does not exist. It is referenced by configuration 'windows:ad'. [indexer1] The lookup table 'windows_audit_changes_lookup' does not exist. It is referenced by configuration 'windows:ad'. [indexer1] The lookup table 'windows_privilege_lookup' does not exist. It is referenced by configuration 'windows:ad'. [indexer1] The lookup table 'windows_vendor_info_lookup' does not exist. It is referenced by configuration 'windows:ad'. Checked the lookups and also the lookup table rights and the permissions are correct and also they are set as Global. In the same application we were able to do a | inputlookup and it showed up without any errors in the application. We also checked the indexer and it is running fine.

Could I save the predicted value after using ML model?

$
0
0
As title, I am using Splunk Machine Learning Toolkit now. I'm confused about whether I could save the result of predicting. In my opinion, I might save the result into my raw data. That's exactly what I want to do. Besides, could I make predictions while uploading raw data? Thank you.

Post Process searh - No results Found

$
0
0
Hi, I have the following code for post process search but it does not show me results. index=_internal source="*" sourcetype=splunkd_access$field1.earliest$$field1.latest$
-24h@hnow
stats count by method
Could you help me?, please. Regards.

Granular Deletion of Metrics?

$
0
0
All, How can I delete specific metrics? We have a GDPR concern that is preventing our metrics use cases. They are worried the metric name itself might be PAN/PII so until we have LUHN scanning of metric names and a way to delete we're kinda stalled.

Is it possible to execute a different Search query for exporting to CSV

$
0
0
Hello Splunk Experts, I have this Simple XML dashboard with a simple Query. index="__" sourcetype="__" source="___" DOCUMENT_ID="'$documentId$'" SERVICE_NAME="$serviceName$" SUCCESS_STATE="$successState$" | eval SERVICE_INPUT = replace(SERVICE_INPUT,"{{","") | eval SERVICE_INPUT = replace(SERVICE_INPUT,"}}","") | eval SERVICE_OUTPUT = replace(SERVICE_OUTPUT,"{{","") | eval SERVICE_OUTPUT = replace(SERVICE_OUTPUT,"}}","") | eval SIP = SERVICE_INPUT | eval SOP = SERVICE_OUTPUT | eval SERVICE_INPUT = substr(SERVICE_INPUT,1,15) + toString("...") | eval SERVICE_OUTPUT = substr(SERVICE_OUTPUT,1,15) + toString("...") | table TIME_STAMP, TIME_SPENT, SERVICE_NAME, SUCCESS_STATE, DOCUMENT_ID, SERVICE_INPUT , SERVICE_OUTPUT , TRANSACTION_TYPE, MACHINE_NAME, SIP, SOP $dateFrom.earliest$$dateFrom.latest$$row.SIP$$row.SOP$ The values for SERVICE_INPUT , SERVICE_OUTPUT will be quiet huge and hence I am shortening the length of these values to be displayed on the results table. However I have a drilldown beneath the main search results table, upon clicking a particular row, the complete XML message for SERVICE_INPUT and SERVICE_OUTPUT is being printed with the help of $row.SIP$ and $row.SOP$ values captured through the tokens. Now I want to export the search results into a CSV with the entire row ie Untrimmed SERVICE_INPUT and SERVICE_OUTPUT. I want these columns to be exported to the CSV, "TIME_STAMP", TIME_SPENT, SERVICE_NAME, SUCCESS_STATE, DOCUMENT_ID, SERVICE_INPUT , SERVICE_OUTPUT , TRANSACTION_TYPE, MACHINE_NAME". Also I want to omit SIP and SOP values as they will be redundant for each row. In Short I need the untrimmed SERVICE_INPUT and SERVICE_OUTPUT to be exported by leaving out SIP and SOP from the export.csv file. How do I go about this. Any help will he highly appreciated. Thanks and Regards, Sabari Nathan Krishnan

Pretty Print XML value on Table

$
0
0
Hello Splunk Experts, I have a Simple XML based dashboard which displays Input XML and Output XML as couple of columns(with all white spaces stripped off between the xml tags) in table results. I would like to pretty print the contents of the XML on the dashboard. I am not in a position to make use of external applications or external modules for pretty printing, Is there a way to workaround this issue. Any help would be highly appreciated. Thanks in Advance. Thanks and Regards, Sabari Nathan Krishnan

Creating End_Loading_Time

$
0
0
Hi Splunkers, I have faced with another problem where the logs I have contain only 3 fields with Start_Loading_Time, _Event_Reference, Event_Name. An example of this log is shown below dummy data: 11:00:31:800,3200,ABCDeposit;11:00:33:940,3201,ABCSelectAmount;11:00:35:320,3202,ABCSelectAccount;11:00:42:670,3203,ABCConfirm;11:00:50:350,3204,ACBSuccessfulEnd ....... ....... ....... I have used split function to split the above record by ";" which will give me below 11:00:31:800,3200,ABCDeposit 11:00:33:940,3201,ABCSelectAmount 11:00:35:320,3202,ABCSelectAccount 11:00:42:670,3203,ABCConfirm 11:00:50:350,3204,ACBSuccessfulEnd I have then used below regex to capture the two fields I'm after: (?Start_Loading_Time[^\,]+)\,\d*\,(?Event_Name\w+[^\n]+) What I am trying to create is to get "11:00:33:940" -1milisecond as End_Loading_Time for ABCDeposit and use "11:00:33:940" as Start_Loading_Time for ABCSelectAmount similarly I want to capture "11:00:35:320" -1milisecond as End_Loading_Time for ABCSelectAmount and use "11:00:35:320" Start_Loading_Time for ABCSelectAccount and so on. Any suggestion or help will be much appreciated. Many Thanks in advance!

rename hostname

$
0
0
Hi, How do I rename hostname in splunk. I am trying to enroll a particular syslog in splunk, I want to rename a hostname from a => b c => d How to achieve this? Thanks, Dinesh

How do you state within a time frame in a search

$
0
0
Hello All, I am relatively new to Splunk and need some help on this search query. I have hosts that are required to check in periodically to an external source. However I want to know what host have failed to do so in lets say the last 24 hours. Here is what I have so far. sourcetype="web" src_requires_av=true dest=requiredsite.com | table src_ip, src_nt_host, src_mac src_bunit | dedup src_ip, src_mac This outputs all devices that have successfully checked in, but I want the output to be for devices that have ***not checked in***.

Could receiver automaticlly make prediction when the data is uploading from forwarder?

$
0
0
Hi All, I have used Splunk Machine Learning Toolkit. I have a learned model, and I could use it to predict with new data. I have known how it works in Splunk Web, but I don't want to manually do the prediction. Could the automatical predicting happen when the raw data is uploaded into Splunk? If it could work, how? Thank you.

Could I just use commands to visualize result?

$
0
0
I use Splunk Enterprise. For saving alert with chart, I need to know whether the visualization could be called by commands but not clicking "visualization tab". Or, is there any way that could alert user with chart but not manual clicking? Thank you.
Viewing all 47296 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>