Hello,
I am using KNIME to clean data and Splunk for data analytics and visualization.
I would like to connect these tools together where KNIMEcan act as a data source for Splunk.
I was wondering if there is any solution for this?
↧
how to connect KNIME with Splunk?
↧
Process to upload to Splunkbase
Hi,
Once the add-on is developed, can we upload it to Splunkbase right away or is there any process it has to pass?
I understand the certification is not mandatory but is there any other process?
↧
↧
While performing the searches getting the warning messages like "usage limit exceed 500 MB"
While performing the searches getting the "usage limit exceed 500 MB" warning messages , To overcome this error , We increased the default srchdiskquota limit from 500MB to 1000MB for the specifice roles in authorize.conf.
But still not resolving the same error persists.
↧
CA SiteMinder - How to get audit logs into Splunk?
I need to get the SiteMinder audit logs into Splunk. Currently they we have them going into an Oracle DB. We want to eliminate the Oracle DB and have the audit logs go directly from SiteMinder into Splunk. Is this possible?
↧
I would like to forward the entire contents of a csv file even if its unchanged daily
Hello,
I'm attempting to forward a set of .csv files for administrator group auditing. However it only forwards, or at least the search only returns changes to the .csv file. For audit reasons, I need the entire contents of the .csv to ingest and not just the changes.
Is there a way to force the forwarder to ignore the fact it already gathered the data?
thanks,
↧
↧
How to send production events into dev cluster?
We just recently upgraded some of our equipment and decided to move the old equipment into our dev environment to set it up as a separate cluster. What is the best way to route production data over to the dev environment? I am looking for the best option that isn't too demanding on the production environment and won't eat up our license. We were considering looking into setting up Splunk HEC as a possibility. I am just brainstorming ideas currently I am open to all suggestions! Thanks!
↧
How to combine unique values of the field to one
I am trying to make a report with the unique combination of ID, AVER SRV & ZONE. However, since I am getting lots of duplicate values because I have multiple values for ZONE, is there anyway I can combine all the ZONE in one field so I won't have lots of duplication.
Currently I am using following query:
* | dedup ID AVER SRV ZONE | fields + ID, SRV, ZONE
Now if the Zone has multiple values, I am getting multiple entries instead I am trying to have one entry with all the different zones combined.
Please advise.
↧
How can I get my inputlookup to exclude holidays
Hello I have a search that I use to calculate days between 2 dates.
The search is like this:
|index=dev_tsv "BO Type"="assessments" | rename "BO ID" as id| convert timeformat="%Y-%m-%d %H:%M:%S.%6N" mktime("Step Date Started") AS starttime mktime("Step Date Completed") AS endtime mktime("Step Due Date") AS cumulativeDueDate mktime("Step Actual Due Date") AS actualDueDate
|eval dueDateRange=mvrange(cumulativeDueDate,now(),86400)
|convert ctime(dueDateRange) timeformat="%+"
| eval pastDueDays =mvcount(mvfilter(NOT match(dueDateRange,"(Sun|Sat).*")))
| mvexpand pastDueDays | search NOT [| inputlookup exclude_holidays | eval holiday_date=strptime(holiday_date, "%Y-%m-%d %H:%M:%S") | rename holiday_date as pastDueDays ]
|eval pastDueDays=if(isnull(pastDueDays),"0", pastDueDays)
|dedup "Workflow Name" "Workflow Phase" "Workflow Process Due Date" "Workflow Process Name" "Workflow Process Sort Order" "Workflow Started Date" "Workflow Step Name" "Step Due Date" "Workflow Step Sort Order"
| table "Workflow Name" "Workflow Phase" "Workflow Process Due Date" "Workflow Process Name" "Workflow Process Sort Order" "Workflow Started Date" "Workflow Step Name" "Step Due Date" "Workflow Step Sort Order" pastDueDays
Right now it works but does not exclude the exclude_holiday dates from the lookup it just excludes weekends.
The csfv is simply holiday_date like:
holiday_date
2018-01-01 00:00:00
2018-01-15 00:00:00
2018-02-19 00:00:00
2018-05-28 00:00:00
2018-07-04 00:00:00
2018-09-03 00:00:00
2018-10-08 00:00:00
2018-11-12 00:00:00
2018-11-22 00:00:00
2018-12-25 00:00:00
Any ideas what would be preventing me from being able to use the lookup and exclude those dates?
↧
What regex command to use to to remove the special character after a number?
I want to remove the special character after a number, please help.
data:
7.62\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00
I want:
7.62. the number is not constant it will keep changing so I need to remove only the special character.
↧
↧
What is the process to upload to Splunkbase once an add-on is developed?
Hi,
Once the add-on is developed, can we upload it to Splunkbase right away or is there any process it has to pass?
I understand the certification is not mandatory but is there any other process?
↧
How to forward the entire contents of a CSV file even if its unchanged daily?
Hello,
I'm attempting to forward a set of .csv files for administrator group auditing. However it only forwards, or at least the search only returns changes to the .csv file. For audit reasons, I need the entire contents of the .csv to ingest and not just the changes.
Is there a way to force the forwarder to ignore the fact it already gathered the data?
thanks,
↧
How to combine unique values of the field into one?
I am trying to make a report with the unique combination of ID, AVER SRV & ZONE. However, since I am getting lots of duplicate values because I have multiple values for ZONE, is there anyway I can combine all the ZONE in one field so I won't have lots of duplication.
Currently I am using following query:
| dedup ID AVER SRV ZONE | fields + ID, SRV, ZONE
Now if the Zone has multiple values, I am getting multiple entries instead I am trying to have one entry with all the different zones combined.
Please advise.
↧
How can I get my inputlookup to exclude holidays?
Hello I have a search that I use to calculate days between 2 dates.
The search is like this:
|index=dev_tsv "BO Type"="assessments" | rename "BO ID" as id| convert timeformat="%Y-%m-%d %H:%M:%S.%6N" mktime("Step Date Started") AS starttime mktime("Step Date Completed") AS endtime mktime("Step Due Date") AS cumulativeDueDate mktime("Step Actual Due Date") AS actualDueDate
|eval dueDateRange=mvrange(cumulativeDueDate,now(),86400)
|convert ctime(dueDateRange) timeformat="%+"
| eval pastDueDays =mvcount(mvfilter(NOT match(dueDateRange,"(Sun|Sat).*")))
| mvexpand pastDueDays | search NOT [| inputlookup exclude_holidays | eval holiday_date=strptime(holiday_date, "%Y-%m-%d %H:%M:%S") | rename holiday_date as pastDueDays ]
|eval pastDueDays=if(isnull(pastDueDays),"0", pastDueDays)
|dedup "Workflow Name" "Workflow Phase" "Workflow Process Due Date" "Workflow Process Name" "Workflow Process Sort Order" "Workflow Started Date" "Workflow Step Name" "Step Due Date" "Workflow Step Sort Order"
| table "Workflow Name" "Workflow Phase" "Workflow Process Due Date" "Workflow Process Name" "Workflow Process Sort Order" "Workflow Started Date" "Workflow Step Name" "Step Due Date" "Workflow Step Sort Order" pastDueDays
Right now it works but does not exclude the exclude_holiday dates from the lookup it just excludes weekends.
The csfv is simply holiday_date like:
holiday_date
2018-01-01 00:00:00
2018-01-15 00:00:00
2018-02-19 00:00:00
2018-05-28 00:00:00
2018-07-04 00:00:00
2018-09-03 00:00:00
2018-10-08 00:00:00
2018-11-12 00:00:00
2018-11-22 00:00:00
2018-12-25 00:00:00
Any ideas what would be preventing me from being able to use the lookup and exclude those dates?
↧
↧
splunk 7.1.2 login and logout option missing
Dears,
I'm unable to see my login button in my splunk home page: localhost:8000. Also, unable to see logout button as well.
Seeing the below error in messages.
Missing or malformed messages.conf stanza for LM_LICENSE:WARNINGS_GENERATED
Can someone update what is happening here.
Thanks,
Ramu Chittiprolu
↧
How many pipelines should I use on a forwarder?
I'm trying to figure out how many pipelines to set on my forwarders to maximize the following:
- Throughput
- Data distribution to my indexers
- Resource utilization
What are the things I need to be aware of when adding more pipelines? The default is 1.
↧
Splunk DB connect in SHC
Hi,
I have recently installed DB connect app in the search head cluster and the following message is being displayed in the search heads.
```DB Connect is running in SHCluster! Some functions will be disabled under this environment.```
What are the functions that are disabled in the app while running in the search heads ?
Thanks
↧
Fields on datamodel "ns_waf" doesn't exist
Hi.
Fields present on datamodel "ns_waf" doesn't exist.
Anyone have these fields extracted? nswaf_action, nswaf_appliance, nswaf_company, etc.
This app doesn't have any extraction defined on props.conf.
Regards,
Bruce Campos
↧
↧
Multiple alerts in one query
Please help I want the query with below scenario.
Requirement 1:
Check occurence of 0 in 10 mins timeframe.
If continuously 0 in 5 minutes,set some counter at every occurence of 0 continuously and send alert.
When the value > 0,reset counter.
Requirement 2:
Check if specific logs are not updating for sometime send alert.
Requirement 3:
Check the occurence of en event in 10 minutes timeframe and throw alert at some threshold.
In this source file is different.
-----
All these in one query.and alert should specify what is wrong.
↧
How we can use multiple AWS acccount in indexer cluster.
I have two AWS accounts (A and B) on both account I have installed two -two instance each . Is there way to connect cross AWS account , if yes how ?
↧
How can I create an alert for RDP logins without CyberArk credential check out
I am looking for a way to capture events where a user did not check out credentials from CyberArk before using them to RDP, so a scenario would be that someone checked out some credentials for 12 hours, he used it, then uses it again after 12 hours without doing another checkout.
I have a search where I am using a transaction command to capture CyberArk checkout events and Windows login events and looking for incomplete transactions with closed_txn=0. I have a query built but it does not seem to be capturing the right events because the result that shows up which has the two logs combined (Cyberark checkout and windows login) don’t have the correct information, the account name that was checked out does not match the account name used to login via RDP.
(index=wineventlog AND sourcetype="WinEventLog:Security" AND (Logon_Type=3 OR Logon_Type=10 OR Logon_Type=11) AND [inputlookup xxx.csv | fields + Account_Name ])
OR (index=main AND sourcetype="cyberark:epv:cef" AND "Retrieve password" AND (cn2="(Action: Show Password)" OR cn2="(Action: Copy Password)"))
| eval "Logon Behaviour"=case(Logon_Type==3,"Interactive",Logon_Type==10,"Remote Interactive",Logon_Type==11,"Cached Interactive")
| eval "Windows Account"=mvindex(Account_Name,1)
| transaction startswith="Retrieve password" endswith="Logon Type" keepevicted=1 keeporphans=1
| search closed_txn=0
| bucket _time span=12h
| stats count by _time "Windows Account" action Workstation_Name "Logon Behaviour"
| rename action as "Logon Result" Workstation_Name as "Workstation"
Below are a sample checkout event and a sample RDP event. I would appreciate any help in figuring this out or a better way to accomplish something like this.
**Check Out Event**
Aug 16 06:50:53 XXXXXX ABC: 0|Cyber-Ark|Vault|9.80.0000|295|Retrieve password|5|act=Retrieve password suser=user1 fname=fname1 dvc=dvc1 shost=192..x.x.x dhost=dhost1 **duser=duser1** externalId= app= reason= cs1Label="Affected User Name" cs1= cs2Label="Safe Name" cs2=EA_VSA cs3Label="Device Type" cs3=Operating System cs4Label="Database" cs4= cs5Label="Other info" cs5=192.168.122.30 cn1Label="Request Id" cn1= cn2Label="Ticket Id" cn2=(Action: Copy Password)Troubeshooting msg=(Action: Copy Password)Troubeshooting
**RDP event**
LogName=Security
SourceName=Microsoft Windows security auditing.
EventCode=4625
EventType=0
Type=Information
ComputerName=computername1
TaskCategory=Logon
OpCode=Info
RecordNumber=12345678
Keywords=Audit Failure
Message=An account failed to log on.
Subject:
Security ID: NT AUTHORITY\SYSTEM
Account Name: AccountName1
Account Domain: domain1
Logon ID: 0x3E7
Logon Type: 2
Account For Which Logon Failed:
Security ID: NULL SID
**Account Name: duser1**
Account Domain: domain1
Failure Information:
Failure Reason: Unknown user name or bad password.
Status: 0xC000006D
Sub Status: 0xC000006A
Process Information:
Caller Process ID: 0x5f4
Caller Process Name: C:\Program Files\process
Network Information:
Workstation Name: workstation1
Source Network Address: -
Source Port: -
Detailed Authentication Information:
Logon Process: Advapi
Authentication Package: Negotiate
Transited Services: -
Package Name (NTLM only): -
Key Length: 0
↧