I have one data model with acceleration. I am using Splunk version 6.5.3. On my Splunk instance, It is showing 100% built data model but still, I am not getting data with tstats command and summariesonly=true.
This thing is only happening in my particular instance. It's working on other colleagues instance without any issue with same Splunk version.
How to debug this problem? Is there any steps to debug?
↧
facing issue in datamodel summary creation
↧
Can I use the Splunk Supporting Add-on for Active Directory (SA-ldapsearch) to enumerate group membership for a specified user?
I have seen how the Splunk Supporting Add-on for Active Directory (SA-ldapsearch) can give me a list of all groups, and enumerate their membership, to include both nested and direct membership. I have also seen how I can retrieve all users, and the groups which they are a member of.
Does anyone have a search where I can search Active Directory with SA-ldapsearch, specify a user, and enumerate all group membership, to include any inherited groups?
Desired Results:
User | Group | Membership Type
John.Doe | Domain Users | Direct
John.Doe | Accounting | Direct
John.Doe | Finance Dept. | Nested
I believe I had some success achieving this with the data from Active Directory monitoring - however, I'd prefer to use SA-ldapsearch for this.
↧
↧
Need to inventory the large number of searches, queries and reports and dump .csv file.
I need to inventory the large number of searches, queries and reports and dump the details (name), scheduled time, search attributes, owner, email etc, into a .csv file. IS there a quick and easy way to do this?
↧
subsearch truncating the result
I have one query where I want to get the results in terms of installed,uninstalled and exception machines.
Not installed machine details we are getting from a Lookup file,exception machine details getting from ldap search and installed machine details we are getting from dbquery.
When I am joining installed and exceptions machine count with dbquery which is having more than 70K results ..final results getting truncated and I am getting partial result.
| inputlookup newuptodatead.csv
| where strptime(pwdLastSet,"%Y/%m/%d %H:%M:%S %Z")>=relative_time(now(),"-45d@d")
| rex field=distinguishedName "CN=.*?,DC=(?.*?),DC="|search DC=na OR DC=ap OR DC=eu OR DC=sa OR DC=mea
| rex field=distinguishedName "CN=.*?,OU=(?.*?),DC="
| search ADOU="*EngineeringLabs*" AND ADOU!="*Harvester*" AND ADOU!="*Image*" AND operatingSystem!="*Server*" AND operatingSystem="Windows*" AND cn != "*Kratos*" AND cn != "*harv*"
| eval InWBSN="Not Installed"| eval cn=upper(cn)
| join cn type=left [
| ldapsearch domain=x search="(&(objectCategory=group)(cn=WBSN_RM))" attrs="member" | rename _raw as _raw2 | fields member | append [
| ldapsearch domain=x search="(&(objectCategory=group)(cn=wbsn_rm_bsod))" attrs="member" | rename _raw as _raw2 | fields member ] | append [
| ldapsearch domain=x search="(&(objectCategory=group)(cn=wbsn_rm_tomcat))" attrs="member" | rename _raw as _raw2 | fields member ]
| stats values(member) as dn
| rex field=dn "CN=(?.*?),OU="
|table cn
|mvexpand cn
|eval cn=upper(cn)
| eval InWBSN= " Exception"
| sort 0 cn]
| join cn type=left [
| ldapsearch domain=x search="(&(objectCategory=group)(cn=WBSN_RM))" attrs="member" | rename _raw as _raw2 | fields member | append [
| ldapsearch domain=x search="(&(objectCategory=group)(cn=wbsn_rm_bsod))" attrs="member" | rename _raw as _raw2 | fields member ] | append [
| ldapsearch domain=x search="(&(objectCategory=group)(cn=wbsn_rm_tomcat))" attrs="member" | rename _raw as _raw2 | fields member ]
| stats values(member) as dn
| rex field=dn "CN=(?.*?),OU="
|table cn
|mvexpand cn
| eval cn=upper(cn)
|eval InWBSN= " Exception"
| sort 0 -cn]
| join cn type=left
[ | dbquery "AA81" " SELECT [KEY] AS cn, UPDATE_DATE AS _time FROM PA_DYNAMIC_STATUS WHERE UPDATE_DATE >= DATEADD(DAY,-60,GETDATE()) "
| append maxout=0 maxtime=0
[ | dbquery "AA" " SELECT [KEY] AS cn, UPDATE_DATE AS _time FROM PA_DYNAMIC_STATUS WHERE UPDATE_DATE >= DATEADD(DAY,-60,GETDATE()) " ]
| rex field=cn "(?[^\\.]*)"
| eval cn=trim(upper(cn))
| eval InWBSN="Installed"
| sort 0 cn ]
| join cn type=left
[ | dbquery "AA81" " SELECT [KEY] AS cn, UPDATE_DATE AS _time FROM PA_DYNAMIC_STATUS WHERE UPDATE_DATE >= DATEADD(DAY,-60,GETDATE()) "
| append
[ | dbquery "AA" " SELECT [KEY] AS cn, UPDATE_DATE AS _time FROM PA_DYNAMIC_STATUS WHERE UPDATE_DATE >= DATEADD(DAY,-60,GETDATE()) " ]
| rex field=cn "(?[^\\.]*)"
| eval cn=trim(upper(cn))
| eval InWBSN="Installed"
| sort 0 -cn]| stats Count by InWBSN
Result should be like:
InWBSN count
Installed ****
Not Installed *****
Exception *****
I am getting partial result when I am incorporating my search with Dbquery search.
↧
Pairing two fields that came from the same event
I have a handful of fields that I've extracted from the raw event data using the `rex` function. Now that I have these fields, I've applied some stat grouping and counting to them, but I'm unable to properly display additional fields without grouping on them.
The stats line looks like this: ` | stats values(problem) AS prob count by problemType problemLocation` that produces a table that holds a problemType in the first column, the problem location in the second column, and a list of unique problems that are of that problemType and were caused at that problemLocation in a third column- given by the `values` function.
Each event only has 1 value for `prob` in this third column, and also only 1 value for another field `X`. I'm now trying to display, in a fourth column, values of this `X` field that match up in the same row with the displayed `prob` values in the third column without having to group everything else by them.
Any help would be appreciated.
↧
↧
Typing queue blocked
How to find sources/sourcetypes/hosts/indexes causing typing queue blockage?
↧
Can I use a savedsearch to filter events before the first pipe?
Hello,
I'm trying to make my searches more efficient and I'd like to know if savedsearches (or maybe macros) can be used to filter before the first pipe. Assume I have two indexes named `current` and `history`. I'd like to filter events in `history` based on a value in `current`. Simplified indexes:
Current
ID,Current_Status
0001,Open
0002,Open
0003,Closed
History
ID,Historical_Status
0001,Open
0002,Open
0003,Open
0003,In Progress
0003,Closed
Assuming I'd like to analyze events in `history` for IDs in `current` that have `Current_Status="Closed"`, would it be possible to avoid having to load all data, make a join, and then use a `where` condition by including some sort of subsearch in the first line?
I've already got as far as creating a savedsearch `get_current_status` that will return the current value:
index="current" ID="$ID" | table Current_Status
I can successfully call this in a search as well:
| savedsearch get_current_status ID=ID
What I can't do, however, is figure out whether I can use a subsearch to filter in the first line. Something along the lines of:
index="history" [|savedsearch get_current_status ID=ID]="Closed"
Is this possible?
Thank you and best regards,
Andrew
↧
Scheduled jobs dropped from an indexer, too much memory free
I have 4 indexers that always have the same memory load (monitored through Zabbix) - usually consistent at around 90% memory free, however, splunkix04, for some reason dropped all of its scheduled jobs and its memory is free up to 95%.
I've checked all the monitoring for splunkix04, everything is good; its pointing to two search heads...this has happened before, and I did a systemctl restart splunk and the scheduled jobs did catch on to splunkix04 and everything was normal. However, a few days later, splunkix04 dropped its scheduled searches again. splunkix04 has only 11 splunk helpers, while my other 3 indexers have about 3 dozen.
1. What can be done to ensure splunkix04 doesn't drop its scheduled searches?
2. When I do a systemctl restart splunk on splunkix04, what is the impact on logging data (I have 3 other indexers).
Splunk Enterprise 6.5.3
Thanks.
↧
Regex to extract one or more lines with same heading in single event
Hello guys,
I'm adding this to my search in order to extract fields :
| rex max_match=0 field=_raw "CC :' \d+' de DN : 'CN=(?[^,]+)[^']+'\n(- CODE \(serial : (?P
\d+)\) error.\n-+\n)+"
Event example :
CC :' 223' de DN : 'CN=XXX 2025, ABCDEFGHIJKLMNOPQRSTUVWXYZ'
- CODE (serial : 1234) error.
---------------------------------------------------------
- CODE (serial : 5676) error.
---------------------------------------------------------
- CODE (serial : 5677) error.
---------------------------------------------------------
- CODE (serial : 5678) error.
---------------------------------------------------------
- CODE (serial : 5679) error.
---------------------------------------------------------
CC :' 224' de DN : 'CN=YYY 2025, ABCDEFGHIJKLMNOPQRSTUVWXYZ'
I want to get XXX 2025:1234,XXX 2025:5678...etc like a tree with 1 or more branches.
The problem is it returns only last match : last code.
Thanks a lot.
Regex101 link : https://regex101.com/r/M96VAN/2
↧
↧
Drilldown does not trigger in Maps+
Hi,
after deploying Maps+, I'm able to do a lot of things, with the exception of drilldown.
I have enabled it in the *Format Visualization* settings, as per documentation, and I've filled the *Edit drilldown* form, but when I double click on a marker, all I get is a zoom in (or, sometimes, nothing at all). I've also tried to convert the panel to HTML, and tweaked around it, but the drilldown does not trigger.
One strange thing is that, in the *Edit drilldown* form, I get this warning:
*This custom visualization might not support drilldown behavior.*
Then I save the panel, and nothing happens on the double click.
Any suggestion?
↧
Debut API call from REST API Modular Input
Hi everyone,
what is the best solution to debug an input of the "**REST API Modular Input**" Apps ?
it's possible to call the GET input manualy ?
regars
↧
Trouble with self-signed certs between Forwarder and Indexer
Hello Everyone,
I am having trouble configuring self-signed certs and was wondering if I could possibly get some advice.
I am doing this in a test environment with the express purpose of replicating the configurations listed in the splunk docs (.../Splunk/7.1.3/Security/Howtoself-signcertificates)
These configs are being performed on a deployment server. The deployment server's splunk.secret was replicabed to all boxes upon initial install. This splunk 7.1.2 on RHEL 7.
Currently I am getting the following error:
- ERROR TcpInputProc - Error encountered for connection from src=10.0.0.1:36014. error:1408F10B:SSL routines:SSL3_GET_RECORD:wrong version number
- WARN SSLCommon - Received fatal SSL3 alert. ssl_state='SSLv3 read server certificate B', alert_description='unknown CA'.
Here is my configuration:
# Create a key to sign your certificates.
/opt/splunk/bin/splunk cmd openssl genrsa -aes256 -out myCAPrivateKey.key 2048
splunk_$certs
- Generate a new Certificate Signing Request (CSR) When prompted, create a password for the key.
/opt/splunk/bin/splunk cmd openssl req -new -key myCAPrivateKey.key -out myCACertificate.csr
splunk_$certs
- Anything not specified is left default/blank
Country Name (2 letter code) [AU]:US
State or Province Name (full name) [Some-State]:DC
Locality Name (eg, city) []:Washington
Organization Name (eg, company) [Internet Widgits Pty Ltd]:MyTestOrg
Organizational Unit Name (eg, section) []:SecDiv
A challenge password []:splunk_$certs
Common Name (e.g. server FQDN or YOUR name) []:Deployment_Server
- Use the CSR myCACertificate.csr to generate the public certificate:
/opt/splunk/bin/splunk cmd openssl x509 -req -in myCACertificate.csr -sha512 -signkey myCAPrivateKey.key -CAcreateserial -out myCACertificate.pem -days 10950
splunk_$certs
# Create the server certificate for the search head to forward its data to the indexers
/opt/splunk/bin/splunk cmd openssl genrsa -aes256 -out myServerPrivateKey.key 2048
splunk_$certs
- Generate and sign a new server certificate
/opt/splunk/bin/splunk cmd openssl req -new -key myServerPrivateKey.key -out myServerCertificate.csr
splunk_$certs
- Anything not specified is left default/blank
Country Name (2 letter code) [AU]:US
State or Province Name (full name) [Some-State]:DC
Locality Name (eg, city) []:Washington
Organization Name (eg, company) [Internet Widgits Pty Ltd]:MyTestOrg
Organizational Unit Name (eg, section) []:SecDiv
Common Name (e.g. server FQDN or YOUR name) []:Search_Head
A challenge password []:splunk_$certs
/opt/splunk/bin/splunk cmd openssl x509 -req -in myServerCertificate.csr -SHA256 -CA myCACertificate.pem -CAkey myCAPrivateKey.key -CAcreateserial -out myServerCertificate.pem -days 1095
splunk_$certs
- Create a single PEM file
- Once you have your certificates, you must combine the server certificate and your keys into a single file that Splunk software can use.
cat myServerCertificate.pem myServerPrivateKey.key myCACertificate.pem > myNewServerCertificate.pem
- The CA cert is copied to a deployment app so it can be reused. The Search_Head/server certs are moved.
- The 'dev_transit_forwarder_certs' app is transfered to the search head via the deployment server
cp myCA* /opt/splunk/etc/deployment-apps/dev_transit_forwarder_certs/splunk
mv myNew* /opt/splunk/etc/deployment-apps/dev_transit_forwarder_certs/splunk
mv myServer* /opt/splunk/etc/deployment-apps/dev_transit_forwarder_certs/splunk
# Create the server certificate for the search head to forward its data to the indexers
/opt/splunk/bin/splunk cmd openssl genrsa -aes256 -out myServerPrivateKey.key 2048
splunk_$certs
- Generate and sign a new server certificate
/opt/splunk/bin/splunk cmd openssl req -new -key myServerPrivateKey.key -out myServerCertificate.csr
splunk_$certs
- Anything not specified is left default/blank
Country Name (2 letter code) [AU]:US
State or Province Name (full name) [Some-State]:DC
Locality Name (eg, city) []:Washington
Organization Name (eg, company) [Internet Widgits Pty Ltd]:MyTestOrg
Organizational Unit Name (eg, section) []:SecDiv
Common Name (e.g. server FQDN or YOUR name) []:Indexer
A challenge password []:splunk_$certs
/opt/splunk/bin/splunk cmd openssl x509 -req -in myServerCertificate.csr -SHA256 -CA myCACertificate.pem -CAkey myCAPrivateKey.key -CAcreateserial -out myServerCertificate.pem -days 1095
splunk_$certs
- Create a single PEM file
- Once you have your certificates, you must combine the server certificate and your keys into a single file that Splunk software can use.
cat myServerCertificate.pem myServerPrivateKey.key myCACertificate.pem > myNewServerCertificate.pem
- The certs are copied to a deployment app.
- The 'dev_transit_indexer_certs' app is transfered to two indexers via the deployment server
cp myCA* /opt/splunk/etc/deployment-apps/dev_transit_indexer_certs/splunk
mv myNew* /opt/splunk/etc/deployment-apps/dev_transit_indexer_certs/splunk
mv myServer* /opt/splunk/etc/deployment-apps/dev_transit_indexer_certs/splunk
# Search Head configurations
### /dev_hf_outputs/local/server.conf
[sslConfig]
sslRootCAPath = /opt/splunk/etc/apps/dev_transit_forwarder_certs/splunk/myCACertificate.pem
### /dev_hf_outputs/local/outputs.conf
[indexAndForward]
index = false
[tcpout]
defaultGroup = dev_indexers
indexAndForward = false
[tcpout:dev_indexers]
server = 10.0.0.10:9996,10.0.0.11:9996
disabled = 0
[tcpout:splunkssl]
clientCert = /opt/splunk/etc/apps/dev_transit_forwarder_certs/splunk/myNewServerCertificate.pem
sslPassword = splunk_$certs
sslVerifyServerCert = false
# Indexer configurations
### /dev_indexers_inputs/local/server.conf
[sslConfig]
sslRootCAPath = /opt/splunk/etc/apps/dev_transit_indexer_certs/splunk/myCACertificate.pem
### /dev_indexers_inputs/local/inputs.conf
[splunktcp-ssl:9996]
disabled = 0
[SSL]
serverCert = /opt/splunk/etc/apps/dev_transit_indexer_certs/splunk/myNewServerCertificate.pem
sslPassword = splunk_$certs
requireClientCert= false
↧
Tabs in multiple rows
Hi All,
I am planning to create tabs in multiple rows and the first tab in each row should be "Active" (meaning as soon as the dashboard loads the Active tabs from each row should display panels under it).
Row1: Tab1(Active), Tab2, Tab3, Tab4
Row2: Tab5(Active), Tab6, Tab7
Row3: Tab8(Active),Tab9
I was successful in creating the tabs for all the three rows and also i linked the panels to each respective tabs.
The issue i am facing is, when i am specifying "li class="active" for Tab 1 it works and it shows the panels under it as soon as the dashboard loads.
When i am repeating the same for Tabs5& Tabs8, it is not working for them (meaning the panels under it don't show up until the selection is made) which it shouldn't be the case.
Seems like the scripts or the dashboard is taking only Tab1 into consideration for the Active part.
I am using the tabs.js and tabs.css scripts from "making a tabs in splunk" page.
Any suggestions or help would be really helpful.
Thanks,
Sandeep.
↧
↧
English version?
Hello. Do you have a version of this with English labels and variables? I'll go through and translate it if not, but won't spend the time if you already have one.
Thanks!,Hello, do you happen to have a version of this with English labels and variables? I can go through and translate it if not, but won't spend the time if you already have one.
Thanks!
↧
How do I pair two fields that came from the same event?
I have a handful of fields that I've extracted from the raw event data using the `rex` function. Now that I have these fields, I've applied some stat grouping and counting to them, but I'm unable to properly display additional fields without grouping on them.
The stats line looks like this: ` | stats values(problem) AS prob count by problemType problemLocation` . That line produces a table that holds a problemType in the first column, the problem location in the second column, and a list of unique problems that are of that problemType and were caused at that problemLocation in a third column, given by the `values` function.
Each event only has 1 value for `prob` in this third column, and also only 1 value for another field `X`. I'm now trying to display, in a fourth column, values of this `X` field that match up in the same row with the displayed `prob` values in the third column without having to group everything else by them.
Any help would be appreciated.
↧
How do I find what is causing my typing queue blockage?
How do I find sources/source types/hosts/indexes causing typing queue blockage?
↧
How do I make tabs in multiple rows in a dashboard?
Hi All,
I am planning to create tabs in multiple rows and the first tab in each row should be "Active" , (meaning as soon as the dashboard loads, the Active tabs from each row should display panels under it).
Row1: Tab1(Active), Tab2, Tab3, Tab4
Row2: Tab5(Active), Tab6, Tab7
Row3: Tab8(Active),Tab9
I was successful in creating the tabs for all the three rows and also i linked the panels to each respective tabs.
The issue i am facing is, when i am specifying "li class="active" for Tab 1 , it works and it shows the panels under it as soon as the dashboard loads.
But, when i am repeating the same for Tabs5 & Tabs8, it is not working for them (meaning the panels under it don't show up until the selection is made, which it shouldn't be the case).
Seems like the scripts or the dashboard is taking only Tab1 into consideration for the "Active" part.
I am using the tabs.js and tabs.css scripts from "making a tabs in splunk" page.
Any suggestions or help would be really helpful.
Thanks,
Sandeep.
↧
↧
Can I use a savedsearch (or a macro) to filter events before the first pipe?
Hello,
I'm trying to make my searches more efficient and I'd like to know if savedsearches (or maybe macros) can be used to filter before the first pipe. Assume I have two indexes named `current` and `history`. I'd like to filter events in `history` based on a value in `current`. Simplified indexes:
Current
ID,Current_Status
0001,Open
0002,Open
0003,Closed
History
ID,Historical_Status
0001,Open
0002,Open
0003,Open
0003,In Progress
0003,Closed
Assuming I'd like to analyze events in `history` for IDs in `current` that have `Current_Status="Closed"`, would it be possible to avoid having to load all data, make a join, and then use a `where` condition by including some sort of subsearch in the first line?
I've already got as far as creating a savedsearch `get_current_status` that will return the current value:
index="current" ID="$ID" | table Current_Status
I can successfully call this in a search as well:
| savedsearch get_current_status ID=ID
What I can't do, however, is figure out whether I can use a subsearch to filter in the first line. Something along the lines of:
index="history" [|savedsearch get_current_status ID=ID]="Closed"
Is this possible?
Thank you and best regards,
Andrew
↧
Why is my indexer dropping scheduled jobs (too much free memory)?
I have 4 indexers that always have the same memory load (monitored through Zabbix). They are usually consistent at around 90% memory free. However, for some reason splunkix04 — one of my indexers — dropped all of its scheduled jobs and its memory is free up to 95%.
I've checked all the monitoring for splunkix04, but everything is good; it's pointing to two search heads...this has happened before, and I did a systemctl to restart Splunk, and the scheduled jobs did catch on to splunkix04 and everything was normal. However, a few days later, splunkix04 dropped its scheduled searches again. splunkix04 has only 11 Splunk helpers, while my other 3 indexers have about 3 dozen.
1. What can be done to ensure splunkix04 doesn't drop its scheduled searches?
2. When I do a systemctl restart splunk on splunkix04, what is the impact on logging data (I have 3 other indexers)?
Splunk Enterprise 6.5.3
Thanks.
↧
Will you help me create a regex to extract one or more lines with same heading in a single event?
Hello guys,
I'm adding this to my search in order to extract fields :
| rex max_match=0 field=_raw "CC :' \d+' de DN : 'CN=(?[^,]+)[^']+'\n(- CODE \(serial : (?P\d+)\) error.\n-+\n)+"
Event example :
CC :' 223' de DN : 'CN=XXX 2025, ABCDEFGHIJKLMNOPQRSTUVWXYZ'
- CODE (serial : 1234) error.
---------------------------------------------------------
- CODE (serial : 5676) error.
---------------------------------------------------------
- CODE (serial : 5677) error.
---------------------------------------------------------
- CODE (serial : 5678) error.
---------------------------------------------------------
- CODE (serial : 5679) error.
---------------------------------------------------------
CC :' 224' de DN : 'CN=YYY 2025, ABCDEFGHIJKLMNOPQRSTUVWXYZ'
I want to get XXX 2025:1234,XXX 2025:5678...etc like a tree with 1 or more branches.
The problem is it returns only last match : 5679
Thanks a lot.
Regex101 link : https://regex101.com/r/M96VAN/2
↧