When I search for this query it shows wrong results ?
|metadata type=hosts index=* |lookup domain.csv host output domain datacenter host IP |search domain=Y|eval age=(now()-recentTime)|convert ctime(*Time)| append[ |inputlookup domain.csv ] | dedup host | fields host IP domain datacenter lastTime age totalCount| sort lastTime
When I serach for this query shows full results ?
host=*wdc* |stats count by host
Any help .
↧
Query Help??
↧
performance impact of Using ignoreOlderThan on forwarder
Hi
So we have a server which writes out thousands of files a day.
Over course of two months we can have 70K+ files.
We cant enforce a aggressive archival policy for the project team due to some other retention constraints.
There are also dat.gz files which may have even more CPU hit if we take into account the thousands of older files.
ignoreOlderThan = [s|m|h|d]
* Causes the monitored input to stop checking files for updates if their
modtime has passed this threshold. This improves the speed of file tracking
operations when monitoring directory hierarchies with large numbers of
historical files (for example, when active log files are colocated with old
files that are no longer being written to).
* As a result, do not select a cutoff that could ever occur for a file
you wish to index. Take downtime into account!
Suggested value: 14d , which means 2 weeks
* A file whose modtime falls outside this time window when seen for the first
time will not be indexed at all.
* Default: 0, meaning no threshold.
if ignoreOlderThan is used in the inputs.conf does this process the input at monitor time or there is a internal blacklist setup
I think there might be an internal list because below is true for ignoreOlderThan from what I read in other answers:
***Once a file is ignored, it never comes back to being monitored even if its timestamp is updated. The only way to stop this is to remove that feature and restart Splunk.***
**Now the query is : How will this affect performance of Splunk forwarder over time ( with thousands of blacklists being setup internally?)**
↧
↧
After upgrade from 6.6.1 to 7.7.0, indexer stop taking data and throwing ssl errors.
Hi,
I have upgraded my SPLUNK from version 6.6.1 to 7.0.0. After upgrade I am seeing some ssl exceptions and indexer is not taking data. I am using 6.3.3 version for Splunk forwarder.
**Errors in splunkd.log**
10-09-2017 16:39:22.628 +0200 WARN SSLCommon - Received fatal SSL3 alert. ssl_state='SSLv3 read client hello C', alert_description='protocol version'.
10-09-2017 16:39:22.628 +0200 WARN HttpListener - Socket error from 127.0.0.1 while idling: error:1408A10B:SSL routines:ssl3_get_client_hello:wrong version numbe
There are some known issues and workaround for it. even though these workaround is for older version, I have tried but still getting same errors.
http://docs.splunk.com/Documentation/Splunk/6.6.0/ReleaseNotes/KnownIssues#Upgrade_Issues
Not sure if indexer is not taking data because of these errors.
does anyone have encounter such problem or have idea how to fix it ?
Thanks
Ankit
↧
After upgrade from 6.6.1 to 7.0.0, indexer stop taking data and throwing ssl errors.
Hi,
I have upgraded my SPLUNK from version 6.6.1 to 7.0.0. After upgrade I am seeing some ssl exceptions and indexer is not taking data. I am using 6.3.3 version for Splunk forwarder.
**Errors in splunkd.log**
10-09-2017 16:39:22.628 +0200 WARN SSLCommon - Received fatal SSL3 alert. ssl_state='SSLv3 read client hello C', alert_description='protocol version'.
10-09-2017 16:39:22.628 +0200 WARN HttpListener - Socket error from 127.0.0.1 while idling: error:1408A10B:SSL routines:ssl3_get_client_hello:wrong version numbe
There are some known issues and workaround for it. even though these workaround is for older version, I have tried but still getting same errors.
http://docs.splunk.com/Documentation/Splunk/6.6.0/ReleaseNotes/KnownIssues#Upgrade_Issues
Not sure if indexer is not taking data because of these errors.
does anyone have encounter such problem or have idea how to fix it ?
Thanks
Ankit
↧
Bootstrap Enterprise
regards,
I need to create a responsive control panel and realized that in Splunk there is a modified version of the bootstrap classes for this functionality. Is there a manual describing these classes?
↧
↧
If then statement where the output will exclude a value from search.
I want a statement that will evaluate field A, and if the value of field A equals 1, then I want to exclude any value of field B from the search.
↧
Splunk uninstallation / kill command failed -- Connection Refused error on port 22
Hi,
I was trying to uninstall Splunk due to some issues in existing installation. I followed the steps for "Uninstall Splunk Enterprise manually" as mentioned on below link:
https://docs.splunk.com/Documentation/Splunk/7.0.0/Installation/UninstallSplunk
But as soon as I executed the below command (as directed on above link), the next moment, I got Connection Refused error port 22 for the server. I don't know how is this related, any hint will be really helpful.
kill -9 `ps -ef | grep splunk | grep -v grep | awk '{print $2;}'`
Thanks in advance
↧
Problem: contract emplyees can't use our domain where Splunk is. Proposed solution: New search head in secondary domain connected to current Splunk instance. Thoughts?
We are only allowed to use AD accounts when accessing Splunk, but in our PCI DSS environment some users are not allowed to have accounts by policy due to either being contractors or due to age restrictions.
Is it at all possible to have another search head in another domain, but connected to same Splunk instance we already have? This way the search head will be in the domain where our users are etc.
I am just thinking how we can grant access to Splunk if the users cannot have AD accounts in the same domain as Splunk and we cannot use trusts etc.
↧
Override host field with event data
Hello,
I am indexing some data from a file monitor and i want to override the host field with data that lays inside the events. Below is a sample of the data and the values i want for the host field with bold.
Mon Oct 09 2017 15:24:18 **SE-001** sshd[5905]: Failed password for invalid user postgres from 49.212.64.138 port 4856 ssh2
Mon Oct 09 2017 15:24:13 **ACME-005** sshd[2792]: Failed password for nsharpe from 10.2.10.163 port 1148 ssh2
Mon Oct 09 2017 15:24:12 **ops-sys-006** sshd[4105]: Failed password for sync from 233.77.49.94 port 4595 ssh2
Mon Oct 09 2017 15:24:19 **PROD-MFS-001** sshd[74897]: pam_unix(sshd:session): session closed for user nsharpe by (uid=0)
Mon Oct 09 2017 15:24:07 **PROD-MFS-001** su: pam_unix(su:session): session closed for user root
The data is indexed under linux_secure sourcetype. In order to achieve the host overriding, i added one props.conf and one transforms.conf stanza in /etc/system/local on the indexers:
props.conf
[linux_secure]
TRANSFORMS-sethost = set_hostname_linux_secure
SHOULD_LINEMERGE = false
transforms.conf
[set_hostname_linux_secure]
REGEX = (?<=:\d{2}\s).*?(?=\s)
FORMAT = host::$1
DEST_KEY = MetaData:Host
The above configuration is not working, and the events are still indexing with host = the name of the forwarder where they come from.
Any idea what's wrong with this configuration and how can i implement the host overriding?
Thanks a lot!
↧
↧
Help optimized an advanced DBConnect Query
The query below takes approximately 20 minutes to run and I need help optimizing it. The point of the query is to gather the number of problematic data conditions from each client. The conditions are:
1. Where PROCESSEDFLAG='N' (from the VA table)
2. Where WORKFLOWSTATUS is null (from the CA table)
3. Where PROCESSEDFLAG='N' (from the VS table).
Our current setup has each client having their own database and clients split throughout servers. Given our setup, this is why I have formatted the search to have a different sub-search for each problematic condition for each client. What solutions are there to fix the current speed of the search or optimize the current query?
|dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID1\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-126" | stats count as ALPVCA
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID2\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-126" | stats count as ARTVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID3\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-126" | stats count as CEGVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID4\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-126" | stats count as CFUSVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID5\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-116" | stats count as IRMVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID6\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-116" | stats count as MARVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID7\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-116" | stats count as USBIVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID8\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-116" | stats count as WHIVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID9\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-124" | stats count as WOSVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID10\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-124" | stats count as POWVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID11\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-124" | stats count as NABVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID12\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-125" | stats count as NDQVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID13\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-127" | stats count as WEDVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID14\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-127" | stats count as HLMVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID15\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-127" | stats count as ICBCVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID16\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-128" | stats count as CFUKVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID17\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-128" | stats count as MEDVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID18\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-128" | stats count as SPIVCA]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID1\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-126" | stats count as ALPWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID2\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-126" | stats count as ARTWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID3\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-126" | stats count as CEGWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID4\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-126" | stats count as CFUSWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID5\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-116" | stats count as IRMWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID6\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-116" | stats count as MARWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID7\".\"dbo\".\"CA\"WHERE WORKFLOWSTATUS = Null connection="connection-116" | stats count as USBIWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID8\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-116" | stats count as WHIWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID9\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-124" | stats count as WOSWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID10\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-124" | stats count as POWWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID11\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-124" | stats count as NABWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID12\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-125" | stats count as NDQWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID13\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-127" | stats count as WEDWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID14\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-127" | stats count as HLMWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID15\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-127" | stats count as ICBCWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID16\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-128" | stats count as CFUKWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID17\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-128" | stats count as MEDWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID18\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-128" | stats count as SPIWS]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID5\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-116" | stats count as IRMVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID6\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-116" | stats count as MARVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID7\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-116" | stats count as USBIVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID8\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-116" | stats count as WHIVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID9\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-124" | stats count as WOSVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID10\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-124" | stats count as POWVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID11\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-124" | stats count as NABVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID12\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-125" | stats count as NDQVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID13\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-127" | stats count as WEDVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID14\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-127" | stats count as HLMVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID15\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-127" | stats count as ICBCVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID16\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-128" | stats count as CFUKVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID17\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-128" | stats count as MEDVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID18\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-128" | stats count as SPIVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID1\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-126" | stats count as ALPVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID2\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-126" | stats count as ARTVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID3\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-126" | stats count as CEGVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID4\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-126" | stats count as CFUSVSEC]
| eval Totals= ARTVCA+CEGVCA+CFUSVCA+IRMVCA+MARVCA+USBIVCA+WHIVCA+WOSVCA+POWVCA+NDQVCA+WEDVCA+HLMVCA+ICBCVCA+CFUKVCA+MEDVCA+SPIVCA+ARTVSEC+CEGVSEC+CFUSVSEC+IRMVSEC+MARVSEC+USBIVSEC+WHIVSEC+WOSVSEC
+POWVSEC+NDQVSEC+WEDVSEC+HLMVSEC+ICBCVSEC+CFUKVSEC+MEDVSEC+SPIVSEC+ARTWS+CEGWS+CFUSWS+IRMWS+MARWS+USBIWS+WHIWS+WOSWS+POWWS+NDQWS+WEDWS+HLMWS+ICBCWS+CFUKWS+MEDWS+SPIWS
| table Totals
↧
What other additional software gets automatically shipped with SPLUNK
I had heard that SPLUNK comes with some additional software, can someone tell me what that is?
↧
Dashboard help -- Can I use Bootstrap Enterprise so my dashboard has responsive design?
![alt text][1]Regards,
I'm developing a home panel using the HTML components inside a panel in simple XML format.
Ex:
... my code here |
What I need to do is make this control panel sensitive to other screen resolutions and mobile devices. I know that Splunk uses a modified version of the Bootstrap library, but I do not know which classes allow me to enable the same control panel in other resolutions.
Are there guides for the Enterprise Bootstrap version used by Splunk?
[1]: /storage/temp/217790-dashboard-home.png
↧
Is it possible to find values that are wrapped by quotation marks in a lookup table?
I have a lookup table that has values that are wrapped by quotations. For example: "fw: Help".
If I try to search for this by this search:
|inputlookup zyx.csv | search column="fw: Help" nothing comes up.
I have even tried this as well:
|inputlookup zyx.csv | search column=fw: Help
BUT if i do | search column=* it populates all the value in the lookup table which is fine but i need help finding a value that is wrapped around quotation.
How can i do that?
Is it even possible to search for a value that is wrapped in quotations?
↧
↧
Splunk appender not logging long HTTP headers
The basic issue is that for longer requests, the Splunk logger is not logging the HTTP headers.
We are using the splunk-1.5.0.0.jar and splunk-library-javalogging-1.5.2.jar -- https://github.com/splunk/splunk-library-javalogging. We are using log4j to push HTTP SOAP requests and responses to Splunk. The application itself is a very basic proxy running on Mule CE 3.8.1 – it simply wraps Soap calls and logs the request and responses. We add several custom headers that are critical to see in the logs.
We find when the requests are long, Splunk is not logging the headers. We know the headers are present because they do appear in the response log. HTTP Headers seem to always be present in the logged responses no matter the length. For small request bodies, the headers appear in the splunk logs correctly.
Is there a way to force Splunk to log the HTTP headers without regard to the request length? A trimmed example of what we see in the log is below.
10/9/17
8:23:34.572 AM
{ [-]
logger: org.mule.module.http.internal.HttpMessageLogger
message: LISTENER
severity: DEBUG
thread: [dmvproxy-1.0.0-SNAPSHOT].HTTP_Listener_Configuration.worker.43
}
Show as raw text
↧
Scheduled report error -- Search process did not exit cleanly, exit_code=-1, description="exited with code -1"
When running the following manually there are no issues. But when this is scheduled the following error is noted and half the information is not present.
[subsearch]: [1spl-ind04-dc1] Search process did not exit cleanly, exit_code=-1, description="exited with code -1". Please look in search.log for this peer in the Job Inspector for more info.
[subsearch]: [1spl-ind04-dc2] Search process did not exit cleanly, exit_code=-1, description="exited with code -1". Please look in search.log for this peer in the Job Inspector for more info.
|inputlookup idstoroles.csv | where like (role_name,"%External User%") | join type=left user_id[search index=onelogin event_type_id=5 |eval mytime=strftime(_time, "%Y-%m-%d") |lookup idstoroles.csv user_id
|stats distinct_count(mytime) as "total logins" by role_name,user_name,user_id |where like(role_name, "%External User%")] | table username,user_name, firstname, lastname, user_id, "total logins" |outputlookup logins.csv
↧
How to get a universal forwarder GUID from deployment server via API?
I'm trying to create a script to delete computers from deployment servers (ds) that were decommissioned. I think I can use the below curl command to delete form the deployment server, but how can I get the GUID of a server with host name?
Example: I need to get GUID of a server xyz that I want to delete form DS, once i get the GUID, i can use the deployment/server/clients end point to delete.
curl -k -u admin:pass --request DELETE https://localhost:8089/services/deployment/server/clients/
↧
Single host is showing up as multiple sources (i.e. server1 and ip-server1). How can I clean this up?
Greetings,
In splunk search, some of the hosts are showing under multiple host names. I would like to combine the hostnames into one hostname for cleanup purposes. I fixed the initial reporting issue, but cannot seem to figure out how to make the logs show up under 1 host.
Example: server1 and ip-server1 are the same host, but show as 2 sources. I would like both sources show as server1.
↧
↧
Change row color when the field "time value" increases
Hi ,
In my dashboard I am watching the ticketing system , am calculating the time frame , if the ticket age was grater 02:00 hours then the row would be turn to red , if it less than 02:00 then it should be yellow, and if it less than 01:30 it would be turn green
![alt text][1]
[1]: /storage/temp/216742-12345.jpg
↧
Help optimizing an advanced Splunk DB Connect search
The query below takes approximately 20 minutes to run and I need help optimizing it. The point of the query is to gather the number of problematic data conditions from each client. The conditions are:
1. Where PROCESSEDFLAG='N' (from the VA table)
2. Where WORKFLOWSTATUS is null (from the CA table)
3. Where PROCESSEDFLAG='N' (from the VS table).
Our current setup has each client having their own database and clients split throughout servers. Given our setup, this is why I have formatted the search to have a different sub-search for each problematic condition for each client. What solutions are there to fix the current speed of the search or optimize the current query?
|dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID1\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-126" | stats count as ALPVCA
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID2\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-126" | stats count as ARTVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID3\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-126" | stats count as CEGVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID4\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-126" | stats count as CFUSVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID5\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-116" | stats count as IRMVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID6\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-116" | stats count as MARVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID7\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-116" | stats count as USBIVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID8\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-116" | stats count as WHIVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID9\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-124" | stats count as WOSVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID10\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-124" | stats count as POWVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID11\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-124" | stats count as NABVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID12\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-125" | stats count as NDQVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID13\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-127" | stats count as WEDVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID14\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-127" | stats count as HLMVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID15\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-127" | stats count as ICBCVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID16\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-128" | stats count as CFUKVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID17\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-128" | stats count as MEDVCA]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID18\".\"dbo\".\"VA\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-128" | stats count as SPIVCA]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID1\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-126" | stats count as ALPWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID2\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-126" | stats count as ARTWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID3\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-126" | stats count as CEGWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID4\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-126" | stats count as CFUSWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID5\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-116" | stats count as IRMWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID6\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-116" | stats count as MARWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID7\".\"dbo\".\"CA\"WHERE WORKFLOWSTATUS = Null connection="connection-116" | stats count as USBIWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID8\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-116" | stats count as WHIWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID9\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-124" | stats count as WOSWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID10\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-124" | stats count as POWWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID11\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-124" | stats count as NABWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID12\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-125" | stats count as NDQWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID13\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-127" | stats count as WEDWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID14\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-127" | stats count as HLMWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID15\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-127" | stats count as ICBCWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID16\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-128" | stats count as CFUKWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID17\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-128" | stats count as MEDWS]
| appendcols [dbxquery query="SELECT ClientID FROM \"clientID18\".\"dbo\".\"CA\" WHERE WORKFLOWSTATUS = Null connection="connection-128" | stats count as SPIWS]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID5\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-116" | stats count as IRMVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID6\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-116" | stats count as MARVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID7\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-116" | stats count as USBIVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID8\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-116" | stats count as WHIVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID9\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-124" | stats count as WOSVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID10\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-124" | stats count as POWVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID11\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-124" | stats count as NABVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID12\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-125" | stats count as NDQVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID13\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-127" | stats count as WEDVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID14\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-127" | stats count as HLMVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID15\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-127" | stats count as ICBCVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID16\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-128" | stats count as CFUKVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID17\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-128" | stats count as MEDVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID18\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-128" | stats count as SPIVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID1\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-126" | stats count as ALPVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID2\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-126" | stats count as ARTVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID3\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-126" | stats count as CEGVSEC]
| appendcols [dbxquery query="SELECT PROCESSEDFLAG FROM \"clientID4\".\"dbo\".\"VS\" WHERE PROCESSEDFLAG = 'N' and ADDEDDT >DATEADD(hour, -24, GETDATE()) " connection="connection-126" | stats count as CFUSVSEC]
| eval Totals= ARTVCA+CEGVCA+CFUSVCA+IRMVCA+MARVCA+USBIVCA+WHIVCA+WOSVCA+POWVCA+NDQVCA+WEDVCA+HLMVCA+ICBCVCA+CFUKVCA+MEDVCA+SPIVCA+ARTVSEC+CEGVSEC+CFUSVSEC+IRMVSEC+MARVSEC+USBIVSEC+WHIVSEC+WOSVSEC
+POWVSEC+NDQVSEC+WEDVSEC+HLMVSEC+ICBCVSEC+CFUKVSEC+MEDVSEC+SPIVSEC+ARTWS+CEGWS+CFUSWS+IRMWS+MARWS+USBIWS+WHIWS+WOSWS+POWWS+NDQWS+WEDWS+HLMWS+ICBCWS+CFUKWS+MEDWS+SPIWS
| table Totals
↧
Throttle unless count increases
In an Enterprise Security Correlation Search I have a report that emails out when an email address is seen across multiple unique accounts. Each unique account has its own row and I have a field called dc_accounts that shows a count of how many unique accounts the email address is present on. I'm trying to determine the best way to throttle within the correlation search. The search time window spans 90 days and I only want the rows to show up on the report again if the count increases (so accounts are not re-reviewed). I think the best way to throttle is to throttle the fields account, email and dc_accounts. But I only want to throttle if dc_accounts increases from the previous value. So if the account and email values are the same and dc_accounts is less than or equal to the previous value I don't want to see it.
Is there a way to achieve this? I have thought about writing my output of each run to a lookup file and then have an inputlookup earlier in the search that retrieves the previous value and compares. But I am not sure if that is the only option available.
Portion of search that is relevant is below. There are multiple roles per account which is the reason for the multi-value names and user-id fields.
| dedup first_name, last_name, acct_no, user_id, email
| stats list(first_name) AS first_name list(last_name) AS last_name list(user_id) AS user_id values(acct_no) AS acct_nos by email
| stats list(first_name) AS first_name list(last_name) AS last_name list(user_id) AS user_id dc(acct_nos) AS dc_accounts by email,acct_nos
| search dc_accounts>1
| mvexpand first_name
| mvexpand last_name
| mvexpand user_id
| dedup email, acct_nos, first_name, last_name, user_id
| stats values(email) AS email values(first_name) AS first_name values(last_name) AS last_name values(user_id) AS user_id by acct_nos, dc_accounts
| eval first_name=mvjoin(first_name, " ; ")
| eval last_name=mvjoin(last_name, " ; ")
| eval user_id=mvjoin(user_id, " ; ")
| table first_name, last_name, acct_nos, user_id, email, dc_accounts
| sort 0 -dc_accounts,email
↧