Quantcast
Channel: Questions in topic: "splunk-enterprise"
Viewing all 47296 articles
Browse latest View live

Create a stacked bar chart, conflating values from multiple fields

$
0
0
Each event in my data set falls into one of two categories: 1) Has a field called "os_platform" and a field called "parameters.From" 2) Has a field called "os" and a field called "params.from" I would like to generate a stacked bar chart where there is one bar per value of either os or os_platform (whichever is present for each event), and where each bar is split into a segment for each value of parameters.From or params.from (whichever is present for each event). What would a query look like which does this?

Windows Last Logon against a .csv file

$
0
0
I am trying to search for a list of users Last Logon to Windows through SPLUNK... for an individual user I use the search *USERNAME* logon eventtype=windows_logon_success |table User_time However, I am trying to do this for around 300 users.. is there a way to do this on bulk by importing a lookup .csv file and getting the search to look at the username & export a new list with the last logon date populated? Any help would be great Thanks

How do I get the event associated to a fired_alert?

$
0
0
I run this search: index=_audit action=fired_alert I get back this which looks like properties of the alert. Audit:[timestamp=11-08-2017 06:52:57.231, id=35143213, user=admin, action=alert_fired, ss_user="nobody", ss_app="search", ss_name="RDP by GenericID Prod", sid="rt_scheduler__admin__search__RMD5cf6dac5adc7385e1_at_1510141830_38328.0", alert_actions="email,notable,resilient", severity=3, trigger_time=1510141971, expiration=1510228377, digest_mode=0, triggered_alerts=1][OhxQLHMR0bgxLAaqfsIRoIsoknIp5H1APZ24P4Hm/9FDp8O0VT46WEsP+yLAPbuHYNBkjd8X2/Lu4tVXmqLy+d738KZDjCqFTCu9WcwwILDA97uAfDes/bqw0KamiumItENPlXSQkZIGLfuULHuVoBWOdWrIDF5MMp2y19XsXps=] The search for the alert looks like this: index="wineventlog" EventCode=4648 Logon_ID=0x3e7 Process_Name="C:\\Windows\\System32\\winlogon.exe" [ | inputlookup serts-prod.csv | rename genid as user_identity | table user_identity] | eval discovered_date=ceil(_time) * 1000| fields host, user, Account_Domain, discovered_date How do I get values in the field statement? it should be my username, my workstation and my logon domain. Thanks!

base search to use in new window drilldown dashbboard

$
0
0
Hello All, Is there a way to put a base search in the first dashboard which drilldowns to a new dashboard where it uses the base search from the first dashboard.

Configuring app for first time. What index to specify?

$
0
0
I am setting up this app for the first time and planning on ingesting azure logs. The app is installed on my heavy forwarder as I plan to send any data ingested to my indexers for replication and redundancy. How can I specify what index this app should be using? Documentation mentions sourcetypes but not sure if I need to create an index on my indexers or how exactly that will work if I dont specify an index.

Managing many alerts simultaneously

$
0
0
I am currently managing 50 alerts and this number will multiply in the next couple of weeks. Editing my alerts is cumbersome. If I want to change a common property, I have to change every single instance by itself. Is there a way to change an alert property like its permissions, or triggers, for multiple alerts at a time? I have looked at "Alert Manager", but it seems to be tailored to managing incidents, not the actual alerts in of itself.

apply shcluster-bundle returning insufficient permission to access this resource

$
0
0
Hi, I'm trying to deploy new apps to shcluster via my deployer and running apply shcluster bundle command I'm receiving a erro message. `/splunkdrive/splunk/bin/splunk apply shcluster-bundle --answer-yes -auth admin:{{ADMIN_PASSWORD}} -target https://{{SEARCH_HEAD_IP}}:8089` With return: **insufficient permission to access this resource** I already tested a API call at search head using same auth and it was OK both on deployer and on search head cluster element

Error in TA-wazuh-api-connector

$
0
0
Hi, i have some problems with TA, i install TA like in instruction, but in splunkd.log i see errors for all wazuh_api_* Version Splunk 7.0.0 standalone ---------------------------------- 11-08-2017 12:55:40.905 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" Traceback (most recent call last): 11-08-2017 12:55:40.905 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/modinput_wrapper/base_modinput.py", line 113, in stream_events 11-08-2017 12:55:40.905 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" self.parse_input_args(input_definition) 11-08-2017 12:55:40.905 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/modinput_wrapper/base_modinput.py", line 152, in parse_input_args 11-08-2017 12:55:40.905 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" self._parse_input_args_from_global_config(inputs) 11-08-2017 12:55:40.906 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/modinput_wrapper/base_modinput.py", line 171, in _parse_input_args_from_global_config 11-08-2017 12:55:40.906 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" ucc_inputs = global_config.inputs.load(input_type=self.input_type) 11-08-2017 12:55:40.906 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/splunktaucclib/global_config/configuration.py", line 270, in load 11-08-2017 12:55:40.906 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" input_item['entity'] 11-08-2017 12:55:40.906 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/splunktaucclib/global_config/configuration.py", line 175, in _load_endpoint 11-08-2017 12:55:40.906 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" **query 11-08-2017 12:55:40.906 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/solnlib/packages/splunklib/binding.py", line 287, in wrapper 11-08-2017 12:55:40.906 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" return request_fun(self, *args, **kwargs) 11-08-2017 12:55:40.906 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/solnlib/packages/splunklib/binding.py", line 69, in new_f 11-08-2017 12:55:40.906 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" val = f(*args, **kwargs) 11-08-2017 12:55:40.906 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/solnlib/packages/splunklib/binding.py", line 665, in get 11-08-2017 12:55:40.906 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" response = self.http.get(path, self._auth_headers, **query) 11-08-2017 12:55:40.906 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/solnlib/packages/splunklib/binding.py", line 1160, in get 11-08-2017 12:55:40.906 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" return self.request(url, { 'method': "GET", 'headers': headers }) 11-08-2017 12:55:40.906 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/solnlib/packages/splunklib/binding.py", line 1221, in request 11-08-2017 12:55:40.906 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" raise HTTPError(response) 11-08-2017 12:55:40.906 +0000 ERROR ExecProcessor - message from "python /opt/splunk/etc/apps/TA-wazuh-api-connector/bin/wazuh_api_info_basic.py" HTTPError: HTTP 500 Internal Server Error -- {"messages":[{"type":"ERROR","text":"Unexpected error \"\" from python handler: \"REST Error [500]: Internal Server Error -- Traceback (most recent call last):\n File \"/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/splunktaucclib/rest_handler/handler.py\", line 113, in wrapper\n for name, data, acl in meth(self, *args, **kwargs):\n File \"/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/splunktaucclib/rest_handler/handler.py\", line 348, in _format_all_response\n self._encrypt_raw_credentials(cont['entry'])\n File \"/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/splunktaucclib/rest_handler/handler.py\", line 382, in _encrypt_raw_credentials\n change_list = rest_credentials.decrypt_all(data)\n File \"/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/splunktaucclib/rest_handler/credentials.py\", line 286, in decrypt_all\n all_passwords = credential_manager._get_all_passwords()\n File \"/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/solnlib/utils.py\", line 154, in wrapper\n return func(*args, **kwargs)\n File \"/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/solnlib/credentials.py\", line 272, in _get_all_passwords\n clear_password += field_clear[index]\nTypeError: cannot concatenate 'str' and 'NoneType' objects\n\". See splunkd.log for more details."}]} 11-08-2017 12:55:41.617 +0000 ERROR PasswordHandler - Decrypted password from stanza=credential:__REST_CREDENTIAL__#TA-wazuh-api-connector#data/inputs/wazuh_api_agents:api_server``splunk_cred_sep``1: is not utf8, skipping 11-08-2017 12:55:41.617 +0000 ERROR PasswordHandler - Decrypted password from stanza=credential:__REST_CREDENTIAL__#TA-wazuh-api-connector#data/inputs/wazuh_api_agents:api_server``splunk_cred_sep``2: is not utf8, skipping 11-08-2017 12:55:41.617 +0000 ERROR PasswordHandler - Decrypted password from stanza=credential:__REST_CREDENTIAL__#TA-wazuh-api-connector#data/inputs/wazuh_api_decoders:api_server``splunk_cred_sep``1: is not utf8, skipping 11-08-2017 12:55:41.617 +0000 ERROR PasswordHandler - Decrypted password from stanza=credential:__REST_CREDENTIAL__#TA-wazuh-api-connector#data/inputs/wazuh_api_decoders:api_server``splunk_cred_sep``2: is not utf8, skipping 11-08-2017 12:55:41.617 +0000 ERROR PasswordHandler - Decrypted password from stanza=credential:__REST_CREDENTIAL__#TA-wazuh-api-connector#data/inputs/wazuh_api_info_basic:api_server``splunk_cred_sep``1: is not utf8, skipping 11-08-2017 12:55:41.617 +0000 ERROR PasswordHandler - Decrypted password from stanza=credential:__REST_CREDENTIAL__#TA-wazuh-api-connector#data/inputs/wazuh_api_info_basic:api_server``splunk_cred_sep``2: is not utf8, skipping 11-08-2017 12:55:41.617 +0000 ERROR PasswordHandler - Decrypted password from stanza=credential:__REST_CREDENTIAL__#TA-wazuh-api-connector#data/inputs/wazuh_api_rules:api_server``splunk_cred_sep``1: is not utf8, skipping 11-08-2017 12:55:41.617 +0000 ERROR PasswordHandler - Decrypted password from stanza=credential:__REST_CREDENTIAL__#TA-wazuh-api-connector#data/inputs/wazuh_api_rules:api_server``splunk_cred_sep``2: is not utf8, skipping 11-08-2017 12:55:41.642 +0000 ERROR AdminManagerExternal - Stack trace from python handler:\nTraceback (most recent call last):\n File "/opt/splunk/lib/python2.7/site-packages/splunk/admin.py", line 130, in init\n hand.execute(info)\n File "/opt/splunk/lib/python2.7/site-packages/splunk/admin.py", line 594, in execute\n if self.requestedAction == ACTION_LIST: self.handleList(confInfo)\n File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/splunk_aoblib/rest_migration.py", line 38, in handleList\n AdminExternalHandler.handleList(self, confInfo)\n File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/splunktaucclib/rest_handler/admin_external.py", line 40, in wrapper\n for entity in result:\n File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/splunktaucclib/rest_handler/handler.py", line 120, in wrapper\n raise RestError(500, traceback.format_exc())\nRestError: REST Error [500]: Internal Server Error -- Traceback (most recent call last):\n File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/splunktaucclib/rest_handler/handler.py", line 113, in wrapper\n for name, data, acl in meth(self, *args, **kwargs):\n File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/splunktaucclib/rest_handler/handler.py", line 348, in _format_all_response\n self._encrypt_raw_credentials(cont['entry'])\n File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/splunktaucclib/rest_handler/handler.py", line 382, in _encrypt_raw_credentials\n change_list = rest_credentials.decrypt_all(data)\n File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/splunktaucclib/rest_handler/credentials.py", line 286, in decrypt_all\n all_passwords = credential_manager._get_all_passwords()\n File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/solnlib/utils.py", line 154, in wrapper\n return func(*args, **kwargs)\n File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/solnlib/credentials.py", line 272, in _get_all_passwords\n clear_password += field_clear[index]\nTypeError: cannot concatenate 'str' and 'NoneType' objects\n\n 11-08-2017 12:55:41.642 +0000 ERROR AdminManagerExternal - Unexpected error "" from python handler: "REST Error [500]: Internal Server Error -- Traceback (most recent call last):\n File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/splunktaucclib/rest_handler/handler.py", line 113, in wrapper\n for name, data, acl in meth(self, *args, **kwargs):\n File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/splunktaucclib/rest_handler/handler.py", line 348, in _format_all_response\n self._encrypt_raw_credentials(cont['entry'])\n File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/splunktaucclib/rest_handler/handler.py", line 382, in _encrypt_raw_credentials\n change_list = rest_credentials.decrypt_all(data)\n File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/splunktaucclib/rest_handler/credentials.py", line 286, in decrypt_all\n all_passwords = credential_manager._get_all_passwords()\n File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/solnlib/utils.py", line 154, in wrapper\n return func(*args, **kwargs)\n File "/opt/splunk/etc/apps/TA-wazuh-api-connector/bin/ta_wazuh_api_connector/solnlib/credentials.py", line 272, in _get_all_passwords\n clear_password += field_clear[index]\nTypeError: cannot concatenate 'str' and 'NoneType' objects\n". See splunkd.log for more details.

Error in model "Network_Resolution" Cannot add field 'reply_code_id' because it already exists

$
0
0
Error in model "Network_Resolution" : Cannot add field 'reply_code_id' because it already exists in object 'DNS'. It's a problem with splunkbase.splunk.com/app/1621/ Also, data model Network_Resolution has empty name - prntscr.com/h7sas3 And empty datasets - prntscr.com/h7sb8b

Help with previous weeks index totals by day Report

$
0
0
I have this report for license usage for one day emailed to me daily, that shows totals by index and a daily total. I want a report with this same format repeated/listed 5 times showing the previous weeks days Mon-Friday and I'm not sure how to accomplish it. A manager report so they can see daily totals for the previous week emailed out on Monday AM. The email and schedule part is not the issue for me. Just the SPL learning curve. index=_internal earliest=-1d@d latest=-0d@d source=*license_usage.log* type=Usage | bucket span=1d _time | stats sum(b) as bytes by _time idx | eval gb=round(bytes/1024/1024/1024,3) | addtotals row=false col=true | rename idx as Index | rename gb as "Total GB"

How to Restrict Index to only one user?

$
0
0
Hello, I need to upload sensitive data into Splunk to do my analysis, and I only want myself to be able to see the data. Would the below process ensure that or do I need to take any other steps? -create a new index "test" -create a new user role "testuser" -ensure that all other user roles cannot search that index "test" -set the new user role "testuser" to only search that index "test" -assign myself that role "testuser" -upload my data into that index "test" A few other questions- my Splunk has LDAP authentication but can I just add the splunk user role "testuser" without LDAP authentication and adding a LDAP authentication step/group for that role? Also- the default for my normal user role is "all internal indexes"- this would not include my new index "test" correct?

Splunk ES Adaptive Response Action Adhoc invocation failed

$
0
0
Hello, We have an AR Action, and it works fine with correlation search. But when we try to invoke it as adhoc action, it failed with the following error message: ActiveResponseException: Invalid parameter for adhoc modular action. Now we use sendalert command in our alert_actions.conf, so according to the Splunk document, it should support adhoc invocation. The command we use in our alert_actions.conf follows the Splunk example for adaptive response: command = sendalert $action_name$ results_file="$results.file$" results_link="$results.url$" param.action_name=$action_name$ | stats count None of the log files in $SPLUNK_HOME/var/log/splunk folder provides useful information. How can we debug this please? Thanks!

Lookups excluding answers if multiple same lines are found

$
0
0
I have a lookup that end users can update. However they might make a mistake and put in the same data twice. The issues is, if this is done SPLUNK wont return ether results. So the data is lost Initial Search .....| lookup lookup.csv Context_Command AS "Context+Command" OUTPUT Tags CC_Description Threshold So Example 1 - Working This is the look up table - I get 10 Row returned to me [As i should] It finds a match for NULL#Login and this is good Context_Command CC_Description Tags Alert Threshold NULL#Login TEST2 TEST2 y 5 Example 2 - Not Working This is the look up table - I get 8 Row returned to me and NULL#Login is excluded from this Context_Command CC_Description Tags Alert Threshold NULL#Login TEST2 TEST2 y 5 NULL#Login TEST2 TEST2 y 5 I know this is a human problem, however this file can have hundreds is not thousands of line and this will become difficult to manage.

Issue with Splunk TA-Tomcat , not parsing properly

$
0
0
Hi Team , we have issue with Splunk add-on TA-Tomcat , the logs are not parsing properly . can you help in trouble shooting . i guess we can do from props and transform but not sure how to investigate and fix it . Thanks Koushik

need help in formatting the data

$
0
0
Hello All, i'm trying to format the "json" formatted data with a custom sourcetype. below are my sample events {"formatVersion":"1.0", "vendor":"BeyondTrust","product":"BeyondInsight","version":"6.3.1","agentid":"PBPS","severity":"0","eventid":"PBPS","eventname":"Requestor","eventdesc":"Request Response Expire","eventdate":"Nov 07 2017 21:31:11","sourcehost":"test-vm-1","sourceip":"127.0.0.1","eventsubject":"0127.0.00.001","eventtype":"0","user":"ssltest", "nvps" : {"clienthost":"test-vm-1", "eventseverity":"0", "logsystemid":"121", "logtime":"11/07/2017 21:31:11", "username":"ssltest", "userid":"2", "roleused":"Requestor", "objecttypeid":"7", "objecttype":"Request Response", "objectid":"14", "operation":"Expire", "failed":"False", "target":"localhost/btuser", "details":"ReleaseRequest #9"}}{"formatVersion":"1.0", "vendor":"BeyondTrust","product":"BeyondInsight","version":"6.3.1","agentid":"PBPS","severity":"0","eventid":"PBPS","eventname":"System","eventdesc":"Release Request Expire","eventdate":"Nov 07 2017 21:31:11","sourcehost":"test-vm-1","sourceip":"127.0.0.1","eventsubject":"0127.0.00.001","eventtype":"0","user":"Internal process","workgroupid":"1","workgroupdesc":"BeyondTrust Workgroup", "nvps" : {"clienthost":"test-vm-1", "eventseverity":"0", "logsystemid":"122", "logtime":"11/07/2017 21:31:11", "username":"Internal process", "userid":"0", "roleused":"System", "objecttypeid":"6", "objecttype":"Release Request", "objectid":"9", "operation":"Expire", "failed":"False", "target":"ManagedSystem=localhost ManagedAccount=btuser", "details":"ReleaseRequest #9, Ticket #, TicketSystem="}} and props.conf is "TIME_PREFIX=\"eventdate\":\" TIME_FORMAT= %b %d %Y %H:%M:%S LINE_BREAKER=([\r\n]+)\s*\{"formatVersion SHOULD_LINEMERGE=false ANNOTATE_PUNCT=false TRUNCATE = 0 KV_MODE=json AUTO_KV_JSON=true" i facing issue at line breaker can any one help me?

nested eval command in search query.

$
0
0
Hi, I have to use nested eval command in my search query. **Requirement:** if isnotnull(GC_TIMESTAMP) then set _time = GC_TIMESTAMP else if ($Log_or_live$=="Log") set _time = $mlc_log_start_time$ + relative_time else set _time = _time I tried to make it working but its malformed. Please help to correct syntax. eval _time=if(isnotnull(GC_TIMESTAMP),GC_TIMESTAMP,(if($Log_or_live$=="Log"),$mlc_log_start_time$ + relative_time,_time)))

How to send dat to Index cluster using HTTP REST POST methods using /services/receivers/simple

$
0
0
Hi We want to send data to Index cluster using HTTP REST POST methods using /services/receivers/simple .We tried to use Master Indexer IP but then data is not replicated on Peer Nodes .It just goes to Master Node . What should we do differently to use this services/receivers/simple in index Cluster ? Thanks in advance for your reply . Regards, Vishal

How can I correlate current splunk events with historical field value pairs?

$
0
0
I have a complex audit request that has been quite difficult to provide results for via splunk. The data set regards vulnerability management information from multiple scanning appliances. For the purpose of this question, a given event contains the following fields: ** ip_address, data_ctr, signature, cvss** Events are generated per **signature** so that each event will have a unique **signature** value, but may have duplicate values for **ip_address, cvss,** and / or **data_ctr .** Goal: Identify unique **ip_address** values per **data_ctr** where the **cvss** value is within 1 of 2 ranges (7.0-8.9 = high, 9.0-10 = critical) Once this list has been established, I need to pair the unique **signature** and **ip_address** field values to determine if this unique combination has existed in the following time ranges: 0-30 days ago, 31-60 days ago, 61+ days ago. If the pair has been seen across multiple spans, use the latest event to determine where it falls. ip_address cvss signature 0-30 days 31-60 days 61+ days 1.2.3.4 high OpenSSL FALSE TRUE FALSE 1.2.5.6 high Weak SNMP TRUE FALSE FALSE 1.3.8.9 crit WannaCry FALSE FALSE TRUE One of the difficulties in correlating all of this information is the initial search used to identify the **ip_address** values. I only want to look at the most recent events for a given **ip_address** to determine if it falls within the established **cvss** threshold. To do this, I have use the following search: splunk_server_group=xxx index=xxx sourcetype=xxx |join audit_id [search index=xxx sourcetype=xxx | where status="Finished"|dedup name] This base search basically tells splunk to only look at events (or scan data) from unique scans (**audit_id**) that have finished and throw away older data based on the scan **name**. In this way we are able to narrow our searches down to the most recent, complete data sets. This search is pivotal in determining what ip_addresses are flagging signatures for a point in time and should not be edited unless absolutely necessary. The issue is that this join eliminates all of the historical data that is needed to determine if unique **ip_address** / **signature** field value pairs have flagged in the past. Is there a way to hold on to this historical data so that it can be referenced? I realize this is a complex ask, and I appreciate any and all help that could put me on the right path forward. Please let me know if there is any additional information needed to better understand my situation. Thanks!

Anyone have a good maintenance routine document that I can copy?

$
0
0
Good morning, I am just trying to formalize , automate and hopefully hand off some Splunk maintenance to some junior employees. Trying to free up some time that sort of thing. Anyone already have a document or site that can refer me to which has a normal maintenance routine for Splunk? thanks -Daniel

Splunk Stream: Failed to detect Splunk_TA_stream status

$
0
0
I just installed the Stream App on an on-prem heavy fowarder and when I select the "Collect data from this machine using Wire Data input (Splunk_TA_stream)." I get the following error: Failed to detect Splunk_TA_stream status. the splunk_app_stream log shows me: Error getting the streamfwd auth, return streamfwd auth is disabled Has anyone encountered this issue? If so can you please provide insight on how to solve it? Regards,
Viewing all 47296 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>