Hi all,
I want to create the correlation search in order to further enhance our current security alert from splunk by correlating the search from multiple host and device logs and then merge the results and show them together. In this case, my first attempted correlation searches would be to merge the result of both firewall logs and web server logs.
Take example like below search query for web server log:
**host="web server" Source_IP="IP Address"| bucket span=1m _time | stats values(URL_Action) as URL, values(response) AS http_response count by _time , Source_IP**
The above search query will shown the result of the URL web activity and its response associated with the specified source IP per minute.
Now the below is the search query for checkpoint firewall log:
**host=checkpoint src="IP Address" | bucket span=1m _time | stats VALUES(orig) AS firewall_type, VALUES(s_port) AS source_port, VALUES(dst) AS destination_ip, VALUES(service) AS service, VALUES(action) AS Action,count by _time, src**
The above search query will shown the result of firewall activity such as firewall zone, source port, Destination, destination port, firewall accept or drop and so on associated with the specified source IP per minute.
For this correlation search, I attempt to correlate and merge the result of both of the web server logs and firewall logs based on the source IP field(which is src and Source_IP). I already attempt to try the correlation using eval correlation_field=case(isnotnull) and eval correlation_field=coalesce(). But unfortunately both of this eval command does not produce the output I desired. I also already try other query like stats and chart but to no avail
Please advice on the correlation search query to merge the result of multiple different host and device logs. If can, i would prefer to correlate our firewall alert that detect suspicious source IP that have high count traffic drop and then correlate the result with other device should it found match the source IP attempting other activity in other device. This would further improve our security alert as it not only just shown the firewall result but also result from other device logs.
Let me know if there are anything
↧
Create correlation search from multiple host or device logs based on the Source IP field
↧
Sparkline not working in 7.2.6
Hi
I am on version 7.1.6 and want to move to 7.2.6 but i have noticed that spark-lines don't work in the new version.
Or am i missing something?
This is the SPL
index=mlc_live | table host _time| chart sparkline count by host | fields - count
When i change from fast mode to verbose mode in SPL it works, but you cant save a search that way.
Any help would be great.
↧
↧
systemd journald logs not being read by TA_NIX
Why doesn't the Unix / Linux add on have default support for picking up logs from the systemd logs into journald?
I realize journald is a binary format, I don't think it makes sense to ignore that data or ask each splunk user to find a manual way of pulling this data it. It should be part of this app.
Will Splunk please fix this?
Thanks!
↧
perfmon Accurate Memory Utilization Percentage
We are starting to use the Splunk App for Infrastructure to ingest Windows metrics. One of the KPIs we want to measure is percentage of memory used on servers. It appears that is not an available perfmon counter. The only way it seems you would be able to calculate percentage of actual memory used is to get the total memory of the server somehow and then do a calculation like:
(totalMem - Available_MBytes)/totalMem * 100
How are others capturing total memory and calculating memory usage in a percentage?
↧
user request in splunk servers
Hi ,
i have more that 22 SH and 18 indexers
i want to know the how may request are coming to each search head and indexer
Can someone help me with basic query for this
↧
↧
Data not readable on receiver
Hello All,
I have a question. It seems that I am unable to correctly configure a relationship from
a server which has the Universal Forwarder installed (and acts like it is forwarding data)
On the forwarder I have inputs set to a log file, and outputs set to the Splunk Enterprise Server.
I have attempted to (via the web interface and the cli) to configure a "receiver" to everyone's favorite port: 9997.
I have not configured any thing in "Data Inputs" or "Monitoring" on the Splunk Enterprise server.
I get NO data from the server with the Universal Forwarder installed.
If I delete the receiver port (9997) - go to the Add Data area, select Monitor - and then add port, ip, a generic one line sourcetype,
and an index - I get data in, but all unreadable slashes and zeros, etc.
So my question is - what am I missing here?
Thanks
eholz1
↧
Is there a roadmap to provide a Cloud data model for the CIM app
Hello,
I was wondering if there is an enhancement request from Splunk to define a data model specifically for Cloud service providers?
There are some TAs for AWS, Azure, and Google out on the splunkbase.
My organization is considering defining a custom data model for data from AWS/Azure.
However, we would like to ask if maybe something is in-the-works by Splunk gurus.
↧
Problems to collect logs from checkpoint OP SEC
Hello!
We have a problem with OP SEC LEA, the splunk restarted for a problem of storage because is over a Centos7, since there we cant collect logs from checkpoint and we have the next message.
2019-05-10 17:59:58,532 +0000 log_level=ERROR, pid=2081, tid=Thread-4, file=ta_data_collector.py, func_name=index_data, code_line_no=108 | [input_name="live_checkpoint_fixed_new1" data="non_audit"] Failed to index data, reason:
Traceback (most recent call last):
File "/opt/splunk/etc/apps/Splunk_TA_checkpoint-opseclea/bin/splunk_ta_checkpoint_opseclea/splunktaucclib/data_collection/ta_data_collector.py", line 105, in index_data
self._do_safe_index()
File "/opt/splunk/etc/apps/Splunk_TA_checkpoint-opseclea/bin/splunk_ta_checkpoint_opseclea/splunktaucclib/data_collection/ta_data_collector.py", line 148, in _do_safe_index
self._client = self._create_data_client()
File "/opt/splunk/etc/apps/Splunk_TA_checkpoint-opseclea/bin/splunk_ta_checkpoint_opseclea/splunktaucclib/data_collection/ta_data_collector.py", line 73, in _create_data_client
ckpt = self._get_ckpt()
File "/opt/splunk/etc/apps/Splunk_TA_checkpoint-opseclea/bin/splunk_ta_checkpoint_opseclea/splunktaucclib/data_collection/ta_data_collector.py", line 64, in _get_ckpt
return self._checkpoint_manager.get_ckpt()
File "/opt/splunk/etc/apps/Splunk_TA_checkpoint-opseclea/bin/splunk_ta_checkpoint_opseclea/splunktaucclib/data_collection/ta_checkpoint_manager.py", line 32, in get_ckpt
return self._store.get_state(key)
File "/opt/splunk/etc/apps/Splunk_TA_checkpoint-opseclea/bin/splunk_ta_checkpoint_opseclea/splunktalib/state_store.py", line 202, in get_state
state = json.load(jsonfile)
File "/opt/splunk/lib/python2.7/json/__init__.py", line 291, in load
**kw)
File "/opt/splunk/lib/python2.7/json/__init__.py", line 339, in loads
return _default_decoder.decode(s)
File "/opt/splunk/lib/python2.7/json/decoder.py", line 364, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/opt/splunk/lib/python2.7/json/decoder.py", line 382, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
We hope someone can help us.
Regards.
↧
Compare a field date with current date for alert
I have a simple windows script that collects CRL expiration dates and runs as a task every 24 hours
`echo | set /P = "%date:~4,10% %time:~1,7% " >> c:\crl_expiration.log
echo | set /P = "crl1.crl " >> c:\crl_expiration.log
openssl crl -inform DER -in \\x.x.x.x\crl\crl1.crl -noout -nextupdate >> c:\crl_expiration.log
echo.>>c:\crl_expiration.log`
The log output looks like this
`05/09/2019 13:00:01 crl1.crl nextUpdate=May 15 17:00:00 2019 GMT
05/09/2019 13:00:02 crl2.crl nextUpdate=May 15 17:00:00 2019 GMT
05/09/2019 13:00:05 crl3.crl nextUpdate=May 15 17:00:00 2019 GMT`
These get indexed by Splunk
I need to compare the given CRL expiration date listed as nextUpdate to today and I need to create an alert if the CRL's are going to expire soon.
`| eval dateadded_epoch = strptime('Date Added', "%b %d %H:%M:%S %Y") | where dateadded_epoch >= relative_time(now(), "-1d@d")`
I have not been able to get this to work for some reason and if someone has a suggestion on a better way to do this, it would be welcome.
↧
↧
Need to Calculate Response Time matching Index ID between 2 lines in the search
![alt text][1]
[1]: /storage/temp/272686-splunk.jpg
If look the below screen shot due to multiple calls in same time some time response takes a while and we need to match the ID and calculate difference
right now i am using the below query this works well if we have request and response comes next to each other
index=e1prd host=pite1bspd1 sourcetype=e1_npd_bssv *PublishedMethod* | transaction transid startswith="startPublishedMethod" endswith="finishPublishedMethod" | rename duration as Response_Time |table Instance_ID,Response_Time
Please advise
Thanks,
Santhosh
↧
Splunk OVA for VMware - when updating to CentOS 7?
Does anyone have information on when the OVA's will be built on CentOS 7? CentOS 6 is EOL 11/30/2020... our UNIX team strongly prefers CentOS 7 at this point.
I am aware we can build our own DCNs on CentOS 7, which we may opt to do. Just figured I'd ask if this is on the radar.
Matt
↧
Avecto Log Ingestion
Hi,
I'm trying to get Avecto data to my splunk indexer.
The current environment is I have a splunk app that sends WinEvent:Application logs to indexer with some event codes whitelisted.
What I want is I want to create a separate app that collects Avecto log. However Avecto logs only exist in WinEvent:Application but does not exist under applications and services logs. So, I cannot use "Full name" in my inputs.conf stanza.
I found an avecto and splunk integration guide online and tried it ["https://www.beyondtrust.com/docs/privilege.management/documents/mac/pm-splunk-integration-guide-1-0-0.pdf"].
However, it failed to collect the logs saying "WinEventMon::configure: Failed to find Event Log with channel name="Avecto Defendpoint Service."
Is there anyway I can build my avecto app without chaning my WinEvent:Application app?
Thanks.
↧
Splunk HTTP Event Collector
Hi , i am running currently 2 issues. My all Http event collector tokens are disabled and it says enable global settings when i click and enable global settings it says unable to write to /etc/apps/httpinput/disabled:0 please help me ?
Second issue is my Db connect v3.1.4 saying cannot contact to task server port while i try to change the port to 1025,1030 and restarted its failing to restart. Please guide me on it
↧
↧
How to add hyperlink to column value in Splunk dashboard?
Hi,
Can someone please advise how to add a hyperlink to the column fields in a table?
E.g.
How can I make NewField value a hyperlink to Field1 values (e.g. ID01 and ID02 ) in the table below instead of separate field?
Table:
Field1 NewField
ID01 http://company.test.com/page-ID01
ID02 http://company.test.com/page-ID02`
Report Title
`
↧
map command returns main index even searching _internal only
I tried to test map command on Splunk 7.1.3 with following search:
index=_internal earliest=-60m | map maxsearches=1 search="search index=_internal earliest=-6m latest=-1m | head 1"
Theoretically, this search should only return one event from index=_internal.
However, lots of events from main index return
![alt text][1]
[1]: /storage/temp/273643-screen-shot-2019-05-11-at-40751-pm.png
Is this a bug?
↧
Phantom app for Splunk
There was an error adding the server configuration. Verify server's 'Allowed IPs' and authorization configuration.
Status: 403
Text: Forbidden
Getting this error trying both SSL and Non SSL configuration between splunk and phantom
with local "192.168.0.0/16" Allowed ip or "any"
↧
hel p for calculating an interval in seconds between now date and a timestamp event
hello
I need to monitore events included in a now() date and the event creation date
So I need to calculate the interval between these 2 dates in seconds
I have no error when I execute my code but the DiffInSeconds eval doesnt works correctly
So I think there is something wrong
For example, if the now() date time is 9:00 and the event creation date time is 9:05 (so 300 seconds), normally I should have some events displayed because if I execute just index="x" sourcetype="winhostmon" Type=Service Name=SplunkForwarder I have events
so what is the issue p^ease??
index="x" sourcetype="winhostmon" Type=Service Name=SplunkForwarder
| eval timenow =now()
| eval EventCreatedTime=_time
| eval DiffInSeconds = (timenow - EventCreatedTime)
| eval Status=if(DiffInSeconds<900, "Online", "Offline")
| convert timeformat="%d-%b-%Y %H:%M:%S %p %Z" ctime(EventCreatedTime)
| table host EventCreatedTime DiffInMinutes Status
↧
↧
help on tostring function
hello
I am doing the distinct count below in my search
| stats dc(host) AS OnlineCount by Code
| where Code = "Online"
| fields OnlineCount
| appendpipe
[ stats count
| where count=0]
But I also need to add text after the distinct count
So I am doing this after the distinct count but I have nothing
| eval dc = if(dc== 0, "no host", tostring(dc) + " hosts")
could you help me please??
↧
Key Error in Python Splunk SDK while accessing KV store
Hi,
I'm getting Key Error while accessing splunk kv store using splunk python sdk.
Python Version: 2.7
Splunk sdk version : 1.6.6
OS: Windows 7 & 10
Code:
import sys, json
from splunklib.client import connect
import getpass
def main():
#Getting Splunk access details from User
hostname = raw_input("Enter Splunk Hostname/IP: ")
splunkUser = raw_input("Enter Splunk Admin Username: ")
splunkPassword = getpass.getpass("Enter Splunk Admin Password: ")
HOST = hostname
PORT = 8089
USERNAME = splunkUser
PASSWORD = splunkPassword
#Connecting splunk server
splunk_service = connect(host=HOST,port=PORT,username=USERNAME,password=PASSWORD)
print("Splunk connection succeed !!!!")
splunk_service.namespace['owner'] = 'Nobody'
collection_name="test"
collection=splunk_service.kvstore[collection_name]
if collection_name in splunk_service.kvstore:
print("{} Already exist !!!".format(collection_name))
else:
splunk_service.kvstore.create(collection_name)
#Adding data to KV store
collection.data.insert(json.dumps({"ID":"123","sId":1234,"Tech":"text"}))
#Retriving data from kv store
kv_data=json.dumps(collection.data.query(),indent=1)
print(kv_data)
if __name__ == '__main__':
main()
Error:
Traceback (most recent call last):
File "kv_store_opps.py", line 36, in
main()
File "kv_store_opps.py", line 21, in main
collection=splunk_service.kvstore[collection_name]
File "C:\Python27\lib\site-packages\splunklib\client.py", line 1243, in __getitem__
raise KeyError(key)
KeyError: UrlEncoded('test')
Any help on this much appreciated.
Thanks,
Mani
↧
Splunk for desktop environment
Can splunk be used to collect and manage win10 event traces / performance data ? Are there any use cases where splunk for used to manage enduser environement (Desktop/Laptop) ? Any help will be appreciated.
↧