I have some data in Splunk that I would like to send over to a custom Servicenow table for record creation. Right now, I am trying to do this with a custom search command that invokes a python script. The python script makes a REST API post request to the Servicenow table.
The script is making contact with the Servicenow table, but I am having trouble sending actual field values from the Splunk search results to the Servicenow table. For example, my Splunk search results contain fields such as "asset ip address" "asset name" "cve", and I would like to send the field values for each of those fields to the matching columns in the Servicenow table. **If anyone knows how to send actual field values from search results to a SNOW table using a custom search command / python script, your assistance would be greatly appreciated.**
Here is the script I have been trying:
#Need to install requests package for python
#easy_install requests
import requests
import math
import csv
import sys
import re
import time
import splunk.Intersplunk
#Test 1
# Custom streaming command to pass events
def customcommand(results):
try:
for result in results:
u_asset_ip_address = result['Asset IP Address']
u_asset_names = result['Asset Names']
u_asset_os_name = result['Asset OS Name']
u_asset_risk_score = result['Asset Risk Score']
u_exploit_count = result['Exploit Count']
u_malware_kit_count = result['Malware Kit Count']
u_service_name = result['Service Name']
u_service_port = result['Service Port']
u_service_product = result['Service Product']
u_service_protocol = result['Service Protocol']
u_site_names = result['Site Name']
u_vulnerability_age = result['Vulnerability Age']
u_vulnerability_cve_ids = result['Vulnerability CVE IDs']
u_vulnerability_cvssv3_score = result['Vulnerability CVSSv3 Score']
u_vulnerability_id = result['Vulnerability ID']
u_vulnerability_proof = result['Vulnerability Proof'']
u_vulnerability_risk_score = result['Vulnerability Risk Score']
u_vulnerability_reference_ids = result['Vulnerability Reference IDs']
u_vulnerability_severity_level = result['Vulnerability Severity Level']
u_vulnerability_title = result['Vulnerability Title']
u_vulnerable_since = result['Vulnerable Since']
except:
import traceback
stack = traceback.format_exc()
# Get the events from splunk
results, dummyresults, settings = splunk.Intersplunk.getOrganizedResults()
# Send the events to be worked on
results = customcommand(results)
# Set the request parameters
url = 'https:///api/now/import/u_splunk_vulnerability_import'
# Eg. User name="admin", Password="admin" for this code sample.
user = 'user'
pwd = 'pass'
# Set proper headers
headers = {"Content-Type":"application/json","Accept":"application/json"}
# Do the HTTP request
response = requests.post(url, auth=(user, pwd), headers=headers ,data="{\"u_asset_ip_address\":\"'{u_asset_ip_address}'\",\"u_asset_names\":\"'{u_asset_names}'\",\"u_asset_os_name\":\"'{u_asset_os_name}'\",\"u_asset_risk_score\":\"'{u_asset_risk_score}'\",\"u_exploit_count\":\"'{u_exploit_count}'\",\"u_malware_kit_count\":\"'{u_malware_kit_count}'\",\"u_service_name\":\"'{u_service_name}'\",\"u_service_port\":\"'{u_service_port}'\",\"u_service_product\":\"'{u_service_product}'\",\"u_service_protocol\":\"'{u_service_protocol}'\",\"u_site_names\":\"'{u_site_names}'\",\"u_vulnerability_age\":\"'{u_vulnerability_age}'\",\"u_vulnerability_cve_ids\":\"'{u_vulnerability_cve_ids}'\",\"u_vulnerability_cvssv3_score\":\"'{u_vulnerability_cvssv3_score}'\",\"u_vulnerability_id\":\"'{u_vulnerability_id}'\",\"u_vulnerability_proof\":\"'{u_vulnerability_proof}'\",\"u_vulnerability_reference_ids\":\"'{u_vulnerability_reference_ids}'\",\"u_vulnerability_risk_score\":\"'{u_vulnerability_risk_score}'\",\"u_vulnerability_severity_level\":\"'{u_vulnerability_severity_level}'\",\"u_vulnerability_title\":\"'{u_vulnerability_title}'\",\"u_vulnerable_since\":\"'{u_vulnerable_since}'\",\"sys_target_table\":\"\"}")
# Check for HTTP codes other than 200
if response.status_code != 200:
print('Status:', response.status_code, 'Headers:', response.headers, 'Error Response:',response.json())
exit()
# Decode the JSON response into a dictionary and use the data
data = response.json()
print(data)
↧
How to send Splunk search results to Servicenow table?
↧
Anyone see something wrong with my sourcetype rename here?
All,
I am trying to rename a subsect of logs. I am expecting the logs get their sourcetype renamed. But they remain WinEventLog:Application instread of Trend.
[WinEventLog:Application]
TRANSFORMS-wintrans = idx2trend,st2trend,route_stubhinfo_to_es_TU
[st2trend]
REGEX = Trend Micro OfficeScan Server[\S\s]+product_version
DEST_KEY = MetaData:Sourcetype
FORMAT = sourcetype::trend
↧
↧
How do you Sort by a Specific Order by values of a field
I am trying to figure out if there's a way to sort my table by the Fields "Whs" which have values of :
GUE -- I want to show rows for GUE data first
GUR -- followed by GUR
I also need to sort by a field called "Type" and the sort needs to follow this order of type
Full_CS
Ovsz
PTL
B_Bay
Floor
then repeat in that order showing rows for GUR.
Here's my current query
index=test sourcetype="example"
| stats values(Machine) as Machine, values(Whs) as Whs, values(Dscrptn) as Dscrptn, values(Percent_Sorted) as Percent_Sorted, sum(Total_CS) as Total_CS, sum(Online_CS) as Online_CS, sum(Inducted_CS) as Inducted_CS, sum(FullCSUnits) as FullCSUnits, sum(Full_CS) as Full_CS, sum(BTS) as BTS, sum(Total_Packed) as Total_Packed, sum(Total_VAS) as Total_VAS, sum(In_VAS) as In_VAS, sum(Total_Non_Vas) as Total_Non_Vas, sum(eval(Total_Units_Machine - (Total_Units_Machine * Percent_Sorted / 100))) as RmnngUnits by Wave, _time
| table Machine, Whs, Wave, Dscrptn, Percent_Sorted, RmnngUnits, Total_CS, Online_CS, Inducted_CS, FullCSUnits, Full_CS, BTS, Total_VAS, Total_Non_Vas, In_VAS, Total_Packed
| sort WHS, Type
↧
Splunk_TA_Windows event blacklist not working
I have below configuration in Splunk_TA_Windows inputs.conf to blacklist the NT AUTHORITY\SYSTEM events in 4663 code.
But my blacklist3 is not working as expected, still I get the events indexed.
Can some one help me in resolving the issue ?
[WinEventLog://Security]
disabled = 0
start_from = oldest
current_only = 0
evt_resolve_ad_obj = 1
checkpointInterval = 5
blacklist1 = EventCode="4662" Message="Object Type:(?!\s*groupPolicyContainer)"
blacklist2 = EventCode="566" Message="Object Type:(?!\s*groupPolicyContainer)"
***blacklist3 = EventCode="4663" Message="Security ID:(\w[NT]\s\w+.\w+)***
renderXml=false
index = winlogs
↧
Counting events in transaction,I need to count the no of particular events in a transaction
I need to count the no of particular events in a transaction Here I NEED to count no of tickets failed
[2018-11-16 16:59:45 0665 - Scanned barcode: EndOfTicketBarcode, 2705600009993 (Referrer=2705600009993, POSNumber=056, Checksum=3)
2018-11-16 16:59:54 0003 - Send ticket failed
2018-11-16 17:00:06 1833 - Send ticket failed
2018-11-16 17:00:52 8165 - BLClient :: Connected to 84.10.32.2:8091],
↧
↧
Converting log events to metrics using existing fields
Good morning all, I am reading docs on how to create sourcetypes for metrics but none go into how to just use fields instead of regex. I am using fluentbit to send metrics to HEC (and it works perfectly) in JSON format.
How do I use the existing fields to rewrite the sourcetype as metrics?
I included a screenshot of what the events look like.![alt text][1]
[1]: /storage/temp/256652-screen-shot-2018-11-18-at-80408-am.png
↧
Splunk connect for Kafka Issue - failed to send batch SplunkSinkTask
I have tried connecting with Splunk connect for Kakfa in multiple scenarios, where i'm keep on receiving the error as below and I'm not receiving the data to Splunk Indexers. I have an intermediate forwarder between Kafka server and Splunk indexer. I have tried sending a test message from Kafka server using the HEC token which is generated on HF. It works totally fine and I'm receiving the data to indexer. But, when I'm using curl command as per docs i'm not receiving the data. Could you please advise:
ERROR failed to send batch (com.splunk.kafka.connect.SplunkSinkTask:261)
java.lang.IllegalArgumentException: Illegal character in scheme name at index 0: :8088/services/collector/raw?index=kafka_connect
at java.base/java.net.URI.create(URI.java:883)
at org.apache.http.client.methods.HttpPost.(HttpPost.java:73)
at com.splunk.hecclient.Indexer.send(Indexer.java:115)
at com.splunk.hecclient.HecChannel.send(HecChannel.java:61)
at com.splunk.hecclient.LoadBalancer.send(LoadBalancer.java:56)
at com.splunk.hecclient.Hec.send(Hec.java:233)
at com.splunk.kafka.connect.SplunkSinkTask.send(SplunkSinkTask.java:257)
at com.splunk.kafka.connect.SplunkSinkTask.handleFailedBatches(SplunkSinkTask.java:127)
at com.splunk.kafka.connect.SplunkSinkTask.put(SplunkSinkTask.java:62)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:564)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:322)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:225)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:193)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:175)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:219)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: java.net.URISyntaxException: Illegal character in scheme name at index 0: 10.225.198.57:8088/services/collector/raw?index=kafka_connect
at java.base/java.net.URI$Parser.fail(URI.java:2915)
at java.base/java.net.URI$Parser.checkChars(URI.java:3086)
at java.base/java.net.URI$Parser.checkChar(URI.java:3096)
at java.base/java.net.URI$Parser.parse(URI.java:3111)
at java.base/java.net.URI.(URI.java:600)
at java.base/java.net.URI.create(URI.java:881)
... 19 more
[2018-11-18 00:45:35,596] INFO handled 1 failed batches with 2 events (com.splunk.kafka.connect.SplunkSinkTask:130)
↧
Splunk-Winevtlog.exe Initial High CPU Utilization on Installation of Windows Splunk Forwarder v 7.1.2
Hi,
Right after the initial install of the Splunk Windows Forwarder the **Splunk-Winevtlog.exe** process consistently runs at 25% utilization.
This will happen for **3 to 5 hours** then will go down to zero and never does it again.
Wondering if anyone else may have seen this and how to prevent this from happening.
The forwarders are being installed on Windows 10 devices.
Thanks for all the help I'm getting on this forum. :)
Alan
↧
Json ingest with weird characters or binary
Hi all,
I was trying to ingest some json files however the json seems to have some weird characters or binary and parsing failed.
Example of JSON:
{
"abc": "*weird_characters*"
}
I got this error : ERROR JsonLineBreaker - JSON Stream ID: xxxxxxxxxxxxxxxxxxxxxx had parsing error: Unexpected character while parsing backslash escape: 'x'
I had experimented on a lot of prof.conf including setting binary to false. I suspect this is something to do with encoding.
How do i solved this?
Thanks in advance
↧
↧
[SmartStore] Can I get key configuration for Smart Store?
I am planning to setup SmartStore and was looking for Key configuration. Could you please share the Key configuration?
↧
logstash configurations for sending data to splunk through tcp
Hi All,
I am using rsyslog and logstash agent to forward data to splunk. I am able to send data through tcp from rsyslog but I am not able to using logstsh agent. Can you please help me on logstash configuration to fwd data to splunk.
logstash config
input{
pipe{
Appid => "XXXX"
path => "/path/script.sh"
}
output{
tcp
{
host => "XXXXX.XXXX"
port => XXX
}
}
I have mentioned the above config as a sample.
↧
Survey kind of dashboard is possible in splunk?
Hi,
we have a simple dashboard that contains Tweets for our client like shown below
1. I transferred money last night, am worried I sent it to the wrong person. Ap only giving me a 6 digit number how 2 get more details?
2. List item @Run_amuk @CommBank I set up an account online super easy, #ApplePay is awesome. Looks like it’s goodbye CommBank after 30 years.
3. ...
4. ...
5. ...
6. ...
7. goes on
Client want to have additional 2 fields Agree and Disagree for further identification of the message is real message or it can be ignored. So this dashboard used by front end user who will click on one of these options. As soon as they click one of the options, that tweet should go away from the dashboard, we need TweetText followed by agree or / and disagree field in a csv file.
Can we achieve this solution? Kindly advise with the code.
↧
Discard specific fields and keep the rest
Hi All,
could you please let me know how to Discard specific fields and keep the rest while indexing in Splunk ?
like
field 1=a
field 2 =b
field 3 = c
field 4 =d
From field 1-3 has to be discard and only field 4 has to show in splunk events /_raw
Thanks
Rakesh Singh
↧
↧
splunk version upgrade issue
HI All,
I upgraded splunk 6.5 to splunk 7.1.1 version in linux.we are good with xml dashboards only.For html dashboards like using external javascript code those dashboards are not loading .
I want to know what is the issue and which version will solve all related issues with good performance.?
Happy Splunking.
Thnaks in Advance..:)
↧
dbxquery for real time selects
Hello,
The colleague who is a Splunk admin in our company installed the DB Connect App. Now, I would like to use it not only to create inputs and read from them but also for the real time queries, which would produce tables directly consumable in my dashboard.
The point is, that when I execute the dbxquery on the indexer, where the DB Conn App has been installed, all works fine. I can also open it in the search there.
But when I try to execute the dbxquery where I actually would like to, in the search server, I get the following error:
Search Factory: Unknown search command 'dbxquery'.
Also the dbxquery command does not appear in blue there as on the indexer, which I guess mean it is not recognized.
Could you please confirm, that the DB Conn App has to be installed also on the server where the search is executed? I mean not only on the indexer?
Kind Regards,
Kamil
↧
HELP ON REGEX PLEASE
Hello
I want to extract the field below from my event
ABDM-TOUPDATE.$w$
could you help me please?
↧
how to extract fields from XML file
sample data
Can you please help me out to build a regular expression to get the rows as mentioned below.
ID="C1", DATE="2018-11-16 09:20:01", CODE="A1", AMOUNT="100"
ID="C1", DATE="2018-11-16 09:20:01", CODE="A2", AMOUNT="200"
ID="C1", DATE="2018-11-16 09:20:01", CODE="A3", AMOUNT="300"
ID="C2", DATE="2018-11-16 09:20:01", CODE="A2", AMOUNT="100"
ID="C2", DATE="2018-11-16 09:20:01", CODE="A1", AMOUNT="200"
↧
↧
conditional token using row.host
First panel of a dashboard, user clicks on a host to get more panels to display. One such panel depends on if row.host contains a string. The panel I want depends on $disk$.
'$row.host$ true '
but the panel depending on $disk$ is not working. All the other panels are fine but the are using $host$.
↧
assertionconsumerservice URL is missing from samlrequest in SP-initiated SSO flow.
I am currently working with our Okta team to get SSO working with Splunk. However, we cannot get the assertionconsumerservice URL to send. We've tried adding the field manually in authentications.conf and the field still doesn't send.
↧
confirmation of the CLONE_SOURCETYPE config
Hello,
I would need a confirmation of my CLONE_SOURCETYPE configuration.
I have the following requirements:
**sourcetype 1: ISP_hanatraces**
Here I would like all the events containing the strings:
csns.admin.AdminConnection
csns.admin.commands
alter system alter configuration
to be cloned to the new index mlbso_changelog and the source type ISP_parameterChangelog.
**sourcetype 2: ISP_executed_statements**
Here I would like to copy all the events completely to the new index mlbso_changelog and the source type ISP_DMLchangelog.
My configuration for that would be:
**props.conf**
[ISP_hanatraces]
TRANSFORMS-ISP_parameterChangelog_clone
[ISP_executed_statements]
SHOULD_LINEMERGE = false
LINE_BREAKER = ((?:\r?\n){2,})
TRANSFORMS-ISP_executed_statements_clone
**transforms.conf**
[ISP_parameterChangelog_clone]
CLONE_SOURCETYPE = ISP_parameterChangelog
REGEX = .*(?i)(csns\.admin\.AdminConnection|csns\.admin\.commands|alter system alter configuration)(?-i).*
FORMAT = mlbso_changelog
DEST_KEY = _MetaData:index
[ISP_executed_statements_clone]
CLONE_SOURCETYPE = ISP_DMLchangelog
FORMAT = mlbso_changelog
DEST_KEY = _MetaData:index
Does the above make sense?
Also, the original sourcetypes have some sensitive data being hashed in the props.conf using SEDCMD later in the file. However I would like to clone the data still before hashing. Would the cloning and transform rules be applied in the order how they appear in the props.conf? Then it would be fine for me as the cloning entries above would come first.
Kind regards,
Kamil
↧