I notice with RHEL 7 (7.3 at least), several of the scripts currently do not work.
- interfaces.sh: returns no data
- services.sh: doesn't return systemctl service status
Are these planned to be fixed soon? If I were to fix this locally, who should I forward suggested fixes?
↧
Splunk Add-on for Unix and Linux - RHEL 7 support? Some scripts do not work.
↧
dbxquery failure; Failure starting process
RHEL6, Splunk 7.0, DB Connect 3.1.1, java 1.8, mysql jdbc connector
from the search.log:
11-07-2017 20:52:28.844 INFO SearchParser - PARSING: | dbxquery query="select 1;" connection="Moog"
11-07-2017 20:52:28.847 INFO ISplunkDispatch - Not running in splunkd. Bundle replication not triggered.
11-07-2017 20:52:28.848 INFO UserManager - Setting user context:
11-07-2017 20:52:28.848 INFO UserManager - Done setting user context: NULL ->
11-07-2017 20:52:28.848 INFO ChunkedExternProcessor - Running process: /opt/splunk/etc/apps/splunk_app_db_connect/linux_x86_64/bin/command.sh -Dlogback.configurationFile\=../config/command_logback.xml -DDBX_COMMAND_LOG_LEVEL\=INFO -cp ../jars/command.jar com.splunk.dbx.command.DbxQueryCommand
11-07-2017 20:52:28.851 ERROR ChunkedExternProcessor - Failure starting process
a jdbc connection pool is made though, and the sqlexplorer does populate with data.
↧
↧
service field - showing VideoMSN
Hi mates,
I would like to clarify why field "service" is showing "VideoMSN" as particular service using known TCP ports as 25 (which should be SMTP), 22 (which is SSH) and so on.
I'm not sure if it is something in our firewall, or this "VideoMSN" description is written by splunk itself.
There are other events, sent from the same firewall, that show the correct description, for example TCP 80 HTTP.
↧
Concepts about inbound | outbound traffic. Public and private addresses
Hi mates,
I'm figuring out the reason, why I'm looking LAN addresses as source IP if my search is clearly filtering inbound traffic, coming from Internet.
Is it possible that could be related to something wrong in the firewall ?
For example, if I perform this search
index=index-example sourcetype=firewall src_interface=WAN action=allowed | stats count by src dest service | sort 10 - count
For the query above, I expected to received at the top, public IP addresses. Instead I received subnet IPs 172.20.X.X/24 which is my LAN subnet.
↧
Splunk DB Connect: dbxquery failure; Failure starting process
RHEL6, Splunk 7.0, DB Connect 3.1.1, java 1.8, mysql jdbc connector
from the search.log:
11-07-2017 20:52:28.844 INFO SearchParser - PARSING: | dbxquery query="select 1;" connection="Moog"
11-07-2017 20:52:28.847 INFO ISplunkDispatch - Not running in splunkd. Bundle replication not triggered.
11-07-2017 20:52:28.848 INFO UserManager - Setting user context:
11-07-2017 20:52:28.848 INFO UserManager - Done setting user context: NULL ->
11-07-2017 20:52:28.848 INFO ChunkedExternProcessor - Running process: /opt/splunk/etc/apps/splunk_app_db_connect/linux_x86_64/bin/command.sh -Dlogback.configurationFile\=../config/command_logback.xml -DDBX_COMMAND_LOG_LEVEL\=INFO -cp ../jars/command.jar com.splunk.dbx.command.DbxQueryCommand
11-07-2017 20:52:28.851 ERROR ChunkedExternProcessor - Failure starting process
a jdbc connection pool is made though, and the sqlexplorer does populate with data.
↧
↧
How do you compare sets of field values from two searches?
I'm basically trying to identify whether some of my hosts are not doing something successfully as it should be in a daily basis, and alert as needed.
The process would output specific line if the task is successful and I only need to match once.
so I'm been trying to do sub-search without much success.
my searches so far:
try 1:
sourcetype="myapp" | dedup host | eval allhost=host | eval joinf=1 | join max=0 joinf [search sourcetype="myapp" "Database updated" | dedup host | eval updatedhost=host | eval joinf=1] | eval match=if(allhost==updatedhost, 1,0)
try 2:
sourcetype="myapp" | dedup host | stats dc(host) as allhost | appendcols [search sourcetype="myapp" "Database updated" | dedup host | stats dc(host) as updatedhost ] | eval nodiff=if(match(allhost,updatedhost),"True","False") | table nodiff
^^^ this only match total host count which I need more details (ie. which host does NOT match)
try 3:
sourcetype="myapp" OR (sourcetype="myapp" "Database updated") | streamstats count by host | stats values(host) as host | mvexpand host | eval Status = if(match(host), "MATCH","NO MATCH") | table host,Status
^^^ not working since I don't know how to identify the second set of 'host' for the match
↧
When I search for top public IP addresses results include my LAN subnet
Hi mates,
I'm figuring out the reason, why I'm looking LAN addresses as source IP if my search is clearly filtering inbound traffic, coming from Internet.
Is it possible that could be related to something wrong in the firewall ?
For example, if I perform this search
index=index-example sourcetype=firewall src_interface=WAN action=allowed | stats count by src dest service | sort 10 - count
For the query above, I expected to received at the top, public IP addresses. Instead I received subnet IPs 172.20.X.X/24 which is my LAN subnet.
↧
REST API Modular Input: How to remove meta data from JSON and split it in to multiple events
Hi All,
I am pulling this json from REST input.
{"meta":{"bucket":"second","bucketsize":"1","tstart":1510090302753,"tend":1510093902753,"group":{"mname":{"desc":"Monitor name of measurement","type":"string","name":"Monitor Name"},"monid":{"desc":"Slot ID of measurement","type":"number","name":"Slot ID"}},"monid":[22762032],"metrics":{"count":{"desc":"Number of total hits or datapoints","unit":"number","name":"Total number of hits"},"avail":{"desc":"Average Availability of selected Measurements","unit":"%","name":"Availability"},"nwtme":{"desc":"Total time of all network traffic measured by the agent","unit":"ms","name":"Total Network time"},"uxtme":{"desc":"Full User Experience time as reported by the browser","unit":"ms","name":"User Experience"}},"limit":100000,"dbtime":3,"dbname":"db_dt_wa_raw_5 ","apiversion":"48.0.0.201709291816.154491-8"},"data":[{"mname":"NM - SearchAndPurchase_Guest - Chrome Agent","monid":22762032,"count":1,"nwtme":37874,"uxtme":118764,"avail":1,"mtime":1510090652753},{"mname":"NM - SearchAndPurchase_Guest - Chrome Agent","monid":22762032,"count":1,"nwtme":33711,"uxtme":120795,"avail":1,"mtime":1510091094753},{"mname":"NM - SearchAndPurchase_Guest - Chrome Agent","monid":22762032,"count":1,"nwtme":44886,"uxtme":134951,"avail":1,"mtime":1510091548753},{"mname":"NM - SearchAndPurchase_Guest - Chrome Agent","monid":22762032,"count":1,"nwtme":32109,"uxtme":114532,"avail":1,"mtime":1510091999753},{"mname":"NM - SearchAndPurchase_Guest - Chrome Agent","monid":22762032,"count":1,"nwtme":31346,"uxtme":116506,"avail":1,"mtime":1510092457753},{"mname":"NM - SearchAndPurchase_Guest - Chrome Agent","monid":22762032,"count":1,"nwtme":34060,"uxtme":126494,"avail":1,"mtime":1510092909753},{"mname":"NM - SearchAndPurchase_Guest - Chrome Agent","monid":22762032,"count":1,"nwtme":34784,"uxtme":120856,"avail":1,"mtime":1510093359753}]}
I would like to remove header until "data":[ and last part of json ]}
and create multiple events of each array
{"mname":"NM - SearchAndPurchase_Guest - Chrome {"mname":"NM - SearchAndPurchase_Guest - Chrome Agent","monid":22762032,"count":1,"nwtme":34784,"uxtme":120856,"avail":1,"mtime":1510093359753}
I have already added below in prop.conf and restarted but it does not work
TIME_PREFIX = \"mtime\":
SHOULD_LINEMERGE = false
LINE_BREAKER = ,\{|}(,){
TRUNCATE = 200000
SEDCMD-remove_header = s/\{\"meta[^\[]*\[*//g
SEDCMD-remove_footer = s/\]\}//g
↧
How to merge and unmerge only specific cells, so that I can export results easily to Excel?
![alt text][1]
[1]: /storage/temp/219709-formatneeded.png
Can someone please help me in getting the search results query in above format which is needed? I had used stats command to group the fields in the search results.
Thanks!
↧
↧
Different results when using the Splunk search bar versus searching over HTTP.
For Splunk events with this kind of payload
[TS: Tue Jul 4 19:28:00 2017 PDT] [PPTID: tid1] [ABC: XYZ][ASD: YHG1] [ANDG: ldsnfn] [PPID: id1]
this query when fired using "Splunk search" returns
index=sampleindex sourcetype=samplesourcetype ![alt text][1]
gives output like this
PPTID PPID TS
tid1 id1 Tue Jul 4 19:28:00 2017 PDT
When I fire the same search over HTTP to Splunk server , the query never returns
![alt text][2]
How should the query be for using over http ?
Splunk version: 6.6.3
[1]: /storage/temp/218714-screen-shot-2017-11-07-at-34628-pm.png
[2]: /storage/temp/218715-screen-shot-2017-11-07-at-35404-pm.png
↧
How can I add HTML and background colors in a dashboard's text panel
I'm trying to create a panel on a dashboard with the text white and the background shaded
So far this colors the whole page pink:
↧
I'm having trouble sending results.csv to slack_alert.py
Hi Guys, I am new to splunk and apoligies if this has been answered but I have been search for the last 2 days with no luck,
I downloaded a slack_alert.py script from the following website function1. I created an alert which generates results is created.
/opt/splunk/var/run/splunk/dispatch/scheduler__st_RU1DLWFwcC1Jc2lsb24__RMD5f420d17a3db08388_at_1510037280_879
In there is a file is called “results.csv.gz” How do I output the "Events.messages" to the "Alert =" ? column to the script??
**events.message**
**The /var partition is near capacity (>90.0% used)**
=======script below===========
import httplib, json
import getopt, sys, os
import subprocess
import sys, json
import urllib2
import re
def main():
WEBHOOK_URL = ''
headers = {'Content-Type': 'application/json'}
color = danger
**alert = 'sys.argv[8]'**
for result in csv.DictReader(openany(results_file)):
print result["_event.message"]
message = {
'username': 'Splunk',
'fallback': text,
'pretext': text,
'color': color,
'icon_url': '',
'fields': [
{
'title': "SPLUNK Alert Notification",
**'value': alert,**
'short': True
}
]
}
connection = httplib.HTTPSConnection('hooks.slack.com')
connection.request('POST', WEBHOOK_URL, json.dumps(message), headers)
print json.dumps(message)
response = connection.getresponse()
print response.read().decode()
Thank you in advance and sorry if its a bit vague, i am super confused myself
↧
How do I find out what the source OTHER is as it is eating up the GB ingested per day
when using the SOS Metrics in splunk 5.0.4 I am getting sources in the category OTHER, but I don't know what information is in this source as it can't be searched on. How to I find out what OTHER is and exclude it so that I can lower down my indexing GB per day?
↧
↧
Option to extract one specific field from different patterns in one go ?
For the same sourcetype, I have a lot many different patterns from which I want to extract one specific field. Is there an option to handle all these extractions (same sourcetype different patterns) in one go instead of going via multiple regex field extractions ?
↧
how to find the user roles which has access to serachindexallowed * or _*
I need to find the user roles which has searchindexallowed = " \* or _* "
The below command list all the roles with searchindexallowed details, but how do I get only roles which has permission to all indexes.
| rest /services/admin/roles | table title, srchIndexesAllowed | rename title as role.
Also, is there any way to find this with btool command ?
↧
Search head cluster member does not come up after exec restart cmd from master
Hi,
I have a three search head SHC.
I see that one SHC member going for restart but never comes back up. This is the log line.
`INFO SHCSlave - event=SHPSlave::handleHeartbeatDone master has instructed peer to restart`
SHC has three members with Dynamic captain.
What could be going wrong.
Please help.
↧
Another installation is on progress
i have a problem while installing UniversalForwarder on Windows Server 2008 R2.
the dialog box said, "Another installation is in progress. You must complete that installation before continuing this one".
![alt text][1]
I checked to temporary files and the task manager, and nothing's odd.
Any solution?
Thanks for the attantion
[1]: /storage/temp/219711-splunk.jpg
↧
↧
Not able to access splunk web after splunk instillation
I was trying to install splunk6.1.1 in CentOs 6.8.
Installation was successful and was unable to access Splunk web interface.
Can anyone suggest me where I went wrong or what I missed out...
![alt text][1]
![![alt text][2]][2]
[1]: /storage/temp/218718-1.png
[2]: /storage/temp/218721-2.png
↧
Using app.manifest file of Splunk Packaging Toolkit to declare app dependencies
Hi experts,
Recently I read a bit about the Splunk Packaging Toolkit (http://dev.splunk.com/view/packaging-toolkit/SP-CAAAE9V) and got to know that apart from a lot of other utilities, it can also be used to declare your **app dependencies using an app.manifest file**. Has anyone tried declaring app dependencies using this toolkit before?
I have an app that's divided into three parts (dashboards app, TA, TA for Adaptive Response). I want to be able to define simple dependencies within these apps that would warn the user by telling something like 'TA is a required app for using TA-AR. Please install TA before proceeding with TA-AR installation'.
From all that I've read about the app.manifest file, it allows the developer to define a dependencies stanza and list all required apps. However, I haven't been able to find any examples of how to write this file, **where to place it in the app directory structure** and **whether or not app.manifest can be defined without using the entire Splunk Packaging Toolkit**.
Thanks.
↧
Dataset regexed field to uppercase
Hello.
I have a dataset with a regular expression where i extract the hostname of the computer to a `hostname` variable.
However, in the searches i base this on, a lower case hostname does not work. so for example i have
How can I add a simple eval to the dataset that does `| eval hostname=upper(hostname)` ?
(The error I get when i try to do this in the GUI is `Error in 'eval' command: Fields cannot be assigned a boolean result. Instead, try if([bool expr], [expr], [expr])`
↧