Hi Splunkers,
I want to get the count of forwarders that are reporting from each application/Workspace.
Example: I have created 4 apps/workspace for 4 different teams.
So now I want to get the count of forwarders that are reporting from each application/Workspace
Is there any search which can give me the above information in a single search ?
Thanks in advance,
Thippesh
↧
How to get the count of forwarders that are reporting from each application/Workspace?
↧
How to specify a value in one place and use it in several searches?
I have several saved searches that contain `where vehicle_distance<=100`. I want to make the value of 100 tunable in _one_ place rather than having to edit all the searches.
I've discovered that I can set-up a macro, say, `nbmaxdist` that contains the above definition, then use it in a search like `| nbmaxdist |` (enclosed in backticks that I can't easily show here because a backtick is a Markdown special character).
This works, but is this the prescribed way to do this?
↧
↧
Anomali ThreatStream Community App: How to perform a basic search?
Installed the Anomali ThreatStream Community App from Splunkbase, and uploaded 2 sets of data.
(1) Network.log that have several IP address I am interested.
(2) web.log that has several url, that I hope it will show up after my search.
However, after I click "Generating Network Summary", or "Generating Web Summary" or "Generating and Uploading Summaries" . If I click "Run" next to "Generating and Uploading Summaries", i am getting -- "Failed to run the search"
I was told i should be able to see the search result, and graph.
So, what I am doing wrong?
↧
Is there any way to do stats count over multiple time frames?
Is there any way to do stats count over multiple time frames?
I am trying to replace something written in perl and output to .xls format. I wish to count IP addresses in each subnet; I have about 3500 subnets that I wish to summarize across multiple time frames ( current, -30days, -60days, -90days). I have done the first part by doing a CIDR lookup to subnet and then counting.
I am looking for alternate ideas to accomplish the same thing. Help?
Tim
( index=network_dns OR index=network_bro ) earliest=-30d
| rex field=named_message "client (?\d+\.\d+\.\d+\.\d+)"
| fields _time id_orig_h id_resp_h client_ip
| eval ip=coalesce(id_orig_h, id_resp_h, client_ip)
| regex ip="\d+\.\d+\.\d+\.\d+"
| dedup ip
| lookup cidr_ranges subnet AS ip OUTPUT subnet
| eval ip_class=if(`is_my_network(ip)`, "MINE", "External")
| stats count(ip) as count by subnet, ip_class
| where ip_class="MINE"
| where subnet!="UNKNOWN"
| sort subnet
| table count subnet
↧
How to best resolve peer indexes down due to wrong homepath in the indexers?
While pushing the cluster bundle from the cluster master to indexers, there was a wrong homepath in the indexers app that was being pushed. As a result, the peers could not restart and were down. And each indexer was manually restarted after removing the app from the slave apps folder. Is that a good practice or will there be any loss of buckets in the process?
↧
↧
Anomali ThreatStream Community App: What does "Error in TsidxStats": Could not find datamodel: TS_optic" mean and how do I fix it?
I am new to the Splunk world, but I was trying to use Anomali ThreatStream Community App and a search but get the following errors:
(1) Error in "TsidxStats": Could not find datamodel: TS_Optic
(2) The search job has failed due to an error. You may be able view job in the "Job Inspector"
My question is: what is "datamodel : TS_Optic"? How do I create one?
↧
How to build a search that shows the uptime of Splunk and the host reporting to Splunk?
I am required to build a search which will show the uptime of all my Splunk components over a period of one month.
Also I am required to build a search for the host reporting to Splunk
↧
How can I get the latest event by a specific field?
Hello,
I have the following event data:
City,Date,Temp,Sky
New York,2016-11-10,20,Clear
New York,2016-11-10-19,Cloudy
San Francisco,2016-11-20,20,Clear
San Francisco,2016-11-19,18,Rain
Rome,2016-11-20,11,Rain
Rome,2016-11-19,10,Partly Cloudy
What I would like to do is get the latest event for each city, so I have results:
City,Date,Temp,Sky
New York,2016-11-10,20,Clear
San Francisco,2016-11-20,20,Clear
Rome,2016-11-20,11,Rain
I have tried
stats first(City) by City
But this just gives me a list that I cannot use. Any help would be great!
Thank you!
Andrew
↧
Why doesn't this custom search command call class method?
Given an excerpt from custom search command:
logger = logging.getLogger( 'nbclosest' )
logger.setLevel( logging.DEBUG )
K_STAG = 'stop_tag'
K_TIME = '_time'
K_VDIST = 'vehicle_distance'
K_VID = 'vehicle_id'
@Configuration()
class NextBusClosestStop( EventingCommand ):
class ConfigurationSettings( EventingCommand.ConfigurationSettings ):
required_fields = ConfigurationSetting(value=[K_TIME, K_VID, K_VDIST, K_STAG])
def __init__( self ):
super( NextBusClosestStop, self ).__init__()
# ...
def drain( self ):
logger.debug( 'enter drain()' )
# do drain code
def transform( self, records ):
logger.debug( 'enter transform()' )
for rec in records:
# ...
yield rec
logger.debug( 'exit transform()' )
self.drain()
The `transform()` function is called and both `enter transform()` and `exit transform()` are in `search.log`, but I never see `enter drain()` logged --- and the code is indeed never called (because the results produced are wrong).
However, if I copy & paste the code from `drain()` and put it "inline" in place of `self.drain()`, then the code executes.
How can it be the case that `self.drain()` isn't called?
↧
↧
How do I display negative values through geostats?
Hello,
I'm busy mapping temperatures for locations around the world and in some cases the value is negative. Unfortunately negative values are not displayed in the map through the geostats command, so I was wondering how to enable them? I've tried setting the marker sizes so that it is the same for all values:
But this only makes positive values the same size, negative values are still missing...
Any ideas?
Thank you!
Andrew
↧
Is it possible to create a multi value visualization for each value of a grouped field?
Hello,
I am trying to create a variable sized visualization based on the value of a field grouped by another field. To explain what I mean, I have a table with temperatures:
CITY,TEMP
Tokyo,6.67
New York,10.29
Rio de Janeiro,29.54
Bamako,32.56
Milano,9.27
Port Elizabeth,19.41
Norilsk,-24.43
Perth,16.72
What I would like to do is create a single visualization that gives me say, a single value or gauge, for each city. I would like to avoid having to create a separate search for each city and add a separate visualization. I'd just like to be able to create a single visualization that includes all values in a table.
Is this possible?
Thank you!
Andrew
↧
Why is the knowledge bundle directory filling up after 6.5.1 upgrade?
I am having an issue with the knowledge bundle directory not deleting old bundles. This started after upgrading from 6.3 to 6.5.1. We only have 1 search head that keeps sending bundles to the directory on the search peer every 2 minutes and does not remove the older ones. There are more than the default 5 that are kept. Is there any place that I can look to find the cause?
↧
I have logs with out-of-order timestamped events. Will searches compensate for this and correct the timestamp order?
Given this excerpt from log files I generate and index:
2016-11-19 20:34:21 GMT vehicle_id="1009" route="E" speed=0 distance=136 stop_tag="4502"
2016-11-19 20:36:44 GMT vehicle_id="1009" route="E" speed=13 distance=4 stop_tag="4529"
2016-11-19 19:46:23 GMT vehicle_id="1006" route="E" speed=21 distance=140 stop_tag="7795"
2016-11-19 20:18:10 GMT vehicle_id="1007" route="E" speed=9 distance=42 stop_tag="5240"
2016-11-19 20:38:28 GMT vehicle_id="1009" route="E" speed=21 distance=281 stop_tag="4516"
you'll notice that the time-stamps are out-of-order. The way these are generated is that a web site is polled and it returns the set of vehicles that have changed since the last poll. Included also (but not shown here) is how long ago the vehicle "phoned home" to the web site. When I generate a log file entry, I subtract the "ago" value from "now" to get the time at which the vehicle actually transmitted the data and use _that_ time for the time-stamp. Hence the time-stamps are out-of-order.
Whenever I want to write searches against this data, do I have to do anything special? For example, if I use `streamstats` as part of a search, does Splunk compensate for the out-of-order events and return them in correct time-order? Or do I always have to include `sort _time` as part of the search pipeline prior to invoking `streamstats`?
Do I have to do something else to compensate for other search commands?
Note to aaraneta_splunk: This question is **NOT** only about `streamstats`. I used `streamstats` as an _example_ only. Please do _not_ edit the question again. Thanks.
↧
↧
How do I extract numerical value from within a string using rex command?
Hello,
I've been reading up on the `rex` command and using it to split strings, but I cannot for the life of me get it working. I have the following input:
FORECAST
Sun and a few passing clouds. High 13C. Winds light and variable.
Partly cloudy. Low 4C. Winds light and variable.
Overcast with rain showers at times. High around 10C. Winds NE at 10 to 15 km/h. Chance of rain 60%.
Cloudy skies. Low 2C. Winds ENE at 10 to 15 km/h.
Partly cloudy skies. High 6C. Winds light and variable.
A mostly clear sky. Low -1C. Winds light and variable.
A mainly sunny sky. High 6C. Winds light and variable.
A mostly clear sky. Low near 0C. Winds light and variable.
and my goal is to extract the temperature value into a new field FORECAST_C. I basically have to remove everything up until the space between the date, and everything after starting with "C.". My desired output is:
FORECAST_C
13
4
10
2
6
-1
6
0
I've come up with
rex field=entry ".\s(.*)C\w."
but it's clearly not working...
Can somebody give me a hand?
Thanks!
Andrew
↧
How to modify my search to include historic count to my current day's count and also average?
Hello everyone,
I have a search as follows which displays the usernames, their accessing application count on that day, and the average of total users average accessing application count
index=foo sourcetype = foo | | stats dc(A) as accessing_application_count by usernames |eventstats avg(accessing_application_count) as avg_accessing_application_count |stats max(accessing_application_count) as max_accessing_application_count max(avg_accessing_application_count) as avg_accessing_application_count by usernames
Which Displays something as follows
usernames max_accessing_application_count avg_accessing_application_count
abc 3 4.982456
def 0 4.982456
ghi 10 4.982456
Now I want to calculate similarly for each user's last 3 days **max_accessing_application_count** which should be calculated based on each day's **max_accessing_application_count** of last 3 days, and the average too. like below
usernames max_accessing_application_count last_3days_max_accessing_ avg_accessing_application_count last_7d_avg
abc 3 6 4.982456 7.8
def 0 4 4.982456 7.8
ghi 10 7 4.982456 7.8
average should be calculated as each days average for the last 3 days and that 3 days average of that.
Please suggest if you have any idea to help me regarding this query.
↧
How to edit my search to display data on a weekly chart?
Hi All,
For a trend chart, I have data for the following dates
2016-10-29 - saturday
2016-11-05 - saturday
2016-11-12 - saturday
2016-11-15 - Tuesday
2016-11-26 - saturday
i want a weekly chart which shows saturday's date ( last day of every week) on the axis
here 15 Nov's data also should be shown as 2016-11-19 (saturday's date). How can I do it?
Currently I am using this search
index ="64581-np" earliest=-24w@w6 latest=now sourcetype = "fn_details" matchConfidence!="Not Vulnerable"
[
| tstats max(_time) as maxTime WHERE index ="64581-np" earliest=-24w@w7 sourcetype="fn_details" by source _time span=1w
| sort -maxTime
| stats first(source) as source by _time
| fields source]
| fields fieldNoticeId,matchConfidence,source
| eval _time = _time + (86400*7)
| eval _time=if(_time>now(),relative_time(now(),"@d"),_time)
| eval dayWeek =strftime(_time,"%Y-%m-%d")
| eval workField = fieldNoticeId.":".dayWeek
| dedup workField,matchConfidence
| replace "Potentially Vulnerable" with "Potentially" in matchConfidence
| stats list(matchConfidence) as matchConfidence by workField
| eval statusOuput=if(matchConfidence LIKE "Potentially" AND !(matchConfidence LIKE "Vulnerable"),"Potentially Vulnerable","Vulnerable")
| eval id=mvindex(split(workField,":"),0)
| eval dayWeek=mvindex(split(workField,":"),1)
| chart dc(id) over dayWeek by statusOuput
which is giving me below result (dates):
2016-09-10
2016-09-17
2016-09-24
2016-10-01
2016-10-08
2016-10-22
2016-10-29
2016-11-05
2016-11-12
2016-11-15
2016-11-26
Help much appreciated!
↧
What does is the function of eval in this search?
We use eval command to create new field , we used this as function ex: `|stats count(eval(method="GET")) as get` . can someone explain this example clearly? what is `eval` doing here?
↧
↧
how do I cause summary indexing keep stats on distinct items?
I am having trouble understanding summary indexing, and keeping stats on container objects, but I am really interested on the stats on distinct objects in the containers. How do I cause summary indexing to keep stats on the distinct items?
Lets say I am an internet service provider (ISP) and I provide IP addresses that are grouped by network, and I wish to count addresses in use. I would summarize as follows:
index=blah | fields ip
| eval age_category=case(_timerelative_time(now(), "-60d@d") AND _timerelative_time(now(), "-30d@d") AND _time
↧
How to put new line character in underlabel tag?
150 characters in undel label tag
How to add new line at panel ends and set the string as per the size of panel?
↧
REST API Modular Input: How can I make REST calls to MS Project Server 2013?
Hello,
I am busy trying to configure the REST API Modular Input to make a REST call to my MS Project Server 2013 instance, but I'm not exactly sure how to configure the form and Custom Authentication Handler. I have a piece of Python code that authenticates me and makes a request to the URL that I want to retrieve data from:
import requests
from ntlm import HTTPNtlmAuthHandler
import getpass
import urllib2
SITE = "https:////"
USERNAME = "domain\\username"
PASSWORD = ""
passman = urllib2.HTTPPasswordMgrWithDefaultRealm()
passman.add_password(None, SITE, USERNAME, PASSWORD)
auth_NTLM = HTTPNtlmAuthHandler.HTTPNtlmAuthHandler(passman)
opener = urllib2.build_opener(auth_NTLM)
urllib2.install_opener(opener)
URL = "https:////_api/ProjectData/Resources?$select=ResourceId,ResourceName"
response = urllib2.urlopen(URL)
headers = response.info()
print("headers: {}".format(headers))
body = response.read()
print("response: " + body)
What I am having trouble with is figuring out how to modify the `authhandlers.py` file for authentication and what to put in the `Endpoint URL` field in the configuration form.
Any pointers would be greatly appreciated!
Thank you,
Andrew
↧