Hello,
I want to divide AverageCount by AverageTotal. The problem is that Average count is separated by Sourcetype and Average Total is separated by a Field. For example:
index=x Sourcetype: SAT --> I calculate Average Count using this search
index=x Sourcetype:TotalTru Site:SAT --> I calculate Average Total by day using this search
Is there a way that I can use an eval statement by specifying with an if statement what site to relate the average to. I was thinking:
If sourcetype: SAT, then eval by site when site: SAT
index="x"
| bin _time span=1d
| stats count(Number) as CountEvents by _time, sourcetype
| chart avg(CountEvents) as AverageCount by sourcetype
| append
[search sourcetype=TotalTru
| eval Total=if(sourcetype==SAT,
....
Thanks!!
↧
IF statement inside EVAL
↧
How do I combine storage statistics of indexes with the index, sourcetype, and host?
I can use a rest search from the `services/data/indexes`endpoint to calculate storage statistics, like the index size in GB, of each index. I would like to combine these storage statistics to a table that has the index, sourcetype, and host. Currently, I'm using this tstats search:
`| tstats count where index=* by index sourcetype, host | stats list(host) as Hosts by index sourcetype| rename index as "Index", sourcetype as "Sourcetype(s)"`
I don't believe that `|rest` and `|tstats` can be used together. Is there a way I can do this only using `|tstats`? Possibly by using license usage?
Any help is appreciated.
↧
↧
Combine data across multiple sources and then split answer to separate rows when exported to csv
Good afternoon,
I am trying to take data from multiple sourcestypes, combine it by a common field and then output it to one entry per line when exporting to CSV. I'm having difficulty because there are several fields but only a couple have multiple values. The fields with multiple values show up in one cell. I have tried several suggestions I have come across in searching, but none of them seem to do what I'm attempting.
To start with, another organization hosts the SPLUNK instance, so I do not have access to any back end modifications such as props.conf. I am able to run searches and create dashboards, that is about it.
We have 1 index assigned to this data, and 4 source types. The data I need is spread across all 4 source types and there is one common field (key) between the four. Three of the four source types will return a single event per key, but the 4th can return multiple events per key. When I run my search using stats, I get the data from the first three pretty in a line, then the fields from the fourth will have multiple lines per row. When exported, these show up as a single cell in excel. Hope this makes sense.
Example:
Sourcetype1 contains Fielda Fieldb Fieldc
Sourcetype2 contains Fielda Fieldd Fielde
Sourcetype3 contains Fielda Fieldf Fieldg
Sourcetype4 contains Fielda FieldH FieldI FieldJ
index=* [search index=* Search_criteria | table Fielda | rename Fielda as query] |stats values(*) as * by Fielda
| stats list(Fieldb) as Fieldb, list(Fieldc) as Filedc, list(fieldd) as Fieldd, list(fielde) as Fielde, list(fieldf) as Fieldf, list(Fieldg) as Fieldg, list(FieldH) as FieldH, list(FieldI) as FieldI, list(FieldJ) as FieldJ by Fielda
Result would look like:
Fieldb Fieldc Fieldd Fielde Fieldf Fieldg FieldH FieldI FieldJ
A1 A1 A1 A1 A1 A1 A1 A1 A1
A1.1 A1.1
A2 A2 A2 A2 A2 A2 A2 A2 A2
A2.1 A2.1
A2.2 A2.2
A3 A3 A3 A3 A3 A3 A3 A3 A3
A4 A4 A4 A4 A4 A4 A4 A4 A4
A5 A5 A5 A5 A5 A5 A5 A5 A5
A5.1 A5.1
A5.2 A5.2
And I need it to look like this when exported to CSV:
Fieldb Fieldc Fieldd Fielde Fieldf Fieldg FieldH FieldI FieldJ
A1 A1 A1 A1 A1 A1 A1 A1 A1
A1 A1 A1 A1 A1 A1 A1 A1.1 A1.1
A2 A2 A2 A2 A2 A2 A2 A2 A2
A2 A2 A2 A2 A2 A2 A2 A2.1 A2.1
A2 A2 A2 A2 A2 A2 A2 A2.2 A2.2
A3 A3 A3 A3 A3 A3 A3 A3 A3
A4 A4 A4 A4 A4 A4 A4 A4 A4
A5 A5 A5 A5 A5 A5 A5 A5 A5
A5 A5 A5 A5 A5 A5 A5 A5.1 A5.1
A5 A5 A5 A5 A5 A5 A5 A5.2 A5.2
I've tried using transaction instead of the stats command. I've also tried adding "by fiedla FieldI FieldJ" at the end of the stats and that just seems to created multiple entries for each possible combination of .1 and .2 answers. Any help that could be offered would be greatly appreciated.
↧
Need help understanding how Transform "access-extractions" works
Hi to all that read this, Hoping one of you might be able to provide some assistance.
We have an app that is producing logs using Extended Common web format. Right now the source type we are using is linked to the access-extractions transform, but is not giving all the required fields.
I have tried a number of different approaches to get the required values using regex, but due to the nature of the logs, it feels like I might need a large number of regex entries to capture all variations.
After figuring out that we were using the access-extractions transform, I though a better approach would be to edit this to suit - however I'm still pretty new to regex and not really sure what the regex in this transform is actually doing or how it works.
A sample of the logs we are working with:
10.x.x.x www.blah.au - [20/Aug/2018:08:06:19 +1000] "GET /ebs/picmi/picmirepository.nsf/PICMI?OpenForm&t=PI&k=D&r=http%3A%2F%2Fwww.assediomoral.org%2Findex.php%2Fspip.php%3Farticle106 HTTP/1.1" 200 53245 "http://a.bla.es/?u=https%3A%2F%2Fwww.ebs.tga.gov.au%2Febs%2Fpicmi%2Fpicmirepository.nsf%2FPICMI%3FOpenForm%26t%3DPI%26k%3DD%26r%3Dhttp%253A%252F%252Fwww.assediomoral.org%252Findex.php%252Fspip.php%253Farticle106" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.189 Safari/537.36 Vivaldi/1.95.1077.60" 422 "" "d:/Lotus/Domino/data/ebs/picmi/picmirepository.nsf"
10.x.x.x www.blah.au "107831_67744" [20/Aug/2018:08:06:19 +1000] "GET /ebs/lm/lmdrafts.nsf/xAgentUpdateValidationMonitoring.xsp?documentId=7D35903C63DAEB54CA2582C000426C09&dojo.preventCache=1534716380650 HTTP/1.1" 200 78 "https://www.ebs.tga.gov.au/ebs/LM/LMDrafts.nsf/GenApp.xsp?documentId=7d35903c63daeb54ca2582c000426c09&action=editDocument" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_6) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/11.1.2 Safari/605.1.15" 31 "_ga=GA1.3.644697231.1517015993;
_gid=GA1.3.1541615874.1534641115; DomAuthSessId=A004127B4D088BDBD4B14B7E1BF0928B; WelcomeDialogLM=1; SessionID=9E1B7E03146C77042992C7B008ABB7DB303BC2AD" "d:/Lotus/Domino/data/ebs/lm/lmdrafts.nsf"
10.x.x.x www.blah.au - [20/Aug/2018:08:06:15 +1000] "GET /ebs/picmi/picmirepository.nsf/PICMI?OpenForm&t=PI&k=D&r=http%3A%2F%2Fwww2.ogs.state.ny.us%2Fhelp%2Furlstatusgo.html%3Furl%3Dhttp%253A%252F%252Fpedagogie.ac-toulouse.fr%252Feco-golfech%252Fspip.php%253Farticle129 HTTP/1.1" 200 53566 "https://www.apemsa.es/web/guest/analisis-de-agua/-/asset_publisher/7OQq/content/dureza?redirect=https%3A%2F%2Fwww.ebs.tga.gov.au%2Febs%2Fpicmi%2Fpicmirepository.nsf%2FPICMI%3FOpenForm%26t%3DPI%26k%3DD%26r%3Dhttp%253A%252F%252Fwww2.ogs.state.ny.us%252Fhelp%252Furlstatusgo.html%253Furl%253Dhttp%25253A%25252F%25252Fpedagogie.ac-toulouse.fr%25252Feco-golfech%25252Fspip.php%25253Farticle129" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.170 Safari/537.36,gzip(gfe)" 282 "" "d:/Lotus/Domino/data/ebs/picmi/picmirepository.nsf"
10.x.x.x www.blah.au - [20/Aug/2018:08:06:15 +1000] "GET /ebs/picmi/picmirepository.nsf/PICMI?OpenForm&t=PI&k=D&r=http%3A%2F%2Fwww2.ogs.state.ny.us%2Fhelp%2Furlstatusgo.html%3Furl%3Dhttp%253A%252F%252Fpedagogie.ac-toulouse.fr%252Feco-golfech%252Fspip.php%253Farticle129 HTTP/1.1" 200 53566 "https://www.apemsa.es/web/guest/analisis-de-agua/-/asset_publisher/7OQq/content/dureza?redirect=https%3A%2F%2Fwww.ebs.tga.gov.au%2Febs%2Fpicmi%2Fpicmirepository.nsf%2FPICMI%3FOpenForm%26t%3DPI%26k%3DD%26r%3Dhttp%253A%252F%252Fwww2.ogs.state.ny.us%252Fhelp%252Furlstatusgo.html%253Furl%253Dhttp%25253A%25252F%25252Fpedagogie.ac-toulouse.fr%25252Feco-golfech%25252Fspip.php%25253Farticle129" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.170 Safari/537.36,gzip(gfe)" 282 "" "d:/Lotus/Domino/data/ebs/picmi/picmirepository.nsf"
The particular fields that we are after are the last 3 which represent the time to process, cookie header and translated URL.
Regex from access-extractions:
^[[nspaces:clientip]]\s++[[nspaces:ident]]\s++[[nspaces:user]]\s++[[sbstring:req_time]]\s++[[access-request]]\s++[[nspaces:status]]\s++[[nspaces:bytes]](?:\s++"(?[[bc_domain:referer_]]?+[^"]*+)"(?:\s++[[qstring:useragent]](?:\s++[[qstring:cookie]])?+)?+)?[[all:other]]
I'm assuming I need to update the last part of this "[[all:other]]" but have tried running this in GUI search box and in regex101, neither seem to be able to work with it so struggling to understand how to update correctly.
↧
Using the number of events in bins to find percentile
Hello all,
I have a seemingly simple goal: bucketing events by time and finding the 95th percentile using the total number of events in each bin. I'm able to get the counts for each bin but I'm not sure how to use each of those counts and find the percentile using `p()`.
This is how I'm getting the count for each bin:
`| bin _time span=5m | stats count by _time `
Now I want to use the values in the count column as an input list to calculate `p95()`.
Thanks for the help in advance.
↧
↧
We are in process to move to cloud, but we have a splunk license on-premises, this license we can use in cloud?
Hi! i nowadays we have a Splunk license on-premises, but we are in process to move to cloud, part of this process is keep a hybrid schema. For this reason we like know if is possible use the same license (on-premises), because the strategy is move Splunk to cloud (search head) and have three index (2 in on-premises and 1 cloud) . .. this is possible?
↧
Need to create a Dashboard which can select multiple fields based on user selection of checkbox/ radio button.
Hi all,
I Need to create a Dashboard which can select multiple extracted fields based on user selection of checkbox/ radio button.
![alt text][1]
For example, I want the user to be able to search with both fields ServiceID AND Username. I currently have the single search functionality working but I cant seem to think of how I will pass multiple fields.
Will greatly appreciate any help with this.
[1]: /storage/temp/254749-msp.png
↧
Can I hide/unhide specific text boxes using a single checkbox?
Hi,
I am trying to get a checkbox to hide/reveal specific text boxes.
for example say I have the following checkbox with three choices:
- c1
- c2
- c3
Furthermore, I have three text boxes t1, t2, and t3.
If c1 is ticked I want t1 to be revealed otherwise I want it to be hidden.
c2 and c3 will perform the same function on t2 and t3 respectively.
How would I go about monitoring the state of all the choices simultaneously in order to set/unset new tokens such that I can pass said tokens as depends attributes to reveal/hide the individual text boxes.
Thank you for your time.
↧
Splunk App for AWS stops collecting data from AWS after migration fro 6 to 7
We are seeing that index build stopped for s3 bucket e.g. billing and cloudtrail after migration. I am not seeing any error message. It just stopped and not retrying anymore.
↧
↧
How do I create an alert when a value is greater than "X" directly following a specific string within a log file?
This is a snip of the log file. I want to receive an email when the value the follows "Memory used by APP:" exceeds 4000 MB.
[2018-08-17 11:59:51.909.196][0x0000219c][Info][GENERAL] Memory used by APP: 3321.77734375 MB.
↧
getting alerts even i disabled from GUI
Hi,
I have disabled an alert from GUI even though I am still getting splunk alerts. Can you please let me know why this is happening??
↧
How to do search jobs using the imported CSV file field name
Dear Team,
I have imported one csv File and searched using the Portal(8000) and REST API(8089), REST API is not working because of the *CustomField*, this is one of the field name in the csv file, please find below the details, Kindly advice on this. Thanks in advance.
**Url:**
https://localhost:8089/services/search/jobs
**Header:**
Authorization:Splunk 3nUbDwfNUWdQ^7zb5sI6l2xXAqZ^U4xK4fEBinNarM7jvOs4^bAOBNyA1LsBy5ev2I2dXmZE6aThm_MPf3gCG7B9XnGlalyvj4HWaaa
Content-Type:application/x-www-form-urlencoded
**Body:**
search:search sourcetype=csv
*CustomField:fc3aa216*
Response:
Returns all rows, instead of returning the rows contains CustomField=fc3aa216
But if i do the same using the portal its working fine,
http://localhost:8000/en-US/app/search/search
sourcetype="csv" CustomField="fc3aa216"
↧
How to search using the fieldname n the csv file in REST API
Dear Team,
I have imported one csv file and searched using the sourcetype & customfield(one column header in the csv file), its working fine in the Portal (8000),
If i do the same using the REST API (8089), i am getting all rows instead of rows that contains customfield value itself, please find below the details,
**URL:**
https://localhost:8089/services/search/jobs
**Method:**
POST
**Header:**
Authorization:Splunk 3nUbDwfNUWdQ^7zb5sI6l2xXAqZ^U4xK4fEBinNarM7jvOs4dgdfgdfgdfgnGlalyvj4HWrcWz
Content-Type:application/x-www-form-urlencoded
**Body:**
search:search sourcetype=csv
*customfield:fc3aa216*
Kindly advice on this. Thanks in advance.
↧
↧
What causes the this splunkd Search Head Assertion in Splunk 7.1.1?
Hello,
splunkd: /home/build/build-src/nightlight/src/framework/SearchResultsMem.cpp:839: SearchResultsMem::iterator SearchResultsMem::erase(SearchResultsMem::iterator, SearchResultsMem::iterator): Assertion `it != end()' failed.
Trying to track down RAM issues but at the same time would like more specific information on what Splunk is trying to do.
Please advise...
Mark
↧
How to display comparison between previous week with current week in Single Value with trendline
Hi,
I have a query which should ideally give me results for the Last week and the current week Request count.
index=data earliest=-1w@w latest=now | eval Latency=case(walltime<500, "0-0.5s", walltime>=500 AND walltime<1000, "0.5s-1s",walltime>=1000 AND walltime<3000, "1s-3s", walltime>=3000 AND walltime<6000, "3s-6s",walltime>=4000 AND walltime<10000, "6s-10s",walltime>=10000 AND walltime<30000, "10s-30s", walltime>=30000, ">=30s")| timechart span=1w count as RequestCount by Latency
When represented in a single value, it should result me single value of current week along with trendline compared with previous week. But the above query results for both previous week & latest week not the recent 3 days in current week, which is wrong. Here is the preview of the result
![alt text][1]
How do i represent only the current week's result value alone displayed compared with previous week to show how much in percent less/more the count is ?
Also is it possible to schedule this as a search and have it indexed in summary index. For example, Say every week, i run a query to get that week's result in the name "Previous_week", how do i compare with current week result with the summarized result under the source "Previous_week" ?
[1]: /storage/temp/255750-screen-shot-2018-08-21-at-120108-pm.png
↧
In real-time alert, if I use lookup command, too many alerts triggered.
Splunk ver 7.1.1
I'm using real-time alert that trigger when there is event which has src_ip match black_list.csv like below.
index=hogehoge
| lookup black_list.csv src OUTPUT status
| where isnotnull(status)
| table _time src status
But if there is such a event, this alert is continuously triggered per 5 seconds against same event!
I think it caused by `lookup` command touches every event again, when new event is arrived.
Should not I use `lookup` in real-time alert?
And is it specification?
* If this is written in the document, I'm sorry ....
Please someone help me.
↧
Index Cluster - Backup stratetegy
Hi All,
We have 3 indexers in a Index cluster with a Index master. Currently, the data is being backed up periodically to AWS S3/Glacier storage.
We want to understand if we need to shutdown before we backup the buckets
Reviewed this [http://docs.splunk.com/Documentation/Splunk/6.5.0/Indexer/Backupindexeddata][1]
[1]: http://docs.splunk.com/Documentation/Splunk/6.5.0/Indexer/Backupindexeddata
↧
↧
Dashboard "Cover Page"
Hi,
we have created a multi sided Dashboard which will be automatically exported to PDF.
In order to get the report looking less technical we would like to add a cover page.
Cover page simply should look like:
no footer, no header, title, custom bg-color of the whole page and time range (e.g. 2018/08/14 - 2018/08/21)
Allready tried with html tags placing pngs and looking through pdfgen py scripts with success.
Is there any possibility to achieve this?
Many thanks in advance,
Martin
↧
Issue getting "Unknown host" in creating new connection using Progress Data Direct jdbc driver for SQL Server.
Trying to configure a sql server jdbc driver for splunk, configured DataDirect SQL Server JDBC in Splunk DB Connect successfully.
But unable to connect to SQL SErver.
↧
Combine multiple rows in one with a common key
Hello,
I have a log that records data bit by bit. I want to combine them to have only one row of data.
Exemple:
![alt text][1]
I've tried mvcombine but when there are multiple values for a field it doesn't work as I would like. I'vealso tried stats values but I struggle to expand both pet and male/female fields as they don't have the same number of different values.
Do you know how to do it?
Thanks for the help
[1]: /storage/temp/254755-splunk.png
↧