We are installing a larger Splunk setup.
It a lot of index server.
**Search server.
Jobs/deploy server
Test server**
Are there anyway to customize the very top header to see what server we are logged inn to,
Today we can see it by an url like **http://splunk-jobs** etc.
Different color on top bar would be nice.
I know I can make a custom app with an logo.png. But this will then only showed when logging inn.
↧
Differentiate various Splunk server.
↧
.NET Client windows authentication
Hi,
is it possible to authenticate using windows authentication in the C# (.NET) client ?
Given that Splunk supports LDAP, I wonder if it can use the Windows credentials ?
The Service.LogonAsync method requires a username and password.
Many thanks
↧
↧
Differentiate various Splunk server using color?
We are installing a larger Splunk setup.
It a lot of index server.
**Search server.
Jobs/deploy server
Test server**
Are there anyway to customize the very top header to see what server we are logged inn to,
Today we can see it by an url like **http://splunk-jobs** etc.
Different color on top bar would be nice.
I know I can make a custom app with an logo.png. But this will then only showed when logging inn.
↧
extracting time and formating it
Hello fellows,
I have an issue that I'm not really sure how to solve.
well in event I have time in following format "datetime":"20180829 073501672".
I have created a regex that will extract this line but now I need to format it following way 2018 08 29 07:35:01:672.
any suggestions?
↧
クライアント管理の方法について,クライアントのログオン、ログオフの情報収集について
クライアント端末をSplunk 上で管理する方法を教えてください。
バージョンはSplunk Enterprise 7.1.2となります。
agentなどをインストールする必要があるかと思いますが、
どのようにインストールを行えばよいのでしょうか?,Splunk Enterprise 7.1.2を使用しています。
クライアント端末74台のログイン、ログオフの情報収集を行いたいです。
そのために、まず管理クライアントをサーバに登録が必要かと思いますが、設定方法を教えてください
↧
↧
aggregate url calls , ignoring number in the route
HI Guys,
I have a url like this:
https://localhost/Client/V2/clients/23423/acc/view
https://localhost/Client/V2/clients/23424/acc/view
https://localhost/Client/V2/clients/23425/acc/view
https://localhost/Client/V2/clients/23423/acc/basic
https://localhost/Client/V2/clients/23423/acc/basic
https://localhost/Client/V2/clients/23423/acc/basic
https://localhost/Client/V2/clients/23425/acc/basic
I want to group into two rows
url | count
https://localhost/Client/V2/clients/*/acc/view| 3
https://localhost/Client/V2/clients/*/acc/basic | 4
How can I aggregate?
It is a url field.
I tried with
> rex field=url> "https://localhost/Client/V2/clients/(\d+)/*"> | table url
but it did not work.
↧
number of members in a Search Head Cluster
Hi at all,
for a customer, I need to replicate knowledge objects between two Search Heads and high availability.
The best solution is a Search Head Cluster, bat the problem is that I have only two Search Heads and Splunk best practices require at leats three members.
From your experience, could I use a Search Head Cluster with only two members without great problems?
If I cannot use a Cluster, as a workaround, I thought to use a script to replicate all the knowledge object from SH1 to SH2, anyone can suggest a different workaround.
Bye.
Giuseppe
↧
Conditional eval to fill a new field depending on other fields
Hello,
I want to create a new field that will take the value of other fields depending of which one is filled.
For example, I have 5 fields but only one can be filled at a time. The other fields don't have any value.
Field1:
Field2:
Field3:
Field4: Ok
Field5:
How can I write the eval to check if a field1 is null, take the value of the following field2, if it is also null, take the value of field3 until it reaches the not null field?
Thank you.
↧
Chart JSON values
I have a JSON just like this. I want to chart data of the values inside values key. The keys of the data in the values will be dynamic.
{"Domain":"DotComMobile","Metrics":"regionCode","values":{"NY":563,"FL":270,"NJ":265,"PA":256,"MA":223,"CA":214,"VA":164,"MD":163,"TX":155,"OH":151}}
How would that be achieved?
↧
↧
Splunk Add-On for Box. Multiple instances.
Reference: https://answers.splunk.com/answers/668545/splunk-add-on-for-box-and-multiple-box-tenants.html
One of our customers has mentioned that Splunk is updating the connector and that it will include this capability to support multiple Box instances at the same time rather than the workaround above. Can anyone confirm whether that is the case, or give an idea where we can get a formal answer from Splunk before we start all the work detailed in the above-said workaround.
Thanks in advance.
↧
Problem ingesting from HEC, sslv3 alert certificate unknown
We're attempting to ingest from ELK servers into Splunk using ELK -> HEC, but are having difficulties getting past ssl. Due to FW constraints, we're only able to send to one heavy forwarder on port 8088, which already has ssl enabled. We don't have certificate validation enabled.
The data is going to be coming in from a company that we've purchased, so they're not on our domain, and I'm not certain if their root ca cert is in effect in our domain, nor certain if that matters here.
They are sending using the following structure:
http_method => "put"
format => "json"
url => https://nattdip:8088/services/collector
headers => {"Authorization" =>"Bearer d****d-9f84-4a3a-a9fd-6*******e"}
content_type => "application/json"
We've tried both put and post as the method, and they get the same error:
[HTTP Output Failure] Could not fetch URL {:url=>"https://nattdip:8088/services/collector", :method=>:post,
I see the following in my _internal log:
08-28-2018 15:45:13.287 -0400 WARN HttpListener - Socket error from sourceip while idling: error:14094416:SSL routines:ssl3_read_bytes:sslv3 alert certificate unknown
We've added their rootca to our pem file that Splunk is using to protect web & hec, but get same error.
Any suggestions would be great. Thanks very much.
↧
What have people's experiences been setting up Splunk in Azure
Hi guys,
just a general question asking about what people's experiences have been when setting up a clustered splunk environment in Azure? (Or AWS as I imagine it's very similar).
Are there any things I specifically need to watch out for or any performance related things I should know about?
Cheers
↧
How to chart a list of key/value pair fields only if a condition matches ?
Hi Splunk'az,
I have events composed of 64 key/value pairs that are being extracted into fields at indexing time:
> "d" : {> "field01" : [ 0 ],> "field02" : [ 5 ],> "field03" : [ 2 ],> "field04" : [ 3 ],> [...]> "field64" : [ 38]> }
I would like to chart the "value" of the field "only if" it is above a certain threshold.
I was initially thinking using 'where':
> | stats count last(field*) | where field* > 100
But the above doesn't work as 'where' can't contain a wildcard...
Then I was looking at the 'foreach' function. And trying something like:
>> | foreach c5x* [eval new_<>=if(<>> 0, <>, null)] > | table new_*
But the above doesn't work either, and still I would have to get rid of the 'null' values from fields using a wildcard again. So the problem would remain the same.
So, how can I chart a set of fields values only if the value is above a certain threshold? And without having to hardcode to complete list of fields obviously ;-)
Thanks in advance!
↧
↧
Remote File & Directory input not automatically inputting data
I currently have a Remote File & Directory Data Input on the following log
'`C:\Windows\System32\winevt\Logs\Microsoft-Windows-TerminalServices-LocalSessionManager%4Operational.evtx`'
If I disable and enable the Data Input, it will import the log data. If I then go and make events within the log, it does not automatically import in to Splunk. However, if i go back and disable and enable the Data Input, it will import the backlog of events perfectly. Is there any way to automate this?
↧
Timechart as VS Timechart by
Over the last 3 days I was trying to create dashboard with single value + trends.
The query was something like this:
* | dedup 1 src | timechart count by src
**The goal was to get total number of src based on dashboard time range (before talking about the trends).**
right now, by mistake (must be honest), I changed the query so I replaced the word by with as and it seems to work but not perfectly - when I change the time range to all time i expect to see all events but I get only one of them (although there are 5 events in results):
**Can someone please tell me why I got confused and to translate my goal above into query correctly for next time?**
Thanks
↧
How to search with a fixed time span timechart everyday?
I am trying to find my average response time of everyday events (not avg of all the events of that day , but the events from 10AM to 1PM) only for last 7 days.
sourcetype="super:access" host=xa20hlf** | eval headers=split(_raw," ") | eval resp_time=mvindex(headers,10) | eval resptime_time_seconds=resp_time*0.001| timechart span=1d eval(round(avg(resptime_time_seconds),2)) as avgTime
↧
can not login with domain user
i installed my splunk in linux, i have 2 HF and one indexer all of them are linux based.
i want to login to my indexer with domain user,i configure ldap in my indexer and it is ok,my user has administrator privilage
. i have one problem i can not login with my domain user on indexer,can you tell me what is my problem?
all the configuration for ldap are correct.
↧
↧
What is the maximum field name length
I have a library for creating application event logs formatted as key-value pairs. It allows the caller to create arbitrary keys to include some attribute into the logging event. What is the the maximum length that Splunk will accept so I can configure the logger interface to match it?
↧
Need to be able to know the number of events processed
Question1. I have a dashboard with multiple timecharts type query running with different logic and different preset time.
I want to know how many events were processed as part of that chart to run. And if possible can I have that value see in that same dashboard?
Question2. I want a query that should be able to tell me how many events are there in a specific time period I choose. e.g. 25th Aug from 10 to 14 or something like 15th August all day. possibly in a timechart visualization like monday = 30000 events, tuesday = 45666 events .....etc.. in a line chart
↧
Extract multiple values from a multi-value field and use these in a table??
I have a multivalue field (custom_4) separated by dollar signs that I have separated in to separate values with the below search. However, that only separate each value to a different line on the same row. I would like to create column headers for each new value and put each new value under a column header. Basically, when I split the multi value field using makemv I want the new single values to appear across the row for the same record with separate column names instead of just multiple rows as it is now. The new column headers (fields) would be: Tool, ID, Severity,Incident Id, Progress. Thanks!
index=UIM sourcetype=nas_transaction_log
| makemv delim="$" custom_4
| top limit=20 custom_4
Before:
"Tool name"
"ID#"
"Severity"
"incident id#"
"status"
What I want:
Tool ID Severity Incident ID Progress
"Tool name" "ID#" "severity" "incident#" "status"
↧