Quantcast
Channel: Questions in topic: "splunk-enterprise"
Viewing all 47296 articles
Browse latest View live

Splunk dB connect and HEC will work on same https port ? If so how can we test it.

$
0
0
I want to test Splunk dB connect app and http event collector both works on same https port ? Any help will be appreciated.

Date not parsing correctly.

$
0
0
I have 2 splunk environments a DEV and PROD. I am send events from same syslog source. I have this date parsing: TIME_PREFIX=severity\=\d+\| MAX_TIMESTAMP_LOOKAHEAD=22 TIME_FORMAT=%Y-%b-%d %H:%M:%S TZ = UTC Here is the event string: Aug 29 11:08:30 tnnwsau1 CEF:1|RSA|Netwitness|10.6|severity=2|2018-Aug-29 15:05:07|Executables in DEV it is parsing correct ( 2018-aug-29 15:05:07) however in PROD is the Aug 29 11:08:30. My DEV is REHL 6, Prod is RHEL 7. Is there some global setting that might be an issue? Our dev is a single search head, where prod is a clustered SH? Any thoughts? Thanks!

Track user session from VPN to Windows server(s)

$
0
0
I would like to to be able to track a users login session from VPN and then login to a Windows server(s). User login scenario: VPN login --> Windows Server Login --> Windows Server Login VPN Search --> userid=user.id index=x "Login succeeded" | rex "\]\sGDI\\\(?[^\(]+)" Windows Search --> user=user.id index=wineventlog sourcetype="WinEventLog:Security" Account_Name!="*$" AND Account_Name!=SYSTEM AND EventCode=4624 AND user!="ANONYMOUS LOGON" I would like to create a table that shows their login time at the VPN, login time and process(s) started on the Windows servers. I can do this separately but how do you do combine the search? Thanks.

Search by user request parameter in Splunk Dashboard

$
0
0
I have a event created which is returning Jason data after search : eventtype="my_new" Data returned after this search : data: {"Id":2001373223232214,"SerialNumber":6773737,"Unique":"200000","messageType":"READY","activeStartTime":"2018-08-29T09:30:00.000-0500","activeEndTime":"2018-08-30T11:00:00.000-0500","additional":["fitness","pool","public"],} Now i want to give a feature on dashboard to the user if to search whole message by passing `Unique` number in a search box. I have no clue how we can do this in Splunk . Please help me in this

Audit modifications to search head cluster

$
0
0
I need to create a query to reveal who changed which objects on the search head cluster when (excluding modifications to personal items). My query so far is: index=_internal source="/opt/splunk/var/log/splunk/splunkd_ui_access.log" method=POST host="sh_svr_*" NOT prefs NOT parser NOT intentionsparser NOT login NOT jobs NOT "/dispatch"|stats count values(uri) by user Q1) Is there a better query to use? Q2) If I continue down this path, how can I exclude results in the URI field where the user name is contained within the URI string?

website-monitoring 271 The read operation timed out

$
0
0
When I am trying to add the new URI, I keep getting below error. Encountered the following error while trying to save: `Splunkd daemon is not responding: ("Error connecting to /servicesNS/admin/launcher/data/inputs/web_ping: ('The read operation timed out',)",)` I have website-monitoring_271 in my Splunk

Need Report of users using XabAB_TBBBBB_Dashboard in Splunk Report Window : 20th August - Till Date

$
0
0
Looking for query where we can find users using the dashboards .Since when Iam trying to find by using below query it gives wrong results .Thanks in advance :) index="_internal" sourcetype=splunkd_ui_access | rex "/app/(?[^/]+)/(?[^?/\s]+)" |stats values(user)

How to determine sendmail issue

$
0
0
I am getting an error after settign up email alerting The error I get is: 08-29-2018 15:33:19.626 +0000 ERROR ScriptRunner - stderr from '/opt/splunk/bin/python /opt/splunk/etc/apps/search/bin/sendemail.py "results_link=https://ourdomain.com/app/ourapp/@go?sid=scheduler__twaller__ourapp__ABC5980879bd671d6025_at_1535556780_96501" "ssname=Todds Test Email" "graceful=True" "trigger_time=1535556798" results_file="/opt/splunk/var/run/splunk/dispatch/scheduler__twaller__ourapp__ABC5980879bd671d6025_at_1535556780_96501/results.csv.gz"': ERROR:root:Connection unexpectedly closed while sending mail to: me@gmail.com My email settings are: Mail Server Settings Mail host smtp-relay.gmail.com:465 Email security =Enable SSL No username/pass Link hostname https://ourdomain.com Send emails as MyCompany Any ideas why I might be getting these errors about the closing connection? I get the same when searching and using sendemail like: index=main | head 5 | sendemail to=me@gmail.com server=smtp-relay.gmail.com:465 subject="Here is an email notification" message="This is an example message" sendresults=true inline=true format=raw sendpdf=true I appreciate the assistance as always

Why is my Remote File & Directory input not automatically inputting data?

$
0
0
I currently have a Remote File & Directory Data Input on the following log '`C:\Windows\System32\winevt\Logs\Microsoft-Windows-TerminalServices-LocalSessionManager%4Operational.evtx`' If I disable and enable the Data Input, it will import the log data. If I then go and make events within the log, it does not automatically import in to Splunk. However, if i go back and disable and enable the Data Input, it will import the backlog of events perfectly. Is there any way to automate this?

LDAP configuration: cannot login with domain user

$
0
0
i installed my splunk in linux, i have 2 HF and one indexer all of them are linux based. i want to login to my indexer with domain user. I configure LDAP in my indexer and it is ok. My user has administrator privileges. i have one problem i can not login with my domain user on indexer. Can you tell me what my problem is? all the configuration for LDAP are correct.

How to extract multiple values from a multi-value field and use these in a table?

$
0
0
I have a multivalue field (custom_4) separated by dollar signs that I have separated in to separate values with the below search. However, that only separate each value to a different line on the same row. I would like to create column headers for each new value and put each new value under a column header. Basically, when I split the multi value field using makemv I want the new single values to appear across the row for the same record with separate column names instead of just multiple rows as it is now. The new column headers (fields) would be: Tool, ID, Severity,Incident Id, Progress. Thanks! index=UIM sourcetype=nas_transaction_log | makemv delim="$" custom_4 | top limit=20 custom_4 Before: "Tool name" "ID#" "Severity" "incident id#" "status" What I want: Tool ID Severity Incident ID Progress "Tool name" "ID#" "severity" "incident#" "status"

Why is date not parsing correctly on my search head cluster?

$
0
0
I have 2 splunk environments a DEV and PROD. I am send events from same syslog source. I have this date parsing: TIME_PREFIX=severity\=\d+\| MAX_TIMESTAMP_LOOKAHEAD=22 TIME_FORMAT=%Y-%b-%d %H:%M:%S TZ = UTC Here is the event string: Aug 29 11:08:30 tnnwsau1 CEF:1|RSA|Netwitness|10.6|severity=2|2018-Aug-29 15:05:07|Executables in DEV it is parsing correct ( 2018-aug-29 15:05:07) however in PROD is the Aug 29 11:08:30. My DEV is REHL 6, Prod is RHEL 7. Is there some global setting that might be an issue? Our dev is a single search head, where prod is a clustered SH? Any thoughts? Thanks!

How do I track a user's login session from VPN to Windows server(s)?

$
0
0
I would like to to be able to track a users login session from VPN and then login to a Windows server(s). User login scenario: VPN login --> Windows Server Login --> Windows Server Login VPN Search --> userid=user.id index=x "Login succeeded" | rex "\]\sGDI\\\(?[^\(]+)" Windows Search --> user=user.id index=wineventlog sourcetype="WinEventLog:Security" Account_Name!="*$" AND Account_Name!=SYSTEM AND EventCode=4624 AND user!="ANONYMOUS LOGON" I would like to create a table that shows their login time at the VPN, login time and process(s) started on the Windows servers. I can do this separately but how do you do combine the search? Thanks.

How to audit user modifications to search head cluster?

$
0
0
I need to create a query to reveal who changed which objects on the search head cluster when (excluding modifications to personal items). My query so far is: index=_internal source="/opt/splunk/var/log/splunk/splunkd_ui_access.log" method=POST host="sh_svr_*" NOT prefs NOT parser NOT intentionsparser NOT login NOT jobs NOT "/dispatch"|stats count values(uri) by user Q1) Is there a better query to use? Q2) If I continue down this path, how can I exclude results in the URI field where the user name is contained within the URI string?

Can you search for users who are using a dashboard between certain dates ?

$
0
0
I'm looking for query where we can find users using the dashboards. The report I need is of users using XabAB_TBBBBB_Dashboard in Splunk Report Window, between the 20th August - Till Date. But when I try to find it by using below query, it gives the wrong results .Thanks in advance :) index="_internal" sourcetype=splunkd_ui_access | rex "/app/(?[^/]+)/(?[^?/\s]+)" |stats values(user)

Separate the count of two fields into ranges

$
0
0
Hi - I have a dataset which contains two scan dates fields per server. There are 50000 events in the dataset, one event per server. hostname, days_since_hw_scan, days_since_sw_scan server1,2,3 server2,20,10 server3,5,19 .... ... I want to summarise the data set so that I have a count of both scan date fields within a range of days, eg Range of Days. hw_host_scan_count, sw_host_scan_count 0-5, x , y 6-10, x , y 11-15, x, y ... ... I can get this OK for one of the field using the chart command below but I am looking for a table which includes both fields. *chart count by hw_host_scan_count span=5* Any suggestions appreciated. Thanks.

Calculate average of events and compare from multiple log files

$
0
0
I want to create a query based on below scenario : 1. There is an event "Login" on different source files. Calculate the average of "Login" event from each source file and then calculate average of average (i.e. combined average of all source files). Then that combined average needs to be compared with each source file and if there is a difference of some threshold , it should send alert along with source file which is having issue.

sed replace command replaces too much

$
0
0
I need some help figuring out why my sed replace command is replacing all of the text to the end of the event in Splunk rather than just the specific text I had it look for. As part of a GDPR-compliance project, I was tasked with anonymizing personal names that come through Splunk, which my solution does. But I'm finding that everything after the replaced text is being cut off as well. In my props.conf file I've added this section to do the replace. [host::...*] SEDCMD-GDPR-anonymize-firstname = s/\"FirstName\"[=:].*\".*?\"/"FirstName":"######"/g These are JSON messages, so I have splunk looking for the "FirstName":"Billy" and want it replace whatever it finds between the double-quotes with the pound signs, which it does. Here's a sample message that I want to anonymize: "Beneficiary_LocalID":"TZ056500190",**"FirstName":"Billy"**,"Location":"Tanzania" **Desired result:** "Beneficiary_LocalID":"TZ056500190","FirstName":"######","Location":"Tanzania" **Actual result:** "Beneficiary_LocalID":"TZ056500190","FirstName":"######" Do I have something wrong in my regex statement that is causing the rest of the event to be included in the replacement? Any help would be greatly appreciated.

Splunk_TA_nix not working

$
0
0
Hi, I'm having troubles with TA_nix application installed in RHEL 7, Splunk version 7.1.1 . I'm getting the data from my server that have TA_nix installed; also my Splunk server (single instance) have TA_nix and app_for_nix to monitor RHEL 7 server but the reports are showing "No results found". Data is there, if I look for *index=os or index=df* it shows from the host I want to monitor but dashboards are dead. Can you help me? Maybe I can look to some logs or something? Gracias,

What is the maximum length for a field name?

$
0
0
I have a library for creating application event logs formatted as key-value pairs. It allows the caller to create arbitrary keys to include some attribute into the logging event. What is the maximum length that Splunk will accept so I can configure the logger interface to match it?
Viewing all 47296 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>