Quantcast
Channel: Questions in topic: "splunk-enterprise"
Viewing all 47296 articles
Browse latest View live

Why is line breaking not consistent on Tomcat logs

$
0
0
I've written for below props.conf and placed in etc\apps\\local. I'm getting sporadic results and lines are being chunked together. Any help would be greatly appreciated. [tomcat:jackrabbit:log] SHOULD_LINEMERGE = false MAX_TIMESTAMP_LOOKAHEAD = 23 LINE_BREAKER = ([\r\n]+)(\d{4}-\d{2}-\d{2}_\d{2}:\d{2}:\d{2}.\d{3}) TIME_FORMAT = %Y-%m-%d_%H:%M:%S.%3N TIME_PREFIX = ^ #BREAK_ONLY_BEFORE = ([\r\n]+)(\d{4}-\d{2}-\d{2}_\d{2}:\d{2}:\d{2}.\d{3}) Logged Events: 2019-11-12_14:06:11.682 [http-nio-8081-exec-3_UpdateFundingRate_null] TRACE: 78420:1: setObject: 1 Inputs - |SEAGH0R5| ownerId, ALR_RID_OWNER, java.lang.String 2019-11-12_14:06:11.682 [http-nio-8081-exec-3_UpdateFundingRate_null] TRACE: 78420:1: setObject: 2 Inputs - |CUS | ownerType, ALR_CDE_OWNER_TYPE, java.lang.String 2019-11-12_14:06:11.698 [http-nio-8081-exec-3_UpdateFundingRate_null] DEBUG: execute sql jar:file:/C:/LOANIQ/Server/mssxml.jar!/78420.xml, Row Count = 0 2019-11-12_14:06:11.729 [http-nio-8081-exec-7_RunXQuery_null] DEBUG: execute trans for xml file = SqlQuery[1,JDBCAdapterSqlXml[78420,jar:file:/C:/LOANIQ/Server/mssxml.jar!/78420.xml,in:2,out:9,count:1,exec:DEFAULT]] 2019-11-12_14:06:11.729 [http-nio-8081-exec-7_RunXQuery_null] TRACE: Prepared 78420:1 { SELECT ALR_TSP_REC_CREATE , ALR_UID_REC_CREATE , ALR_TXT_DETAILS , ALR_RID_ALERT , ALR_RID_OWNER , ALR_CDE_OWNER_TYPE , ALR_TXT_SHORT_DESC , ALR_TSP_REC_UPDATE , ALR_UID_REC_UPDATE FROM VLS_ALERT WHERE ALR_RID_OWNER = CAST ( ? AS CHAR ( 8 ) ) AND ALR_CDE_OWNER_TYPE = CAST ( ? AS CHAR ( 5 ) ) /* LIQ-78420.xml */ } com.misys.liq.jsqlaccess.adapter.jdbcadapter.JDBCWrapper `com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement@6ee94345 2019-11-12_14:06:11.729 [http-nio-8081-exec-7_RunXQuery_null] TRACE: 78420:1: setObject: 1 Inputs - |+3BATO74| ownerId, ALR_RID_OWNER, java.lang.String 2019-11-12_14:06:11.729 [http-nio-8081-exec-7_RunXQuery_null] TRACE: 78420:1: setObject: 2 Inputs - |DEA | ownerType, ALR_CDE_OWNER_TYPE, java.lang.String 2019-11-12_14:06:11.744 [http-nio-8081-exec-7_RunXQuery_null] DEBUG: execute sql jar:file:/C:/LOANIQ/Server/mssxml.jar!/78420.xml, Row Count = 0 2019-11-12_14:06:11.776 [http-nio-8081-exec-9_RunXQuery_null] DEBUG: execute trans for xml file = SqlQuery[1,JDBCAdapterSqlXml[78420,jar:file:/C:/LOANIQ/Server/mssxml.jar!/78420.xml,in:2,out:9,count:1,exec:DEFAULT]] 2019-11-12_14:06:11.776 [http-nio-8081-exec-9_RunXQuery_null] TRACE: Prepared 78420:1 {

split events to different indexes and sanitize for PII data

$
0
0
I have a use case where I need to store pii data in one index and sanitized data in another index. I can use the clone_sourcetype, which works, but the problem is I also want to take the generic data that has no PII data and put it in a third index. The idea is as follows: 1). store non-pii data in index1 2). store pii data in a protected index2 3). store the same pii data, but sanitized in a general index3 ---incoming data - no queries send to index1 - has queries send to index2 - send duplicate of index2 to index3 and sanitize with sed / transforms etc

2.01 signins worked for 12 hours and then stop with 403 forbiden errors.

$
0
0
Hello, the version 2.01 is the only one I installed. I configured signins and audit and the data started flowing. 12 hours after, signin source started returning 403 forbiden while audit source continued to work. tried to figure out what the problem was. Decided to wipe everything off and to start from scratch to configure only signins logs and it stills not work. I have no clue where to go from here, anyone with similar experiences ? Not sure where to look for help. Here's the full error: 2019-11-12 18:02:58,198 ERROR pid=66637 tid=MainThread file=base_modinput.py:log_error:307 | Get error when collecting events. Traceback (most recent call last): File "/opt/splunk/etc/apps/TA-MS-AAD/bin/ta_ms_aad/modinput_wrapper/base_modinput.py", line 127, in stream_events self.collect_events(ew) File "/opt/splunk/etc/apps/TA-MS-AAD/bin/MS_AAD_signins.py", line 84, in collect_events input_module.collect_events(self, ew) File "/opt/splunk/etc/apps/TA-MS-AAD/bin/input_module_MS_AAD_signins.py", line 77, in collect_events sign_ins = azutils.get_items(helper, access_token, url) File "/opt/splunk/etc/apps/TA-MS-AAD/bin/ta_azure_utils/utils.py", line 33, in get_items raise e HTTPError: 403 Client Error: Forbidden for url: https://graph.microsoft.com/beta/auditLogs/signIns?$orderby=createdDateTime&$filter=createdDateTime+gt+2019-11-11T18:02:57.386563Z+and+createdDateTime+le+2019-11-12T22:55:57.688509Z

How to find spike in total count of a field?

$
0
0
I'd like to be able to search for the following: 1) timechart over X days for the sum of the count of a field 2) spikes or % increase for the sum of the count of a field compared to previous hour, day, week, etc For example, we are ingesting Palo logs and I'd like to be able to see what fields have the highest sum/total for the time period I run and then also see the % increase/decrease of the sum/total for the field as well compared to an hour/day/week/etc. Not looking to break the count of the field using 'BY', but just interested in the sum/total of all events for a field Thx

Update search command string of search Manager by js

$
0
0
i have an button that change the search command string, i want to update that string to "search" of searchManager and trigger the it to update data in the view is that possible ? can some one please help ? example is better for me to understand .

Steps to Clean Up a search head in a search head cluster

$
0
0
Hi Guys, It would be helpful if anyone shares knowledge/provide steps about cleaning up a Search head in a Search head cluster environment. Want to know what is cleaned up and what's the process and all. Thanks in Advance!! Sarah

send job to background - resume after splunkd windows service restart?

$
0
0
Hi, I have sent a query manually to the background as a job. It will run quite long since the disks are not the fastest ones and the timeframe is 6 months. Will Splunk resume the query once the splunkd windows service was restarted or will it have to start from 0?

XML file not parsing in the splunk

$
0
0
Hi, The Sales order XML file not parsed in SPLUNK web interface and i'm trying to fetch the sales order count based on the special key word available in the XML file and same(It will generate many files in the same location per the day) but needs to display sales orders count based time frame. note: based on this field "**行のデータを抽出しました。**", needs to count the sales oders (from the below xml sales order count is 132). i'm attaching below XML file and do the needful please. -------------------------------------------------------------------- Descriptive words on XML -------------------------------------------------------------------- MDNET to NetSuite related .xml files are stored in the same location. From these .xml files we needs to get the Sales Order details Header Count and Line Details. ( i.e. read the values from RAW .xml files ). For the MDNET I needs to fetch line count as 132 and with respect to name field (**InsertIntoNetSuite**) below details. --------------------------------------------------------------------- XML FILE below ---------------------------------------------------------------------- - - - - - - - - - - - - - - - - - -

how to add a header of rowsnumber column of table in dashboard

$
0
0
I have setting a column of row number in the table view of dashboard like this figure, but i want to have a header. how to change the blank header like the simple_xml_examples with javascript change the rows?![alt text][1] [1]: /storage/temp/275133-rowsnumber-header.png

Build a distributed search environment with trial version?

$
0
0
Hello, i wanted to build a distributed search environment with splunk with the **trial license**. But for example, every time i wanted to configure one of my two instances as search-peer, and after I put all my data in, I get a error message. Nothing else come to me in mind why this should not work, except the trial license problem. What do you think?

Splunk 8.x osquery

$
0
0
Curious if the current app or any future versions will be compatible with Splunk 8.x

Splunk Smartstore - Can we implement this solution for a framework that consists of multiple unclustered Indexers and if yes, how do we do that?

$
0
0
Hello Everyone, Wanted to see if you guys have any inputs or suggestions on this. Recently I and my team attended the Splunk confernce (.conf19) and we went through some sessions of Splunk SmartStore. We wanted to implement this solution in our environment. We created the necessary Epics and starting building some related stories. SmartStore is an indexer feature that provides a way to use remote object stores, such as Amazon S3, to store indexed data. By reducing reliance on local storage, SmartStore allows us to scale compute and storage resources separately, thus improving the efficiency of resource usage. We have one of the brands/customers that are using a Splunk instance which consists of multiple Unclustered Indexers. Wanted to see how would be our best approach to implement SmartStore with this framework i.e Unclustered Indexers, and if its possible to implement this solution and what options do we have on our plate here. Appreciate any feedback on this. thank you vivek

Embed Saved Search with API (preferably Python SDK)

$
0
0
I'm programmatically generating saved searches with the Python SDK, which is great. I then want to embed those saved searches on an external website (confluence). Embed-able dashboards would obviate the need for the saved search generation, but alas that feature is seemingly never coming. Is it possible to embed saved saved searches with the API? I can't figure out a way to do that.

How to add different marker types for different lines in 1 chart?

$
0
0
Hi, I have 3 lines in 1 chart (average, threshold, total_alarm) I would like to use a different marker types for the 3 different line above. I have used but the same marker applied to all 3 lines.

Website Monitoring Not Reporting Data

$
0
0
Hey Splunker, We have configured 100's of URL to monitor their response code, surprisingly this stopped working, there is not data coming from any of the web_ping://"*" source. When checked in splunkd.log of search head, i didn't found any trace of errors or warning for the same. Can you help me with the keywords or any procedure to get this correct. I tried for web_ping , ERROR. But didn't found anything fishy. Out of my surprise when i restart splunk services, i get the logs of these URLs for sometime, then again it gets stopped. I am using Splunk Version 6.6.3. Checking on the resource usage of SH, everything seems to be normal.

Logged in User's Timezone on Menu Bar?

$
0
0
Is it possible to put the logged in user's timezone (from their preferences) onto the menu bar (top right), next to their account name? Some user's working in Splunk forget what timezone they have selected when performing queries etc. This would be a good visual indicator and reminder.

How can I connect my ionic app to splunk entrerise server?

$
0
0
So I am trying to connect my Ionic app to splunk entreprise server but I don't know how I can do this, I install a Javascript SDK for splunk on my Ionic project then I add a script to connect but he turn 404 not found, I need a help please.

How to raise the alert for sourcetype=netstat

$
0
0
Hi Splunker, How can i Write the splunk query to show the state of a port for local address? The result of netstat is for the whole ports on the particular server, and the results be like: Proto Recv-Q Send-Q LocalAddress ForeignAddress State tcp 0 0 0.0.0.0:111 0.0.0.0:* LISTEN Now in this case, how shall i write the query if the State for port 111 changes from Listen to CLOSED_WAIT or Closed etc...?

Index is showing 0 data.

$
0
0
I had created one VM (EC2 in aws, centos) and attached splunk ebs volume to it, mounted on /opt. On this Server, Splunk is running well but indexes are showing 0 data in splunk web. Though I can see data at $SPLUNK_DB, for all of my application indexes indexes.conf having below conf [prod-abc] homePath = $SPLUNK_DB/prod-abc/db coldPath = $SPLUNK_DB/prod-abc/colddb coldToFrozenDir = $SPLUNK_DB/prod-abc/frozendb thawedPath = $SPLUNK_DB/prod-abc/thaweddb frozenTimePeriodInSecs = 63072000 maxDataSize = auto #If I create new index test and give below setting, change the path to absolute. It populate data in index: [prod-test] homePath = /opt/splunk/volr/splunk/prod-abc/db coldPath = /opt/splunk/volr/splunk//prod-abc/colddb coldToFrozenDir = /opt/splunk/volr/splunk//prod-abc/frozendb thawedPath = /opt/splunk/volr/splunk//prod-abc/thaweddb frozenTimePeriodInSecs = 63072000 maxDataSize = auto my splunk-launch.conf looks so- # Where splunk is installed. SPLUNK_HOME=/opt/splunk # Location where we want to store the indexed data. SPLUNK_DB=/volr/splunk # Splunkd daemon name SPLUNK_SERVER_NAME=splunkd # Splunkweb daemon name SPLUNK_WEB_NAME=splunkweb

Set a default index for all INPUTS within a specific app only

$
0
0
I have an app with a long list of inputs. I want to set them to go to a specific index (let's say `index = my_index`). I can achieve this by placing `index = my_index` under the `[default]` stanza in the app. However, will this affect other apps on the forwarder with their inputs set to default? I deploy this app with a Deployment Server to several UFs.
Viewing all 47296 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>