Quantcast
Channel: Questions in topic: "splunk-enterprise"
Viewing all 47296 articles
Browse latest View live

Is load balancing in UF/HF bound to cause data imbalance on IDXC eventually?

$
0
0
Is load balancing in HF and UF's `outputs.conf` bound to cause data imbalance on IDXC overt time? If yes, I wholeheartedly accept that data rebalancing is something that we need to do in a regular basis. Just need confirmation from the crowd / community. If not and in which it means that the line (according to [this][1] Splunk doc) *"every 30 seconds, the forwarder switches the data stream to another indexer in the group, selected at random"* guarantees to make number of data among the peers balance. If that is really the case, then what causes data imbalance and how to prevent it? Thanks in advance. [1]: https://docs.splunk.com/Documentation/Splunk/6.2.4/Forwarding/Setuploadbalancingd

Conditional searching using eval command with if match

$
0
0
Hi SMEs: I would like to define a print event type to differentiate Remote Prints from Office Print jobs. From my print logs, i'd like to: Define channel = "Remote Print", where printer name contains "*WING*RCA*" else, "Office Print". I started off with: | eval channel = if(match(like printer="*WING*RCA*", "Remote Print"), "Office Print") I'm still relatively new to these commands and would appreciate any assistance. Thanks in advance, Mac

Cannot break AWS Cloudtrail events

$
0
0
Hi, The cloudtrail logs in splunk come in without proper event break; I only got it to recognize the first event's timestamp. This is problem because each 'Records' do contain a large number of separate events with its own timestamp. Each event is a json block starting with eventVersion. Here is an anonymized sample : { "Records": [ { "eventVersion": "1.05", "userIdentity": { "type": "IAMUser", "principalId": "AAAAAAAAAAAAAAAA12345", "arn": "arn:aws:iam::999999999999:user/S3_ContentProvider", "accountId": "999999999999", "accessKeyId": "BBBBBBBBBBBBBBB12345", "userName": "S3_ContentProvider" }, "eventTime": "2020-03-05T04:16:50Z", "eventSource": "sns.amazonaws.com", "eventName": "ListTopics", "awsRegion": "us-east-1", "sourceIPAddress": "10.10.10.10", "userAgent": "aws-sdk-java/1.11.192 Linux/3.10.0-693.21.1.el7.x86_64 Java_HotSpot(TM)_64-Bit_Server_VM/25.45-b02/1.8.0_45 exec-env/AWS_ECS_EC2", "requestParameters": null, "responseElements": null, "requestID": "0000000a-000a-000a-000a-00000000000a", "eventID": "0000000a-000a-000a-000a-00000000000b", "eventType": "AwsApiCall", "recipientAccountId": "999999999999" }, { "eventVersion": "1.05", "userIdentity": { "type": "IAMUser", "principalId": "AAAAAAAAAAAAAAAA12345", "arn": "arn:aws:iam::999999999999:user/S3_ContentProvider", "accountId": "999999999999", "accessKeyId": "BBBBBBBBBBBBBBB12345", "userName": "S3_ContentProvider" }, "eventTime": "2020-03-05T04:17:04Z", "eventSource": "sns.amazonaws.com", "eventName": "ListTopics", "awsRegion": "us-east-1", "sourceIPAddress": "10.10.10.11", "userAgent": "aws-sdk-java/1.11.192 Linux/3.10.0-693.21.1.el7.x86_64 Java_HotSpot(TM)_64-Bit_Server_VM/25.45-b02/1.8.0_45 exec-env/AWS_ECS_EC2", "requestParameters": null, "responseElements": null, "requestID": "0000000a-000a-000a-000a-00000000000a", "eventID": "0000000a-000a-000a-000a-00000000000b", "eventType": "AwsApiCall", "recipientAccountId": "999999999999" } ] } Thanks.

[Docker] can not show logs in Splunk

$
0
0
Hi all. I want to send docker logs form ECS. I did this command. #openssl x509 -in cacert.pem -subject -noout #openssl x509 -in server.pem -subject -noout #docker run --name splunk-nginx -p 80:80 -d \ --log-driver=splunk \ --log-opt splunk-token=XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX \ --log-opt splunk-url=https://splunk.example.com:8088 \ --log-opt splunk-capath=/etc/docker/cacert.pem \ --log-opt splunk-caname=SplunkServerDefaultCert \ --log-opt splunk-index="web" \ --log-opt splunk-sourcetype=access_combined \ --log-opt splunk-format=raw \ nginx But there are no logs in splunk. I check connection in this command curl -k https://XXXXXXX:8088/services/collector/event -H "Authorization: Splunk XXXXXXXXXXXXXX" -d '{"event": "hello world"}' It work well. hello world shows in splunk. I think logging driver does not work well. But where should I fix? logConfiguration in AWS??? or docker??? or other configuration??? Thank you.

How to mark difference between Two CSV

$
0
0
Hi Guys, There is a csv which gets updated every day once with details such as- VMName Group CPU Memory Storage PowerState I need to add column as "Any Changes" with value Yes or No. So that if there is change in values for particular host it should show as yes in "Any Changes" column- VMName Group CPU Memory Storage PowerState AnyChanges Note:- This needs to be checked for month,, if there are any changes it should be highlighted in column "Any Changes" Please let me know how this can be implemented,.

First event

$
0
0
Hello, this is my query | loadjob savedsearch="myquery" | where (strftime(_time, "%Y-%m-%d") >= "2020-02-26") AND (strftime(_time, "%Y-%m-%d") <= "2020-03-03") and STEP=="Click" | bucket _time span=1d |table _time,MESSAGE |where MESSAGE = "337668c2-162c-4f4f-bda9-92f7816f2752" OR MESSAGE = "46095117-4dcb-4ebc-9906-8c23f1a1a26b" OR MESSAGE = "60eb62a4-c54a-4fc0-9aaa-17726ff62929" OR MESSAGE = "8b5e055c-17ab-4135-8b90-1fbc65032792" And this is the result ![alt text][1] What i want is only the lines on yellow: If I have a message on the 26th, 27th and 28th I must have that of 26 [1]: /storage/temp/284673-ask.png

Splunk upgrade 6.5.2 to 7.3.4

$
0
0
Hi team! If I want to update 6.5.2 to 7.3.4 can I do it directly? Or I have to do an extra step? Exists a path route? I have 1 Cluster + Deployer | 2 indexers | 3 search head | 1 HF Thank you!

Splunk forwarder preventing Docker rebuild

$
0
0
I am wondering if anyone has come accross this issue before: System and application versions: • Docker version 18.09.4 • Splunk version 7.2.6 (?) • Windows Server 2019 1809 Build A summary of what we’ve discovered and background information: • Splunk seems to prevent docker from starting docker containers, they are stuck in a “Created” state • We do not use Splunk explicitly as our docker logging service, i.e. Splunk is not referenced in any docker config • Docker and the SplunkForwarder service both start up on host boot • Changing the dependencies on the service (i.e. docker start first or splunk start first) doesn’t fix the issue • Stopping splunk while docker is running and then creating the containers works o As soon as one container has started successfully, we can start splunk and still create more containers • Restarting splunk while docker is running and then creating the containers does not work Steps to reproduce on a machine: 1. Boot server up, docker and splunk start automatically 2. Attempt to run docker-compose to create our containers with no containers already running or in an exited state, docker gets stuck with containers in a “Created” state Steps to mitigate issue: 1. When there are no containers running, stop the splunk service 2. Run docker-compose to create at least one container successfully 3. Start the splunk service 4. Run docker-compose to bring up any remaining containers Any help or ideas to get a work around would be appreciated TIA

Visualizations like Chart , Single View in Modal Window

$
0
0
All, I was reusing the Modal Window project from Ian Gillespie as described in the Hurricane Labs Tutorial Project . This project shows a TABLE in the Modal Window , I would like to have different Visualizations like Chart, Single View Panels etc. Instead of using the **TableView**, I tried changing it to **ChartView** , **ChartElement** etc, but I am not able to make it work. I still get the output as a Table in the Modal Window. Could someone teach me to do that ? It would be really helpful if an example is given on Single View apart from ChartView as well. **Dashboard Code:** Masterindex=_internal | stats count by sourcetype-60m@mnow
slaveindex=_internal | dedup group | table group-60m@mnow
**Script - "modalviewsearchapp1.js"** require([ 'underscore', 'backbone', '../app/search/components/ModalViews', 'splunkjs/mvc', 'splunkjs/mvc/searchmanager', 'splunkjs/mvc/simplexml/ready!' ], function(_, Backbone, ModalView, mvc, SearchManager) { var master = mvc.Components.get("master"); var tokens = mvc.Components.getInstance("submitted"); var slave = mvc.Components.get("slave"); var detailSearch = new SearchManager({ id: "detailSearch", earliest_time: "-24h@h", latest_time: "now", preview: true, cache: false, search: "index=_internal sourcetype=$sourcetype$ | timechart count" }, {tokens: true, tokenNamespace: "submitted"}); var detailedSearch = new SearchManager({ id: "detailedSearch", earliest_time: "-24h@h", latest_time: "now", preview: true, cache: false, search: "index=_internal group=$group$ | chart count by sourcetype" }, {tokens: true, tokenNamespace: "submitted"}); master.on("click", function(e) { e.preventDefault(); if(e.field === "sourcetype") { var _title = e.data['click.value']; tokens.set('sourcetype', _title); var modal = new ModalView({ title: _title, search: detailSearch }); modal.show(); } }); slave.on("click", function(e) { e.preventDefault(); if(e.field === "group") { var _title = e.data['click.value']; tokens.set('group', _title); var modal = new ModalView({ title: _title, search: detailedSearch }); modal.show(); } }); }); **Script - ModalViews** define([ 'underscore', 'backbone', 'jquery', 'splunkjs/mvc', 'splunkjs/mvc/searchmanager', 'splunkjs/mvc/simplexml/element/table', 'splunkjs/mvc/chartview', 'splunkjs/mvc/simplexml/element/chart', 'splunkjs/ready!' ], function(_, Backbone, $, mvc, SearchManager, ChartElement) { var modalTemplate = "
" + "

<%- title %>

" + "
" + "
" + "
" + "
"; var ModalView = Backbone.View.extend({ defaults: { title: 'Not set' }, initialize: function(options) { this.options = options; this.options = _.extend({}, this.defaults, this.options); this.childViews = []; console.log('Hello from the modal window: ', this.options.title); this.template = _.template(modalTemplate); }, events: { 'click .close': 'close', 'click .modal-backdrop': 'close' }, render: function() { var data = { title : this.options.title }; this.$el.html(this.template(data)); return this; }, show: function() { $(document.body).append(this.render().el); $(this.el).find('.modal-body').append('
'); $(this.el).find('.modal').css({ width:'90%', height:'auto', left: '5%', 'margin-left': '0', 'max-height':'100%' }); var search = mvc.Components.get(this.options.search.id); var detailTable = new ChartElement({ id: "detailTable", 'charting.chart': 'pie' managerid: search.name, el: $('#modalVizualization') }).render(); this.childViews.push(detailTable); search.startSearch(); }, close: function() { this.unbind(); this.remove(); _.each(this.childViews, function(childView) { childView.unbind(); childView.remove(); }); } }); return ModalView; });

Code Sample does not work well.

$
0
0
| makeresults | eval _raw="Source1_field2,Count dev,6 prod,5 uat,7 qa,8" | multikv forceheader=1 | table Source1_field2,Count | rename COMMENT as "this is sample your stats output" | transpose 0 header_field=Source1_field2 | eval "prod + uat"=prod+uat | fields - prod uat | transpose 0 column_name="Source1_field2" header_field=column this code works. but `code sample` add extra space. copy and paste to _search_ , this is not works. What should I do?

Limiting Results of matching values in an array field

$
0
0
Anyone know of a way to only return the matching values of a sub search to the string array field in the parent search? index="email" sourcetype="email_links" [ search index="sinkholed" sourcetype="bad_http" | rename raw_host as "extracted_host{}" | fields "extracted_host{}" ] | stats dc("rcptto{}") as recipient_dc values("rcptto{}") values("extracted_host{}") values(subject) by from | sort recipient_dc The query works fine except I'm getting back more than I want. The results I get back in the "extracted_host{}" field are everything in that particular field value array instead of just the matching criteria. For example, in the sub-search let's say there is a sinkhole domain called baddomain.com. The results I see in "extracted_host{}" are: baddomain.com www.w3.org abc123advertisement.com etcetcetc.com Would like to only return what matched in the sub-search. Any assistance is greatly appreciated.

Scripted input with bash script is not generating any results

$
0
0
I have a bash script that queries audit.log using ausearch for events that I have configured in audit.rules to have attached a specific key. This is the general idea of the script: # Assign path variables # Capture saved timestamp from last execution # Save new timestamp for future execution # Execute query using ausearch # Redirect stdout and stderr to two different variables # Check stderr variable does not equal "" and exit execution if true Now this is where I have tried multiple things and while all of them work when executed from a terminal, they don't generate any results when Splunk executes them. echo $stdout_var OR echo $stdout_var > /path/to/tmp cat /path/to/tmp I have even tried even monitoring "/path/to/tmp", that's when I realized this might be a user permissions issue since the file is generated, but there is never any content in it. Currently, SPLUNK_OS_USER=root, but does that mean that the script is executed as SPLUNK_OS_USER? Or do I have to configure the script through Splunk to run as a specific user? Again, when I execute this command manually from the CLI as root, it works exactly as expected, but it generates nothing when executed through the scripted input.

Ingesting same Windows log with two different input stanzas

$
0
0
I am collecting Sysmon logs via Splunk UF in XML format (renderXml=true). I need to forward some specific Sysmon events to QRadar without XML formatting. I would like to keep sending all Sysmon events in XML format to Splunk. I tried to make two different stanzas in inputs.conf trying to ingest the same log in two different ways but it does not seem to work. It looks like Splunk merge these two together in runtime. The idea was to filter non-XML events on HF by using props.conf, transforms.conf and _TCP_SYSLOG to send it to QRadar. [WinEventLog://Microsoft-Windows-Sysmon/Operational] disabled = 0 renderXml = true index = sysmon [WinEventLog://Microsoft-Windows-Sysmon/Operational] disabled = 0 renderXml = false index = sysmon whitelist = 1,22

Merge two rows based on common field value

$
0
0
I've a table like below and I want to merge two rows based on the COMMONID 1. JBID JOBTYPE START_TIME END_TIME COMMONID 2. 2020-03-10T06:30:00 2020-03-10T08:30:00 abc 3. 6398 Medium abc 4. 5649 Medium def 5. 2020-03-10T08:30:00 2020-03-10T10:30:00 def 6. 5649 Medium ghi 7. 2020-03-20T08:30:00 2020-03-20T10:30:00 ghi 8. 2020-03-11T08:30:00 2020-03-11T10:30:00 jkl 9. 6383 Medium jkl 10. 7070 Medium mno 11. 2020-03-10T08:30:00 2020-03-10T10:30:00 mno 12. 11690 Medium pqr 13. 2020-03-12T06:30:00 2020-03-12T08:30:00 pqr 14. 2020-03-19T06:30:00 2020-03-19T08:30:00 stu 15. 6398 Medium stu 16. 6398 Medium vwx 17. 2020-03-10T06:30:00 2020-03-10T08:30:00 vwx The resulting table should look like below 1. JBID JOBTYPE START_TIME END_TIME COMMONID 2. 6398 Medium 2020-03-10T06:30:00 2020-03-10T08:30:00 abc 3. 5649 Medium 2020-03-10T08:30:00 2020-03-10T10:30:00 def 4. 5649 Medium 2020-03-20T08:30:00 2020-03-20T10:30:00 ghi 5. 6383 Medium 2020-03-11T08:30:00 2020-03-11T10:30:00 jkl 6. 7070 Medium 2020-03-10T08:30:00 2020-03-10T10:30:00 mno 7. 11690 Medium 2020-03-12T06:30:00 2020-03-12T08:30:00 pqr 8. 6398 Medium 2020-03-19T06:30:00 2020-03-19T08:30:00 stu 9. 6398 Medium 2020-03-10T06:30:00 2020-03-10T08:30:00 vwx How do I achieve this?

Adding a new menu item in Palo Alto Networks App

$
0
0
Hi, I want to create my own dashboard, a customised version of the ones in the app but when I try to use the built in searches outside of the app they don't work & when I use the clone option in the app there doesn't seem to be any way to add my new dashboard to the menu system in the app? So for instance under Operations there would be the normal ones and my new one? I'm pretty new to this, am I missing something obvious? Thanks for any help.

I'm looking for an app that pulls al windows defender logs from Azure to splunk

$
0
0
I have found two apps this one and this one, but the first one only pulls security alerts and for the other one you need to deploy the app to the servers. Thing is, we also need the clients info and they don't have forwarders installed. Is there an app that pulls all windows defender logs from Azure?

scheduled report searches no longer perform similar to ad-hoc searches

$
0
0
We use Splunk to report on daily smartGrid meter data. We use 1 indexer, 1 searchhead and 1 heavy forwarder We have observed that since the upgrade to 7.3.3 in december 2019 the results of scheduled searches no-longer contain all The expected fields/Field values. When running these same searches manually we do see all the fields and field values. These Searches are used in population of kvstore lookup tables which do not get populated properly. The search ends with a collect to a kvstore and returns around 3mln records. Allthough our search query and volumes are large (3M records) according to the search logs there are no errors and they complete succesfully.

Splunk apps best practices - how to organize user search environment ?

$
0
0
Hi, in my organization we have full automated Splunk Enterprise environment. Automated except one thing - apps in search head. I would like to kindly ask You for some advice's, how to organize Search Head environment for multiple users. In our environment we have 50% Splunk advanced users (apps developers) and 50% dashboard readers. At the moment everyone using Search App as default one - each user storing there all searches. How to backup each user environment ? On many YouTube Splunk movies I saw that many people creating Splunk App and storing there all informations. Then app is located in HOME_SPLUNK/etc/apps/user-appname and could be easily managed by system administrator (tar,backup,restore). The problem is that only admin can create apps. Is there any way to store users search environments in some containers/folders which administrator can easily manipulate. I would like to store Splunk Apps in git repo and user search enviornments somewhere in backup. So if i will do search head deployment, my script will install list of current apps and restore last backup of user searches. How to manage it in the best way ? I will appreciate it any support. Thanks in advance

How do we map same field from CIM Mapping from different model?

$
0
0
How do we map same field from CIM Mapping from different model? -- Example.. from same sourcetype data is coming field1 -- Map to Inventory model 'dest' field field2-- Map to Alert model 'dest' field

Alert not getting triggered with cron schedule

$
0
0
Hello All, I have configured an alert with `earliest=-24h` and `head 3000` and i can see from search there are lot of results are populating but I am no alerts are getting generated. Alert threshold is greater than 2 and results populating are 77 I have integrated the alert with splunk. At first I thought it might the integration is broken but I am verifying from here `activity->triggered alerts` but i do not see anything https://share.getcloudapp.com/kpuYKLmd I am not sure if this due to the cron and other settings, so here it is https://share.getcloudapp.com/o0uD6gyX
Viewing all 47296 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>