Quantcast
Channel: Questions in topic: "splunk-enterprise"
Viewing all 47296 articles
Browse latest View live

SplunkJS in my own webapp : cannot load ready when "!"

$
0
0
Dear all, I am following the tutorial available here : http://dev.splunk.com/view/SP-CAAAEW5 I just copied/past the HTML and located the whole thing under my local apache server (http://127.0.0.1). When loading the page through my browser (Firefox or Chrome), I get the following error > Error: Load timeout for modules:> splunkjs/ready!_unnormalized2,splunkjs/ready!> http://requirejs.org/docs/errors.html#timeout> config.js line 138 > eval:166:17> makeError> http://127.0.0.1/static/splunkjs/config.js> line 138 > eval:166:17 checkLoaded> http://127.0.0.1/static/splunkjs/config.js> line 138 > eval:692:23> checkLoaded/checkLoadedTimeoutId<> http://127.0.0.1/static/splunkjs/config.js> line 138 > eval:713:25 It works perfectly for other modules which don't have the exclamation mark "!". var deps = [ **"splunkjs/ready!",** "splunkjs/mvc/searchmanager", "splunkjs/mvc/chartview", "splunkjs/mvc/eventsviewerview" ]; When I remove the exclamation mark, the module loads properly. It looks like the exclamation mark points to an AMD module but I don't understand why this tutorial would mix the library way of loading. Thus I am wondering if this is not a tutorial error. Can anyone provide me with some explanations regarding this error ? Thank you

Line Chart over _time by fieldname

$
0
0
Hi I am trying below query to plot line chart- index=abc |eval Time=round(endtime-starttime)|chart values(Time) as Time over _time by Type Here there can be multiple Type values. my problem is some _time field has multivalue Time field due to which it is not plotted on graph and I am not able to use mvexpand on Type because Type value is not fixed . my output looks like below _time Build 29/01/2019 12:01 2 3 4 29/01/2019 12:12 5 from above only 5 value is getting plotted and others are not seen due to multivalue and here I can not apply `mvexpand Build` as this value can change

extract _raw to field

$
0
0
Team, When I search for particular sourcetype, source and index I want to have one interesting field may be called as msg and value should be whole _raw message. How can I achieve this via configuration? Please help.

how to search multiple strings

$
0
0
Hi Team, I have a list of 200 filenames (string) that need to be searched in Splunk. Each filename is unique. example - if I have filenames like 1.txt, 2.txt, 3.txt ........ 200.txt I am trying it like below - (1548225008333.4546.-1092053882.Oxalis_jhsediapp02.netsentral.no.doc.xml OR 1126864-1548236892-8712_ehfd.jcloud.no.doc.xml) |

cron schedule for last day of every month

$
0
0
Hello All, i wanted to run a report on the last day of the month and send an email. can some help me with cron value for it?

Regex in search results are blank

$
0
0
Hi all, I'm trying to create a search that includes some regex. Ultimately, I'm trying to parse out some information (filename and file hash) from the raw event and show that information in a separate fields on a table. The other fields not mentioned are already parsed out by default, I just need to filename and file hash information to be parsed out as well. When I perform the below search, nothing shows up in the two new fields I created (fname, fileHash). Could someone help me with my search? **Search Command:** index=antivirus CLF_ReasonCode="virus log" VLF_SecondActionResult="File passed" | rex field=_raw "fileHash= <(?.*)> fname= <(?.*)>" | table _time cef_name VLF_SecondActionResult fname fileHash **Raw Data:** Dec 24 11:39:47 test.test.com Dec 24 2018 11:39:47 testy-test001.test.test.com CEF:0|Test Test|Control Manager|0.0SP3|AV:File quarantined|Trojan.W77M.POWLOAD.SMNM2|3|deviceExternalId=000 rt=Dec 24 2018 13:51:23 GMT+00:00 cntLabel=AggregatedCount cnt=1 dhost=TEST000 act=File quarantined cn1Label=VLF_PatternNumber cn1=0000000 cn2Label=VLF_SecondAction cn2=1 cs1Label=VLF_FunctionCode cs1=Real-time Scan cs2Label=VLF_EngineVersion cs2=0.000.0000 cs3Label=CLF_ProductVersion cs3=0.0 cs4Label=CLF_ReasonCode cs4=virus log cs5Label=VLF_FirstActionResult cs5=File quarantined cs6Label=VLF_SecondActionResult cs6=N/A cat=000 dvchost=TEST-TEST cn3Label=CLF_ServerityCode cn3=2 fname=test.doc filePath=C:\\Users\\u000000\\Downloads\\ dst=255.255.2.255 fileHash=dddd0c5df90e20af01f7ad1e73ea17777d87777b deviceFacility=ExecScan I used these guides as references: https://docs.splunk.com/Documentation/Splunk/7.2.3/SearchReference/Rex http://blog.hortonew.com/how-to-use-regex-rex-in-splunk

Does splunk support Appdynamics add-on integration if they are behind the SAML?

$
0
0
Hi, To capture the appdynamics matrics in splunk we are trying to integrate App dynamics addon. We have successfully integrated them and started collecting the data from App dynamics in those environments where authentication method used was LDAP. we are getting 401 error for the environment where SAML is the authentication method. Below is the error message. *ERROR ExecProcessor - message from "python /SPLUNKHOME/etc/apps/Splunk_TA_AppDynamics/bin/appdynamics_summary.py" ERROR401 Client Error: Unauthorized for url: https://IntentionallyMasked:8181/controller/rest/applications?output=JSON&time-range-type=BEFORE_NOW&duration-in-mins=5* Does splunk supports App dynamics integration in SAML environment? NOTE:-User id and password are same which are used to login to App Dynamics in those SAML environments. Regards Saurabh

HEC large field value not extracted but is in _raw

$
0
0
Have a field in our HEC input that is larger the 10,000 characters. When searching the data input from HEC the field is has not been extracted. It is in _raw and I can pull it out of there. Really would like to be able to have the field extracted. props.conf has: TRUNCATE = 0 I can manually input the same data via a text file and the large field (a blob of JSON text) is extracted and available fine. Just not when input via HEC. See screen shots![alt text][1] [1]: /storage/temp/264683-screen-shot-20190129-02.png

percentage of field

$
0
0
I have two fields body.response.successcount and body.response.failurecount .How to write query for success count % & failure Count % individually?

Search Head Cluster connected to Multiple Single Site Index Clusters

$
0
0
I have a Search Head Cluster consisting of 3 Search Heads. This Search Head Cluster is going to attach to 6 different Single Site Index Clusters. Is it possible to restrict all searches from querying every Index Cluster? If I specify "srchIndexesDefault" as none and specify the "srchIndexesAllowed" with the indexes that can be searched, if the indexes don't exist on some of the Index Clusters, will the indexers from that site still be searched? I am trying to maintain performance on the Index Clusters and not have every cluster hit with every search.

Splunk Universal Forwarder Duplicate Logs (Windows)

$
0
0
Hello- I am currently trying to configure Splunk Universal Forwarders on Windows Workstations. The Universal Forwarder is configured to send security logs directly to our indexer. I have the Windows Add-On installed on the Universal Forwarder, and my inputs.conf file is in the \local\ directory. It is forwarding logs to the indexer as (mostly) intended. The issue that I am currently experiencing an issue that when the Splunk service restarts on a workstation, it begins forwarding event logs to the indexer that have already been indexed. I'm semi-familiar with what the fishbucket is supposed to do, but it doesn't seem like the indexer is keeping track of my events that have already been indexed :/ Here's relevant parts from my inputs.conf: [WinEventLog://Security] index=winsec checkpointInterval = 5 disabled = 0 start_from =newest Would greatly appreciate any help you may provide. Thank you!

Migrate Searches, Dashboards etc from Standalone Search head to new Search Head Cluster

$
0
0
Can I move the /splunk/etc/apps/search/local folder to the deployers shcluster/apps/search/local folder? (And then push package?) Reading the documentation it seems this would be a bad idea but I want to move the savedsearches.conf, macros etc. Or would be a better idea to copy these folders to the search heads individually and restart splunk? This is from docs: > **Caution:** Do not use the deployer to> push default apps, such as the search> app, to the cluster members. In> addition, make sure that no app in the> configuration bundle has the same name> as a default app. Otherwise, it will> overwrite that app on the cluster> members. For example, if you create an> app called "search" in the> configuration bundle, it will> overwrite the default search app when> you push it to the cluster members.

Is it possible to filter the IIS logs to they do not forwarded status 200?

$
0
0
I configured the Advanced Logging log files on a Server to forwarder to Splunk. This is the structure of the log files: #Fields: sitename date-UTC time-UTC date-local time-local Method cs-uri-stem cs-uri-query Username c-ip Status Substatus TimeTakenMS But I want that it only forwarder the states different to 200. (Status <> 200). Someone knows, How can I do it? Regards

json split out nested key/values to separate events

$
0
0
Hello, I have a giant JSON blob that has some similar key names for nested events w/ different values. I'd like 1 item per row/column but I'm not sure how to achieve this. Can anyone help? ![alt text][1] _raw: {"startTime": "2019-01-29T08:30:31", "finishTime": "2019-01-29T08:30:31", "elapsedTime": 0.284014, "server": "REDACTED", "worker": "BF4B6-CEA8C", "results": [{"flavors": {"mime": ["application/zip"], "yara": ["zip_file"]}, "entropyMetadata": {"entropy": 7.557669739231512}, "hashMetadata": {"md5": "1c1ce116840df2cb3d3d42a685ecedea", "sha1": "0da6c24a4e1bb556b89d20af6362298c68647bf9", "sha256": "06744aa51a21da65ad110297da4e917d0d0fc689aa0d7b4ee090faa6a716da8a", "ssdeep": "96:Hg4NX829EYfFlUfDw1G1WV2w86FnxW204sGCfEHe1X8dI:7sRolULhw8hs7Dir"}, "headerMetadata": {"header": "PK\u0003\u0004\u0014\u0000\b\u0000\b\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0018\u0000\u0000\u0000xl/worksheets/Sheet1"}, "selfMetadata": {"filename": "file.11312", "depth": 0, "uid": "3d2950df-8303-42ba-91f3-8b5e022465c6", "rootUid": "3d2950df-8303-42ba-91f3-8b5e022465c6", "hash": "06744aa51a21da65ad110297da4e917d0d0fc689aa0d7b4ee090faa6a716da8a", "rootHash": "06744aa51a21da65ad110297da4e917d0d0fc689aa0d7b4ee090faa6a716da8a", "source": "REDACTED", "scannerList": ["ScanEntropy", "ScanHash", "ScanHeader", "ScanSelf", "ScanYara", "ScanZip"], "size": 5044}, "yaraMetadata": {"matches": ["zip_file"]}, "zipMetadata": {"total": {"files": 9, "extracted": 9}}}, {"flavors": {"mime": ["application/xml"], "yara": ["xml_file"]}, "entropyMetadata": {"entropy": 4.6681910854877975}, "hashMetadata": {"md5": "3acc18198f587033e8e86523b4fd6fc2", "sha1": "9d6efa2cbf0e79b76a2cd4d18c27833169218292", "sha256": "8119d45d66c68bf0de35667726235f7703592c5bf7fb45c6a0d7c50de072a5d4", "ssdeep": "96:zXmmy1FUo5YVDGWqxWKieWwvUWx8vWiHaWs1WS+vWdcWofEGWYiWd1W134WgLWFt:zXmmyko5R4Rc"}, "headerMetadata": {"header": "

Need the correct regular expression for my rex command

$
0
0
Here is my event's raw data: {"line":"level=info t=\"2019-01-29T18:19:42.999Z\" rt=2 method=GET path=\"/contentskus/5b7ee52a4f6b9c001b049ac3?dma=999\u0026itemsPerPage=25\u0026page=1\" sc=200 dma=999 apikey=DEFAULT amzn_trace_id=unknown enabledFeatures=recommendations,upcomingSearch,popularityQueriesPlatformSpecific,availabilityTimes,avoidDefaultQuery,useFavoritesExternalSchemaForD2C,useFavoritesV2ForFavoritesFilter,endCardRecommendations,cmsAuthFallback os=2 rid=\"6962240ed296c770\" mode=published","source":"stdout","tag":"ecs-uat_**admin_v1sandbox_blue**-35-uat-service-admin-d8d0a7a0c2dfe8ea6c00/503343765eaa","attrs":{"SERVICE_NAME":"admin","SERVICE_TAGS":"contentgroups,contentsettings,contentskus,metatags,settings,userroles,users,s3signurl","SERVICE_VERSION":"v1sandbox","com.amazonaws.ecs.task-arn":"arn:aws:ecs:us-west-2:776609208984:task/305f6e5a-d20d-4aa2-877d-1bba2d442a7b"}} I'm trying to create a new field called service_name where I extract the highlighted portion in the above event. The regukar expression that I wrote is: \W\W\W[e][c][s]\S(?\w{1,}) Achieved result: uat_admin_v1sandbox_blue Expected result: admin_v1sandbox_blue Please help!!!

How to get full listing of indexes and GB ingest

$
0
0
I've been using the following search to get a count of ingested daily (24hrs) and for 30 days, but I'm only getting the top 10. How can I get the others beyond the top 10? index=_internal source=*license_usage.log type=Usage idx=* | eval GB = b/1024/1024/1024 | timechart span=1d useother=0 sum(GB) by idx | rename idx as Index, sum(GB) as Gigabyte

Which is this the best way to get a count of events indexed for entire environment?

$
0
0
> Blockquote 1. | eventcount summarize=false | stats sum(count)> Blockquote OR > Blockquote 2. https://docs.splunk.com/Documentation/Splunk/7.2.3/Troubleshooting/Aboutmetricslog index=_internal source=*metrics.log group=thruput | stats sum(ev) > Blockquote My use case is currently for last 30 days, and I am getting *very* different results. The first search command appears to be identical to the dashboard that shows up on the "What to Search" section on the search and reporting home page. The metrics.log I want to believe to be more accurate given that I can't do a time modifier on the first search command, "| eventcount."

How do you use cron schedule to run a report on the last day of every month?

$
0
0
Hello all, I wanted to run a report on the last day of the month and send an email. Can some help me with cron value for it?

Can you help me figure out why the regex in my search results is coming back blank?

$
0
0
Hi all, I'm trying to create a search that includes some regex. Ultimately, I'm trying to parse out some information (filename and file hash) from the raw event and show that information in a separate fields on a table. The other fields not mentioned are already parsed out by default; I just need the filename and file hash information to be parsed out as well. When I perform the below search, nothing shows up in the two new fields I created (fname, fileHash). Could someone help me with my search? **Search Command:** index=antivirus CLF_ReasonCode="virus log" VLF_SecondActionResult="File passed" | rex field=_raw "fileHash= <(?.*)> fname= <(?.*)>" | table _time cef_name VLF_SecondActionResult fname fileHash **Raw Data:** Dec 24 11:39:47 test.test.com Dec 24 2018 11:39:47 testy-test001.test.test.com CEF:0|Test Test|Control Manager|0.0SP3|AV:File quarantined|Trojan.W77M.POWLOAD.SMNM2|3|deviceExternalId=000 rt=Dec 24 2018 13:51:23 GMT+00:00 cntLabel=AggregatedCount cnt=1 dhost=TEST000 act=File quarantined cn1Label=VLF_PatternNumber cn1=0000000 cn2Label=VLF_SecondAction cn2=1 cs1Label=VLF_FunctionCode cs1=Real-time Scan cs2Label=VLF_EngineVersion cs2=0.000.0000 cs3Label=CLF_ProductVersion cs3=0.0 cs4Label=CLF_ReasonCode cs4=virus log cs5Label=VLF_FirstActionResult cs5=File quarantined cs6Label=VLF_SecondActionResult cs6=N/A cat=000 dvchost=TEST-TEST cn3Label=CLF_ServerityCode cn3=2 fname=test.doc filePath=C:\\Users\\u000000\\Downloads\\ dst=255.255.2.255 fileHash=dddd0c5df90e20af01f7ad1e73ea17777d87777b deviceFacility=ExecScan I used these guides as references: https://docs.splunk.com/Documentation/Splunk/7.2.3/SearchReference/Rex http://blog.hortonew.com/how-to-use-regex-rex-in-splunk

How do you migrate searches, dashboards, etc. from a standalone search head to a new search head cluster?

$
0
0
Can I move the /splunk/etc/apps/search/local folder to the deployers shcluster/apps/search/local folder? (And then push package?) Reading the documentation it seems this would be a bad idea, but I want to move the savedsearches.conf, macros etc. Or would it be a better idea to copy these folders to the search heads individually and restart Splunk? This is from docs: > **Caution:** Do not use the deployer to> push default apps, such as the search> app, to the cluster members. In> addition, make sure that no app in the> configuration bundle has the same name> as a default app. Otherwise, it will> overwrite that app on the cluster> members. For example, if you create an> app called "search" in the> configuration bundle, it will> overwrite the default search app when> you push it to the cluster members.
Viewing all 47296 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>