Quantcast
Channel: Questions in topic: "splunk-enterprise"
Viewing all 47296 articles
Browse latest View live

Can you help me with an XML field extraction?

$
0
0
I have a log file, that outputs different formats depending on the portion of the application doing the logging. Some of the events output like the XML sample data shown here. I'd like to find some way to extract the key-value pairs out of them. If a transforms/props can be put in place, that will recognize events like this one and extract the fields I need without interfering with the other single-line machine data and JSON entries in the log that would be nice. If multiple lines can be entered to account for XML and JSON even better. Honestly, Id be happy with an inline solution with |extract |xmlkv |xpath or something like that. Otherwise, I will be forced to write some pretty nasty |REX statements for each field. Thanks! Looking for: Clientid 11111111 MemberFirstName Jane MemberLastName Doe Gender FEMALE DOB 11/11/1911 EmployeeIDNum xxxxx MentorFirstName MentorLastName Event Samples: 2018-09-25 12:48:23,599 [tp-bio-8001-exec-151] [ STANDARD] [ ] [ PHSInt:01.01] (og.Domain_FW_Apollo_Int_.Action) INFO hostname01.domain.com|10.200.200.200|HTTP|AssessmentServices|Services|SaveAssessmentAnswers|AD0A0F376B08E09090B78F37816A41733 - INSERTING INTO SERVICE REQUEST LOG:--SERVICEREQUESTTYPE -->:SaveAssessmentAnswers--SERVICEREQUESTSTATUS -->:--TRANSACTIONID-->:3740e6fc-99xx-43f2-ba47-4630da0aaeda--MEMBERELIGID-->:--PID-->:--PARTICIPANTID-->:--DEBUGMESSAGE-->:[hostname01.domain.com] --REQUEST-->:3740e6fc-59ee-43f2-ba47-4630da0aaeda11111111931ImpersonatorDetailMEMBERMemberFirstNameJaneMemberLastNameDoeGenderFEMALEDOB01/01/1911EmployeeIDNum35121212121212--RESPONSE-->:3740e6fc-59ee-43f2-ba47-4630da0aaedaMessageWe’re sorry, we’re not able to verify your account information. Please contact your benefits administrator. --REFERENCEID-->: 2018-09-25 12:47:21,248 [tp-bio-8001-exec-177] [ STANDARD] [ ] [ PHSInt:01.01] (og.Alere_FW_Apollo_Int_.Action) INFO hostname.domain.com|10.214.6.60|HTTP|AssessmentServices|Services|SaveAssessmentAnswers|A6E53D8C7F19456C1484D3F2307AB5FDB - INSERTING INTO SERVICE REQUEST LOG:--SERVICEREQUESTTYPE -->:SaveAssessmentAnswers--SERVICEREQUESTSTATUS -->:--TRANSACTIONID-->:a8667bd9-2be5-4655-8d9a-dd47e8111ce4--MEMBERELIGID-->:--PID-->:--PARTICIPANTID-->:--DEBUGMESSAGE-->:[hostname.domain.com] --REQUEST-->:axx67bd9-2be5-4655-8d9a-dd47e8111ce411121212931ImpersonatorDetailPARENTMentorFirstNameJaneMentorLastNameDoeMemberFirstNameAidenMemberLastNameDoeGenderMALEDOB01/1/2001EmployeeIDNum351111111111--RESPONSE-->:axx67bd9-2be5-4655-8d9a-dd47e8111ce4MessageWe’re sorry, we’re not able to verify your account information. Please contact your benefits administrator. --REFERENCEID-->: Other Samples in the log that are not XML. 2018-09-252018-09-25 13:17:4613:17:46,,541541 [ [tp-bio-8004-exec-171tp-bio-8004-ex ] [ STANDARD] [ ] [ PHSInt:01.01] (lo_Data_System_BatchLog.Action) INFO hostname06.domain.com|10.200.200.200|HTTP|HealthIndicatorsInt|Services|saveHealthData|A30AC19E66FD562E79942068C75D03XXF - In UpdateBatchLog:ID=20001,Type=ProcessEvent,Action=P-212799085,Status=INFO,Message=Processing of EventNew HD,Exception= JSON I think. 2018-09-25 13:17:45,929 [ PegaRULES-Batch-18] [ STANDARD] [ ] [ ApolloCCBatch:01.01] (on.Domain_FW_Apollo_Int_.Action) INFO - INSERTING INTO SERVICE REQUEST LOG:--SERVICEREQUESTTYPE -->:MPEAPI--SERVICEREQUESTSTATUS -->:200--TRANSACTIONID-->:DOE--MEMBERELIGID-->:99999999--PID-->:999999999--PARTICIPANTID-->:JOHN--DEBUGMESSAGE-->:[hostname04.domain.com] OK [Time Elapsed=697.0ms]--REQUEST-->:{ "MemberProductEligibilityRequest":{ "requestHeader":{ "applicationName":"APPLICATION", "transactionId":"bc99999b547b64cf99a01cabd625e0bc7" }, "consumerDetails":{ "firstName":"JOHN", "lastName":"DOE", "dateOfBirth":"1900-05-09T00:00:00Z", "searchId":"999999999", "contractNumber":"999999" }, "filteringAttributes":{ "includeExtendedAttributes":"true", "applyFilters":"true" }, "requestDetails":{ "requestType":"BIG5", "searchType":"ALL" } }}--RESPONSE-->:{"MemberProductEligibilityResponse":{"responseHeader":{"transactionId":"bc2706b547b64cf99a01cabd625e0bc7"},"consumerDetails":[{"demographics":{**** Section suppressed for logging ****},"contactDetails":{**** Section suppressed for logging ****},"idSet":{**** Section suppressed for logging ****},"populationDetails":{"populationEffectiveDate":"2018-01-01T00:00:00Z","populationCancelDate":"9999-12-31T00:00:00Z","populationId":"POP33477","populationDateAssigned":"2017-12-12T00:00:00Z","populationBrandingType":"Optum Logo","populationBrandingEffectiveDate":"2018-01-01T00:00:00Z"},"coverageDetails":{"recordType":"HEALTH_COVERAGE","employeeStatus":"A","contractNumber":"0999999","eligibilitySourceSystem":"CS","planVariation":"0106","reportingCode":"0106","customerName":"TESLA","coverageType":"M","coverageEffectiveDate":"2018-01-01T00:00:00Z","hireDate":"2001-01-04T00:00:00Z","stateOfIssue":"CA","legalEntity1":"20020","marketSite":"0004422"},"extendedAttributes":{"ecExtended":[],"elExtended":[],"euExtended":[{"typeCode":"EU3","value":"0004422","effectiveDate":"2001-01-01T00:00:00Z","cancelDate":"9999-12-31T00:00:00Z"},{"typeCode":"EU3","value":"0004422","effectiveDate":"2001-01-01T00:00:00Z","cancelDate":"9999-12-31T00:00:00Z"}],"cuExtended":[],"suExtended":[],"muExtended":[]},"productDetails":{"product":[{"source":"Optum","productEvent1":"Productname for Life","productEffectiveDate":"2018-01-01T00:00:00Z","productTerminationDate":"2199-12-31T00:00:00Z"}]}}]}}--REFERENCEID-->:999999

How to create a pie chart with specific field values?

$
0
0
Hi, here is a sample lookup file's table/csv: sample.csv Animal Color horse black horse Grey horse Orange horse Grey horse Grey How can I display ONLY the colors black vs. grey in a pie chart with their respective percentages? My below code only shows a single-colored pie chart and their count. |inputlookup sample.csv |chart count(eval(color = "black" )) as Noir count(eval(color = "grey")) as Gris

Trouble signing in to the spunk web UI

$
0
0
I don't have ldap setup and using splunk authentication , when I try to login the web UI sometimes I am receiving **ERROR** First time Singining in? and when i try with the admin password also it's giving me the same Error. Anyone encountered this issue? Thanks for any help!

Output field name,value from stats table

$
0
0
Hello, I'm trying to get a very specific output format that can be fed into our ticketing system. I have the following table in splunk, top line is field names: sender recipient subject lolwut@domain.com bob@company.com example1 lolwut@domain.com alice@company.com example2 This can either be a table or a set of stats values() multivalue fields. I need the final table to output to a csv like this: sender lolwut@domain.com sender lolwut@domain.com recipient bob@company.com recipient alice@company.com subject example1 subject example2

Join 3 sourcetypes using OR

$
0
0
It would be great if anyone can help me to join data from 3 sourcetypes: 1) sourcetype_1 has fields AA,MM,CC,Amt1 2) sourcetype_2 has fields AA,mm,ss 3) sourcetype_3 has fields SS,cc,Amt2 Step 1 - Join first and second sourcetype where sourcetype_1.AA == sourcetype_2.AA && sourcetype_1.MM=sourcetype_2.mm Step 2 - Join result of step 2 and third sourcetype where sourcetype_3.cc= step1.CC and sourcetype_3.SS = Step1.ss In my final result, I want to group by OO, CC and SS and show all the CC and SS where Amt2 is lesser than Amt Note - There is many to many relationship between some of the required fields. Thanks in advance!

How do I see used vs available license usage?

$
0
0
I have the below search against a particular heavy index which lists its daily volume consumed. index=_internal source=*license_usage.log type=Usage | eval totalMB = b/1024/1024 | eval totalGB = totalMB /1024 | rename idx as index | search index="xxx-xxx" | timechart span=1d sum(totalGB) by index _time xxx-xxx ------------------------------------ date 200GB I want the above query tweaked to compares the daily usage against 1000GB and list the output like below _time xxx-xxx Total %Used --------------------------------------------------------------------------------------------- date 200GB 1000GB 20% Thanks, Laks

displaying columns with an inner join

$
0
0
I need to run a report that gives me phone numbers that appeared >=2 within the same minute and with the corresponding CallID and then I need to do an inner join to include another data called CV7. I was able to accomplish that with this query: index=ABC sourcetype=ABC_MainReportLog "Entered Phone Number" Phone!=1234567890 | dedup CallID | table CallID Phone _time | join type=inner CallID [ search index=ABC sourcetype=ABC_core_MainReportLog "\|RemoteApplicationData\|" CV7=* | dedup CallID] | fields Phone, State, CallID | bucket _time span=1m | stats count(CallID) as Count by _time Phone State | where Count >=2 The query above gives me _time Phone State and Count. But I also need to display the CallID. How do I do that? Thanks in advance!

Conditional Field Alias (Not in search bar)

$
0
0
I would like to create field alias like A=B only if C="some value" (A is the new field while B,C are existing fields). I tried to go to setting and under "Field Aliases" looked for condition but I could not find. Is someone know how to do it? Thanks

How to apply $ symbol for Y-axis values in Splunk Chart

$
0
0
How to apply $ symbol for Y-axis values in Splunk column Chart we are trying to add $ symbol to y-axis values in the Dashboard visualization.

Issues in running query from splunk 6.6.4 to 7.1.1

$
0
0
I have a below query which is working perfectly fine in splunk 6.6.4 after upgrading to new Splunk version 7.1.1 , the query is showing now result found. This is a multi select drop down. | set intersect [| pivot ticket_feed_dm ticket_feed_obj SPLITROW current_ticket_state | eval current_ticket_state = trim(current_ticket_state) | table current_ticket_state] [| inputlookup ticketState_lookup | search Category = $res_un$ | rename Sub-Category as sub_category | eval sub_category = trim(sub_category) | table sub_category] Any leads would be helpful.

Updating GeoLite2-City-Latest.mmdb on deployment server

$
0
0
I have a deployment server configured, inside I have changed the GeoLite2-City.mmdb but know I'm having troubles making it configure with the whole infrastructure. I have tried with the command 'splunk reload deploy-server' , but it seems it is not syncing. How can I force the different instances of splunk to fetch the new database and ditch the old one? Thank you!

Changing the URL of dashboard

$
0
0
I originally had a dashboard named "System". I cloned it and named the new dashboard "System." (note the period at the end). Now, on the Dashboard page, both dashboards have the same URL "10.1.1.1/en-US/app/search/system". Clicking on either one brings me to the cloned dashboard. Is there a way to get the original dashboard back?

Have you ever made an integration splunk-slack?

$
0
0
I'm currently indexing events from a slack team, i am indexing data from differents channels... But not all channels, I saw ir the channels that i want to Index are private on slack, but they ate not private. I am indexing from 375 channels but not from the one that i want. I guess that this is a slack restriction... Can someone oficina you help this soul?

How can I get data in to splunk enterprise?

$
0
0
Dear all, I'm a beginer. I just built up splunk enterprise. could you please help me to get data from windows servers? because I dont know how to deal with them. one more thing, I also tried to read with documents from splunk but still cannot understand to handle. kindly please help me. thank you so much~~

How to sum or subtract values depending on 2 distinct fields and chart it?

$
0
0
Hi folks, I have a table in the following format: Date Buy(qty) CurrencyBuy Sell(qty) CurrencySell Jan/2017 500 ETH 0.2 BTC (meaning I **bought** 500 ETH for 0.2 BTC) Feb/2017 700 ETH 0.3 BTC (meaning I **bought** 700 ETH for 0.3 BTC) Mar/2017 0.2 BTC 400 ETH (meaning I **sold** 400 ETH for 0.2 BTC) what I'm looking to do is a graph that plots the amount of ETH I have throughout time. Date Amount Jan/2017 500 Feb/2017 1200 Mar/2017 800 Thanks for your help folks!

Way to lauch a splunk app from another TA Splunk addon after initial setup

$
0
0
Hello, I have two apps, one is connector app/addon and other is actual splunk app. Now I want to create a dependency between both apps as my connector addon will do the initial setup and fetch logs/create indexes and then my actual splunk app will display the logs. So is there any way to do that? Or is there any way to create a launch app button in my connector so that when the setup has been done, user can directly click the button on the connector app to launch actual splunk app?

Checkpoint compatibility

$
0
0
Dear All, From checkpoint description from the side page there is a compatibility description about this app. Please find below: COMPATIBILITY Products: Splunk Cloud, Splunk Enterprise Splunk Versions: 6.3, 6.2 Platform: Platform Independent On my infrastructure I have Splunk Enterprise 6.5 and it seems that works. What this description stands for? Because I want to upgrade to 7.1 and I am not sure if that is supported and works properly on the latest version of Splunk Enterprise 7.1. Thank you in advance.

Server Roles in the DMC

$
0
0
I have inherited a Splunk non-clustered, distributed enterprise environment. I believe that my Splunk instances have too many server roles assigned to them. Is there documentation stating: 1. What role(s) should a Heavy Forwarder have? 2. What role(s) should a Search Head have? (Search Head role only, or KV store as well?) 3. What role(s) should an Indexer have? (Indexer role only, or KV store as well?) 4. What role(s) should a Deployment Server have? 5. What role(s) should the DMC Server have? Right now, the 4 server roles, Indexer, KV Store, License Master, and Search Head are assigned to my DMC. It is the License Master for my infrastructure so I know that role is required. I am having a hard time finding documentation online that explicitly states how the server role assignment should be. Thanks in advance.

Want to extend/increase the all buckets size in Splunk by Time period (Days)

$
0
0
Hi Everyone, I have gone through some Splunk documents about buckets. But most of the time I have seen that everywhere discuss about how to increase/extend the size of any bucket by Size means either MB/GB which is converted in mb format. But my concern is I want to increase/extend my buckets by Days format (example : I want to store my last 60 days data in my hot bucket). I know that I have to convert the days to minutes value and then use it in bucket configuration. But I didn't find any proper example in Splunk. Can anyone help me on this or any good documentation with a proper example? It'll be very helpful for me. Thanks, Saibal6

Added data not showing up in respective source types

$
0
0
I added some dummy data yesterday after creating an index and respective source type But today morning i found there were no data in the surcetype and index is also saying "No data found" I have splunk admin role Please help Thanks in advance
Viewing all 47296 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>