I have a service which runs an SQL query over a database and returns a simple table with two columns, “Item” and “Item count”, which contain the name of an item and its quantity. These tables will have no fixed number of rows. It is output as a Json file of the format below:
{ "odata.metadata":"”:[
{ "ItemName":"Item1","ItemCount":23
}
{ "ItemName":"Item2","ItemCount":12
// etc…
}
{ “ItemName”:”Item71”,”ItemCount”:2
}
I have managed to pull this data into Splunk via the modular REST API app. However, in it’s current state it’s not very useful in search. For example I cannot do a search for items with item count less than 10, or I cannot search for an item of a specific name, since each item row is grouped into one massive event in Splunk. Splunk has managed to extract the fields, but they also have an ugly name which isn’t attractive for search either, e.g. “value{}.ItemCount” rather than just “itemCount”.
What I would like to do is extract each individual row of this table/JSON data and assign it to an event, so each event from one table would have the tables timestamp, and a field for each attribute (item name and item count). So each event will look more like a traditional log file, like below:
15/12/2015 09:47 Item1 23
15/12/2015 09:47 Item2 12
15/12/2015 09:47 Item71 2
Is this possible with the Custom Response Handlers? I’ve taken a look at the example code in responsehandlers.py and they all seem to be very simple and convert it into another format and then just print it. I need to extract each field and then submit it as a **distinct** event! Also, another priority is being able to extract this data into the other format before search time, if I could handle it in the "Response Handler" that would be perfect.
Many thanks.
↧