I'm a Splunk newbie, so feel free to challenge any of my assumptions.
I'm tasked with integrating our proprietary product's event/alert database.
I believe the correct approach (in a simple case) is to install DB Connect and a (universal/heavy?) forwarder on the database server host and Splunk enterprise as an indexer/search head on a "query/reporting" host.
The difficulty that I'm encountering is that at least one table has a column which contains XML; this XML describes a variable list of additional fields based on the event/alert type (similar to a Windows event log); some of these additional fields include text with commas, which screws up the CSV processing.
These fields should be searchable and select-able on the search head, but I'm not sure what the best approach to processing them should be.
I started to look into custom search commands to transform the SQL Server record into an appropriate form:
A CSV representation seems to be a problem, not just due to delimiting characters in the field text, but to output the header row, all records must be processed to determine the set of additional fields.
One option is to convert the data into to a "key=value;" representation.
Can I define a custom sourcetype to handle the data?
I expect the answer is probably a combination of these approaches.
btw, I installed Splunk Enterprise and Db Connect, but even with a reduced set of records, I violated the daily limits on the demo license. Advice on avoiding this would be helpful.
↧