I'm developing a Splunk app that I plan to make available on Splunkbase.
The app contains dashboards that visualize data from various proprietary source types.
In a test or production environment, users will forward data to the app from the system that generates those source types. (More correctly: uses will forward data to a *TCP data input* defined by the app.)
For demo purposes, for users who want to see examples of the visualizations without forwarding their own data, I want to provide sample data.
I have 50 MB of sample data in a JSON Lines file. This compresses to a 1.5 MB `.zip` file.
I'm debating how and where to make this sample data available to app users.
I'm anticipating that users will download the `.zip` file from somewhere, uncompress it, and then upload it using Splunk Web, following some simple instructions (to select the appropriate custom source type for upload) that I'll probably provide in the detailed description of the app in Splunkbase.
I've previously built Docker images for such demo purposes, and that's worked fine: a single `docker run` command creates a working Splunk installation with the app and sample data. However, this time, I want to make the sample data available separately, outside of a Docker image, to give users the choice of which Splunk installation to use to host the app.
I don't like the idea of bloating the Splunk app, which is currently only a few dozen kilobytes, with a 1.5 MB `.zip` file that is only useful for demos.
How do other Splunk app developers support this use case? What's the best practice for providing sample data for an app on Splunkbase, without "bloating" the app itself? (Note that my source types are proprietary, non-trivial to programmatically synthesize from scratch, and not generally available from public sources.)
↧