Hi all,
We have some scripts for lookup filling via splunk lookup rest api [link text][1]
Also we have search head cluster (SHC).
It would be great to use SHC capability to to run our scripts on the one of alive node.
Best candidate for this procedure - inputs.conf. We can not only run script, but also collect STDOUT and STDERR in to index (docker style), for example:
[script://$SPLUNK_HOME/etc/apps/myapp/bin/lookup_fill.py]
interval = 50 23 * * *
sourcetype = lookup_fill
index = index_for_scripts_output
But, when we use inputs.conf our script start on all SHC nodes.
Can you advise to us way for single run script from inputs.conf or may be is the better way to:
1. Run custom script on the on of the SHC nodes (in the best case - less loaded)
2. Collect STDOUT and STDERR from script to index .
?
Thank you.
[1]: http://dev.splunk.com/view/webframework-developapps/SP-CAAAEZG
↧