I have a 4-server Splunk scenario:
1. index server
2. deployment server
3. search head server
4. deployment client server (w/ a Splunk Universal Forwarder known to be configured correctly and working, i.e., it forwards to the index server ok)
On the index server, I placed the following 200-row csv file and successfully ingested it into the index foo_index using oneshot:
"DateTime","foo"
"10/1/2019 12:03:20 AM","cat"
.. 198 more similar rows
"10/1/2019 11:55:20 PM","dog"
After verifying that the 200 events were ingested, I edited the csv file and added 200 more rows:
**"DateTime","foo"
"10/1/2019 12:03:20 AM","cat"
.. 198 more similar rows
"10/1/2019 11:55:20 PM","dog"
"10/2/2019 12:01:20 AM","mouse"
.. 198 more similar rows
"10/2/2019 11:59:59 PM","mouse"**
Then, on the deployment server, I created a remote folder monitor and deployed it to the deployment client server which created `\etc\apps\xxx\local\inputs.conf` on the deployment client server:
[monitor://D:\foo]
disabled = false
index = foo_index
sourcetype = csv
CRCSALT =
↧