I have used the dump command to extract data from production server and play with it on my local.
I have 6 different hosts in Production, so I'd like to run 6 dumps so that the host segregation is respected.
I used something like this:
sourcetype=MySourceType host = node1 | dump basefilename=MySourceType-node1-2016Jan01ToMay31
It worked, but the results were split by 'source', as in the source file name.
The log files where the data comes from are located in something like
\\node1.company.com\logs\logfile1.log
There are 32 logfiles , so my dump results end up like this:
$SPLUNK_HOME/var/run/splunk/dispatch//dump/node1/node1.company.com/logs/logfile1.log/MySourceType-node1-2016Jan01ToMay31_450_10.raw.gz
$SPLUNK_HOME/var/run/splunk/dispatch//dump/node1/node1.company.com/logs/logfile1.log/MySourceType-node1-2016Jan01ToMay31_816_40.raw.gz
$SPLUNK_HOME/var/run/splunk/dispatch//dump/node1/node1.company.com/logs/logfile2.log/MySourceType-node1-2016Jan01ToMay31_450_21.raw.gz
$SPLUNK_HOME/var/run/splunk/dispatch//dump/node1/node1.company.com/logs/logfile2.log/MySourceType-node1-2016Jan01ToMay31_816_51.raw.gz
I think the 450 and 816 might be me trying to run the dump once, cancelling and doing it again with a different time frame.
My main issue is, how can I consolidate the dump to have a single file, instead of having it segregated by source?
↧