Could see an old question in 2010 , but just getting confused on the timings/duration vs execution cost
I've a search which finishes in `This search has completed in 87.776 seconds`
Below is snapshot of job inspector. As far as the [documentation][1], the `command.search.index` which is `102.39` tells how long it took to look into the TSIDX files for the location to read in the raw data. I assume this is in seconds?
So the confusion is, the entire search has taken only 87.776 seconds, but a component took more than that? Different components are much higher than the total duration. (NOTE this is a SHC & IDX cluster environment). Do we need to divide the duration (seconds) value with the cluster SHC count to get the real value in seconds?
Duration (seconds) Component Invocations Input count Output count
0.96 command.eval 960 14,440 14,440
0.10 command.fields 240 3,606 3,606
0.48 command.pretransaction 478 10,818 10,818
0.00 command.rename 1 - -
275.92 command.search 718 7,361 11,100
102.39 command.search.index 326 - -
24.10 command.search.filter 435 - -
4.37 command.search.fieldalias 195 737,809 737,809
3.88 command.search.calcfields 195 737,809 737,809
1.66 command.search.expand_search 1 - -
0.00 command.search.index.usec_1_8 83,018,230 - -
0.00 command.search.index.usec_512_4096 1 - -
0.00 command.search.index.usec_64_512 14 - -
0.00 command.search.index.usec_8_64 125,952 - -
.........
..........
There are more entries to this.
[1]: https://docs.splunk.com/Documentation/Splunk/7.3.1/Search/ViewsearchjobpropertieswiththeJobInspector
↧