I am running following queries to get event counts average per second and per day over a weeks period but the results makes no sense.
This is the query to get average per day over a week:
index=some_some2_idx ns=something app_name=my-api STATUS_CODE!=5*
| timechart span=1d count as requests_per_second | stats avg(requests_per_second)
The average per day result is 3010.5714285714284
This is the query to get average per second over a week:
index=some_some2_idx ns=something app_name=my-api STATUS_CODE!=5*
| timechart span=1s count as requests_per_second | stats avg(requests_per_second)
The average per second result is 1.4238227146814404
There are 86400 seconds in a day so 86400 * 1.4238227146814404 = 123018 and some change. I am not expecting an exact match but this is not even remotely close to 3010.5714285714284. What am I missing. Is this is expected or is there something wrong with my query?
↧