Hey all,
I wanted to see if someone can help me out with this. Basically im trying to get a duration for the time in between 2 scenarios. Im trying to get how long it takes for each user to get from scenario_1 to scenario_2 by service. This is what I have so far and it seems to work when I do by individual service:
index=index_name (scenariotype="scenario_1" OR scenariotype="scenario_2) user_ID="*" service_name="*service_1*" | transaction user_ID | stats mean(duration) AS "Mean Duration(In Seconds)" by service_name
Stats table shows:
service_name | Mean Duration(In Seconds)
service_1
7.25
It returns a low number and when I manually checked the mean time by user_ID, it is correct.
However, when I want to get the mean duration for all services, I get a much higher number, especially for service_1 above. Keep in mind, I have 9 services Im trying to get numbers from. So basically when I run the following and dont specify a service_name or I include more than service name, i get much higher numbers for (exactly the same period of time) as the mean duration for each service(note service_1 is the same service as the above result but returning much higher number):
index=index_name (scenariotype="scenario_1" OR scenariotype="scenario_2) user_ID="*" | transaction user_ID | stats mean(duration) AS "Mean Duration(In Seconds)" by service_name
Stats table shows:
service_name | Mean Duration(In Seconds)
service_1 189.57
service_2 5.75
service_3 5.75
service_4 1.35
service_5 6.25
service_6 10.40
service_7 4.53
service_8 8.78
service_9 6.72
Ive also experimented with looking further back in the time and the mean duration goes up as I go further back in time if i dont specify 1 service or include more than 1 service or include all services.
Hopefully I made sense and someone can help me with what am I doing wrong.
thx!!
↧