I am conducting an experiment on a couple of computer systems but the results I have don't make sense to me. I made each system perform 1000 operations:
System A performs operations at a rate of 476/second
System B performs operations at a rate of 88/second
But when I look at the frequency distribution of the operations, the large gap in the throughput does not make sense to me:
Time(s) B A
<1 908 0
1-3 84 977
4-6 3 13
7-9 2 7
10-12 2 2
13-15 1 0
16-18 0 0
19-21 0 0
22-24 0 0
25-27 0 0
28-30 0 1
31-33 0 0
34-36 0 0
37-39 0 0
40+ 0 0
As you can see system B performed 90% of its operations in under a second, while system A performed 97% in 1-3 seconds. I would have expected to see a much larger spread of operations in system B than in system A given the huge difference in throughput both systems have.
Do my results make sense?
There are so many possible points of confusion here that I hardly know where to start.
First, I wonder what 'time' means in your tabulation. It can hardly be seconds. If the rate is 88/sec. then how can there possibly be 908 within the first second.
Second, one might suppose 'rates' refer to rates of exponential distributions. But then there is no possibility of having zero outcomes within the first second, and many in the following two seconds.
So I conclude that the units are likely inconsistent and the times for individual operations cannot be exponentially distributed.
On this site one supposes a probability model lies behind the question, but from the information you provide, I have no idea what that model might be.
From my own (limited) experience with benchmarking software, I suspect there is more to the "speed" of these machines than the quoted rates. Could there some competing process running in the background? Different interfaces? Viruses?
Unless you can provide clarifications and additional information, I doubt you will get a more helpful answer here?