Let's say I download a file that is 3500 bytes in 2401 milliseconds. What is the calculation to work out Internet speed?
I ask because I'm developing a piece of software that does just this to make a guess at the users current internet speed. My problem is I don't how to use these 2 numbers to calculate the Mbps.
Here's what I've got so far. Not sure if I'm on the right track but I'm getting the wrong result.
- I convert the bytes to bits by multiplying the bytes by 8
- Divide the bits by the number of seconds it took to transfer the file to get bits per second
- Divide by 1,000,000 to get the Mbps
Mbps = ((fileBytes * 8) / durationInSeconds) / 1000000
If I put my numbers into this function I get a completely different result:
((3500) * 8) / 2.401) / 1000000
(28000 / 2.401) / 1000000
11661.80758017492711 / 1000000
= 0.01166180758017
The result 0.01166180758017 is not correct as my Internet speed is ~4Mbps. Where am I going wrong?
Your math is right, the download speed turns out to be abysmal because of other issues.
Try a (significantly) bigger file. (I mean a lot bigger: 100s of MBytes.) Otherwise, the actual download time is swamped by the time it takes to connect to the host and exchange all the other messages (e.g. SSL/TLS handshake if you run over HTTPS), plus the time the host itself takes to respond.