Calculating the time it takes to send a file

27 Views Asked by At

The time of sending a file is calculated: $$\textrm{time} = \frac{\textrm{file size}}{\textrm{link capacity}}.$$ In this example, $\textrm{file size} = 4492643566$ bytes, and $\textrm{link capacity} = 100$ Mbits/s. Sending the file and the header over $100$ Mbit/s link will take $359.4$ seconds $= 6$ minutes.

I don't understand the result, $4492643566/100 = 359$?

1

There are 1 best solutions below

0
On BEST ANSWER

Note the difference in units: 1 byte = 8 bits.

So multiply L by 8.

Link capacity is given in Mbits, 100 of them, and each Mbit is $1000000 = 10^6$ bits.

So you need to calculate: $$\dfrac{4492643566 \times 8\,\text{ bits}}{100\times 10^6 \text{ bits per second}}\approx 359.4\text{ seconds}$$