Why do I have to add 7 when calculating how many bytes to allocate in order to store x amount of bits

98 Views Asked by At

So I'm using a programming language that always rounds down to the closest integer. so 0.125 = 0, 0.875 = 0, 1.45 = 1 and 1.95 = 1

Keeping that in mind, I can't use a formula like this bits / 8 because if we have 3 bits and we try doing 3/8 we would get 0.375 and keeping what I mentioned previously about rounding down in mind, 0.375 would be 0.
Same goes for 7/8. It would give us 0.875 which rounds down to 0.
And that won't work because we need at least 1 byte to store bits from 1 to 7.

And in order to calculate how many bytes I need to allocate to store x amount of bits I use this formula.

Let's say we have 3 bits, doesn't matter what the value is.
var bits = 3;
I would then do (bits + 7) / 8 which is 10/8 and that would yield 1.25. And after it rounds down it gives us the perfect value which is 1.

My question is as follows, I don't see why I have to add 7, why can't I add a value 1-6? Technically for that example I could add 5, because 5 + 3 yields the value 8 and 8 over 8 is 1.

How come that adding the value 7 works for all values?