Recently I came across an algorithm for integer division on Stackoverflow that rounds up.
If we have two numbers $a, b \in \mathbb{Z}^+$ and we want to divide $b$ into $a$ such that the result is always rounded up, we can use:
$$f(a, b) = \frac{(a + b - 1)}{b}$$
Now, if we take an easy example:
$$f(5, 3) = \frac{(5 + 3 - 1)}{3} = 2.\overline{3}$$
Which, since we assume integer division, will be cut off from $2.\overline{3}$ to $2$
Is there a formal proof for this algorithm that demonstrates that the algorithm will always return a result big enough so that the mantissa of the result can be cut off because of integer division?
$\frac{a+b-1}{b}=\frac{a}{b} + \frac{b-1}{b}$. The value of $\frac{b-1}{b}$ is always less than 1. Now let's take the floor of both sides.
If $b$ divides $a$ then $\lfloor\frac{a+b-1}{b}\rfloor = \lfloor\frac{a}{b} + \frac{b-1}{b}\rfloor = \frac{a}{b}.$
Now let's analyze the case where $b$ doesn't divide $a$. In this case, as before, $\lfloor\frac{a+b-1}{b}\rfloor = \lfloor\frac{a}{b} + \frac{b-1}{b}\rfloor$. Let's say that $\frac{a}{b} = w\frac{r}{b}$ where $w$ is an integer and $r$ is the remainder. Then we know that $0 < r < b-1$. Therefore, $\frac{a}{b}+\frac{b-1}{b}=w\frac{r}{b}+\frac{b-1}{b} = (w+1)\frac{k}{b}$ where $k\geq 0$. If we then take the floor of both sides, we get $\lfloor\frac{a}{b}\rfloor=w+1$. That is to say, $\lfloor\frac{a}{b} + \frac{b-1}{b}\rfloor =\lceil\frac{a}{b}\rceil$.