Let's say I have two timers with uniform random timeouts in the range $(a,b)$. When one timer goes off it needs to communicate to the other timer which takes $c \ll a,b$ time and perform an operation $\varepsilon \ll c$ before resetting it. However, we do not want the second timer going off as we do the operation (before that is fine). What is the chance of that happening? (Timers are independent and are assumed to start at the same time)
I believe this can be represented as
$$t_1 \sim U(a,b)$$ $$t_2 \sim U(a,b)$$
$$ p(t_1+c < t_2 < t_1+c+\varepsilon) $$
But I have no idea how to compute it given $a,b,c,\varepsilon$
If the distribution is uniform, you just take the ratio of the time intervals, so the probability is $\frac {\varepsilon}b$. $c$ does not matter as long as $a \lt b+\varepsilon$ so the whole interval of interest fits within $b$.