From Hozmann, 1991 Design and Validation of Computer Protocols (no it's not networks):
The error model provided by the binary symmetric channel predicts that the probability of a series of at least n contiguous error-free bit transmissions, called an ‘‘error-free interval’’ (EFI), is equal to:
$$Pr(\operatorname{EFI} \geq n) = (1 - b)^n $$
where $b$ is the long-term average bit error rate.
The probability decreases linearly with the length of the interval. Similarly, the pro- bability that the duration of a burst exceeds n bits decreases linearly with n
What could the author mean with "decreases linearly"? Is it about the derivative (as $b$ is a constant and $n$ a variable it would be $n \times (1 - b)$) or anything else?