Is there any bias choosing a random number $1\ldots n$ in C as $1+(\mbox{rand}()\%n)$

26 Views Asked by At

I need random numbers $1\ldots n$ in a program written in C. The library rand() function returns numbers in the range $0\ldots\mbox{RAND_MAX}$, where $\mbox{RAND_MAX}$ is typically $\sim2^{31}-1$. So I've just been taking $1+(\mbox{rand}()\%n)$ as a random number $1\ldots n$ (where C syntax $i\%j$ denotes "$i$ modulo $j$", the remainder after dividing $i$ by $j$). That is, equivalently, if we wrote the result of $\mbox{rand}()$ in base $n$, then I'd just take the low-order digit (plus one) as my random number.

Is this procedure just as random as $\mbox{rand}()$, or have I introduced any kind of bias?