Task: Show that the sequence of integers made up of the $k$ rightmost bits generated by an LCG with $m = 2^n$ has a period of at most $2^k$.
I actually get a hint to define the output as the following: $R_i = X_i \operatorname{mod} 2^k$ and derive a recursion.
The LCG algorithm is to provide a seed $X_0$ and iterate for $i \geq 1$, $$ X_i = aX_{i-1} + c \operatorname{mod} 2^n; \ \ R_i = X_i \operatorname{mod} 2^k. $$ Deriving a recursion for $R_i$ is difficult when $c \neq 0$, so I thought to set $c = 0$. Then, $$ R_i = a^jx_0 \operatorname{mod} 2^k. $$ How can I derive from this that the period is at most $2^k$?
To clarify, for all nonnegative integer $x$ and positive integer $m$, the notation "$x\bmod m$" means the (unique) least nonpositive integer the difference between which and $x$ is a multiple of $m$. Or
x % min programming languages.Verify that $$ X_i = aX_{i-1} + c \bmod 2^n $$ implies $$ R_i = aR_{i-1} + c \bmod 2^k $$ In particular, $R_{i}$ is uniquely determined by $R_{i-1}$.
Consider $R_0, R_1, \cdots, R_{2^k}$, which are $2^k+1$ integers between $0$ and $2^k-1$. Thanks to Pigeonhole principle, two of them must be the same. Suppose they are $R_i$ and $R_j$, where $0\le i<j\le 2^k$.
So the sequence $R_0, R_1, \cdots$ has a period of $j-i$, which is at most $2^k$.