Linear recursions in finite fields

221 Views Asked by At

Let $F$ be a finite field and let $\alpha$, $\beta$ be distinct nonzero elements of $F$. Let $\alpha$ have order $r$ and let $\beta$ have order $s$. Let $M = \operatorname{lcm}(r, s)$. Let $a,b$ be nonzero elements of $F$ and define a sequence $u_n = a\alpha^n + b\beta^n$.

I have to show now that $M$ is the period of the sequence $u_n$, and I am choosing to show this by stating that there is no $L$, $1 \le L < M$ such that $u_{n+L} = u_n$ for all $n$.

I was warned that this is a challenging problem, so I am just wondering what my proof is missing because I found it pretty straight forward:

If $\operatorname{lcm}(r, s) = M$, then $u_{n + M} = a\alpha^{n + M} + b\beta^{n + M} = a\alpha^n\alpha^M + b\beta^n\beta^M = a\alpha^n + b\beta^n = u_n$.

Now suppose there does exist such an $L$ such that $u_{n+L} = a\alpha^{n + L} + b\beta^{n + L} = a\alpha^{n}\alpha^{L} + b\beta^{n}\beta^L = u_n$. But $L < M$ and we know that $M$ is the smallest integer such that both $r|M$ and $s|M$, and we have a contradiction.

That's my proof. Am I missing anything big?