Proving constant-time algorithm to find $A_n \mod 50$ as a linear combination of the $k$ previous $A$ terms

43 Views Asked by At

I've got the below scenario, based on a practice problem:

Suppose a sequence of integers $A_n$, and $A_0,\ldots,A_{k-1}< 50$ are all given, and each $A_i$ is a linear combination of the previous $k$ terms, i.e., $A_i = A_{i-1} b_1 + A_{i-2} b_2 + \dots + A_{i-k} b_k$. We know $A_0$ through $A_{k-1}$ and coefficients $b_1$ through $b_k$.

I already did what the practice problem asked but I want extend this further logically.

I want to figure out how to prove how $A_n \mod 50$ can be done in less than $ O(\log n)$ time. I programmed and plotted a basic scenario where $k=2$; $A_0, A_1, b_1, b_2$ are randomly generated; and 40 values of $A_n \mod 50$ are plotted

What I noticed was that sometimes these values repeated -- but are varying intervals. And some didn't seem to repeat at all, even when I adjusted to plot 1000 values.

I know, that logically and since we're doing mod 50 each time, we can find $A_n \mod 50$ in constant time as a function of the constant $k$ (rather than n). Intuitively it makes sense.

But then how would one prove this if the repeated interval varies depending on the values you start with? Is there some recurrence relation we can set up? Some mathematical inductive proof?

Basically, in this scenario, how do you prove an algorithm to find $A_n \mod 50$ exists in constant time and that the algorithm is correct and complete?