Consider a linear constant coefficient rth order difference equation
$\sum_{j=0}^r \alpha_j U^{(n+j)}=0$
with $a_r \neq 0$ and $a_0 \neq 0$.
I know that if s is a root of the polynomial
$p(s)= \sum_{j=0}^r \alpha_j s^j=0$ then $U^{(u)}=Cs^n$ solves $\sum_{j=0}^r \alpha_j U^{(n+j)}=0$ for any constant C.
My question is: it was claimed in my class the following... Suppose $p(z)$ has r distinct roots $s_1,s_2,...,s_r$. Then the linearity of $\sum_{j=0}^r \alpha_j U^{(n+j)}=0$ gives that $U^{(n)}=c_1z^n_1+...+c_r z_r^n$ for some choice of constant coefficients.
I am not sure what the reasoning is behind this. I have done uniqueness and existence proofs before, but I have never seen a proof claiming all solutions are found. I was wondering if anyone has a reference for this proof I could look at or ideas behind how one would show this.
Thank you.
Start by showing that $U^{(n)}=s_k^n$ is a nontrivial solution: $$\sum_{j=0}^r\alpha_jU^{(n+j)}=\sum_{j=0}^r\alpha_js_k^{n+j}=s_k^n\sum_{j=0}^r\alpha_js_k^j=0$$ If and only if $s_k$ is a root of the characteristic equation, $$\sum_{j=0}^r\alpha_js_k^j=0$$ If $U^{(n)}$ and $V^{(n)}$ are solutions to the linear homogeneous difference equation, and $C$ is any constant, then $$\begin{align}\sum_{j=0}^r\alpha_j\left(U^{(n)}+V^{(n)}\right)&=\sum_{j=0}^r\alpha_jU^{(n)}+\sum_{j=0}^r\alpha_jV^{(n)}=0+0=0\\ \sum_{j=0}^r\alpha_jCU^{(n)}&=C\sum_{j=0}^r\alpha_jU^{(n)}=C(0)=0\end{align}$$ So solutions may be combined via the superposition principle. Suppose we are given initial conditions $U^{(j)}=b_j$ for $0\le j<r$. then we want to find constants $c_k$, $1\le k\le r$ such that $$U^{(j)}=\sum_{k=1}^rc_ks_k^j=b_j$$ For $0\le j<r$. This is possible because the matrix of coefficients $A_{jk}=s_k^j$ is a Vandermonde matrix and is nonsingular. Look at how this works for a small Vandermonde matrix: $$\begin{align}\det\begin{bmatrix}1&1&1&1\\s_1&s_2&s_3&s_4\\s_1^2&s_2^2&s_3^2&s_4^2\\s_1^3&s_2^3&s_3^3&s_4^3\end{bmatrix}&\stackrel{R_3-s_1R_2}{=}\det\begin{bmatrix}1&1&1&1\\s_1&s_2&s_3&s_4\\s_1^2&s_2^2&s_3^2&s_4^2\\0&s_2^2(s_2-s_1)&s_3^2(s_3-s_1)&s_4^2(s_4-s_1)\end{bmatrix}\\ &\stackrel{R_2-s_1R_1}{=}\det\begin{bmatrix}1&1&1&1\\s_1&s_2&s_3&s_4\\0&s_2(s_2-s_1)&s_3(s_3-s_1)&s_4(s_4-s_1)\\0&s_2^2(s_2-s_1)&s_3^2(s_3-s_1)&s_4^2(s_4-s_1)\end{bmatrix}\\ &\stackrel{R_1-s_1R_0}{=}\det\begin{bmatrix}1&1&1&1\\0&(s_2-s_1)&(s_3-s_1)&(s_4-s_1)\\0&s_2(s_2-s_1)&s_3(s_3-s_1)&s_4(s_4-s_1)\\0&s_2^2(s_2-s_1)&s_3^2(s_3-s_1)&s_4^2(s_4-s_1)\end{bmatrix}\\ &=\det\begin{bmatrix}(s_2-s_1)&(s_3-s_1)&(s_4-s_1)\\s_2(s_2-s_1)&s_3(s_3-s_1)&s_4(s_4-s_1)\\s_2^2(s_2-s_1)&s_3^2(s_3-s_1)&s_4^2(s_4-s_1)\end{bmatrix}\\ &=(s_2-s_1)(s_3-s_1)(s_4-s_1)\det\begin{bmatrix}1&1&1\\s_2&s_3&s_4\\s_2^2&s_4^2&s_4^2\end{bmatrix}\end{align}$$ Where we have expanded by minors along the first column and then divided column $1$ by $(s_2-s_1)$, column $2$ by $(s_3-s_1)$, and column $3$ by $(s_4-s_1)$. So we can see that a sequence of elimination operations starting from the bottom up reduced the Vandermonde determinant to one of smaller order by $1$ and that $$\det A=\prod_{j=1}^{r-1}\prod_{k=j+1}^r(s_k-s_j)\ne0$$ Because the roots are all distinct as we are told. So that means we have a unique set of coefficients $c_k$ that satisfies the initial conditions. If we have one solution that satisfies the linear difference equation we know that it is unique because we can uniquely determine $$\begin{align}U^{(r)}&=-\frac1{\alpha_0}\sum_{j=0}^{r-1}\alpha_jU^{(j)}\\ U^{(r+1)}&=-\frac1{\alpha_0}\sum_{j=0}^{r-1}\alpha_jU^{(j+1)}\end{align}$$ And so on.