A linear recurrence relation with constant coefficients is sometimes called a $C$-finite sequence. That is, a sequence $\{a_n\}$ is $C$-finite iff $$a_n = \sum_{k = 1}^d c_k a_{n - k}$$ for all suitable $n$, where $c_1, \dots, c_d$ are some constants and $d$ is a positive integer. The degree of a $C$-finite sequence is the smallest such $d$ for which this holds.
If $\{a_n\}$ is a nonzero $C$-finite sequence with degree $d$, is the matrix \begin{equation*} \begin{bmatrix} a_{d - 1} & a_{d - 2} & \cdots & a_0 \\ a_{d} & a_{d - 1} & \cdots & a_1 \\ \vdots & \vdots & \vdots & \vdots \\ a_{2d - 2} & a_{2d - 3} & \cdots & a_{d - 1} \end{bmatrix} \end{equation*} full-rank?
(Note that some obvious "counterexamples" are actually not. For example, take $a_n = a_{n - 1} / 2 + a_{n - 2} / 2$ with $a_0 = a_1 = 1$. This sequence reduces to a constant, which has degree 1, not 2.)
As a further question, if we construct a matrix of the same form, but of order $m \neq d$, can we say anything about the rank of the matrix then?
I came across this question while trying to justify the following technique: Suppose that we have a sequence $\{a_n\}$ that we suspect of being a degree $d$ $C$-finite sequence. If this were the case, then the following system would be consistent: \begin{align} \label{eqn:recurrence-guess} \begin{split} a_{d - 1} c_1 + a_{d - 2} c_2 + \cdots + a_0 c_d &= a_d \\ a_{d} c_1 + a_{d - 1} c_2 + \cdots + a_1 c_d &= a_{d + 1} \\ &\vdots \\ a_{2d - 2} c_1 + a_{2d - 3} c_2 + \cdots + a_{d - 1} c_d &= a_{2d - 1}. \end{split} \end{align} This is a square system of order $d$ with coefficient matrix \begin{equation*} \begin{bmatrix} a_{d - 1} & a_{d - 2} & \cdots & a_0 \\ a_{d} & a_{d - 1} & \cdots & a_1 \\ \vdots & \vdots & \vdots & \vdots \\ a_{2d - 2} & a_{2d - 3} & \cdots & a_{d - 1} \end{bmatrix}. \end{equation*} Solving this for the coefficients gives us a nice conjecture.
It would be desirable that an actual $C$-finite sequence makes this coefficient matrix be full-rank, so that the coefficients are unique.
Assuming it is not full rank, then the columns $\def\a{{\bf a}}\a_i$ are linearly dependent. Let $\sum_{i=0}^{d-1}k_i\a_i=0$ be a linear dependence, and let $e$ be the largest index for which $k_e\neq 0$. Then the $j^{th}$ row of the recurrence implies that $$ a_{e+j}=-\frac{k_{e-1}}{k_e}a_{e+j-1}-\frac{k_{e-2}}{k_e}a_{e+j-2}-\dots -\frac{k_{j}}{k_e}a_{j}. $$ This holds for all $0\le j \le d-1$. You can then prove by induction that this holds for all $j \ge0$, using the original recurrence. Letting $\tilde k_i=\frac{-k_{e-i}}{k_e}$, $$ a_{e+j}=\sum_{i=1}^d c_i a_{e+j-i}\stackrel{\text{ind}}=\sum_{i=1}^d c_i \sum_{h=1}^e\tilde k_ha_{e+j-i-h}=\sum_{h=1}^e\tilde k_h\sum_{i=1}^dc_ia_{e+j-i-h}=\sum_{h=1}^e\tilde k_h a_{e+j-h}. $$ We use the inductive hypothesis for the previous $e< d$ cases in $\stackrel{\text{ind}}=$, which is OK because there were $d$ base cases.
Finally, $a_{e+j}=\sum_{h=1}^e\tilde k_h a_{e+j-h}$ shows $a_n$ satisfies an $e$ order recurrence, which contradicts the minimality of $d$.