It is well known that if linear recurrences $u_n$ and $v_n$ have characteristic polynomials $K_u$ and $K_v$, then $u_n + v_n$ has characteristic polynomial $K_u K_v$. Is the converse true? If my characteristic polynomial factors as $K_u K_v$, then can I find $u_n$ and $v_n$ such that my recurrence can be written as $u_n + v_n$?
2026-05-16 20:03:54.1778961834
linear recurrence and characteristic polynomial
569 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in LINEAR-ALGEBRA
- An underdetermined system derived for rotated coordinate system
- How to prove the following equality with matrix norm?
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Why the derivative of $T(\gamma(s))$ is $T$ if this composition is not a linear transformation?
- Why is necessary ask $F$ to be infinite in order to obtain: $ f(v)=0$ for all $ f\in V^* \implies v=0 $
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Summation in subsets
- $C=AB-BA$. If $CA=AC$, then $C$ is not invertible.
- Basis of span in $R^4$
- Prove if A is regular skew symmetric, I+A is regular (with obstacles)
Related Questions in SEQUENCES-AND-SERIES
- How to show that $k < m_1+2$?
- Justify an approximation of $\sum_{n=1}^\infty G_n/\binom{\frac{n}{2}+\frac{1}{2}}{\frac{n}{2}}$, where $G_n$ denotes the Gregory coefficients
- Negative Countdown
- Calculating the radius of convergence for $\sum _{n=1}^{\infty}\frac{\left(\sqrt{ n^2+n}-\sqrt{n^2+1}\right)^n}{n^2}z^n$
- Show that the sequence is bounded below 3
- A particular exercise on convergence of recursive sequence
- Proving whether function-series $f_n(x) = \frac{(-1)^nx}n$
- Powers of a simple matrix and Catalan numbers
- Convergence of a rational sequence to a irrational limit
- studying the convergence of a series:
Related Questions in RECURRENCE-RELATIONS
- Recurrence Relation for Towers of Hanoi
- Solve recurrence equation: $a_{n}=(n-1)(a_{n-1}+a_{n-2})$
- General way to solve linear recursive questions
- Approximate x+1 without addition and logarithms
- Recurrence relation of the series
- first order inhomogeneous linear difference equation general solution
- Guess formula for sequence in FriCAS
- Solve the following recurrence relation: $a_{n}=10a_{n-2}$
- Find closed form for $a_n=2\frac{n-1}{n}a_{n-1}-2\frac{n-2}{n}a_{n-2}$ for all $n \ge 3$
- Young Tableaux generating function
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
geometry
circles
algebraic-number-theory
functions
real-analysis
elementary-set-theory
proof-verification
proof-writing
number-theory
elementary-number-theory
puzzle
game-theory
calculus
multivariable-calculus
partial-derivative
complex-analysis
logic
set-theory
second-order-logic
homotopy-theory
winding-number
ordinary-differential-equations
numerical-methods
derivatives
integration
definite-integrals
probability
limits
sequences-and-series
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
For the sake of closure, let me expand my comments into an answer.
The answer is "no"; but it becomes a "yes" if you assume $K_{u}$ and $K_{v}$ to be coprime. Before I start proving it, let me introduce my own notation, which I believe is clearer than yours (specifically, I will avoid the use of the nebulous notion of a "linear recurrence" and the easily misunderstood word "characteristic polynomial").
Fix a field $F$. Set $\mathbb{N}=\left\{ 0,1,2,\ldots\right\} $. Let $F^{\infty}$ be the set $\left\{ \left( a_{0},a_{1},a_{2},\ldots\right) \ \mid\ a_{i}\in F\text{ for each }i\in\mathbb{N}\right\} $ of all infinite sequences of elements of $F$. This set $F^{\infty}$ is an $F$-vector space (where the operations are entrywise: e.g., we have $\left( a_{0},a_{1} ,a_{2},\ldots\right) +\left( b_{0},b_{1},b_{2},\ldots\right) =\left( a_{0}+b_{0},a_{1}+b_{1},a_{2}+b_{2},\ldots\right) $).
(My notion of a "$P$-linearly recursive sequence" roughly corresponds to your "recursive sequence with characteristic polynomial $P$", but keep in mind that $P$ is not uniquely determined by the sequence. For example, each $X$-linearly recursive sequence is also $X^{2}$-linearly recursive.)
Before we prove Theorem 1, we introduce a helpful $F\left[ X\right] $-module structure on $F^{\infty}$. Namely, let $S$ be the $F$-linear map
$F^{\infty}\rightarrow F^{\infty},\ \left( a_{0},a_{1},a_{2},\ldots\right) \mapsto\left( a_{1},a_{2},a_{3},\ldots\right) $.
This map $S$ is called the shift operator (since it shifts a sequence by $1$ forward, dropping the very first entry). It is easy to see that
\begin{equation} S^{i}\left( a_{0},a_{1},a_{2},\ldots\right) =\left( a_{0+i} ,a_{1+i},a_{2+i},\ldots\right) \label{1} \tag{1} \end{equation}
for each $i\in\mathbb{N}$ and each $\left( a_{0},a_{1},a_{2},\ldots\right) \in F^{\infty}$.
Let $\operatorname*{End}\left( F^{\infty}\right) $ denote the $F$-algebra of all endomorphisms of the $F$-vector space $F^{\infty}$. By the universal property of the polynomial ring $F\left[ X\right] $, there exists a unique $F$-algebra homomorphism $\phi:F\left[ X\right] \rightarrow \operatorname*{End}\left( F^{\infty}\right) $ satisfying $\phi\left( X\right) =S$. Consider this $\phi$, and use it to make $F^{\infty}$ into an $F\left[ X\right] $-module. Thus, this $F\left[ X\right] $-module structure on $F^{\infty}$ satisfies $X\mathbf{a}=\underbrace{\phi\left( X\right) }_{=S}\mathbf{a}=S\mathbf{a}$ for each $\mathbf{a}\in F^{\infty}$. We can now easily describe how any polynomial acts on $F^{\infty}$:
Proof of Proposition 2. We have $\mathbf{a}=\left( a_{0},a_{1},a_{2} ,\ldots\right) $. Hence, each $i\in\mathbb{N}$ satisfies \begin{equation} S^{i}\mathbf{a}=S^{i}\left( a_{0},a_{1},a_{2},\ldots\right) =\left( a_{0+i},a_{1+i},a_{2+i},\ldots\right) \label{2} \tag{2} \end{equation} (by \eqref{1}). On the other hand, applying the map $\phi$ to the equality $P=\sum\limits_{i=0}^{k}p_{i}X^{i}$, we obtain
$\phi\left( P\right) =\phi\left( \sum\limits_{i=0}^{k}p_{i}X^{i}\right) =\sum\limits_{i=0}^{k}p_{i}\phi\left( X\right) ^{i}$ (since $\phi$ is an $F$-algebra homomorphism)
$=\sum\limits_{i=0}^{k}p_{i}S^{i}$ (since $\phi\left( X\right) =S$).
But the definition of the $F\left[ X\right] $-module structure on $F^{\infty}$ shows that
$P\mathbf{a}=\underbrace{\phi\left( P\right) }_{=\sum\limits_{i=0}^{k} p_{i}S^{i}}\mathbf{a}=\sum\limits_{i=0}^{k}p_{i}\underbrace{S^{i}\mathbf{a} }_{\substack{=\left( a_{0+i},a_{1+i},a_{2+i},\ldots\right) \\\text{(by \eqref{2})}}}$
$=\sum\limits_{i=0}^{k}p_{i}\left( a_{0+i},a_{1+i},a_{2+i},\ldots\right) =\left( \sum\limits_{i=0}^{k}p_{i}a_{0+i},\sum\limits_{i=0}^{k}p_{i} a_{1+i},\sum\limits_{i=0}^{k}p_{i}a_{2+i},\ldots\right) $.
This proves Proposition 2.
Proof of Corollary 3. Write $P$ in the form $P=\sum\limits_{i=0}^{k} p_{i}X^{i}$ for some $k\in\mathbb{N}$ and some $p_{0},p_{1},\ldots,p_{k}\in F$. Write the sequence $\mathbf{a}\in F^{\infty}$ in the form $\mathbf{a} =\left( a_{0},a_{1},a_{2},\ldots\right) $. Thus, Proposition 2 yields $P\mathbf{a}=\left( \sum\limits_{i=0}^{k}p_{i}a_{0+i},\sum\limits_{i=0} ^{k}p_{i}a_{1+i},\sum\limits_{i=0}^{k}p_{i}a_{2+i},\ldots\right) $. Hence, we have the following chain of equivalences:
$\left( P\mathbf{a}=0\right) $
$\Longleftrightarrow\ \left( \left( \sum\limits_{i=0}^{k}p_{i}a_{0+i} ,\sum\limits_{i=0}^{k}p_{i}a_{1+i},\sum\limits_{i=0}^{k}p_{i}a_{2+i} ,\ldots\right) =0\right) $
$\Longleftrightarrow\ \left( \text{each }n\in\mathbb{N}\text{ satisfies } \sum\limits_{i=0}^{k}p_{i}a_{n+i}=0\right) $
$\Longleftrightarrow\ \left( \mathbf{a}\text{ is }P\text{-linearly recursive}\right) $
(by the definition of "$P$-linearly recursive"). This proves Corollary 3.
Proof of Theorem 1. Since $F\left[ X\right] $ is a principal ideal domain, Bezout's identity shows that there are two polynomials $U\in F\left[ X\right] $ and $V\in F\left[ X\right] $ such that $PU+QV=\gcd\left( P,Q\right) $. Consider these $U$ and $V$. Thus, $PU+QV=\gcd\left( P,Q\right) =1$ (since $P$ and $Q$ are coprime).
Corollary 3 (applied to $PQ$ instead of $P$) shows that $\mathbf{a}$ is $PQ$-linearly recursive if and only if $PQ\mathbf{a}=0$. Hence, $PQ\mathbf{a} =0$ (since $\mathbf{a}$ is $PQ$-linearly recursive).
The sequence $PU\mathbf{a}\in F^{\infty}$ satisfies $Q\left( PU\mathbf{a} \right) =\underbrace{QPU}_{=UPQ}\mathbf{a}=U\underbrace{PQ\mathbf{a}}_{=0} =0$. But Corollary 3 (applied to $Q$ and $PU\mathbf{a}$ instead of $P$ and $\mathbf{a}$) shows that $PU\mathbf{a}$ is $Q$-linearly recursive if and only if $Q\left( PU\mathbf{a}\right) =0$. Thus, $PU\mathbf{a}$ is $Q$-linearly recursive (since $Q\left( PU\mathbf{a}\right) =0$).
The sequence $QV\mathbf{a}\in F^{\infty}$ satisfies $P\left( QV\mathbf{a} \right) =\underbrace{PQV}_{=VPQ}\mathbf{a}=V\underbrace{PQ\mathbf{a}}_{=0} =0$. But Corollary 3 (applied to $QV\mathbf{a}$ instead of $\mathbf{a}$) shows that $QV\mathbf{a}$ is $P$-linearly recursive if and only if $P\left( QV\mathbf{a}\right) =0$. Thus, $QV\mathbf{a}$ is $P$-linearly recursive (since $P\left( QV\mathbf{a}\right) =0$).
Now, $PU\mathbf{a}+QV\mathbf{a}=\underbrace{\left( PU+QV\right) } _{=1}\mathbf{a}=\mathbf{a}$. Hence, $\mathbf{a}=PU\mathbf{a}+QV\mathbf{a} =QV\mathbf{a}+PU\mathbf{a}$ is the sum of a $P$-linearly recursive sequence (namely, $QV\mathbf{a}$) with a $Q$-linearly recursive sequence (namely, $PU\mathbf{a}$). This proves Theorem 1.
Notice that Theorem 1 gives an alternative approach to explicit formulas for linearly recursive sequences (like the Binet formula for Fibonacci numbers). Indeed, repeated application of Theorem 1 yields the following corollary:
For example, if $F=\mathbb{C}$ and $P=X^{2}-X-1$, then we can apply Corollary 4 to any $P$-linearly recursive sequence (setting $k=2$, $P_{1}=X-\dfrac {1+\sqrt{5}}{2}$, $m_{1}=1$, $P_{2}=X-\dfrac{1-\sqrt{5}}{2}$, and $m_{2}=1$). We thus conclude that each $\left( X^{2}-X-1\right) $-linearly recursive sequence can be written as the sum of an $\left( X-\dfrac{1+\sqrt{5}} {2}\right) $-linearly recursive sequence (i.e., a geometric sequence with ratio $\dfrac{1+\sqrt{5}}{2}$) with an $\left( X-\dfrac{1-\sqrt{5}} {2}\right) $-linearly recursive sequence (i.e., a geometric sequence with ratio $\dfrac{1-\sqrt{5}}{2}$). This lets you easily prove the Binet formula for Fibonacci sequence. In general, you might have to work harder (particularly if $P$ is not squarefree, and so some of the $m_{i}$ are $>1$).
Notice also that Theorem 1 does not hold if we forget to require that $P$ and $Q$ be coprime. For example, the sequence $\left( 0,1,2,\ldots\right) $ is always $\left( X-1\right) ^{2}$-linearly recursive, but cannot be written as a sum of two $\left( X-1\right) $-linearly recursive sequences. (In fact, the $\left( X-1\right) ^{2}$-linearly recursive sequences are the arithmetic sequences, while the $\left( X-1\right) $-linearly recursive sequences are the constant sequences.) There is a counterexample for every pair $\left( P,Q\right) $ of non-coprime polynomials $P$ and $Q$ (over any field $F$).