Calculate the radius of convergence and the sum of the series

74 Views Asked by At

Suppose $u_0=1$ and $v_0=0$, then for all $n\geq 0$ $$ u_{n+1}=-u_n-2v_n,~~v_{n+1}=3u_n+4v_n$$ Calculate the radius of convergence and the sum of $\sum_{n\geq 0}\frac{u_n}{n!}x^n$ and $\sum_{n\geq 1}\frac{v_n}{n!}x^n$.

I thought about the matrix representation to solve the question so $$ \left( {\begin{array}{c} u_{n+1} \\ v_{n+1} \\ \end{array} } \right) = \left[ {\begin{array}{c} -1 & -2 \\ 3 & 4\\ \end{array} } \right] \left( {\begin{array}{c} u_{n} \\ v_{n} \\ \end{array} } \right)$$ where $$ \left( {\begin{array}{c} u_{n} \\ v_{n} \\ \end{array} } \right) = A^n \left( {\begin{array}{c} u_{0} \\ v_{0} \\ \end{array} } \right) ~\text{and}~ A = \left[ {\begin{array}{c} -1 & -2 \\ 3 & 4 \\ \end{array} } \right]$$

I think I need now to work on the eigenvalues and eigen vectors but I don't have an idea how to do so!

2

There are 2 best solutions below

2
On

Hints:

  • calculate eigenvalues, you'll get $1$ and $2$
  • calculate associated eigenvectors, for instance $(1,-1)$ and $(-2,3)$
  • what can be said about $a_n=u_n+v_n$ and $b_n=3u_n+2v_n$
  • solve the equation for $a_n$ and $b_n$
  • report in $u_n$ and $v_n$

Alternately for the third point, you can calculate the matrix $P=\begin{bmatrix}1&-2\\-1&3\end{bmatrix}$ such that $A=PDP^{-1}$ so that $A^n=PD^nP^{-1}$.

You'll notice by the form of $P^{-1}$ that it comes back to introducing $a_n$ and $b_n$ by setting $$\begin{pmatrix}a_n\\b_n\end{pmatrix}=P^{-1}\begin{pmatrix}u_n\\v_n\end{pmatrix}$$


Here is a variant to solve it.

The characteristic polynomial of the matrix $A$ is $\chi(x)=2-3x+x^2$, so by Cayley-Hamilton $\chi(A)=A^2-3A+2I=0$.

Applying to $\chi(A)\begin{pmatrix}u_n\\v_n\end{pmatrix}=\begin{pmatrix}u_{n+2}\\v_{n+2}\end{pmatrix}-3\begin{pmatrix}u_{n+1}\\v_{n+1}\end{pmatrix}+2\begin{pmatrix}u_n\\v_n\end{pmatrix}=\begin{pmatrix}0\\0\end{pmatrix}$

And the sequences are verifying the linear induction

$\begin{cases}u_{n+2}-3u_{n+1}+2u_n=0\\v_{n+2}-3v_{n+1}+2v_n=0\end{cases}$

Solving it classically, the solutions are linear combination of powers of the eigenvalues (or equivalently the roots of $\chi(x)=0$).

Thus $\exists (\alpha,\beta)\in\mathbb R^2\mid u_n=\alpha\, 1^n + \beta\, 2^n$ and similarly for $v_n$ with another set of $(\alpha,\beta)$.

Calculate for $u_0=0,\ v_0=1$ to determine all these constants, and you get the result shown in automaticallyGenerated's answer.


As you can see everything is really all entangled, the classical solving via characteristic equation and the solve via matrix reduction. These methods are just different presentations for the same thing.

2
On

If you write out a few terms of $u_n$ and $v_n$, you can see a pattern of $$u_n = -2 \cdot 2^n+3, v_n = 3 \cdot 2^n-3$$

This can be proven more explicitly through an inductive approach: If we assume that is the rule for $u_n$ and $v_n$, then $$u_{n+1} = -(-2 \cdot 2^n+3) - 2(3 \cdot 2^n-3) = - 4 \cdot 2^n + 3 = -2 \cdot 2^{n+1} + 3$$

$$v_{n+1} = 3(-2 \cdot 2^n+3) + 4(3 \cdot 2^n-3) = 6 \cdot 2^n - 3 = 3 \cdot 2^{n+1} - 3$$

Using that, we can see that $$\sum_{n=0}^{\infty}\frac{u_n}{n!}x^n = \sum_{n=0}^{\infty}\frac{-2 \cdot 2^n+3}{n!}x^n$$ and $$\sum_{n=0}^{\infty}\frac{v_n}{n!}x^n = \sum_{n=0}^{\infty}\frac{3 \cdot 2^n-3}{n!}x^n$$

Can you take it from here?