Cosine of a $2 \times 2$ non-diagonalisable matrix

137 Views Asked by At

Given

$$A = \begin{bmatrix} \pi-1 & 1\\ -1 & \pi+1 \end{bmatrix}$$

I need to calculate its cosine, $\cos(A)$. Typically, I use diagonalisation to approach this type of problems:

$$\cos(A) = P \cos(D) P^{-1}$$

However, in this problem, the eigenvalues of the matrix are equal: $\lambda_1=\pi, \lambda_2=\pi$. Thus, there are no two linearly independent vectors and the method will not work.

Are there any alternative approaches?

Besides MacLaurin series expansion of $\cos(A)$, which does not work either since $A$ does not turn into a zero matrix at some point when multiplied by itself $n$ times.

4

There are 4 best solutions below

2
On BEST ANSWER

When you do the Jordan decomposition, you get $A = SJS^{-1}$ with $$ S = \begin{pmatrix}1&1\\1&2\end{pmatrix}\quad\text{and}\quad J = \begin{pmatrix}\pi&1\\0&\pi\end{pmatrix}. $$ You find that $$ J^n = \begin{pmatrix}\pi^n&n\pi^{n-1}\\0&\pi^n\end{pmatrix}. $$ Since $A^n = SJ^nS^{-1}$, this implies \begin{align} \cos(A) &= \sum_{n=0}^\infty \frac{(-1)^nA^{2n}}{(2n)!} = S\left(\sum_{n=0}^\infty \frac{(-1)^nJ^{2n}}{(2n)!}\right)S^{-1}\\ &= S\left(\sum_{n=0}^\infty \frac{(-1)^n}{(2n)!}\begin{pmatrix}\pi^{2n}&2n\pi^{2n-1}\\0&\pi^{2n}\end{pmatrix}\right)S^{-1}\\ &= S\begin{pmatrix}\cos(\pi)&\sum_{n=0}^\infty\frac{2n\pi^{2n-1}}{(2n)!}\\0&\cos(\pi)\end{pmatrix}S^{-1} = S\begin{pmatrix}-1&\sum_{n=1}^\infty\frac{(-1)^n\pi^{2n-1}}{(2n-1)!}\\0&-1\end{pmatrix}S^{-1}\\ &= S\begin{pmatrix}-1&-\sin(\pi)\\0&-1\end{pmatrix}S^{-1} = -I_2. \end{align}

0
On

Use the Cayley-Hamilton theorem, a consequence of which is that for any analytic function $f$ of a $2\times2$ matrix $A$, $f(A)=aI+bA$ for some scalar coefficients $a$ and $b$. To compute $a$ and $b$, use the fact that this equation is satisfied by the eigenvalues of the matrix, i.e., $a+b\lambda=\cos\lambda$. Since $A$ doesn’t have two distinct eigenvalues, you can generate a second, independent equation by differentiating both sides to get $b=-\sin\lambda$. Substituting $\lambda=\pi$, we quickly get $a=-1$, $b=0$, therefore $\cos(A)=-I$. One nice thing about this approach is that you don’t need to know whether or not the matrix is diagonalizable; you just need the algebraic multiplicities of its eigenvalues.

0
On

When working this sort of thing out I usually take powers of the matrix and look for a pattern. In this case I found,

$$ M^0 = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}$$

$$ M^1 = \begin{bmatrix} \pi -1 & 1 \\ -1 & \pi+1 \end{bmatrix}$$

$$ M^2 = \begin{bmatrix} \pi^2 - 2\pi & 2\pi \\ -2\pi & \pi^2 + 2\pi\end{bmatrix}$$

$$ M^3 = \begin{bmatrix} \pi^3 - 3\pi & 3\pi^2 \\ -3\pi & \pi^3 + 3\pi^2\end{bmatrix}$$

$$ M^4 = \begin{bmatrix} \pi^4 - 4\pi^3 & 4\pi^3 \\ -4\pi^3 & \pi^4 + 4\pi^3 \end{bmatrix},$$

This leads me to suspect that for $p\geq 0$ we have,

$$ \boxed{M^p = \begin{bmatrix} \pi^p - p \pi^{p-1} & p \pi^{p-1} \\ -p \pi^{p-1} & \pi^p + p \pi^{p-1} \end{bmatrix}}$$

This can be confirmed by a simple induction proof. Now we can evaluate $\cos(M)$ using the series definition of $\cos$

$$ \cos(M) = \sum_{p=0}^\infty (-1)^p\frac{M^{2p}}{(2p)!}$$

$$ = \sum_{p=0}^\infty \frac{(-1)^p}{(2p)!}\begin{bmatrix} \pi^{2p} - 2p \pi^{2p-1} & 2p \pi^{2p-1} \\ -2p \pi^{2p-1} & \pi^{2p} + 2p \pi^{2p-1} \end{bmatrix}$$


Bringing the summation inside the matrix entries we get the following,

$$ \cos(M) = \begin{bmatrix} \sum_{p=0}^\infty \frac{(-1)^p \pi^{2p}}{(2p)!} - \sum_{p=0}^\infty \frac{(-1)^p 2p \pi^{2p-1}}{(2p)!} & \sum_{p=0}^\infty \frac{(-1)^p 2p \pi^{2p-1}}{(2p)!} \\ - \sum_{p=0}^\infty \frac{(-1)^p 2p \pi^{2p-1}}{(2p)!} & \sum_{p=0}^\infty \frac{(-1)^p \pi^{2p}}{(2p)!} + \sum_{p=0}^\infty \frac{(-1)^p 2p \pi^{2p-1}}{(2p)!} \end{bmatrix} $$

So we need to evaluate the following sums,

$$ S_1 = \sum_{p=0}^\infty \frac{(-1)^p \pi^{2p}}{(2p)!} = \cos(\pi),$$

and

$$ S_2 = \sum_{p=0}^\infty \frac{(-1)^p 2p \pi^{2p-1}}{(2p)!} = \frac{d}{d\pi} \sum_{p=0}^\infty \frac{(-1)^p \pi^{2p}}{(2p)!} = -\sin(\pi)$$

We now have

$$ \cos(M) = \begin{bmatrix} \cos(\pi) - \sin(\pi) & \sin(\pi) \\ -\sin(\pi) & \cos(\pi) + \sin(\pi) \end{bmatrix} $$

0
On

Another approach using the Cayley-Hamilton (proof for a non-diagonalizable matrix in https://en.wikipedia.org/wiki/Cayley%E2%80%93Hamilton_theorem) theorem would be to find the recurrence pattern for A, generalize the recurrence for $A^n$ and introduce the pattern in $cosA$