Two questions about trig functions of matrices

75 Views Asked by At

So the first part of this question asked me to define $\cos A$ and $\sin A$ where $A$ is a $n$x$n$ real matrix. I used the taylor series expansions of sin and cos to get

$\sin A = \sum_{n=0}^{\infty} \frac{(-1)^n}{(2n+1)!}A^{2n+1}$

$\cos A = \sum_{n=0}^{\infty} \frac{(-1)^n}{(2n)!}A^{2n}$

For part a. I need to show that

$\frac{d}{dt}\sin(At) = A\cos(At)$

$\frac{d}{dt}\cos(At) = -A\sin(At)$

I was able to do this for sin with the following steps:

$\frac{d}{dt}\sin(At) = \frac{d}{dt}(\sum_{n=0}^{\infty} \frac{(-1)^n}{(2n+1)!}(At)^{2n+1}) = \frac{d}{dt}(\sum_{n=0}^{\infty} \frac{(-1)^n}{(2n+1)!}A^{2n+1}t^{2n+1})=\sum_{n=0}^{\infty} \frac{(-1)^n}{(2n+1)!}A^{2n+1}(2n+1)t^{2n} = \sum_{n=0}^{\infty} \frac{(-1)^n}{(2n)!}A^{2n+1}t^{2n}=A\sum_{n=0}^{\infty} \frac{(-1)^n}{(2n+1)!}(At)^{2n}=A\cos(At)$

However, for cosine I get stuck. I start off the same way:

$\frac{d}{dt}\cos(At) = \frac{d}{dt}(\sum_{n=0}^{\infty} \frac{(-1)^n}{(2n)!}(At)^{2n})=\frac{d}{dt}(\sum_{n=0}^{\infty} \frac{(-1)^n}{(2n)!}A^{2n}t^{2n})=\sum_{n=0}^{\infty} \frac{(-1)^n}{(2n)!}A^{2n}(2n)t^{2n-1}=\sum_{n=0}^{\infty} \frac{(-1)^n}{(2n-1)!}A^{2n}t^{2n-1}$

I need the $2n-1$ to be $2n+1$ but I don't know how.

For part b. I need to show that this identity still holds:

$(\cos At)^2 + (\sin At)^2 = I$

The hint I am given is to "Show that both sides define solutions to a certain first-order matrix-valued initial value problem." But I can't figure out what this IVP is supposed to be.

1

There are 1 best solutions below

0
On

$ \def\b{\beta} \def\LR#1{\left(#1\right)} $Instead of focusing on a particular function, consider a generic analytic function and its derivative. $$\eqalign{ f(x) &= \sum_{k=0}^\infty \b_k x^k \quad\quad\quad\implies\quad g(x) = \frac{df}{dx} = \sum_{k=0}^\infty k\,\b_k x^{k-1} \\ \dot f &= \LR{\frac{df}{dx}}\LR{\frac{dx}{dt}} = g\dot x\\ }$$ Now apply these functions to a square matrix $$\eqalign{ F &= f(X) = \sum_{k=0}^\infty \b_k X^k \quad\implies\quad G &= g(X) = \sum_{k=0}^\infty k\,\b_k X^{k-1} \\ \dot F &= \sum_{k=0}^\infty \b_k \sum_{j=0}^{k-1} X^j\;\dot X\;X^{k-j-1} \;\ne\; G\dot X \\ }$$ The problem is that (in general) $\,\dot X$ does not commute with $X$.

However, given a constant matrix $A$, the matrix $X=At\,$ does commutes with its time derivative since $$\eqalign{ \dot X &= A \quad\implies\quad X\dot X &= \dot X X = A^2t \\ }$$ Substituting this $X$ into the expression for $\dot F$ allows terms to be collected exactly they way they were in the scalar case, yielding $\,\dot F= GA= AG.$

In this particular problem, $f(x)=\sin(x)$ and $g(x)=\cos(x)$


For part b, if you are given a functional identity $$f_1(x) + f_2(x) = f_3(x)$$ it doesn't matter if you evaluate the LHS or the RHS, because they are identical.

You merely need to recognize the RHS as a function, which can be translated into matrix form $$x^0 = {\tt1} \qquad\implies\qquad X^0 = I$$ Or if you prefer, you can think of it as a Taylor series with a single term $$f_3(x) = \Big(\sin^2(x)+\cos^2(x)\Big) = \left(\sum_{k=0}^0 x^k\right) = {\tt1}$$