What is the cosine/sine of a matrix good for?

983 Views Asked by At

In an Advanced Linear Algebra class, we recently learned about the cosine, exponential and sine of a matrix - how to define it, how to find it, etc. However , we did not learn about the applications and the need for defining these concepts. I have read in some places about how the exponential of a matrix is used in robotics (exponential coordinates) but what is the need for defining the cosine and sine? Is it just for fun?

2

There are 2 best solutions below

0
On

Before I start, just take into account that when we evaluate a function in a matrix, we don't mean an entrywise evaluation (except for Octave and Matlab): $$\text{If } A=[A_{i,j}] \text{ then } f(A)\neq[f(A_{i,j})].$$ For a real function $f$ with a Taylor expansion, we firstly define the matrix-equivalent of $f$ for diagonal matrices: $$\text{If } D=\begin{bmatrix}d_1\\&d_2\\&&\ddots\\&&&d_n\end{bmatrix} \qquad \text{then } f(D):=\begin{bmatrix}f(d_1)\\&f(d_2)\\&&\ddots\\&&&f(d_n)\end{bmatrix}.$$ Next we generalize $f$ for non-diagonal matrices that have an eigendecomposition: $$\text{If } A=P\ \Lambda\ P^{-1} \text{ where } \Lambda \text{ is diagonal, then }f(A):=P\ f(\Lambda)\ P^{-1}.$$ Exponential matrices are useful because the solution of $$f'(t)=-A\ f(t)$$ is $$f(t) =\exp(-tA)\ f(0) =P\ \exp(-t\Lambda)\ P^{-1} f(0),$$ given all eigenvalues of $A$ are positive. Similarly, we can use the cos and sin matrices to express the solution of $$f''(t)=-A\ f(t)$$ as $$f(t)=P\ \cos(t\sqrt{\Lambda})\ P^{-1} f(0) +P\sqrt{\Lambda}^{-1}\sin(t\sqrt{\Lambda})\ P^{-1} f'(0),$$ again, given all eigenvalues $A$ are positive. However, you can't relate $f$ with $\cos(A) =P\ \cos(\Lambda)\ P^{-1}$ nor $\sin(A) =P\ \sin(\Lambda)\ P^{-1}$ because we have $\sqrt{\Lambda}$ instead of just $\Lambda$.

I think matrices such as $\exp(-t\sqrt[4]{\Lambda})$, $\exp(t\sqrt[4]{\Lambda})$, $\cos(t\sqrt[4]{\Lambda})$ and $\sin(t\sqrt[4]{\Lambda})$ appear when solving $f^{(4)}(t)=-A\ f(t)$. (I did it once but I forgot.)

Now, I will just mention that matrices such as the Discrete Cosine Transforms and Discrete Sine Transforms (which are actually built as entrywise evaluations $[\cos(a_{i,j})]$, $[\sin(b_{i,j})]$) are useful as their columns pose as eigenvectors of some special tridiagonal matrices (Britanak, Yip and Rao; Discrete Cosine and Sine Transforms: General Properties, Fast Algorithms and Integer Approximations, $2006$) that arise when numerically solving some differential equations.

0
On

COMMENT(refused as a comment because "too long by 75 characters").-In functional analysis, the concept of function that operates in an algebra is defined as follows: Let $ y = f (x) $ a function , say, of $\mathbb R$ in $\mathbb R$. If in the algebra $ \mathcal A $ when forming $ f (M) $ it makes a sense then $ N = f (M) $ is well defined as a function of $ \mathcal A $ in $ \mathcal A $. A well known example is the exponential $f(x)=e^x$ which operates in the algebra $ \mathcal A_n $ of matrices $n\space\text {x } n$. I don't know if the functions sinus and cosinus operate on $ \mathcal A_n $ (maybe not because of $|\sin(x)|\le 1$). French mathematicians call this "calcul symbolique" (see Katznelson theorem which is an extraordinary case concerning all analytical functions on the circumference $\mathbb T$).