If $A$ is an $n\times n$ matrix with elements $a_{ij}$ $i=$i'th row, $j=$j'th column. Then $e^A$ is also a matrix as can be seen by expanding it in a power series.Is $e^A$ always convergent and defined for any $n\times n$ matrix? What are the elements $e^A_{ij}$ in terms of $a_{ij}$ ?
$\sin(x)$ for real numbers $x$ can be interpreted easily geometrically by looking at the unit circle. Is there any geometrical interpretation of $\sin(A)$ when $A$ is a matrix? What applications does it have?
Slight addition:
Is $\sin(A)$ periodic in any sense, i.e. is there a matrix $B$ not $0$ such that $\sin(A+B)=\sin(A)$ for all matrices $A$?
Does $\sin(A)^2+\cos(A)^2=I$ hold?
And do all the regular rules from algebra transfer, i.e. $e^Ae^B=e^{(A+B)}$ ?
Is there a consistent definition of $M/N$ for matrices $M,N$, such that $e^A/e^B=e^{(A-B)}$ holds?

First, there is no nice direct formula for the entries of the matrix exponential. It gets quite complicated to compute for non-diagonalizable matrices.
Pick your favorite analytic function: $f(x) = \sum\limits_{j=0}^\infty a_jx^j$. Let $A$ be an $n \times n$ matrix and let $\|A\|=\sqrt {\sum\limits_{1 \leq i,j \leq n} |a_{ij}|^2}$.
It's not hard to show that $\|A^j\| \leq \|A\|^j$. Thus $\|\sum_{j=0}^k a_jA^j\| \leq \sum_{j=0}^k |a_j|\|A\|^j$ and so if $f(\|A\|) =\sum\limits_{j=0}^\infty a_j\|A\|^j$ is absolutely convergent, then the series for $f(A)=\sum\limits_{j=0}^\infty a_jA^j$ is convergent.
Now we know that the series for $\sin$, $\cos$, $\exp$ are absolutely convergent everywhere, so $\sin(A)$, $\cos(A)$, $e^A$ are defined for all square matrices.
If you only deal with diagonalizable matrices, then life is much simpler. Suppose $P^{-1}AP=D=\mathrm{diag}(\lambda_1,\dots,\lambda_n)$. The given (any) function $f(x)$, you can define $$f(D) = \begin{pmatrix} f(\lambda_1) & 0 & \cdots & 0 \\ 0 & f(\lambda_2) & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & f(\lambda_n) \end{pmatrix}$$ Then define $f(A)=Pf(D)P^{-1}$.
For diagonalizable matrices this definition gives the same results as the series definition for sin, cos, and exp. But it also allows you to define $\sqrt{A}$ (when all eigenvalues are non-negative) and many other such matrix functions.
$\sin^2(A)+\cos^2(A)=I_n$ for diagonalizable matrices. I also think it holds in general, but I can't think of a proof right now.
Functions of matrices are quite important. The matrix exponential appears all over mathematics. It connects Lie algebras and Lie groups. I'm not aware of any obvious uses or geometric interpretation for matrix sine and cosine. I have seen the square root of a matrix appear several times in the context of numerical linear algebra.