Let $A \in \text{Mat} (n,n,\mathbb{C})$. Let $I$ be a subset of $\mathbb{R}$ or $\mathbb{C}$. Further, let $f:I\to\mathbb{C}$ and $g:I\to\mathbb{C}$ be two functions for which $f(A)$ and $g(A)$ are defined. How to prove the following two statements for the two functions $(f+g)(A): I\to \mathbb{C}$ and $(f\cdot g)(A): I\to \mathbb{C}$:
i) The matrix $(f+g)(A)$ is defined and it holds that $(f+g)(A) = f(A) + g(A)$.
ii) The matrix $(f\cdot g)(A)$ is defined and it holds that $(f\cdot g)(A) = f(A) \cdot g(A)$.
Edit: At the downvoters of this question: If I could understand what to do here, I would solve it myself. It's actually because I first thought I missed important information that I posted it here. It's the actual exercise, please bear that in mind.
Keep it simple first and define $f(A)$ for matrices $A$ which are diagonalizable. If $A = VDV^{-1}$, where $D$ is diagonal, define $f(A) = Vf(D)V^{-1}$, where $$ f(D) = \text{diag}(f(d_1), f(d_2), \dotsc, f(d_n))$$ is the diagonal matrix obtained by applying $f$ to the individual entries of $D$. Here it is straight forward to verify parts 1 and 2.
Push to general matrices and smooth functions via the Jordan decomposition, see the Wikipedia definition for appropriate way of doing this,
https://en.wikipedia.org/wiki/Matrix_function
You will need Leibniz's rule for differentiating a product.
At some point you might want to have a look at Higham's book on matrix functions, see
http://www.maths.manchester.ac.uk/~higham/fm/
It is about as good as they come.