Newton's method for root finding is well known for scalar valued differentiable functions $x\to f(x)$. It is an iterative method where taking one step (iteration) means computing :
$$x_n = x_{n-1}-\frac{f(x_{n-1})}{f'(x_{n-1})}$$
It can be expanded to higher level more advanced methods with higher order derivatives, like second derivative or for multidimensional functions counterparts with gradients and hessians and other cool stuff, but let us pause at this basic setting.
Would there make any sense to try and plug in a matrix function expressed as a (possibly truncated) power series expansion, pretending that the derivative of said matrix expression would be to just plug in the matrix wherever x normally goes? In other words to calculate
$$X_n = X_{n-1}-\frac{f(X_{n-1})}{f'(X_{n-1})}$$
If so (since matrix multiplication is not commutative), should we be replacing the division by $f(X_{n-1})f'(X_{n-1})^{-1}$ or $f'(X_{n-1})^{-1}f(X_{n-1})$ or something third?
One example below for where it does work (but where the real derivative may not make sense?) is the root finding of complex polynomials where the convergence depending on initial iterate generates the Newton fractal, here from wikipedia a fifth degree polynomial it seems:

Do you speak about the Newton Raphson method ?
Let $E$ be a Banach space and $F:E\rightarrow E$ be a $C^{\infty}$ function. We consider the iteration $X_0\in E,X_{n+1}=X_n-F'(X_n)^{-1}(F(X_n))$. If $(X_n)$ converges to $U$ and if $F'(U)\in L(E)$ is invertible, then $F(U)=0$; in fact, at each step, we solve, in $Y$, the linear equation $F'(X_n)(Y)=F(X_n)$, that is easy when the condition number of $F'(X_n)$ is not too large.
The convergence is very fast once $X_n$ is close to $U$. Conversely, we are sure that the sequence converges when $X_0$ is close to $U$; yet, such a choice of $X_0$ is difficult to do when we don't know any approximation of $U$.
When $dim(E)=1$, often we can draw the graph of the function $F$ and easily obtain an approximation of $U$. Yet, the larger the dimension of $E$, the more difficult the search for an approximation.
Example: find the $z=a+ib\in\mathbb{C}\setminus \{0\}$ s.t. $e^z=1+Re(z)+2Im(z)$.
We obtain the system $F(a,b)=(e^a \cos(b)-1-a,e^a\sin(b)-2b)=(0,0)$. As can be seen by drawing (with a PC) these two implicit curves, there is an infinity of solutions. For example, a solution is close to $X_0=(0.75,0.6)$.
Here $F'(a,b)=\begin{pmatrix}e^a\cos(b)-1&-e^a\sin(b)\\e^a\sin(b)&e^a\cos(b)-2\end{pmatrix}$ is invertible in a neighborhood of $X_0$.