I know, that if $A$ is nonsingular matrix, so $\det{A} \ne 0$, then $A^p=\exp\left(p\ln A\right)$ is true for any real exponent, but what about if $A$ is singular? Then $A$ has a zero eigenvalue, so the matrix logarithm doesn't exist.
Is there any extension in this case?
You cannot do it in general, even for positive powers. For example, suppose $N$ is a nilpotent matrix of order $n > 1$ on $\mathbb{R}^{n}$, i.e., $N^{n}=0$, but $N^{n-1}\ne 0$. If $M=\sqrt{N}$, then you would have $M^{2n}=0$, but $M^{2n-2}\ne 0$, which is impossible for an $n\times n$ matrix because the degree of the minimal polynomial cannot exceed $n$.
Because there has been no activity on this question, I thought I'd give you the simplest example. $$ A = \left[\begin{array}{cc} 0 & 1 \\ 0 & 0\end{array}\right] $$ This matrix satisfies $A^{2}=0$, but $A\ne 0$. So the minimal polynomial for $A$ is $m(\lambda)=\lambda^{2}$. If there is a $2\times 2$ matrix $B$ such that $B^{2}=A$, then $B^{4}=0$, but $B^{2}\ne 0$. That's impossible, though, because the minimal polynomial $q$ of $B$ must divide $\lambda^{4}$, and cannot have order greater than $2$. That's a contradiction because $B^{2}\ne 0$. You can try directly, and see that you cannot get $B$ such that $B^{2}=A$, but you know it from these general concerns. Try it: $$ B = \left[\begin{array}{cc} a & b\\ c & d \end{array}\right] \left[\begin{array}{cc} a & b\\ c & d \end{array}\right] = \left[\begin{array}{cc}a^{2}+bc & ab+bd \\ ca+dc & cb+d^{2}\end{array}\right] = \left[\begin{array}{cc} 0 & 1 \\ 0 & 0 \end{array}\right]. $$ Because $a^{2}+bc=0$ and $cb+d^{2}=0$, then $a^{2}-d^{2}=0$. So $a=\pm d$. But $a\ne -d$ because that would contradict $ab+bd=(a+d)b=1$. So $a=d$ must hold. The lower left corner is $c(a+d)=0$. But $a+d\ne 0$ because $(a+d)b=1$. So $c=0$. Now that forces $0=a^{2}+bc=a^{2}$ which gives $a=0$ and, thus, $d=0$, which gives $ab+bd=0$, yielding the desired contradiction. So, no way does there exist $B$ such that $B^{2}=A$ (you can see it abstractly, you can see it concretely.)