In inference statistics, one uses the quantity "likelihood" (L), and derivates it with respect to a given parameter $\theta$ (or set of parameter) in order to maximize the likelihood. Thus we require that the differential with respect to the parameter is null. If several solutions are found, the chosen one is the one that maximimizes the likelihood.
Let's consider the Poisson likelihood, where $n$ is the number of events. We have $L(\theta)=e^{-\theta}\frac{\theta^{n}}{n!}$. If we require differentiation to be null, we get : $\theta^{n-1}e^{-\theta}(n-\theta)=0$. This gives 3 solutions to the condition derivate is null : $\theta=0$, $\theta=\infty$, $\theta=n$. (I don't know, formally, if $\theta=\infty$ is considered as a solution.)
Among them, the ones that maximizes the likelihood is $\theta=n$.
For computation commodity, one most often uses the log ($\ln$) of likelihood, in order to make the computation. Since the $\ln$ has a domain of definition, this prevents the $\theta=0$ to appear as a possible solution. In addition, since we have the introduced the $\ln$, the solution $\nu=\infty$ disappears.
Is there a risk, in some special cases of the form of the likelihood (if so, is there an example), that the usage of logarithm ($\ln$) in likelihood derivation, drives to losing the solutions that would maximize the likelihood if we would have not used the logarithm ?
No. Because the logarithm function is monotonic, if a likelihood function has $n$ maxima, the log-likelihood function will have $n$ maxima at the same values.