Let $X$ and $Y$ random variables with $E(Y)=\mu$ and $E(Y^2)<\infty$. Deduce that the random variable $f(X)$ that minimizes $E[(Y-f(X))^2|X]$ is $f(X)=E[Y|X]$.
I just find the minimum with derivatives
$$\frac{d}{d f(X)}E[(Y-f(X))^2|X]=-2E[Y-f(X)|X]$$
$$=-2E[Y|X]+2E[f(X)|X]=0$$ $$\Leftrightarrow E[Y|X]=E[f(X)|X]$$ $$\Leftrightarrow f(X)=E[Y|X]$$
Is this right?
I founded this solution
Is this wrong too?

Let us try to make your argument more rigorously legitimate. First, let's agree that the goal is to find some (measurable) function $f : \mathbb{R} \to \mathbb{R}$ such that for any (measurable) function $g : \mathbb{R} \to \mathbb{R}$ the inequality $$ \mathbb{E}\left[(Y- f(X))^2 \mid X\right] \leq \mathbb{E}\left[(Y - g(X))^2 \mid X\right] \tag{1} $$ holds almost surely. Now, let $\Omega$ be the sample space on which $X$ is defined and let $h(X) : \Omega \to \mathbb{R}$ be a representative of $\mathbb{E}\left[Y \mid X\right]$ and $k(X)$ be a representative of $\mathbb{E}\left[ Y^2 \mid X \right]$, each defined for every $\omega \in \Omega$. Then, $$ \mathbb{E}\left[(Y-f(X))^2 \mid X\right] = k(X) - 2 f(X) h(X) + f(X)^2 $$ (where equality means the RHS is in the equivalence class of the LHS). Now, for a fixed $\omega \in \Omega$, if we minimize $$ k(X)(\omega) - 2 \lambda h(X)(\omega) + \lambda^2 $$ in the variable $\lambda$, using differentiation as you have above, you will find that $\lambda = h(X)(\omega)$. Thus, for each $\omega \in \Omega$, defining $\lambda(\omega) = h(X)(\omega)$ minimizes the previous expression pointwise in $\Omega$. Thus with $f(X) = h(X)$, Eq. $(1)$ is minimized almost surely.