The identity is the following: $E(Y|Z=z) = \int E(Y|Z=z,X=x) P(X=x|Z=z) dx$
I would start by doing:
$E(Y|Z=z) = \int y P(Y=y|Z=z)dy $
and then $$\int y P(Y=y|Z=z)dy=\int y \int P(Y=y|Z=z,X=x) P(X=x) dx \ dy. \ (1)$$
I know this last step is not right since, for the equality to be true, it should be $$\int y P(Y=y|Z=z)dy=\int y \int P(Y=y|Z=z,X=x) P(X=x|Z=z) dx \ dy. \ (2)$$
What's the intuition / steep needed to arrive to the equality number 2?
Thanks in advance
Well, assuming you are dealing with probability density functions for continuous random variables (or a generalised sigma-algebra notation), then the Law of Total Probability for conditioned random variables would state:
$$P(Y{=}y\mid Z{=}z) ~{= \int_X P(Y{=}y, X{=}x\mid Z{=}z)~\mathrm d x \\ = \int_X P(Y{=}y\mid X{=}x, Z=z)\,P(X{=}x\mid Z{=}z)~\mathrm d x}$$
As to intuition, it follows closely from the definition of conditional probability for events. When the sample space is partitioned by sequence of disjoint events, $(B_i)$, then:
$$\mathsf P(A\mid C) ~{= \dfrac{\mathsf P(A\cap C)}{\mathsf P(C)} \\=\dfrac{\mathsf P(A\cap C\cap \bigcup_i \{B_i\})}{\mathsf P(C)}\\=\dfrac{\sum_i \mathsf P(A\cap B_i\cap C)}{\mathsf P(C)} \\=\dfrac{\sum_i \mathsf P(A\mid B_i\cap C)~\mathsf P(B_i\cap C)}{\mathsf P(C)} \\= \sum_i \mathsf P(A\mid B_i\cap C)~\mathsf P(B_i\mid C)}$$
Also
$$P(Y{=}y) ~{= \int_X\int_Z P(Y{=}y, X{=}x, Z{=}z)~\mathrm d z~\mathrm d x \\ = \int_X\int_Z P(Y{=}y\mid X{=}x, Z=z)\,P(X{=}x\mid Z{=}z)\,P(Z{=}z)~\mathrm d z~\mathrm d x}$$