I am trying to marginalize the conditional expectation and I am not sure what I do is correct. It agrees with my intuition but it does not make sense according to the law of total probability. This is the problem:
$E[Y|X_1,X_2, L_1] = 0.3 + 2X_1 + 3X_2 + 5X_1X_2 + 6L_1 $
I would like to get: $E[Y|X_1, X_2]$ by averaging over all values of $L_1$. The distribution of $L_1$ is Bernouli where $P(L_1 = 1) = 0.1 + 3X_1$
What I am doing is as follows:
$E[Y|X_1, X_2] = 0.3 + 2X_1 + 3X_2 + 5X_1X_2 + 6(0.1 + 3X_1)$
Since it will be 1 with probability $0.1 + 3X_1$ and zero otherwise.
However according to the law of total expectation:
$E[Y|X_1, X_2] = E[Y|X_1,X_2, L_1 = 1] p(L_1 = 1) + E[Y|X_1,X_2, L_1 = 0]P(L_1 = 0)$
This means the probability is multiplied by the whole expression and not by 6 only.
Can someone tell me what is the right way to marginalize the conditional expectation given above?
Both are good.
Let $Z=0.3+2X_1+3X_2+5X_1X_2$ to save some space.
So $\mathsf E(Y\mid X_1,X_2,L_1) = Z+6L_1$
Then by the Law of Total Probability: $$\small\begin{align}\mathsf E(Y\mid X_1,X_2) &= \mathsf E(Y\mid X_1,X_2,L_1{=}1)\mathsf P(L_1{=}1\mid X_1)+ \mathsf E(Y\mid X_1,X_2,L_1{=}0)\mathsf P(L_1{=}0\mid X_1)\\[1ex]&= (Z+6)(0.1+3X_1)+(Z+0)(0.9-3X_1)\\[1ex]&= Z+6(0.1+3X_1)\end{align}$$ As your intuition gave you.