How to prove below?

39 Views Asked by At

Given \begin{split} A =~& \theta(X) \cdot D + g(X) + \epsilon ~~~&~~~ \mathbf E[\epsilon | X] = 0 \\ D =~& f(X) + \eta & \mathbf E[\eta \mid X] = 0 \\ ~& \mathbf E[\eta \cdot \epsilon | X] = 0 \end{split} How to prove below? \begin{split} A - \mathbf E[A | X] = \theta(X) \cdot (D - \mathbf E[D | X]) + \epsilon \end{split}

1

There are 1 best solutions below

0
On BEST ANSWER

Since $\mathbf E[\epsilon |X] = 0, \mathbf E[\eta | X] = 0$, by linearity of expectation we have \begin{align} \mathbf E[D | X] & = \mathbf E[f(X) | X] + \mathbf E[ \eta | X] \\ & = f(X) \\ \mathbf E[A | X] & = \mathbf E[\theta(X) \cdot (f(X) + \eta) + g(X) + \epsilon | X] \\ & = \theta(X) f(X) + \theta(X) \mathbf E[\eta | X] + g(X) + \mathbf E[\epsilon | X] \\ & = \theta(X) f(X) + g(X) \\ & = \theta(X) \mathbf E[D | X] + g(X) \\ A - \mathbf E[A | X] & = \theta(X) \cdot D + g(X) + \epsilon - (\theta(X) \mathbf E[D | X] + g(X)) \\ & = \theta(X) (D - \mathbf E[D | X]) + \epsilon \end{align}

I don't think the assumption that $\mathbf E[\eta \cdot \epsilon | X] = 0$ is necessary.