Problem about error term

52 Views Asked by At

So the question is there are three random variables $X$, $Y$ and $\epsilon$, which the relation can be shown by:

$Y = \beta_0 + \beta_1 X + \epsilon$

Hence see if the following is right and prove why if it is wrong or right

[1] If $E(X\epsilon) = 0$, then $E(X^2\epsilon) = 0$

[2] If $E(\epsilon|X) = 0$, then $E(X^2\epsilon) = 0$

[3] If $E(\epsilon|X) = 0$, then $X$ and $\epsilon$ are independent

[4] If $E(X\epsilon) = 0$, then $E(\epsilon|X) = 0$

[5] If $E(\epsilon|X) = 0$ and $E(\epsilon^2|X) = \sigma^2$, then $X$ and $\epsilon$ are independent

My professor did not really went through error terms in lecture but he sent this as task... The only thing that was written on the lecture note was that

Y and X can be described $Y = E(Y|X) + \epsilon$ ,where

1.$\epsilon$ is mean-independent of X that is $E(\epsilon|X) = 0$

  1. $\epsilon$ is uncorrelated with any function of X

I have no idea what this means at all, but I tried by:

for [1]

$Cov(X,X\epsilon) = E(X^2\epsilon) - E(X)E(X\epsilon)$

Since $E(X\epsilon) = 0$....... Really got stuck here

[2]

This one I think I kind of proved it

$E(X^2\epsilon) = E(E(X^2\epsilon|X))$

$E(X^2E(\epsilon|X)) = 0$ ??

Hence, this is true but I am kind of stuck.. here. I don't even understand the concept. I need some help.