I am studying linear regression. I have been given that $E[X(Y-b_0-b_1X)] = 0$, and told that it follows from this that $E[Y-b_0-b_1X]X = 0$. Why is this so? I have tried expanding the bracket inside the expectation in the first equation and then splitting it up using the linearity of the expectation, but this has failed to yield any insight.
Furthermore, I am curious about a different, more general question: if X and Y are random variables and $E[XY] = 0$, can anything interesting be concluded?
I think there the OP misunderstands the "moments conditions" in the context of linear regression, and @Nicolas Agote's (albeit correct) does resolve this misunderstanding.
A linear regression assumes that the conditional mean $\mathbb E[Y|X]$ of a random variable $Y$ given another random variable $X$ has the functional form $$\mathbb E[Y|X] = b_0 + b_1 X,$$ where $b_0$ and $b_1$ are unknown constants. The goal is to estimate these unknown constants based on observed data. One way of estimating these parameters is the Method of Moments of approach. With this approach, $b_0$ and $b_1$ are estimated such the sample version of the moment condition $$\mathbb E[X(Y - b_0 -b_1X)] = 0$$ is satisfied.
In fact, this does not imply the equation $$ X\mathbb E[Y-b_0-b_1X] = 0$$ Instead, if the model is correct, i. e. $$\mathbb E[Y|X] = b_0 + b_1X$$ holds, the equation is trivially true:
\begin{align*} X\mathbb E[Y-b_0-b_1X] &= X\mathbb E[Y] - b_0X - b_1X\mathbb E[X] \\ &= X\mathbb E[\mathbb E[Y|X]] - Xb_0 - Xb_1\mathbb E[X] \\ &= X\mathbb E[b_0 + b_1X] - Xb_0 - Xb_1\mathbb E[X] \\ &= Xb_0 + Xb_1\mathbb E[X] - Xb_0 - Xb_1\mathbb E[X] \\ &= 0,\end{align*} where linearity of $\mathbb E$ is used in the first and third line and the law of iterated expectation in the second line. Note that $$E[X(Y - b_0 -b_1X)] = 0$$ is not needed.