Understanding Linear Regression from statistical learning theory lecture

64 Views Asked by At

I was reading Linear Regression. But I am unable to understand why that red-marked term turns out to be zero. Can anybody help me understand that? TIA

enter image description here

1

There are 1 best solutions below

0
On BEST ANSWER

Expectations of conditional expectations are just expectations, e.g.

$$\mathbb E[\mathbb E[z|x]] = \mathbb E[z].$$ Also you can always pull out all functions of $x$ out of the conditional distribution given $x$.

Then you can calculate, assuming $(x,y) \sim \mathcal D$

$$\begin{aligned}\mathbb E[(h(x) - g(x))(g(x) - y)] = &\mathbb E[\mathbb E[(h(x) - g(x))(g(x) - y)|x]] \\ = &\mathbb E[(h(x) - g(x))(g(x) - \mathbb E[y|x])] \\ = &\mathbb E[(h(x) - g(x))(g(x) - g(x))] \\ = & 0.\end{aligned}$$