I'm trying to follow along with this youtube video on finding coefficients equation for LMMSE estimator. see link: https://www.youtube.com/watch?v=r4hz6KBXZLQ
Here's my work:
$E\bigg[\Big((\Theta- E[\Theta])-\underline{h}^T (E[~ \underline{x}~]-\underline{x})~\Big)~\underline{x}\bigg] = 0$
let $\underline{y} = (\Theta- E[\Theta])-\underline{h}^T (E[~ \underline{x}~]-\underline{x})$
$E\bigg[\langle \underline{y}, \underline{x} \rangle\bigg] = 0$
$E\bigg[\langle \underline{x}, \underline{y} \rangle\bigg] = 0$
$\text{Cov}(\underline{x}, \underline{y}) + E[~\underline{x}~]~E[~\underline{y}~]^T=0$
now we find $E[~\underline{y}~]$ so that we can substitute it into the preceding equation.
$\underline{y} = (\Theta- E[\Theta])-\underline{h}^T (E[~ \underline{x}~]-\underline{x})$
$E[\underline{y}] = E\bigg[(\Theta- E[\Theta])-\underline{h}^T (E[~ \underline{x}~]-\underline{x})\bigg]$
$E[\underline{y}] = \bigg(E[\Theta]- E\Big[E[\Theta]\Big]\bigg)-\underline{h}^T \bigg(E\Big[E[~ \underline{x}~]\Big]-E[\underline{x}]\bigg)$
$E[\underline{y}] = \bigg(E[\Theta]- E[\Theta]\bigg)-\underline{h}^T \bigg(E[~ \underline{x}~]-E[\underline{x}]\bigg)$
$E[\underline{y}] = \underline{0}$
Thus:
$\text{Cov}(\underline{x}, \underline{y}) + E[~\underline{x}~]\cdot \underline{0}^T=0$
$\text{Cov}(\underline{x}, \underline{y}) = 0$
I get a little bit stuck on the next step of the proof. See video. you will see I posted a picture of the video frame with red ink showing where i get messed up.

\begin{align} \text{Cov}(x, y) &= E[(x-E[x])(y-E[y])^\top] & \text{defn. of covariance} \\ &= E[(x-E[x])y^\top] & \text{$y$ has zero mean} \\ &= E[(x-E[x])((\theta - E[\theta]) - h^\top(x - E[x]))^\top] & \text{defn. of $y$} \\ &= E[(x-E[x])(\theta - E[\theta])^\top] - E[(x-E[x])(h^\top(x-E[x]))^\top] & \text{linearity of expectation} \\ &= E[(x-E[x])(\theta - E[\theta])^\top] - E[(x-E[x])(x-E[x])^\top] h^\top & \text{distributing the transpose} \end{align}