I can't figure out to derive estimator for normal equations for weighted linear regression. (Supposed to be similar to normal equations.)
I set up problem as $W(y-XB)^T(y-xB)$
My Steps:
$W(y^Ty - 2(XB)^T-(XB)^T(XB)$
$Wy^Ty - 2W(XB)^T-W(XB)^T(XB)$
Take derivative w.r.t $\beta$, set equal to zero:
$2WX^T = 2WB^TX^TX \rightarrow WX^T = WB^TX^tX$
But this isn't right because the answer is:
$$\hat{\beta} = (X^TWX)^{-1}X^TWy$$
I'm not sure what I'm doing wrong.
Note: $W$ is a $nxn$ matrix with only non-zero entries along the diagonal. $B$ are the parameters; $\hat{B}$ is the estimator.
your problem formulation is incorrect. note that $W(y-X\beta)^T(y-X\beta)$ is a $n \times n$ matrix since $(y-X\beta)^T(y-X\beta)$ is a scalar. The weighted least squares problem is to minimize the scalar $(y-X\beta)^TW(y-X\beta)$. If you now derive with respect to $\beta$ you'll get the solution you're looking for.