How to prove bias-variance decomposition of MSE and how does it work in multidimentional case?

300 Views Asked by At

Let $x \in R$ be the real value of parameter we are trying to estimate with estimator $\hat x$. Bias, variance and mean squared error are defined as $$b(\hat x) = E[\hat x - x],$$ $$V(\hat x) = E[(\hat x - E[\hat x])^2],$$ $$MSE = E[(\hat x - x)^2]$$

  • I know that $MSE(\hat x) = Var(\hat x) + [b(\hat x)]^2$, but don't see why. How can I prove it? I've tried to expand $(a-b)^2$ parts as $a^2 - 2ab + b^2$ but it didn't help because I don't know what to do with $[b(\hat x)]^2$.
  • What is the relationship between those 3 things in case we replace $x, \hat x$ with $X, \hat X \in R^n$?
1

There are 1 best solutions below

0
On

Regarding only your first question, try showing that both $MSE(\hat{x})$ and $Var(\hat{x})+b(\hat{x})^{2}$ equal $\mathbb E(\hat{x}^{2})-2x\mathbb E(\hat{x})+x^{2}$, by using only the basic definitions of the terms you mentioned, as well as basic properties of expectation and variance. In particular, use the linearity of the expectation, the fact that $Var(\hat{x})=\mathbb E(\hat{x}^{2})-\mathbb E(\hat{x})^{2}$ and the fact that the expectation of a constant is the constant itself.