Connection between error probability and MSE

35 Views Asked by At

Setup: Given that we have an estimate $\hat{\beta}\in\mathbb{R}^n$ of a signal $\beta\in\mathbb{R}^n$. We have the error probability $P_e=\mathbb{P}[\hat{\beta}\neq\beta]$ and the mean squared error (MSE) $\frac{1}{p}\|\beta-\hat{\beta}\|_2^2$.

Question: Given that we have $\text{MSE}\rightarrow0$ as $n\rightarrow\infty$, can we show that $P_e\rightarrow0$ as $n\rightarrow\infty$? This seems intuitive to me but I can't seem to show it rigorously. I was thinking we can show this by deriving some upper bound $$ P_e\leq C\cdot \text{MSE}, $$ where $C$ is some constant (non growing as $n\rightarrow\infty$). Is this possible? Thanks.

1

There are 1 best solutions below

0
On BEST ANSWER

I think this isn't possible in general. Consider $\beta=0$ and $\hat \beta =1/n$. In this case, MSE goes to 0, but $P_e$ is 1 for all $n$